Pollster.com’s Mark Blumenthal posted an article on the National Journal about two new ventures bringing “professional” polling to the masses. Essentially, these new ventures (Pulse Opinion Research and Precision Polling) allow any one with a credit card to inexpensively commission and field a telephone survey (either a robo-call or a recording of the purchaser’s own voice) on the topic of their choosing.
The fact that these ventures are being run by “professional pollsters” is irrelevant: once you plop down your fee, you are the pollster, and there won’t be any professional guidance–or oversight–on how your questions are formulated or how these polls are used. Want to run a push-poll? Go for it. Circumvent the “do not call” rules with a thinly-veiled attempt to “gauge interest” in a new product or service? Be our guest. Blumenthal wisely notes that should we become flooded by a plethora of these self-serve robo polls, three changes will need to take place in order to safeguard the integrity of our industry: better measures of quality, better disclosure, and safeguards against abuse. Yes, please.
I’ll add a fourth: better outreach by AAPOR, CASRO and enlightened advocates in the press to educate journalists on how to read data, and the questions they should be asking. I’ve commented numerous times on the sad state of survey reporting, and the fact that so many journalists merely pass on the results of study after study without once questioning the sample composition, the claims of representation, who paid for the survey or even what the exact questions were. I am sure there will be good studies done under the aegis of these two enterprises; I am equally sure there will be poor ones as well. The fourth estate has a responsibility to do more than “pass along” the results indiscriminately. As data reporting quickly gets shortened to 140 characters, online reporting has an even greater responsibility to separate the wheat from the chaff.
Often, the survey itself does a fine job disclosing its limitations, but the press (and the tech press, especially, which is woeful in this regard) confounds “respondents” with “all Americans” or “Households.” With thousands of non-representative convenience samples on various topics, journalists will inevitably be confronted with surveys that offer differing and sometimes contradictory results. The responsibility of the journalist should be to first ask and answer why those surveys might differ before blindly passing them along to the public. In survey reporting, it’s often the sins of omission, not commission, that do the most harm to readers.