"If enough of us refuse to answer, the polling data will become so unrepresentative and unreliable even the media would have to admit it was useless." — Ariana Huffington on her blog, The Huffington Post shortly after the New Hampshire Democratic Primary.
On January 8, Hillary Clinton won that race by 2 points, a 10 point shift from what most polls had predicted. Watch any news program today and you are likely to hear a political pundit deriding the validity and reliability of political polls, and by extension, surveys.
Andrew Kohut, president of the Pew Research Center, wrote a column in the NY Times, titled "Getting It Wrong", saying, "All the published polls, including those that surveyed through Monday, had Senator Barack Obama comfortably ahead with an average margin of more than 8 percent." To his credit, Kohut defends the overall reliability of polling methodology and suggests factors for further study.
Some of the factors that have been suggested include:
Sampling problems — Some have suggested that certain demographic segments of the voting population are underrepresented because they do not cooperate with pollsters, suggesting that these people tend to be lower income, less well educated and more racially biased.
Media interpretation and bias — Some have suggested that the poll results were in fact accurate, but that the media is poorly educated in how to read and interpret poll results.
Racial bias — Some have suggested that many white voters are more liberal when speaking to pollsters than they are when alone in the voting booth. Would online polls be more representative?
Gender bias — Some have suggested that women would not want to seem biased toward Hillary Clinton, or men might be embarrassed to admit that they were going to vote for a woman, so their actual votes would be underrepresented.
Voting machine error — Some bloggers believe that the polls were correct, but that the voting machines were in error.
Bias in political polling — Much of the public opinion simply does not trust political polls. Certainly push polling and other disreputable practices further this distrust. I once was asked to help a local campaign with a poll when one activist suggested, "Why don’t we just tell them we took a poll. Who would know?" I had to tell her, "I would know."
So, what do you think was going on in New Hampshire?
And what do you do in commercial market research, when the actual results differ significantly from what you or the client expected? Do you challenge your results, or the expectations? Have you ever encountered a client who just did not trust surveys?
Looking forward to your responses.
P.S. The CNN/ UNH poll (http://i.a.cnn.net/cnn/2008/images/01/05/top10.pdf ) predicted a tie +/- 5%. The actual difference was 2 points. Maybe not all of the polls were entirely wrong.
Steve Runfeldt (Senior Account Executive for Quantitative Research) came to Schwartz Consulting Partners in September 2007 with a total of 27 years of research experience. His expertise is in innovative research design, statistics and analysis. He has a BA degree in Psychology and Anthropology from Brandeis University, graduate work in Behavioral Sciences, Genetics and Neurobiology at The Rockefeller University and Comparative Psychology at Georgia State University. Steve has worked as project manager, statistician and director of Internet research at Elrick & Lavidge (now TNS), principal and VP of Research at Customer Sat.com and founder and CEO of Justaskthem.com.
Steve is recognized as a pioneer and innovator in the field of Internet survey research, having introduced some of the first methods for online sample control, real time online reporting and customer relationship management. As head of justASKthem.com he designed and managed one of the first online customer satisfaction management systems which enabled AT&T WorldNet to become the industry leader in customer service satisfaction. Other clients with whom Steve has worked include AMD, American Century, Cover Girl, Pac Bell, Price Waterhouse, Procter and Gamble, Roper Starch, SBC, Siemens, the U.S. Navy, the World Bank, and others.
In 2005 Steve developed a new method using Flash technology that allows websites to collect consumer feedback through a short 3 question inline feedback application. When installed on a web shopping cart this method has achieved as high as a 70% response rate.
As a member of the Marketing Research Association Internet Ethics Task Force, Steve helped to write the association’s Internet market research survey guidelines. He has been a keynote speaker, panel moderator and workshop presenter for groups such as the Marketing Research Association, Emory School of Business, University of Georgia Marketing Research Program, the Institute for International Research and the International Quality and Productivity Center.