A recent study by World Public Opinion.org shows that the majority of the American population believed false things about basic national issues, right before the 2010 mid-term elections. I don’t know how to interpret this as anything other than a catastrophic failure of American journalism, in its most fundamental, clichéd, “inform the public” role.
The most damning section of the report (PDF) is titled “Evidence of Misinformation Among Voters.”
The poll found strong evidence that voters were substantially misinformed on many of the issues prominent in the election campaign, including the stimulus legislation, the healthcare reform law, TARP, the state of the economy, climate change, campaign contributions by the US Chamber of Commerce and President Obama’s birthplace. In particular, voters had perceptions about the expert opinion of economists and other scientists that were quite different from actual expert opinion.
This study also found that Fox viewers were significantly more misinformed than average on many issues, which is mostly how this survey was covered in the blogosphere and mainstream news outlets. I think this Fox thing is a terrible diversion from the core problem: the American press did not succeed in informing the public. Not even right before an election, not even on the narrow set of issues that, by survey, voters cared to base their votes on.
The travesty here is that the relevant facts were instantly available from primary sources, such as the Congressional Budget Office and the Intergovernmental Panel on Climate Change. I interpret this failure in the following way: for many kinds of issues, the web makes it easy to find true information. But it doesn’t solve the problem of making people go look. That, perhaps, is a key role for modern journalism. Unfortunately, modern American journalism seems to be very bad at it. I imagine the same problem exists in the journalism of many other countries.
What the study actually says
The study compares what voters think experts believe with what those experts actually believe. This is a bit tricky, and the study isn’t saying that the experts are necessarily right, but we’ll get to that. First, some example findings:
- 68% of voters thought that “most economists” believe that the stimulus package “saved or created a few jobs” and 20% thought most economists believe that the stimulus caused job losses, whereas only 8% correctly said that most economists think it “saved or created several million jobs.” (The Congressional Budget Office estimates that the stimulus saved several millions jobs, as do 75% of economists interviewed by the Wall Street Journal.)
- 53% of voters thought that economists believe that Obama’s health care reform plan will increase the deficit, while 29% said that economists were evenly divided on this issue. Only 13% said correctly that a majority of economists think that health care reform will not increase the deficit. (The Congressional Budget Office estimates a net reduction in deficits of $143 billion over 2010-2019, and Boards of Trustees of the Medicare Fund also believe that the Affordable Care act will “postpone the exhaustion of … trust fund assets.”)
- 12% of voters thought that “most scientists believe” that climate change is not occurring, while 33% thought scientists were evenly divided on the issue. That’s 45% with an incorrect perception, as opposed to the 54% who said, correctly, that most scientists think climate change is occurring. (Aside from the IPCC reports and virtually every governmental study of the issue worldwide, an April 2010 survey of climate scientists showed that 97% believe that human-caused climate change is occurring.)
A fussy but necessary digression: all of this rests on the reliability of the WorldPublicOpinion.org survey results. The survey was conducted by Knowledge Networks, Inc. using an online response panel randomly selected from the US population. Those without internet access were apparently provided it for free. I have been unable to find any serious independent evaluation of Knowledge Networks’ methodology, but their many research papers on sample design certainly talk the talk. All of the basic sampling errors, such as self-selection and language bias (what about Hispanics?) are at least addressed on paper. The margin of error is reported as 3.9%.
So let’s take these survey results as accurate, for the moment. This means that the majority of the American public had an incorrect conception of expert opinion on the issues that they voted on. That’s a mouthful. It’s not the same as “believed false things,” and in fact asking “what do you think experts believe” deliberately dodges the tricky question of what is true. If there is some misperception of expert belief, then in the strictest terms the public is misinformed. The study addresses this point as follows:
In most cases we inquired about respondents’ views of expert opinion, as well as the respondents’ own views. While one may argue that a respondent who had a belief that is at odds with expert opinion is misinformed, in designing this study we took the position that some respondents may have had correct information about prevailing expert opinion but nonetheless came to a contrary conclusion, and thus should not be regarded as ‘misinformed.’
So this study does not say “the American public are wrong about the economy and climate change.” It says that they haven’t really looked into it. I’m all for questioning authority’s claim to truth — anyone who follows my work knows that I’m generally a fan of Wikipedia, for example — but I believe we must take lifelong study and rigorous methodology seriously. To put it another way: voting contrary to the opinions of economists may be a fine thing, but voting without any awareness of their work is just silly. Yet that seems to be exactly what happened in the last election.
The role of the press, then and now
Of course, voting is hard and stuff is complex, which is why we rely on the media to break it all down for us. The sad part is that economics and climate change are familiar ground for journalists. It’s not like the facts of these issues were not published in mainstream news outlets. For that matter, journalists were not even necessary here. Any citizen with a web browser could have found out exactly what the Affordable Care Act was predicted to do to the deficit. The Congressional Budget Office published their report and then blogged about it in plain language.
Maybe publishing the truth was never enough. Maybe journalism never actually “informed the public,” but merely created conditions where the curious could get themselves informed by diligently reading the news. But on big issues like whether a piece of national legislation will affect the deficit, we no longer need professionals to enable this kind of self-motivated discovery. The sources go direct in such cases, as the Congressional Budget Office did. And do we really expect that the social media sphere — that’s all of us — will remain silent about the next big global warming study? We’re all going to use Facebook etc. to share links to the next IPCC report when it comes out.
If the problem of having access to true information about these sorts of “votable issues” is solved by the web, what isn’t solved by the web is getting every voter to go look at least once. That might be a job for informed professionals at the helm of big media channels. This is a big responsibility for a news organization to try to take, but I don’t see how it’s anything but the corollary to the responsibility to only publish true information. Presumably some of that information is important enough to know, so consumers would probably appreciate the idea that your mission is to ensure they are informed.
I suspect that paper-based habits are holding journalism back here. There is a deeply ingrained newsroom emphasis on reporting only what’s “new.” A budget report only gets to be news once, even if what it says is relevant for years. But there are no “editions” online; the same headline can float on the hot topics list for as long as it’s relevant. There is even more reason to keep directing attention to an issue if people are actively discussing it, if it is greatly polarized, or if there’s a lot of spin around it (see: the rise of fact-check journalism). In any case, journalists have long been good at keeping an issue in the news, by advancing the story daily in one way or another. But first they have to know what the public doesn’t know.
So the burning question that the World Public Opinion study leaves me with is just this: why wasn’t it a news organization that commissioned this survey?
See also: Does journalism work?