Who cues?


At the MAPOR conference last weekend, I presented a study on how partisan media strengthens belief gaps. The belief gap idea, first identified by Doug Hindman a couple years ago, is an extension of the knowledge gap, a theory with over 40 years of work behind it. Whereas the knowledge gap hypothesis suggests that education predicts differential gains of knowledge about political issues — everyone learns, but high-education people learn more, creating a widening gap — the belief gap hypothesis suggests that ideology is better than education for such predictions. It’s called the “belief” gap because the conceptualization of beliefs better fits the context in which facts and knowledge are politically contested.

My paper (with students Delwar Hossain and Ben Lyons) took the initial findings and expanded them in three ways. First, we examined partisanship in addition to ideology and found that it’s consistently a better predictor of beliefs. We attribute this to both being essentially markers of group affiliation, but partisanship being a clearer one for both researchers and survey respondents. Ideology has long been conceptualized as a coherent belief system that drives opinion-formation, but most research suggests few people actually have this kind of formal ideology. Instead, we use cues from elites to guide our opinions, attitudes and beliefs.

Second, we examined the role of partisan traditional and social media in the belief gap process. Despite concern that social media are politically polarizing and insular, we found that partisan traditional media are far stronger drivers of partisan beliefs. There is a structural explanation for this — cable TV and radio have far larger audiences than do blogs and pundits’ social media outlets — as well as a psychological one — we’re exposed to more elite opinion through these outlets, whether those elites are elected officials or opinion-leading commentators.

Finally, we examined belief gaps in five issues — two science-related issues that had previously been studied by Hindman (climate change and abstinence-only sex education), two evidence-free rumors about President Obama (he’s a Muslim, he was born outside the U.S.) and one factual economic issue (whether most Americans’ taxes have gone up during the Obama Administration). Each of these issues has a correct answer by consensus of relevant authorities, but each is also highly politicized. We found belief gaps for each, with largely similar patterns of partisan media influence.

I lay all this out because thinking about our findings in the context of the other presentations in the belief gap panel — from Hindman; Ken Blake and Misa Culley; and Rob Daves, Allen White and Stephen Everett — led me to thinking a lot about the broader, more abstract facets of this idea.1 To my mind, there are two big questions to be answered. First, we need to think about what things a person can have “beliefs” about. During the panel, Rob Daves talked about “verifiable” issues and referenced the work of Cecilie Gaziano in this area, but I think we can think of this in cognitive terms. Given that the belief gap idea grow out of the knowledge gap, I suggest that we look towards the cognitive structure of knowledge to understand what we mean by “belief.” Presumably we are thinking of issues about which the believer can feel that their beliefs are “correct,” even if all evidence and authoritative consensus suggests otherwise, even if there is no consensus to draw on, and even if the answer exists but is unknowable. We may further want to separate issues that are retrospective (about which verification may already be possible), prospective (about which verification can’t be done yet) and ongoing (about which verification may be ephemeral or in constant dispute). These orthogonal issue dimensions would co-exist with the dimension already in use in existing research, politicization.2 The typology might look something like this — consistent with a seat-of-the-pants typology, the examples are the results of just some quick thinking on this and may not fit all that well:

Politicized Non-politicized
Retrospective Prospective Ongoing Retrospective Prospective Ongoing
Consensus Obama born in U.S. Global temperatures will rise Climate change Lincoln killed by Booth Vaccines and autism
Disputed Roe v. Wade lowered crime rate Economy will improve next year Gun ownership and safety JFK killed by Oswald
Unknowable 2000 election stolen Jesus will return someday Alien life exists

 
If we believe that the process observed in the belief gap phenomenon is one of elite cuing by like-minded political leaders (consistent with the work of, e.g., John Zaller), the next question is who does the cuing across the range of this issue typology. For politicized issues, we’ve got a pretty strong hypothesis that political elites provide the most relevant cues, but who those elites are might vary by issues. Particularly for issues that are politicized along evangelical/non-evangelical religious lines, we might expect to see different people and sources playing important roles in mass-opinion formation. Maybe economic, defense and science issues all have different arrays of influential elites; still, we’re probably talking about a relatively narrow band of elites that cue beliefs across a lot of political issues.

But what about beliefs for which elite political cues are not relevant? With the possible exception of Michele Bachmann, nobody’s politicizing childhood vaccinations. So who cues beliefs about vaccinations? Is it scientific consensus (as reported by news media)? Jenny McCarthy? Oprah? If we can explain how non-political beliefs are cued, we may go a long way toward identifying the underlying cognitive and social psychological processes of political belief formation.

1. I should also acknowledge the suggestions of several members of the SIUC political science department (particularly Tobin Grant and Scott McClurg) during a preliminary presentation of this work, which I subsequently incorporated into the final product, and which have informed my ongoing thinking about this topic.

2. There’s another wrinkle here, which is the concept of issue domains and the cognitive work that goes into connecting our attitudes on related issues. For example, in the data used in our paper above, beliefs that tax cuts encourage job creation and that federal deficits discourage job creation were strongly correlated, even though tax cuts help to increase deficits. Additionally, our respondents also anticipate strong inflation over the next year, even though we’ve been in a period of historically low inflation during the global recession. What it looks like is that, instead of considering each issue on its own, there’s a relationship between all these economic issues and general economic attitudes — that is, the economy is bad, and inflation is bad, so we’re in an inflationary period. Job creation is good and tax cuts are good, so they must go together. Probably relevant, but also probably not worth getting into until the first level of questions have been worked out.

Share this post:

    Tags: , , , , , , ,

    • Rob Daves

      Aaron…

      I’m fascinated by your partitioning of the verification issue — which again is science fact versus cultural “knowledge” (which may or not be fact) into retrospective and prospective domains.  

      I sent Doug a note yesterday suggesting that in addition to examining the drivers of public opinion about the various issues, we need to examine at least some of the key  components of opinion themselves.   These could include:

      1.  Salience to the respondent
      2.  The respondent’s direction of opinion
      3.  The available choice frames to the respondent
      4.  The respondent’s fund of underlying information, i.e., how much does the R know about the topic
      5.  The respondent’s certainty of conviction (this is often used on horserace questions in preelection polls, for example)
      6.  The respondent’s likelihood of action
      7.  The perceived social context in which the opinion is held

      This might make an interesting little canonical correlation exercise, should we have the opportunity to test it.

      Rob Daves

    • Stephanie

      Thank you for sharing!