Notes on "Framing"

Shaping opinion and response through language.
With research on the "backfire effect" and the persistence of misconceptions.

26 April 2011. Modified: 27 January 2013. First published at; moved here 16 June 2013.
Last Modified:19 February 2015.

What is a "frame"?

Source: Textual silence and the discourse of homelessness
Thomas Huckin, Discourse & Society 2002;13(3):347-372. p.354.

A frame is a socially based, abstract, high-level knowledge structure that organizes certain information about the world into a coherent whole; it is "a general, standardized, predefined structure (in the sense that it already belongs to the receiver's knowledge of the world) which allows re-cognition and guides perception' (Donati, 1992:141). Writers and speakers commonly frame public issues by mentioning certain relevant topics and subtopics while ignoring others. In so doing, they are in effect setting the context so as to invoke a certain context model, i.e. give the text representation a certain 'slant'.

Donati, P. (1992) 'Political Discourse Analysis', in M. Diani and R. Eyerman (eds.), Studying Collective Action, pp.136-67. London: Sage.

What is a "context model"?

Source: Cognitive Context Models and Discourse
Teun A. van Dijk, 'Cognitive Context Models and Discourse', in M. Stamenow (ed.), Language Structure, Discourse and the Access to Consciousness. Amsterdam: Benjamins. (1997:189-226) pp.192-194, passim.

During a conversation, a lecture, doctor-patient interaction, reading the newspaper or watching TV, participants of course also need to mentally monitor such encounters themselves, e.g., by planning, executing, controlling or indeed understanding them. It is here proposed that such ongoing, continuously updated episodic representations should be conceptualized as a special type of models, viz., context models. [...]

[C]ontexts typically consist of at least the following major categories, possibly each with their own internal schematic structure, as if they were sub-models :

  • Setting: location, timing of communicative event;
  • Social circumstances: previous acts, social situation;
  • Institutional environment;
  • Overall goals of the (inter)action;
  • Participants and their social and speaking roles;
  • Current (situational) relations between participants;
  • Global (non-situational) relations between participants;
  • Group membership or categories of participants (e.g., gender, age).

[...] Context models are episodic, personal and hence subjective interpretations and experiences of the communicative event or context. That is, speech participants will usually have similar or overlapping models of the event they participate in, but their models are both theoretically and practically unique and different, as is true for all models: Rather trivially, speech participants have different goals, perspectives, knowledge, opinions, etc., about ongoing text and talk. In written communication this may even be more pronounced, given the obviously different models of writers and readers, models that also have different information in their Setting (Time and Place) category. Indeed, routine complications in talk may be largely based on conflicting context models, and negotiation may be necessary to strategically manage such conflicts. [...]

Our context models make us more or less susceptible to the framed message.

How does framing work?

Source: The Future of Public Engagement
Matthew C. Nisbet & Dietram A. Scheufele, The Scientist, Volume 21, Issue 10, p.38.

The facts never speak for themselves, which is why scientists
need to "frame" their messages to the public.

The earliest formal work on framing traces back 25 years to research by the cognitive psychologists Daniel Kahneman and Amos Tversky. In experiments examining risk judgments and consumer choices rather than content itself, the two psychologists discovered that the different ways in which a message is presented or framed can result in very different responses. They concluded in their Nobel Prize winning research that "perception is reference-dependent."4

The Scientist Image Credit:

Over the past two decades, research in the fields of political communication and sociology has added to previous work on framing to explain how media portrayals in interaction with cultural forces shape public views. In this research, frames are identified as being used by audiences as "interpretative schema" to make sense of and discuss an issue, by journalists to craft interesting and appealing news reports, and by policymakers to define policy options and reach decisions.5

In each of these contexts, frames simplify complex issues by lending greater importance to certain considerations and arguments over others. In the process, framing helps communicate why an issue might be a problem, who or what might be responsible, and what should be done.6 A typology of frames specific to science-related issues summarizes a common set of frames specific to science. Past research suggests that these generalizable interpretations play out over and over again across science debates.7 [...] [Read more]

The Ethics of Framing

"Cough or sneeze in your sleeve" is an excellent example of a complex theme framed in simple terms that communicate information of real benefit to others. The origin of "cough in your sleeve" is difficult to pin down, but some remember it from childhood: "Rather than coughing into your hands and spreading germs by touching everything you come into contact with, you should cough into your sleeve instead. This method will keep your saliva to yourself – a trick most of us learned in kindergarten or from our multi-tasking mothers who didn’t have time for consistent hand washing."

Whatever the origin, the concept quickly became part of the communication strategies pursued aggressively by public health authorities during the H1N1 pandemic.

We launched an aggressive communication strategy to get the word out to the American people — primarily about vaccination, since this is the single safest and most effective way to protect public health — but also about “what you can do to keep flu from spreading: cough in your sleeve; keep surfaces clean; stay home when you’re sick.”

"Cough in your sleeve" proved an extremely successful framing of the facts of contagion. It is a call for specific action in the interest of self-protection and shared responsibility in a social matrix.

But framing can also be used in a manner most of us would regard as unethical, particularly when objective reportage is compromised by personal bias which leads to the omission or distortion of "facts" in the frame. A good case in point is presented by editor of The Daily Caller Tucker Carlson, in his recent exposés of email archives from Journolist, a now-defunct listserv comprised of several hundred liberal journalists, like-minded professors, and activists. These archives suggest concerted effort on the part of certain journalists to frame information in pursuit of their own political biases, rather than convey the facts in an objective manner.

For Journolist founder Ezra Klein's take on it, see On Journolist, and Dave Weigel (25.06.10), in which he writes that, "insofar as the current version of Journolist has seen its archives become a weapon, and insofar as people's careers are now at stake, it has to die". Klein is a "26-year-old Washington Post blogger [...] who makes trenchant observations about health care and other complicated policy issues" and "could be seen as relatively inexperienced [...]", writes columnist Kathleen Parker. But while his postscript is an interesting exploration of personal motive and an attempt to place events in meaningful context, Klein seems to miss the key point. The fact is, we want to hold "reporters" to certain standards of conduct as professional framers of information in the public interest, and those standards preclude prejudicial or pejorative distortions or elisions of the facts with intent to manipulate public opinion. There is a difference, one hopes, between a reporter and an activist.

The Pew Research Center for the People & the Press released a survey report (12.09.09) entitled Press Accuracy Rating Hits Two Decade Low: Public Evaluations of the News Media: 1985-2009. Among the findings:

Just 29% of Americans say that news organizations generally get the facts straight, while 63% say that news stories are often inaccurate. In the initial survey in this series about the news media’s performance in 1985, 55% said news stories were accurate while 34% said they were inaccurate. That percentage had fallen sharply by the late 1990s and has remained low over the last decade.

Similarly, only about a quarter (26%) now say that news organizations are careful that their reporting is not politically biased, compared with 60% who say news organizations are politically biased. And the percentages saying that news organizations are independent of powerful people and organizations (20%) or are willing to admit their mistakes (21%) now also match all-time lows.

The Journolist controversy may be overblown in some respects — there is no foul in the personal exchange of opinions, for example — but in this case, more than the matter of media bias and mistaken facts, at issue is the intent to dissemble, disparage, and manipulate. See the video, below. And see Getting the message on Journolist's controversial postings, by Howard Kurtz, Washington Post (23.07.10).

In Politics, Sometimes the Facts Don't Matter

All things considered...

We sometimes hold to biased cognitions, inaccurate beliefs and interpretations, even when presented with evidence that those cognitions are falsely predicated, because we want to maintain our point of view (motivated reasoning).H We may defend against cognitive dissonance (an inner sense of disquiet or anxiety resulting from simultaneously held contradictory ideas) by ignoring or rationalizing the evidence in terms of our desired conclusion.A, F, I, K The interview, article, and research presented at right and below explores this phenomenon, as well as the "backfire effect" by which such cognitive bias is more strongly reinforced.

The opposite side of the coin is the manner in which the factual evidence is framed, and by whom. When we are overloaded with information, dealing with contradictory frames presented by supposedly "expert" or "authoritative" sources, each competing for our attention, filters become all the more important in making sense of our world.

As David Shenk writes (2003),
'[t]he psychological reaction to ... an overabundance of information and competing expert opinions is to simply avoid coming to conclusions. "You can't choose any one study, any one voice, any one spokesperson or a point of view," explains psychologist Robert Cialdini. "So what do you do? It turns out that the answer is, you don't do anything. You reserve judgment. You wait and see what the predominance of opinion evolves to be."'

But waiting for the emergence of predominant opinion may not be practicable; the psychoemotional need for an operative understanding may be immediate. Cognitive processes (strategies for evaluating, constructing, and evaluating belief),H engage the experiential information at hand, in the interests of intrapsychic coherence and stability, driven by unconscious variables.F In objective terms the resulting solution may be more expedient than factual, but it nevertheless serves as an effective filter, a context model, in terms of which we can respond.

Ideally, self-perceptionJ and self-affirmationE, F make one more open to evidence that runs contrary to a motivated stance but, given the postmodern fluidity of "fact" and the decenteredness of meaning in an age of spin and sound bytes, is it any wonder that we formulate and defend self-serving belief, political or otherwise?

The Scream, by Edvard Munch
"The Scream", Edvard Munch
(Restored; original dated 1893? 1910?)
Image Credit: AFP/Getty; Adapted.
Click to enlarge, read article.

Source: In Politics, Sometimes The Facts Don't Matter
Talk of the Nation, Host: Neil Conan, with guests: Dana Milbank, national political columnist, Washington Post; Brendan Nyhan, Robert Wood Johnson scholar in health policy research, University of Michigan; and Alicia Shepard, ombudsman, NPR [National Public Radio] (13.07.10)

New research suggests that misinformed people rarely change their minds when presented with the facts -- and often become even more attached to their beliefs. The finding raises questions about a key principle of a strong democracy: that a well-informed electorate is best.

Talk of the Nation, NPR (00:30:17)

How facts backfire:
Researchers discover a surprising threat to democracy: our brains

Joe Keohane, Boston Globe (11.07.10) Link added.

[...] In a series of studies in 2005 and 2006, researchers at the University of Michigan found that when misinformed people, particularly political partisans, were exposed to corrected facts in news stories, they rarely changed their minds. In fact, they often became even more strongly set in their beliefs. Facts, they found, were not curing misinformation. Like an underpowered antibiotic, facts could actually make misinformation even stronger.

This bodes ill for a democracy, because most voters — the people making decisions about how the country runs — aren't blank slates. They already have beliefs, and a set of facts lodged in their minds. The problem is that sometimes the things they think they know are objectively, provably false. And in the presence of the correct information, such people react very, very differently than the merely uninformed. Instead of changing their minds to reflect the correct information, they can entrench themselves even deeper.

"The general idea is that it's absolutely threatening to admit you're wrong," says political scientist Brendan Nyhan, the lead researcher on the Michigan study. The phenomenon — known as "backfire" — is "a natural defense mechanism to avoid that cognitive dissonance." [...]

Political Behavior, Volume 32, Number 2 / June, 2010

When Corrections Fail:
The Persistence of Political Misperceptions

Political Behavior. (2010)32:303–330. DOI 10.1007/s11109-010-9112-2.
Published online 30.03.10.

Abstract An extensive literature addresses citizen ignorance, but very little research focuses on misperceptions. Can these false or unsubstantiated beliefs about politics be corrected? Previous studies have not tested the efficacy of corrections in a realistic format. We conducted four experiments in which subjects read mock news articles that included either a misleading claim from a politician, or a misleading claim and a correction. Results indicate that corrections frequently fail to reduce misperceptions among the targeted ideological group. We also document several instances of a "backfire effect" in which corrections actually increase misperceptions among the group in question.

Notes F and H, added.
[...] The backfire effects that we found seem to provide further support for the growing literature showing that citizens engage in "motivated reasoning." F, H While our experiments focused on assessing the effectiveness of corrections, the results show that direct factual contradictions can actually strengthen ideologically grounded factual beliefs—an empirical finding with important theoretical implications. Previous research on motivated reasoning has largely focused on the evaluation and usage of factual evidence in constructing opinions and evaluating arguments (e.g. Taber and Lodge 2006)1. By contrast, our research — the first to directly measure the effectiveness of corrections in a realistic context — suggests that it would be valuable to directly study the cognitive and affective processes that take place when subjects are confronted with discordant factual information. Two recent articles take important steps in this direction. Gaines et al. (2007)2 highlight the construction of interpretations of relevant facts, including those that may be otherwise discomforting, as a coping strategy, while Redlawsk et al. (forthcoming) argue that motivated reasoners who receive sufficiently incongruent information may become anxious and shift into more rational updating behavior. [...]

  1. Taber CS, Lodge M.
    Motivated skepticism in the evaluation of political beliefs.
    American Journal of Political Science. 2006;50(3):755–769.
  2. Gaines BJ, Kuklinski JH, Quirk PJ, Peyton B, Verkuilen J.
    Interpreting Iraq: Partisanship and the meaning of facts.
    Journal of Politics. 2007;69(4):957–974.

And see:

  1. "There Must Be a Reason": Osama, Saddam, and Inferred Justification
    Prasad M, Perrin A, Bezila K, Hoffman SG, Kindleberger K, Manturuk K, Powers A.
    Sociological Inquiry. 2009;79(2).
  2. Same Facts, Different Interpretations: Partisan Motivation and Opinion on Iraq
    Gaines BJ, Kuklinski JH, Quirk PJ, Peyton B, Verkuilen J.
    Journal of Politics. 2007;69:957-974.
  3. Metacognitive experiences and the intricacies of setting people straight: Implications for debiasing and public information campaigns.
    Schwarz N, Sanna LJ, Skurnik I, Yoon C.
    Advances in Experimental Social Psychology. 2007;39:127-161.
    DOI: 10.1016/S0065-2601(06)39003-X
  4. Bridging the Partisan Divide: Self-Affirmation Reduces Ideological Closed-Mindedness and Inflexibility in Negotiation
    Cohen GL, Sherman DK, Bastardi A, Hsu L, McGoey M, Ross L.
    Journal of Personality and Social Psychology. 2007;93(3):415–430.
    DOI: 10.1037/0022-3514.93.3.415
  5. The psychology of self-defense: Self-affirmation theory.
    Sherman DK, Cohen GL. In Mark P. Zanna (Ed.)
    Advances in Experimental Social Psychology. 2006;38:183-242.
  6. Hot Cognition or Cool Consideration? Testing the Effects of Motivated Reasoning
    on Political Decision Making

    Redlawsk DP. Journal of Politics. 2002;64:1021-1044.
  7. Misinformation and the Currency of Democratic Citizenship
    Kuklinski JH, Quirk PJ, Jerit J, Schwieder D, Rich RF.
    Journal of Politics. 2000 Aug;62(3):790-816.
  8. The Case for Motivated Reasoning
    Kunda Z. Psychological Bulletin. 1990 Nov;108(3):480-498
  9. The Nature and Origins of Mass Opinion
    Zaller JR. Cambridge University Press (1992).
  10. Self-Perception: An Alternative Interpretation of Cognitive Dissonance Phenomena.
    Bem DJ. Psychological Review. 1967;74(3):183-200.
  11. A Theory of Cognitive Dissonance
    Festinger L. White Plains, NY: Row, Peterson (1957).

Psychological Operations — Framing as a weapon...
The term "Psychological Operations" (PSYOP, PSY-OP) was superseded by
"Military Information Supply Operations" — MISO — in 2010.µ

Source: Psychological Operations Tactics, Techniques, and Procedures
FM 3-05.301 (FM 33-1-1), MCRP 3-40.6A. Department of the Army, Washington, DC. (31 December 2003:p.1-1)

PSYOP are planned operations that convey selected information and indicators to foreign target audiences (TAs) to influence their emotions, motives, objective reasoning, and ultimately, the behavior of foreign governments, organizations, groups, and individuals. The purpose of all PSYOP is to create in neutral, friendly, or hostile foreign groups the emotions, attitudes, or desired behavior that support the achievement of U.S. national objectives and the military mission. In doing so, PSYOP influence not only policy and decisions, but also the ability to govern, the ability to command, the will to fight, the will to obey, and the will to support. The combination of PSYOP products and actions create in the selected TAs a behavior that supports U.S. national policy objectives and the theater commander’s intentions at the strategic, operational, and tactical levels.

History of American propaganda for political, corporate control of Americans

Source: Psychological Operations Field Manual
No.33-1 31, August 1979. Department of the Army. "Propaganda Techniques" is based upon "Appendix I: PSYOP Techniques" from "Psychological Operations Field Manual No.33-1" published by Headquarters; Department of the Army, in Washington DC, on 31 August 1979.

Knowledge of propaganda techniques is necessary to improve one's own propaganda and to uncover enemy PSYOP stratagems. Techniques, however, are not substitutes for the procedures in PSYOP planning, development, or dissemination.

Techniques may be categorized as:

Characteristics of the content self-evident.  [A]dditional information is required to recognize the characteristics of this type of propaganda. "Name calling" and the use of slogans are techniques of this nature.

Additional information required to be recognized.  Additional information is required by the target or analyst for the use of this technique to be recognized. "Lying" is an example of this technique. The audience or analyst must have additional information in order to know whether a lie is being told.

Evident only after extended output.  "Change of pace" is an example of this technique. Neither the audience nor the analyst can know that a change of pace has taken place until various amounts of propaganda have been brought into focus.

Nature of the arguments used.  An argument is a reason, or a series of reasons, offered as to why the audience should behave, believe, or think in a certain manner. An argument is expressed or implied.

Inferred intent of the originator.  This technique refers to the effect the propagandist wishes to achieve on the target audience. "Divisive" and "unifying" propaganda fall within this technique. It might also be classified on the basis of the effect it has on an audience.

The shift from a "needs" to a "desire" culture:
Changing mentalities and cultivating behaviors through framing.
Edward Bernays, "propaganda" → "public relations", and social control.

Source: The Century of Self,
Adam Curtis, BBC Documentary (Monday 29 April - Thursday 2 May 2002)

Adam Curtis' acclaimed series examines the rise of the all-consuming self against the backdrop of the Freud dynasty.

To many in both politics and business, the triumph of the self is the ultimate expression of democracy, where power has finally moved to the people. Certainly the people may feel they are in charge, but are they really? The Century of the Self tells the untold and sometimes controversial story of the growth of the mass-consumer society in Britain and the United States. How was the all-consuming self created, by whom, and in whose interests? [...]

Sigmund Freud's work into the bubbling and murky world of the subconscious changed the world. By introducing a technique to probe the unconscious mind, Freud provided useful tools for understanding the secret desires of the masses. Unwittingly, his work served as the precursor to a world full of political spin doctors, marketing moguls, and society's belief that the pursuit of satisfaction and happiness is man's ultimate goal.

More on the series:

See below in six parts; watch the whole series at OVGuide; or download in full at Internet Archive.

  1. Simple Framing: An introduction to framing and its uses in politics.
    George Lakoff, Rockridge Institute, (14 February 2006)
  2. The framing effect and risky decisions: Examining cognitive functions with fMRI
    Cleotilde Gonzalez, Jason Dana, Hideya Koshino, Marcel Just.
    Journal of Economic Psychology, 26(2005)1–20
  3. The Framing Effect of Price Format
    Marco Bertini and Luc Wathieu, Working Paper, HBS Working Knowledge, (16 May 2006; pubdate: June 2006)
    See also: Fixing Price Tag Confusion
    Q&A with Luc R. Wathieu, by Sean Silverthorne. (11 December 2006), HBS Working Knowledge
  4. Framing (social sciences) - Wikipedia, the free encyclopedia
  5. The Environment: A Cleaner, Safer, Healthier America
    Frank Luntz. A 16-page memo (2003) that outlines a public relations strategy to help Republicans and President George W. Bush address vulnerabilities in their position on the environment and the matter of global warming.
    See also: Words that Work: It's Not What You Say it's What People Hear
    Frank Luntz (Hyperion Books; pubdate: 31 January 2007)
  6. Scientific American Mind: When Words Decide
    Researchers are discovering the myriad ways in which language can have a profound effect on the choices we make - from the foods we eat to the laws we support. [...]
    Barry Schwartz, Scientific American Mind, (August/September 2007:37-43)