|Matters of Interest||RichardDagan.com|
Notes on "Framing"Shaping opinion and response through language.
With research on the "backfire effect" and the persistence of misconceptions. 26 April 2011. Modified: 27 January 2013. First published at Intraspec.ca; moved here 16 June 2013.
Last Modified:19 February 2015.
What is a "frame"?
Source: Textual silence and the discourse of homelessness
A frame is a socially based, abstract, high-level knowledge structure that organizes certain information about the world into a coherent whole; it is "a general, standardized, predefined structure (in the sense that it already belongs to the receiver's knowledge of the world) which allows re-cognition and guides perception' (Donati, 1992:141). Writers and speakers commonly frame public issues by mentioning certain relevant topics and subtopics while ignoring others. In so doing, they are in effect setting the context so as to invoke a certain context model, i.e. give the text representation a certain 'slant'.
Donati, P. (1992) 'Political Discourse Analysis', in M. Diani and R. Eyerman (eds.), Studying Collective Action, pp.136-67. London: Sage.
What is a "context model"?
Source: Cognitive Context Models and Discourse
During a conversation, a lecture, doctor-patient interaction, reading the newspaper or watching TV, participants of course also need to mentally monitor such encounters themselves, e.g., by planning, executing, controlling or indeed understanding them. It is here proposed that such ongoing, continuously updated episodic representations should be conceptualized as a special type of models, viz., context models. [...]
[C]ontexts typically consist of at least the following major categories, possibly each with their own internal schematic structure, as if they were sub-models :
[...] Context models are episodic, personal and hence subjective interpretations and experiences of the communicative event or context. That is, speech participants will usually have similar or overlapping models of the event they participate in, but their models are both theoretically and practically unique and different, as is true for all models: Rather trivially, speech participants have different goals, perspectives, knowledge, opinions, etc., about ongoing text and talk. In written communication this may even be more pronounced, given the obviously different models of writers and readers, models that also have different information in their Setting (Time and Place) category. Indeed, routine complications in talk may be largely based on conflicting context models, and negotiation may be necessary to strategically manage such conflicts. [...]
Our context models make us more or less susceptible to the framed message.
How does framing work?
The facts never speak for themselves, which is why scientists
Over the past two decades, research in the fields of political communication and sociology has added to previous work on framing to explain how media portrayals in interaction with cultural forces shape public views. In this research, frames are identified as being used by audiences as "interpretative schema" to make sense of and discuss an issue, by journalists to craft interesting and appealing news reports, and by policymakers to define policy options and reach decisions.5
In each of these contexts, frames simplify complex issues by lending greater importance to certain considerations and arguments over others. In the process, framing helps communicate why an issue might be a problem, who or what might be responsible, and what should be done.6 A typology of frames specific to science-related issues summarizes a common set of frames specific to science. Past research suggests that these generalizable interpretations play out over and over again across science debates.7 [...] [Read more]
The Ethics of Framing
"Cough or sneeze in your sleeve" is an excellent example of a complex theme framed in simple terms that communicate information of real benefit to others. The origin of "cough in your sleeve" is difficult to pin down, but some remember it from childhood: "Rather than coughing into your hands and spreading germs by touching everything you come into contact with, you should cough into your sleeve instead. This method will keep your saliva to yourself – a trick most of us learned in kindergarten or from our multi-tasking mothers who didn’t have time for consistent hand washing."
We launched an aggressive communication strategy to get the word out to the American people — primarily about vaccination, since this is the single safest and most effective way to protect public health — but also about “what you can do to keep flu from spreading: cough in your sleeve; keep surfaces clean; stay home when you’re sick.”
"Cough in your sleeve" proved an extremely successful framing of the facts of contagion. It is a call for specific action in the interest of self-protection and shared responsibility in a social matrix.
But framing can also be used in a manner most of us would regard as unethical, particularly when objective reportage is compromised by personal bias which leads to the omission or distortion of "facts" in the frame. A good case in point is presented by editor of The Daily Caller Tucker Carlson, in his recent exposés of email archives from Journolist, a now-defunct listserv comprised of several hundred liberal journalists, like-minded professors, and activists. These archives suggest concerted effort on the part of certain journalists to frame information in pursuit of their own political biases, rather than convey the facts in an objective manner.
For Journolist founder Ezra Klein's take on it, see On Journolist, and Dave Weigel (25.06.10), in which he writes that, "insofar as the current version of Journolist has seen its archives become a weapon, and insofar as people's careers are now at stake, it has to die". Klein is a "26-year-old Washington Post blogger [...] who makes trenchant observations about health care and other complicated policy issues" and "could be seen as relatively inexperienced [...]", writes columnist Kathleen Parker. But while his postscript is an interesting exploration of personal motive and an attempt to place events in meaningful context, Klein seems to miss the key point. The fact is, we want to hold "reporters" to certain standards of conduct as professional framers of information in the public interest, and those standards preclude prejudicial or pejorative distortions or elisions of the facts with intent to manipulate public opinion. There is a difference, one hopes, between a reporter and an activist.
The Pew Research Center for the People & the Press released a survey report (12.09.09) entitled Press Accuracy Rating Hits Two Decade Low: Public Evaluations of the News Media: 1985-2009. Among the findings:
Just 29% of Americans say that news organizations generally get the facts straight, while 63% say that news stories are often inaccurate. In the initial survey in this series about the news media’s performance in 1985, 55% said news stories were accurate while 34% said they were inaccurate. That percentage had fallen sharply by the late 1990s and has remained low over the last decade.
Similarly, only about a quarter (26%) now say that news organizations are careful that their reporting is not politically biased, compared with 60% who say news organizations are politically biased. And the percentages saying that news organizations are independent of powerful people and organizations (20%) or are willing to admit their mistakes (21%) now also match all-time lows.
The Journolist controversy may be overblown in some respects there is no foul in the personal exchange of opinions, for example but in this case, more than the matter of media bias and mistaken facts, at issue is the intent to dissemble, disparage, and manipulate. See the video, below. And see Getting the message on Journolist's controversial postings, by Howard Kurtz, Washington Post (23.07.10).
In Politics, Sometimes the Facts Don't Matter
Source: In Politics, Sometimes The Facts Don't Matter
New research suggests that misinformed people rarely change their minds when presented with the facts -- and often become even more attached to their beliefs. The finding raises questions about a key principle of a strong democracy: that a well-informed electorate is best.
Talk of the Nation, NPR (00:30:17)
[...] In a series of studies in 2005 and 2006, researchers at the University of Michigan found that when misinformed people, particularly political partisans, were exposed to corrected facts in news stories, they rarely changed their minds. In fact, they often became even more strongly set in their beliefs. Facts, they found, were not curing misinformation. Like an underpowered antibiotic, facts could actually make misinformation even stronger.
This bodes ill for a democracy, because most voters — the people making decisions about how the country runs — aren't blank slates. They already have beliefs, and a set of facts lodged in their minds. The problem is that sometimes the things they think they know are objectively, provably false. And in the presence of the correct information, such people react very, very differently than the merely uninformed. Instead of changing their minds to reflect the correct information, they can entrench themselves even deeper.
"The general idea is that it's absolutely threatening to admit you're wrong," says political scientist Brendan Nyhan, the lead researcher on the Michigan study. The phenomenon — known as "backfire" — is "a natural defense mechanism to avoid that cognitive dissonance." [...]
Abstract An extensive literature addresses citizen ignorance, but very little research focuses on misperceptions. Can these false or unsubstantiated beliefs about politics be corrected? Previous studies have not tested the efficacy of corrections in a realistic format. We conducted four experiments in which subjects read mock news articles that included either a misleading claim from a politician, or a misleading claim and a correction. Results indicate that corrections frequently fail to reduce misperceptions among the targeted ideological group. We also document several instances of a "backfire effect" in which corrections actually increase misperceptions among the group in question.
Notes F and H, added.
Source: Psychological Operations Field Manual
No.33-1 31, August 1979. Department of the Army. "Propaganda Techniques" is based upon "Appendix I: PSYOP Techniques" from "Psychological Operations Field Manual No.33-1" published by Headquarters; Department of the Army, in Washington DC, on 31 August 1979.
Knowledge of propaganda techniques is necessary to improve one's own propaganda and to uncover enemy PSYOP stratagems. Techniques, however, are not substitutes for the procedures in PSYOP planning, development, or dissemination.
Techniques may be categorized as:
Characteristics of the content self-evident. [A]dditional information is required to recognize the characteristics of this type of propaganda. "Name calling" and the use of slogans are techniques of this nature.
Additional information required to be recognized. Additional information is required by the target or analyst for the use of this technique to be recognized. "Lying" is an example of this technique. The audience or analyst must have additional information in order to know whether a lie is being told.
Evident only after extended output. "Change of pace" is an example of this technique. Neither the audience nor the analyst can know that a change of pace has taken place until various amounts of propaganda have been brought into focus.
Nature of the arguments used. An argument is a reason, or a series of reasons, offered as to why the audience should behave, believe, or think in a certain manner. An argument is expressed or implied.
Inferred intent of the originator. This technique refers to the effect the propagandist wishes to achieve on the target audience. "Divisive" and "unifying" propaganda fall within this technique. It might also be classified on the basis of the effect it has on an audience.
Source: The Century of Self,
Adam Curtis, BBC Documentary (Monday 29 April - Thursday 2 May 2002)
Adam Curtis' acclaimed series examines the rise of the all-consuming self against the backdrop of the Freud dynasty.
To many in both politics and business, the triumph of the self is the ultimate expression of democracy, where power has finally moved to the people. Certainly the people may feel they are in charge, but are they really? The Century of the Self tells the untold and sometimes controversial story of the growth of the mass-consumer society in Britain and the United States. How was the all-consuming self created, by whom, and in whose interests? [...]
Sigmund Freud's work into the bubbling and murky world of the subconscious changed the world. By introducing a technique to probe the unconscious mind, Freud provided useful tools for understanding the secret desires of the masses. Unwittingly, his work served as the precursor to a world full of political spin doctors, marketing moguls, and society's belief that the pursuit of satisfaction and happiness is man's ultimate goal.
More on the series:
|Shenjiva: My Journal of Arts|
Framing and repetition...
Deconstructing the corporate tax cut debate
Trish Hennessy, Framed in Canada (27.01.11)
[...] Given recent polling that indicates the majority of Canadians don’t like the idea of corporate tax cuts, it could be a risky wedge issue on which to stake an election. But conservatives in Canada are careful students of framing. They understand what neuroscience is teaching us: repetition changes minds.
There are several fascinating experiments that prove this phenomenon.For instance, psychologist Ian Skurnik asked senior citizens to sit through a computer presentation of a series of health warnings that were randomly identified as either true or false: Aspirin destroys tooth enamel (true); Corn chips contain twice as much fat as potato chips (false). Quizzed a few days later, the seniors remembered the false statements as true – repetition had rewired their brain to believe falsehoods. Kimberlee Weaver of Virginia Tech did a study that showed if one person tells you that something is true, and tells you that over and over again, you are likely to conclude that the opinion is widely held. Norbert Schwarz from the University of Michigan helped show that even when the task is to educate the public with a myth-busting fact sheet, people walk way remembering the repeated myths as truth. Astute politicians use repetition to build support for their side. Harper’s Conservatives are nothing if not disciplined message bearers. Their carefully scripted frame, which key Ministers are diligently repeating, is simple: Corporations = job creators; corporate tax cuts = job creation. Finance Minister Jim Flaherty has used it: “If we want more jobs, higher wages, an improved standard of living for all of us, Canada needs to be an attractive place for job-creators to do business and invest.” Government House Leader John Baird has used it: “We are reducing taxes for businesses because it creates jobs and it creates economic growth,” said Baird. “Our tax rates for job creators is one of the measures sustaining our fragile economic recovery.” And you can expect to hear a lot more of the Conservative frame. [...]
A key finding that has emerged in communications research over the years is that when propaganda fails, it's because audiences are active. They ask questions.
Communications professor Aaron Delwiche, Trinity University, San Antonio TX; propagandacritic.com.|
In Military experts say psy-ops isn't brainwashing: Psychological operations just convinces the enemy to change behavior.