The Self-Fulfilling Prophecy, Confirmation Bias, Group-Think and News Pandering
Section: Dust Devils
Most people have heard of staying on message and building group loyalty. The purpose of this article is to show you how a set of techniques can be used to do just that. Such techniques have been used ethically and unethically. For example, these techniques have been used to create cults, illegal drug cultures, overly loyal employees and political groups that hate others and believe they have all the answers. However, they have also been used to improve learning.
One of the most practical approaches to behavior change in all of psychology is the self-fulfilling prophecy. According to Nada Esl island, a resource for teachers and students,
“In 1968, in a classic experiment, Robert Rosenthal, a professor of social psychology at Harvard, and Lenore Jacobson worked with elementary school children from 18 classrooms. They randomly chose 20% of the children from each room and told the teachers they were “intellectual bloomers.”
They explained that these children could be expected to show remarkable gains during the year. The experimental children showed average IQ gains of two points in verbal ability, seven points in reasoning and four points in over all IQ. The “intellectual bloomers” really did bloom.
Examples of how teachers communicate expectations
- Paying less attention to lows in academic situations (smiling less often, maintaining less eye contact, etc.)
- Calling on lows less often to answer questions or to make public demonstrations
- Waiting less time for lows to answer questions
- Not staying with lows in failure situations (e g. providing fewer clues, asking fewer follow-up questions)
- Criticizing lows more frequently than highs for incorrect responses
- Praising lows more frequently than highs for marginal or inadequate responses
- Providing lows with less accurate and less detailed feedback than highs
- Demanding less work and effort from lows than from highs
- Interrupting lows more frequently than highs”
Another example from the accel team illustrates the full power of the self-fulfilling prophecy
“In 1971 Robert Rosenthal, a professor of social psychology at Harvard, described an experiment in which he told a group of students that he had developed a strain of super-intelligent rats that could run mazes quickly. He then passed out perfectly normal rats at random, telling half of the students that they had the new “maze-bright” rats and the other half that they got “maze-dull rats”.
The rats believed to be bright improved daily in running the maze. They ran faster and more accurately. The “dull” rats refused to budge from the starting point 29% of the time, while the ”bright” rats refused only 11% of the time. The message is that people’s expectations can cause them to get what they are expecting. According to Rosenthal, “Those who believed they were working with intelligent animals liked them better and found them more pleasant.”
“To make a person acts in a certain way, all you have to do is believe this when you interact with them. If you find it hard to make this jump, persuade others that the target person has desired attributes.
When people treat you as if you had certain attributes, decide whether this is desirable or not. Question their behavior if you do not wish to be pushed in this direction.”
There is one more aspect of human motivation that is required to understand how group loyality is built—confirmation bias. According to Changing Minds,
“When we have made a decision or build a hypothesis, we will actively seek things which will confirm our decision or hypothesis. We will also avoid things which will disconfirm this. The alternative is to face the dissonance of being wrong. We use this approach both for searching our memory and looking for things in the external world. ” That is, we ignore dis-confirming information and only use confirming information. Therefore, once we develop a theory, it is very hard for us to change our minds. People also tend to seek out confirming information and avoid dis-confirming information.
One way these techniques can be used unethically is to use them to build cult-like groups or groups that feel their cause is so just that other opinions are worthless. The poltical news media can use these two biases to build their audience and to make their audience loyal. For example, the first step is to find an audience with certain beliefs and then pander to those beliefs. If one was building a political website in the U.S., you could pander to any one of the belief systems that is prevalent in the U.S. (e.g, Christian Right, Libertarism, Business Conservatism, Progressive Democrat, Social Democrat, Moderate Independent or Union Democratic). Next, you would read about the belief systems of your target audience and write news stories that confirm their expectations. Confirmation Bias will insure that people with the political biases of your website will seek out your confirming information. For that reason, you only write stories that confirm the beliefs of your audience. If you have an online commenting system, you delete posts and ban users who present ideas that are outside the group norm because it’s important to present only a world that is consistent with your readers’ existing belief system.
The effect on the readers would be to cement their beliefs which is referred to as group-think. Group members try to minimize conflict and reach a consensus decision without critical evaluation of alternative ideas or viewpoints. Especially when they know their comments will be deleted or they will be cyber-bullied if they fall outside the norm. Fear of ostracizism, will insure that Group-Think occurs. According to Psychologists for Social Responsibility there are,
“eight symptoms of groupthink:
- Illusion of invulnerability –Creates excessive optimism that encourages taking extreme risks.
- Collective rationalization – Members discount warnings and do not reconsider their assumptions.
- Belief in inherent morality – Members believe in the rightness of their cause and therefore ignore the ethical or moral consequences of their decisions.
- Stereotyped views of out-groups – Negative views of “enemy” make effective responses to conflict seem unnecessary.
- Direct pressure on dissenters – Members are under pressure not to express arguments against any of the group’s views.
- Self-censorship – Doubts and deviations from the perceived group consensus are not expressed.
- Illusion of unanimity – The majority view and judgments are assumed to be unanimous.
- Self-appointed ‘mindguards’ – Members protect the group and the leader from information that is problematic or contradictory to the group’s cohesiveness,
view, and/or decisions.”
Now that the political website has it’s
victimswhere it wants them, it can cash in. In addition, to using the previously mentioned tactics, the website has made sure not to offend any potential advertisers/donor groups by not writing any articles that would be offensive to business, lobbying groups, advertisers or people with money. Rather, the website has developed a cheery positive happy child-like tone that discourages criticism of any potential advertisers.
The website now begins to use certain advertisers or ask for donations. For example, if you plan to sell beer then you begin having your writers, write stories about how members of your political group love to drink beer and especially brand X (the brand you wish to sell). Under normal circumstances, people might not see a connection between beer and their political beliefs. However, the website has associated the two and since the website has become a source of beliefs for its
victims, the website might end up selling a lot of beer. The website might even use its tech support team to plant false readers that rave about how much they like certain beers and of course these sorts of comments won’t be deleted because they are off topic or pandering.
For some readers, these ideas might seem far-fetched so let’s design a little experiment to test the hypothesis that some websites act in exactly the fashion described above. First, think about the two or three most politically biased websites that you know of. You would expect these websites to ban more users from their websites than other news websites so let’s try that.
GOOGLE PHRASE NUMBER OF HITS ”Banned From The Huffington Post” 2,180,000 ”Banned from Fox News” 59,200 ”Banned from MSNBC” 51,900 ”Banned from CNN” or “Banned from NBC” 32,000 ”Banned from ABC” 28,800 ”Banned from CBS” 15,500
In a subsequent editorial, we’ll explore whether The Huffington Post tends to ban comments in the manner that would be predicted by the theories mentioned in this article. Currently, the headline story for The Huffington Post is Obama Begs Donors: Send more $$$
Of course, news websites are not the only ones to use these techniques. Politicians, Business’ recommend managers use these techniques for their employees, advertisers use them of course, and so have cults, and drug dealers.
by Todd Miller
Sorry! No Links