I think we should, however, pursue this concept just a bit farther. Consider the following graphic:
Just about everyone who took intro courses in psychology knows this is the chief image used in the famous Asch conformity experiments. In a nutshell, if you want a significant number of people to say that the line in the left-most box is the same length as lines A or B in the right-most box, all you have to do is have four people agree that this is the case. It doesn't matter that C is clearly of equal length; a significant amount of social persuasion will make a significant number of people conform.
The Asch study simply had a single study participant in a study group of 5 to 7 other people, all confederates in on the study, confederates told to give either right or wrong answers. When the single non-confederate was asked to answer which line was the same after the confederates all agreed on the wrong answer, 41% of them would agree with the wrong answer.
41% would say something just about everyone could see wasn't true.
With a little help from our friend technology, the Asch results can take on a life of their own. Following up on Asch's procedure in 2005, researchers replicated the experiment with a twist, adding an fMRI scanner to the mix. This way, they could see what portions of the brain are at work when conformity and non-conformity occur. "'We like to think that seeing is believing,' said Dr. Gregory Berns, a psychiatrist and neuroscientist at Emory University in Atlanta who led the study."
Ah, is seeing believing? Not when social pressure is applied. Just like the Asch experiments, the subjects of this study were subjected to confederate participants in the study who gave wrong answers. Only the test subject was in the fMRI. What the brain told the researchers should be considered carefully:
As in Dr. Asch's experiments, many of the subjects caved in to group pressure. . . .
The researchers had two hypotheses about what was happening. If social conformity was a result of conscious decision making, they reasoned, they should see changes in areas of the forebrain that deal with monitoring conflicts, planning and other higher-order mental activities.
But if the subjects' social conformity stemmed from changes in perception, there should be changes in posterior brain areas dedicated to vision and spatial perception.
In fact, the researchers found that when people went along with the group on wrong answers, activity increased in the right intraparietal sulcus, an area devoted to spatial awareness, Dr. Berns said.
There was no activity in brain areas that make conscious decisions, the researchers found. But the people who made independent judgments that went against the group showed activation in the right amygdala and right caudate nucleus -- regions associated with emotional salience.
(I added emphasis)
The people who went along with the wrong answers had brains which worked to literally see that the answers were right. Those that didn't see the answers, the non-conformists, worked their brains to resolve the emotional conflict with seeing something other than the crowd.
Seeing is not believing.
Okay, now that we have determined that many of us can be made to see what is obviously not there in a blatant situation, how difficult would it be to sway opinions on topics that are less clearly cut? Say, on political opinions and how one should vote in coming elections?
Taking a cue from Dr. Asch, a political strategists named Joseph P. Overton developed a campaign strategy to sway elections based upon the pre-existing opinions of those in the election district.
Before the strategy begins, 0 is the local opinion on the topic to be considered. By flooding the media with messages far to the right of the 0, or "Policy" position of the population, Overton found he could shift the window of possibility far enough to the right for moderately-right-leaning opinion to be shifted from "Radical" to "Sensible," bringing the topics for which he had been hired to get passed into "Policy."
It's the Asch experiment writ large. Instead of study confederates, the campaign paid writers, news pundits and others hired specifically to shift the Window of possibility. All the campaign had to do was hire them to write their copy and pay media to run it.
By flooding the media with right-wing talk of a Radical, sometimes Unthinkable nature, the voters shifted their own views without even knowing why. And it's surprisingly easy to do, it seems. In an Episode #108 of the Skepticality podcast, Ray Hyman outlines the chief weapons of the Window Shifters: Source Amnesia and the Repetition Effect:
. . . (T)here is something what's called in psychology "source amnesia." So when people standing in the supermarket line, for example, . . . on the stands right near the checkout . . . the National Inquirer and . . . other rags, with headlines on it that Elizabeth Taylor had sex with a gorilla, or something like that. They say "That's silly." They know.
However . . . this stuff floats around in your mind; but the source [of the outrageous thought], where it came from, is not there anymore. As a result, this (gorilla sex idea) becomes much more plausible. The next time you hear something like that it becomes much more plausible, more believable. And we know this. This is called the Repetition Effect. There have been lots of experiments. It's a very powerful effect.
The way they used to do the original experiments, they give people a lot of statements, and ask them to rate these statements on how plausible they are. . . . (M)ost of these statements are rated very low in plausibility. Next week or the week after they give them another bunch of statements and have them rate them. The next week, another batch. Well . . . each time they were doing this, they repeat some of (the statements), and some of them were not repeated. . . . People don't consciously remember that they saw these before. The more (the statement) was repeated, the more it becomes believable.
Just throw a pile of lies into the media fire, repeat as necessary, and the opinions of the populace will eventually come around. It doesn't matter if the reporters in any given news outlet are liberal or conservative, not at all. What matters is what the editors and publishers decide gets published . . . and repeated . . . until the job is done.
The Window shifts some more.
Which brings us back to the world of today. More of our commercial news/infotainment media is conglomerated than ever before. And because it is commercial, it has a vested interest in not offending those advertisers that literally pay the bills.
There are also think tanks out there which purport to be non-partisan and independent. They mostly aren't. And they coordinate, even synchronize their efforts, giving the illusion that a majority of sources of news are arriving at the same conclusion simply by considering the opinions of "others." The non-critical news reporters considering these actors as sources often just run with what they read, failing, often, to consider who is doing the talking . . . and why.
The left-hand side of the political spectrum, by the way, doesn't seem to coordinate like this, not at all. Oh, they try. They prove ham-fisted in their attempts. That's because the left has, I believe, an inherent suspicion of propaganda. These tactics (like the Overton Window, Source Amnesia and the Repetition Effect) smack of lying, underhanded manipulation. They are, true; but on the left there is a delusion that intentions matter. They prefer "education" to propaganda.
(I put that word in scare quotes for a reason: If someone tells you something, it is not necessarily "education." Only teachers can educate pupils. The difference is that the pupil recognizes the teacher as a teacher. Blowhards yapping on street corners or door belling neighborhoods like locusts are, at best, "informing" or "persuading." Unless the listener acknowledges the speaker as an expert and accepts that expert as a teacher, there is no education.
(Sorry, pet peeve of mine. /rant.)
**This aversion to coordination has been noted by none other than Josh Trevino, founder of the blog Swords Crossed. In an entry quoted here (I would have quoted the original instead of the DailyKos version, but couldn't find it: Swords Crossed might need to clean up its user interface), Trevino noted this about the Overton strategy:
So there's your tip from the VRWC [Vast Right Wing Conspiracy] for the day. It's a methodology that could work for the left as easily as the right, although I'm not aware of a single left-wing think tank (and they are few) that operates so systemically. If you're of an analytic bent, and want to figure out where a legislative or policy strategy is heading, try constructing the scale of possibilities and the Overton window for the subject at hand. Change can happen by accident, true: but it is just as often the product of deliberation and intent, and it does all of us well to understand the mechanisms by which it occurs.
(I boldly emphasized.)
When the guys at the right-wing think tanks are unaware of the left-wingers using their strategies, this speaks volumes for the accusation that the left are just as effective as the right, wouldn't you say?**
If that weren't enough, there is technology giving vested interests new strategies. For example, think back to that HB Gary/Anonymous email hack from last year, the one where the CEO of the company belittled Anonymous and drew their wrath. From some of the leaked emails, we learn that software is currently being used to enable one person to put their opinions into up to 50 individual on-line personas at once.
From the link:
According to an embedded MS Word document found in one of the HB Gary emails, it involves creating an army of sockpuppets, with sophisticated "persona management" software that allows a small team of only a few people to appear to be many, while keeping the personas from accidentally cross-contaminating each other. Then, to top it off, the team can actually automate some functions so one persona can appear to be an entire Brooks Brothers riot online.
For everyone not living in a cave cowering under a cot, this little bit of reality completely changes the nature of reality itself. It gives the software wielders the power to give commercial opinion the imprimature of online crowd-sourced popular opinion, clearing the way for others in the public sphere to adopt the commercial opinion as simply a popular one. And as we learned from the Asch/Berns studies, when an individual changes his or her opinion based upon their perception of what peers/others say, the brain conforms perceptions to the opinion, not the other way around.
And with a bit of selective priming, these opinions can eventually become religious opinions. From this article:
The researchers started by asking volunteers who said they believe in God to give their own views on controversial topics, such as abortion and the death penalty. They also asked what the volunteers thought were the views of God, average Americans and public figures such as Bill Gates. Volunteers' own beliefs corresponded most strongly with those they attributed to God.
Next, the team asked another group of volunteers to undertake tasks designed to soften their existing views, such as preparing speeches on the death penalty in which they had to take the opposite view to their own. They found that this led to shifts in the beliefs attributed to God, but not in those attributed to other people. . . .
Even though the team gave the participants priming from non-religious sources — people, in other words — the participants associated the soften religious views with God's opinion.
Priming led to source amnesia. With enough repetition, and enough time, we can change silly political opinions into dogmatic canon. All we need is a reason.
And profit is always a good one.
Addendum, the next day: Section between the twin asterisks added after qualifying amendment suggested by underlankers was inserted.