I will look at any additional evidence to confirm the opinion to which I have already come.
--Lord Molson, British politician (1903-1991)
The Washington Post ran a piece On Leadership July 18 and 19, 2011, that featured responses to the Atlanta cheating scandal from Dan Ariely, Arne Duncan, Howard Gardner and Steven Pearlstein.
Despite the evidence mounting against high stakes, standardized testing, Arne Duncan and many other advocates for test and punish accountability want to stay the course with their "despite cheating scandals, testing and teaching are not at odds" mantra. So powerful is Duncan's need for consonance, his reaction to disconfirming evidence is to criticize, distort and dismiss cheating as the sole responsibility of those individuals who did the cheating.
In other words, Duncan is saying "mistakes were made, but not by me." This mental jockeying is known as confirmation bias, and at the moment Duncan is its poster boy.
False dichotomies make choosing easy. Duncan frames his argument very carefully -- either you are for accountability (with him) or you are against accountability (against him). But in reality, the situation is far from this simple.
In her book Willful Blindness, Margaret Heffernan writes about "the ostrich instruction":
We all recognize the human desire at times to prefer ignorance to knowledge, and to deal with conflict and change by imagining it out of existence... In burying our heads in the sand, we are trying to pretend the threat doesn't exist and that we don't have to change... A preference for the status quo, combined with an aversion to conflict, compels us to turn a blind eye to problems and conflict we just don't want to deal with.Sometimes it's the leaders with the most power and responsibility who are the most blind because they believe they know what they were doing -- or feel like they have to look like they know what they are doing.
In their book Mistakes Were Made but not by Me, Carol Tavris Elliot Aronson write:
In a study of people who were being monitored by magnetic resonance imaging (MRI) while they were trying to process dissonant or consonant information about George Bush or John Kerry, Drew Westen and his colleagues found that the reasoning areas of the brain virtually shut down when participants were confronted with dissonant information, and the emotion circuits of the brain lit up happily when consonance was restored. These mechanisms provide a neurological basis for the observations that once our minds are made up, it is hard to change them.
Indeed, even reading information that goes against your point of view can make you all the more convinced you are right.In light of this, it's not surprising that when test and punish accountability supporters like Arne Duncan are faced with evidence that shows cheating as an inevitable and inherent characteristic of high stakes testing, they simply turn to discrediting the facts and become even more committed to their own argument. At this point, I'll be fair to Duncan and say that this behavior is as predictable as it is unfortunate, especially if he believes staying the course is his only option. Because people become more certain they are right if they can't undo it, nothing is more dangerous than an idea when it's the only one you have.
At this point, I'm reminded of what Edward De Bono meant when he said:
If you never change your mind, why have one?This is precisely why we need to listen to people like Bob Schaeffer from Fairtest who say:
The failure of NCLB and its state-level clones cannot be reversed by “staying the course,” “raising the bar” or any of the other faith-based notions frequently invoked by high-stakes testing true-believers.