Did Data-Driven Accountability Cause the Atlanta Cheating Scandal?
Ohanian Comment: The pressure for higher test scores has intimidated adults to the point that some of them have broken under the pressure. So who are we going to blame for that? As Thompson rightly points out, Atlanta Superintendent Beverly Hall was certainly not alone in proclaiming a culture of "No exceptions. No excuses," and "data-driven instruction." Under Secretary Duncan, this is the mantra of the land, and we can expect more, not less, cheating, and more adults collapse under the pressure.
John Thompson was an award-winning historian, lobbyist, and guerilla-gardener who became an award-winning inner city teacher after crack and gangs hit his neighborhood. He blogs at http://www.thisweekineducation.com and is writing a book on 18 years of idealistic politics in the classroom and realistic politics outside.
by John Thompson
The official report of cheating in the Atlanta Public Schools is bound to prompt a debate whether or not the scandal is just an extreme version of the corruption prompted by NCLB and data-driven accountability. The report shows that some of Atlanta's behavior was qualitatively worse than the legal, but dubious practices that have become common in urban districts.
Much of the behavior prompted by their "culture of fear," however, is indistinguishable from the abuses lauded elsewhere as a "culture of accountability." Atlanta Superintendent Beverly Hall was not alone in proclaiming a culture of "No exceptions. No excuses," and "data-driven instruction."
In order to motivate principals, Hall employed multi-colored charts displaying data gains that she should have known were questionable. Given the extreme turnover of principals across the nation, and the press glorifying superintendents who fire school leaders, Hall's record of replacing 90% of principals over two decades was exceptional only in degree.
The humiliation and scapegoating of educators is pervasive today, but the way it was ritualized in Atlanta seems unique. Every year at a convocation at the Georgia Dome, faculty of schools that met their targets were given seats of honor "on the floor," while teachers from low-performing schools were relegated to the back of the dome. A principal forced a teacher who did not raise scores enough to crawl under a table.
The extent of Atlanta's cover-up also seems extreme. Other districts, like Washington D.C., on the other hand, have sought to contain their cheating scandals by looking the other way. Atlanta may be unique, then, only because the story had legs, and was forced to keep up its culture of denial over a much longer period of time.
The main findings of the state investigation sound eerily familiar to educators who were forced to meet unobtainable NCLB targets that called for 100% proficiency by 2014. A primary factor that led to cheating, the report concluded, was unreasonable growth targets. As with NCLB, goals became more impossible because of their cumulative effect over the years. And as with NCLB, schools faced the task of comparing "apples with oranges" by mandating growth by one class based on scores for other groups of students.
Atlanta should be a warning to the Duncan Administration, which has pressured systems to use test score growth for the evaluation of educators. In Atlanta, 25% of the principals' evaluations were based on test score growth, and they also were allowed only three years to raise scores. In other words, Atlanta reaped the harvest of the "reforms" in New York City and Washington D.C., which give principals three years or less to meet targets or be fired. Now, both cities face the decision of whether to honestly investigate the extent of their dishonesty.
The investigation described the same dynamic that Susan Headden, of the Education Sector, explained as a problem with the D.C. IMPACT evaluation system. The Atlanta investigation notes that the cumulative effect of cheating made it more difficult for honest educators to meet their growth targets. Similarly, Headden wrote,
Cheating, (in D.C.) of course, distorts the playing field; the teacher who fudges the numbers on students' tests is judged against the teacher who doesn't ... The teacher who gets the same students the following year is also hurt; because she is starting from an inflated baseline.
Perhaps the best example of corruption was "question number seven," because it is similar to the abuses that are common because they are not as clear-cut as pointing out the right answer or erasing wrong answers. Teachers were given a "tip" sheet that warned that question seven, an essay about a rule that the students considered to be unfair, was similar to a question that would appear on the test. Sure enough, students were required to write about a law that they thought was unfair. In one sense, the "question number seven" abuse was worse than practices that are openly promoted in most districts, because it required a person to improperly peek at a test booklet. On the other hand, many systems are like D.C. and announce an annual, spring test prep season. Was Atlanta's "tip" more unethical than the common practice of drilling students on recently released test items before the exam? D.C., for instance, proudly takes advantage of granular detail
available to teachers" to prep for test questions. Fundamentally, the Atlanta scandal is the logical and predictable result of data-driven reform. As explained by Campbell's Law, when accountability regimes are unfair and impose unreasonable requirements, that is an invitation for corruption. Atlanta is a legacy of the fiction that rising test scores reflect real increases in learning. Above all, it is a result of the situational ethics of today's accountability hawks. The end, of helping poor children, is used to justify disgusting means -- the intimidation of adults to the point where some break under the pressure. When data-driven accountability is used to intimidate adults, the poison flows down on the kids. This week's report on cheating will not be the last we hear of the unintended effects of the nation's bubble-in test craze.
National Education Policy Center (NEPC) blog