Orwell Award Announcement SusanOhanian.Org Home


Forging Facts to Fit "Reading First" Mandates (Part 2)

Susan Notes:


Gerald Coles shows that the reader must beware of everything, including how a control group is chosen. He shows us that "solid empirical data" can be manipulated right from the getgo.

As I wrote in my previous commentary on "forging facts," the National Reading Panel created a document of scientific misrepresentation using several methods that sculpted (i.e., bent, chiseled, contorted) data. They did this in order to make the data conform to a preconceived image of what instruction works "best." This second part of "Forging Facts" continues the description of the methods used by the Panel.

METHOD FOUR: MAKE THE MEANING OF FACTS ACQUIRE A NEW MEANING

What is "reading"? Throughout the Report the term is used profusely but its definition is ever-changing. Reading in one place might mean reading real or nonsense words and somewhere else might mean reading aloud smoothly and rapidly. Seldom does it mean comprehending text, a process most people would define as key in reading.

The Report's predominant definition of reading is reading at the word level, with a special stress on decoding skills. Nonetheless, all definitions of reading are transformed into the cryptic term "reading." For example, the Report states that an eleven week training program for poor kindergartners, focusing on phonemic awareness activities and games that included learning to read real words "transferred to reading," i.e., the training program benefited reading. The Report makes no mention of the inconvenient fact that the measures of "reading" success did not include reading of sentences or tests of comprehension. "Reading" boiled down to the results of phonemic awareness and word list tests. Nevertheless, the Report includes this study as one demonstrating that "training boosts children's reading."

Another study supposedly showing that phonemic training boosts reading was done in Sweden with kindergartners. Excluded in the Report are the following facts noted by the original researchers at the conclusion of their study: in a follow-up of the children at the end of first grade, although the students in the phonemic training program did better on phonemic awareness tests, there were no group differences for silent reading, spelling, and reading nonsense words. "Boosts" of this kind are abundant in the studies the Panel reviewed.

Most critically, for these and similar skills training studies that, according to the Report, included significant effects on "reading," where follow-up testing was done one or two years after training, these studies did not demonstrate a sustained effect on comprehension. As one researcher who did attempt to achieve such an effect concluded, "as in other studies, this amount of [skills training] did not lead to unique benefits for growth in reading comprehension," an outcome "similar to the findings of most controlled studies of explicit training even with somewhat longer training times."

Not surprisingly, this appraisal of the research facts on comprehension outcomes does not appear in any of the Report's discussion of "reading."

METHOD FIVE: REFASHION NO COMPARISON INTO AN INFORMATIVE COMPARISON

Running through the Report is the "compared with what" problem, which uses two kinds of fact fashioning.

One compares various kinds of skills-training with no other kind of beginning reading instruction ( e.g., children who only did "coloring, cutting and sticking" and no other instruction were used as a control group) and finds that the group receiving skills instruction obtains better test results, especially when the tests are related to the word skills that were taught. An amazing finding!

A second kind of refashioning of facts occurs when no genuine comparison is described as an informative comparison by using a control group whose reading instruction is never clearly defined. An example is a study in which the experimental group of students had a skills training program (e.g. segmenting and blending phonemes, phonics exercises, spelling word sounds) added to their regular classroom teaching. At the end of the training period, the Report states, the training group had a significantly superior outcome in decoding words and nonwords compared to "the untreated controls," thereby demonstrating that the program "was highly effective at teaching decoding skills to disabled readers."

However, if we look beyond the Report's skimpy details, we find that the published paper on the study contains lots worth knowing. First, the children were identified as "learning disabled" and in special education classrooms in New York City in 1975. Second, the reading instruction for these youngsters, in the words of the researchers, "might best be described as eclectic," i.e., there was no uniform instruction. Various basal reading programs were used in the classrooms; three teachers used one series and two other teachers used two different series. About 75% of the teachers also used phonics materials. Worksheets too were part of the curriculum.

What we had, in other words, were special education classes that had various reading programs that appeared to be used for nothing "special" with the students!

No doubt the special education classes were going through the various basal readers at a slower pace, but there was no indication that the classrooms used anything other than the traditional, conventional reading programs used in the regular classes, reading programs that perhaps were those that contributed to the students' reading failure in the first place.

Therefore, we could say that the skills-training program was helpful as an adjunct to lackluster, conventional basal reading programs used in conventional special education classes. The "control" program merely duplicated, at a slower pace, the kind of literacy program that the children used before being placed in special education.

What does this comparison demonstrate with special education youngsters other than that if skills training is added to an unimaginative, routine reading program and compared with the reading program alone, the added skills training will result in better skills outcomes? This is hardly a persuasive demonstration of the reading benefits of a given skills training program or of the need for such training over alternative reading education approaches.

Nevertheless, the National Reading Panel forged these kinds of entirely deficient "compared to what?" studies into what appeared to be solid empirical data supporting skills-heavy beginning reading instruction!

(to be continued with Method Six: Misrepresent Causation)

Amplified and documented versions of these commentaries on forging facts can be found in my book, "Reading the Naked Truth: Literacy, Legislation and Lies" (Heinemann) and in my chapter, "Forging 'Facts' to Fit an Explanation: How to Make Reading Research Support Skills-Emphasis Instruction," in the newly published 2nd edition of "Literacy as Snake Oil: Beyond the Quick Fix," Joanne Larson (ed.), Peter Lang publisher.

— Gerald Coles
The Pulse: Education's Place for Debate
2007-09-25
http://www.districtadministration.com/pulse/commentpost.aspx?news=no&postid=48315


INDEX OF RESEARCH THAT COUNTS


FAIR USE NOTICE
This site contains copyrighted material the use of which has not always been specifically authorized by the copyright owner. We are making such material available in our efforts to advance understanding of education issues vital to a democracy. We believe this constitutes a 'fair use' of any such copyrighted material as provided for in section 107 of the US Copyright Law. In accordance with Title 17 U.S.C. Section 107, the material on this site is distributed without profit to those who have expressed a prior interest in receiving the included information for research and educational purposes. For more information click here. If you wish to use copyrighted material from this site for purposes of your own that go beyond 'fair use', you must obtain permission from the copyright owner.