Orwell Award Announcement SusanOhanian.Org Home


NCLB Outrages

ED Ignored Early Warnings on Reading First Conflicts, Report Says

Ohanian Comment: Rarely do we see this kind of investigative reporting directed at education. It is the shame of most media that they just don't care enough, recycling press releases as "reporting."

Read this and watch the rats trying to get off the sinking Reading First ship. The sad thing is they will be successful.

Brownstein and Hicks' companion piece is linked in this article.


Officials Obscured Origins of Influential Assessment Review

By Andrew Brownstein and Travis Hicks

Washington, October 30, 2006 — Almost from the inception of Reading First, critics raised red flags about the potential of the initiative to blur the lines between commerce and policy.

Critics and supporters alike agree that the $6 billion program is unusually strict and prescriptive. With a historic expenditure of new money on reading — and requirements for qualified peer review, technical support, instruction and professional development — it created a sudden and urgent need for expertise in the fairly young field of scientifically based reading research (SBRR).

But in the world of reading, expertise of that sort comes with a price: Just about any researcher of note has links to at least one commercial program — sometimes many more. And the pool of people with sophisticated knowledge of SBRR is quite small.

The issue is important because an apparent failure to erect a strong-enough firewall between advice and salesmanship is a key factor that put Reading First on a collision course with the U.S. Department of Education’s Office of Inspector General (OIG).

In an investigative report released in September — the first of six on Reading First — the OIG documented several instances during the early implementation of the program in which warnings about potential conflicts were ignored and safeguards weren’t adequately implemented or enforced.

Enormous Influence

The most significant of these episodes — and one that is frequently overlooked — was the creation of a federal panel that reviewed testing instruments as examples to guide states on the kinds of assessments that could be used for K-3 readers.

The eight-member assessment committee, which reviewed the “technical adequacy” of 29 K-3 reading assessments between September and December 2001, wielded enormous influence on the later implementation of Reading First. The Education Department (ED) links directly to the committee’s report from its Reading First Web page, as do many state education agencies. In addition, many states cited the panel’s work in their applications for funding.

Of the 29 tools reviewed, the team found 24 assessments to have “sufficient evidence” to qualify for use as one or more of the measures states might employ to meet the program’s testing mandates. The approved assessments included all seven of the instruments designed by or linked to committee members.

The report sparked an early outcry from critics who claimed the panel was stacked in favor of the Dynamic Indicators of Basic Early Literacy Skills (DIBELS), which has become the most widely used assessment tool under Reading First. Some say DIBELS was selected because it is uniquely tailored to the needs of the program. Others note that the assessment was created at the University of Oregon and that four Oregon professors — including DIBELS creator Roland Good and panel leader Edward Kame’enui (see Sept. 2005 Title I Monitor, Reading First Under Fire) — sat on the eight-member assessment panel. Panelists did not review their own instruments, however, and publishers were given the opportunity to identify errors in the panel’s analysis.

In an e-mail to assessment committee members, Kame’enui acknowledged the potential for trouble, according to the OIG. He explained that he decided to describe the review committee’s report as a “singular effort” on his part because several members “are authors or co-authors of the assessment instruments” the panel reviewed. “The perception of a conflict of interest in shaping the final report was a concern,” he wrote.

The report was contracted by ED and the National Institute for Literacy (NIFL), an independent government agency. Nonetheless, concerned that “releasing the report might appear as though NIFL was endorsing specific products,” Sandra Baxter, the organization’s director, told investigators that NIFL never green-lighted its publication.

Behind the Scenes

The fact that the report exists — and has been widely disseminated — is partly the result of an elaborate behind-the-scenes effort to publish an unauthorized version of the report while concealing the source of its funding from the public, according to the OIG. Lacking the institute’s approval, Reading First officials decided to publish the report anyhow. As such, NIFL was denied a chance to review the report it commissioned, as well as an opportunity to fully address concerns about product endorsement.

A researcher with close ties to the panel, who spoke on condition of anonymity, acknowledged that “the release of the report did not follow appropriate procedures.” The motive was “speeding up the release of accurate information” in the early days of Reading First, when states and districts were clamoring for guidance on how to apply for funds. But “the means (not securing NIFL sign off) do not justify the ends,” the researcher acknowledged in an e-mail.

In the Reading First legislation, Congress identified a role for NIFL to “disseminate information on scientifically based reading research.” However, according to the OIG, the decision to leak the assessment review demonstrated a scheme “to disregard Congress’ direction and intent.”

Everett Barnes, executive director of the RMC Research Corporation of Portsmouth, N.H., told the Monitor that NIFL and ED collaborated on the work. RMC subcontracted funding from an earlier NIFL grant to the Oregon review committee, but in practice, Barnes said, “RMC had no role in defining the work, selecting the panelists, or providing direct oversight.”

The audit highlights e-mail exchanges between Kame’enui, University of Oregon Professor Doug Carnine, and former Reading First Director Chris Doherty that illustrate a campaign by federal reading officials to obscure NIFL’s funding role and publish the report without going through the proper channels.

A decision was made in 2002 to publish the final report on the University of Oregon’s Web site without NIFL’s blessing. Doherty, in an e-mail, said that waiting for NIFL’s okay “will slow us down,” according to investigators. After receiving a copy of the report, Baxter told Doherty that, “Clearly, [the final report] is not in shape for publication as submitted. I’ll need at least a few days to read the entire report and determine just how to streamline it into a document that is useful to our intended audience.” Doherty forwarded the e-mail to his boss, former ED Assistant Secretary Susan Neuman, who responded that NIFL was dragging its feet. “At this rate, it may never get up on the [NIFL] website,” she wrote.

But the report got out to the field anyway via the University of Oregon’s Web site. The OIG quoted Carnine offering instructions to Kame’enui on how to handle questions about who funded the report: “Say you received funding from RMC. If they ask who funded RMC, say the Dept of Ed. You should not need to answer such questions in writing.”

Kame’enui asked Doherty whether to tell publishers asking for details that “the report is completed but under review or do I simply say it is not ready for public release?” According to the report, Doherty advised him to “go with...‘not ready’ for the reason that the ‘under review’ option could lead to the non-trivial question, ‘Under review by whom?’ which then gets into the next issue of, ‘is this a NIFL doc, or an ED doc?’”
State Confusion

Confusion persisted, and some states operated under the impression that the final report bore ED’s seal of approval. For instance, North Dakota’s guidance to local education agencies for Reading First assessments said the state had “adopted the ‘approved list’ of assessments found on the University of Oregon’s” Web site. Similarly, Oklahoma included in its approved application references to tests that were “[f]avorably approved by the USDE Reading First Assessment Committee.”

Part of the confusion may be attributed to mixed signals from the department and the final report itself. In a Sept. 10, 2002, letter to a publisher’s organization, then-ED Secretary Rod Paige said “that the findings of this study do not in any way represent an ‘approved list’ of assessments for use under the Reading First program. Rather, this analysis examines the technical qualities of a group of K-3 assessments, not their suitability for use in connection with Reading First.”

But during the same time period, Reading First officials were actively sowing confusion about the report’s origins as they rushed to get it published and out to the field. Moreover, an executive summary released on the University of Oregon’s Web site opens with a description of the Reading First legislation and its requirements for assessments.

The OIG noted that the report that finally appeared on the University of Oregon’s Web site — the link has since been deactivated — “does not mention any connection to the department or NIFL,” the organizations that funded the review. Lynn Reddy, NIFL’s deputy director, told auditors that the document should not have been posted.

The OIG stated that the final report never appeared on NIFL’s Web site. However, weeks after the OIG released its report, Bob Sweet, who helped write the Reading First legislation as a senior staff member for the U.S. House of Representatives, sent reporters for the Monitor a live link to the review on the institute’s Web site, http://www.nifl.gov. Minutes after the reporters informed NIFL of the discrepancy, the link went dead.

My Linh Nguyen, a NIFL spokeswoman, said the existence of the report on the agency’s Web site resulted from a misunderstanding. “We have internal written communication from Institute senior staff to our Web site staff stating that the report is on hold. We believed that the communication represented clear guidance not to post the report on our Web site. We had no reason to believe that the guidance was not followed,” she said.

Doherty resigned shortly after the OIG released its findings. He, Carnine and Neuman declined requests to discuss the assessment review episode. The department would not make Kame’enui, now commissioner of ED’s National Center for Special Education Research, available for comment. NIFL officials would not address specific questions about the OIG’s findings.
No Further Review

The assessment review panel remains controversial for other reasons. Time constraints limited the number of assessments the panel could review. Several assessments were listed in the report as instruments to be reviewed later, but no further reviews were conducted. States, then, were left with choosing the options the panel identified or funding their own scientifically based reviews — a requirement that most lacked the funds and technical expertise to handle on their own.

In separate interviews, two members of the assessment review panel acknowledged the limitations with the scope of the review, but stood behind the integrity of the panel.

“In our first meeting, I remember personally bringing up [the conflict of interest issue] and saying that we needed to address these issues and make sure that people were not involved in reviewing their own tests,” said David Francis, chairman of the psychology department at the University of Houston. “I had no desire to participate in something [suggesting that] ‘These are approved tests and the rest of these are not approved tests,’ or ‘If you’re not on the list, you’re not approved or useful.’”

Added Joseph Torgeson, director of the Eastern Regional Reading First Technical Assistance Center at Florida State University: “The intention always was to make this an ongoing process ... but somehow that idea got diverted.”

Torgeson said he was surprised to learn that NIFL never approved the panel’s report for publication. Francis told the Monitor that he was not aware that NIFL played a role in funding the panel at all; he thought the RMC Research Corporation had contracted the review. Neither researcher was involved in the development or publication of the final report.
‘Individual Mistakes’

In response to the OIG’s findings, ED Secretary Margaret Spellings said the department would review state applications, guidance and conflict-of-interest protocols to ensure they were done appropriately. The department also has sought comments from state directors and will send a memorandum to all ED program managers “reminding them of the importance of impartiality in the performance of their duties.”

“I acknowledge that some of the actions taken by Department officials as described by the draft report reflect individual mistakes,” said Spellings. “Thus, I am disappointed by what I have read about some aspects of the early implementation of the Reading First program.”

Rep. George Miller (D-Calif.), ranking minority member of the House education committee, asked the U.S. Attorney General’s Office to determine whether the actions of department officials or others working with the agency violated federal corruption or conflict of interest laws.

Miller also has requested that the committee conduct hearings on the findings. Lindsey Mask, a spokeswoman for the committee, said the panel will explore Reading First as part of a slate of hearings during the next session linked to the reauthorization of No Child Left Behind (NCLB).
Peer Reviews

The assessment panel was not the only instance in the early implementation of the program in which questions about conflicts of interest came up. Department officials also debated the issue in connection with its selection of peer reviewers to evaluate state applications for funds.

NCLB does not require peer reviewers to be screened for conflicts of interest. Nonetheless, the report noted, the screening process ED voluntarily adopted for Reading First was “not effective.” Department officials decided to exclude a sample question on its conflict-of-interest form, submitted by an ED ethics attorney, referring to “any other circumstances that might cause someone to question your impartiality.” Potential panelists provided ED with resumes, but according to the OIG, they were not reviewed. Investigators surveyed the resumes of the 25 approved panelists and identified six whose resumes revealed “significant professional connections to a teaching methodology that requires use of a specific reading program” (see Former Reading First Director Draws Fire — and Defenders).

Some say such conflicts reflect the nature of the field, in which the pool of qualified experts on SBRR is limited. Kurt Engelmann, president of the National Association for Direct Instruction in Eugene, Ore., said of the panelists: “They had to come from somewhere. I don’t think there’s a large conflict of interest.”

Department officials complained about the OIG’s finding, arguing that investigators seemed to be suggesting that “we could have been better off just by meeting the minimal requirements of the law” and not instituting any kind of screening policy. ED officials said there was “no available information” that “any problematic behavior” ensued as a result of the selection of the six panelists with professional connections. The OIG, however, noted that due to the faulty screening process, “the Department would not have known of the potential conflict” those panelists represented.
‘Bias’ Against Reading Recovery

A few days before the department announced its list of expert reviewers, one of the panelists phoned Doherty and shared what he termed his “strong bias” against Reading Recovery — a widely used intervention that targets individual students — and his strategy for responding to any state that used the program in its application. According to investigators, Doherty responded: “I really like the way you’re viewing/approaching this, and not just because it matches my own approach, I swear!” This panelist later served as the head of the subpanel that reviewed Wisconsin’s state application, in which he included an 11-page negative review of Reading Recovery in his official comments responding to the state’s plans to use the program.

Reading Recovery came in for particular venom from Reading First’s leadership, as did practitioners of “whole language” instruction, such as the Wright Group Literacy program, once a popular reading instruction vehicle.

According to the report, Doherty instructed one staff member regarding the Wright Group: “Beat the [expletive deleted] out of them in a way that will stand up to any level of legal and [whole language] apologist scrutiny. Hit them over and over with definitive evidence that they are not SBRR [based on scientifically based reading research], never have been and never will be. They are trying to crash our party and we need to beat the [expletive deleted] out of them in front of all the other would-be party crashers who are standing on the front lawn waiting to see how we welcome these dirtbags.”
National Reading Panel

Defenders of Doherty say they see little evidence that he did anything more than aggressively — perhaps inartfully — enforce the law. Reading First rested on the scientific groundwork of a National Research Council (NRC) report on beginning reading and the National Reading Panel, which was designed to put an end to the “Reading Wars” that marked much of the ’80s and ’90s. Based on over 100 studies showing that reading did not come automatically to children, the panel helped ground the discussion of reading in scientific dialogue. Phonics — the notion that students learn how to read by first sounding out letters — had been minimized in many classrooms. Members of the panel said reading achievement would not improve if teachers could introduce phonics how and when they pleased; phonics, as well as instruction in reading fluency, vocabulary, and comprehension needed to be taught systematically, directly and explicitly.

Sweet and Reid Lyon, who had been President Bush’s unofficial “reading czar” at the National Institute of Child Health and Human Development, drafted the Reading First legislation with the panel’s findings in mind. Having lost an early legislative battle that would have required states and districts to use only those programs with “actual evidence of effectiveness in repeated, randomized trials”— only two programs met the criteria— the pair then drafted language that reflected the congressional intent to require that programs had to be based on the elements of scientifically-based reading research (See "Definition of Scientifically Based Reading Research"). The problem, Lyon said in an interview, is that without the evidence base behind them, programs could claim they had SBRR without actually implementing the required elements. “And then it merely becomes a beauty contest,” he said.

That is the confusing and sometimes hostile environment Doherty found himself in during the early days of the program. “The states are not without blame for lack of appropriate implementation,” Sweet said. “One state wanted to use Reading First funds for parking lots, another to buy only library books, and another wanted the money even though it had earlier lambasted the goals of the program. One very large state tried to pass legislation that would have avoided compliance altogether.”

“Chris was gamed day in and day out,” added Lyon. “People tried to sell him a bag of goods every day. That’s when he’d call me and say, ‘Do you have any evidence on this?’”

In Massachusetts, Doherty intervened after the state approved districts using Wright and another out-of-favor program, Rigby Literacy, the report said. A district that continued to use those programs lost its Reading First funding, according to the report. The OIG said that these examples and others, several of which were first reported in the Sept. 2005 Title I Monitor, ("Reading First Under Fire") may have violated federal prohibitions on circumventing local control and the endorsement of curriculum.
Essential Components of Reading Instruction

(Sec. 1208(3), Elementary and Secondary Education Act)

(3) Essential components of reading instruction — The term 'essential components of reading instruction' means explicit and systematic instruction in —

(A) phonemic awareness;
(B) phonics;
(C) vocabulary development;
(D) reading fluency, including oral reading skills; and
(E) reading comprehension strategies.
SBRR v. Local Control

But supporters of Doherty say he was just doing his job: making sure that states and districts adopted programs based on SBRR. Sweet stressed his belief that Reading First’s SBRR requirement trumped older portions of education law that placed limits on local control. The conflict in the law is the subject of considerable debate. Leigh Manasevit, an education attorney and a member of the Monitor’s editorial board, acknowledged that the local control statute is a “moving target” that has been substantially eroded in recent years by the sweeping federal mandates of NCLB.

If federal officials had the right to enforce SBRR at the local level, however, they often didn’t appear eager to explain themselves in a transparent fashion. In an e-mail to Lyon cited by investigators, Doherty bragged about getting Maine, Mississippi and New Jersey to drop the Wright and Rigby programs. “I think this kind of program-bashing is best done off or under the major radar screens,” he said.

In another instance cited by the OIG and first reported by the Monitor last year, Doherty told Kentucky officials that he had concerns about Reading Recovery and urged them not to use Reading First funds for the program. State officials asked him to provide the request in writing, but he declined and invited them to defend the program instead. The officials subsequently provided scientific evidence for the program, but never heard back from the department.

“It is clear in this respect that there was this animosity toward Reading Recovery, and a well-organized campaign” to undermine it, said Jady Johnson, executive director of the Reading Recovery Council of North America. “At this point in time, we have no assurance they’re going to change.” Reading Recovery is one of three groups that asked the OIG to investigate the program.
State Response

ED has said that it intends to implement all the OIG’s recommendations ( See OIG Recommendations). It has also solicited concerns from states about the initial review of Reading First grants in 2002 and 2003. Thus far, only a handful of states have indicated they are exploring the option.

“We know that there are certainly states considering the opportunity offered in the letter and that at least some will respond,” said Scott Palmer, an education attorney and consultant to the Council of Chief State School Officers.

Wisconsin officials, for example, are composing a letter expressing their concerns, although a spokesman for the state department of public instruction could not offer specifics. (The OIG released another audit on Oct. 20, which found that Wisconsin did not always ensure that school districts receiving Reading First funds were implementing high-quality programs.)

Interestingly, Kentucky — the lone state to formally complain to the department about conflicts of interest during the grant process — is not leaning toward sharing its concerns with ED. The state may, however, request access to peer review comments that it didn’t see originally. “It’s not [Reading First] that’s the problem ... it’s the process that we went through to get approved that was the problem for us,” said Lisa Gross, a spokeswoman for the Kentucky education department.

The parties that requested the investigation, meanwhile, are weighing their options as the OIG prepares to complete its work. Johnson said that Reading Recovery officials would ask ED to stress to states that their program is eligible for Reading First funds and that “early intervention” — a key component of Reading Recovery — is written into the statute.

Bob Slavin, chairman of the Success for All Foundation (SFA), another group that requested the investigation, said he recently had “a very positive” meeting with Joe Conaty, the new director of Reading First. Slavin said he asked Conaty to consider allowing about 100 schools that he believes were pressured to drop SFA to be given a chance to re-adopt the program.

“It’s a symbolic gesture to rectify the most egregious of the program’s effects,” Slavin said.

Definition of Scientifically Based Reading Research

(Sec. 1208(6), Elementary and Secondary Education Act)

(6) Scientifically based reading research. — The term ‘scientifically based reading research’ means research that —

(A) applies rigorous, systematic, and objective procedures to obtain valid knowledge relevant to reading development, reading instruction, and reading difficulties; and

(B) includes research that —

(i) employs systematic, empirical methods that draw on observation or experiment;

(ii) involves rigorous data analyses that are adequate to test the stated hypotheses and justify the general conclusions drawn;

(iii) relies on measurements or observational methods that provide valid data across evaluators and observers and across multiple measurements and observations; and

(iv) has been accepted by a peer-reviewed journal or approved by a panel of independent experts through a comparably rigorous, objective, and scientific review.

— Andrew Brownstein and Travis Hicks
Title 1 Online
2006-10-30
http://www.thompson.com/libraries/titleionline/news_desk/tio061030a.html


INDEX OF NCLB OUTRAGES


FAIR USE NOTICE
This site contains copyrighted material the use of which has not always been specifically authorized by the copyright owner. We are making such material available in our efforts to advance understanding of education issues vital to a democracy. We believe this constitutes a 'fair use' of any such copyrighted material as provided for in section 107 of the US Copyright Law. In accordance with Title 17 U.S.C. Section 107, the material on this site is distributed without profit to those who have expressed a prior interest in receiving the included information for research and educational purposes. For more information click here. If you wish to use copyrighted material from this site for purposes of your own that go beyond 'fair use', you must obtain permission from the copyright owner.