Orwell Award Announcement SusanOhanian.Org Home


Special Report: Reading First Under Fire: IG Targets Conflicts of Interest, Limits on Local Control

Susan Notes: Here is Research that really counts, showing the read why Reading First should now be termed ReadingGate. I can testify that DIBELS has taken over the world. I can't tell you how many desperate notes I get from teachers, parents, and grandparents, asking why DIBELS is ruling their lives. Read this parent letter . And then go take a look at the DIBELS tests. A sample is available online from the University of Oregon .

NOTE: This is one of two reports. The second follows this one.


By Andrew Brownstein and Travis Hicks

By January of 2003, Kentucky reading officials were frustrated. They had just been denied federal Reading First funds for the third time, and state leaders worried that they might lose the opportunity to bring in an unprecedented $90 million for reading instruction in grades K-3 over six years. Like most states strapped by budget cuts, they could not afford to lose that money.

Months before, consultants to the federal program strongly suggested to state officials that Kentucky's choice of assessment was a major sticking point in their pursuit of the grant. According to the officials, consultants pushed them to drop the assessment they were using, Pearson's Developmental Reading Assessment (DRA), and choose the Dynamic Indicators of Basic Early Literacy Skills (DIBELS), which was quickly becoming the most widely used test under Reading First. But there was a problem: One of the consultants on the four-member team had a second job - as a trainer for DIBELS.

"I can assure you, he did not identify himself as such," said Felicia Cumings Smith, who resigned as that state's Reading First director in August to take a job at the University of Kentucky.

Seeking to retain DRA, the state turned to Chris Doherty, director of the national Reading First program for the U.S. Department of Education (ED). No luck. Toward the end of their telephone conversation, according to six participants, Doherty reinforced the consultant's position.

"The consultants said explicitly that unless we dropped the DRA and used DIBELS, we would never get funded," said Star Lewis, Kentucky's associate commissioner of teaching and learning. "We heard the same thing from Chris. The message was loud and clear." Three state officials and one retired official who participated in the conference call, in addition to Cumings Smith, attested to the conversation.

In an interview, Doherty vehemently denied he made the comments. "It absolutely, positively did not happen," he said. "Never would I suggest that a state use a brand-name product or they wouldn't get their funding."

"Amazing," Cumings Smith said when informed of his response.

It might be left as an irresolvable he said/she said scenario were it not for the fact that many state and district leaders say the episode fits a pattern. It encapsulates what many critics say have been features of Reading First since its inception: an over-prescriptiveness, a lack of transparency and conflicts of interest between consultants to the program and commercial interests.
Little in Writing

"I've had superintendents tell me stories about what the feds have asked them to do under Reading First, and I always tell them 'Get it in writing,'" said Richard Long, a lobbyist for the International Reading Association. "They try, and then they don't hear anything. I mean, are we talking urban legends here? It's that lack of transparency that is hurting the credibility of the department."

Critics of Reading First were once dismissed as ideological holdovers from the "Reading Wars" that marked much of the '80s and '90s. Now, the department is facing an unprecedented broadside from one-time supporters of the program. ED's Office of Inspector General (IG) is investigating wide-ranging complaints from three organizations that the $6 billion program has drifted away from its philosophical footing in scientifically-based research and has instead pushed states to adopt commercial programs touted by interested consultants. The organizations also allege that the department has micromanaged state decision-making on Reading First in a manner that violates Section 9526 of No Child Left Behind (NCLB), which prohibits the department from mandating state or local curriculum.

In June, Sen. Richard Lugar, R-Ind., expressed "considerable concern" in a letter to ED Secretary Margaret Spellings that "one or more officials contracted to work for the Department of Education may be working to further their own interests."

ED officials and the program's supporters say that the department has retained the best researchers as consultants, and in the field of reading, that typically means people with ties to commercial programs. "In my view, the program has been conducted just about as honorably as it possibly could have been conducted," said Joseph Torgeson, director of the Eastern Regional Reading First Technical Assistance Center (ERRFTAC) at Florida State University. "It is true that there are a relatively limited number of people around the country who would have some visibility in this area, and who strongly and passionately believe in the policy elements in instruction and assessment that Reading First is trying to institute. It is also true that those folks are very accomplished professionals who have been in engaged in a whole variety of activities," including commercial products.

In a July response to Lugar, Deputy ED Secretary Ray Simon sidestepped the issue of conflicts of interest, but insisted that the department's commitment to scientifically-based reading principles was "clear and unequivocal."

The complainants in the IG investigation can only be described as strange bedfellows. They are separated by sharp differences in geography, degree of influence and approach to reading instruction.
Far-Reaching Investigation

The first was Cindy Cupp. A former state reading director in Georgia, Cupp, 56, operates "Dr. Cupp's Readers and Journal Writers" out of her home in Savannah with her sister Ginger. The program, which is administered in approximately 100 elementary schools in Georgia, made $200,000 in net profits last year - pocket change by the standards of the major publishers. Her complaints, first lodged with the state education department in March, were forwarded to the national IG.

Baltimore-based Success for All (SFA), by contrast, is an international non-profit organization with an annual budget of $50 million that operates in 46 states and five foreign countries. Co-founder Robert Slavin, 55, is recognized as one of the leading practitioners of scientifically-based reading research, with more than $30 million in federal grants to his credit. Congress recognized the success of Success for All by naming it as an exemplar of schoolwide instruction in the 1997 legislation that introduced the Comprehensive School Reform demonstration project.

Slavin filed his complaint in May. In August, he and Cupp were joined by Reading Recovery, a popular reading program based in Worthington, Ohio, with even greater reach than SFA. It is supported by powerful senators like Edward Kennedy, D-Mass., and Susan Collins, R-Maine. In its complaint, Reading Recovery alleged that ED "has supported a quiet, yet pervasive misinformation campaign against Reading Recovery, despite a large body of research demonstrating Reading Recovery's effectiveness and long-term results."

Members of the IG's investigative team met with SFA representatives for the first time in a three-hour interview in late August. The IG team said they foresaw a wide-ranging investigation that could take up to a year to complete. They told SFA representatives that they plan on visiting Georgia and New York City, where in 2002 school officials dropped their existing phonics curriculum in favor of a largely untested program designed by Voyager, Inc. of Dallas, Texas (see story). The auditors also promised to focus on the state application process and the role played by consultants contracted by ED from the RMC Research Corporation of Portsmouth, N.H., and technical advisers at regional centers in Texas and Oregon.
Strict by Design

Of course, Reading First was never meant to function like a typical government program. Advocates have long argued that "entitlement" programs like Title I failed to improve reading scores because of a lack of quality control on how the money was spent. The Reading Excellence Act, the Clinton-era program that predated Reading First, was the first piece of federal legislation to codify scientifically-based reading research. But many observers said its spotty monitoring failed to hold states accountable for results.

"Reading First needed to be this prescriptive," said Janice Dole, Utah's Reading First director and a member of the expert panel that reviewed state applications. "The Reading Excellence Act was all over the place. If you don't narrow down the choices, what you have is a little bit of everything."

If states have made progress because of Reading First, supporters say, it is likely due to the targeting of funds to specific programs backed by solid evidence. Anecdotally, states report that the program's emphasis on professional development and the presence of coaches in Reading First schools has helped guide instruction and boost student scores. In Michigan, for example, several Reading First schools have outperformed their non-program counterparts. This year, Massachusetts reported that 34 of 35 districts showed "substantial gains" in reading scores.

The typical state reading director lauds the program while expressing some frustration with aspects of its implementation. Gail Schauer, assistant director of Title I and Reading First in North Dakota, describes the program as "wonderful" and says it's responsible for some "really big, positive changes" in her state. But she described aspects of the process as "extremely frustrating." For example, despite assurances from the department that there was no "approved list" of programs, Schauer learned over writing four applications that certain programs had to be in their application in order to get funded, and funding would never occur were not other programs removed from the application.

"Even though there was no approved list of assessments or core programs, you don't get approved unless you have certain assessments or core programs," she said. "There must have been a list somewhere."
"The Dibelization of America"

No one doubts that the standard for approving funds was strict. But was it strict for the right reasons, and in the right way? A two-month investigation by the Monitor suggests the answer is, "Not always." Based on interviews with more than 70 educators and researchers involved with the program, and an analysis of hundreds of pages of state and federal documents, the investigation raises questions about whether the department adequately policed the program for potential conflicts of interest and enacted reading policy with sufficient transparency.

The issue is complex. Take, for example, the instance of DIBELS.

A recent report from the Center on Education Policy (CEP) found that states were "remarkably consistent" in the instruments they selected to assess student progress in reading. In fact, CEP's review determined that 37 states required use of DIBELS as part or all of school district assessments, and that five additional states included it on a list of assessments from which districts could choose (see table).

Developed with funding from ED's Office of Special Education Programs, DIBELS is a test designed by University of Oregon researchers to measure student reading development. Specifically, it evaluates student performance in phonological awareness, alphabetic understanding, and fluency with connected text. According to the DIBELS Web site, the tests are "designed to be short (one minute) fluency measures used to regularly monitor the development of pre-reading and early reading skills." The instrument is used to predict how well students will read by the end of the third grade. Teachers can score the test themselves and enter the information into a database at the University of Oregon that tracks student progress.

A member of the National Reading Panel, whose 2000 report helped lay the philosophical foundations for Reading First, suggested that DIBELS is widely used because it "it fits the law so well that it is difficult to justify non-DIBELS solutions."

"If you require tests of specific elements of reading, then the best thing to have are diagnostic tests rather than generalized tests that combine everything," said Timothy Shanahan, director of the Center for Literacy at the University of Illinois-Chicago. "If you need a test that has high reliability and validity statistics, then tests that have bothered to do psychometric tests are best in that context. And if you require that a test be given and re-given multiple times, than a test with multiple forms is superior to those with single forms."

DIBELS' insistence on frequent testing is the subject of annoyance for many teachers, who charge that the need for ongoing assessment overwhelms time needed for instruction. In some circles, it has earned the derisive nickname "dribbles." "It's an assessment tsunami," said a Reading First consultant in Colorado. "It's the dibelization of America. Everything is being dibbled."

Charlotte Postlewaite, a Kentucky schoolteacher, tracked the early implementation of Reading First in 2003. As part of her efforts, she interviewed the author of DIBELS, Roland Good, an associate professor of school psychology at the University of Oregon. In an e-mail Postlewaite shared with the Monitor, Good explained that "the level of interest" in the test was on the rise: 2,020 schools were actively using DIBELS data across 32 states and Canada. He predicted that those numbers would triple by the spring of 2004.

"All in all, DIBELS is becoming very widely used," he said.

Something about the exchange irked Postlewaite. "He told me business was booming," she said. "Well, it ought to have been, because states told me that if they didn't put DIBELS in their applications, they didn't get funded. I thought it was a bit disingenuous that he pretended not to know why use of the test was growing by leaps and bounds."

In an interview, Good said he would like to think DIBELS is widely used because it is "reliable, valid, efficient and has utility for teachers making educational decisions." Good contends that research supports this assertion: a Web address he sent to the Monitor links to over 30 technical reports that support DIBELS - roughly two-thirds authored, at least in part, by Good himself.
The Assessment Team

It's the kind of self-supporting review that has some critics up in arms. They argue that Good and his Oregon colleagues have cornered the testing market through a federal review of assessments for Reading First. The assessment panel, they say, was stacked in favor of DIBELS: Four of the panel's eight members taught at the University of Oregon. (See chart for state involvement with leading consultants.)

Led by Edward Kame'enui, former co-director of the Western Regional Reading First Technical Assistance Center (WRRFTAC), the committee evaluated the "technical adequacy" of 29 K-3 reading assessments between September and December 2001. The federally created panel developed its own criteria for evaluation and a process for applying the criteria, as well as a means for identifying and selecting the assessments to be judged. Good served on the team, as did two other authors of five tests that were analyzed.

Of the 29 tools reviewed, the team found 24 assessments to have "sufficient evidence" for use as one or more of the required measures under Reading First -including all of the six instruments designed by committee members Lynn Fuchs, Good and Torgeson of ERRFTAC. The committee recognized that it would only be able to review a "modest sample" of available assessments, but said the goal was to establish criteria for review as much as to recommend a "sample of trustworthy measures."

Not everyone considered the process to be fair, however, and some argue that a lack of transparency in the selection of committee members allowed DIBELS to thrive while other assessments received little interest or weren't favored by peer reviewers.

For instance, a group of education professors at the University of Virginia asserts that the Phonological Awareness Literacy Screening (PALS) test - an instrument they designed for Virginia's statewide reading program - was deliberately not evaluated because it might have competed with DIBELS.

PALS consists of two screening instruments - one for kindergartners and one for children in grades 1-3 - that measure a variety of skills related to reading, including phonological awareness. The major purpose of PALS is to identify those students who are learning below grade-level expectations in these areas and may be in need of additional reading instruction, according to its designers. PALS can also be used as a diagnostic tool, providing teachers with specific information regarding what students know to help guide individual instruction. And like DIBELS, PALS offers an Internet database that provides a variety of interpretive reports.

Upon request, Virginia education officials sent a copy of PALS technical manuals and assessment materials to Doug Carnine, then director of the National Center to Improve the Tools of Educators (NCITE) at the University of Oregon, sometime between 2000 and 2001. But PALS was not among the instruments reviewed by the assessment team. Marcia Invernizzi, the author of PALS and a professor at the University of Virginia, said she called Kame'enui to ask why PALS wasn't reviewed and Kame'enui informed her that he had never seen it. According to Invernizzi, Kame'enui promised that if she re-sent the material, it would be reviewed. She sent her material a second time, but the review never happened.

In a letter to Sen. George Allen, R-Va., who as Virginia's governor initiated the state's reading initiative and funded the development of PALS, Invernizzi complained that "overlooking PALS was politically motivated since we had submitted our materials to be reviewed well before the review. The University of Oregon's own assessment was thoroughly reviewed, as was the assessment in the state of Texas, and those developed by many of the large publishing corporations."

Carnine confirmed that he had asked Virginia officials for PALS materials, but said the purpose of his request was not a review by the assessment committee; instead, he was simply "interested in assessments."

Department officials declined to let reporters interview Kame'enui, who is serving through 2007 as the commissioner of the National Center for Special Education Research, about possible conflicts of interest surrounding his role in the Reading First program. In addition to heading the Reading First assessment team, Kame'enui co-authored a widely used "Consumer's Guide" to help states and school districts select programs under Reading First and, from his office at the University of Oregon, co-directed WRRFTAC. Yet, despite his prominence in Reading First until recently, ED Spokeswoman Susan Aspey said, "Commissioner Kame'enui is not involved in any work involving Reading First, and hence there are no conflicts." Carnine, who tried to contact Kame'enui at the behest of the Monitor, said his colleague was unable to talk to reporters due to the ongoing IG investigation.

Invernizzi, meanwhile, said she doesn't question the validity of DIBELS, which she called a "terrific research tool," but rather points to what appears to be "an agenda to get [all states] to use DIBELS."
Stacked?

Torgeson, director of ERRFTAC and a member of the panel, said he understood why some believe the committee was "stacked." But he justified the make-up of the panel as necessary if it was to reach conclusions in a short time-frame. "I recognize [the panel] as a bunch of like-minded individuals," he said. "That's why you have a group of people that wouldn't represent the point of view that an informal reading inventory is just as good as any other reading inventory. That would just slow down the process, and you'd be dead in the water when trying to implement a very well-defined set of policies and principles."

Carnine said time constraints created an "inflexible deadline" that limited the number of reviews, although PALS was one of several assessments listed in the final report as an instrument to be reviewed later. But no further reviews were conducted.

Some critics charge that DIBELS' widespread use in the Reading First program has kept other, potentially better assessments from getting noticed.

Michael Pressley, a researcher at Michigan State University, said a report that he is working on concludes that the Qualitative Reading Inventory (QRI) - developed at Marquette University in Milwaukee, Wisc.- is a better predictor than DIBELS of whether students will pass a demanding reading test. He said his research, which will be available in a prepublication technical report in the next few months, documents that while students are reading fast on DIBELS, they are doing so "with very low comprehension."

Laurie Borkon, government affairs manager for Renaissance Learning of Wisconsin Rapids, Wisc., expressed frustration with the limitations of the findings of the assessment panel. Work on Renaissance's STAR Early Literacy assessment wasn't completed until September 2001, so timing was certainly a factor in it not being reviewed. But without an ongoing mechanism to review tests, she said, relying on the Oregon list encourages states to stick with older assessments and discourages innovation.

"I find it very discouraging that there are early literacy assessments out there that are not only valid and reliable, but are also more usable than traditional pencil-and-paper assessments, and schools aren't being allowed to use them," she said. "If you're not on the Oregon list, what do you do? On paper, states and districts could evaluate any assessment themselves using the Oregon rubric. However, it has been reported to us that in practice this isn't the case."

For instance, state officials in North Dakota who wanted to use STAR found themselves in a Catch-22. Like many small states, North Dakota lacked the manpower and funds to finance their own scientifically-based reliability and validity assessment review - a requirement for states that chose to use tests not on the Oregon list. Sandi Jacobs, senior education specialist in the federal Reading First office, told them to seek a review from the western technical center headed by Kame'enui. But they were told that the center had other projects and that evaluating STAR was not a high priority, according to Schauer, North Dakota's Reading First director.

"We were at a standstill," she said. "It was kind of frustrating."
Under Pressure in Nevada

Other states seeking to utilize non-DIBELS assessments found the going difficult as well. Initial comments from the expert panels that reviewed both Connecticut and Nevada Reading First plans alleged a lack of valid evidence to support the use of PALS. Moreover, reviewers appeared to be more intimately familiar with DIBELS than any other instrument.

For example, Connecticut's team noted in its review of the state's second application that "there appears to be come confusion regarding the use of [DIBELS] measures, as indicated in the new materials in the chart on page 96. The chart incorrectly lists all DIBELS subtests as fluency measures. The word fluency, in the case of the DIBELS instruments, is used to indicate mastery of each of several sets of skills to the point of automatic application, in contrast to oral reading fluency, one of the essential components of reading instruction." An analysis of other expert comments found that no other assessment instrument elicited such in-depth knowledge.

Nevada, meanwhile, rebelled against the prevailing winds and elected not to adopt DIBELS as part of its Reading First program.

Joan Taylor, who wrote Nevada's grant, sought assistance from technical advisors from RMC Research Corporation, which contracted with ED as a technical assistance provider. RMC assigned two consultants: Jerry Silbert, an independent consultant affiliated with the University of Oregon; and Joe Dimino, a research associate with the Instructional Research Group in Long Beach, Calif.

During a telephone discussion of assessments, Taylor claims Silbert "strongly" recommended DIBELS, saying it was more reliable than PALS. Taylor informed Silbert that the department's university partners - housed at both the University of Nevada-Reno and the University of Nevada-Las Vegas - had reservations about DIBELS, and she asked Silbert to put his request in writing. At that point, Taylor asserts that Silbert lambasted her and asked her what she knew about reading; she responded that she had a Ph.D. in literacy.

"In terms of DIBELS, I was definitely pressured," she said.

Intrigued and somewhat put-off by the conversation, Taylor asked her 19-year-old son to search the Internet for information on Silbert; he found that Silbert was associated with the University of Oregon, which Taylor recognized as the home of DIBELS. Silbert had worked with Carnine at NCITE, but said he knew little of DIBELS prior to his work with RMC, and certainly had no link to the test. Dimino, on the other hand, is listed on the DIBELS Web page as a trainer.

Silbert said he can't remember exactly what was said, but didn't recollect specifically pushing DIBELS. "I can't say what the exact words were, but normally when I dealt with states I just told them what the various options were," he said.

Dimino, incidentally, was the same consultant and DIBELS trainer who drew the ire of state officials in Kentucky. His behavior there sparked a spirited letter from Gene Wilhoit, Kentucky's commissioner of education.

"Kentucky's team members have expressed to me their understanding that it became quite clear that [choosing DIBELS] was an expectation in order to funded," Wilhoit said in the Jan. 2003 letter. "We subsequently learned that one of our technical assistance members, Joe Dimino, is, in fact, a trainer for DIBELS, which we believe raises serious issues concerning conflicts of interest. Furthermore, the developer of DIBELS, Roland Good, is a member of the Reading First Assessment Committee."

The letter was answered by then-ED undersecretary Eugene Hickok. Curiously, Hickok addressed the complaint by noting that "members of the expert review panel have been carefully screened for real and perceived conflicts of interest. No panelist is assigned to review an application that identifies commercial products ... in which he or she has a financial interest." What makes Hickok's statement curious is that Wilhoit did not mention conflicts of interest involving the "expert review panel" that read state applications (see sidebar); Wilhoit's letter referred to conflicts among consultants recommended by the department.
Legal Questions

Nearly all ED contracts contain standard language requiring contractors to safeguard against "organizational conflicts of interest" that could create an unfair competitive advantage, according to Sheara Krvaric, staff attorney with the education law firm Brustein & Manasevit, which is representing SFA. (In the interest of full disclosure, it should be noted that members of the firm often write for the Monitor and its sister publications.)

Federal regulations define an organizational conflict of interest as occurring when "other activities or relationships" make a person "unable or potentially unable to render impartial assistance or advice to the Government" or create circumstances in which "the person's objectivity in performing the contract work is or might be otherwise impaired, or a person has an unfair competitive advantage." The goal is to prevent contractors and subcontractors from providing technical assistance for products in which they have a direct interest.

Everett Barnes, executive director of RMC, said all technical assistance consultants contracted by the company were vetted by ED for conflicts of interest. Asked to describe the process, however, Reading First's Doherty declined, citing the ongoing IG investigation.

Barnes said that to the best of his knowledge, neither Silbert nor Dimino - or anyone else on the RMC team - had a financial interest in DIBELS. "The conflict of interest part of that, I guess, is hard to interpret," Barnes said. "Usually if somebody's done that, it's related to some personal benefit that you might get."

Asked whether being paid to train others in DIBELS would constitute a financial interest, Barnes said that no state would be forced to use Dimino for training, and in fact, could receive training free at one of the regional centers. Dimino could not be reached for comment.

DIBELS trainers certainly receive compensation for their services. And the compensation is considerable: In Washington state, an individual was paid $8,400 to conduct DIBELS training for two days in September 2004, according to documents obtained by the Monitor. Whether Good or the University of Oregon stood to receive financial benefits by overseeing the national evaluation of assessments is harder to determine.

A copy of the assessment is available for free on the Web, and the University of Oregon charges $1 per student to electronically track student progress - all of which goes back to the University of Oregon, according to the DIBELS Web site. Moreover, Good said in an e-mail that all royalties from publication of DIBELS are invested in research and professional development on early literacy, and research and development on the test itself.

DIBELS is also available from Sopris West of Longmont, Colo., in published form and from Wireless Generation of New York City in electronic form. The purchased versions come prepackaged and save school officials the job of downloading, photocopying and organizing. "Many schools would rather write a check and receive the materials good to go," Good explained.

DIBELS' royalties also stem from the increasing production of reading products with tie-ins to the ubiquitous test. Voyager, which employed Good and Kame'enui as part of its design team, developed its own DIBELS Data System for use in Voyager Universal Literacy. Shanahan, too, said he is working on a DIBELS measurement tool to be used with a basal reader he is developing for Macmillan/McGraw Hill.

"Everyone is doing it," he said.
"Step Up to the Plate"

DIBELS wasn't the only topic about which states felt pressure from Reading First consultants.

Through the Freedom of Information Act, Georgia's Cupp unearthed a letter from RMC, in which the organization apologized for a comment made by two consultants who said the state should have the "courage" to "step up to the plate" and write a "short list" of approved Reading First programs. Georgia officials took umbrage at the suggestion - although it was made in an e-mail between the consultants and sent to Georgia's education department by accident - since nowhere in Reading First is this required.

RMC's Barnes characterized the exchange between technical advisors Stuart Greenberg and Jerry Silbert as "inappropriate" language and advice. "It showed a definite lack of professional discretion, in my opinion," he said in an interview.

Inadvertent or not, it may illustrate the priorities of the technical team. The episode begs the question of whether other states received similar advice, even though such actions are not required by law.

Greenberg agreed that the e-mail was "totally incorrect." He said he did not believe that similar situations had occurred elsewhere, adding that states were under no obligation to heed the consultants' advice. "Our job was to help them think through how to meet the criteria. None of the thoughts that we gave to states or shared with them were binding in any way," he said.

Difficulties arose for Cupp early in Georgia's Reading First process. Five schools had contacted her about the possibility of using her readers as a core program, but Cupp had heard rumblings that it wouldn't be approved by the state education department.

Ken Proctor, Georgia's Reading First director, told her in an e-mail that all "schools will have to wait until [ERRFTAC ] reviews the materials before they spend [Reading First] dollars on them." In a later communication, Proctor told Cupp it is "a requirement of the federal government." Officials at the national Reading First office apparently agreed with Proctor's decision: An e-mail from Proctor to staff members said he had "spoke with Sandi Jacobs [of ED's Reading First office] and she feels we are very much in line by having the material reviewed."

But not all programs were subject to such requirements. The Richmond County school district was allowed to adopt a Voyager program designed by Good, Kame'enui and other leading researchers because, Cupp argues, it had been endorsed by the University of Oregon.

"Here they're making me jump through hoops, but this program hasn't been evaluated either," she said in an interview. An e-mail from Julie Morrill, education program specialist at the Georgia department, to Proctor appears to confirm this: "We included Voyager and it wasn't evaluated by Oregon," she wrote.

Moreover, there is nothing in federal law or the Georgia grant that requires programs to be evaluated by ERRFTAC, and in May of 2004, Proctor admitted he was wrong and discontinued the requirement. Cupp lamented the bureaucratic obstacles that seem particularly to plague small companies like hers.

It is a lament that resonates with several leading reading researchers. Given the expense it takes to do research - not to mention marketing - on core programs, the big publishers definitely have a leg up on smaller reading companies.

"I think I probably have a greater sympathy for the woman in Georgia than Success for All," said Shanahan, the NRP member. "Can a small company compete for Reading First funds? I think the answer is largely, 'No.' That shouldn't be the case."
SFA and Reading Research

Noting that SFA was written into the legislation of Comprehensive School Reform, Shanahan said: "They benefited from that. They took off and became a big deal, as much because of their research as because it was written into the law ... My thought is, 'Wait a minute, you guys got millions of dollars from this, and now you're saying the Education Department is turning you away?'"

In response, Slavin said the debate shouldn't be about who benefits from Reading First, but why some programs are benefiting and others are not. He noted with irony that just as SFA was filing its complaint with the IG due to the practices of one part of the department, Grover "Russ" Whitehurst, director of ED's Institute of Education Sciences, praised as a "sophisticated study" a look at 38 schools demonstrating that students in SFA outperformed their counterparts in regular classrooms. Whitehurst, who is spearheading the department's push for scientifically-based research, told Education Week that the randomized field trials employed by the study used "everything the evaluation field has come to recognize as high-quality."

Slavin said he fears that the cause of reading research will be set back if Reading First stays on its present course. "Publishers now, and in the future, will feel like they don't have to adhere to these principles," he said. "All they'll have to do is get some of the key people to be authors and use their political influence, and they won't have to worry about scientifically-based reading research."

In addition to favoring certain programs, subjective scientific research criteria have been employed to diminish the status of programs that are not favored under Reading First, according to state officials in Kentucky. Last October, a monitoring team noted as an "area of concern" that two districts used Reading Recovery as an intervention program with Reading First funds. In an accompanying letter, Doherty, the director of Reading First, did not refer to Reading Recovery by name, but stated that "two of the monitored districts were implementing reading programs that did not appear to be aligned with scientifically-based reading research."

In interviews with the Monitor, and in a June 24 letter to Doherty, state officials noted that during a conference call, Doherty "stated that the Reading Recovery intervention program is not scientifically based."

"We told him that unless he put that in writing, we would not discontinue using Reading Recovery with Reading First funds," said Lewis, the associate commissioner. "He said that if we were so convinced of the effectiveness of the program, we should mount a scientific defense and present it to the department."

In June, state officials supplied three scientific studies defending the research behind Reading Recovery. As of late August, they had not heard back from the department.
Local Control

Why would Doherty name Reading Recovery in a phone call, but not in writing? Some observers claim the department is cautious because it does not want to appear to cross the threshold of Section 9526 of NCLB, which prohibits ED from mandating state or local curriculum.

But a similar episode in Massachusetts raises further questions about the department's respect for local control. In February of 2004, reading officials in three school districts received a troubling phone call from Cheryl Liebling, the new state Reading First director: Doherty, she told them, had reviewed a list of core reading programs used by Massachusetts districts that had already won their awards; nonetheless, he found that the programs in those three districts were not sufficiently scientifically based, and informed Liebling that unless they adopted more rigorous programs, the districts would lose their funding. She would be the one to carry the message to school districts.

Reading officials in two of the districts recounted the story on the condition that their names or the names of their districts not be used. In an interview, Liebling confirmed the incident. "We got a call from Washington saying that these three programs were not sufficiently scientifically based, and could we to talk to the districts and ask them to reconsider," she said. She confirmed that the call came from Doherty.

Two of the districts had used "leveled readers," following the model of the Literacy Collaborative, a method affiliated with Reading Recovery and which both Doherty and Liebling did not even consider to be a core program. But the other district used the Wright Literacy Group, a McGraw-Hill product that had been reviewed by the University of Oregon and identified as a core program - although it did not score well.

Ultimately, the two districts that had been using the leveled readers switched to the popular Houghton Mifflin basal series. These districts retained their Reading First funding. But the group that used Wright Literacy saw the program as effective and refused to change. That district lost its Reading First funds at the end of 2004 school year.

Liebling instructed the districts involved to speak with Doherty directly. At least one district adviser left both voicemail and e-mail messages, but never heard back.

Asked whether she felt Washington overstepped its bounds, Liebling, a former consultant for RMC, said, "That's a concern of mine as well. But, you know, once you buy into a program, you're kind of in it."

In an interview, Doherty at first strongly suggested that state monitors had steered Liebling to ask districts to change their programs. He also suggested that Liebling might have acted on her own, without prompting from Washington. "The states don't act at our behest," he said.

At another point, ED spokesman Chad Colby dismissed the story as "hearsay." But informed that Liebling had confirmed the episode, Reading First's Jacobs merely said, "We don't agree with that characterization." She declined to elaborate.

It could not be determined if ED regularly back-reviews district applications or if Massachusetts was a unique case.

Liebling, who had just started her job at the time the incident occurred, appeared uncomfortable with what had transpired. "I understand why I was asked to do that," she said. "Did I disagree with the request? No. Had the request not been made, would I have made it on my own? I don't know."

The Monitor encourages readers to comment on this special report. If you have thoughts or questions related to any of the articles, contact editor Andrew Brownstein at abrownstein@thompson.com. As an additional service for readers, the Monitor is making available a current list of state monitoring reports and peer reviews of state applications for Reading First. Just click on this link: http://www.titleionline.com/libraries/titleionline/news_desk/tio050826.html

— Andrew Brownstein and Travis Hicks
Thompson Title1 Monitor
8-05
http://www.titleionline.com/libraries/titleionline/news_desk/tio050826.html


INDEX OF RESEARCH THAT COUNTS


FAIR USE NOTICE
This site contains copyrighted material the use of which has not always been specifically authorized by the copyright owner. We are making such material available in our efforts to advance understanding of education issues vital to a democracy. We believe this constitutes a 'fair use' of any such copyrighted material as provided for in section 107 of the US Copyright Law. In accordance with Title 17 U.S.C. Section 107, the material on this site is distributed without profit to those who have expressed a prior interest in receiving the included information for research and educational purposes. For more information click here. If you wish to use copyrighted material from this site for purposes of your own that go beyond 'fair use', you must obtain permission from the copyright owner.