Orwell Award Announcement SusanOhanian.Org Home


NCLB Outrages

How Test Companies Fail Your Kids

This solid piece of reporting appeared in December 2006 and is more timely than ever.

by David Glovin and David Evans

Jerry Lee Faine Elementary School in Dothan, Alabama, starts
each day with two hours of reading and vocabulary. After that,
thereâs arithmetic. âIf you can read, you can do anything,â says Principal
Deloris Potter, a spry woman of 59 who has run the school
since 2002.

Potter, trusting the work of her teachers, was confident of passing
grades in April 2005 as students began two weeks of mandatory
standardized testing in reading and math. That July, state education
officials told Potter her school had failed the Alabama Reading and
Mathematics Test. The state warned it might fire teachers if scores
didnât improve, she says. A dozen students transferred after the substandard
rating. Faculty morale plunged. âWe felt like dogs,â says
Charlotte Adams, a reading specialist at the school.
In February 2006, the state said Jerry Lee Faine Elementary had
passed. Harcourt Assessment Inc., a unit of London-based Reed
Elsevier Plc and one of the worldâs largest test companies, had improperly
graded the exam. The snafu is at least the 30th time since
2000 that San Antonio, Texasâbased Harcourt Assessment, which
also wrote the exam, has made errors such as improper scoring,
faulty instructions and questions with more than one answer.
Harcourt isnât alone. Other companies are constructing flawed
tests, administering them improperly and scoring them incorrectly,
according to lawsuits and education department records in 15 states.
In March, Pearson Assessments, a unit of London-based Pearson Plc,
the worldâs biggest educational publisher, had to explain to high
schoolers across the U.S. that it had erred in scoring about 5,000 SAT
college entrance exams because its scanners couldnât read answer
sheets that had expanded from humidity. The next month, education
officials in Minnesota discovered a separate issue with answer sheets
that Pearson Assessments had created for a state-mandated exam. At
least 500,000 people taking tests from 2000 through â06âfrom Nevada
third graders to aspiring teachers in many statesâwere victims
of test company mistakes, documents show.
âThe errors weâve seen from testing companies are probably just
the tip of the iceberg,â says David Berliner, 68, Regentsâ Professor of
Education at Arizona State University in Tempe, who has written
more than 200 articles, books and book chapters about education and
served as president of the 25,000-member American Educational
Research Association. âState education departments
often lack the ability to adequately supervise these companies.â
The U.S. is in a testing frenzy. Students in the 92,816 American
public schools will take at least 45 million standardized
reading and math exams this year. That will jump to 56 million
in the 2007â08 school year, when states begin testing science
as part of the 2002 federal No Child Left Behind law, the
most comprehensive education overhaul in half a century. Beyond
No Child, tens of millions of additional tests assess college
hopefuls, certify future stockbrokers and even evaluate
preschoolers. With the stakes for making the grade so high for
so many, errors by test companies have dramatic consequences.
Joseph Conigliaro lost his Pennsylvania teaching job after
Princeton, New Jerseyâbased Educational Testing Service, the
worldâs biggest standardized test company, incorrectly scored
three of his licensing exams. ETS, which will pay $11.1 million
to 4,100 teachers who were falsely failed, called the error an
"anomaly." Ryan Beck & Co. asked Linda Cutler to resign from
a senior associate job at the securities firm after she and 1,881
other test takers were scored incorrectly last year on the Series
7 licensing exam for securities representatives. (See "How
NASD Flunked a Pro," page 134.)
"Itâs an exponentially growing catastrophe," says James
Popham, an emeritus professor of education at University of
California, Los Angeles, and author of 25 books on education.
"No one knows how bad it is, and itâs going to get worse."
Deputy U.S. Education Secretary Raymond Simon says
states must better oversee test companies. "The whole teaching
system is based on the results of those tests," Simon, 61, says. "If
the integrity of the testing process is called into question, that
brings into question the whole accountability system."
The national obsession with performance and measurement
means a booming business for test-producing and
grading companies. In 2005, CTB/McGraw-Hill, Educational
Testing Service, Harcourt Assessment, Pearson Assessments
and smaller firms generated $2.8 billion in revenue from testing
and test preparation, according to Boston-based research
firm Eduventures LLC. No Child tests alone produced about
$500 million in annual revenue in 2005â06.
Along with creating exams, Harcourt Assessment, Pearson
Assessments and companies such as White Plains, New Yorkâ
based Haights Cross Communications Inc. sell mass-produced
workbooks, practice tests and computer software that teachers
use year-round to prepare students for No Child and other
tests. The burgeoning test preparation industry generated $1.7
billion in annual revenue last year. The $1.1 billion testing
market and the $1.7 billion test preparation business will grow
by a combined 30 percent by the 2009â10 school year, Eduventures
predicts.
For test companies, pitching schools to buy preparation
materials after receiving a No Child contract is routine,
says Robert Schaeffer, public education director at the
National Center for Fair & Open Testing, a Cambridge, Massachusetts-
based nonprofit group. "Itâs standard business practice,
the equivalent of razor companiesâ giving away razors so
they can make money selling blades," he says. "It's where the
real profits are."
Profit margins in test preparation are as much as seven times
higher than they are for No Child tests, partly because there are
no requirements for high-quality questions on practice exams.
States leave it to schools and school districts to decide whether
the test preparation materials theyâre buying are sound. Haights
Cross, publisher of the Buckle Down test preparation workbooks,
reported operating margins of 21 percent in its test preparation
division for the first half of 2006.
In comparison, No Child tests, which must be custom designed
for almost every state, have pretax profit margins as
low as 3 percent, says Kurt Landgraf, chief executive officer
of Educational Testing Service. He says his not-for-profit
company lost $2.6 million on a $236 million four-year No
Child contract in California.
Richard Rizzo, chief financial officer of Measured Progress
Inc., a Dover, New Hampshireâbased nonprofit firm that produces
No Child tests, says he expects to earn margins triple
those of No Child exams by selling practice questions and tests
that schools use to gear up for the actual exams. Getting a foot
in the door with a No Child contract can also lead to sales of
achievement or psychological tests not related to No Child.
"Companies could conceivably low-ball the customized test because
they know they could go in and sell the off-the-shelf products
with a 40â50 percent margin," says Rizzo, 62, referring to
tests that arenât specially designed for individual states.
Whether or not they low-ball, companies often scrimp
when they bid on No Child contracts, Eduventures analyst
Tim Wiley says. Getting a contract involves the same process
as selling supplies or cafeteria food to a school: A company
submits what it expects to be a winning package. "As with any
bidding situation, it definitely requires a lot of cost cutting,"
Wiley says. "Or, in some cases, cutting corners."
In Florida, CTB/McGraw-Hill won part of the state's testing
contract for 3,800 schools in 2005. To grade the essay portion,
the Monterey, Californiaâbased unit of McGraw-Hill
Cos. hired $10-an-hour workers from Kelly Services Inc., the
second-largest U.S. provider of temporary employees, and
other companies. Among the 2,947 graders was a person who
won the job while he was employed packing bags of potato
chips for PepsiCo Inc.'s Frito-Lay unit, applications compiled
by the Florida state senate show. Kelly spokeswoman Renee
Walker declined to comment.
Another grader was a cook in an Orlando, Florida, diner.
One essay evaluator wrote he was "layed off" from a clerical job
after working as a janitor. He graduated from Ambassador University,
a Worldwide Church of Godârun school in Big Sandy,
Texas, in 1997. The school shut down that same year. Another
said that he majored in "Phylosophy/Humanity" at Mount
Angel Seminary in St. Benedict, Oregon.
Steven Weiss, vice president for communications at Mc-Graw-Hill, said in an e-mailed statement that the company
had performed extremely well in scoring more than 90 million
documents with a total of more than 755 million essay and
short-answer questions during the past five years.
CTB/McGraw-Hill, Harcourt Assessment and Pearson Assessments
donât break down their revenue from No Child tests
and preparation materials in regulatory filings. Public records
from the Wyoming department of education show the state is
paying Harcourt, which has a $21 million, four-year No Child
contract, more than $120 per student each year.
Of that, about
half is for No Child tests, and the rest is for preparation materials
and other testing products.
Schools in Okaloosa County, Florida, pay $9.50 per student
for a series of preparation tests called Stanford
Learning First, which Harcourt Assessment renamed
Learnia. By comparison, Harcourt received $4.93 per child
from the state of Florida in 2005 to develop questions for its No
Childâmandated annual Comprehensive Assessment Test.
Harcourt Assessmentâs experience shows how winning a
No Child bid can be a prelude to more sales. In 2004, Harcourt
got a four-year, $44.5 million contract to develop and
score Illinoisâs No Child exams. Chicago schools then began
purchasing Harcourt materials, testing director Xavier Botana
says. The preparation products included Stanford Learning
First practice tests that measured student progress as they
prepared for No Child exams. In the 2005â06 school year, the
district spent $1.8 million on Harcourtâs new Stanford
Learning First product.
Christine Rowland, a former teacher of English as a second
language who now trains colleagues at Christopher Columbus
High School in the Bronx, New York, says her pupils
didnât learn more because of increased testing. Still, she relied
on test preparation materials to help students pass the
math test. The cost of failure was too high, she says. "If I
know they are going to test six things six weeks from now,
thatâs what Iâm going to teach," Rowland, 46, says. "It puts a
tremendous amount of pressure on. The real fear is that it
turns students off from learning."
Test companies, aware that Rowland and other teachers
are being judged by how students do on No Child exams, are
inundating schools with ads for preparation products such as
practice tests, software and banks of sample questions. Often
they say their materials are designed specifically to help students
pass the stateâs No Child test. "Iâm getting mail from
companies Iâve never heard of," says Susan Friedwald, head
of teacher training at Public School 48 in the Bronx.
At Cracker Trail Elementary School in Sebring, Florida,
11-year-old Alexis Szoka took dozens of practice
exams last year leading up to the Florida Comprehensive
Assessment Test. She wound up a nervous wreck. "My
daughter has such test anxiety, she canât take a test anymore,"
says Alexisâs mother, Carol Szoka.
One exam measured whether Alexis understood vocabulary
and another checked her spelling. The school tested
how well she read and whether she knew math. Some tests
compared her reading and math skills with those of other
fourth graders. Alexis was evaluated on phonics, writing
and her understanding of text on a computer. Most tests
were given two, three or four times a year. Teachers gave
chapter tests in reading and math and benchmark tests
throughout the year to see whether Alexis was progressing.
Andrew Lethbridge, Cracker Trailâs vice principal, says
one test gave fourth graders practice in filling in answer sheet
bubbles on other tests. The materials came from divisions of
Harcourt Assessment, Pearson Assessments and smaller, privately
held companies.
"It was never like this," says Carol Szoka, who has two grown
children who went through the same schools in Sebring, which
is 85 miles (137 kilometers) south of Orlando. "They had an
achievement test. They just took it. They werenât prepped."
Richard Demeri, Cracker Trailâs principal, says test preparation
materials have helped his students. Seven years ago, the
school was given a grade of C by the state. Now, with test scores
higher, the school has an A from the state and is no longer on
probation. "Thereâs very little spray-and-pray teaching going
onâwhere you spray everybody and pray they get it," he says.
The school uses test results to analyze each studentâs progress.
"Itâs much more individualized now," he says.
Even if Demeriâs students are prepared to take No Child
tests, two Florida state senators question whether
CTB/McGraw-Hill has qualified people to grade them.
Senators Walter "Skip" Campbell and Leslie "Les" Miller Jr.
sued the state education department and CTB/McGraw-
Hill earlier this year to obtain applications of test graders.
The department had refused to release the applications, citing
confidentiality. CTB/McGraw-Hill settled the suit by providing
copies of the scorersâ personnel files with personal
identifying information removed.
CTB/McGraw-Hillâs $82 million, three-year Florida
contract requires a scorer to have a bachelorâs degree in
mathematics, reading, science, education or a related field.
On its Web site, the Florida Department of Education assures
parents that graders of the Florida Comprehensive Assessment
Test are professional, trained scorers.
Information the senate obtained shows one grader had an
associateâs degree, which is below a bachelorâs, from the
University of Delhiâs School of Correspondence Courses and
Continuing Education in Delhi, India. She worked as a $7.50-an-hour cashier at a duty-free shop at OâHare International
Airport in Chicago before being hired to grade exams, according
to the settlement documents. CTB/McGraw-Hill now says
this person never scored exams. A personal trainer with a
degree in sports science from the University of Leipzig in
Germany also graded essays, as did a convicted shoplifter who
graduated from West Virginia University with a degree in
physical education, the applications show. A person from
Hungary wrote he was a "pyshical education" major. A physical
education major from Methodist College in Fayetteville,
North Carolina, wrote that she had attended "Methidist College."
McGraw-Hillâs Weiss said its scorers from the University
of Delhi met the requirements for a bachelorâs degree. "Individuals
must undergo a comprehensive training process before
becoming qualified to score," Weiss wrote. "Scorers must
maintain performance quality throughout the process."
CTB/McGraw-Hill spokeswoman Kelley Carpenter says
the company subjects scorers to a rigorous three- to five-day
training program. Next year, at Floridaâs request, the
company will ensure that scorers have appropriate backgrounds
for the subjects they grade, she says. "They are constantly
monitored," she says. "And if they donât match the
quality performance standards, theyâre not retained as scorers."
Carpenter says spelling errors on an application donât disqualify
someone from being hired as a scorer. "Spelling in and
of itself is not a requirement," she says.
When Deputy Education Secretary Simon is shown misspellings
on applications of Florida scorers, he says he would
demand excellence. "Itâs absolutely important that the integrity
of the scorers is something the companies would be
proud of and feel comfortable with," he says. "I canât imagine
they would feel comfortable with a nonspeller."
Cornelia Orr, head of the Florida Office of Assessment
and Performance, says she reviewed about 25 percent of the
grader applications. "I felt like CTB had minimally met our expectations,"
she says. "I know there are ways they can improve."
One reason for the testing foul-ups and their dire effects is
that thereâs no federal oversight of the testing industry. When
the U.S. Congress authorized the No Child law it didnât create
an agency to evaluate whether the companies making and selling
the exams do an adequate job. Each state oversees its own
test contractor.
Roderick Paige, who ran the No Child program as U.S. education
secretary from 2001 to â04, says the law is a good one.
He says his concern is that testing may not be done accurately
and competently. Paige, 73, says he summoned top executives
from 20 testing companies to a conference room at the U.S.
Department of Education on Feb. 20, 2003, and demanded
better performance. In 2005, the Education Departmentâs inspector
general announced plans to study whether thereâs a
need for federal review to detect and prevent errors. The study
isnât yet under way, spokeswoman Catherine Grant says.
"Weâve got to get better testing producers," says Paige,
whoâs now chairman of Chartwell Education Group LLC, a
Washington-based school consulting company. "Theyâre
making mistakes."
Harcourt Assessment is making the most errors, according
to records in 15 state education departments. In
addition to erroneously failing Jerry Lee Faine Elementary,
Harcourt wrongly flunked three other Alabama schools
because of its grading snafu. It mistakenly passed 10 Alabama
schools that should have failed, the state said.
In Connecticut, Harcourt Assessment reported the wrong
reading test scores for 355 high school students in 51 districts
last year. The state fined the company $80,000. In Georgia,
Hawaii, Illinois, Massachusetts and Virginia, Harcourt made
errors on No Child tests and achievement tests given to measure
how students compared with one another. States fined the
company hundreds of thousands of dollars.
"Employees took shortcuts," Harcourt Assessment Senior Vice President Robin Gunn wrote in a May 28, 2004, letter to
Hawaii school principals, promising stricter oversight. Gunn
has since left the company. Hawaii hired a not-for-profit firm,
Washington-based American Institutes for Research, to
develop and score the tests after discovering more errors on
Harcourtâs 2005 exams. Illinois also replaced Harcourt in the
middle of its contract; Connecticut, Massachusetts and Virginia
didnât renew their contracts with the company.
Nevada fired Harcourt in 2004, after the company mistakenly
failed hundreds of students, gave inflated scores to
thousands of others and produced tests with missing pages,
misspellings and flawed instructions, according to Nevada
Education Department records.
"It was errors, one after the other, and not to a single student
but to a large number," says Karlene Lee, the assistant
superintendent in Clark County, Nevada, which includes Las
Vegas. "In education, we donât have the luxury to say that 2 percent doesnât matter. Every child has to be accurate."
Nevada fined Harcourt Assessment $425,000 in 2002,
before firing the company. Harcourtâs approximately
$290 million in revenue last year was 3 percent
of Reed Elsevierâs sales, according to company filings. Reed
Elsevier reported its profit increased 62 percent in the six
months ended on June 30 to 217 million pounds ($403 million)
compared with a year earlier. The companyâs shares
rose 7.8 percent this year to 588.5 pence on Oct. 9.
Harcourt Assessment hired a new CEO, Michael Hansen,
who took over in July after serving as executive vice president
for corporate development at Gütersloh, Germanyâ
based Bertelsmann AG, Europeâs largest media company.
Hansen, 45, says his company wonât slip up again. He blames
errors on the enormous demand for made-to-order state
tests. "You went from an industry that was largely standardized to an industry that was highly, highly customized," Hansen
says during an interview in a conference room in his San
Antonio office suite, which is adjacent to the test production
work floor. "Our most sacred obligation is that the test results
are accurate and that they are timely."
Last year privately held Measurement Inc., a Durham, North
Carolinaâbased test development and scoring company, wrongly
failed 890 students out of the 5,461 it tested on Ohioâs high
school graduation exams. The company says it scored the exams
correctly and then erred when it determined the studentsâ
grades based on the number of questions they answered correctly.
"We had a really spotless reputation," Senior Vice President
Mike Bunch says. "This was just devastating to us."
Pearson Assessments grades 40 million exams each year.
The company has the high-profile job of scoring the SAT,
which more than 3,000 colleges and universities use as a
gauge for admitting students.
Pearson discovered its SAT scoring error in January after
two students asked that their results be handscored. Score
changes affected about 1 percent of the October 2005 test takers,
says the New Yorkâbased College Board, a nonprofit group
that represents 5,000 colleges and oversees the exam. Before
most college admission decisions were announced, the College
Board re-reported the roughly 4,400 scores that had been underscored.
"When you do 12 million tests a year, a lot of people
are involved in that," College Board President Gaston Caperton
says. "Itâs very hard to get perfection."
Shane Fulton, a lean youth who played soccer and tennis at
George School in Newtown, Pennsylvania, knows the pain of
an incorrect score. Fulton had his sights on attending New
York University or Lafayette College in Easton, Pennsylvania.
In June 2005, at the end of his junior year at the Quaker-run
high school, he took his first SAT. He earned a score of 1,910
out of 2,400 on the three-part test, which assesses mathematics,
reading and writing. Not satisfied with his performance on
the math portion, he took the test a second time in October.
He was shocked when the grade came back as a 1,330.
"I knew that something was wrong," says Fulton, 19, of
Yardley, Pennsylvania. He asked to have his exam graded by
hand. When the results were returned more than a month
later, his score was actually a 1,720, or 390 points higher than
initially reported.
By then, Fulton had suffered restless nights, sought sleeping
pills from his parents and broken down in tears because of
the uncertainty surrounding the scores and his future. Adding
to his anxiety, heâd taken the SAT a third time because he didnât
yet know his results on the second test. On that one, he earned
an 1,850. "Every year, thereâs more of an emphasis on how you
do," says Fulton, whoâs attending Northeastern University in
Boston and is suing Pearson Assessments and the College
Board over the error. "I was thinking I wouldnât get into any of
the colleges I applied to."
Mistakes may soon cost Pearson Assessments and other test
companies business. Educational Testing Service wants to bring
scoring in-house to reduce the chance of errors. ETSâs Landgraf has directed the company to invest $50 million so it can expand
its scoring operation within three years. He estimates that will
produce $33 million in new annual revenue. Pearson shares
gained 11 percent this year as of Oct. 9 to 762 pence.
Having ETS grade his exam didnât help Pennsylvania teacher
Conigliaro, one of the 4,100 false failures on the Praxis test. Fortyfour
states require the Praxis to evaluate teaching skill and knowledge
in a particular field. ETS developed the Praxis and then, in
Conigliaroâs case, scored it incorrectlyâmultiple times.
Conigliaro, 55, an engineer and former machine shop
owner, started teaching seven years ago as an intern at
Mountain View Junior/Senior High School in Kingsley,
Pennsylvania. His employment there was contingent on his
passing the Praxis to get final certification. He took the
exam in April 2003 and was told heâd failed. He took it
again and got a second failing score. He took it a third and
a fourth time and again flunked. "I was missing by one or
two points each time," he says.
Conigliaro was fired from his teaching job and wound up
working as a bartender. "I didnât want to leave the house for a
year and a half because I was too embarrassed," he says.
ETS notified Conigliaro in July 2004 that there were scoring
errors on his tests and that he had actually passed. In a
press release that month the company cited a "statistical
anomaly" in the scoring of nine exams from January 2003 to April â04 and apologized to test takers. ETS spokesman Tom
Ewing declined to comment further.
According to court papers by teachers who later sued ETS
in federal court in New Orleans, the firm didnât start an investigation
of its scoring of short essays until an unnamed state
challenged the results. In March, the company agreed to pay
$11.1 million to the test takers to settle the lawsuit
Conigliaro, who sued and was part of the settlement, says
he would have succeeded on at least three of the four exams he
was told heâd failed. "Yes, Iâm bitter," says Conigliaro, who, after
passing the Praxis and getting his license, now teaches business
and accounting at Blue Ridge High School in New Milford,
Pennsylvania. "I was just about to get tenure, and I had to
start all over again."
Errors can occur in the earliest stages of
the test-making process and then snowball.
In 2003, the Minnesota Department
of Education found flaws in questions
proposed by Maple Grove, Minnesotaâbased
Data Recognition Corp., a privately held firm
that provides testing for eight states. Minnesota
school officials reviewed some questions,
which are known as items. About 6 percent
had no correct answers or multiple correct answers.
"There are other concerns about item quality
with another 60â70 percent," testing director
Reginald Allen wrote in 2003 after the
company challenged the stateâs decision not to
renew its contract. The flawed test questions
didnât make it onto state exams. Company lawyer
Dwight Rabuse declined to comment except
to say that the state later hired a Data
Recognition staffer to replace Allen. Minnesota
Education Department spokesman Randy
Wanke declined to comment. Minnesota now
contracts with Pearson Assessments to provide
its state tests.
"When you have an education reform agenda
thatâs relying so heavily on standard tests to
ensure school quality, it doesnât take so many
problems to undermine credibility," says
Thomas Toch, co-director of Washington-based
research firm Education Sector, who wrote a
2006 report on test errors.
Executives at testing companies say they strive for perfection
in the face of state demands for new tests each year, in at
least two different subjects and for seven different grades.
Stuart Kahl, president and founder of Measured Progress,
says the industry uses dozens of quality checks as companies
draft, edit, print and deliver exams; retrieve, scan and read
papers; and calculate, compare and convert raw scores into
test grades. The process may take two years from start to finish.
"Thereâs no question there are tremendous demands
placed on the industry," Kahl says. "Obviously, when you redo
things every year, you have tremendous potential for errors."
Former Harcourt Assessment President Jeff Galt says state
education departments are sometimes to blame for errors that
they require their testing contractors to assume responsibility
for. He points to Connecticut, which is using MeasurementInc.âits third testing contractor since 2003. The state got rid
of Harcourt and then parted ways with CTB/McGraw-Hill.
"You have to wonder, Is the problem with the testing company
or with the department?" says Galt, 50, who now teaches business
at the University of the Incarnate Word in San Antonio.
Connecticut Education Department spokesman Henry Garcia
declined to comment.
Harcourt Assessmentâs inability to follow instructions
from Alabama is what cost Jerry Lee Faine Elementary its
good name. After the school was notified of its failure to make
the required adequate yearly progress, the state placed it in
the category of School Improvement, as probation is called
under the No Child program. Newspapers publicized the designation,
and parents won permission to transfer children to
other schools. "People will not move into this community,"
says Alfreda Mays-Rogers, whose grandchild is in first grade
at the school.
The Alabama Department of Education summoned principal
Potter 100 miles north to Montgomery, she says. Officials
demanded more teacher training and insisted on additional
reading instruction. Potter says she researched curricula used by
other schools and dissected years of test data to figure out why
her pupils hadnât passed. Nothing stood out.
During Potterâs crisis of confidence, Kirby Hubbard, the testing director in Etowah County, about 250 miles to the
north, discovered that Harcourt Assessment had miscalculated
his schoolsâ No Child results. Harcourt had tallied the scores
of students whoâd been absent during part of the exam week,
failing to follow Alabamaâs instruction to count the scores of
only students who took the entire multipart test, state Education
Superintendent Joseph Morton said in a Nov. 8, 2005,
letter to Harcourt. That same type of error affected Jerry Lee
Faine Elementar y. When the state told Potter her
school had actually passed on Feb. 9, 2006, she took to
the school intercom and made the announcement. Teachers
ran into the hallways, cheering. "We were happy, happy,
happy," Potter says. "But you turn to the other side, we were
mad, mad, mad."
Along with Potter, educators in Florida, Nevada and
across the U.S. have to live with test company mistakes
every year. Boston College emeritus professor
George Madaus and researcher Kathleen Rhoades say there
should be independent oversight of crucial exams. "Thereâs so
much error in these products," Rhoades says.
Madaus, co-author of a 2003 study on test errors, envisions
an impartial federally financed panel that would monitor
state testing programs to ensure theyâre well crafted and
used correctly. Such a board would analyze why there are errors
and how they can be minimized. It also may offer a seal
of approval on the test preparation products flooding the
market, which can generate such a big chunk of a test companyâs
earnings. "This is not anti-testing," Madaus says. "This is
an attempt to make testing better."
Potter tries not to be bitter. She notes with pride how her
school has now passed the state test for two consecutive years.
She has a message for test companies. "Theyâre hurting students
more than anything else," she says. "Please donât make that mistake
on students. Thatâs a reflection on our school, on my students,
on my teachers. Thatâs a reflection on me."
Itâs also a reflection on the $2.8 billion test industry, which
profits from selling materials to prepare students for high-stakes
exams it has a hard time getting right.

DAVID GLOVIN covers the Manhattan federal court at Bloomberg News in New York. DAVID EVANS is a senior writer in Los Angeles.
dglovin@bloomberg.net
davidevans@bloomberg.net



Test Prep: A+ in Profits,
Incomplete in Results


From a storefront office beside a Long Island
Rail Road station in Glen Head, New York, about
25 miles east of Manhattan, Rally! Education LLC
markets its Test Rehearsal product to two dozen
U.S. states. Rally, founded in 2003, is one of as
many as 1,800 U.S. test preparation and tutoring
firms that are springing up to sell practice materials
to help schools get ready for annual No Child
Left Behind tests.
Test preparation is booming. Companies took
in $1.7 billion in revenue last year. Profit margins
on test prep materials are 20 percent or more
compared with margins as low as 3 percent on
the year-end No Child exams.
"Weâve been profitable since day one," Rally
Chief Executive Officer Howard Berrent says.
He says the company has annual revenue of
$5 millionâ$10 million.
Rally recommends that schools administer
Test Rehearsal practice exams as many as four
times a year at a cost of about $4 per student.
Test Rehearsal works, the company says. According
to a study that Rally publishes on its Web site,
scores in 25 urban New Jersey schools that used
Test Rehearsal rose an average of 17.76 percent.
What the study doesnât say is that Berrent,
the co-author of the research, is the companyâs
CEO. Nor does it mention that Toms River and
other New Jersey districts use prep materials
besides Test Rehearsal. Berrent says he tells
customers who ask that heâs both Rally chief
executive and an author of the study.
"This is the sort of research that tells us very
little," says Richard Allington, a professor of education
at the University of Tennessee and past
president of the International Reading Association.
Further, Allington, whoâs the author of more
than 100 articles and books on reading and education,
says constant testing may have a negative
effect. "Often, reading gets worse," he says. "The
passages they read arenât relevant to the core
curriculum, so the kids learn less vocabulary."
Berrent agrees thereâs no proof that Test Rehearsal
alone pushed up scores in New Jersey.
"Thatâs the issue with all research," says Berrent,
who previously was CEO of Harcourt Interactive
Technology, a Harcourt Education unit that developed
online classroom tests.
Rick Stiggins, founder of the Portland, Oregonâ
based Assessment Training Institute,
whose assets are now owned by Educational
Testing Service, says sophisticated test preparation
materials can be helpful. He cites tests
called formative assessments, which give immediate
feedback on whether students are
grasping a new concept. The trouble is, schools
get no assurances that the test prep materials
theyâre buying are any good, says James
Popham, an emeritus professor of education at
the University of California, Los Angeles, and
the author of more than 25 books on education,
which he sells along with teacher training
videos. "Teachers are turning to all sorts of
false prophets," he says. "Theyâre being sold a
bill of goods."
In Wyoming, Harcourt Assessment is selling
a series of prep tests called Learnia. The product,
which costs the state $206,000, has two
parts. The first is a group of tests that educators
call benchmark assessments. These midyear
exams are designed to tell teachers how
much progress a student has made in math,
reading, science and writing. The second is a
group of what Harcourt calls formative assessments
that provide instant feedback.
Cheryl Schroeder, Wyomingâs testing director,
says Learnia is helping students master the subjects
that No Child exams test. Wyoming has
started science testing in advance of the 2007â
08 federal requirement. "Itâs how you know the
children are making the gains they need to,"
Schroeder says.
Popham, who serves as a member of the Wyoming
advisory committee on testing, says thereâs
no proof that Learnia is helping students. He says
the practice tests arenât tailored to Wyoming
school standards. "Thereâs no evidence that
theyâre worth a damn," he says.
Harcourt Assessmentâs new CEO, Michael
Hansen, says Learnia exams are under development.
"We have not, in any situation,
rolled out this product saying 'This is a
finished product; hereâs what it is,'" he says.
"Theyâre pilots."
Learnia exams will be customized to test
whatâs being taught in each state, Harcourt
spokesman Russell Schweiss says. He says the
exams are of high quality.
Popham says one big hang-up in test preparation
is that no one is distinguishing the good products,
such as those that rely on formative
assessment, from the bad ones. "Test publishers
are hawking anything they can," he says. "Itâs absolutely
a fraud."
DAVID EVANS and DAVID GLOVIN

One Child Left Behind

Tyler Stoken was a well-behaved
fourth grader who enjoyed school, earned
Aâs and Bâs and performed well on standardized
tests. In May 2005, heâd completed
five of the six days of the
Washington State Assessment of Student
Learning exam, called WASL, part of the
stateâs No Child Left Behind test.
Then Tyler came upon this question:
"While looking out the window one day at
school, you notice the principal flying in
the air. In several paragraphs, write a
story telling what happens."
The 9-year-old was afraid to answer the
question about his principal, Olivia McCarthy.
"I didnât want to make fun of her," he
says, explaining he was taught to write the
first thing that entered his mind on the state
writing test. In this case, Tylerâs initial
thoughts would have been embarrassing and
mean. So even after repeated requests by
school personnel, and ultimately the principal
herself, Tyler left the answer space
blank. "He didnât want them to know what
he was thinking, that she was a witch on a
broomstick," says Tylerâs mother, Amanda
Wolfe, sitting next to her son in the familyâs
ranch home three blocks from Central Park
Elementary School in Aberdeen, Washington.
Because Tyler didnât answer the question,
McCarthy suspended him for five
days. He recalls the principal reprimanding
him by saying his test score could bring
down the entire schoolâs performance.
"Good job, bud, youâve ruined it for everyone
in the school, the teachers and the
school," Tyler says McCarthy told him.
Aberdeen School District Superintendent
Martin Kay ordered an investigation.
"My suspension was for refusal to comply
with a reasonable request, and to teach
Tyler that that could harm him in the future,"
McCarthy told an investigator. "I
never, for a second, questioned my actions."
Tyler, whoâs 4 feet (1.2 meters) tall
and weighs 70 pounds (32 kilograms),
hasnât been the same since, his mother
says. "He liked the principal before this,"
she says. "He cried. He didnât understand
why sheâd done this to him."
Now, Tyler blows up at the drop of a hat,
his mother says. "They created a monster.
Heâll never take that test again, even if I have
to take him to another state," she says.
Tylerâs attitude about school changed.
He became shyer. Heâs afraid of all tests
and doesnât do as well in classes anymore,
his mother says.
McCarthyâs May 6, 2005, letter to
Tylerâs mother detailed her sonâs suspension.
"The fact that Tyler chose to
simply refuse to work on the WASL after
many reasonable requests is none other
than blatant defiance and insubordination,"
McCarthy wrote. In the letter, she
accused Tyler of bringing down the
average score of the other 10 students in
his class. "As we have worked so hard
this year to improve our writing skills,
this is a particularly egregious wound,"
McCarthy wrote.
Her accusation was wrong, state regulations
show. There is no averaging of the
writing scores. Each student either meets
or fails the state standard.
Tita Mallory, director of curriculum and
assessment for the Aberdeen School District,
says school officials feel tremendous
pressure because of the high-stakes tests.
While thereâs no academic effect on elementary
school children taking the exams,
there can be repercussions for school administrators.
When schools repeatedly fail
to show adequate yearly progress, as defined
by No Child, the principal can be fired.
"In many ways, thereâs too much emphasis
on the test," Mallory says. "I donât
want that kind of pressure on our kids."
Out of 74,184 fourth graders taking the
WASL test last year, 42.3 percent failed
to meet the state standard for writing.
Juanita Doyon, director of Mothers
Against WASL and author of Not With
Our Kids You Donât! Ten Strategies to
Save Our Schools
(Heinemann, 144 pages,
$14.95), says Tylerâs experience is representative
of whatâs wrong with tests like
the WASL. "They took a student who
loved his school and crushed his spirit,"
Doyon, 46, says. "Weâve elevated test
scores to be the most important part of
school. The principal and teachers are so
pressured by the test that theyâve lost
good sense in dealing with children."
DAVID EVANS

— David Glovin and David Evans
Bloomberg Markets
2006-12-01
http://www.bloomberg.com/news/marketsmag/education.pdf


INDEX OF NCLB OUTRAGES


FAIR USE NOTICE
This site contains copyrighted material the use of which has not always been specifically authorized by the copyright owner. We are making such material available in our efforts to advance understanding of education issues vital to a democracy. We believe this constitutes a 'fair use' of any such copyrighted material as provided for in section 107 of the US Copyright Law. In accordance with Title 17 U.S.C. Section 107, the material on this site is distributed without profit to those who have expressed a prior interest in receiving the included information for research and educational purposes. For more information click here. If you wish to use copyrighted material from this site for purposes of your own that go beyond 'fair use', you must obtain permission from the copyright owner.