MSA changes may have raised scores; Md. test was shorter, not easier; experts see link to unusual rise
By Liz Bowie
State education officials acknowledged yesterday for the first time that they had changed the Maryland School Assessment this year in a way that experts say contributed to an unusually large rise in student test scores.
According to experts, the test was shorter though not easier, which might have meant that students taking it were less tired. The tests have been given in grades three through eight in reading and math for the past five years."There was a psychological advantage," said Ronald A. Peiffer, Maryland deputy state school superintendent. "They weren't as tired this year. ... It doesn't mean it wasn't as difficult."
In addition, some test questions were more directly aligned with the state's curriculum than before. That change occurred because Maryland substituted locally written questions for a small number that had come from a national standardized test in previous years.
Outside experts said the changes were justified but did have an impact, making comparisons between this year's scores and previous results imperfect. Maryland saw significant gains on the MSA this year, particularly among black and low-income students and those learning English.
Baltimore schools had some of the largest increases, which was celebrated this week by the governor and other elected officials.
City schools chief Andres Alonso said the news that the test had been changed did not diminish the accomplishments of Baltimore students. "Since the test had the same degree of difficulty, and the fundamental question is how our students are performing in relationship to the curriculum, the questions, to me, are irrelevant," he said.
"It has been my experience that when black and Latino children demonstrate great gains, people look to explanations other than good, hard work to explain that performance. It's what I experienced in New York. We know why that happens."
Baltimore students' scores improved not only in relation to past years but also in comparison with those of students around the state.
But Kate Walsh, who was recently appointed to the state school board, said members were not made aware of the change when they were briefed Tuesday.
"I am disappointed [that] these changes, however justified, weren't shared with the board," Walsh said.
In fifth- and seventh-grade reading, scores increased more than in any other grade or subject. The percentage of students passing the fifth-grade reading test went up 10 percentage points, from 76.7 percent to 86.7 percent. Seventh-grade reading scores rose by 11 percentage points.
Such large increases had not been seen before during the five years that the test has been administered and were a red flag to a standing panel of education testing experts - known as psycometricians - hired by the Maryland State Department of Education to advise them on the validity of test scores.
"We had a lively discussion on these results within the panel," said Huynh Huynh, a professor of statistics and education at the University of South Carolina and a member of the panel.
Long before the results were released to the public, the panel asked Harcourt Assessment Inc., the company hired by the state to oversee testing, to do further analysis.
In the end, Huynh said, the panel concluded that the test was equivalent to the one given in 2003, the year the test was first used, and to subsequent tests. But the panel also concluded that the changes in the test had contributed to the large increases in the fifth- and seventh-grade scores. How much effect the changes had on scores the panel could not estimate, he said.
In putting together the original MSA in 2002, Maryland, like a number of states, decided to take an off-the-shelf standardized test that can be purchased from companies such as Harcourt and combined it with questions that were created in-house.
The locally produced questions made up the majority of the test and were more closely aligned with material on the state curriculum - the material that Maryland officials say ought to be taught in classrooms. The standardized tests, on the other hand, had great reliability because they are given to tens of millions of students across the nation.
Although students had to answer about 40 questions on the standardized portion of the test, Maryland officials did not count most of them. Instead, they elected to count the questions that focused on material they cared about and those that reflected the state curriculum.
But Maryland students were not informed that some questions did not count and might have gotten bogged down on questions that covered unfamiliar material. In addition, teachers who looked at the tests when they were given each year saw material they had not focused on in class and might have been confused about what to teach the next year, according to Leslie Wilson, who heads the state's assessment office.
Dropping the standardized portion also meant that students did not have to be given two sets of directions for the two parts of the examination.
This year, the test was an hour shorter.
Reading and math testing is conducted over two weeks, with two days for each subject. The first day used to spend 2 1/2 hours on each subject, and each was shortened this year by 30 minutes.
"Could part of the increase [in scores] be that the test was a little shorter and a little less confusing? The answer is yes, it could be," said Mark Moody, the retired state director of assessment who advises the state on testing.
But Moody and Huynh cautioned that the increases that Maryland students posted are not out of line.
"After the analysis by the panel, the conclusion was that the growth that appeared in 2008 is not so much out of line so as to raise concerns, and for most of the grades it is within the variation observed over previous years," Moody said.
Mary Lyn Bourque, a consultant on school testing who has not advised Maryland on the tests, said she believes that the changes might account for a significant portion of the increase.
"I think a lot of this can be looked at as the result of changes in the test item pool and the length of the test," she said. "It doesn't take much to change test scores."
She said the increases in Maryland are "really unusual."
Maryland's test also has been questioned for other reasons. When compared with state tests around the nation, Maryland's ranked 26th in difficulty, according to a report by the Fordham Institute, a nonprofit think tank in Washington.
"If already the proficiency bars are lower than half the nation, what can we really make of this?" asked Amber Winkler, Fordham's research director.
INDEX OF NCLB OUTRAGES