Back to School: Performance data driving education now
Remember, the data at its best, can only relate to Standards. My One Size Fits Few: The Folly of Educational Standards has a few words on this. You can get a used copy at Amazon. for $4.
My condemnation of Standards and Standardistos was ahead of its time.
By Eleanor Chute
James Turner is flooded by so much data on schools and kids that he is reminded of the ancient mariner who was stuck on a windless sea, surrounded by undrinkable salt water.
Data data everywhere
So much it's hard to think.
Data data everywhere
If only it would link.
Across America, educators, parents and students are swimming in an overload of data, much of which seems unrelated and difficult to use.
Over the past decade, education based on academic standards has become the norm. This has fueled the push for data to show how students are doing and to guide efforts to improve.
The trend has only grown since the early 2002 signing of the federal No Child Left Behind Act, mandating all students in public schools be proficient in math and reading by 2014.
Now a big buzzword is "data-driven instruction," which means using the data to try to figure out what a child needs and then trying to provide it.
Kati Haycock, director of the Education Trust, a Washington, D.C.-based nonprofit group that advocates for achievement for all children, said data "can basically take us out of the dark ages of just kinda teaching and hoping, which is what a lot of folks have done for a very long time.''
"A lot of teachers have taught their hearts out and don't have a good way of telling who's learning what and what's working and what's not."
State Education Secretary Gerald Zahorchak gave this analogy for how data can be used:
"It's like going to the hospital and getting your temperature, pulse rate, respiration rate, blood pressure data. That's then marked against where you should be. ... A medical model would have you prescribed some remedies to help you recover."
"Where this is all headed is personal learning plans on a kid-by-kid basis," said Mr. Turner, who also is director of the partnership for school district improvement at the University of Pittsburgh School of Education.
The challenge, however, is making the data meaningful and usable.
This fall, the state will give every district a new tool called Pennsylvania's Value-Added Assessment System or PVAAS. The state will pay for the analysis at a cost of about $2 a student for more than 900,000 students.
In 2002, the state Board of Education passed a resolution calling for a value-added approach, which looks at how much of a student's progress can be attributed to the school.
Since then, it has been piloted in about 100 districts. Now all districts will get at least limited data, which will be expanded in future years.
PVAAS uses statewide achievement data in math and reading from the Pennsylvania System of School Assessment.
Taking into consideration that students start a school year at different levels, it statistically compares a student to other students statewide who have similar testing histories. It then predicts how the student would do in an average school during the coming school year.
If they make more than the expected year's growth, then that extra improvement could be credited to the school. If they make less, questions are raised about the school's effectiveness.
"Every sound business has systems in place to keep track of what's working and what's not working and focus management efforts on fixing problems," said Alan Lesgold, dean of Pitt's School of Education. "Value-added testing is the first effort in the educational world to do the same thing."
PVAAS uses a complex statistical calculation based on the work of William Sanders, who more than a decade ago began a value-added system in Tennessee.
The Sanders model is "probably one of the more widely respected models for the value-added system," said Suzanne Lane, professor of research methodology in the School of Education at Pitt and former president of the National Council on Measurement in Education.
Dr. Lane noted that for the data to be good, the underlying test needs to be one of good quality.
Pilot schools, such as South Fayette, used data from other standardized tests when PSSA data weren't available.
"It changed our district because it allows us to look not only at student achievement but also at student growth," said South Fayette Superintendent Linda Hippert.
On the 2005 PSSA, South Fayette's fifth-graders ranked No. 1 in math and No. 3 in reading statewide. Its eighth-graders came in No. 4 in math and No. 7 in reading.
That favorable report, however, missed part of the picture that PVAAS catches.
For South Fayette, PVAAS revealed that one eighth-grader scored 130 points lower in math than would have been expected and another scored 400 points higher than would have been expected.
It showed that while some fifth-graders who scored low in math didn't meet the state standard, they did grow at a much faster rate than those scoring higher did, thus the low achievers may be headed toward proficiency.
PVAAS doesn't pinpoint skills, but it opens the door to new questions and conversations.
Dr. Hippert said, "I can go into the system and say I want to see every [eighth-grader] who is predicted with less than a 60 percent chance of being proficient. Then you can pull those kids and take a look at them and begin remediating in the ninth grade."
Value-added also can be used to spot proficient and advanced students who are declining but still are at least proficient.
"The beauty of the value-added philosophy is that every child deserves at least a year's worth of growth in a year. The highest-achieving kids are just as worthy of growth as the lowest-achieving kids," said Theodore Hershberg, professor of public policy and history at the University of Pennsylvania and a longtime supporter of value-added.
Throughout the country, value-added approaches are hot.
As a pilot, the U.S. Department of Education is permitting two states --Tennessee and North Carolina -- to be judged under the No Child Left Behind Act using value-added approaches, not just achievement levels. Other states, including Pennsylvania, are eager to be considered for such an approach.
One of the controversies is whether growth data should be used for teacher evaluation, as it is in Tennessee, or whether it shouldn't, as in Pennsylvania.
The state will provide PVAAS data on individual students, grade levels and schools but not teachers.
James Mahoney, director of Battelle for Kids, a nonprofit that is helping Ohio implement a value-added system, said that starting at the teacher level would be "threatening, and people would see it used punitively."
He said, "We want an evolution, not a revolution."
Ms. Haycock, however, said she believes Pennsylvania has made a mistake.
"It turns out there are very, very big differences among teachers in their ability to grow kids," she said.
While the use of data can help to improve instruction, an ongoing problem is getting data quickly enough that it can be used effectively. State test results, for example, aren't available until well after the school year ends.
As a way to provide more immediate feedback, some districts have started using the 4Sight Benchmarks, which are tests based on state standards for math and reading. The tests are given near the start of the year and then quarterly to track progress.
Because the schools score them themselves, they have the data in time to use it to help students as they learn.
The benchmarks were created by The Success for All Foundation and the Center for Data-Driven Reform in Education at Johns Hopkins University.
This past school year, 233 districts statewide used 4Sight in grades three through eight. The districts paid $120 per class for a total of five tests each in reading and math. This fall, 4Sight will be expanded to grades 9 through 11 and will be used in 303 districts.
Wilkinsburg concentrated its efforts around the 4Sight tests, scheduling in-service days so teachers could plan teaching strategies based on the results.
From those meetings at Turner Elementary, Principal Christine French said students were placed in flexible groups assigned based on student needs. Each teacher specialized in a skill, enabling students to be rotated among teachers who had developed lessons aimed at teaching the needed skill.
Without the data, Ms. French said, teachers may repeat what students already know or miss what they need to learn.
Wilkinsburg math coach Diane Chessman said, "It was a much more systematic way."
Ms. French wishes she'd had the data when she started her teaching career in 1994.
"We were given a textbook and curriculum, and we went through it. You were trying to get through all the material by the end of the year. You weren't really looking at what each individual child needed."
The PSSA results for last spring show Wilkinsburg students improving on most but not all tests.
Some data have been cumbersome to use.
Mr. Turner said some districts have such huge warehouses of data that "at the end of the day it almost required an advanced degree from MIT [Massachusetts Institute of Technology] to get access to the data."
This school year, Wilkinsburg will use specialized computer software to connect, or as the data people say, link, the information so it is easier to use.
There is no one piece of data that is the holy grail. Instead, educators want an array of data.
"The more data points you have, the more confidence you have in what you're doing," said Mr. Turner. "If the data is too bad to believe or too good to be true, question it."
INDEX OF NCLB OUTRAGES