Education Week's Quality Counts 2004 Receives an F
In One Size Fits Few: The Folly of Educational Standards I reported in some detail on Education Week's bizarre and arbitrary rating of states and teachers. Follow the money. These so-called Quality reports are funded by the Pew Foundation, one of the biggest Standardista outfits going. When Texas gets an A and Vermont a D, you have to figure something is peculiar. That was five years ago. As Dennis reports below, the Pew system is still going strong, and it still stinks. I couldn't format his tables and so omitted them. Go to his website for the full report.
An evaluation of the of the eighth annual Education Week report, Quality Counts 2004,
Count Me In: Special Education in an Era of Standards by the Center for the Study of Jobs & Education in Wisconsin and the U.S. gave a grade of F or unsatisfactory to the report. Quality Counts 2003 released in January 2003, Quality Counts 2002 released in January 2002 also was rated an F by the Center. (See Education Week’s Grading of the States Receives a Grade of F www.jobseducationwis.org report 129) The Education Week report Quality Counts is useless for comparing the quality of public education by state. It is unbelievable and incredible that anyone, including the media, could take the ratings of states for Standards and Accountability seriously.
The Quality Counts 2004 is divided into two sections, The State of Special Education and the State of State section that provides State Report Cards and State Profiles of education policy. Only the State of State rating of states is discussed in this report. Data for three categories of data collected to rate states, Student Achievement, Standards and Accountability and Improving Teacher Quality were analyzed. Data for other categories that were used to rate states, School Climate and Resources, was determined to be not useable for qualitative or quantitative measurements of education quality by state. .
The only comparative measure of student achievement by state is standardized tests for the 4th and 8th grade National Assessment of Educational Progress NAEP Exams in math, reading, and writing. There are no other comparative student achievement test results for all States and the District of Columbia available. Only the reading and math exams were used for analysis. The writing scores are unreliable and useless.
The Quality Counts 2003 did not rank states on Student Achievement. Proficient or above percentages were listed by NAEP test by state Alabama to Wyoming in alphabetical order. The Quality Counts 2004 report did not attempt to rank states on Student Achievement and should not have for the following reasons.
NAEP test results- “Everyone who has ever studied the NAEP achievement levels has rejected them. The "everyone" includes individual psychometricians such as Lyle Jones at the University of NC and Bob Forsyth at U of Iowa. It also includes respected groups such as the Center for Research in Evaluation, Standards, and Student Testing (CRESST, co-headquartered at U of Colorado and UCLA), the General Accounting Office and the National Research Council. The NRC called them "fundamentally flawed" (Quote from Gerald Bracey Phi Delta Kappan).
"NAEP's current achievement-level-setting procedures remain fundamentally flawed. The judgment tasks are difficult and confusing; raters' judgments of different item types are internally inconsistent; appropriate validity evidence for the cut scores is lacking; and the process has produced unreasonable results.” (National Academy of Sciences, 1998}.
“The editors of Quality Count urge readers to focus on the achievement gaps within states, rather than compare data across states, because each state has its own standards, tests, and definitions of "proficiency." (Quality Counts 2004) In reality, the test results are useless for each state because there are absolutely no validated participation rates for any category of students including special education students.
“Ten states and the District of Columbia could not provide participation rates for any of the requested grades. Some states could not break out the rates by specific grade levels; some experienced coding problems that invalidated their data; and others could only compare test-taking rates of special education students against those for all students (including those with disabilities), not just general education students. The No Child Left Behind Act requires states, districts, and schools to test 95 percent of all students, including those with disabilities. According to Education Week's survey, 13 of the 37 states that provided participation rates for students with disabilities tested 95 percent or more of their special education students in reading and math in grades 4, 8, and 10. Overall, participation rates for students with disabilities ranged from 40 percent to 100 percent.” (Quality Counts 2004) The NAEP exams mandated under NCLB are useless for any purpose. .
Table 1 shows the rank order and the at or above “proficient” level percentage of the top 10 states on the NAEP 4th and 8th Grade reading and math tests. The grade and rank of the top 10-student achievement states for 2003 and 2004 Standards and Accountability and Efforts to Improve Teacher Quality are given. (Quality counts 2004 January 2004)
Missing are tables. For them, go to url below.
*State Ranking NAEP Test * Percent At or Above Proficient (Quality Counts 2004)
Only 30% of United States students are proficient or above on the 4th and 8th Grade NAEP Reading tests. Not one state had a statistically significant increase in 4th Grade reading from the previous administration of the exam. Two states, Texas and West Virginia (below average scoring states) had a statistically significant decrease in 8th Grade reading from the previous administration of the test. One state, North Dakota (above average scoring state) had a statistically significant increase in 8th Grade reading. To expect that all students, including special education students, will continuously improve each year on NAEP reading tests and eventually score at proficient or above is absurd. To label individual schools as needing improvement or failing, based only on NAEP test results is disgraceful.
Thirty-nine states and D.C. (7% proficient and above) had a statistically significant increase in 4th Grade math from the previous administration of the exam. Only 31% of students in all states scored at or above proficient on the 4th Grade math NAEP test. Sixteen states had a statistically significant increase on the 8th Grade NAEP math test. A total of 27% of students in the U.S. scored at the proficient or above level. NAEP math tests are destructive to poor students retained at grade level based on a useless math test.
The Center also reviewed the student achievement ranks of eight states and Washington D.C. that received a grade of D, D- or F and seven states that received an A or A-for the Standards and Accountability category in 2002, 2003 or 2004..
Table 2 States Receiving D+, D, D-, F or A, A- Grades for Standards & Accountability in 2002, 2003 or 2004
This table also missing.
The data in Tables 1 and 2 indicate that the methodology used to determine the state rankings and grades is absurdly flawed. There is no valid statistical relationship between a states student achievement and its standards and accountability ranking. In fact, the standards and accountability ranking of a state is in many cases inversely proportional, not directly proportional, to the student achievement of a state, according to Education Week report writers. For example, Minnesota, which was second only to Connecticut in student achievement ranks on NAEP tests in the United States in 2001, 2002, and 2003, received an F grade on Standards and Accountability in 2001 and D- in 2002 and 2003.
The unbelievable Education Week rationale for rating states low on Standards and Accountability in 2001 and 2002 is based on two specious premises. (1) “Because the quality of state assessments is so pivotal for standards-based reform to work, this year we changed our methodology to focus more on details of state assessment systems. Specifically, states get full credit only if they use tests aligned with their standards in all three grade spans—elementary, middle, and high school—and offer the tests in all four core subjects: English, math, science and social studies.” (2) “In addition, it’s important that tests go beyond multiple-choice questions to gauge students’ knowledge and skills. Therefore, Quality Counts gives more credit to states that also include short-answer questions, extended-response items (such as writing an essay), and portfolios as part of their testing systems. The more detailed analysis caused many state grades on “standards and accountability” to decline.” (Quality Counts 2002)
But the most incredible paradox of this so-called grading of the states report is that it extols government over local control of schools. “A solid core of local control states –including Idaho, Iowa, Nebraska and North and South Dakota—are inching toward state standards but have rejected a strong state driven accountability system. Such states do not typically do well on the Quality Counts indicators, which assume a strong state role in standards, assessments and accountability.” (Quality Counts 2001) The local control states are traditionally high student achievement states and also are rated low by the Education Week experts. “Government schools” is a demeaning term that is often used to bash American education by conservative school bashers.
The original inspiration for the evaluation of the Education Week state-by-state study were the January 2001 headlines in the Star Tribune in Minnesota, “Minnesota scores an F in school standards study”, and the Milwaukee Journal Sentinel, “School report gives state poor grades”, two states that are leaders in student achievement. In January 2002 the abysmal headline in the Milwaukee Journal Sentinel was, “State schools earn mediocre grades. This article outrageously claimed Wisconsin schools were mediocre but well funded based on the absurd Quality Counts 2002 report. I have not found any articles in the Journal Sentinel as yet on the Quality Counts 2003 or 2004 report.
Why does the American media continue to reiterate misinformation in education commentaries and bogus reports bashing American education without question? The question that I continually ask in regards to the reporting of education issues, “Is the media educationally disadvantaged or just lazy and naïve? The same question might be reasonably asked of too many so-called education experts from prestigious universities and so-called think tanks. The problem is that too many so-called experts are feeding well at the trough kept full by foundations and other organizations that are not the friends of American education!
The Education Week Quality Counts report may be the poorest quality education report I have ever reviewed in 49 years in the education business. An F grade is well deserved.
Education Week is welcome to challenge the grade they received in this commentary.
Unfortunately, Education Week is not alone in preparing spurious reports on American public schools. Negative reports on American public education continuously come from well-funded think tanks and even prestigious university researchers feeding at the trough. However the United States Department of Education may be responsible for the most harmful reports and policies for American public schools. The NCLB education policies are an excellent example.
Dennis W. Redovich
Center for the Study of Jobs & Education in Wisconsin and the United States
188 Education Week’s Grading of the States Quality Counts 2004 Receives an F