RACE TO THE TOP ASSESSMENT PROGRAM
Ohanian Comment: Here [and below] is the Executive Summary of the U.S. Department of Education RTTT Assessment Program. Read and weep. Weep not because it is in any way unexpected but weep for the way your professional organizations and your proto-progressive friends will scramble to align themselves and their ideals to this corporate outrage. Money talks and Arne Duncan's operative ritual is to speak incessantly while carrying a pot of money that looks big to rank and file educationists.
The DOE has set aside up to $350 million RTTT funds for supporting states in the development of "next generation of assessments." Given the Race to the Top milieu, given the hyper-aggressive allegiance to Business Roundtable goals, this will turn out to be The Lost Generation of Assessments. But this won't dissuade the professional organizations and proto-progressives from scrambling for a place at the table--and the chance to get a hand in the money pot. Years ago, in The Ghost in the Machine and Janus: A Summing Up, Arthur Koestler had a name for table-scramblers. He named them Call Girls.
The "public input" which Arne brags about means that the DOE invites people they can rely on to stay on message. I have an idea of where they should stick their input. . . but never mind.
The tragedy here is that the DOE insists on assessments that measure every student against rigged common core standards "that build toward college and career readiness. . . . measure the extent at each grade level to which each individual student is on track."
The industrial metaphor of education as a race in which student are stuck to a track continues.
I can give you a very detailed account of what playing Hink Pinks taught me about individual children in a second grade class (plus one girl's mother, who exhibited the exact same hink pink deficiency as her brilliant daughter). I can tell you what watching a troubled, obnoxious teen play himself in Scrabble for six months taught me about his "achievement."
Of course if you bow every morning at the
Alignment altar, then you have to tell Charles, a mainstreamed 11-year-old in third grade, that he can't read Rumpelstiltskin 16 days in a row. And tell Jack he can't play Scrabble at all, never mind for six months. In reality, in his decidedly un-Aligned state, Jack was engaged in the most difficult work of all--that silent, solitary, internal task of coming to grips with one's self, with one's own needs. Jack's work meant first changing his view of himself and later figuring out where he might find a place for himself in the world.
Charles' and Jack's very different stories, stories with a common core of my idea of assessment, are leitmotifs in my professional career. That's why I tell their stories over and over. Of course Frank Smith told us long ago why students want to repeat certain activities over and over and over: It's because they are getting something unseen and important out of that activity. They aren't "wasting time," they aren't trying to get out of work the teacher deems more important. They are getting something they need.
Something they need. Got that, Arne? A schoolteacher letting students find something they need.
In Word Freak: Heartbreak, Triumph, Genius, and Obsession in the World of Competitive Scrabble Players, Stefan Fatsis shows the reader that Scrabble at the competitive level is about weirdness, extreme weirdness. It's also about linguistics, psycholoogy, mathematics, memory, competition, doggedness. Scrabble at the national competitive level and with one out-of-kilter kid in an alternative school set up for misfits is about mastering the rules; it's about failure and it's about hope.
NOTE: Jack became a career man in the Marine Corps. I don't know what happened to Charles, a boy so damaged by his world I doubt he grew up to be self-sufficient. But he did find out, through love of another book, "The Ugly Duckling, that "It's OK to be different." Charles wrote a note about how important the book was. He wrote that he hoped I'd send his note to the newspaper and that they would publish it so everybody would know about differences.
Also NOTE: The Assessment program designed by Duncan and friends is designed to make Determinations of principal and teacher effectiveness to inform evaluation and the provision of support to teachers and principals. Somehow, I doubt that the Federal Assessment program would allow me to submit Charles' note or Jack's Marine Corps acceptance as proof of my teacherly "effectiveness."
RACE TO THE TOP ASSESSMENT PROGRAM
NOTICE OF PUBLIC MEETINGS AND REQUEST FOR INPUT
October 20, 2009
By March 2010, the Secretary of Education (Secretary) intends to announce a competition for a program that would support one or more consortia of States that are working toward jointly developing and implementing common, high-quality assessments aligned with a consortiumĂ˘€™s common set of K-12 standards that are internationally benchmarked and that build toward college and career readiness by the time of high school completion. To inform the design of this program and the development of a notice inviting applications that establishes the requirements for this competition and to provide technical assistance to States, the Secretary is seeking input from States, technical experts, and members of the public through public meetings and written submissions. Following the public meetings and review the written submissions, the Department intends to publish a notice inviting applications for such a competition.
Public meetings will have two parts:
Ă˘€Â˘Input from invited panels of experts and stakeholders
Invited panelists will have a set amount of time to individually respond to the questions in the notice
Department representatives will ask questions of individual panelists and facilitate cross-panelist discussion
Ă˘€Â˘Open opportunity to share input
Each meeting will have 60 to 90 minutes dedicated to opportunities for interested members of the public, who have registered to speak, to respond to the questions in the notice
Each individual scheduled to speak will have 5 minutes to provide oral input
Written submission will also be accepted
Meeting Locations, Dates, and Topics
Boston Ă˘€“ Nov 12-13:
General Assessment (1 day)
High School Assessments (1/2 day)
Technology and Innovation in Assessment (1/2 day)
Atlanta Ă˘€“ Nov 17-18:
General Assessment (1 day)
Assessing Students with Disabilities (1/2 day)
Denver Ă˘€“ Dec 1-2:
General Assessment (1 day)
Assessing English Language Learners (1/2 day)
Assessment Program Design and Questions
The Assessment Program is intended to support consortia of States working toward jointly developing and implementing a next generation of common summative assessments that are aligned with a common set of K-12 internationally benchmarked, college and career ready standards that model and support effective teaching and student learning. Such summative assessments would allow students, including students with disabilities and English language learners, to demonstrate at each grade level tested their mastery of knowledge and skills and the extent to which each student is on track to college and career readiness by the time of high school graduation.
In designing the requirements for this program, the Secretary is particularly interested in innovative and effective approaches to assessment that will assist States in creating powerful and useful systems of assessment that meet these requirements.
In the following paragraphs, we have provided a framework that outlines the characteristics we believe should be required or encouraged in assessment systems supported by a grant under this proposed program. We then list the specific questions on which we seek input, taking into account this framework. It is important to note that this proposed program, the public meetings, and the framework below would focus on the design and quality of assessment systems and not accountability policies. Given the pending reauthorization of the ESEA, we intend that the Assessment Program would support the development of the best possible assessments that could be not only appropriately used by States under the current ESEA assessment and accountability requirements, but could also serve additional purposes as outlined in the notice.
Design of Assessment Systems Ă˘€“ General Requirements
The Department is particularly interested in supporting the development of summative assessments that measure:
Ă˘€Â˘Individual student achievement as measured against standards that build toward college and career readiness by the time of high school completion;
Ă˘€Â˘Individual student growth (that is, the change in student achievement data for an individual student between two or more points in time); and
Ă˘€Â˘The extent to which each individual student is on track, at each grade level tested, toward college or career readiness by the time of high school completion.
At a minimum, we would expect that the common assessments would measure each of these elements in the subject areas of reading/language arts and mathematics, and would provide information for each student annually in grades 3 through 8, and provide information at the high school level about each studentĂ˘€™s college and/or career readiness. The assessments need not be limited to a single end-of-year assessment but could include multiple summative components administered at different points during the school year.
Moreover, the assessments might be viewed as replacing rather than adding to the assessments currently in use in States participating in the consortia.
Information gathered from the assessments should be useable in informing:
Ă˘€Â˘Teaching, learning, and program improvement;
Ă˘€Â˘Determinations of school effectiveness;
Ă˘€Â˘Determinations of principal and teacher effectiveness to inform evaluation and the provision of support to teachers and principals; and
Ă˘€Â˘Determinations of individual student college and career readiness, such as determinations made for high school exit decisions, college course placement in credit-bearing classes, or college entrance.
Design of Assessment Systems Ă˘€“ Required Characteristics
With respect to the design of the assessment system, the Department would likely require that the assessments, at a minimum, meet the following characteristics:
Reflect and support good instructional practice by eliciting complex responses and demonstrations of knowledge and skills consistent with the goal of being college and career ready by the time of high school completion;
Be accessible to the broadest possible range of students, with appropriate accommodations for students with disabilities and English language learners;
Contain varied and unpredictable item types and content sampling, so as not to create incentives for inappropriate test preparation and curriculum narrowing;
Produce results that can be aggregated at the classroom, school, LEA, and State levels;
Produce reports that are relevant, actionable, timely, accurate, and displayed in ways that are clear and understandable for target audiences, including teachers, students and their families, schools, LEAs, communities, States, institutions of higher education, policymakers, researchers, and others;
Make effective and appropriate use of technology;
Be valid, reliable, and fair;
Be appropriately secure for the intended purposes;
Have the fastest possible turnaround time on scoring, without forcing the use of lower-quality assessment items; and
Be able to be maintained, administered, and scored at a cost that is sustainable over time.
In addition, the Department is particularly interested in assessment systems in which:
Teachers are involved in scoring of constructed responses and performance tasks in order to measure effectively students mastery of higher-order content and skills and to build teacher expertise and understanding of performance expectations;
The assessment approach can be easily adapted to include summative assessments in other content areas (e.g., science, social studies) in the future;
The technology "platform" created for summative assessments supports assessment and item development, administration, scoring, and reporting that increases the quality and cost-effectiveness of assessments; and
The technology infrastructure created for summative assessments can be easily adapted to support practitioners and professionals in the development, administration, and/or scoring of high-quality interim assessments.
Design of Assessment Systems Ă˘€“ LEA-Level Activities
With funds that are directed to LEAs under this program, the Department is interested in supporting LEA-level activities that are designed by the State consortium to support development and implementation of its assessment system. With respect to LEA-level funds, the Department would likely require that the funds be used to support the following types of activities conducted by LEAs that choose to participate:
Ă˘€Â˘Pilot testing of the new assessments with different populations, including English language learners and students with disabilities;
Ă˘€Â˘Designing systems to support and enable effective and consistent teacher scoring, providing professional development support for these activities, and implementing them statewide;
Ă˘€Â˘Statewide transition to the consortiumĂ˘€™s K-12 common, college and career ready, internationally benchmarked standards, with new high-quality assessments (consistent with the State plans described in the notice of proposed priorities, requirements, definitions, and selection criteria for the Race to the Top Fund general program);
Ă˘€Â˘Development of formative or interim assessments that align with State summative assessments as part of a comprehensive assessment system.
QUESTIONS FOR INPUT
The specific questions on which the Department seeks input are listed below.
General Assessment Questions
Propose an assessment system (that is, a series of one or more assessments) that you would recommend and that meets the general requirements and required characteristics described in the notice. Describe how this assessment system would address the tensions or tradeoffs in meeting all of the general requirements and required characteristics. Describe the strengths and limitations of your recommended system, including the extent to which it is able to validly meet each of the requirements described in the notice. Where possible, provide specific illustrative examples.
For each assessment proposed in response to question 1), describe the--
Ă˘€Â˘Optimal design, including--
Type (e.g., norm-referenced, criterion-referenced, adaptive, other);
Frequency, length, and timing of assessment administrations (including a consideration of the value of student, teacher, and administrative time);
Format, item-type specifications (including the pros and cons of using different types of items for different purposes), and mode of administration;
Whether and how the above answers might differ for different grade levels and content areas;
Ă˘€Â˘Administration, scoring, and interpretation of any open-ended item types, including methods for ensuring consistency in teacher scoring;
Ă˘€Â˘Approach to releasing assessment items during each assessment cycle in order to ensure public access to the assessment questions; and
Ă˘€Â˘Technology and other resources needed to develop, administer, and score the assessments, and/or report results.
ARRA requires that States award at least 50 percent of their Race to the Top funds to LEAs. The section of the notice entitled Design of Assessment Systems Ă˘€“ LEA-Level Activities, describes how LEAs might be required to use these funds. What activities at the LEA level would best advance the transition to and implementation of the consortiumĂ˘€™s common, college and career ready standards and assessments?
If a goal is that teachers are involved in the scoring of constructed responses and performance tasks in order to measure effectively students' mastery of higher-order content and skills and to build teacher expertise and understanding of performance expectations, how can such assessments be administered and scored in the most time-efficient and cost-effective ways?
Given the assessment design you proposed in response to question 1), what is your recommended approach to competency-based student testing versus grade-level-based student testing? Why? How would your design ensure high expectations for all students?
Given the assessment design you proposed in response to question 1), how would you recommend that the assessments be designed, timed, and scored to provide the most useful information on teacher and principal effectiveness?
Specific Technical Assessment Questions
What is the best technical approach for ensuring the vertical alignment of the entire assessment system across grades (e.g., grades 3 through 8 and high school)?
What would be the best technical approach for ensuring external validity of such an assessment system, particularly as it relates to postsecondary readiness and high-quality internationally benchmarked content standards?
What is the proportion of assessment questions that you recommend releasing each testing cycle in order to ensure public access to the assessment while minimizing linking risk? What are the implications of this proportion for the costs of developing new assessment questions and for the costs and design of linking studies across time?
High School Assessment Questions
Provide recommendations on the optimal approach to measuring each studentĂ˘€™s college and career readiness by the time of high school completion. In particular, consider:
How would you demonstrate that high school students are on track to college and career readiness, and at what points throughout high school would you recommend measuring this? Discuss your recommendations on the use of end-of-course assessments versus comprehensive assessments of college and career readiness. (Note: If you recommend end-of-course assessments, please share your input on how to reconcile the fact that college and career ready standards might not include all of the topics typically covered in todayĂ˘€™s high school courses.)
Questions on the Assessment of English Language Learners
Provide recommendations for the development and administration of assessments for each content area that are valid and reliable for English language learners. How would you recommend that the assessments take into account the variations in English language proficiency of students in a manner that enables them to demonstrate their knowledge and skills in core academic areas? Innovative assessment designs and uses of technology have the potential to be inclusive of more students. How would you propose we take this into account?
In the context of reflecting student achievement, what are the relative merits of developing and administering content assessments in native languages? What are the technical, logistical, and financial requirements?
Question on the Assessment of Students with Disabilities
Taking into account the diversity of students with disabilities who take the assessments, provide recommendations for the development and administration of assessments for each content area that are valid and reliable, and that enable students to demonstrate their knowledge and skills in core academic areas. Innovative assessment designs and uses of technology have the potential to be inclusive of more students. How would you propose we take this into account?
Questions on Technology and Innovation in Assessment
Propose how you would recommend that different innovative technologies be deployed to create better assessments, and why. Please include illustrative examples in areas such as novel item types, constructed response scoring solutions, uses of mobile computing devices, and so on.
We envision the need for a technology platform for assessment development, administration, scoring, and reporting that increases the quality and cost-effectiveness of the assessments. Describe your recommendations for the functionality such a platform could and should offer.
How would you create this technology platform for summative assessments such that it could be easily adapted to support practitioners and professionals in the development, administration, and/or scoring of high-quality interim assessments?
For the technology Ă˘€śplatformĂ˘€ť vision you have proposed, provide estimates of the associated development and ongoing maintenance costs, including your calculations and assumptions behind them.
Project Management Questions
Provide estimates of the development, maintenance, and administration costs of the assessment system you propose, and your calculations and assumptions behind them.
Describe the range of development and implementation timelines for your proposed assessment system, from the most aggressive to more conservative, and describe the actions that would be required to achieve each option.
How would you recommend organizing a consortium to achieve success in developing and implementing the proposed assessment system? What role(s) do you recommend for third parties (e.g., conveners, project managers, assessment developers/partners, intermediaries)? What would you recommend that a consortium demonstrate to show that it has the capacity to implement the proposed plan?
U. S. Department of Education + Ohanian Rant
INDEX OF NCLB OUTRAGES