Orwell Award Announcement SusanOhanian.Org Home


What Can Student Drawings Tell Us About High-Stakes Testing in Massachusetts?

Susan Notes:
In case you missed this when it appeared--or in case you need a reminder--take a look at what few researchers look at--students' views of high-stakes testing.


The paper is below, but you really need to go to this url--to see the student drawings:

http://wwwcsteep.bc.edu/drawoned/mcas/mcaspaper.html
to see the students' drawings--and commentary.


Description
This study of students' self-portraits as test-takers in Massachusetts stimulates discussion of the variation in students' responses to high-stakes testing according to students' individual idiosyncrasies, grade level, and school context.


Abstract

Many high-stakes testing policies rest on the belief that attaching consequences to test scores will persuade students of the importance of academics and will motivate them to exert greater effort to achieve at passing levels. This investigation explores that assumption through an examination of students' drawings of themselves taking the Massachusetts high-stakes test. Student drawings conveyed a range of opinions about test difficulty, length, and content. In a minority of drawings, students depicted themselves as diligent problem-solvers and thinkers. A larger percentage of drawings portrayed students as anxious, angry, bored, pessimistic, or withdrawn from testing. Students' reactions to testing varied from the elementary to secondary grades. The responses of urban students were notably different from those of non-urban students. These patterns suggest that policy-makers can not rely on high-stakes testing to motivate all students in a uniformly positive way. Colo

Executive Summary

Many high-stakes testing policies rest on the belief that attaching consequences to test scores will persuade students of the importance of academics and motivate them to exert greater effort to achieve at passing levels. In Massachusetts, beginning with the Class of 2003, policy requires virtually all students to pass the mathematics and English/Language Arts portions of the Massachusetts Comprehensive Assessment System (MCAS) in order to receive a high school diploma. All students in fourth, eighth, and tenth grades now take the MCAS in mathematics and English/Language Arts as well as in science and history/social studies. Students are further tested in additional grades, with variations in the subject area. Results from 1998 and 1999 have proved disappointing, with over half the tenth graders receiving "Failing" scores in at least one area.

Policy-makers in Massachusetts support of MCAS, in part, because they believe that once students realize their scores will "count," they will begin to treat school work more seriously. To examine this assumption from the perspective of students themselves, we asked Massachusetts teachers to invite their students to "Draw a picture of yourself taking the MCAS." Using a sample of the 411 drawings gathered from fourth, eighth, and tenth grade students, we identified superficial features evident in the drawings and developed a coding sheet that we then used to tally the features of each of the drawings. This paper summarizes the patterns that emerged as a result of our tally.

Overall, 37% of the drawings conveyed neither commentary on the test or information about students' personal reactions to test-taking. These drawings simply showed students as test-takers, typically on their own with a test booklet containing multiple choice items with bubbles to fill in, writing responses, or both. In the remaining 63% of the drawings, students commented on MCAS itself or provided personal responses to their experiences as test-takers.

Some students chose to use the invitation to draw themselves as test-takers by remarking on test difficulty, length, and content. About one in six drawings alluded to test difficulty, and of these, drawings describing MCAS as difficult, outnumbered those describing the test as easy by over five to one. Additional drawings drew attention to test content students found tricky or confusing in self-portraits of students thinking "What?" or "Huh?" Further, students commented on the length of the test, typically describing it as excessive.

Drawings also conveyed information about student's affective responses to MCAS. The number of positive responses was a fraction of the negative responses. We coded about 18% of the drawings for "diligence" and 5% for "confidence." We also noted about 7% of the drawings portrayed students as thinking or solving problems. A larger percentage of drawings portrayed students as anxious, angry, bored, pessimistic, or withdrawn from testing. About 13% of all drawings conveyed general anxiety and fears of failing, and 10% communicated anger. About 5% of the drawings portrayed students as daydreaming or sleeping during testing, and about the same percentage communicated boredom. In about 3% of the drawings students portrayed themselves as sad or disappointed with their performance, and another 4% communicated relief at the test's completion.

In Part 2 of this paper, we reflect on personal, educational, and social factors that might help us understand the variety of student responses to MCAS. Literature on motivation to learn indicates that external testing does not necessarily inspire greater student effort. Rather, student motivation depends on a complex mix of beliefs, attitudes, and feelings that students develop in the context of classroom experiences, personal relationships, and school routines. In this context, we note that students' responses to testing as expressed in the MCAS drawings, raises a scenario that differs from the optimistic predictions of those who hope that high stakes testing will improve motivation: If the immediate costs of testing for some students include anxiety, anger, diminished confidence, and helplessness, some students may not endure these feelings but may instead withhold effort from testing as a way of preserving their sense of personal adequacy, even when they understand the consequences of doing so.

What factors might give rise to students' negative responses to high stakes testing? The individuality and diversity of drawings remind us that students react to test-taking and specific test items in highly personal ways. We consider how the mismatch between the mode of test administration and classroom learning might affect student responses. We suggest that ongoing communication with students about test items and administration should play an important role in assessing the validity of high stakes tests like MCAS. State practices like the release of scores by race from prior tests during the testing period may set African American and Hispanic students up to the effects of what Claude Steele calls "stereotype threat" and may contribute to some students' disengaging testing.
We note that the variability of students' responses to MCAS according to their grade level parallel findings from earlier research on developmental differences in testing and motivation. Survey research over the past decade has found that many students become increasingly cynical about testing as they move beyond the elementary grades. Likewise, the MCAS drawings suggest that older students are more likely to express disillusionment, hostility, and loss of motivation than younger students in relation to testing.

Schooling policies and practices as they differ across districts may also set the stage for varying responses to testing between urban and non-urban students. In Massachusetts, urban and non-urban districts adopt distinctly different approaches to standardized testing, grade retention, and tracking and ability grouping. Students who are overexposed to these practices may experience eroded confidence and diminished motivation that contribute to varying responses to MCAS expressed in the drawings of urban and non-urban students.

Students' drawings underscore the limitations of high stakes testing policies to generate uniform positive effects on the motivation of all students. While some students may respond to the policy that links passing MCAS to high school graduation by mobilizing all their resources to improve their test scores, others may develop attitudes that are counterproductive to their success. Older students and urban students appear to be especially vulnerable to giving up, perhaps because, anticipating failure for themselves or their friends, they view MCAS less as a challenge than a source of intimidation and humiliation.

Introduction

All rumors of [the test's] esoteric and labyrinthine questions are true, and it personally took me over 17 hours to complete the [MCAS].
--Christian Drake, Northampton, Massachusetts, Class of 2000

The questions were really hard. I don't know how they expected us to answer them.
--Sarah Dauphinais, Westfield, Massachusetts, Class of 2003

We think it's unfair because they're testing us on things we haven't even learned yet.
--Yaraliz Soto, Holyoke, Massachusetts, Class of 2002

After the first two days of tests, your fingers and your mind hurt. A lot of kids didn't try after that.
--Arthur Page, Wareham, Massachusetts, Class of 2003

Yes, the tests are challenging, but they are not unfair.
--James Peyser, Chairman, Massachusetts Board of Education




. . .In contrast to the enthusiasm of the media and policy officials, however, researchers increasingly warn that relying on standardized test scores to make educational decisions related to curriculum and instruction, allocation of resources, or students' futures is unwise (Dorn, 1998; Heubert & Hauser, 1999; Linn, 2000; Mehrens, 1998; Noble & Smith, 1994; Stake, 1998; Whitford & Jones, 2000). Widely-reviewed trade books with a focus on standardized testing further caution that such testing has a history of backfiring, resulting in narrowing rather than expanding students' opportunities to learn (Kohn, 1999; Lemann, 1999; Ohanian, 1999; Sacks, 1999). Yet despite extensive public discussion, few have attended to how those most affected, namely students, perceive and react to testing, or how students' responses might affect test results.

As a step toward addressing this gap, this paper focuses on student responses to high stakes testing in Massachusetts. Using students' drawings of themselves taking the statewide test, our study describes the range of responses of students to high stakes testing. It also provides electronic links to the drawings themselves, the coding scheme used to analyze the drawings, and brief suggestions on using drawings.



--------------------------------------------------------------------------------


The nature of statewide testing in Massachusetts

In the spring of 1998, the Massachusetts Department of Education implemented the first version of the Massachusetts Comprehensive Assessment System (MCAS), prepared under contract with Advanced Systems of New Hampshire. The MCAS is a paper-and-pencil test that includes both multiple-choice and open-response items. Scores are determined by measuring student performance against "standards" that can be set at high or low levels. Starting with the class of 2003, all Massachusetts high school students will be required to attain "passing" scores on tenth grade mathematics and English tests in order to receive a high school diploma.

MCAS is a lengthy test, taking more time than the Graduate Record Examinations or standardized tests required for medical, law, or business school. The 1998 testing schedule called for students to sit for seven sessions in English/language arts, three in mathematics, and three in science and technology, with most sessions running a minimum of one hour. The 1999 schedule required fourth graders to sit for five sessions in English, two in mathematics, two in science and technology, and one session of tryout questions in history and social studies. In 2000, the state reduced fourth grade testing time to five sessions in English, two in mathematics, and two in science and technology, but spread testing into the fifth grade for the first time, ostensibly to reduce what Board of Education Vice-Chairperson Roberta Schaefer called "testing burnout" (McFarland, 1999). Eighth and tenth grade students sat for 13 sessions, with tryout questions scheduled for sixth and seventh graders. Test times published by the Department of Education are approximate; in all grades, actual test times vary, and often exceed, those recommended.

MCAS is meant to be a difficult test. From the beginning, policy-makers intended MCAS results to be scored at a more demanding level than nationally normed standardized tests. According to John Silber, then-Chairman of the Massachusetts Board of Education, as reported in February 1997 minutes from the Board meeting that month http://www.doe.mass.edu/boe/minutes/97/min21097.html

If on one of the nationally normed tests that is given in grades 4, 8 and 10, students should turn out with a B, and on the Advanced Systems test they came out with a C, we might conclude that Advanced Systems has pegged it just right with a more demanding standard, a standard that would approach international standards. On the other hand, if the situation were reversed, where a nationally normed test shows that the students were performing at about a C level and Advanced Systems had a B level, then we would know that the standards in that exam were perhaps not rigorous enough.

Since setting the "cut-off" scores for MCAS is a political matter, "passing" scores can be set at a higher level relative to other tests. Indeed, at its January 2000 meeting, the Board of Education established the official requirement that students in the graduating class of 2003 meet or exceed the threshold scaled score of 220 on English and mathematics MCAS grade 10 tests in order to satisfy requirements of the "Competency Determination" for a high school diploma (http://www.doe.mass.edu/boe/bib/bib00/12800.html). This decision followed by two months the news that nearly half the state's tenth grade students did not meet this performance standard.

MCAS is also a controversial test. Despite a concerted campaign by business, media, and policy leaders to promote MCAS as a tool for accountability and reform (Guenther, 2000; Hayward, 2000b), those closer to student learning have become increasingly vocal about their disenchantment with high stakes testing in Massachusetts. Educators from all grade levels, including several former Massachusetts Teachers of the Year and McArthur Fellow Deborah Meier, have questioned the value of such testing for student learning (Associated Press, 2000; Greenebaum, 2000; Hayward, 2000a; Lindsay, 2000; Lord, 1999; Marcus, 1999; Meier, 2000; Penniman, 2000; Sukiennik, 2000; Tantraphol, 2000a). Parents have launched a petition drive and lobbied elected officials to reject the tests as a means to determine graduation (Daley D., 2000; Downs, 1999; Walsh, 2000a, 2000b; Wilson, 2000; also, http://www.massparents.org). Finally, the Student Coalition Against MCAS (http://www.scam-mcas.org) has distributed position papers on MCAS and testified at school committee meetings. In the spring of 2000, hundreds of students from across the state organized a test boycott, rallies, and the delivery of letters to the governor protesting MCAS (Crittenden, 2000; Gentile and Sukiennik, 2000; Hayward, 2000d; Steinberg, 2000; Scherer, 2000; Shartin, 2000; Sweeney, 2000; Tantraphol, 2000b; Thesing, 2000; Vaishnav & Vigue, 2000; Vigue and Yaekel, 2000).



--------------------------------------------------------------------------------


Participation in and results of MCAS testing

In May 1998, Massachusetts administered its first round of testing in English/language arts, mathematics, and science/technology to 201,749 students. With participation required of virtually all students, including students with disabilities and those whose first language is not English, the testing pool included 96.6% of all students enrolled in the fourth, eighth, and tenth grades. Although the 1999 participation rate for students with disabilities dropped by several percentage points in each grade tested, overall participation for the second year of testing was similar to that of 1998 (Massachusetts Department of Education, October 1999a; Massachusetts Department of Education, November 1999).

MCAS results are reported in four categories: "Advanced," "Proficient," "Needs Improvement," and "Failing." Results of the first round of testing reported in December 1998 were disappointing. Only in the Grade 8 English/language arts test did more than half the students score in the "Advanced" and "Proficient" categories. In May 1999, another 215,045 students sat for the second round of testing, and results reported in December 1999 had changed little from the previous year. As in 1998, only in English/language arts in Grade 8 did more than half the students score above the "Needs Improvement" category, results Governor Celluci called "unacceptable" (Mashberg, 1999).

Results for African American and Latino students were more disheartening. According to an analysis of 1998 results prepared by the Mauricio Gaston Institute for Latino Community Development and Public Policy at the University of Massachusetts in Boston (http://www.gaston.umb.edu), 49% of the African American and 58% of the Latino tenth graders failed the English portion of MCAS compared with 19% of the state's white tenth graders. While 80% of African American and 83% of Latino tenth graders failed mathematics, 43% of the state's white tenth graders did so (Uriarte & Chavez, 2000). The Gaston Institute's analysis also revealed that high numbers of minority students had failed MCAS in specific districts. Cities like Boston, Springfield, Worcester, New Bedford, Lowell, Holyoke, Fitchburg, Chicopee, Chelsea, and Salem posted still higher failing rates for their African American and Latino tenth graders than for those students statewide. A Department of Education analysis based on 1999 scores revealed similar patterns (http://www.doe.mass.edu/mcas/race_report99/default.html). In response, researchers, parents, and community leaders warned that without a change in the policy linking MCAS scores to graduation, urban students would begin to drop out of school in increasing numbers (Allen, 1999; National Center for Fair and Open Testing, 2000; Rodriguez, 1999; Vigue, 2000).

MCAS "Failing" scores also put large numbers of students with disabilities and students whose first language is not English at high risk of leaving high school without a diploma. According to reports released by the Massachusetts Department of Education, 64% of the tenth graders with disabilities and 59% of students with limited English proficiency received a "Failing" score on MCAS in 1998. Failing rates were still higher in 1999 when 71% of the tenth graders with disabilities and 66% of students with limited English proficiency received "Failing" grades (Massachusetts Department of Education, October 1999a; Massachusetts Department of Education, November 1999).



--------------------------------------------------------------------------------


Challenges to MCAS

Although MCAS results have been announced with great fanfare and assurances of their reliability, MCAS may, in fact, mistakenly classify competent students as ill-prepared for life after high school. Comparing the scores of students who had taken both the MCAS and one of four other standardized tests the Iowa Test of Basic Skills (ITBS), Stanford 9 Achievement Tests (SAT9), Educational Research Bureau tests (ERB), and the Preliminary Scholastic Achievement Test (PSAT), researchers have found that many students with scores in the upper ranges on each of these four tests could readily fall into any one of the four MCAS score categories (Horn, Ramos, Blumer, & Madaus, 2000). Likewise, the state Board of Education's own technical report summary notes that student achievement as measured by national tests varies widely within MCAS categories. This October 1999 report noted that the group of fourth graders receiving "Failing" MCAS scores in 1998 included students who had received "average" scores, up to the 50th percentile, on the Grade 3 1997 Iowa Test of Basic Skills (ITBS). Further, fourth graders receiving "Needs Improvement" MCAS scores included students whose ITBS scores ranged from the 30th percentile to the 80th percentile (Massachusetts Department of Education, 1998 MCAS Technical Report Summary, Figure 2, 1999). These findings emphasize the possibility that MCAS scores may unreliably report what Massachusetts students know and can do and may result in misclassifying individual students into the "Failing" or "Needs Improvement" categories when, in fact, their achievement is above average.

Professional associations and university educators have also cautioned against the potential misuse of MCAS. In a statement to the Massachusetts legislature, Jacob Ludes III, Executive Director of the New England Association of Schools and Colleges has warned, " The notion that a single high-stakes test can be used to set policy, and reward or punish schools and the children in them, is indeed appalling" (Katz, 2000). Harvard professor Vito Perrone, summarizing an analysis of publicly-released 1998 MCAS items, has raised additional concerns about both the educational quality of MCAS questions and the effects of MCAS on student learning and engagement. Writing on behalf of the Coalition for Authentic Reform in Education (CARE), Perrone (CARE, 1998: 4) states:

In general, the conclusion [from a review of MCAS items] was that the tests were very much like all the other standardized tests we have reviewed over the years; that they might be difficult for many students, not because they are particularly rigorous or challenging, but because they are long, tedious, lacking in a genuine performance base, and filled with ambiguities; and that the tests are likely to dampen student achievement by undermining quality.

A year later, in a similar review of items made public after the second round of MCAS, Perrone (CARE, 1999: 3, 25) added:

In general, the 1999 tests are much like the previous year's tests.... MCAS tests were described as generating no genuine interest, viewed by students as tedious and ambiguous, representing a lack of trust in their abilities and commitments and constituting a waste of their time. It is not surprising that the tests didn't receive students' best efforts.



--------------------------------------------------------------------------------


Students' perceptions of standardized testing

The implication that the most important challenge to MCAS comes from the responses of students to the tests invites further inquiry into students' perceptions of MCAS. If students judge a test as unworthy of their participation, scores from even well designed assessments will not accurately reflect what they have learned in school or what they can do with their knowledge. If students do not put their best effort into testing, both test results and the value of the test as an educational tool are called into question. In addition, student anxiety, stress, fatigue, and motivation to learn compromise test results, a phenomenon known as "test score pollution" (Haladyna, Nolen, & Haas, 1991; Madaus, 1988).

Using students' drawings to explore their perceptions of high stakes testing

With a view to understanding the phenomenon of test score pollution, a few researchers have surveyed and interviewed students. (Debard & Kubow, 2000; Haney & Scott, 1987; Paris, Lawton, Turner, & Roth, 1991; Paris, Herbst, & Turner, in press; Thorkildsen, 1999; Urdan & Davis, 1997). However, although surveys and interviews are valuable tools for gathering information from students, they are often difficult for teachers, school leaders, or community decision-makers to use in their classrooms, schools, and communities. In contrast, asking students to draw a picture of themselves as test-takers in order to examine their perceptions of testing is cost-effective, unintrusive, and compatible with many classroom routines. Using drawings to elicit impressions of schooling can capture the perspectives of students for whom reading or completing survey forms might be difficult, including students with disabilities and students whose first language is not English. Teachers seeking a way to document changes in classroom organization, teaching, and learning can easily collect drawings for reflection and discussion (Haney, Russell, & Jackson, 1997; Tovey, 1996). In addition, drawings are proving to be a valid and reliable way of illuminating how individual students and groups of students understand their own learning processes (Lifford, Byron, Eckblad, & Ziemian, 2000; Russell & Haney, 1999). See "Using drawings to spur reflection and change"

Despite these advantages, we acknowledge potential drawbacks to drawings as a form of research and inquiry. Based on past experience, we know that when asked to draw aspects of their learning and school experience, students sometimes fall back on visual stereotypes, or highly unusual but memorable events. Also, we have seen that older students, in their mid-teens, sometimes decline to draw because they view drawing as childish or have been taught to doubt their own artistic ability. Additionally, we have learned that it can be hazardous to interpret the meaning of individual drawings without being able to talk with the artists who created them (see sidebar). Ideally, in order to inquire more fully into students' reactions to high stakes testing, we would like to use not just drawings, but also interviews and observations. Nonetheless, we think that this exploratory study provides a unique window on the perspectives of a group whose views are far too often ignored in debates about high stakes testing, namely the students who are subjects to such testing.

Gathering drawings from Massachusetts classrooms

In May 1999, shortly after the second round of MCAS testing ended, we sent out an e-mail invitation to a small listserve of Massachusetts teachers asking for their help in an exploration of students' perceptions of MCAS based on students' drawings of themselves as MCAS test-takers. Fifteen educators from fifteen schools in eight different districts responded, and in June, all asked their students to follow the simple prompt: "Draw a picture of yourself taking the MCAS." Subsequently, these teachers sent us drawings by 411 students. Of these, 303 (73.7%) were from 4th graders, 58 (14.1%) were from 8th graders, and 50 (12.2%) were from 10th graders. Disaggregated by type of community, the 411 drawings included 109 (26.5%) from classrooms described as urban, 209 (50.9%) from classrooms described as suburban, and 93 (22.6%) from those described as rural.

We asked teachers to note the student's grade level and the general location of each school (urban, suburban, or rural) on the back of each drawing. We assured teachers that any presentation of the drawings or a summary of patterns would not identify contributing students, teachers, schools, or communities. We have received their permission to use the drawings anonymously to illustrate the general patterns and findings from our sample.

Developing a coding scheme

Over the fall of 1999, we used a random sample of drawings to identify surface features apparent in the drawings, then clustered those features into a coding scheme reflecting broad categories that emerged. Our intention was not to look at the drawings through a psychoanalytic lens, but rather to describe facets of the testing situation that students chose to include in their drawings. To a great extent, then, we listed explicit aspects of the drawings, including such unambiguous features as student postures, testing materials, and the presence of other students or teachers. In addition, the coding scheme included affective responses that were clearly discernible in the drawings. Finally, we allowed for space to note specific and individual features from students' self-portraits, including comments written in thought bubbles, speech bubbles, or captions. (For a copy of the coding scheme used, see Appendix A.)

Coding the drawings

Following the development of the rubric, we coded each drawing individually, noting the characteristics listed in the rubric as well as any additional characteristics, including written commentary by students. Categories were not mutually exclusive, and single drawings encompassing multiple characteristics were coded for all characteristics. All drawings were coded by the first author. In addition, a graduate assistant also coded a randomly selected sample of 40 drawings to assess the inter-rater reliability of the coding scheme, and the first author recoded another set of 60 randomly selected drawings to assess intra-rater reliability. The agreement in independent ratings was greater than 90%. To further study the reliability of coding of drawings, we had five graduate students independently code a random sample of drawings. In this study we used not just percent agreement, but also Cohen's kappa (which adjusts observed agreement for the probability of chance agreement). Across all ten pairs of comparisons, the mean Cohen's kappa was 0.67. In his methodological note on kappa in Psychological Reports, Kvalseth(1989) suggests that a kappa coefficient of 0.61 represents "reasonably good" overall agreement. Our results show clearly that reliability of coding drawings surpass the standard suggested by Kvalseth. Finally, in January 2000, we met with educators who had submitted the drawings. Reviewing the drawings together, we elicited their comments, discussed their interpretations based on their knowledge of testing conditions, and considered the implications.

Analyzing the findings

Given our dependence on busy teacher volunteers, the sample used in our analysis is an opportunity sample and does not represent all Massachusetts students in the grades tested, or all students in urban, suburban, or rural communities. Recognizing the limitations of our sample, then, we have analyzed and reported our findings so as to reduce misinterpretation as much as possible. To this end, we have combined drawings gathered from eighth and tenth graders into a larger sample of drawings from secondary students. In addition, the differences in proportions we report for various categories of responses may reflect the vagaries of our sampling approach, and some differences in proportions may not be real but may have resulted simply from the chance of sampling. For these reasons, we generally avoid discussing differences in the proportions of drawings showing particular features when those differences are less than 0.10 or 10%.



--------------------------------------------------------------------------------


Students' description of MCAS difficulty, length and content

The drawings provide a variegated picture of how students in elementary, middle, and high school grades view high stakes testing. Overall, our sample of student drawings depicted the task of sitting through the multiple MCAS sessions as a solitary experience. Almost three-quarters of the drawings (70.7%) showed students seated alone at their desk or table. By comparison, 9.0% of the drawings included other students, and 2.7% included an adult, presumably a teacher. A little over a third (36.8%) of the drawings showed students both seated and writing, while almost half (45.9%) showed them seated but not writing. The test booklet was visible in about three quarters (76.3%) of the drawings; 41.0% of all drawings depicted the test booklet with writing, and 12.7% showed "bubbles" to be filled in.

Of the 411 drawings, 152 (37.0%) contained no evidence of what students thought or felt about taking the MCAS. These self-portraits convey a picture of students' complying with MCAS requirement with no marked reaction to the task at hand. Contrasting with the relatively neutral drawings, the remaining 259 drawings provided some explicit information about students' perceptions of MCAS. Using thought bubbles, speech bubbles, written captions, and unambiguous postures or gestures, students critiqued the test, referring to test difficulty, content, and length.

Perceptions of test difficulty

About one out of six (17.6%) drawings referred to test difficulty in captions or thought bubbles. Students were more than four times more likely to describe the test explicitly as "hard" (9.3%) than as "easy" (1.7%). Some described MCAS as a mixture of hard and easy items. Drawings from fourth graders and urban students were most likely to allude to test difficulty. Urban students (15.6%) were more likely than suburban students (5.7%) to describe MCAS as "hard."

Perceptions of test content and items

A second group of drawings conveyed students' reactions to MCAS content, including content they found "tricky" or confusing. About one out of twelve drawings (8.5%) included questions marks, often in thought bubbles, sometimes without text, sometimes as part of specific questions. In some of these drawings, students pictured themselves asking for help from the teacher. In others, they raised questions about specific test items, as in one drawing that asked, "Who was Socrates? Who was Socrates? What kind of question is that." In still other drawings, students' questions conveyed simply "What? What? What?" or "What is this?" or "Huh?" A handful of drawings offered self-portraits of students who appeared to be "stuck" or "blanked out" in response to test content.

Perceptions of test length

Six percent of all of the drawings alluded to the length of the MCAS, in terms of both number of pages and/or the time required to take it. Urban students were most likely to draw themselves as taking a test they described as "too long." While only 3.3% of suburban students and none of the rural students commented on test length, 16.5% of the drawings from urban students described MCAS as "too long."

One eighth grader, steam coming from her ears, drew herself with a test booklet of 6,021,000 pages in front of her. A fourth grader labeled a booklet of 1,000,956,902 pages with the title "Stinkn' test" and portrayed herself saying "TO (sic) MUCH TESTING." Another frowning student drew herself thinking "5 pages more." Drawings also portrayed students saying, "Is it over yet?" and, perhaps alluding to the daily repetition of the MCAS ritual, exclaiming, "Not MCAS again!" A related set of drawings, while not referring directly to test length, portrayed students as feeling tired or rushed to complete the test.



--------------------------------------------------------------------------------


Students' affective responses to MCAS

A number of students' portraits of themselves as test-takers departed from critiquing the test itself and instead delineated the wide range of affective responses students have toward MCAS. In some drawings, students' reactions to MCAS were generally positive or negative, and we coded these as "nonspecific positive" or "nonspecific negative." While 2.9% of all students drew pictures that conveyed a "nonspecific positive" response to MCAS, over six times that number (19.3%) used the drawings to communicate a "nonspecific negative" response. We considered drawings that communicated "I like MCAS" as "nonspecific positive" responses to MCAS. We included drawings with captions such as "I hate MCAS," "This test is stupid," and "This feels like jial (sic)." Other drawings conveyed more distinct personal attitudes and feelings toward MCAS, both positive and negative.

Positive responses to MCAS


Additional drawings displayed particular positive responses to MCAS, including diligence and persistence, thinking and problem solving, and confidence, and we coded them accordingly. These categories are not mutually exclusive, and some drawings were coded in several categories.

Diligence and persistence

Many who put their faith in high stakes testing believe that attaching consequences to test scores will push otherwise lackadaisical students to take testing seriously (Manzo, 1997). For example, Martha Wise, president of the Ohio State Board of Education asserts, "Unless we say these [tests] are tremendously important for our students and then tie high stakes to them, students and others will tend to find excuses for not taking the tests [or] for not achieving high scores" (Lawton, 1997).

In our sample, 18.0% of the drawings offered portraits of students as diligent and motivated test-takers. In this category, we included drawings in which students presented themselves as thinking, solving problems, confident, or working hard for an "A" or "100." Fourth graders were most likely to portray themselves as diligent and persistent (21.5%), compared with 8.3% of eighth and tenth graders. Drawings from urban and suburban students were more likely (at 21.1% and 20.1% respectively) to suggest diligence and persistence than those of rural students (9.7%).

Thinking and solving problems

Supporters of high stakes testing in Massachusetts assert that MCAS is better than other standardized tests in that its questions stir students to use critical thinking skills. In 7.3% of the drawings, students depicted themselves thinking and solving problems. In some of these drawings, students were shown considering specific questions from the English and mathematics portions of the test. In others, they appeared engaged in a thinking process, sometimes involving the weighing of various answers to problems, sometimes using test-taking skills. Fourth graders (9.2%), suburban students (9.1%), and rural students (8.6%) were slightly more likely than the sample as a whole to draw themselves using thinking skills, solving problems, or using test-taking skills.

Confidence

Students approach testing with different views of themselves as learners and different levels of confidence. Those who view themselves as confident learners and test-takers may be more inclined to persist, even when test items are tedious and ambiguous. Confident test-takers may be more likely to ponder each test item rather than guess or search for the single right answer. Confident test-takers may also be more likely to check their work and correct mistakes.

In our sample, 5.4% of all drawings depicted "confident" test-takers. These drawings highlighted students as self-regulators of their work. Some showed no signs of students' working for specific marks on MCAS; others portrayed students anticipating an "A" or "100." Fourth graders (6.3%) and suburban students (8.6%) were slightly more likely to depict themselves as confident test-takers, but given our sample size these differences are not statistically significant.

Negative responses to MCAS

Drawings also conveyed distinct negative responses, including anxiety, anger, pessimism, boredom, and loss of motivation, to the MCAS.

Anxiety

Since the first administration of MCAS, reports of schools' efforts to respond to students' anxiety have circulated widely. In one suburban Boston school, where student absence, illness, and headaches have been attributed to anxiety about MCAS, teachers have arranged for a yoga instructor to instruct fourth graders in yogic breathing before MCAS testing (Hays, 1999). Other schools have sought to relieve student fears about testing (whether general worries about failing, lack of time to think, and "blanking out," or specific concerns about multiplying decimals, spelling, and historical dates) through after-school test preparation programs (Yaekel, 2000). Mixed messages abound as teachers prepare students in test-taking skills while reminding them "Don't stress; it's only a test." On one hand, experts counsel parents to downplay MCAS results (Meltz, 1999). On the other, Kaplan/Simon & Schuster has promoted a "No-Stress Guide to the Eighth Grade MCAS" for parents to use at home. As Kaplan publisher Maureen McMahon commented, "Anyone could see there was a need for a guide that would take some of the anxiety away from this process" (DiLorenzo, 2000).

We coded 13.4% of the MCAS drawings from all grades as showing anxiety. These included students' self-portraits showing them sweating or commenting on the test as "nerve-wracking." Other drawings coded for anxiety included thought bubbles with a prayer or wish for the arrival of help (as distinct from asking for help from the teacher). Still others alluded to fear of failing and having to go to summer school. Students at all grade levels and in all kinds of communities portrayed themselves as worried about MCAS. In our sample, the rate of drawings presenting anxiety was not dramatically different for fourth graders (14.2%) and secondary students (11.1%), or for urban (17.4%), rural (14.0%), and suburban students (11.0%).

Anger and hostility

Ten percent (10%) of the drawings portrayed students as angry about MCAS testing. These drawings went beyond the "I hate MCAS" message of the "nonspecific negative" drawings. Rather these drawings portrayed students as "mad." Others included thought bubbles in which students were setting fire to MCAS or marching on City Hall. Some drawings detailed the reasons for some students' hostility: their encounter with "hard problems" or content they were not familiar with, the belief that time spent in testing was time stolen from learning, and the feeling that the test was designed to reveal "what you don't know."

Drawings that distinctly conveyed hostility varied considerably by grade level and kind of community. While 6.6% of the fourth graders pictures themselves as angry, 19.4% of the eighth and tenth graders portrayed themselves in this way. Urban students were four times as likely as rural students to depict themselves as angry: 17.4% and 4.3% respectively.

Boredom

In our sample, 4.9% of the drawings highlighted boredom as a response to MCAS. Some students conveyed this reaction in commentary on test questions as "easy, but boring." Others drew themselves with thought bubbles, noting "I am sooooo bored. This is really annoying."

Secondary students were generally more likely to depict themselves as bored, with 10.2% of the eighth and tenth grade drawings communicating boredom as compared with 3.0% of those from fourth graders. Eleven percent (11.0%) of urban students portrayed themselves as bored with MCAS.

Sadness, disappointment, and pessimism

Drawings also depicted students as sad or pessimistic about their experience with MCAS. We coded 2.7% of the drawings as "sad." Another 2.2% of the drawings contained explicit references to students anticipating failure, grade retention, or a poor score. A few students depicted themselves as disappointed, including one who drew a large heart cracked through the middle with the caption "Heart broken because of MCAS." Urban students were slightly more likely than others to portray themselves as sad, disappointed, or pessimistic.

Loss of motivation and withdrawal from testing

After days of testing on material that may seem confusing, ambiguous, or unfamiliar, students who feel anxious, angry, test-weary, or pessimistic about their prospects for success are not likely to put sustained effort into testing. In fact, during the testing period in May 1999, absenteeism was up in some districts, while in others, participating students "just put any old answer down," stopped answering questions, and put their heads down on their desks (Berard & Pearlman, 1999; Curtis, 1999a; Johnson, 1999; O'Shea, 1999). As one student from the first class required to pass MCAS for graduation observed, "After the first two days of tests, your fingers and your mind hurt. A lot of kids didn't try after that" (Curtis, 1999b).

MCAS drawings showed student effort stalled in various postures. Several artists described how they felt fresh and eager in the early hours of MCAS but petered out and became careless as the hours and days of testing continued. In 5.3% of the drawings, students portrayed themselves sleeping during testing, or daydreaming about things unrelated to MCAS. Secondary students (7.4%) and urban students (6.4%) were slightly more likely to show themselves as sleeping through MCAS.

Relief

A final set of drawings, representing 3.9% of all drawings, depicted students as relieved that testing had ended. Some students portrayed themselves proclaiming the test "done" while others cheered, "Yeah, it's all over!" Drawings from eighth graders (5.2%) and urban students (6.4%) were most likely to convey relief that testing had finished.



--------------------------------------------------------------------------------


Conclusion

Our study of students' self-portraits as MCAS test takers began with an invitation to Massachusetts teachers to gather drawings from their fourth, eighth, and tenth grade students following the second round of MCAS testing in 1999. The drawings generated by this invitation have allowed us to explore how students respond to MCAS testing. Although we have been working with a small opportunity sample of drawings, the patterns that emerge, we believe raise questions about the assumptions that undergird high stakes testing policies. In particular, the drawings challenge the belief that the high stakes associated with MCAS will enhance the motivation and effort of students in a uniform way. To the contrary, the considerable range of responses to MCAS from one student to another, by grade level, and by the school location suggests that the connection of high stakes testing to students' motivation is not so simplistic as policy makers often assume. These variations also invite reflection on how students' perceptions of high stakes testing may interact with other aspects of their schooling. We discuss the patterns that emerge in the MCAS drawings in Part 2 of this paper.




--------------------------------------------------------------------------------


References

Allen, M. (1999). Panel blasts MCAS: Minorities don't get fair shake, group says. New Bedford Standard-Times, 16 November: A01.

Associated Press. (1999). MCAS changes considered. Northampton Gazette, 3 July: http://www.gazettenet.com/schools/07031999/13970.htm.

Associated Press. (2000). Teacher opposed to MCAS tests. Northampton Gazette, 1 April: http://www.gazettenet.com/schools/04012000/23647.htm.

Barry, S. (2000). Pressure on schools to improve. Springfield Union-News. 13 April: http://www.masslive.com/news/pstories/ae413mca.html.

Berard D. & Pearlman, H. P. Latest MCAS scores: Good, bad, and ugly. Lowell Sun, 8 December: 01.

Coalition for Authentic Reform in Education (CARE). (1998). MCAS Review. Unpublished report, 23 November.

Coalition for Authentic Reform in Education (CARE). (1999). MCAS Review #2. Unpublished report, 10 December.

Crittenden, J. (2000). Boycotters score points as most students take MCAS, Boston Herald, 13 April: 03.

Curtis, M. J. (1999a). MCAS controls their future. New Bedford Standard-Times, 12 December: 01 (http://www.s-t.com/daily/12-99/12-12-99/a01lo005.htm).

Curtis, M. J. (1999b). Scores draw mixed reaction from parents, students. New Bedford Standard Times, 8 December: 01 (http://www.s-t.com/daily/12-99/12-08-99/a01lo013.htm).

Daley, B. & Vigue, D.I. (2000). Mass. seen as ground zero in test tussle, Boston Globe, 11 April: B01.

Daley, D. (2000). Parents lash back at MCAS, Newton Tab, 27 April.

Debard, R. & Kubow, P. K. (2000). Impact of Proficiency Testing: A Collaborative Evaluation. Report prepared for Perrysburg (OH) Schools. Bowling Green State University and Partnerships for Community Action.

Drake, C. (2000). Flaws of MCAS justify boycott. Northampton Gazette, 25 April: http://www.gazettenet.com/04252000/opinion/24535.htm.

DiLorenzo, J. (2000). A high-stakes revolution: Across the state, parents, students, and educators are saying 'no' to the controversial MCAS, 13 January: 16, 19, 21.

Dorn, S. (1998). The political legacy of school accountability systems. Educational Policy Analysis Archives 6(1), 2 January: http://olam.ed.asu.edu/epaa/v6n1.html.

Downs, A. (1999). Cambridge parents pledge to end MCAS. Boston Sunday Globe City Weekly, 31 October: 01.

Gentile, D. (2000a). Monument students working to remove MCAS requirement. Berkshire Eagle, 14 April: 01.

Gentile, D. (2000b). Some students to shun MCAS. Berkshire (MA) Eagle, 11 April: 01.

Gentile, D. & Sukiennik, G. (2000). Students at Wahconah, Monument stage 'quiet protest' against MCAS. Berkshire Eagle, 13 April: 01.

Greenebaum, M. (2000). School visits: MCAS alternative. Northampton Gazette, 14 April: http://www.gazettenet.com/04142000/opinion/24160.htm.

Groves, M. & Richardson, L. (2000). 'Test Prep' Moving Into Primary Grades, Los Angeles Times, 1 April: A1.

Guenther, W. (2000). MCAS tests skills that matter. Boston Globe, 25 April: E4.

Haladyna, T. M., Nolen, S. B., & Haas, N. S. (1991). Raising standardized achievement test scores and the origins of test score pollution. Educational Researcher 20(5), June-July: 2-7.

Haney, W., Russell, M., & Jackson, L. (1997). Using drawings to study and change education and schooling. Research proposal from the Spencer Foundation from the Center for the Study of Testing, Evaluation, and Educational Policy at Boston College, September 1997.

Haney, W. & Scott, L. (1987). Talking with children about tests: An exploratory study of test item ambiguity. In Freedle, R. O. and Duran, R. P. (Eds.). Cognitive and Linguistic Analyses of Test Performance. Norwood, NJ: Ablex.

Hayes, K. (1999). Stressed out children gaining relief in yoga. Boston Sunday Globe South Weekly, 6 June: 01.

Hayward, E. (2000a). Ex-teacher of the year now fights MCAS. Boston Herald, 27 March: 20.

Hayward, E. (2000b). Hub teachers, local leaders agree to back MCAS exam. Boston Herald, 15 April: http://www.bostonherald.com/news/local_regional/mcas04152000.htm.

Hayward, E. (2000c). Minority pupils lagging on MCAS exams. Boston Herald, 18 May: http://www.bostonherald.com/news/local_regional/mcas05182000.htm.

Hayward, E. (2000d). Schools gear up for MCAS: Boycotting kids face penalties as testing begins. Boston Herald, 12 April: http://www.bostonherald.com/news/local_regional/mcas04122000.htm.

Hellman, S. (2000). Parents unite as revolt widens against MCAS. Boston Sunday Globe West Weekly, 16 April: 01.

Heubert, J. P. & Hauser, R. M., (Eds.). (1999). High Stakes: Testing for Tracking, Promotion, and Graduation. Washington, DC: National Research Council, National Academy Press. (Available on line at http://books.nap.edu/html/highstakes/index_pdf.html.)

Horn, C., Ramos, D., Blumer, I., & Madaus, G. (2000). Cut scores: Results may vary. NBETPP Statements, Vol. 1, No. 4, National Board on Educational Testing and Public Policy, Boston College: http://www.nbetpp.bc.edu/reports.html.

Johnson, J. (1999). MCAS brings good news and bad. Norton Town Online, 9 December: http://www.townonline.com/neponset/norton/news/newNMMCAS121027.html.

Katz, M. (2000). Top educator criticizes reliance on MCAS tests. Providence Journal, 2 February.

Kohn. A. (1999). The Schools Our Children Deserve: Moving Beyond Traditional Classrooms and "Tougher Standards." Boston: Houghton Mifflin.

Kvalseth, T. O. (1989). Note on Cohen's kappa. Psychological reports, 65: 223-26.


Lawton, M. (1997). States' boards leaders call for assessments bearing consequences. Education Week, 22 October: http://www.edweek.org/ew/1997/08nasbe.h17.

Lemann, N. (1999). The Big Test: The Secret History of the American Meritocracy. New York: Farrar Strauss & Giroux.

Lifford, J., Byron, B., Eckblad, J., & Ziemian, C. (2000). Reading, responding, reflecting. English Journal 84(4). March: 46-57.

Lindsay, D. (2000). CON-test. Education Week, 5 April, http://www.edweek.org/ew/ewstory.cfm?slug=30mass.h19.

Linn, R. (2000). Assessments and accountability. Educational Researcher 29 (2), March:http://www.aera.net/pubs/er/arts/29-02/linn01.htm.

Lord, R. (1999). Harwich teacher refused to hand out MCAS test. Cape Cod Times, 3 June.

Madaus, G. F. (1988). The influence of testing on curriculum. In L.N. Tanner (Ed.), Critical issues in curriculum: Eighty-seventh yearbook of the National Society for the Study of Education (pp. 83-121). Chicago, IL: University of Chicago Press.


Manzo, K. K. (1997). High Stakes: Test Truths or Consequences. Education Week, 22 October: http://www.edweek.org/ew/1997/08nc.h17.

Marcus, J. (1999). The shocking truth about our public schools: They're better than you think. Boston Magazine, October: 70-81; 138-141.

Mashberg, T. (1999). Mass. kids join legions with poor test scores. Boston Herald, 14 November: 01.

Massachusetts Department of Education. (October 1999a). Massachusetts Comprehensive Assessment System: 1998 Technical Manual.

Massachusetts Department of Education. (October 1999b). Massachusetts Comprehensive Assessment System: 1998 MCAS Technical Report Summary.

Massachusetts Department of Education. (November 1999). Massachusetts Comprehensive Assessment System: Report of 1999 State Results. November

McFarland, C. (1999). Education board increases number of MCAS tests. Worcester Telegram and Sun, 26 May: 01.

McNeil, L. (2000). Contradictions of Reform: Educational Costs of Standardized Testing. New York: Routledge.

McNeil, L. & Valenzuela, A. (2000). The harmful impact of the TAAS system of testing in Texas: Beneath the accountability rhetoric. Harvard University, The Civil Rights Project: http://www.law.harvard.edu/civilrights/conferences/testing98/drafts/mcneil_valenzuela.html.

Mehrens, W. A. (1998). Consequences of assessment: What is the evidence? Educational Policy Analysis Archives 6(13), 14 July: http://olam.ed.asu.edu/epaa/v6n13.html.

Meier, D. (2000). Will standards save public education? Boston: Beacon Press.

Meltz, B. F. (1999). How to handle MCAS scores. Boston Sunday Globe City Weekly, 11 November: 01.

National Center for Fair and Open Testing (FairTest), (2000). MCAS: Making the Dropout Crisis Worse, MCAS Alert, September: http://www.fairtest.org/care/MCAS%20Alert%20Sept.html.

Noble, A. J. & Smith, M.L. (1994). Old and new beliefs about measurement-driven reform: 'Build it and they will come.' Educational Policy 8(2): 111-136.

Ohanian, S. (1999). One Size Fits Few : The Folly of Educational Standards. Portsmouth, NH: Heinemann.

O'Shea, M. E. (1999). Absenteeism seen as a factor in state test results. Springfield Union-News, 10 December: http://www.masslive.com/news/stories/ae129atv.html.

Paris, S.G., Lawton, T.A., Turner, J.C., & Roth, J.L. (1991). A developmental perspective on standardized achievement testing. Educational Researcher 20(5), June-July: 12-20.

Paris, S. G., Herbst, J.R., & Turner, J.C. (In press.) Developing disillusionment: StudentsŐ perceptions of academic achievement tests. Issues in Education.

Penniman, B. (2000). After great expectations, hard times: Why assessment in Massachusetts undermines the early promise of education reform. The State Education Standard 1 (Journal of the National School Boards Association). Spring: 22-26.

Rodriguez, C. (1999). MCAS stirs fears that minorities will drop out. Boston Globe, 12 November: A01.

Russell, M. & Haney, W. (1999). Validity and reliability gleaned from information in student drawings. Paper presented at the annual meeting of the American Educational Research Association, Montreal.

Sacks, P. (1999). Standardized Minds: The High Price of America's Testing Culture and What We Can Do To Change It. Cambridge, MA: Perseus Books.

Scherer, M. (2000). Political statement about MCAS. Northampton Gazette, 12 April: http://www.gazettenet.com/schools/04122000/24076.htm.

Shartin, E. (2000). High stakes for making the grade. Metrowest Daily News, 16 May.

Shea, C. (2000). It's come to this. Teacher Magazine, May: http://www.teachermagazine.org/tm/tm_printstory.cfm?slug=08profit.h11Content-Type:.

Stake, R. (1998). Some comments on assessment in U. S. Education. Educational Policy Analysis Archives, 6 (14): http://olam.ed.asu.edu/epaaa/v6n14.html


Steinberg, J. (2000). Bluebooks closed, students protest state tests. New York Times, 13 April: 01.

Sukiennik, G. (2000). MCAS tests fatally flawed, teachers say. Berkshire Eagle, 26 January: 1.

Sweeney, E. (2000). Students stage MCAS walk-out. Brookline TAB, 13 April: 01.

Tantraphol, R. (2000a). Educator: Delay test-diploma link. Springfield Union-News, 15 March: http://www.masslive.com/news/pstories/ae315tan.html.

Tantraphol, R. (2000b). Students boycott 'unfair' test. Springfield Union-News, 13 April: http://www.masslive.com/news/stories/ae413not.html.

Thesing, E. (2000). Teens tell governor he is failing them. Boston Globe, 6 June: B02: http://www.boston.com/dailyglobe2/158/metro/Teens_tell_governor_he_is_failing_them-.shtml.

Thorkildsen, T. A. (1999). The way tests teach: Children's theories of how much testing is fair in school. In M. Leicester, C. Modgil & S. Modgil (Eds.) Education, culture, and values, Vol. III. Classroom Issues: Practice, Pedagogy, and Curriculum. London: Falmer.

Tovey, R., (1996) Getting kids into the picture: Student drawings help teachers seen themselves more clearly, Harvard Education Letter XII, (6), 5-6.

Urdan, T. & Davis, H. (1997, June). Teachers' and students' perceptions of standardized tests. Paper presented at the Interdisciplinary Workshop on Skills, Test Scores, and Inequality. The Roy Wilkins Center for Human Relations and Social Justice, University of Minnesota. Minneapolis, Minnesota.

Uriarte, M. & Chavez, L. (2000). Latino Students and the Massachusetts Public Schools. Boston: Mauricio Gaston Institute for Latino Community Development and Public Policy, University of Massachusetts; http://www.gaston.umb.edu.

Vaishnav, A. & Vigue, D. I. (2000). Hundreds of parents, teachers, students rally against MCAS. Boston Globe, 16 May: B3.

Vigue, D. I. & Yaekel, T. (2000). Hundreds of students boycott MCAS tests. Boston Globe, 13 April: A01.

Vigue, D. I. (2000). Race gap endures on MCAS results: Minority leaders, state plan talks. Boston Globe, 19 May: B01.

Walsh, J. (2000a). Board to be asked to oppose test. Springfield Union-News, 13 April: http://masslive.com/news/pstories/hf413mca.html.

Walsh, J. (2000b). Parents pass word of test opposition. Springfield Union-News, 11 April: http://www.masslive.com/news/pstories/hf411mca.html.

Walsh, J. (2000c). Test content reaches school board. 16 April. http://www.masslive.com/news/pstories/hf415mca.html.

Whitford, B.L & Jones, K. (2000). Accountability, Assessment, and Teacher Commitment: Lessons from Kentucky's reform efforts. Albany: State University of New York Press.

Wilson, C. B. (2000). Parents seek anti-MCAS stand. Northampton Gazette, 12 April: http://www.gazettenet.com/04122000/schools/24078.htm.

Yaekel, T. (2000). Students scramble, sacrifice, sweat in countdown to MCAS. Boston Globe, 10 April: B01.



--------------------------------------------------------------------------------


Acknowledgments

The authors most gratefully acknowledge the contribution of the Massachusetts teachers who generously collaborated with us on this project, gathered drawings, and provided commentary on our early analysis of findings. We also extend special thanks to Mimi Coughlin, Lauren McGrath, Christine Mills, and Genia Young for their invaluable assistance in our data input, reliability analysis, and technical production tasks, and to Christina Capodilupo, Scott Paris, Kathleen Rhoades and Alan Stoskopf for comments on earlier drafts of this paper. We alone take responsibility for our conclusions. Finally, we are grateful to the Spencer Foundation for support in the completion of this project.



--------------------------------------------------------------------------------


Appendix A (Click here to download ".pdf" coding matrix)

— Anne Wheelock, Damian Bebell, and Walt Haney

September 2000
http://wwwcsteep.bc.edu/drawoned/mcas/mcaspaper.html


INDEX OF RESEARCH THAT COUNTS


FAIR USE NOTICE
This site contains copyrighted material the use of which has not always been specifically authorized by the copyright owner. We are making such material available in our efforts to advance understanding of education issues vital to a democracy. We believe this constitutes a 'fair use' of any such copyrighted material as provided for in section 107 of the US Copyright Law. In accordance with Title 17 U.S.C. Section 107, the material on this site is distributed without profit to those who have expressed a prior interest in receiving the included information for research and educational purposes. For more information click here. If you wish to use copyrighted material from this site for purposes of your own that go beyond 'fair use', you must obtain permission from the copyright owner.