Orwell Award Announcement SusanOhanian.Org Home


Evaluation Of The Office Of Economic Opportunity’s Performance Contracting Experiment

Susan Notes:

When Donald Rumsfeld was head of the Office of Economic Opportunity, he played a role in high-stakes testing and performance contracting. You may recall that after leaving OEO, Rumsfeld was Secretary of Defense from 1975 to 1977 and again from 2001 to 2006. As director of OEO, Nixon said of him, "He's a ruthless little bastard. You can be sure of that."

Guess who was Rumsfeld's assistant at OEO? Dick Cheney. And just note how they were experimenting on public school children.

NOTE: The 92-page report is slow loading but it does come up eventually. this may be history but teachers will recognize that it reads as up-to-date as tomorrow's headline.

Also note: Soon after the findings of this report, OEO's research and development activities in education were transferred to the National Institute of Education, Department of Health, Education, and Welfare.

The report provides a fascinating account of federal bumbling in local education affairs. I provide some of the highlights but I recommend that you take a look at the whole thing.

In the introduction the report writers note:


The Texarkana school system initiated an experimental program during the 1969-70 school year directed toward reducing student dropouts. This program was funded primarily by the Department of Health, Education, and Welfare. The private educational firm conducting the program was to be paid on the basis of its success in raising student achievement levels above a minimum guaranteed level of one grade. Although initial reports on the results of the program in early 1970 indicated success, the results were later determined questionable when it was learned that the firm was "teaching to the tests."

Of course now "teaching to the tests" is regarded as "business as usual."

It's interesting that the report names the school district but not the "private educational firm conducting the program." Even more interesting and outrageous is the fact that although initially, the criteria for selecting firms, as stated in the request for proposals, was to "limit the selection to those firms which had a demonstrated capability
in using existing educational techniques and technologies appropriate for helping disadvantaged children in reading and mathematics," the selectors changed their minds and hired those firms demonstrating the most "innovation." Read below for a description of these firms, which reads like a script for The Three Stooges. And ask yourself if we've learned anything from history.

NOTE: The school districts were selected because of a "high degree of poverty." Forty years later, that high degree of poverty still dominates those school populations. Many instructional methods have come and gone, a zillion tests have been given, but nothing has been done to relieve the poverty.

Here is a finding that is mentioned almost as an aside:

Though OEO did not consider the length of dally class periods signlficant, research flndlngs indicate that a relationship between achievement and the length of a class period does exist.

Research, In general, indicates
--Children in primary grades do poorer with longer class periods because of eye fatigue and shorter attention spans
--Children at the secondary level do better with longer class periods.
--Longer class periods are better for the study of mathematics than for reading
--Difference in length of class periods of from 10 to 30 minutes can produce achievement differences for certain target groups and subjects.





May 8, 1973

To the President of the Senate and
the Speaker of the House of Representatives

This IS our evaluation of the Office of Economic Opportunity's performance contracting experiment.

WHY THE REVIEW WAS MADE

The General Accounting Office (GAO)
evaluated the performance contracting
experiment because of its potential
Impact in education and because
it was the Office of Economic Opportunity's
(OEO) first major experiment
after its designation by the
President as the primary research
and development arm for the Nation's
poor. . . .

BACKGROUND

"Performance contracting" has been defined an an agreement between a local education agency, such as a public school, and a private educational firm, known as a "learning-system contractor." Payment to the contractor is related to some measure of student achievement. In other words, the contractor is paid on the basis of its success in raising the grade levels of students it instructs. Performance contracting is not a program but a method of organizing programs.

Prompted by the initial reports of success of the first performance contracting project, OEO initiated a major educational experiment in a conscientious effort to help poor children because of its belief that education was crucial to breaking the poverty cycle and to provide useful information to the many school districts which were considering such projects.

The OEO experiment, conducted during the 1970-71 school year at an estimated cost of $6 million, was designed to assess the overall impact of remedial reading and mathematics programs conducted by private educational firms. These programs were carried out under performance contracts for students from low-income families performing well below average in the subjects relative to national norms.

The experiment included approximately 27,000 students, 18 school districts, 6 private educational firms, a management support contractor, a test and analysis contractor, and a payment computations contractor (See app. II.) . . . .

HOW WERE PAYMENTS BASED ON STUDENT
ACHIEVEMENT TO BE COMPUTED7


Student achievement 3 as measured by standardized nationally normed pretests and posttests, was to serve as the basis for computing a maximum of 75 percent of the amount which the firms could earn under their contracts. The remaining 25 percent was to be based on the results of five interim performance objective tests which were to be administered to the experimental groups during the school year.. . .


TESTING PROBLEMS

In one of the more serious incldents, the test monitor at one school district reported that conditions among the various testing locations varied greatly. Some rooms were air conditioned,
others were not. At some locations the students were much more unruly, disinterested, and unmotivated than at others, although a lack of motivatlon was evldent at all locations. In the elementary grades, the payment and evaluation
tests were given in 1 day each instead of the 2 days recommended by the test and analysis contractor.

The test monitor reported that some students simply marked answers in a purely random pattern without regard to the questions, Some students slept through a considerable part of the test or talked and annoyed their neighbors. He stated that he could verify only the existence of test scores and could not certify that they represent a true measure of each student's capability. He expressed concern over the assumptions that would be made from pretests to posttests on the comparability of the conditions under which the two sets of tests were administered, since he stated that there was no way that the conditions of the first test could be repeated for the posttest. No retesting was conducted at this school district. . . .


INTERIM PERFORMANCE OBJECTIVE TESTS

Interim performance objective tests were to be used as a basis for up to 25 percent of the final payment to the educational firms and as a supplemental measure of the impact of the individual educational firms' instructional programs. The tests were to be given to the experimental students five times during the school year to assess their mastery of the curricular materials to which they had been exposed.

Although the tests were administered, the conditions necessary to insure valid test results could not be met within the limited time before the tests were administered. As a
result, the tests were virtually useless for evaluation purposes and questionable as a basis for payment to the firms.

The educational firms were required to submit to the test and analysis contractor three times the number of items (questions) required for each of the five interim performance objective tests that were to be administered. They were also to submit the curriculum objectives on which these tests were based and to document the relationship between each question and the curriculums by the first day of school.

The test and analysis contractor then had to evaluate the data and determine whether the individual test items did or did not reflect a fair and relevant test of the educational firms' curriculums. If the test items were fair and relevant, the contractor was to certify such in writing to OEO, if not, the contractor was to notify OEO and the firms why they were
not and recommend improvements. OEO's project manager was to settle any disagreements between the firms and the test and analysis contractor.

After certification, the contractor was to submit the questions to the school districts. The project directors would then randomly select one-third of these questions to be included on the tests.

After school started, OEO realized that these requirements could not possibly be met before the scheduled test dates. Consequently, the test and analysis contractor did not evaluate
and certify these tests before they were given. Moreover, on the basis of its evaluation, the test and analysis contractor concluded that most of the tests were deficient for several
reasons.

Although the educational firms were requested to correct these deficiencies and all but one firms' tests were later certified as acceptable, the certifications were of little
value since the deficiencies were not corrected until after the tests had been administered. [emphasis added] Moreover, one firm, with the exception of one test in one school district, never provided interim performance objective tests for its three school
districts. The tests were made up at the sites by each teacher, consequently, the tests certified were not the tests administered. [emphasis added].

OEO expected these tests to measure the students' progress in the reading and mathematics programs of each educational firm. But insufficient controls over test content and administration precluded such an evaluation. The pass-fall rate for the tests varied significantly among firms and school districts,

For example, in one school district, the firm administered 1,424 mathematics tests to students in grades 7, 8, and 9, and
the students failed only 108 of these tests, or less than 7.6 percent. In another school district, the firm administered 1,374 mathematics tests to its students in grades 7, 8, and 9, and the students failed 967 of these tests, or more than 70 percent.

OEO could not determine whether a high or low pass-fall rate was a result of the quality of the instructional programs or the inappropriateness of the tests given. . . .(emphasis added)

This is such an important point. The current US Office of Education does not ask this important question--whether the tests given may be inappropriate. They just assume the quality of the teacher is to blame.

OEO did not use selection criteria specified in request for proposals

Our examination of the firms' proposals indicated that none of the firms selected had the existing educational techniques and demonstrated capabilities initially deemed
necessary by OEO. Initially, the criteria for selecting firms) as stated in the request for proposals, was to limit the selection to those firms which had a demonstrated capability
in using existing educational techniques and technologies appropriate for helping disadvantaged children in reading and mathematics. However, OEO's evaluation of the
firms' proposals was not based upon this criterion but rather upon an assessment of a firm's proposed innovative systems or approaches for helping disadvantaged students.

A brief summary of the firms' stated experience with the approaches proposed to OEO follows.

1. Firm A did not refer to any previous findings using its proposed instructional approach or to the specific curriculum materials that would be used. Moreover, the firm did not provide any information concerning its general capability, corporate and staff experience in applied educational technology and training, or a description of its approach to school-contractor cooperation, including teachers unions, as required by the request for proposals

2. Firm B claimed to be the Nation's largest system of programed learning centers with 66 In operation and 50 more scheduled to be opened by October 1970. The operating centers were testing over 1,000 students per month, an average of fewer than 16 students per center.

3. Firm C, established in 1967, had prior Job Corps experience and was operating one learning center which provided instruction in mathematics and reading to over 50 students per day from preschool to high school, Some of these students were from minority groups, were dropouts, and participated in remedial work and enrichment study programs

4. Firm D listed its past experience as being in vocational skills training, adult basic education, and college remedial training and tutoring. it gave no indication as to the number of students involved and presented no discussion of previous findings, except for a brief statement on its college program. The firm stated that it was also experimenting with a small number of high school and junior high school students and dropouts using the format of the college remedial training and tutoring program but that no results were available yet.

5. Firm E, established in 1967, did not claim, in its proposal, to have had any previous experience in conducting remedial instruction for disadvantaged children, It claimed, however, to have developed an empirical process for creating new and modified materials which could be readily adopted to almost any subject matter In addition, it claimed to have developed, tested, and refined an administrative planning tool which encouraged more precise design of educational planning and evaluation activities.

6. Firm F's past experience consisted of operating a Job Corps Center and a vocational rehabilitation center. The firm planned to open its first learning center in September 1970, which would offer creative learning environments and self-motivation instructional
techniques for private students from ages
3 to 8

After OEO decided not to base selection of a firm on the criteria set forth in the request for proposals, an amendment to the request for proposals should have been issued apprising all officers of the changes. This would have notified officers of the criteria against which their offers were to be measured and would have placed them on an equal basis as required by the Federal Procurement Regulations. . . .

There's more. . . lots lots more. It's actually worth reading--as a small example of your federal government at work.

FINDINGS AND CONCLUSIONS

Was performance contacting more successful than traditional classroom instruction in improving the reading and mathematics skills of poor children? The answer according to OEO is no! OEO's report released in June 1972 stated that
"The results of the experiment clearly indicate that the firms operating under performance contracts did not perform significantly better than the more traditional school systems."

Because of a number of shortcomings in both the design and implementation of the experiment, GAO believes that the question as to the
merits of performance contracting versus traditional educational methods remains unanswered.

While the OEO experiment in performance contracting was initially designed to make a reasonable comparison between educational performance contracting and traditional
classroom instruction, GAO believes the Information obtained from the experiment did not provide a basis for making a reliable comparison.

Although there were 6 unique experimental
instructional programs involved in the 18 school districts, OEO's overall conclusion concerning the merits of the instructional
programs of the educational firms was based on its comparative analysis of achievement results between experimental and control groups aggregated for all 18 school districts. . . .

The short time available during selection of school districts forced many districts to agree to participate without full knowledge of all the implications. Since negotiations took place during the summer, most school personnel did
not know they would be involved until school opened and this caused many to view the project with apprehension. Personnel in several school districts were openly critical of and hostile toward the educational firms during the school
year. . . .

PRINCIPAL OFFICIALS OF THE
OFFICE OF ECONOMIC OPPORTUNITY
RESPONSIBLE FOR ACTIVITIES
DISCUSSED IN THIS REPORT


DIRECTOR Tenure
Howard Phillips (acting) Jan. 1973--
Phillip V. Sanchez Sept. 1971--Jan. 1973
Frank C. Carlucci Dec. 1970- Sept. 1971
Donald Rumsfeld May 1969--Dec. 1970
Bertrand M Harding (acting) March 1968--May 1969
R, Sargent Shriver Oct. 1964--March 1968

Here are the school districts involved in this study: APPENDIX IV

  • VcNairy County School District
    Selmer, Tennessee

  • Dallas Independent School District
    Dallas, Texas

  • Clark County School District
    Las Vegas, Nevada

  • Anchorage Borough School District
    Anchorage, Alaska

  • Clarke County School District
    Athens, Georgia

  • Unified School District No. 259
    Wichita, Kansas

  • Taft Independent School District
    Taft, Texas

  • McComb Separate School District
    McComb, Mississippi

  • Seattle School District
    Seattle, Washington

  • Grand Rapids Public Schools
    Grand Rapids, Michigan

  • Hartford School District
    Hartford, Connecticut

  • Duval County School Board
    Jacksonville, Florida

  • School Administrative District No 5
    Rockland, Maine

  • Hammond City School District
    Hammond, Indiana

  • Portland School District
    Portland, Maine

  • Fresno City Unified School District
    Fresno, California

  • The School District of Philadelphia
    Philadelphia, Pennsylvania

  • Bronx School District No 9
    Bronx, New York
  • — The Comptroller of the United States
    Report to Congress. B-130515
    --
    http://archive.gao.gov/f0202/092641.pdf


    INDEX OF RESEARCH THAT COUNTS


    FAIR USE NOTICE
    This site contains copyrighted material the use of which has not always been specifically authorized by the copyright owner. We are making such material available in our efforts to advance understanding of education issues vital to a democracy. We believe this constitutes a 'fair use' of any such copyrighted material as provided for in section 107 of the US Copyright Law. In accordance with Title 17 U.S.C. Section 107, the material on this site is distributed without profit to those who have expressed a prior interest in receiving the included information for research and educational purposes. For more information click here. If you wish to use copyrighted material from this site for purposes of your own that go beyond 'fair use', you must obtain permission from the copyright owner.