MCAS Gains are Unreliable Basis for "High Performing" School Awards; Annual Score Increases Often are Not Sustained

for further information:
Karen Hartke or Dr. Monty Neill
(857) 350-8207

for immediate release, Thursday, May 31, 2001
MCAS GAINS ARE UNRELIABLE BASIS FOR “HIGH PERFORMING” SCHOOL AWARDS;
ANNUAL SCORE INCREASES OFTEN ARE NOT SUSTAINED;
FAIRTEST ANALYSIS SHOWS YEAR-TO-YEAR CHANGES MAY RESULT FROM
“LUCK OF THE DRAW,” SMALL STUDENT POOLS, “LOSS” OF LOW-SCORERS

Annual changes in average MCAS scores are not a reliable way to identify educational improvement and should not be used to recognize “high performing” schools, according to a report released today by the National Center for Fair & Open Testing (FairTest) in Cambridge.
The study concludes that year-to-year MCAS score gains often are not sustained and may simply reflect variations in the student population, particularly at small schools. In addition, average MCAS increases can mask widening gaps between high and low scorers or an unexplained “disappearance” of poor performers from the test-taking pool.
Both the Massachusetts Department of Education and the business group MassInsight honor schools with large MCAS score increases. William Edgerly, chairman emeritus of State Street Corporation, gives $10,000 prizes to principals based on test score gains.
But the FairTest review found that only one of five 1999 Edgerly award-winning schools continued to show MCAS score gains in 2000. At the Abraham Lincoln school in Revere, for example, where 26% of students were deemed advanced or proficient in English in 1999, only 14% reached the same levels in 2000. Just 1% of Abraham Lincoln test-takers received “failing” scores in English in 1999; the very next year 5% failed. At Springfield’s Kensington Avenue school, which received an award for increasing the percentage of students in the proficient English category from 2% in 1998 to 40% in 1999, not a single student scored proficient in 2000.

Assessment experts expect considerable year-to-year score variability when small numbers of tests are administered. At many MassInsight award-winning schools, fewer than fifty students are tested annually. “Score increases may simply reflect ‘the luck of the draw,’ not better teaching and learning,” explained FairTest Executive Director Monty Neill. “In a small school, the presence of just a few students who score especially high or low can skew average scores dramatically from one year to the next. These unreliable score changes should not be the basis of rewards or sanctions.”
Score changes may also reflect the loss of weaker students from the test-taking pool. FairTest found that in award-winning Hudson High School , 157 students were enrolled in 10th grade in October 1999 but only 110 took the MCAS in spring 2000. “If low-scoring student drop out or transfer to vocational or private schools, average scores become meaningless,” noted Karen Hartke, FairTest’s Assessment Reform Director.
Even a rising number of students scoring in the top MCAS categories may not mean that a school is doing a better job educating the majority of its population. For instance, in award-winning Nauset Regional High School on Cape Cod, the percentage rated “advanced” increased from 1998 to 2000. But so did the percentages in the “failing” and “needs improvement” categories.”
“Example after example demonstrates that the MCAS is a poor tool for assessing school improvement,” FairTest’s Hartke added. “Giving out awards based on annual score increases is an exercise in cynical public relations, not education reform.”
“Despite the clear unreliability of using test score changes to make major decisions about schools, this flawed approach will be mandated by federal law if proposed testing provisions are not removed from the pending reauthorization of the Elementary and Secondary Education Act,” Neill concluded.
As an alternative to MCAS, FairTest supports a comprehensive school accountability plan designed by the Coalition for Authentic Reform in Education (CARE). That proposal relies on local assessments based on state standards, external school quality reviews, and limited standardized testing to create annual reports to communities based on multiple indicators of educational quality and opportunity.

Tags: