Inflating NAEP's Importance Would Deflate Validity of NCLB 'Yardstick'
The influential Center on Education Policy (CEP) released a report in June announcing the "good news" of rising state test scores, to the delight of NCLB proponents, who were quick to credit NCLB for rising "achievement." The study's authors took pains to warn that gains could not definitely be attributed to NCLB, but their cautions were quickly drowned out by boasts from Education Secretary Margaret Spellings, who said, "This study confirms that No Child Left Behind has struck a chord of success with our nation's schools and students. We know the law is working, so now is the time to reauthorize."
A closer look at the CEP results revealed far less than the claims. The CEP report looked at three years of data for elementary school math, for example, and found that in 37 of 41 states the proportion of students scoring proficient or above increased at least one percentage point per year, with most gaining just above the one percentage point. Whether or not a one-point increase in proficiency signifies anything meaningful about the quality of teaching and learning-and there is strong evidence that it indicates damaging practices of teaching to the test and narrowing the curriculum--clearly this pace of improvement is far too slow to meet NCLB's mandate of 100% proficiency by 2014.
A number of analysts commented that some of the methodology used by CEP for the report was poor. John T. Yun, a professor at the University of California at Santa Barbara, said the CEP results were likely "seriously damaged" by flaws such as a bias toward selecting states with inflated results. Yun also said that CEP authors effectively eliminated states that changed practices after NCLB was implemented, thereby selecting exactly the wrong states to consider when gauging NCLB's impact.
Two days after the CEP report, the National Center for Education Statistics (NCES) released a study comparing minimum proficiency levels on state test results with performance on NAEP. Despite increasing scores on state tests, the study documented wide variations in the proportion of students attaining "proficiency" on NAEP in different states and said the variation "can be largely attributed to differences in the stringency of [state] standards" in setting achievement levels. This finding prompted calls by President Bush and others to increase the importance of NAEP as a common yardstick in order to pressure states to set their proficiency standards at NAEP's higher levels and thereby force higher student achievement.
NAEP is better than many state tests but is still far from the "gold standard" its proponents claim for it. NAEP proficiency levels have been widely criticized as absurdly high, most recently in a report by Bert Stoneberg, "Using NAEP to Confirm State Test Results in the No Child Left Behind Act." Stoneberg punctures the idea that achieving "proficiency" on NAEP equates to an ability to do "grade level work," an assertion often repeated by Secretary Spellings and the media. Indeed, reaching NAEP's "basic" level requires students to succeed at work common to their actual grade level.
Even if NAEP levels were appropriate, increasing NAEP's importance, as President Bush and others recommend, would almost certainly bring Campbell's Law into play.. Campbell's Law says, "The more any quantitative social indicator is used for social decisionmaking, the more subject it will be to corruption pressures and the more apt it will be to distort and corrupt the social processes it is intended to monitor." Or as testing expert Gerald Bracey wrote in the Huffington Post, "The president's plan …calls on states to post their scores side-by-side with the NAEP results. This is a great way to destroy NAEP. NAEP's integrity rests largely on the fact that people don't pay much attention to it. Attach high-stakes to it and it will lose whatever utility and validity it has."
o The CEP report is at http://www.cep-dc.org
o Dr. Yen's analysis of the CEP report is at http://www.greatlakescenter.org/
o The NCES report is at http://nces.ed.gov/nationsreportcard/pubs/studies/2007482.asp
o Stoneberg's report is at http://pareonline.net/pdf/v12n5.pdf
o For more on Campbell's law, see "Collateral Damage," Examiner, April 2007.
o See also, James Pellegrino, "Should NAEP Performance Standards Be Used for Setting Standards for State Assessments," Phi Delta Kappan, March 2007. http://www.pdkintl.org/kappan/kappan.htm
- K-12 Testing
- University Testing
- Fact Sheets
- Get Involved
- Other Resources