Testimony of Lisa Guisbond to the PA Senate Ed. Committee

Testimony of Lisa Guisbond, Policy Analyst for the National Center for Fair & Open Testing (FairTest)

Senate Education Committee Hearing on Proposed Graduation Competency Assessments
Pennsylvania General Assembly

May 14, 2008

Since its creation in 1985 by leaders of major civil rights, education reform and student advocacy organizations, the National Center for Fair & Open Testing, Inc. (FairTest) has closely monitored the impact of state-mandated exit exams on both equity and educational quality.


More than two decades of evidence demonstrates that high school graduation tests are the wrong prescription for what ails public education. In fact, such requirements most damage the very groups proponents claim they will help.


Because of the overwhelming evidence that exit exams create more harm than

good and do not improve the quality of education for underserved student populations, we oppose the proposed Graduation Competency Assessment regulations.


Across the nation, tens of thousands of students are denied diplomas each year–

regardless of how well they have done in school–because they did not pass a standardized state test. Under such policies, after 12 years of playing by the rules, working hard and completing all other graduation requirements, a student’s future can hinge on just one or two points on a single standardized exam.

Misguided exit-exam mandates have increased dropout rates, especially among minority groups, and have focused classroom teaching on test preparation rather than 21st century skills. The full record in states like Massachusetts, Texas and California shows that high-stakes tests have failed to fulfill their promise of improved quality and equity for public school students. That’s why Pennsylvania civil rights and disability advocates, teachers, administrators, school board members, public school parents, and others have expressed serious concerns about Gov. Rendell’s high-stakes testing plan.

A poll from the respected Susquehanna Polling & Research group shows they are not alone. By a landslide 62 to 31 percent, Pennsylvanians responding to a survey opposed denying diplomas to students if they fail a statewide test but have passed all their classes.

The problems exit exams are meant to solve are certainly real. Pennsylvania, like most states, has gaps in educational access, quality and outcomes. But exit exams won’t cure these ills. For too many students, the cure is worse than the disease. Rather than provide better education and expanded opportunities, graduation tests add punishment – denial of a diploma – to those who most need help. Unfortunately, rather than address unequal opportunities for a high-quality education, the Board of Education proposes the false “solution” of a graduation test.

Proponents incorrectly claim that exit exams narrow achievement gaps. The National Assessment of Educational Progress reports no narrowing of achievement gaps at the high school level among racial groups for the past several decades (Neill, 2005). Nor have average high school scores increased. During this time, many states have imposed graduation tests, so that 65% of U.S. public high school students are now affected. Often, scores go up on the state exam and more students pass, but according to independent measures such as NAEP, the state gains are mere inflation, not reflecting any increase in students’ knowledge. In Texas, for example, the university system reported an increase in the numbers of students needing remediation after the state imposed its graduation test (Haney, 2000).

Real progress has been elusive because high-stakes testing, including No Child Left Behind, undermines rather than improves education. Untested subjects are ignored, while tested topics narrow to test-coaching programs. Since these tests are mostly multiple-choice, students focus on rote learning to identify correct answers instead of learning to think and apply their knowledge. Test prep is like holding a match to a thermostat and believing the room is warmer: Scores rise on that test; real learning does not.

The most thorough independent national research also confirms a link between graduation tests and higher dropout rates. Dee and Jacob found that the tougher the tests, the more the dropout rate increased (Dee & Jacob, 2006). Warren et al. found that graduation tests have caused the national dropout rate to increase by 40,000 students per year (Warren et al., 2006). California’s dropout rate spiked in 2006, the first year students had to pass the state’s exit exam to graduate, with 24,000 seniors dropping out, more than twice as many as four years earlier (Williams, 2007). Texas introduced exit exams in 1992. Fifteen years later, Texas used test results to deny diplomas to a record 40,200 students in the Class of 2007 (Radcliffe and Mellon, 2007).

Massachusetts “MCAS” tests have been touted as among the nation’s best, and the claim has been made that the high-stakes MCAS has not caused the dropout rate to increase. The evidence says otherwise. Five years after the MCAS became a graduation requirement, dropouts are at a nine-year high. Between 1999-00, three years before the exit exam was implemented, and 2006-07, the number of students who dropped out of school increased by 24 percent, from 9,199 to 11,436. Claims that exit exams ensure better schooling for minority students are belied by black and Hispanic dropout rates that are more than two to three times that of whites. For the state’s limited English proficient students, the annual dropout rate has been on a steady upward trend since 2003, from 7.6% to 10.4%. Further, 11th and 12th graders who have not passed MCAS remain more than 11 times more likely to drop out of school than those who have (MA DOE, 2007).

In 2006, Boston’s annual dropout rate rose sharply, from 7.7% to 9.9%. At the same time, the city suffered a wave of youth violence. Boston City councilors, who solicited the views of local young people on why violence was rising, reported, “Students … expressed massive frustration and boredom with the endless drilling and practice of the MCAS test and test preparation… Far too many students describe their school experience as an MCAS-centric environment… [as a result] the incentive for students to remain in school is tenuous.”


In addition to feeding the dropout rate, tests have “measurement error,” which means some children will fail even though they know the subject (Rogosa, 2001). Being able to take the test more than once helps, but does not solve this problem. There is also the well-documented problem of test anxiety: An accomplished student may freeze, not do well on the test, and be denied a diploma (Hembree, 1988). For this among other reasons, the Standards for Educational and Psychological Testing state that a major decision about a student should not be made on the basis of a single test score (AERA, 2000). Being able to take a test more than once solves the problem for some, but not all students. Therefore, a graduation test hurdle violates the standards of the measurement profession itself.

No one wants to see youth leave school without the skills needed for success. Exam supporters say students shouldn’t get meaningless diplomas if they can’t pass the tests. But it’s a student’s overall transcript that makes a diploma truly meaningful. For example, high school grades are better predictors of college success than SAT scores. When college professors and employers are surveyed, they say projects and portfolios tell them far more about a candidate than test scores do. A standardized test is not a solid foundation for establishing meaning.

Moreover, the research shows that preparing for exit exams does not help build skills needed for college and employment. On the contrary, a focus on learning out-of-context facts to pass exit exams detracts from preparing students for the work required in college. A survey of professors and employers by Achieve, which promotes standards and tests, found many high school graduates are weak in comprehending complex reading, oral communication, understanding complicated materials, doing research, and producing quality writing (Achieve, 2005).

If exit exams really enhance equity and school quality, why are Southern states — the first to adopt graduation tests — still mired at the bottom by any measure of educational performance? Why, in short, should Pennsylvania follow the failed practices of Mississippi and Alabama?


The truth is that race and class performance gaps reflect more on what happens outside the classroom than inside. A recent analysis of high school test scores in Connecticut found socioeconomic factors alone account for about 85 percent of the variation in test scores in four subjects (Heffley, 2007). This mirrors years of studies that find family background predicts test scores best. Pennsylvania can do better than putting accountability on the backs of its children while failing to address the underlying economic and social inequalities.

The individual and societal costs of denying a diploma based on a state test score are high. Students without diplomas earn much less, are far less likely to maintain stable families, and are far more likely to end up in prison. Ironically, in the long run it would be less expensive to adequately fund schools than to pay for the costs of the resulting damage. Imposing a graduation test increases the long-term damage while pretending to solve the problems actually facing schools.

We believe it is time to rethink what students should be required to achieve

before they earn a diploma. It’s important to ensure sufficient resources to enable

students to meet those goals. Then develop various ways in which students can

demonstrate this learning and the state can check up on the system. Other states

have avoided the exit exam route specifically because they recognized the costs

can outweigh the benefits. Wyoming and Rhode Island, for example, have multiple measures systems of determining graduation. And while New Jersey does have exit exams, it also has a robust alternative graduation system that is accessible to more than 10,000 graduates a year. Access to this system, called the Special Review Assessment (SRA), is a major reason New Jersey has one of the nation’s highest graduation rates, and some of the best rates for students of color, despite significant gaps with white students (Fine, 2007).


Thank you for considering my testimony. FairTest would be pleased to work with you and Pennsylvania educators, parents and citizens to craft a different approach to graduation, one that would rely on local determinations of adequate achievement but that would establish methods to ensure the quality of the local determinations. We can be reached by phone at (857) 350-8207.

http://www.fairtest.org

References

Achieve. 2005. Rising to the challenge: Are high school graduates prepared for college and work? http://www.achieve.org/sites/default/files/pollreport_0.pdf


AERA. 2000. AERA Position Statement on High-Stakes Testing in Pre-K – 12 Education. American Educational Research Association. http://www.aera.net/policyandprograms/?id=378

Dee, T.S. & Jacob, B.A. 2006. Do high school exit exams influence educational attainment or labor market performance? Social Science Research Network, April. http://papers.ssrn.com/sol3/papers.cfm?abstract_id=900985.

Fine, M. et al. 2007. New Jersey’s Special Review Assessment: Loophole or Lifeline? http://www.edlawcenter.org/ELCPublic/elcnews_080324_SpecialReviewAssessm…


Haney, W. 2000. “The Myth of the Texas Miracle in Education,” Education Policy Analysis Archives. (August 19). epaa.asu.edu/epaa/v8n41/.


Heffley, E. 2007. What do CAPT scores really tell us? The Connecticut Economy, Summer: 14-16.

Hembree, R. 1988. Correlates, causes, effects, and treatment of test anxiety. Review of Educational Research, V58, N1, 1988.

Massachusetts Department of Education. 2007. High School Dropouts 2006-07 Massachusetts Public Schools. http://www.doe.mass.edu/infoservices/reports/dropout/0607/summary.pdf

Neill, M. 2005. School Beat: Notes on the recent NAEP test results. Beyond Chron: (August 4). http://quartz.he.net/~beyondch/news/index.php?itemid=339.

Radcliffe, J. & Mellon, E. 2007. TAKS tests cost 40,000 Texas seniors chance to graduate, Houston Chronicle (May 12)

Rogosa, D. 2001. Shoe Shopping and the Reliability Coefficient, Educational Assessment. Vol. 7, No. 4: 255-258


Ross, M. et al. 2006. Report of the Special Committee on Youth

Violent Crime Prevention: Working Together to Increase the Peace (June): 10.


Warren, J.R., Kulick, R.B., & Jenkins, K.N. 2006. High school exit examinations and state-level completion and GED rates, 1975 through 2002. Education Evaluation and Policy Analysis, V28, N2: 131-152.


Williams, J. 2007. Report: California dropouts increase in first year of exit exam, San Francisco Chronicle (November 7). http://www.sfgate.com/cgi-bin/article.cgi?file=/n/a/2007/11/07/state/n14…

Attachment Size
PASenEdComTestimony.pdf 23.77 KB