FairTest’s Monty Neill presented invited testimony on graduation tests and alternatives to the Maryland Board of Education. The letter also appears attached at the bottom as a formatted PDF:
October 21, 2008
James DeGraffenreidt, Chair
Members of the State Board of Education
200 West Baltimore Street
Baltimore, MD 21201
Copies sent by regular mail and e-mail
Dear Chairman DeGraffenreidt and Members of the Maryland Board of Education,
Thank you for inviting me to present written testimony on the High School Assessments and alternative approaches to determining graduation requirements.
My name is Monty Neill and I am Deputy Director of the National Center for Fair & Open Testing, Inc, also known as FairTest. Headquartered in Boston, Massachusetts, we are the nation’s only organization focused exclusively on assessment reform issues.
In the past year, we have been working with the Maryland Coalition for Fair and Equitable Accountability. This is an alliance of state and local civil rights, education and parent organizations that seeks to ensure a fair, high-quality system, rooted in equitable opportunities and employing multiple forms of assessment, for students and schools. The ideas expressed below in part come from that work, but I am not writing as a representative of the Coalition.
In my testimony, I will focus on two main points. First, I will provide reasons why Maryland should not implement a high-stakes graduation requirement. Second, I will outline a plan to assess high school students. This plan, developed with members of the Coalition, includes the state’s High School Assessments but also incorporates other, diverse measures. It will avoid the pitfalls of high-stakes testing and will provide a useful set of tools for evaluating students and schools and for enriching education in Maryland.
I. Limits and Dangers of High-Stakes Graduation Tests
Maryland’s approach, to take effect this year, will employ end-of-course exams, with a requirement that a student obtain a sufficient score on these tests to graduate. Those who do not pass the test will be able to use an alternative, the Bridge program. However, it has become clear the Bridge program is not yet ready for its proposed use. Further, while an alternative such as Bridge lessens the severity of the problems associated with high-stakes testing, too many of the problems remain. For most students, the HSA tests are a high-stakes hurdle, and as such they will powerfully affect curriculum, instruction and learning – but, as evidence shows, generally not for the better.
The problems that have been identified with the use of high-stakes graduation exams, such as the Maryland High School Assessments, include direct harm to individuals, particularly through increases in the dropout rate, and more general damage to educational quality. I will address each of these two general points.
Graduation Tests are Unfair and often Cause Direct Harm to Individuals:
Measurement experts say basing a decision on a single test is wrong.
The Standards of the measurement profession and most professional education organizations state that making a decision based on a single test score is a misuse of standardized testing (AERA, APA, and NCME, 1999). The HSAs function as such a ‘single test’ for most students.
High-stakes tests cause an increase in dropout rates.
Comprehensive national studies have found that graduation tests lead to higher dropout rates while they do not improve learning for those who stay in school (Grodsky, et al., 2008; Warren, et al., 2006). For example, Grodsky, et al. (2008), found that graduation exams had no positive effects on scores on the National Assessment of Educational Progress. Dee and Jacob (2006) found the more difficult the test, the greater the likelihood of dropping out. Many students leave school before they are eligible for alternatives: the fact of failing overrides the idea of possible later success on an alternative. This has been the case in Massachusetts, despite its reputed success with graduation tests: in 2006-07, dropout rates were the highest since before the tests were introduced, with African American and Latino rates two-to-three times those of Whites and Asians ( Mass. Department of Education). Thus, the low passing rates for Maryland’s Latinos, African Americans, students in special education and limited English proficient students are a very clear warning sign that the dropout rate will increase.
The consequences of not having a diploma are severe. Young adults who don’t obtain a diploma earn far less, have less stable families, and are more likely to be in prison (Mclaughlin, 2008). Studies also have confirmed a clear link between not having a diploma and the likelihood of criminal activity (Sum, et al., 2006). Proponents of high-stakes graduation tests often deride what they term a “meaningless” diploma, but this data demonstrates the diploma is far from “meaningless,” even if the earner lacks some of the kinds of knowledge and skills measured by standardized tests. Rather than impose huge costs on individuals and society through graduation tests, Maryland should seek alternative means to improve learning outcomes.
Standardized tests reinforce inequity.
Society should not punish students for adults’ failure to provide children with the necessary tools for success, as detailed in the excellent letter of October 20 to you from the ACLU of Maryland and the Maryland Law Disability Center. Many students – particularly those from low-income and minority families, along with English language learners and students with disabilities– attend under-funded schools or lack access to high quality educational programs necessary for their success. In at least the next year or so, the fiscal crisis facing Maryland along with many other states will make it more difficult to provide students with a strong and equitable opportunity to learn.
Standardized tests do not accurately assess many students.
Some students who successfully demonstrate learning through classroom performance do not score well on standardized tests. These often include students with test anxiety and learning disabilities as well as students whose first language is not English (Expert Panel on Assessment, 2007).
High-stakes testing produces narrow teaching to the test.
The higher the stakes, the more schools focus instruction on the tests, but the tests fail to adequately assess a rich, comprehensive curriculum. Indeed, these tests undermine the quality of the curriculum, as I will next explain. This narrowing of curriculum is most severe for low-income students.
Graduation Tests Fail to Add Value to High School Diplomas
Graduation tests do not promote the knowledge, skills and habits needed for success in college or skilled work. According to college professors and employers, high school graduates must be able to analyze conflicting explanations, support arguments with evidence, solve complex problems that have no obvious answer, reach conclusions, conduct research, and engage in the give-and-take of ideas (National Research Council, 2002). Also needed are attributes such as good study skills, time management, awareness of one’s performance and persistence. Since exit exams do not measure most of these important attributes, test scores have little value for colleges or employers. (Peter D. Hart, 2008).
Graduation tests do not make high school diplomas more valuable to employers. Recent research found no positive impact on employment status or wages in states with high school exit exams. There was also no impact on numbers of high school graduates going to college (Warren et al., 2008).
Most state standards-based high school tests are not aligned with college-level work or employment. Even when the exams go beyond multiple-choice items, they have but few essays or extended tasks. They rarely require students to apply their learning or engage in higher-level thinking. According to Stanford Professor Linda Darling-Hammond (2005), “Most jobs in today’s knowledge-based economy require that we find, assemble and analyze information, write and speak clearly and persuasively; and work with others to solve messy problems,” none of which are measured by multiple choice exams. College requires similar skills.
Test preparation overshadows the development of college-level skills. A focus on learning out-of-context facts to pass exit exams detracts from preparing students for the work required in college. A survey of professors and employers by Achieve (2005), which promotes standards and tests, found many high school graduates are weak in comprehending complex reading, oral communication, understanding complicated materials, doing research, and producing quality writing. High-stakes standardized exams will not solve these problems and can exacerbate them.
The widespread adoption of exit exams has not resulted in more high school graduates prepared for college. Exit exam policies now influence the education of 65% of U.S. public high school students, yet colleges report increasing need for remedial education. Federal statistics indicate that 40% of college students take at least one remedial course, reducing their probability of graduating (National Center for Education Statistics, 2004). Texas colleges reported in-state high school graduates needed more, not less, remediation after high-stakes testing was introduced (Haney, 2000).
High school graduates would not be better prepared if schools were to “raise standards” by making exams harder. Tougher multiple-choice questions won’t address the real gap between tests and college or employment requirements. Such strategies also ignore research on human motivation, assuming that simply “raising standards” and threatening punishment (withholding diplomas) will make students and teachers work harder. Most modern businesses no longer try to boost productivity by threatening employees with punishment (Oakes and Grubb, 2007).
II. High quality assessment is an educational necessity – and possible.
Better assessment methods are needed if high schools are to develop higher level skills students need for college and work. Unlike standardized exit exams, the use of assessment methods such as performances, exhibitions and portfolios has been shown to promote the development of skills, knowledge and disposition actually valued in college and employment (Wood, Darling-Hammond, Neill and Roschewski, 2007; Darling-Hammond and McCloskey, forthcoming). Employers have said they are more interested in examples of student work and problem-solving, such as portfolios, than they are in test results [or grades] (Peter D. Hart, 2008). Similarly, the Partnership for 21st Century Skills (n.d.) has outlined a range of knowledge and skills students should acquire, much of which clearly cannot be measured with traditional paper-and-pencil tests – but can be assessed using other means. Only with a range of strong and flexible assessments can students or schools be fairly and comprehensively evaluated and learning outcomes improved.
The Coalition also has examined national and international evidence to develop some recommendations on how graduation requirements could be re-structured. We intend these recommendations to be a starting point for discussion and planning. FairTest staff and Coalition members would be pleased to meet with you to discuss these further and work on an exact design.
End of course assessments would remain, but count for 15-20% of a student’s final grade. Students would have to pass their required courses to graduate. Not being “sink-or-swim” barriers to graduation will diminish the unfair high-stakes nature of graduation tests while ensuring the tests match state standards. Teachers could use state-designed exams instead of their own end-of-course tests. A few states, such as Tennessee, have begun to implement such an approach.
The subject-specific exams should assess thinking and doing, reasoning, creating, and problem solving, as well as cover basic information and routine procedural knowledge. Thus, they will need to include performance tasks in addition to having a modest proportion of multiple-choice and short-answer items (e.g., 20-25%, as recommended in Principles and Indicators for Student Assessment Systems, National Forum on Assessment, 1995). This can solve the problem of narrowing the curriculum to rote, lower-level learning in tested subjects. It will take time to improve the tests in this way, so the state should commit to steady progress in doing so.
The exams can be scored locally, with spot-checking to confirm accuracy. This is the case for the New York State Regents exams and other nations’ assessment practices (Darling-Hammond and McCloskey, forthcoming). This approach can address the need to return results quickly while including complex performance components. States can build systems to review samples of answer sheets to ensure quality and accuracy in scoring. Teams of teachers from other schools might do the initial scoring or rescoring to ensure independent oversight.
Maryland, by itself or in collaboration with other states, should create banks of good assessment tasks that teachers can use at their discretion. This goes beyond the immediate use of end-of-course tests. Use of cognitively complex tasks will support teaching higher order skills. Their use also will facilitate teachers’ use of projects in the instructional process, prepare students for exams with performance components, and help build the basis for an overall stronger assessment system (Expert Panel on Assessment, 2007; Wood, et al., 2007). Other nations that do very well in international comparisons, such as Finland, Sweden, and some Australian states, rely on performance assessments, most or all of which are controlled locally (Darling-Hammond and McCloskey, forthcoming; FairTest Examiner, 2005; Wood, et al., 2007). There are many similar programs in the U.S. on which Maryland could rely (Darling-Hammond, Rustique-Forrester and Pecheone, 2005; Neill, 2008). The knowledge acquired in development of the Bridge program could be of use.
Maryland should allow schools and districts to use high-quality alternatives to end-of-course exams. For example, the New York Performance Standards Consortium (n.d.) has won the right to determine student high school success in all subjects. Students must take only the N.Y. Regents exam in English. For other subjects, Consortium members use school-based performance assessments instead of Regents tests in a series of graduation-level tasks: analytical comparative essay in literature, social studies research paper, original science experiment, and application of higher level mathematics, as well as proficiencies in oral defenses and exhibitions of their work. This approach to assessment can lead to innovative curricular design and teaching. An expert body should approve the use of such assessments. As with using performance assessments within HSA, this will take time to accomplish. The state should move steadily towards building capacity to evaluate the quality of proposed local graduation assessments.
The performance tasks and school/district alternatives can together form the basis for developing a far richer, comprehensive system in which state exams are one component, or even become optional, once schools and districts are capable of compiling validated bodies of evidence of student learning. Such systems are technically feasible, as shown in the report of the Expert Panel on Assessment of the Forum on Educational Accountability (2007). In such systems, testing would primarily serve as one check on the system. If the federal Elementary and Secondary Education Act (now No Child Left Behind) does not change its current testing requirements, the HSAs would also continue to meet that mandate. If, as was recommended in draft legislation crafted by House Education Committee Chairman George Miller and Ranking Member Buck McKeon, Congress allows and encourages states to develop comprehensive systems that include local assessments, then Maryland will be ahead of the curve.
Finally, some might say that Maryland can do both, have a high-stakes graduation test and have a separate set of possibilities or requirements to assess the higher order skills that students need. But it is very hard to serve two masters. An integrated approach, not a set of separate hurdles, will enable schools to properly balance the range of knowledge and skills students need while allowing legitimate, monitored flexibility. Such an approach also would also be far more congruent with what modern science understands about how people learn (Pellegrino, Chudowsky and Glaser, 2001). Constructing a system along these lines also would allow development of alternative curricula as well as assessment that could be valuable to future educational improvement. States should not stifle the development of superior approaches in the name of standardization.
Developing such an integrated, multiple-measures system would address the need to assess higher order and complex skills and knowledge that have proven impossible or too expensive to assess with statewide standardized exams. Across the nation, the limits of standardized tests have become ever more clear. Maryland has an opportunity to go beyond such tests and develop a far richer and fairer system. I strongly urge you to seize that opportunity. FairTest and the Coalition recommend that you declare a one-to-two-year moratorium on the use of the HSA tests as a graduation requirement, then use that time to construct a new system.
Finally, whatever you require of students, whatever assessment processes you decide on, Maryland must first ensure that all students have had a fair, equitable and strong opportunity to learn the material on which they are to demonstrate their competency, knowledge and skills. We know that in Maryland, as in the rest of the nation, such opportunity does not yet fully exist. For example, science standards may require lab work, but too many schools do not have adequate equipment. This problem must be addressed before the exams count toward graduation – but the state should not use this as an excuse to mandate tests that ignore standards requiring lab work. A rich set of assessments can foster opportunity, but high-stakes graduation requirements should follow in the wake of opportunity, not precede it.
Thank you for considering these concerns and proposals. Again, I, my FairTest colleagues, and the members of the Maryland Coalition for Fair and Equitable Accountability look forward to working with you to develop a new assessment and accountability system.
Monty Neill, Ed.D.
National Center for Fair & Open Testing, Inc. (FairTest)
15 Court Square, Suite 820
Boston, MA 02108
857-350-8207 x 101
American Educational Research Association, American Psychological Association, National Council on Measurement in Education. 1999. Standards for Educational and Psychological Testing. Washington: AERA.
Achieve, 2005. “Rising to the Challenge: Are High School Graduates Prepared for College and Work?” http://www.achieve.org/node/548.
Darling-Hammond, L. 2005. “What Does It Take to Graduate? What’s in a test?” June 28.
Darling-Hammond, L., and McCloskey, L. Forthcoming. “Assessment for Learning Around the World: What Would It Mean to Be ‘Internationally Competitive?'” Phi Delta Kappan.
Darling-Hammond, L., Rustique-Forrester, E., and Pecheone, R. 2005. Multiple Measures Approaches to High School Graduation. Stanford, CA: Stanford University School Redesign Network.
Dee, T.S., and Jacob, B. 2006. “Do High School Exit Exams Influence Educational Attainment or Labor Market Performance?” April: NBER Working Paper No. W12199.
Expert Panel on Assessment. 2007, June. Assessment and Accountability for Improving Schools and Learning: Principles and Recommendations for Federal Law and State and Local Systems. Washington: Forum on Educational Accountability. Available at http://www.edaccountability.org/reports.html.
FairTest. 2005. “Queensland, Australia, ‘Rich Tasks’ Assessment Program.” Examiner, June. Available at http://www.fairtest.org/queensland-australia-rich-tasks-assessment-program.
Grodsky, E., Warren, J.W., and Kalogrides, D. 2008. “State High School Exit Examinations and NAEP Long-Term Trends in Reading and Mathematics, 1971-2004” June 13: Educational Policy. Available at http://epx.sagepub.com/cgi/content/abstract/0895904808320678v1.
Haney, W., 2000. “The Myth of the Texas Miracle in Education.” Education Policy Analysis Archives. Aug. 19, V. 8, N. 41. http://epaa.asu.edu/epaa/v8n41/.
Massachusetts Department of Education. N.d. “High School Dropouts 2006-07.” Available at http://www.doe.mass.edu/infoservices/reports/dropout/0607/summary.pdf.
McLaughlin, J. 2008, September. “Estimating the Size of the High School Dropout Problem in the U.S. and Illinois and the Economic and Social Consequences of Dropping Out of High School.” Boston: Center for Labor Market Studies, Northeastern University. PDF document, emailed to M. Neill by J. McLaughlin, Oct. 2, 2008.
National Center for Education Statistics, 2004. Conditions of education 2004. Washington, DC: U. S. Department of Education.
National Forum on Assessment, 1995. Principles and Indicators for Student Assessment Systems. Cambridge, MA: FairTest. Available at http://www.fairtest.org/principles-and-indicators-student-assessment-syste.
National Research Council. 2002. Learning and understanding: Improving advanced study of mathematics and science in U.S. High schools. Washington, DC: National Academy Press.
Neill, M. 2008. “What Superintendents Can Do to Promote Sound Assessment in Light of NCLB.” Paper for the American Association of School Administrators. Available at http://www.fairtest.org/promoting-sound-assessment.
New York Performance Standards Consortium (n.d.). Various materials available at http://performanceassessment.org/.
Oakes, J. & Grubb, W.N. 2007. “‘Restoring Value’ to the High School Diploma: The Rhetoric and Practice of Higher Standards. October: Education Public Interest Center and Education Policy Research Unit. http://epsl.asu.edu/epru/documents/EPSL-0710-242-EPRU.pdf
Partnership for 21st Century Skills. N.d. Various materials available at http://p21.org/.
Pellegrino, J. W., Chudowsky, N., & Glaser, R. (Eds.). (2001). Knowing what students know: The science and design of educational assessment. Board on Testing and Assessment. National Research Council. Washington, DC: National Academy Press.
Peter D. Hart Research Associates. 2008. January 9. Available at http://www.aacu.org/advocacy/leap/documents/2008_business_leader_poll.pd…
Sum, A., Khatiwada, I., McLaughlin, J., and Tobar, P. 2006, December. “An Assessment of the Labor Market, Income, Health, Social, and Fiscal Consequences of Dropping Out of High School: Findings for Illinois Adults in the 21st Century.” Boston: Center for Labor Market Studies, Northeastern University. PDF document, emailed to author by J. McLaughlin.
Warren, J.R., Grodsky, E., & Lee, J, 2008. “State High School Exit Examinations and Post-Secondary Labor Market Outcomes.” Sociology of Education, V. 81, N. 77-107. Education
Warren, J.R., Kulick, R.B., and Jenkins, K.N. 2006. “High School Exit Examinations and State-Level Completion and GED rates, 1975 Through 2002.” Education Evaluation and Policy Analysis (Vol. 28, No. 2, pp. 131-152).
Washington Education Association and Washington State PTA. 2005, January. “Weighted Multiple Measures Compensatory Approach to Graduation Requirements.” Policy Brief.
Wood, G., Darling-Hammond, L., Neill, M., and Roschewski, P. 2007. “Refocusing Accountability: Using Local Performance Assessments to Enhance Teaching and Learning
for Higher Order Skills.” Briefing Paper Prepared for Members of The Congress of The United States. Available at http://www.fairtest.org/refocusing-accountability.
|Maryland Testimony – email version – 10-21-08.pdf||45.33 KB|