FairTest Analysis of MCAS: Massachusetts Learning Standards Assessed Locally Better Compels Student Learning
Massachusetts Learning Standards and Curriculum Frameworks implemented and assessed locally will strengthen the Massachusetts accountability system by more accurately and comprehensively measuring whether students have in fact met the standards. MCAS is a limited snapshot that fails to assess important elements of student learning. Read our analysis here:
https://fairtest.org/wp-content/uploads/2024/03/massstandardsresponsefinal.docx.pdf
Summary of State Graduation Requirements
State graduation requirements around the country can be summarized as follows:
- Credit/Course Completion Requirements. This is the most common graduation requirement.47 statesrequire students to pass a certain number of courses (and thus accumulate credits) that are aligned to state learning standards in prescribed disciplines to graduate with a state issued high school diploma.The determination as to whether a student has passed a course (or accumulated credit) in each stateis made at the school/local level. For the class of 2024, in30 states, course passage is the sole substantive requirement for graduation.Ten of those states do require students to take a civics exam before graduating.
- High Stakes Exit Exams. 9 statesrequire students to take and pass anywhere between 2 and 5 subject matter exams in order to graduate with a state-issued diploma.The states (for the graduating class of 2024) areFlorida, Louisiana, Massachusetts, Ohio, New Jersey, New York, Texas, Virginia, and Wyoming.
- End of Course Exams. 6 states(Georgia, Illinois, Maryland, Mississippi, Missouri and Tennessee) require students totakecertain statewide end-of-course exams in order to graduate.In some of these states the end-of-course exam must count for a certain percentage of the final course grade.A final passing grade in the course as determined at the school/local level (accumulation of the credit) is a diploma requirement.
- Mastery/Proficiency/Career Readiness.In10 states,students have an additional requirement beyond course passage or an exit exam to earn a diploma.These requirements come in various forms of demonstrating competency, mastery, or college and career readiness.In most cases the state allows for multiple ways to meet the state requirement as designed and implemented by the local school district.
- Locally designed and implemented.6 states.In several states the state has set a standard of mastery, competency, or graduation “readiness” and has asked local districts to devise their own way to meet this standard.In Connecticut, districts can choose their own assessments to measure students’ mastery-based learning.In Rhode Island, beginning with the class of 2028, localities need meet the state Readiness Based Graduation Requirement.In Pennsylvania districts rely on locally developed assessments to comply with state standards. In Vermont, students have to demonstrate proficiency in alocally delineated set of content knowledge and skills connected to state standards that, when supplemented with any additional locally developed requirements, have been determined to qualify a student for earning a high school diploma. The mode of assessment is locally determined.In Colorado and New Mexico, local districts can select from a menu of options, including capstones, AP or IB tests, or locally developed assessments to meet the readiness requirement.
- State Competency Badges. 3 states.In Ohio, Indiana and Nevada, students must demonstrate a certain number of post-secondary readiness competencies from a menu of options and categories provided by the state and implemented locally.
- ACT Passage with Specific Cutoff Score set by state. 1 state.Alabama has required that beginning with the graduating class of 2028, students will have to “pass” the ACT with a cutoff score to be determined by the state as an indicator of “college and career readiness.”
OVERWHELMING MAJORITY OF U.S. COLLEGES AND UNIVERSITIES REMAIN ACT/SAT OPTIONAL OR TEST-BLIND/SCORE-FREE FOR FALL 2025
for further information:
Harry Feder, Esq. 917 273-8939
Bob Schaeffer 239 699-0468
for immediate release, Wednesday, February 21, 2024
OVERWHELMING MAJORITY OF U.S. COLLEGES AND UNIVERSITIES
REMAIN ACT/SAT-OPTIONAL OR TEST-BLIND/SCORE-FREE FOR FALL 2025;
More than 80% of U.S. four-year colleges and universities will not require applicants for fall 2025 admissions to submit ACT/SAT scores according to a new tally by FairTest, the National Center for Fair & Open Testing. That’s a total of at least 1,825 of the nation’s bachelor-degree granting institutions, with more schools extending test-optional policies every week.
“Despite a media frenzy around a single Ivy League school reinstating testing requirements, ACT/SAT-optional and test-blind/score-free policies remain the ‘new normal’ in undergraduate admissions,” explained FairTest Executive Director Harry Feder. “Test-optional policies continue to dominate at national universities, state flagships, and selective liberal arts colleges because they typically result in more applicants, academically stronger applicants and more diversity.”
A recent study of ACT/SAT score submission by the Common Application group of 1000+ colleges and universities found “more and more students choosing not to report than to report. Growth is meaningfully faster over the past year for students not reporting test scores . . . “ (https://www.commonapp.org/files/Common-App-Deadline-Updates-2024.02.14.pdf)
FairTest Public Education Director Bob Schaeffer added, “High school students, parents, and counselors should understand that ACT/SAT scores will not be required by an overwhelming majority of undergraduate campuses for the foreseeable future. The introduction of the digital SAT will not change that reality because the revised test is not a better or fairer predictor of undergraduate success.”
Among well-known institutions whose test-optional policies continue at least through fall 2025 are Columbia, Cornell, Emory, Harvard, Johns Hopkins, Notre Dame, Princeton, Stanford, the University of Chicago, and Vanderbilt. ACT/SAT scores will also not be required at such liberal arts colleges as Amherst, Middlebury, Swarthmore, Wesleyan, and Williams. Major public campuses in most states, including Colorado, Illinois, Missouri, Oregon, Utah, and Washington, remain test-optional for current high school seniors and juniors. The entire University of California and California State University systems are permanently test-blind/score-free.
FairTest’s frequently updated list of schools that do not require all or many applicants to submit ACT/SAT scores before admissions decisions are made is available free online at: https://fairtest.org/test-optional-list/
– – 3 0 – –
ACT/SAT-OPTIONAL & TEST-FREE UNDERGRADUATE ADMISSION BY THE NUMBERS
1,075 ACT/SAT-optional schools pre-pandemic (as of March 15, 2020)
1,700 schools did not require ACT/SAT scores for fall 2020
1,775 schools did not require ACT/SAT scores for fall 2021
1,825 schools did not require ACT/SAT scores for fall 2022
1,904 schools did not require ACT/SAT scores for fall 2023
2,025 schools did not require ACT/SAT scores for fall 2024
1,825+ schools have already extended ACT/SAT-optional or test-blind/score-free admissions
through at least fall 2025. About 200 schools have not yet announced their policies
for the next admissions cycle — more are extending every week.
1,700+ of these schools are “permanently” ACT/SAT-optional or test-blind/score-free
2,278 total number of 4-year schools per USDOE National Center for Education Statistics https://nces.ed.gov/programs/coe/indicator/csa
The Misguided War on Test Optional
Innovative Educational Assessments that Support Deeper Learning
FairTest explores efforts at the state and local level to assess student learning in more educationally beneficial and equitable ways other than federally and state mandated standardized tests. There are robust examples of systems of authentic assessments that can be used as models for spurring deeper learning.
https://fairtest.org/wp-content/uploads/2023/12/PBA-Advocacy-Document.docx.pdf
Interpreting PISA Results: It’s Poverty, Stupid (With a Bit of the iPhone)
The results of PISA 2022 should, like all standardized test results, be filtered through a dose of skepticism about the claims of the test producers and administrators. We must also carefully scrutinize “Chicken Little” claims in the media which is notorious for manufacturing and hyping education crises. Declines in standardized test scores have been the premise for all of the failed education reforms of the past forty years, from the publication of A Nation at Risk, through No Child Left Behind, the charter school movement, and now universal vouchers and privatization. We must guard against this trap yet again.
By scrutinizing the performance of the United States versus other OECD countries, the unshocking conclusion should be that the PISA test is largely a measure of childhood poverty rates rather than academic achievement. The United States leads the OECD in child poverty. Our rate of child poverty is approximately 26%, and higher by some measures. Thus it is not surprising that as a nation we do not perform as well as most other OECD countries on PISA. If you compare the tranche of American schools with poverty rates equal to those of other OECD countries, however, the United States does quite well.
In reading, countries at the top of the PISA list– Slovenia (499), Denmark (489), and Finland (474)–all have childhood poverty rates below 10%. If one were to measure US schools with under 10% poverty rates, the average score would be 562, good enough for first globally. In mathematics, Germany, France and the UK have child poverty rates between 15 and 18% and have scores of 475, 479 and 489 respectively. If you measured US schools with childhood poverty rates of 10-25%, we would score a 508, good enough for 4th in the OECD.
America’s problem on PISA is poverty and inequality, not curriculum and instruction.
PISA is a scaled score, norm-referenced, multiple-choice test. Two-thirds of all test takers globally score between 400 and 600 on a section (math, reading and science). Only 2% score over 700. In general, these kinds of tests are set up so results will go down over a longer time frame. According to Prof. Andy Hargeaves of Boston College, “once a metric is widely used and has a competitive ranking element, gaming the system leads to overall declines in performance after an early lift, and also has negative side-effects on well being.” Not surprisingly, during the last two decades student performance in mathematics, reading and science all significantly declined in most OECD countries.
If one were looking for an actual reason for this decline besides test design and use, the proliferation of technology and handheld device usage by 15-year olds may very well be the culprit. In response to a question on the 2022 PISA, 45% of students said they felt anxious if their phone was not near them and 65% felt distracted by them during math lessons. Prof. Sam Abrams of Teachers College, Columbia University attributes Finland’s decline in PISA scores to the introduction of the iPhone and its proliferation of use among Finnish teens.
As an aside, the United States did not do terribly relative to other OECD countries in terms of rankings. The US moved up in rankings for all three subjects (math, reading and science). And in aggregate scores the United States held its ground in reading and science pretty well from the previous administration while other countries’ scores went down.
The OECD is to be commended for attempting to analyze the extent creativity and innovation are promoted in national school systems. However, the Creativity and Innovation review did not include the United States. The PISA team found it hard to extract data about the state of creative thinking in schools in the US, because education is delegated via states to many often small districts of schools. They did cite the work of EL Education’s network of districts and public schools – a network committed to assessment via performance-based assessment– as an example of creative and innovative schooling designed to get students to think critically, communicate clearly, and create complex work. Overall the report stated that within the limitations of a snapshot review “it has not been possible to do justice to the rich variety of experiences in schools in the USA.”
Successful attempts have been made to measure creativity, innovation and entrepreneurship in the global economy by nation. Perhaps not surprisingly, the United States ranks near the top in those economic categories among G20 nations and has for decades. There is a disconnect between our international testing rankings and our economic ranking based on human capital. This calls into question whether the United States should be worried about its PISA rankings at all. Does PISA measure anything of importance? Not really, but you wouldn’t know it from the weight policy makers and media ascribe to the results.
Tienken and Mullen (2014) found no statistically significant relationships between indicators associated with the innovation economy and PISA. Earlier studies of PISA results suggest no statistically significant relationships or weak relationships between ranks on international tests and economic output indicators such as GDP, adjusted gross income, or purchasing power parity (e.g., Baker, 2007; Rameriz, et al, 2006; Tienken, 2008). International tests do not provide meaningful information about the skills most important for full participation in society in terms of socio-civic and economic outcomes in the G20 countries (Sjoberg, 2007). The information tested on international tests is not the information children will need to compete in the innovation economy and the results do not tell us important information about student proficiency with the vocational competencies necessary to create, innovate, and pursue entrepreneurial opportunities (Zhao, 2014). (See citation below). Perhaps the correct answer is that we really shouldn’t be paying much attention to PISA results. They don’t give us particularly useful or telling information.
Finally, given the deep dislocation in schooling and trauma caused by the COVID pandemic, drops in scores from the pre-COVID administration are not surprising. The PISA scores are some evidence of what we already knew–the pandemic was bad for kids everywhere and impacted learning.
Tienken, C.H. & Mullen, C.A. (2014). The curious case of international student assessment: Rankings and realities in the innovation economy. In S. Harris & J. Mixon (Eds.), Building cultural community through global educational leadership (pp.146-164). Ypsilanti, MI: NCPEA Press