Teacher Test Score Fiasco

Teacher & Employment Testing

Paul Perrea seems like just the sort of person troubled schools would love to recruit. A 44-year-old electrical engineer from Cincinnati, he decided to change careers and teach high school science in the inner city. “I thought it was time to give back,” Perrea explained.


He took the required courses and signed up for the exams mandated by the state of Ohio, tests that are part of the Praxis series produced and marketed by the Educational Testing Service (ETS). Perrea scored high on the science content exams but, much to his surprise and chagrin, received a failing mark on the Principles of Learning and Teaching (PLT) Grades 7 – 12 pedagogy test because of low scores on the essay section.


Without passing this exam, Perrea could never be hired for a regular classroom job in Ohio as a licensed teacher. Wearing what he calls “a scarlet letter of failure,” he went through a multitude of unsuccessful job interviews. While he redoubled his efforts to study to pass the test, Perrea suspected that his score was wrong because, “I was always a good test-taker, and I had written some of the best essays in my life on that exam.”


Shortly after completing his June 2004 retest, Perrea received a strange message on his answering machine. A caller from ETS was admitting that Perrea had actually passed the PLT the first time he had taken it, and that his score had just been recalculated.

Though the voice from ETS had apologized for the “inconvenience” and promised a refund of testing fees, Perrea was angry. He assumed that others were similarly affected.


A call to FairTest and a brief investigation produced an article in The New York Times. A national torrent of stories soon followed. Perrea, it turns out, was just one of about 4,100 test-takers in 17 states who had erroneously been told they had failed by ETS. The scoring error – in the short essay section – occurred during eight different PLT administrations over a 15-month period.


An ETS letter to Praxis test-takers again apologized for “the uncertainty and difficulty that this situation may have caused you and your family.” The company proudly explained that it had set up an 800 number to answer any questions. The letter closed by urging recipients to “enjoy your accomplishment and pursue your career dreams.”


But the “situation” was more than an “inconvenience” for most of the 4,100 people whose test results had been misreported. Without passing scores, many could not obtain full-time, tenure track teaching positions. In addition to the financial loss from being denied an opportunity to pursue their “career dreams” for a year or more, many suffered mental anguish over their reported lack of “accomplishment.” Others wasted time and money on preparation for needless retests.


Already three class action law suits have been filed against ETS on behalf of test-takers who were denied employment because of the scoring scandal. One case is in state court in Pennsylvania; another is in federal court in Ohio and a third in Long Island. All seek economic damages as well as compensation for less-tangible losses.


Documents ETS will be compelled to produce during the judicial process may illuminate the cause of this particular error. However, a larger question is unlikely to be directly addressed. As policymakers invest more power in the testing industry, what assurance does the public have that any mandatory, high-stakes exam is actually fair, accurate, or valid? Almost weekly, the media report new examples of scoring errors, faulty exam questions and other testing flaws (see article, p. 6)


At present, there is less public oversight of the tests required of teachers and their students than there is of the food we feed our dogs and cats. It is time for state and federal legislative bodies to hold hearings and determine how to enforce minimum quality control standards over the entire testing industry, while limiting potential damage from their defective products.


Who will examine the examiners?