GMAT Error Hurts Applicants - Test-Takers Not Told of Mistake for 10 Months

University Testing

Nearly one thousand recent business school applicants from across the nation had their chances for acceptance seriously damaged by scoring errors on the Graduate Management Admission Test (GMAT) but were not told of the mistake for almost a year. A letter from the exam’s manufacturer, the Educational Testing Service (ETS), obtained and made public by FairTest, reveals that the firm had to correct the scores of “three percent of examinees who took the GMAT in February and March 2000.”


The correction increased scores on the computer-adaptive test by up to 80 points on the exam’s 200 to 800 scale, a difference that could significantly alter the chances for admission at competitive business schools. For example, a reported score of 530 is below average for GMAT test-takers, while a corrected 610, 80 points higher, exceeds the score earned by nearly three-quarters of all examinees. The ETS letter said that the problem developed when “some questions in the Quantitative section were incorrectly counted as not having been answered.”


Test-takers were not informed of the problem until mid-December 2000, ten months after some had taken the exam. That means college seniors taking the GMAT to enter business schools in the fall semester of 2000 were not notified of the mis-scoring until after admissions decisions were made and classes had begun. In fact, some test-takers may never have received notice of the error because they had long since moved from their undergraduate addresses.


At least one business school has taken action to correct errors in its admissions process caused by the GMAT scoring error. Brigham Young University invited seven applicants it had rejected in the fall 2000 admissions cycle to reapply for 2001 because their scores were 40 to 60 points higher than originally reported. Though a few test-takers threatened legal action against ETS due to damage to their careers from the scoring error, no lawsuit has yet been filed.


This is not the first time flaws in ETS’ computer-delivered exams have messed up young peoples’ lives. Previous problems include “the black screen of death,” in which more than a thousand test-takers’ terminals froze up after all questions had been answered but before a score was recorded during the initial months of computerization (see Examiner, Winter 1997-98). ETS also has admitted that the computer adaptive Graduate Record Exam (GRE) produced inaccurate scores for some students (see Examiner, Fall 1999). Such errors are not surprising since internal company documents state that the computer adaptive testing technology used for the GMAT was rushed to the marketplace for financial reasons, not because it is fairer or more accurate (see Examiner, Spring 1997).


The GMAT scoring error is yet another example of the danger in using fallible exams as a primary factor in determining admissions, promotion, or graduation. Even test-makers recognize the potential negative consequences of increasing reliance on exam scores. “As pressure grows on the use of tests for greater high-stakes decisions, incidents or mistakes or inaccuracies take on greater importance,” ETS spokesman Thomas Ewing told the Salt Lake Tribune.


• For a copy of the fact sheet Computerized Testing: More Questions Than Answers, see or send a self-addressed envelope with $.34 postage to Computerized Testing c/o FairTest 15 Court Square, Suite 820, Boston, MA 02108.