The Cavalier Daily
Serving the University Community Since 1890

Re-examining computer exams in education

I WALKED into the Educational Testing Service center last Friday to take my Graduate Record Examination feeling a little disoriented. I was about to take a standardized test, but for the first time I didn't bring a stack of freshly sharpened No. 2 pencils and an extra eraser. Fortunately, I didn't need to. Friday, a keyboard and a computer screen took over my testing experience, making Scantron bubble sheets seem like a thing of the past.

Once I hit the multiple-choice sections of the exam, though, I longed for a paper copy of the test and an oval-covered answer sheet. Computer-based testing as it exists today, despite its efficiency, is a poor replacement for the traditional paper-based format. ETS should re-examine the way it evaluates computer-based tests in order to bring the new testing format's fairness up to the same level as the traditional paper versions.

ETS currently offers four exams in the computer-based format: GMAT, GRE, TOEFL and PRAXIS. Although ETS is switching to only computer-based testing for some exams, others, such as the GREs, are still available in the paper-based format as well. The paper-based GRE general test is given twice a year in areas of the United States where computer-testing centers are not located. Test takers pay the same $115 fee regardless of which testing method they choose.

Computer-based testing offers some clear advantages. Because the computer program calculates the score as the test-taker is working through the exam, scoring time decreases dramatically. The computer tests also can cut down on the test taker's time at the testing center. Rather than having to sit through listening to a test administrator's directions, test takers can work at their own pace by skipping directions and not taking breaks between sections of the test. Since few people are needed to give directions for the computer-based test, ETS cuts back on costs by hiring fewer personnel.

The computer format also may benefit many test takers on tests that include a writing portion. For example, the GRE general test includes two writing tasks that compose the analytical writing section. The test allots 45 minutes for one essay that asks writers to present their perspective on a selected issue, and 30 minutes for a second essay that asks writers to analyze an argument. As more people become computer literate and accustomed to composing writing over a computer, test takers may feel they can write stronger compositions by typing rather than handwriting responses. The word processing program specially designed for the GRE general test allows writers to insert and delete text, thus making revision easier through the computer-based test than the paper version.

The problems with the computer-based test, however, are more related to the multiple-choice portions of the exam.

Overall, the test taker has less agency when it comes to attacking the multiple-choice verbal and quantitative sections of the computer-based test. According to the GRE Information and Registration Bulletin, because the score is calculated as the test is taken, the computer presents a question of middle difficulty at the beginning of the test. "The computer scores that question and uses that information, as well as

responses to any preceding questions and information about the test design, to determine which question is presented next." Questions of increasing difficulty appear as the test-taker responds correctly, and incorrect answers lead to lower-difficulty questions. What this means, however, is test-takers do not have the option of skipping problems and returning to them later. Instead, each question must be answered before another one is presented.

The paper version includes two verbal sections consisting of 38 questions each and two quantitative sections with 30 questions each, compared to the 30 verbal questions and 28 quantitative questions in the computer version. The Registration Bulletin explains that the computer-based test is an adaptive test with questions tailored to individual performance levels. Yet, knowing that each question ultimately is weighted more in the computer-based version serves to increase test taker anxiety.

The computer-based test program itself is simple, and ETS provides a free GRE PowerPrep CD-ROM to all computer-based test takers to orient them to the design of the computer version. Although the format is relatively easy to follow, the way the test is scored remains a myth to many test takers. ETS needs to re-evaluate its method for assessing these tests. A computer-based test that was not adaptive, but rather followed the basic format of the paper test, might help eliminate some test-taker anxiety. A paper test booklet could accompany a non-adaptive computer test, allowing test-takers the advantage of paper-based tests when it comes to solving mathematical problems, underlining reading passages and eliminating incorrect answers, while still asking them to report their answers on the computer.

ETS has made some significant strides in standardized testing through the introduction of computer-based examinations. If the organization would make some changes in the way such tests are scored, eventually test takers would be willing to throw out their Scantron sheets for good.

(Stephanie Batten's column appears Tuesdays in The Cavalier Daily. She can be reached at sbatten@cavalierdaily.com.)

Comments

Latest Podcast

The University’s Associate Vice Provost for Enrollment and Undergraduate Admission, Greg Roberts, provides listeners with an insight into how the University conducts admissions and the legal subtleties regarding the possible end to the consideration of legacy status.



https://open.spotify.com/episode/02ZWcF1RlqBj7CXLfA49xt