Adapted from an article that originally appeared in Thrust for Educational Leadership, May-June 2000.
Assessing computerized testing
Testing and assessment are prime candidates for technological innovation as the pencil-and-paper, blue book, and fill-in-the blank model of testing is nearing the end of its useful life. Just over the horizon are new methods, new mechanisms and new modes of test development, delivery, taking, and administration.
The cost of test administration is high, particularly when accounting for test creation, reproduction, distribution, delivery, grading and evaluation, and post-test review. The time taken out from the instructional day is equally expensive - while students are taking a test, they are arguably not learning anything new. Moreover, for a test to be most effective as a learning tool, immediate review of test results by test-takers is essential.
If you thought the introduction of the ScanTron was testing nirvana, wait until you see the potential of computer-delivered testing. Already widespread in private-sector certification programs (particularly in the technology field), computerized assessment is more than just multiple-choice. From performance assessment to short answers, to essay questions, computer-based testing is the wave of the future.
One example of a comprehensive computer-based testing program is the Educational Testing Service (ETS). ETS uses computers to deliver and evaluate several tests, including the Graduate Management Admission Test (GMAT), Graduate Record Examinations (GRE), Test of English as a Foreign Language (TOEFL), the Praxis Series of assessments for beginning teachers, parts of the National Board for Professional Teaching Standards (NBPTS) assessments, and restricted portions of the SAT I: Reasoning Test.
ETS relies on software to automate the scoring of essays by using what amounts to a sophisticated grammar checker coupled with rules for determining essay content. For more information about the ETS computer-based testing approach, visit their website.
Yes, there have been highly publicized disasters in computerized scoring of essays. This is particularly true when software has been fed classics of English literature and then regurgitated failing grades. Moreover, as anyone who relies on standard word processing grammar checkers knows, sometimes the software gets it wrong. However -- with proper safeguards -- even the most creative uses of language can be evaluated fairly. And the software is improving all the time.
At first blush, computerized testing may also seem expensive. All those computers cost already scarce money. But why not leverage the technology already in place in your school or district to bring one more benefit to your students? Computerized tests can be delivered in number of venues, including via the Internet, at single-purpose kiosks, in the classroom, or in the campus computer lab. This flexibility, coupled with the elimination of duplication and physical delivery costs, and the automated scoring of tests, can actually reduce district expenses.
Recent research has discovered a few flaws in computer-based testing and assessment, however. Among the concerns is that students without adequate keyboarding skills (generally lower than 20 words per minute) score lower on computer-based tests than on traditional pencil-and-paper tests, even when asked identical questions.
Additionally, research into human review of identical essays discovered that hand-written answers often generate higher scores than computer-printed responses.
These and other issues will need to be addressed, either by beefing up student test-taking skills or through a technical remedy.
Try it yourself
Vendors will probably be clamoring for your attention as products come to market to simplify computer-based test development, delivery, taking, grading and evaluation, administration, and post-test review. A researcher at the University of Southern California noted in a recent conversation with me that widespread public education deployment of computer-based testing -- multiple-choice, short answer, and essay questions and beyond into knowledge mapping -- is about five years out.
In the meantime, I have developed a quick-and-dirty multiple-choice test on ACSA's website to demonstrate how computerized testing could work. Take my pop quiz. Additional resources for web-based testing (easily adaptable to a campus network) are available at the same URL.
By investigating the possibilities of using the latest technology to streamline your testing and assessment program now, you will ensure a smooth transition for your schools, your districts, and your students.
Marc Elliot Hall is ACSA's Webmaster. Give him your assessment of this column by calling (916) 444-3216 or via e-mail.
Copyright 2000, Marc Elliot Hall, DBA Sensation! Services