Friday, November 18, 2016

Do Testwise Strategies Really Work (Testwiseness Part Three)

Hell, yeah.

Here is a personal anecdote from back when I worked as a full-time Test Development Specialist.

I had been working for a few years as a Test Specialist when the senior manager suddenly realized that there was only one person on staff who knew how the software for grading our exams worked. As it occurred to him that the whole operation could grind to a halt were something to happen to that individual, he transferred me into the computer programming section as the backup programmer. (This made sense to him because I was the only person back in those days—30 years ago!—who owned his own personal computer, so I looked like a computer guy to him.) Since I had no programming training, and since all our computer programs were written in FORTRAN, my boss signed me up for a university-level correspondence course on FORTRAN programming. By the time the course materials had arrived in the mail two weeks later (this before online courses!), however, my boss had come to his senses, hired a team of FORTRAN programmers, and switched me back to writing tests. I therefore had no further need to learn FORTRAN, and I pushed the unopened box of course materials under my bed and forgot about it.

Six months later I received a notice in the mail saying the examination for the FORTRAN course would be at such and such location in a week's time. And I thought, "Well, the course has already been paid for, why not take the test?" Not, you understand, that I had ever opened the box or looked at any of the course materials. I just wondered if I could pass the test on my knowledge of test design.

The examination, if I recall, was 70% multiple-choice and 30% written response. I had no hope of answering the written response part, since I had absolutely no idea how to write a FORTRAN program. I did repeat back the question (because some markers wrongly give at least one mark for a student filling in the space with something) and attempted a line or two of code based on examples from the multiple-choice part of the test, but since I had no idea what it was I was copying, I highly doubt it made any sense at all.

The multiple-choice questions, however, were a different matter. As a test specialist, I was able to examine each question for the tiny flaws that gave the answer away. Examining the alternatives for the odd one out, or for one answer that was significantly longer than the others, and so on, (see previous post for these techniques) I gave it my best college try, even though I often had no idea what the question was asking, let alone what the correct answer might be.

Three weeks later I received a letter saying I had obtained 70% on the test; which considering I had left the written response essentially blank, means I must have aced the multiple-choice. Not bad for a topic I literally knew nothing about!

I subsequently wrote the institution in question pointing out this weakness in their tests, and sometime later they hired me to give a one-day workshop on the do's and don'ts of multiple-choice item writing. There is, therefore, no use asking me how to register for that course; these techniques won't work to pass that test again!

This was probably an extreme case; and, to be fair, such tests are not intended for people who make their living designing multiple-choice tests. Most people would not have found the flaws quite so obvious.

In sharp contrast, I sat down with a copy of Alberta's Grade 12 Mathematics Diploma Examination to see what I could get using these test-wise tricks. In this case, I only managed 23%, slightly less than one would expect by pure chance. This test had been written by test experts and it was impossible for me to get through it without actually knowing the course content (calculus) which I did not.

Most classroom teachers' tests are somewhere in between these two extremes. A student will not likely be able to pass a teacher-constructed exam solely on knowing testwise strategies, but such strategies are likely to improve the testwise student's grade a by significant percentage if the teacher does not know and implement correct test construction technique.


In that context, it is worth noting that teachers should be cautious about using test banks that come with textbooks. Some of the larger textbook publishers do employ professional test developers as editors for their test banks, but many do not. Test banks are sometimes written by the author of the textbook (who likely will not have any expertise in test-construction) but are more often written by graduate students from that discipline desperate for cash (and who definitely do not have test-construction skills) and dashed off over a weekend before the publisher's deadline. In other words, don't assume the test bank is worth anything. At a minimum, any instructor using a test bank written by anyone else should edit the questions before putting them on their test to ensure the questions are less flawed and more relevant to their own classroom. (On the other hand, starting from a flawed test bank is sometimes easier than writing one's own items starting from a blank page. Using a test bank as initial rough draft can be okay, provided one guards against it being all rote memorization. You will probably need to write your own higher level thinking questions, because publishers can't pay grad students enough to come up with those!)

No comments:

Post a Comment