Back in January, I wrote a blog about item writing best practices for exams. By no means was the list complete, and now, months later, I’m offering advanced suggestions for quality exam items. As I stated previously, the exam quality says a lot about your certification program.
Your exams should accurately measure the skills you outlined in your exam blueprint. If you have items that go off script, and don’t really match the blueprint, your exams miss the mark, and confuse the candidates. Many programs, like mine at Kinaxis, publish the blueprint so candidates can see how the exams topics are broken down by percentages, and also what the specific objectives are. In the Kinaxis program we feel the candidate should know what is expected of them. We do not want to frustrate our candidates or waste their time studying for topics that aren’t on the exam.
This blog describes additional best practices for exam item construction, obtained from my industry experience. I will suggest the same strict rules and methods I have my SMEs follow with the exams we develop at Kinaxis. I do “practice what I preach.”
Exam Item Construction Best Practices
Some additional best practices to follow for your items:
- Ensure each item is both technically and cognitively congruent with your blueprint objectives
An item is congruent with an objective if the item tests the knowledge, skill or ability defined in the objective. To be cognitively congruent, a reasoning objective should have a reasoning item, and not a recall one.
- Test your candidates on relevant knowledge
Avoid using corner-case examples, or seldom-used product features as the subject of an item. Test them on knowledge and skills they need to know to perform their daily work. Is it really important to know which year a product went to market?
- A True/False question on a certification exam, really?
Items should have 4 or 5 answer keys, no more, and be either (1 out of 4), (2 out of 4) or (3 out of 5) questions. Sample guessing probabilities: 1 of 4 = 25%, 2 of 4 = 16.7%, 3 of 5 = 20%. A True/False question is at 50%, or a coin flip. I actually saw a 6 out of 14 question and it was a “Choose all that apply” vs. telling the candidate how many correct answers there are.
- Avoid value judgment items unless you specify the judging criteria
- Eliminate words that can have multiple meanings, or meanings that are alien to some cultures. Write to a 6thgrade level of English.
Two words I learned to avoid from experience are “via” and “prone”. Many candidates from Asia had trouble with these two words, as indicated in their comments during beta exams. If you have examples of some, please put them in the comments to share with everyone.
- Watch out for enemy items
An enemy item is one that gives away the answer to another item. If you develop multiple form exams, they need to be separated. If you randomly pull from a pool then one item needs to change.
- Stretch your item bank by using variants and isomorphs
A variant is an item testing on the same concept but with a different answer. It would be an item with the same answer as another item, but with a different stem or scenario, or a variation of the first question. An isomorph is where you keep the stem the same, but have a different set of answers and distractors. You could take a “Choose two” question and break it up into two single answer isomorphs, as long as the answer and distractors differ.
- Make sure your scenario items are real scenarios
If your item can be answered by covering up the scenario, or exhibit, then you have a recall, and a not a reasoning item. Perhaps you need to remove some of the details contained in the stem. Also, a scenario is typically a condition, problem, or situation. It is not a short story or a treatment for a movie. Remember the candidates who have English as a second language (ESL). If you allow x seconds/item on the exam, and it takes x+20 seconds/item you have real problems with the item. I had an SME once write a scenario that took over 8 minutes to read aloud, on an exam where we allowed for 90 seconds/item.
- If you use graphical exhibits, ensure the graphic is not blurry and has uniform text
I have seen graphics created in PowerPoint with the spelling error redline showing, or text that was added as an overlay appear in different fonts or sizes. Also keep the picture size to a maximum of 800 x 600 so the candidate does not have to scroll horizontally or vertically to see the entire exhibit. That takes time away from the thought process.
- If you have textual exhibits, limit the length to 24 lines
Consider how long it takes someone to read and comprehend a text exhibit. 10 lines of code could take a long amount of time to absorb, so keep the text large enough to read and parse, but not a dump. Also, don’t use a graphic of text as a replacement for a text exhibit. Obey the minimum resolution requirements of your test delivery provider.
- Focus on one concept per item
- Avoid “not” questions or negative phrasing in the scenario or stem
- Do not use words like “always”, “never”, “best” (unless “best” is a documented “best practice”)
- Never use “all of the above”, “none of the above, “only B & C”
- Eliminate the phrase “Which of the following…”
- Double cue multiple answer items
- For multiple choice questions you use one of these variations: 1 out of 4, 2 out of 4, or 3 out of 5. I have seen 6 out of 14, which was totally ridiculous
- Make your distractors plausible and parallel, and don’t make things up that are not part of your product or software
- Avoid asking trivia questions
As I stated in my last blog, you may not agree with some of these suggestions, and that is fine. As a program leader you get to decide for yourself how your exams will be presented. My goal is to present additional industry best practices, and maybe event to stir up some debate. Please feel free to add suggestions of your own.
Joe Cannata is the Certification Director at Kinaxis and the CEdMA Membership Trustee.