- ExamEval
- Item Writing Flaws
- Heterogeneous Answer Choices
Heterogeneous Answer Choices and Non-Parallel Question Construction

Understanding Heterogeneity
Answer choices in multiple-choice assessments should be homogeneous and use parallel construction, meaning the format, length, style, category, and verb tense of the answer choices should be similar. Heterogeneous answer choices lack this parallel construction. Options that are unique or stand out are more likely to be identified as the correct answer by test-wise students, even if they lack actual knowledge of the content.
Common forms of heterogeneity include variations in length (one option significantly longer or shorter), different levels of specificity (some options general, others highly detailed), mixed formats (some numerical, others descriptive), or inconsistent grammatical structures. As Haladyna, Downing, and Rodriguez (2002) emphasize, maintaining parallelism in answer choices is a fundamental principle of effective item writing. When this principle is violated, the question is more likely to be flawed.
Heterogeneous answer choices introduce construct-irrelevant variance by providing clues that are unrelated to the content being assessed. This flaw allows test-wise students to identify the correct answer based on structural anomalies rather than genuine understanding, which undermines the validity of the assessment.
Consider the following example. Even for a student with no knowledge of computer science, a test-wise student can make an educated guess at the correct answer:
Why Longer, More Detailed Answer Choices Are Usually the Correct Answer
Longer, more detailed answer choices are often correct in multiple-choice assessments because item writers, consciously or not, tend to include more qualifiers, explanations, or specific details in the correct answer to ensure its accuracy and defensibility. This results in the correct option standing out due to its length or complexity, while distractors are typically shorter and more general.
Exam writers often write the correct answer choice first, naturally including more details or qualifiers to ensure accuracy and defensibility. When drafting plausible but incorrect distractors, it is much harder to add specific or detailed qualifiers that remain believable. This challenge frequently results in distractors that are shorter, more generic, or less specific, further contributing to heterogeneous answer choices that can cue test-wise students to the correct response.
Example of Heterogeneous Answer Choices
Homogeneous Options Improve One-Best-Answer Question Formats
Assessment experts recommend a one-best-answer (or single best answer) format in which the answer choices can be ranked or ordered from least to most true along a single dimension. If answer choices are heterogeneous across multiple dimensions, this ranking or ordering process is not possible.
Consider this example in which the flawed question has heterogeneous answer choices across multiple dimensions:
Creating Homogeneous, Parallel Answer Choices
Effective answer choices should be homogeneous across multiple dimensions:
- Similar Length: Options should be approximately the same number of words or characters to avoid length bias.
- Consistent Specificity: All options should operate at the same level of detail—either all general or all specific.
- Uniform Format: Use consistent formatting, capitalization, and punctuation across all options.
- Parallel Grammar: Ensure all options follow the same grammatical pattern relative to the question stem.
- Equivalent Complexity: Options should require similar levels of sophistication to understand and evaluate.
- Same Domain or Concept: Options should all share a concept or domain (e.g., all are medications, all are risk factors for a disease, all relate to pathophysiological processes, etc.)
Assessments with heterogeneous answer choices can compromise exam validity by enabling test-wise students to guess correctly without demonstrating true content mastery, eroding confidence in exam results. ExamEval's AI-powered exam analysis platform automatically identifies and corrects this type of item-writing flaw to boost assessment reliability and improve student learning outcomes. Discover how ExamEval can support health professions educators at ExamEval.
References
- National Board of Medical Examiners (NBME). Item-Writing Guide. Philadelphia, PA: National Board of Medical Examiners; February 2021.
- Haladyna TM, Downing SM, Rodriguez MC. A review of multiple-choice item-writing guidelines for classroom assessment. Appl Meas Educ. 2002;15(3):309-334. doi:10.1207/S15324818AME1503_5