ExamEval Logo
  1. ExamEval
  2. Item Writing Flaws
  3. Unfocused, Unclear, or Open Stem

Unfocused, Unclear, and Open Question Stems Undermine Assessment Quality

Sean P. Kane, PharmD, BCPS
By Sean P. Kane, PharmD, BCPS
Published June 29, 2025

Unfocused or Ambiguous Questions Reduce Assessment Validity

The question stem serves as the foundation of any multiple-choice item. An unfocused, unclear, or open stem can lead to common flaws in assessment quality:

  1. Unfocused stems fail to clearly specify what is being assessed, either covering multiple topics or lacking sufficient detail.
  2. Unclear stems use ambiguous language that forces students to guess the question writer's intent.
  3. Open stems use fill-in-the-blank or sentence completion formats instead of a clear, directed question, leaving students uncertain about what is being asked or increasing cognitive load.

These problems force students to spend cognitive resources on interpretation rather than demonstrating knowledge, which reduces the validity of the assessment.

Consider this example of a poorly constructed stem:

Vague Medical Knowledge Assessment
Regarding diabetes:

A. It affects blood sugar levels
B. Lifestyle modifications are important
C. Regular monitoring is recommended ✓
D. Medications may be prescribed
This open stem "Regarding diabetes:" provides no focus or context. All answer choices could be considered correct depending on interpretation. Students must guess what aspect of diabetes is being assessed rather than applying specific knowledge.
Critical Issue

Unfocused, unclear, or open stems introduce construct-irrelevant variance by forcing students to guess the question writer's intent. This shifts the assessment from a measure of content knowledge to a test of interpretation and inference, which undermines the validity of the results.

The Cover the Options Rule

A helpful technique for identifying an unfocused stem is the "cover the options" rule. If a student can cover the answer choices and still formulate a reasonable answer to the question, the stem is likely well-focused. If the student cannot, the stem is likely unfocused and needs to be revised.

As Dell and Wantuch (2017) recommend, a well-written stem should consist of a self-contained question. This ensures that the question is clear, focused, and effectively assesses student knowledge.

Examples of an Unfocused Question Stem in Health Sciences Education

Flawed Question
Pharmacokinetics includes the following aspects in relation to drug therapy:

A. Absorption, distribution, metabolism, and elimination ✓
B. Drug interactions and contraindications
C. Patient compliance and monitoring
D. Dosing frequency and route of administration
This stem is vague and unfocused. "Pharmacokinetics includes the following aspects in relation to drug therapy:" doesn't specify the context or what type of consideration is being assessed. Multiple options could be correct depending on the specific pharmacokinetic situation being addressed.
Corrected Question
A patient with severe kidney disease requires medication dosing adjustments. Which pharmacokinetic process is most significantly affected and requires primary consideration?

A. Absorption from the gastrointestinal tract
B. Distribution to target tissues
C. Hepatic metabolism of the drug
D. Renal elimination of the drug ✓
The corrected version provides specific context (kidney disease, dosing adjustments) and asks for the most significantly affected process. This focuses the question on renal elimination while requiring understanding of how kidney disease impacts pharmacokinetics. When applying the cover the options rule, the stem alone makes it clear what is being asked, allowing a knowledgeable student to anticipate the required answer without seeing the options.

Open-Ended Questions with Fill-In-The-Blank

Assessment experts recommend a closed-ended question format in which the question stem ends with a question. An open-ended, sentence-completion format may be reasonable only if the item has a clear, focused premise and the blank is at the end of the statement. Sentence-completion questions are less preferred because they are more prone to having unfocused stems in which the "cover the options" rule cannot be applied.

Blanks at the beginning or middle of the question increase cognitive load for test-takers, often prompting students to re-read the question and answer choices multiple times before evaluating the correct answer. Consider the following example:

Flawed Question
The process of _____ is essential for the elimination of most drugs from the body.

A. absorption
B. distribution
C. metabolism
D. excretion ✓
This sentence-completion format places the blank in the middle of the sentence, forcing students to re-read the item with each answer choice to determine which fits best. This increases cognitive load and can obscure the focus of the question, making it less effective than a direct question format.
Corrected Question
Which process is essential for the elimination of most drugs from the body?

A. Absorption
B. Distribution
C. Metabolism
D. Excretion ✓
The corrected version uses a direct question format, clearly asking which process is essential for drug elimination. This reduces cognitive load and allows students to focus on applying their knowledge rather than interpreting the sentence structure.

Correcting Stem Problems

The most effective approach to fixing poorly constructed stems involves these key strategies:

  • Single Learning Objective: Each stem should assess one clearly defined concept rather than multiple topics simultaneously
  • Realistic Scenarios: Present authentic professional situations that require knowledge application
  • Clear Task Direction: Specify exactly what students should determine, evaluate, or recommend
  • Sufficient Context: Provide enough background information to eliminate ambiguity without unnecessary details

By applying the "cover the options rule" and focusing on these core principles, question writers can create stems that accurately assess student knowledge rather than test-taking skills.

This type of systematic item writing flaw detection and correction can be challenging to implement consistently across large exams. ExamEval, an AI-powered exam analysis platform, automatically identifies unfocused, unclear, and open stems while suggesting specific improvements to enhance assessment quality.

References

  1. National Board of Medical Examiners (NBME). Item-Writing Guide. Philadelphia, PA: National Board of Medical Examiners; February 2021.
  2. Haladyna TM, Downing SM, Rodriguez MC. A review of multiple-choice item-writing guidelines for classroom assessment. Appl Meas Educ. 2002;15(3):309-334. doi:10.1207/S15324818AME1503_5
  3. Rudolph MJ, Daugherty KK, Ray ME, Shuford VP, Lebovitz L, DiVall MV. Best Practices Related to Examination Item Construction and Post-hoc Review. Am J Pharm Educ. 2019;83(7):7204. doi:10.5688/ajpe7204
  4. Dell KA, Wantuch GA. Curr Pharm Teach Learn. Jan-Feb 2017;9(1):137-144. doi: 10.1016/j.cptl.2016.08.036.

Take the First Step Toward More Reliable Exams

✓ No credit card required
1,000 free credits each month
1,000 bonus credits for joining
✓ Cancel anytime