Are scenario based items associated with more omitted answers in progress testing?
-
Upload
carlos-collares -
Category
Education
-
view
605 -
download
0
description
Transcript of Are scenario based items associated with more omitted answers in progress testing?
Are scenario-based items associated with
more omitted answers in progress testing?
AMEE Conference - Milan, September 2nd, 2014.
C.F. Collares, Maastricht UniversityA.M.M. Muijtjens, Maastricht UniversityM.M. Verheggen, Maastricht University
D. Cecilio-Fernandes, University Medical Centre GroningenR.A. Tio, University Medical Centre GroningenC.P.M. van der Vleuten, Maastricht University
FHML – Dept. of Educational Development and Research – School of Health Professions Education
Background
• Formula scoring (number-right score minus
penalties for wrong answers) has been useful in PT:
- prevention of undeserved scores due to guessing
- generalizability (especially for early years)
- promotion of metacognitive skills
• International Progress Test (IPT) Committee:
increase scenario-based items early exposure of
students to more authentic and relevant items
FHML – Dept. of Educational Development and Research – School of Health Professions Education
Background
• Apparent increase of omitted answers (“I don’t know
option”, “question mark”)
• Student dissatisfaction, demotivation and
disengagement
• Educational utility and future of progress testing
paradoxically in jeopardy
FHML – Dept. of Educational Development and Research – School of Health Professions Education
Method• Instrument: one edition of the International Progress Test
• Participants:198 students.
• Dependent variable: % of omitted answers of each item
• Independent variables:
- clinical scenario
- stem word count
- item number
- number of alternatives
- dummy variables related to the content domains (subscores).
• Bootstrapped multiple linear regression (SPSS)
FHML – Dept. of Educational Development and Research – School of Health Professions Education
Results
FHML – Dept. of Educational Development and Research – School of Health Professions Education
Regression coefficients (all years)
R = 0,577R2 = 0,333
Durbin-Watson = 2,124
ANOVAF = 4,078 (20) p < 0,001
FHML – Dept. of Educational Development and Research – School of Health Professions Education
Results by year
• The regression coefficients of scenario-based items
on question mark prevalence were the highest and
p values were the lowest on Year 1.
• The observed increase of omitted answers in
scenario-based items gradually decreased until it
disappeared in the fourth academic year.
FHML – Dept. of Educational Development and Research – School of Health Professions Education
Conclusion
• Scenario-based items were associated with
increased omitted answers, independently from
content, number of alternatives, item length and
item position, particularly in early years
• The current progress testing framework might be
associated to a delay in the engagement of
students into solving scenario-based items
FHML – Dept. of Educational Development and Research – School of Health Professions Education
What now?
Is progress testing as we know it
“wearing off”?
Is it time for a new progress testing
framework?
FHML – Dept. of Educational Development and Research – School of Health Professions Education
How can we keep the positive
effects of progress testing on
metacognition while enhancing
item relevance and authenticity
and ensuring accurate scores?
FHML – Dept. of Educational Development and Research – School of Health Professions Education
PROPOSAL FOR A NEW
PROGRESS TEST FRAMEWORK
FHML – Dept. of Educational Development and Research – School of Health Professions Education
It is common for mentors to coach students to
estimate their degree of “certainty” or
“confidence” in their answers before writing
them in the progress test answer sheet.
>75% confidence = go for it and answer!
50-75% = think again, leave it for later
<50% = “I don’t know” (?)
FHML – Dept. of Educational Development and Research – School of Health Professions Education
The key to progress testing future...
...might be in the past.
FHML – Dept. of Educational Development and Research – School of Health Professions Education
(Leclercq, 1975)
Separate “confidence marking”
(Leclercq, 1982)
FHML – Dept. of Educational Development and Research – School of Health Professions Education
Enhanced progress testing framework
• Metacognitive knowledge gets its own score
• “Confidence” levels are expressed in a Likert scale
• Each category of the scale represents a % range
• Penalties for wrong answers go to the
metacognition scores only
FHML – Dept. of Educational Development and Research – School of Health Professions Education
Expected adjustments
• Less items per test
• Multiaxial blueprint with flexible item tagging
(clusters, competences, categories, disciplines) to
allow more items per subscore
• Separate norms for the metacognition scores
• Scores shall be given in standardized item response
theory “theta scores” (mean = 500; SD = 100)
FHML – Dept. of Educational Development and Research – School of Health Professions Education
Expected outcomes
• Better item relevance and professional authenticity
• Strengthened metacognitive regulation
• Increased reliability (Ferrando et al., 2013)
• Less construct-irrelevant score variance ( validity)
• Early student engagement in scenario-based items
• Authentic items promoting learning and transfer
• Better educational utility
FHML – Dept. of Educational Development and Research – School of Health Professions Education
Thank you!