Fundamentals of Assessment Todd L. Green, Ph.D. Associate Professor Pharmacology, Physiology &...
-
Upload
sherilyn-clark -
Category
Documents
-
view
224 -
download
0
Transcript of Fundamentals of Assessment Todd L. Green, Ph.D. Associate Professor Pharmacology, Physiology &...
Fundamentals of Assessment
Todd L. Green, Ph.D.
Associate Professor
Pharmacology, Physiology & Toxicology
PIES Seminar2-23-11
Use of terms
• Assessments = Observations /measurements
• Evaluation = interpretations / conclusions
Uses of assessment
• Individual learner
– Educational action: Feedback
– Administrative action: Judgment on advancement
• Program
– E.g., Sources of variance Quality improvement
– Accreditation
“Unit” of assessment
• Single interaction of student and content
• Series of encounters (observations by one teacher)
• Series of teachers (in a rotation)
• Series of teachers + exams (in a clerkship)
• Series of clerkship (in a department)
• Inter-clerkship (clinical year’s program)
PURPOSE (Why Assess)
• Meet goals and objectives?
• Were innovations successful?
• Determine program strengths and weaknesses
– Unanticipated strengths and problems
• Growing interest in educational outcomes
– “Residency programs should use outcome assessment in their evaluation of program educational effectiveness”1
• LCME, RRC require programmatic evaluation1
1Graduate Medical Education Directory, 2001-2002
PURPOSE (Why Assess)
• Competing goals of stakeholders
– “Tension between delivering the service and ensuring time for reflection and learning”2
• Help obtain and/or justify resources for your program
2Hayden, 2001
Goal
Domains
Standards
Methods
Data
Evaluation & Interpretion
Outcomes-based
judgments
Process: Assessment & Evaluation
What are stakeholders and who are they?
Marshall University Joan C. Edwards School of Medicine Institutional Objectives
Patient CareStudents must be able to provide patient care that is compassionate, appropriate, and effective for the treatment of health problems and the promotion of health.
Medical KnowledgeStudents must demonstrate knowledge of established and evolving biomedical, clinical,epidemiological and social-behavioral sciences, as well as the application of this knowledge to patient care.
Practice-based Learning and ImprovementStudents must demonstrate the ability to investigate and evaluate their care of patients, to appraise and assimilate scientific evidence, and to continuously improve patient care based on constant self-evaluation and life-long learning.
Marshall University Joan C. Edwards School of Medicine Institutional Objectives
Interpersonal and Communication SkillsStudents must demonstrate interpersonal and communication skills that result in the effective exchange of information and collaboration with patients, their families, and health professionals.
ProfessionalismStudents must demonstrate a commitment to carrying out professional responsibilities and an adherence to ethical principles.
Systems-based PracticeStudents must demonstrate an awareness of and responsiveness to the larger context and system of health care, as well as the ability to call effectively on other resources in the system to provide optimal health care.
Undergraduate Medical Education DURING AFTER
Logbooks Surveys
Hours Placing Graduates
Critiques Self assessment
Exit interviews
Student portfolios
Review of write-ups
Site Visit
Written exams Written exams
Performance EvaluationPerformance Evaluation
Grades/narratives (4th year OSCE, mini-CEX)
Attitudinal 4th year medicine performance
Graduate Medical EducationBEFORE DURING
AFTERSelf Assessment Logs
Surveys
Hours
Placing Graduates
Learner Critiques
360 Degree
Evaluation
Chart Review
Exit interviews
Attitudinal
Written Exams Written Exams
Written Exams
Performance Eval Performance Eval
Performance Eval
GPA
Disciplinary Action
Narratives Narratives
Research/Teaching
Global Evaluation
National Society
Participation
Students Faculty Courses/Clerkships
Overall Curriculum
Who assesses?
What is assessed?
What methods are used?
Who receives the assessment data collected?
Who evaluates the assessment data collected?
What decisions are made based on assessment data?
Undergraduate Medical Education
Students Faculty Courses/Clerkships
Overall Curriculum
Who assesses? Faculty Students Students FacultyLCME
What is assessed?
Knowledge Teaching Course organization
Course assessment data
What methods are used?
MCQ Satisfaction survey
Satisfaction survey
USMLE Step I scores
Who receives the assessment data collected?
Course director
Department chair Curriculum Committee
Self-study committee
Who evaluates the assessment data collected?
Course director
Department chair Curriculum Committee
Curriculum Committee
What decisions are made based on assessment data?
Grade PromotionTenure
Course changes Curriculum changes
Undergraduate Medical Education
Graduate Medical Education
Residents Faculty Rotations Overall Program
Who assesses?
What is assessed?
What methods are used?
Who receives the assessment data collected?
Who evaluates the assessment data collected?
What decisions are made based on assessment data?
Graduate Medical Education
Residents Faculty Rotations Overall Program
Who assesses? Faculty Residents Residents Residency Review Committee
What is assessed?
Knowledge & Competencies
TeachingClinical productivity
Complaints Resident feedback
What methods are used?
MCQLikert scales
Satisfaction survey
Informal Resident interviewsCompetency
Who receives the assessment data collected?
Residency director
Residency directorDepartment chair
Residency director
Residency directorDepartment chair
Who evaluates the assessment data collected?
Residency director & Faculty
Department chair & Faculty
Residency director
Residency directorDepartment chair
What decisions are made based on assessment data?
RemediationPromotion to next level
PromotionTenure
Informal discussion with faculty recommending changes
Accreditation
Pangaro HMI 07
Fairness
Consistency
Simplicity
Principles of Assessment
Pangaro HMI 07
Principle 1 : Fairness
• to society : competence (P/F)
• to students : transparency and feedback
• to teachers : faculty development and protection
Principle 2 : Consistency
• Validity- Are you measuring what you want?
• Reliability
- Inter-teacher and inter-clerkship: suitable for high-stakes decisions?
Pangaro HMI 07
X X X X XX X X
Reliable
X X X
X X X X
Valid
Reliability versus validity
Pangaro HMI 07
Principle 3: Simplicity
• How we frame the question?
• How we ask the question?
• How do we use the answer?
Pangaro HMI 07
Strategy
• Simplicity leads to acceptance and use
• Acceptance leads to consistency
• Consistency = fairness
Pangaro HMI 07
Framing the question simply
• What do we expect of learners (students)?
• How do we communicate this to them and to their
teachers?
In order to get the information you need from an assessment and evaluate the results, you have to ask the right questions.
Pangaro HMI 07
• Find the person nearest you. Person to the left is “A”.
• Person A: In 30 seconds tell the person nearest you what it takes to pass a course/clerkship you are familiar with. How is this assessed?
• Person B: In 30 seconds tell the person nearest you what it takes to get the highest grade in a course/clerkship you are familiar with. How is this assessed?
Exercise
Discussion
Based on the information you have:
• Are you able to determine your assessment system’s
strengths and weaknesses?
• Can you determine whether the system has met the
required levels of success for the purpose?
• Is it clear how your own program evaluation system
will be assessed?