Gen Ed Assessment Critical Thinking Outcome Multiple Choice Question (MCQ) Development Project in...

22
Gen Ed Assessment Critical Thinking Outcome Multiple Choice Question (MCQ) Development Project in the Social Sciences BASED ON SLIDES FROM DEC. LAURA BLASI, PH.D., DIRECTOR, INSTITUTIONAL ASSESSMENT

Transcript of Gen Ed Assessment Critical Thinking Outcome Multiple Choice Question (MCQ) Development Project in...

Gen Ed AssessmentCritical Thinking Outcome

Multiple Choice Question (MCQ) DevelopmentProject in the Social Sciences

BASED ON SLIDES FROM DEC.

LAURA BLASI, PH.D., DIRECTOR, INSTITUTIONAL ASSESSMENT

Critical Thinking – The Gen Ed Outcome

When testing Critical Thinking in the General Education program at Valencia we are focused on the three indicators addressing – (1) bias, (2) use of evidence, and (3) context. A pattern in faculty work focused on Critical Thinking since 2002.

Purpose of the Multiple Choice Question (MCQ) Project

The current MCQ project :

(1) invests the money in our faculty and a homegrown test item bank that is emerging from our experience with the students in Gen Ed;

(2) increases the college’s capacity for reliability testing (in our IR office) moving away from reliance on consultants;

(3) assures that faculty concerns about external reliability using pilot data are addressed by recognized experts in the field;

(4) provides an option after Social Science faculty discovered that nationally normed example can be $6 per student or higher.

Standards for reviewing the questions specific to the outcome

Different forms of questions are possible…. Examples can be taken from the standardized tests used across the country – for example –

excerpt of a study

excerptdialogue,speech,or a current event

Provide a premise and test student assumptions about the idea

Applying the StandardsImagine you are a 2nd year student…Notice the question begins with reference to Political Science but it is broad enough to be accessible to students who have not taken Political Science.

Bias (analyze others and one’s own)

Context (beyond knowing it is important –examine its relevance when presenting ideas)

Bias (beyond recognizing it, analyzing it)

Evidence

Next steps needed

Questions by Jan 31

Pilot February – March

Expert Analysis April

Discussion of Results Assessment Day

50 questions to develop (we can include review of those we have)

Work is distributed and faculty-led

Questions stand up to peer review applying external standards

So our “Self Test” questions hold up when applied to the items (internal standards.)

The MCQ creation strategies – not discipline specific - from the Steve Downing workshop are followed and adhered to (external standards.)

Timeline for the pilot takes into account student administration and validation study by USF

Dr. Steve DowningTips for Developing Multiple Choice Questions Across Disciplines (examples)

Write a clear “testing point” or objective for item [context, bias, evidence]

Pose a clear question - review, edit, rewrite

Focus on important/essential information

Assure that question can be answered without reading options

Write clear, concise items; avoid superfluous information

Include most information in stem, avoiding lengthy options

Don’t use trick questions

Test higher-order cognitive knowledge (he refers to Bloom’s Taxonomy)

Application, problem solving, judgment, synthesis

Questions?