Review: Alternative Assessments
• Alternative/Authentic assessment• Real-life setting
• Performance based
• Techniques:• Observation• Individual or Group Projects• Portfolios• Exhibitions• Student Logs or Journals
• Developing alternative assessments• Determine purpose
• Define the target
• Select the appropriate assessment task
• Set performance criteria
• Determine assessment quality
Review: Grading
• Grading process:
• Making grading fair, reliable, and valid• Determine defensible objectives• Ability group students• Construct tests which reflect objectivity• No test is perfectly reliable• Grades should reflect status, not improvement• Do not use grades to reward good effort• Consider grades as measurements, not evaluations
Objectivesof instruction
Test selectionand administration
Results comparedto standards
Finalgrades
Cognitive Assessments
PhysicalFitnessKnowledge
HPER 3150Dr. Ayers
PhysicalFitnessKnowledge
Test Planning
• TypesMastery (driver’s license)
Achievement (mid-term)
Table of Specifications(content-related validity)
• Content Objectiveshistory, values, equipment, etiquette, safety, rules, strategy, techniques of play
• Educational Objectives (Blooms’ taxonomy, 1956)
knowledge, comprehension, application, analysis, synthesis, evaluation
Table of Specifications for a 33Item Exercise Physiology Concepts Test
(Ask-PE, Ayers, 2003)
T of SPECS-E.doc
Test Characteristics• When to test
• Often enough for reliability but not too often to be useless
• How many questions (p. 185-6 guidelines)• More items yield greater reliability
• Format to use (p. 186 guidelines)• Oral (NO), group (NO), written (YES)• Open book/note, take-home: (dis)advantages of both
• Question types• Semi-objective (short-answer, completion, mathematical)• Objective (t/f, matching, multiple-choice, classification)• Essay
Semi-objective Questions
• Short-answer, completion, mathematical
• When to use (factual & recall material)
• Weaknesses
• Construction Recommendations (p. 190)
• Scoring Recommendations
Objective Questions
• True/False, matching, multiple-choice
• When to use (M-C: MOST IDEAL)• FORM7 (B,E).doc• Pg. 196-203: M-C guidelines
• Construction Recommendations(p. 191-200)
• Scoring Recommendations
Cognitive Assessments I
• Explain one thing that you learned today to a classmate
Review: Cognitive Assessments I
• Test types• Mastery Achievement
• Table of Specifications (value, use, purpose)• Questions Types
• Semi-objective: short-answer, completion, mathematical• Objective: t/f, match, multiple-choice• Essay (we did not get this far)
Figure 10.1
The difference between extrinsic and intrinsic ambiguity(A is correct)
A
B
C
AAB B
C
CDD
D
Too easyExtrinsicambiguity(weak Ss miss)
IntrinsicAmbiguity(all foils = appealing)
Essay Questions
• When to use (definitions, interpretations, comparisons)
• Weaknesses
• Scoring
• Objectivity
• Construction & Scoring recommendations (p. 205-7)
Administering the Written Test
• Before the Test
• During the Test
• After the Test
Characteristics of Good Test Items
• Leave little to "chance"• Reliable• Relevant• Valid• Average difficulty• Discriminate
Gotten correct by more knowledgeable studentsMissed by less knowledgeable students
• Time consuming to write
Quality of the Test
• Reliability and Validity
• Overall Test QualityIndividual Item Quality
Item Analysis
• Used to determine quality of individual test items
• Item DifficultyPercent answering correctly
• Item DiscriminationHow well the item "functions“Also how “valid” the item is based on the total test score criterion
Item Difficulty
100*nn
cc
LU
LUDifficulty
0 (nobody got right) – 100 (everybody got right)Goal=50%
Item Discrimination
100*U
LUationminDiscri
n
cc
<20% & negative (poor) 20-40% (acceptable)Goal > 40%
Figure 10.4The relationship between item discrimination and difficulty
Sources of Written Tests
• Professionally Constructed Tests (FitSmart, Ask-PE)
• Textbooks (McGee & Farrow, 1987)
• Periodicals, Theses, and Dissertations
Questionnaires
• Determine the objectives• Delimit the sample• Construct the questionnaire• Conduct a pilot study• Write a cover letter• Send the questionnaire• Follow-up with non-respondents• Analyze the results and prepare the report
Constructing Open-Ended Questions
• AdvantagesAllow for creative answersAllow for respondent to detail answersCan be used when possible categories are largeProbably better when complex questions are involved
• DisadvantagesAnalysis is difficult because of non-standard responsesRequire more respondent time to completeCan be ambiguousCan result in irrelevant data
Constructing Closed-Ended Questions
• AdvantagesEasy to codeResult in standard responsesUsually less ambiguousEase of response
• DisadvantagesFrustration if correct category is not presentRespondent may chose inappropriate categoryMay require many categories to get ALL responsesSubject to possible recording errors
Factors Affecting the Questionnaire Response
• Cover LetterBe brief and informative
• Ease of ReturnYou DO want it back!
• Neatness and LengthBe professional and brief
• InducementsMoney and flattery
• Timing and DeadlinesTime of year and sufficient time to complete
• Follow-upAt least once (2 about the best response rate you will get)
The BIG Issues in Questionnaire Development
• ReliabilityConsistency of measurement
• ValidityTruthfulness of response
• Representativeness of the sampleTo whom can you generalize?
Cognitive Assessments II
• Ask for clarity on something that challenged you today
Top Related