David McConnell, GeologyUniversity of AkronMay 2005
NAGT Observing and Assessing Workshop
Hey, Technology Boy: It’s the
Message, Not the Medium
Conceptest Group Members
David McConnell, University of Akron
David Steer, University of Akron
Katharine Owens, University of Akron
Jeffrey R. Knott, California State University, Fullerton
Stephen Van Horn, Muskingum College
Walter Borowski, Eastern Kentucky University
Jeffrey Dick, Youngstown State University
Annabelle Foos, University of Akron,
Michelle Malone, Western Washington University
Heidi McGrew, University of Dayton
Constructivist Teaching Reform
1. Instructors organize their experiences and observations into patterns or mental models about teaching.• Perceptions of effective teaching shaped by observations as
students, teaching assistants, and peers.
2. It is reasonably easy for instructors to learn something that matches or extends an existing mental model.• Easier to make modest changes tied to existing procedures, e.g.,
using techniques such as peer instruction or minute paper.
3. It is difficult for instructors to make substantial changes to an established mental model.• Needs compelling evidence for change – traditional teaching
methods ineffective and alternatives show improvement.• Significant changes need sustainable institutional support.
Halloun, I.H. and D. Hestenes, American Journal of Physics, 1985. 53(11): p. 1043-1055.
Teaching Preconceptions Conceptest
Three instructors taught same Physics course in same semester. All professors use the same textbook, cover the same chapters, and receive similar evaluations. Prof A - Lectures emphasized conceptual structure of physics, careful definitions, logical arguments (97 students).Prof B - Demonstrations in lectures, extra time and energy in lecture prep, two teaching awards (192 students).Prof C - Emphasized problem solving, taught by example, solving multiple problems in lectures (70 students).
Which class did best on common post-test?
1.A 2.B 3.C 4.No difference
Peer instruction (& Conceptests) Instructors in present short lecture segments (10-20 minutes
– limit of student engagement)
Conceptest – conceptual multiple choice question to evaluate student understanding/application of key concept
Students consider conceptest, signal their answer (raised hands, colored cards, classroom performance system)
Based on proportion of correct responses (35-70% optimal), students discuss answers in small groups (2-4), respond
Student or instructor explanation of correct answer
Mazur, E., 1997, Peer instruction: A user’s manual: Prentice Hall, 253p.
Peer Instruction Pedagogy (Mazur)
Crouch, C.H., Mazur, E., 2001, American Journal of Physics, v. 69, #9, p.970-977
60%
70%
80%
90%
100%
1990 1991 1993 1994 1995 1996 1997
FCI pretest score
FCI score gain on post-test
Traditional Class
Peer Instruction Classes
n = 117 - 216
12
3
2
3
4
4
1 Began PI
Refined conceptests
Changed text
Open ended reading questions
Mazur’s results for Introductory
Physics using Force Concept Inventory (FCI)
Impact of Alternative Pedagogy
Examine the map and answer the question that follows. How many plates are present?a. 3 (26%; 0%)
b. 4 (19%; 18%)
c. 5 (44%; 75%)
d. 6 (11%; 7%)
Individual responses
Post-discussion responses
Geoscience Conceptest Example
Libarkin, J.C., and Anderson, S.W., 2005, Journal of College Science Teaching.
Impact of Alternative Pedagogy
0
2
4
6
8
10
12
14
Pos
t-T
est
- P
re-T
est
Sco
re (
%)
Comparison of difference among pre- and post-test scores for the Geosciences Concepts Inventory (GCI)
Courses utilizing conceptests
Geoscience Results
• GCI - Standardized measurement instrument
• 55% of classes showed no statistical improvement on GCI over the course
• No information on teaching strategies for other courses
Why Peer Instruction Works
1. Encourages student-faculty contact
2. Develops cooperation among students
3. Encourages active learning
4. Provides prompt feedback
5. Emphasizes time on task
6. Communicates high expectations
7. Respects diverse talents and ways of learning
Chickering & Gamson, AAHE Bulletin, 1987
Seven Principles of Good Practice
“. . . what the student does is actually more important in determining what is learned than what the teacher does.”
Thomas Shuell
A National Conceptest Database?
Chemistry, physics, and astronomy all have extensive conceptest databases
What can we learn from their experience?
• Sharing is good – add questions to a common database
• Add another step – pre/post response statistics for individual conceptests for comparative assessment – assess student success with questions
• Identify bottleneck concepts vs. basic concepts – Which concepts are the most difficult to learn?
A geology conceptest question database is available on-line at: http://serc.carleton.edu/introgeo/
Predict the post-test score in the courses for non-majors and majors.
a. 35/41% c. 40/48%
b. 40/41% d. 48/56%Sundberg, M.D., M.L. Dini, and E. Li, Journal of Research in Science Teaching, 1994. 31(6): p. 679-693.
Impact of traditional instruction
Pre- and post-tests of student comprehension were compared for large introductory biology courses for non-majors and majors with comparable class sizes. The majors course presented more content. Mean pre-test score for the non-majors was 29% and the for the majors was 35%.
Liquid hazardous waste isdisposed off by pumping itdown injection wells.Which well location wouldbe the most suitable touse for an injection well?Why?
A B C
Geoscience Conceptest Example
McConnell, D.A., and others., Journal Geoscience Education, 2006
Impact of Alternative Pedagogy
Student Perceptions of Conceptests
Conceptests: Useful vs. Not Useful
0 10 20 30 40 50 60
Strongly agree
Agree
Neutral
Disagree
Strongly disagree
% student responses
Student perceptions of the use of conceptests as a useful teaching and learning tool in an introductory geology lab course at Western Washington University (n = 72)
(Data from Michelle Malone)
The Value of Conceptests
Students taught key concepts using one of four methods. Student learning assessed by proportion of correct answers to open ended questions on same concepts on final exam
Crouch, C.H., Fagen, A.P., Callan, J.P., & Mazur, E., 2004. American Journal of Physics, v.72 #6, p. 835-838.
No demonstration
Observation of demonstration w/explanation
Prediction prior to demo with a conceptest
Prediction prior to demonstration using discussion & a later conceptest
% correct answers
61
70*
77*
82*
Teaching method
n = 158-297; * = statistically significant result vs. no demonstration
The Value of Peer Instruction
Control Group: Students took test individually.
Experimental Group: Students took physics test individually, then again as a pair.
Singh, C., 2005. American Journal of Physics, v.73 #3, in press.
Proportion of pairs of students who both got the question wrong on the first test but correct on “paired” test: 29%
Mean score on second exam for experimental group: 74%
Mean score on second exam for control group: 64%
Students in both groups answered similar questions on a second exam two weeks later.
Top Related