Teaching, Learning, & Transfer of Experimental Procedures in Elementary School Science David Klahr...

40
Teaching, Learning, & Transfer of Experimental Procedures in Elementary School Science David Klahr Department of Psychology Pittsburgh Science of Learning Center (PSLC) Program in Interdisciplinary Education Research (PIER) Carnegie Mellon University Society for Research on Educational Effectiveness First Annual Conference Dec 10 - 12, 2006

Transcript of Teaching, Learning, & Transfer of Experimental Procedures in Elementary School Science David Klahr...

Teaching, Learning, & Transfer of Experimental Procedures

in Elementary School Science

David Klahr

Department of PsychologyPittsburgh Science of Learning Center (PSLC)

Program in Interdisciplinary Education Research (PIER)

Carnegie Mellon UniversitySociety for Research on Educational Effectiveness

First Annual Conference Dec 10 - 12, 2006

Topic: Assessing different methods for teaching experimental procedures to middle school children

More specifically: Teaching “CVS”

• In the lab

• In both “easy” & “challenging” classrooms

• To students of widely varying abilities

• CVS: Control of Variables Strategy

• A simple procedure for designing unconfounded experiments:

- Vary one thing at a time (VOTAT).

• The conceptual basis for making valid inferences from data:

- isolation of causal path.

What is (CVS)?

NOT This:

Why study CVS?

Theoretical issuesSurface vs deep mapping during transfer of procedures and concepts at different transfer “distances”.

Practical importanceTopic: Core topic in early science instruction

Assessment: State standards

High stakes assessments

NCLB to start testing scienceBest Instructional approach for teaching CVS?

Heated controversy in profession

Legislative battles (e.g., CA and “hands on” science)

Goal: Compare different types of instruction for teaching CVS.

• Participants: 60 2nd - 4th graders

• Assessment:

– Measure learning & transfer at different “distances”

from initial instruction.

• Materials: 3 different physcial domains

– Springs

– Ramps

– Sinking objects.

Chen & Klahr (1999), Child Dev.

Between subjects design

Materials: 8 springs: 2 lengths x 2 widths x 2 wire sizes & 2 pair of weights

• Select two springs

• Select two weights • Hang springs on rack hooks

• Hang weights on springs.

• Compare amount of stretching.

Springs domainWhich attributes determine how far a spring will stretch?

Execution:

Question: does the length of a spring make a difference in how far it stretches?

A B

Length: short long

Width: wide wide

Wire: thin thin

Weight: light light

An unconfounded test:

• Exploratory:

• Explicit = Exploratory plus:

Two types of instruction (between subjects)

– Training: Explicit, good and bad examples

– Training: Reasons why, focus on deep structure

– Probe questions: Can you tell for sure? Why?

–Hands on: work with physical materials– Goal provided: “find out if x makes a difference”

Different transfer “distances”

• Far transfer (between domain): – CVS tests in different domain from training. – Time: few days after training– Location, context, etc., same as training

• Near transfer (within domain): – CVS “tests” in same domain as training, but on a

different dimension.– Time: minutes after training– Location, context, etc.: same as training

• Remote transfer (more later)

Exploration Near Transfer

Far Transfer0%

10%

20%

30%

40%

50%

60%

70%

Study Phases Day 1 Day 2

(Pre-test)

Training Manipulation

Far Transfer(Day 2)

Near Transfer

Explicit immediately better than Exploration and remains so

(4 experiments per child in each phase)

Training Manipulation

0%

10%

20%

30%

40%

50%

60%

70%

Exploration(pre-test)

Exploratory

Explicit

% o

f un

con

fou

nde

d e

xper

ime

nts

100

Explicit Exploratory0

25

50

75

(at least 3 out of 4 unconfounded experiments)

CVS mastery by individual children%

of

child

ren

beco

min

g

Mas

ters

1. Initial transfer measures are very close to training objectives.

2. Need a more “distant” ( “authentic”?) assessment of children’s understanding.

3. Will training effects remain with such extended assessments?

Procedure

Create a more “authentic” assessment:

•Ask children to judge science fair posters.

• Score their comments and suggestions.

Extensions

1. Participants: 112 3rd & 4th graders

2. Train on CVS via Explicit or Exploration method.

3. Assess effectiveness of CVS skill.

4. Present poster evaluation task.

5. Look at how CVS skill, training condition, affect

poster evaluation performance.

CVS Training and Science Fair Assessments (Klahr & Nigam, 2004)

Training Manipulation

0%

10%

20%

30%

40%

50%

60%

70%

Exploration Near transfer Far transfer

Day 1 1 week Study Design

Poster Evaluation

Scoring Rubric for Children’s Poster Critiques

1. Adequacy of research design

2. Theoretical explanation

3. Controlling for confounds in:

4. Measurement:

Subjects/Materials, Treatment, Experimenter bias, etc.

Reliability/Variability, Error, Data Representation

6. Completeness of conclusion:

5. Statistical Inferences: Sample size/population, effect size

Supported by data, Relate to hypothesis

Grand Poster Score = (Pingpong Poster) + (Memory Poster)

all valid, non-redundant, critiques about a posterPoster Score =

Possible subtle effects of type of instruction

Do the few kids who master CVS in the Exploratory condition do better on poster evaluation than the many who master CVS in the Explicit Instruction condition?

Possible subtle effects of type of instruction

• More specifically:– What is the relation between Poster Scores

and Path to CVS mastery?

•Method:– Secondary analysis based on “learning paths”

Do the few kids who master CVS in the Exploratory condition do better on poster evaluation than the many who master CVS in the Explicit Instruction condition?

Different “paths” to mastery or non-mastery of CVS

How do these children following these different paths perform on poster evaluations?

Note: following based on combining results from two studies: original K&N plus a replication

-.8-.6-.4-.20.2.4.6.81

Pos

ter

Ass

essm

ent S

core

(sta

ndar

dize

d)

Exp

lici

t M

aste

rs

Exp

lora

tory

n

on

- M

aste

rs

Exp

lora

tory

M

aste

rs

Exp

erts

Exp

lici

t

no

n-M

aste

rs

o CVS mastery is associated with high poster scores

o Non-mastery with low poster scores

o Path to mastery, or non-mastery is irrelevantn = 59 n = 25n = 15 n = 66n = 19

p < .001

n.s.

n.s.

Decomposition (attention to detail)

Nature of science

Rhetorical stance

Science as argument

Question for cognitive research:Why does training on CVS (narrow) lead to better poster evaluations (broad)?

Focused search for causal paths

Stay tuned ….

Translate experiment “script” into teacher lesson plan.

Procedure (in a nutshell):

Teach in “normal” science classes (in high SES schools).

(Toth, Klahr, & Chen, 2000)

Question for applied research:Can CVS be taught in a normal

classroom setting?

Participants in Classroom Study

• 77 4th graders from 4 classrooms in two different private schools

• 2 different science teachers

• Neither school had participated in “lab” studies

What to hold and what to fold?

Pedagogy:– Goal – teach CVS – Type of teaching:

Explicit instruction

Assessment:– Same as laboratory – Plus, some new

assessments in classroom

Context:– Lesson plan, not “script”–Teacher, not researcher–Scheduling– Student/teacher ratio– Group work– Record keeping– Error and multiple trials

Keep Change & adjust

These are issues of “engineering design”.

0

20

40

60

80

100

Pretest Post Test

Results of Classroom Implementation

% unconfounded designsIndividual students

classified as “Experts” (8 of 9 correct)

Posttest

91%

Pretest

5%

What about more challenging classrooms?(“Lesson Planning Project”, w/Junlei Li, Stephanie Siler, Mandy Jabbour)

One facet of the Lesson Planning Project:

• Two classrooms (5th and 6th graders) in urban school

• 90% eligible for free lunch. • Teacher is researcher (Junlei Li)

2-Day Classroom Replication of CVS TrainingDomain: Ramps

2-Day CVS Transfer & RetrainingDomain: Pendulum

2-Week Delay:Transfer to “real world”, “high-stakes” items

Local

National

International

StandardizedTest Items

0%

20%

40%

60%

80%

100%

Teaching & Assessment of CVS with Urban 5th and 6th Graders

(n = 42) (Klahr & Li, 2005)

Dyads

Student Design

Mastery-based

Formative Assessment

(CTBS)

(NAEP)

(TIMSS)Dyads

Focused Analogical Mapping

% C

orre

ct

Our CVS Tests

% correct for various groups on a TIMMS CVS item

He wants to test this idea: The heavier a cart is, the greater its speed at the bottom of a ramp. Which three trials should he compare?

Typical TIMMS CVS item

SignificanceBrief, theoretically grounded, focused

instruction: Is highly effective for middle class students In the sort run & over longer durations On “far transfer” assessments

Path independence: “What” matters more than “how”.

BIG differences in effectiveness with different student population. Thus, current approach requires: Adaptation, Modification, & Individualization

Questions to pursue(Next steps)

NCLB in “the small”:

Goal: No child who can’t understand & execute CVS

Method: Develop an “intelligent tutor” that can adapt to wide variability in children’s learning

0

1

2

3

4

Ex As T1 T2

0

1

2

3

4

Ex As T1 T2

0

1

2

3

4

Ex As T1 T2

0

1

2

3

4

Ex As T1 T2

TYPE FAST GAIN UP DOWN UP GRADUAL GAIN HIGH CONSTANT

Explicit 31% 10% 7% 7%

Socratic 0% 0% 14% 0%

0

1

2

3

4

Ex As T1 T2

0

1

2

3

4

Ex As T1 T2

0

1

2

3

4

Ex As T1 T2

TYPE UP & DOWN STEADY DECLINE LOW CONSTANT

Explicit 7% 0% 37%

Socratic 18% 7% 60%

Wide variety of individual learning patterns(From Chen & Klahr, 1999)

Design a Tutor for Experimental Designw/ Mari Strand Cary, Stephanie Siler, Junlei Li

Thanks to

Zhe Chen, Eva Toth, Junlei Li, Mari Strand Cary, Stephanie Siler, Milena Nigam, Amy Masnick, Lara Triona

Funding $ources:• McDonnell Foundation, NICHD, NSF, IES

Recent & Current collaborators

END

Extras

A page from the 15-item test booklet

Good Test

Bad Test

Lots of Water

Lots of Sunlight

A Little Plant Food A Little Water

No Sunlight

Lots of Plant Food

Does the amount of water affect plant growth?

Remote transfer items

• Temporal– Training - test interval: 7 months

• Domain– Physical - biological, et al

• Format– Physical materials vs. paper and pencil test

booklet

Why “remote”?

• Context- One on one with Experimenter vs whole class test taking

0

25

50

75

100

3rd 4th

UntrainedTrained

Remote Transfer Results

Good Test

Bad Test

Lots of Water

Lots of Sunlight

A Little Plant Food A Little Water

No Sunlight

Lots of Plant Food

Does the amount of water affect plant growth?

Me

an

% c

orr

ect

on

15

-ite

m f

ar

tra

nsf

er

test

Ramps Domain

Question: Does the surface of a Ramp make a difference in how far a ball rolls?

A

B

Surface: smoothRun: shortSteepness: highBall: golf

Surface: roughRun: longSteepness: lowBall: rubber

A completely confounded test