From the laboratory to the classroom: Creating and implementing … · 2012. 12. 23. ·...

1
Two-year randomized controlled trial Pilot testing Curriculum design Experimental classroom studies Cognitive science research From the laboratory to the classroom: Creating and implementing a research- based curriculum around the use of comparison Courtney Pollack, Harvard University Dr. Jon R. Star, Harvard University I. Abstract This poster presents a research program that seeks to improve educational practice and student learning in mathematics by developing, implementing, and testing curriculum materials built on findings from cognitive science. We convey the process of conducting experimental classroom studies built on lab-based cognitive science research and the subsequent design and implementation of a supplemental first-year algebra curriculum based on comparison. II. Learning & comparison: Cognitive science research There is a great deal of cognitive research showing the benefits of comparison for learning in young children (e.g., Loewenstein & Gentner, 2001; Oakes & Ribar, 2005) and adults (e.g., Gentner, Loewnstein, & Thompson, 2003; Namy & Gentner, 2002). Yet, little of this type of research has been done in classrooms. Building on these findings, we engaged in small-scale experimental classroom studies to explore the benefits of comparison for students’ learning of mathematics, focusing on equation solving. III. Experimental classroom research Our experimental research showed positive effects of comparison on student learning in controlled settings. IV. Curriculum Design and Development During 2008-2009, we worked with a small group of expert teachers to transform our experimental materials (see Figure 1) into a supplementary Algebra I curriculum that embodied the principles derived from previous experimental research (see Figure 2). VII. Conclusion Atkinson, Derry, Renkl, and Wortham (2000) acknowledge the gap between mathematics research in controlled laboratory settings and the relevance of its findings to classroom settings. They note that lab findings cannot alone improve classroom practice, but that “controlled experimental research grounded in cognitive science has substantially improved educational practice,” (p. 184). We hope to illustrate one way that building on experimental research can improve educational practice and student learning, by extending the benefits of learning through comparison to authentic classroom settings. V. Pilot testing During the 2009-2010 school year, we worked with 12 middle and high school teachers to test our materials in classrooms. References Atkinson, R. K., Derry, S. J., Renkl, A., & Wortham, D. (2000). Learning from examples: Instructional principles from the worked examples research. Review of Educational Research, 70(2), 181-214. Gentner, D., Loewnstein, J., & Thompson, L. (2003). Learning and transfer: A general role of analogical encoding. Journal of Educational Psychology, 95(2), 393-408. Loewenstein, J., & Gentner, D. (2001). Spatial mapping in preschoolers: Close comparisons facilitate far mappings. Journal of Cognition and Development, 2, 189– 219. Namy, L. L., & Gentner, D. (2002). Making a silk purse out of two sow’s ears: Young children’s use of comparison in category learning. Journal of Experimental Psychology: General, 131, 5-15. Oakes, L. M., & Ribar, R. J. (2005). A comparison of infants’ categorization in paired and successive presentation familiarization tasks. Infancy, 7, 85–98. Rittle-Johnson, B. & Star, J. R. (2007). Does comparing solution methods facilitate conceptual and procedural knowledge? An experimental study on learning to solve equations. Journal of Educational Psychology, 99(3), 561-574. Rittle-Johnson, B. & Star, J. R. (2009). Compared with what? The effects of different comparisons on conceptual knowledge and procedural flexibility for equation solving. Journal of Educational Psychology, 101(3), 529-544. Figure 1. Experimental comparison materials from Rittle-Johnson & Star (2007) Figure 2. Worked example from comparison curriculum • Side-by-side comparison • Labeled solution steps • Prompts to identify similarities & differences How does comparison affect student learning in real classroom environments? • “Infuse” comparison into first-year algebra classes • Facilitate comparison of and reflection on multiple strategies • About 80 “worked example pairs” • Characters: Alex and Morgan • Four comparison types Which is better? Why does it work? How do they differ? Which is correct? • Discussion phases • Understand, compare, make connections • Help teachers facilitate comparison conversations • Supplementary materials • Beyond linear equation solving to include Algebra I content • New comparison type: Correct versus incorrect •7 th grade students •Worked examples side- by-side • Compare side-by-side or reflect sequentially • Comparison condition: Greater procedural knowledge and flexibility • One-week summer PD • Comparison activities • Create own worked example pairs • Model teaching with comparison •Tasks • Use materials 1-2 x per week • Videotape 2 x per month • Submit log after material use (e.g., time spent, student learning, teacher satisfaction) • Student assessments • Feedback • Ongoing feedback throughout the year • Student and teacher end-of-year interviews VI. Randomized controlled trials Our revised materials are currently being tested using about 80 first-year algebra classrooms across Massachusetts. We are currently finishing data collection of videos, logs, and student assessments for the first year of the RCT. We will continue to collect data during the 2011-2012 school year. Figure 3. Take-away page excerpt from the revised curriculum •7 th and 8 th grade students • What you compare matters • Solution methods: largest gains in conceptual knowledge and flexibility • Problem types: Support both to a lesser extent Based on our pilot year feedback, we expanded our curriculum to include 150 worked example pairs. We also added a second page for each (see Figure 3). Alex and Morgan were asked to solve Alex’s “eliminate the fractions” way 3.1.2 Morgan’s “find common denominators” way 20 x 4 ! x 5 " # $ % & = !2(20) (20) x 20 = !2(20) Then I combined like terms to get the answer. First I gave the two fractions the same denominator. Then I subtracted the fractions. Then I multiplied by 20 on both sides. I simplified both sides of the equation to get the answer. * Why did Alex multiply each term by 20 as a first step? * Why did Morgan find a common denominator as a first step? * What are some similarities and differences between Alex's and Morgan's ways? * Which way is easier, Alex's way or Morgan's way? Why? First I multiplied both sides of the equation by the least common multiple of the denominators, which is 20. Then I simplified both sides of the equation. Which is better? 20 x 4 ! x 5 " # $ % & = !2(20) (20) x 20 = !2(20) Then I combined like terms to get the answer. First I gave the two fractions the same denominator. Then I subtracted the fractions. Then I multiplied by 20 on both sides. I simplified both sides of the equation to get the answer. First I multiplied both sides of the equation by the least common multiple of the denominators, which is 20. Then I simplified both sides of the equation. When solving equation with fractions as coefficients, you can start by multiplying both sides of the equation by the LCM of the fractions first or by finding the common denominators first. You get the same answer using both methods. Multiplying both sides of the equation by the LCM of the fractions first might be easier because it eliminates the fractions. Before you start solving a problem, you can look at the problem first and try to see which way might be easier.

Transcript of From the laboratory to the classroom: Creating and implementing … · 2012. 12. 23. ·...

Page 1: From the laboratory to the classroom: Creating and implementing … · 2012. 12. 23. · comparisons on conceptual knowledge and procedural flexibility for equation solving. Journal

Two-year randomized controlled trial

Pilot testing

Curriculum design

Experimental classroom studies

Cognitive science research

From the laboratory to the classroom: Creating and implementing a research-based curriculum around the use of comparison

Courtney Pollack, Harvard University Dr. Jon R. Star, Harvard University

I. Abstract This poster presents a research program that seeks to improve educational practice and student learning in mathematics by developing, implementing, and testing curriculum materials built on findings from cognitive science. We convey the process of conducting experimental classroom studies built on lab-based cognitive science research and the subsequent design and implementation of a supplemental first-year algebra curriculum based on comparison.

II. Learning & comparison: Cognitive science research There is a great deal of cognitive research showing the benefits of comparison for learning in young children (e.g., Loewenstein & Gentner, 2001; Oakes & Ribar, 2005) and adults (e.g., Gentner, Loewnstein, & Thompson, 2003; Namy & Gentner, 2002). Yet, little of this type of research has been done in classrooms. Building on these findings, we engaged in small-scale experimental classroom studies to explore the benefits of comparison for students’ learning of mathematics, focusing on equation solving.

III. Experimental classroom research Our experimental research showed positive effects of comparison on student learning in controlled settings.

IV. Curriculum Design and Development During 2008-2009, we worked with a small group of expert teachers to transform our experimental materials (see Figure 1) into a supplementary Algebra I curriculum that embodied the principles derived from previous experimental research (see Figure 2).

VII. Conclusion Atkinson, Derry, Renkl, and Wortham (2000) acknowledge the gap between mathematics research in controlled laboratory settings and the relevance of its findings to classroom settings. They note that lab findings cannot alone improve classroom practice, but that “controlled experimental research grounded in cognitive science has substantially improved educational practice,” (p. 184). We hope to illustrate one way that building on experimental research can improve educational practice and student learning, by extending the benefits of learning through comparison to authentic classroom settings.

V. Pilot testing During the 2009-2010 school year, we worked with 12 middle and high school teachers to test our materials in classrooms.

References Atkinson, R. K., Derry, S. J., Renkl, A., & Wortham, D. (2000). Learning from examples: Instructional principles from the worked examples research. Review of Educational Research, 70(2), 181-214. Gentner, D., Loewnstein, J., & Thompson, L. (2003). Learning and transfer: A general role of analogical encoding. Journal of Educational Psychology, 95(2), 393-408. Loewenstein, J., & Gentner, D. (2001). Spatial mapping in preschoolers: Close comparisons facilitate far mappings. Journal of Cognition and Development, 2, 189–219. Namy, L. L., & Gentner, D. (2002). Making a silk purse out of two sow’s ears: Young children’s use of comparison in category learning. Journal of Experimental Psychology: General, 131, 5-15. Oakes, L. M., & Ribar, R. J. (2005). A comparison of infants’ categorization in paired and successive presentation familiarization tasks. Infancy, 7, 85–98. Rittle-Johnson, B. & Star, J. R. (2007). Does comparing solution methods facilitate conceptual and procedural knowledge? An experimental study on learning to solve equations. Journal of Educational Psychology, 99(3), 561-574. Rittle-Johnson, B. & Star, J. R. (2009). Compared with what? The effects of different comparisons on conceptual knowledge and procedural flexibility for equation solving. Journal of Educational Psychology, 101(3), 529-544.

Figure 1. Experimental comparison materials from Rittle-Johnson & Star (2007)

Figure 2. Worked example from comparison curriculum

•  Side-by-side comparison •  Labeled solution steps • Prompts to identify similarities

& differences

How does comparison affect student learning in real classroom environments?

•  “Infuse” comparison into first-year algebra classes •  Facilitate comparison of and reflection on multiple

strategies

• About 80 “worked example pairs” • Characters: Alex and Morgan •  Four comparison types

• Which is better? • Why does it work? • How do they differ? • Which is correct?

• Discussion phases • Understand, compare, make connections • Help teachers facilitate comparison conversations

•  Supplementary materials

• Beyond linear equation solving to include Algebra I content

• New comparison type: Correct versus incorrect

• 7th grade students • Worked examples side-

by-side • Compare side-by-side

or reflect sequentially • Comparison condition:

Greater procedural knowledge and flexibility

• One-week summer PD • Comparison activities • Create own worked example pairs • Model teaching with comparison

• Tasks • Use materials 1-2 x per week • Videotape 2 x per month • Submit log after material use (e.g., time

spent, student learning, teacher satisfaction)

• Student assessments • Feedback

• Ongoing feedback throughout the year • Student and teacher end-of-year

interviews

VI. Randomized controlled trials Our revised materials are currently being tested using about 80 first-year algebra classrooms across Massachusetts.

We are currently finishing data collection of videos, logs, and student assessments for the first year of the RCT. We will continue to collect data during the 2011-2012 school year.

Figure 3. Take-away page excerpt from the revised curriculum

• 7th and 8th grade students

• What you compare matters

• Solution methods: largest gains in conceptual knowledge and flexibility

• Problem types: Support both to a lesser extent

Based on our pilot year feedback, we expanded our curriculum to include 150 worked example pairs. We also added a second page for each (see Figure 3).

Alex and Morgan were asked to solve

Alex’s “eliminate the fractions” way!

3.1.2

Morgan’s “find common denominators” way

20x

4!

x

5

"#$

%&'= !2(20)

(20)

x

20= !2(20)Then I combined

like terms to get

the answer.

First I gave the two fractions the same

denominator.

Then I subtracted the fractions.

Then I multiplied by 20 on both sides.

I simplified both sides of the equation

to get the answer.

* Why did Alex multiply each term by 20 as a first step? * Why did Morgan find a common denominator as a first step?

* What are some similarities and differences between Alex's and Morgan's ways? * Which way is easier, Alex's way or Morgan's way? Why?

First I multiplied both sides of the

equation by the least common

multiple of the denominators, which is 20.

Then I simplified both sides of the

equation.

Which is better?

Alex and Morgan were asked to solve

Alex’s “eliminate the fractions” way!

3.1.2

Morgan’s “find common denominators” way

20x

4!

x

5

"#$

%&'= !2(20)

(20)

x

20= !2(20)Then I combined

like terms to get

the answer.

First I gave the two fractions the same

denominator.

Then I subtracted the fractions.

Then I multiplied by 20 on both sides.

I simplified both sides of the equation

to get the answer.

* Why did Alex multiply each term by 20 as a first step? * Why did Morgan find a common denominator as a first step?

* What are some similarities and differences between Alex's and Morgan's ways? * Which way is easier, Alex's way or Morgan's way? Why?

First I multiplied both sides of the

equation by the least common

multiple of the denominators, which is 20.

Then I simplified both sides of the

equation.

Which is better?

When solving equation with fractions as coefficients, you can start by multiplying both sides of the equation by the LCM of the fractions first or by finding the common denominators first. You get the same answer using both methods. Multiplying both sides of the equation by the LCM of the fractions first might be easier because it eliminates the fractions. !

Before you start solving a problem, you can look at the problem first and try to see which way might be easier. !

separate packet for each of the two days of partner work; the firsttwo problem types in Table 1 were presented in the first packet,and the third and fourth problem types in Table 1 were presentedin a second packet.

In the sequential packets, there were 24 equations, the 12equations from the compare condition and an isomorphic equationfor each that was identical in form and varied only in the particularnumbers. The same solution methods were presented as in thecompare condition, but each worked example was presented on aseparate sheet. Thus, exposure to multiple solution methods wasequivalent across the two conditions. As in the compare condition,steps were labeled or students needed to fill in the appropriatelabel. At the bottom of each page was one question promptingstudents to reflect on that solution. The number of reflectionquestions (24) was the same across the two conditions. A pair ofsample pages from the packet is shown in Panel B of Figure 1.

There was also a packet of 12 practice problems. The problemswere isomorphic to the equations used in the worked examples,and the same practice problems were used for both conditions.Three brief homework assignments were developed, primarilyusing problems in the students’ regular textbook, and homeworkwas the same for both conditions.

Assessment. The same assessment was used as an individualpretest and posttest. It was designed to assess procedural knowl-edge, flexibility, and conceptual knowledge. Sample items of eachknowledge type are shown in Table 2. The procedural knowledgeitems were four familiar equations (one of each type presentedduring the intervention) and four novel, transfer equations (e.g., aproblem that included three terms within parentheses). There weresix flexibility items designed to tap three components of flexibil-ity—the abilities to generate, recognize, and evaluate multiplesolution methods for the same problem. There were six conceptual

Figure 1. Sample pages from intervention packet for (A) compare and (B) sequential conditions.

564 RITTLE-JOHNSON AND STAR

separate packet for each of the two days of partner work; the firsttwo problem types in Table 1 were presented in the first packet,and the third and fourth problem types in Table 1 were presentedin a second packet.

In the sequential packets, there were 24 equations, the 12equations from the compare condition and an isomorphic equationfor each that was identical in form and varied only in the particularnumbers. The same solution methods were presented as in thecompare condition, but each worked example was presented on aseparate sheet. Thus, exposure to multiple solution methods wasequivalent across the two conditions. As in the compare condition,steps were labeled or students needed to fill in the appropriatelabel. At the bottom of each page was one question promptingstudents to reflect on that solution. The number of reflectionquestions (24) was the same across the two conditions. A pair ofsample pages from the packet is shown in Panel B of Figure 1.

There was also a packet of 12 practice problems. The problemswere isomorphic to the equations used in the worked examples,and the same practice problems were used for both conditions.Three brief homework assignments were developed, primarilyusing problems in the students’ regular textbook, and homeworkwas the same for both conditions.

Assessment. The same assessment was used as an individualpretest and posttest. It was designed to assess procedural knowl-edge, flexibility, and conceptual knowledge. Sample items of eachknowledge type are shown in Table 2. The procedural knowledgeitems were four familiar equations (one of each type presentedduring the intervention) and four novel, transfer equations (e.g., aproblem that included three terms within parentheses). There weresix flexibility items designed to tap three components of flexibil-ity—the abilities to generate, recognize, and evaluate multiplesolution methods for the same problem. There were six conceptual

Figure 1. Sample pages from intervention packet for (A) compare and (B) sequential conditions.

564 RITTLE-JOHNSON AND STAR

separate packet for each of the two days of partner work; the firsttwo problem types in Table 1 were presented in the first packet,and the third and fourth problem types in Table 1 were presentedin a second packet.

In the sequential packets, there were 24 equations, the 12equations from the compare condition and an isomorphic equationfor each that was identical in form and varied only in the particularnumbers. The same solution methods were presented as in thecompare condition, but each worked example was presented on aseparate sheet. Thus, exposure to multiple solution methods wasequivalent across the two conditions. As in the compare condition,steps were labeled or students needed to fill in the appropriatelabel. At the bottom of each page was one question promptingstudents to reflect on that solution. The number of reflectionquestions (24) was the same across the two conditions. A pair ofsample pages from the packet is shown in Panel B of Figure 1.

There was also a packet of 12 practice problems. The problemswere isomorphic to the equations used in the worked examples,and the same practice problems were used for both conditions.Three brief homework assignments were developed, primarilyusing problems in the students’ regular textbook, and homeworkwas the same for both conditions.

Assessment. The same assessment was used as an individualpretest and posttest. It was designed to assess procedural knowl-edge, flexibility, and conceptual knowledge. Sample items of eachknowledge type are shown in Table 2. The procedural knowledgeitems were four familiar equations (one of each type presentedduring the intervention) and four novel, transfer equations (e.g., aproblem that included three terms within parentheses). There weresix flexibility items designed to tap three components of flexibil-ity—the abilities to generate, recognize, and evaluate multiplesolution methods for the same problem. There were six conceptual

Figure 1. Sample pages from intervention packet for (A) compare and (B) sequential conditions.

564 RITTLE-JOHNSON AND STAR