TIMSS. 54p. - ERIC · 2014-06-02 · Improving Mathematics Education Using Results from NAEP and....

56
ED 431 001 AUTHOR TITLE INSTITUTION SPONS AGENCY ISBN PUB DATE NOTE CONTRACT AVAILABLE FROM PUB TYPE EDRS PRICE DESCRIPTORS IDENTIFIERS ABSTRACT DOCUMENT RESUME TM 029 827 Wilson, Linda Dager; Blank, Rolf K. Improving Mathematics Education Using Results from NAEP and TIMSS. Council of Chief State School Officers, Washington, DC. State Education Assessment Center. National Science Foundation, Arlington, VA. ISBN-1-884037-56-9 1999-00-00 54p. REC-98-03080 Council of Chief State School Officers, Attn: Publications, One Massachusetts Avenue NW, Suite 700, Washington, DC 20001-1431; Tel: 202-336-7016; Web site: http://www.ccsso.org ($8, including shipping and handling). Reports Evaluative (142) MF01/PC03 Plus Postage. Achievement Gains; *Educational Improvement; Educational Policy; *Educational Practices; Elementary Secondary Education; Foreign Countries; International Studies; *Mathematics Achievement; *Mathematics Education; National Surveys; *Test Results *National Assessment of Educational Progress; State Mathematics Assessments; *Third International Mathematics and Science Study The findings of the National Assessment of Educational Progress (NAEP) and the Third International Mathematics and Science Study (TIMSS) are approached from the perspective of mathematics educators and education decision makers at state and local levels. Some key findings are highlighted to show problems and successes in mathematics teaching and learning, and some educational practices and policies that appear to improve student performance are identified. One half of the states improved student NAEP mathematics scores from 1990 to 1996. Students improved their performances in the algebra content areas and in geometry, and while they scored well on whole number computation and operations, they were weak in the area of number sense and estimation. Measurement is a particular weakness in U.S. mathematics at all grade levels. Well-prepared teachers appear to make a difference in student achievement. The NAEP, TIMSS, and state assessment programs have important differences in their purposes, frameworks, types of items, and methods of reporting results. Based on the analyses of NAEP and TIMSS results, recommendations are offered for educators and decision makers. An appendix contains a map of some NAEP questions for grade 8. (Contains 29 figures and 59 references.) (SLD) ******************************************************************************** Reproductions supplied by EDRS are the best that can be made from the original document. ********************************************************************************

Transcript of TIMSS. 54p. - ERIC · 2014-06-02 · Improving Mathematics Education Using Results from NAEP and....

Page 1: TIMSS. 54p. - ERIC · 2014-06-02 · Improving Mathematics Education Using Results from NAEP and. TIMSS. Council of Chief State School Officers, Washington, DC. State Education Assessment

ED 431 001

AUTHORTITLE

INSTITUTION

SPONS AGENCYISBNPUB DATENOTECONTRACTAVAILABLE FROM

PUB TYPEEDRS PRICEDESCRIPTORS

IDENTIFIERS

ABSTRACT

DOCUMENT RESUME

TM 029 827

Wilson, Linda Dager; Blank, Rolf K.Improving Mathematics Education Using Results from NAEP andTIMSS.Council of Chief State School Officers, Washington, DC.State Education Assessment Center.National Science Foundation, Arlington, VA.ISBN-1-884037-56-91999-00-0054p.

REC-98-03080Council of Chief State School Officers, Attn: Publications,One Massachusetts Avenue NW, Suite 700, Washington, DC20001-1431; Tel: 202-336-7016; Web site:http://www.ccsso.org ($8, including shipping and handling).Reports Evaluative (142)MF01/PC03 Plus Postage.Achievement Gains; *Educational Improvement; EducationalPolicy; *Educational Practices; Elementary SecondaryEducation; Foreign Countries; International Studies;*Mathematics Achievement; *Mathematics Education; NationalSurveys; *Test Results*National Assessment of Educational Progress; StateMathematics Assessments; *Third International Mathematicsand Science Study

The findings of the National Assessment of EducationalProgress (NAEP) and the Third International Mathematics and Science Study(TIMSS) are approached from the perspective of mathematics educators andeducation decision makers at state and local levels. Some key findings arehighlighted to show problems and successes in mathematics teaching andlearning, and some educational practices and policies that appear to improvestudent performance are identified. One half of the states improved studentNAEP mathematics scores from 1990 to 1996. Students improved theirperformances in the algebra content areas and in geometry, and while theyscored well on whole number computation and operations, they were weak in thearea of number sense and estimation. Measurement is a particular weakness inU.S. mathematics at all grade levels. Well-prepared teachers appear to make adifference in student achievement. The NAEP, TIMSS, and state assessmentprograms have important differences in their purposes, frameworks, types ofitems, and methods of reporting results. Based on the analyses of NAEP andTIMSS results, recommendations are offered for educators and decision makers.An appendix contains a map of some NAEP questions for grade 8. (Contains 29

figures and 59 references.) (SLD)

********************************************************************************

Reproductions supplied by EDRS are the best that can be madefrom the original document.

********************************************************************************

Page 2: TIMSS. 54p. - ERIC · 2014-06-02 · Improving Mathematics Education Using Results from NAEP and. TIMSS. Council of Chief State School Officers, Washington, DC. State Education Assessment

t-CNICO0)Cs1

2

Improving MathematicsEducation Using Results

from NAEP and TIMSSLINDA DAGER WILSON AND ROLF K. BLANK

PERMISSION TO REPRODUCE ANDDISSEMINATE THIS MATERIAL

HAS BEEN GRANTED BY

g0(4

TO THE EDUCATIONAL RESOURCESINFORMATION CENTER (ERIC) )

U.S. DEPARTMENT OF EDUCATIONOffice ot Educational Research and Improvement

EDUCATIONAL RESOURCES INFORMATION

CENTER (ERIC),4/1his document has been reproduced as

received from the person or organization

originating it.

13 Minor changes have been made to

improve reproduction quality.

Points of view or opinions stated in thisdocument do not necessarlly represent

official OERI position or policy.

COUNCIL OF CHIEF STATE SCHOOL OFFICERS ONE MASSACHUSETTS AVENUE, NW, SUITE 700 WASHINGTON, DC 20001-143 I

BEST COPY AVAILABLE 2

Page 3: TIMSS. 54p. - ERIC · 2014-06-02 · Improving Mathematics Education Using Results from NAEP and. TIMSS. Council of Chief State School Officers, Washington, DC. State Education Assessment

COUNCIL OF CHIEF STATE SCHOOL OFFICERSSTATE EDUCATION ASSESSMENT CENTER

Improving MathematicsEducation Using Results

from NAEP and TIMSS

LINDA DAGER WILSON AND ROLF K. BLANK

Linda Dager Wilson is a professor of mathematics education at the University of Delaware.

Rolf K Blank is director of education indicators at the Council. The development of this paper

was supported by a grant to CCSSO from the National Science Foundation (REC 98-03080).

Page 4: TIMSS. 54p. - ERIC · 2014-06-02 · Improving Mathematics Education Using Results from NAEP and. TIMSS. Council of Chief State School Officers, Washington, DC. State Education Assessment

THE COUNCIL OF CHIEF STATE SCHOOL OFFICERS

The Council of Chief State School Officers (CCSSO) is a nationwide, nonprofit organization com-posed of the public officials who head departments of elementary and secondary education in thestates, the District of Columbia, the Department of Defense Education Activity, and five extra-statejurisdictions. CCSSO seeks its members' consensus on major educational issues and expresses theirview to civic and professional organizations, federal agencies, Congress, and the public. Through itsstructure of standing and special committees, the Council responds to a broad range of concernsabout education and provides leadership on major education issues.

Because the Council represents the chief education administrators, it has access to the educationaland governmental establishment in each state and to the national influence that accompanies thisunique position. CCSSO forms coalitions with many other education organizations and is able toprovide leadership for a variety of policy concerns that affect elementary and secondary education.Thus, CCSSO members are able to act cooperatively on matters vital to the education of America'syoung people.

The State Education Assessment Center was established by chief state school officers to improve theinformation base on education in the United States, especially from a state perspective. The Centerworks to improve the breadth, quality, and comparability of data on education, including state-by-state achievement data, instructional data, indicators of quality in areas such as mathematics andscience, and performance assessment of teachers and students. In collaboration with state educationagencies, the federal government, and national and international organizations, the Center contrib-utes to the development of a set of useful and valid measures of educational quality geared, whenappropriate, to education standards.

Robert E. Bartman (Missouri)President

Nancy Keenan (Montana)President-Elect

Wilmer S. Cody (Kentucky)Vice President

Gordon M. AmbachExecutive Director

Wayne N. MartinDirector, State Education

Assessment Center

Rolf K. BlankDirector of Education Indicators

Copies of this paper may be ordered for $8.00,including shipping and handling, from:

Council of Chief State School OfficersAttn: PublicationsOne Massachusetts Avenue NWSuite 700Washington, DC 20001-1431

Phone: (202) 336-7016Fax: (202) 408-8072www.ccsso.org

ISBN # 1-884037-56-9

Copyright 0 1999 by the Council of Chief

State School Officers, Washington, D.C.

All rights reserved with the exception of

reproduction for educational purposes.

Page 5: TIMSS. 54p. - ERIC · 2014-06-02 · Improving Mathematics Education Using Results from NAEP and. TIMSS. Council of Chief State School Officers, Washington, DC. State Education Assessment

TABLE OF CONTENTS

Introduction

Using NAEP and TIMSS Results: Key Findings

Trends in Mathematics Learning

Content Area Strengths and Weaknesses

A Closer Look at Item Types on NAEP

Using NAEP Data

Opportunity to Learn Mathematics

Major Differences Between NAEP and TIMSS and State Assessments

Implications for Mathematics Education

Conclusion

Appendix: Map of Selected Questions on the NAEP Mathematics Scale for Grade 8

References

References for Sample Tasks

FIGURE I

FIGURE 2

FIGURE 3

FIGURE 4

FIGURE 5

FIGURE 6

FIGURE 7

FIGURE 8 "Odometer" NAEP 1996, Grade 8 10

FIGURE 9 "Sam's Lunch" NAEP 1996, Grade 4 12

FIGURE I 0 "Bicycle Accidents" TIMSS, Grade 12 I 2

FIGURE I I "Geometric Proof" TIMSS, Grade 12 I 3

FIGURE 12 "Amount Paid" TIMSS , Grade 8 14

FIGURE I 3 "Cherry Syrup" NAEP 1996, Grade 12 I 4

FIGURE 14 "Meaning of Equation" TIMSS, Grade 8 15

FIGURE 15 NAEP Results by Item Type i65

2

5

I 0

16

20

25

34

41

42

44

45

48

LIST OF FIGURES

National Mathematics Trends on Main NAEP 1990 tO 1996 6

Highest Improving States at Grades 4 and 8, NAEP 1996 6

NAEP Mathematics Achievement Levels and Results, 1996 7

TIMSS Mathematics 8

Long-term National Trends, NAEP Mathematics 8

NAEP Mathematics Trends by Quartiles 8

Cohort Growth For Selected States in Mathematics,NAEP 1992 tO 1996 9

THE COUNCIL OF CHIEF STATE SCHOOL OFFICERS STATE EDUCATION ASSESSMENT CENTER

Page 6: TIMSS. 54p. - ERIC · 2014-06-02 · Improving Mathematics Education Using Results from NAEP and. TIMSS. Council of Chief State School Officers, Washington, DC. State Education Assessment

LIST OF FIGURES(CONTINUED)

FIGURE 16 "Two Geometric Shapes," NAEP 1996, Grade 4 17

FIGURE 17 State Math Proficiency by Measures of State Context,NAEP 1996, Grade 8 20

FIGURE 18 Score Variation within Selected States, NAEP 1996, Grade 4 2 I

FIGURE 19 Differences in Selected States' Mathematics Proficiency atNAEP Grade 8 by School Location, 1992 to 1996 22

FIGURE 20 Eighth-Grade Mathematics Curriculum in Selected Statesby NAEP &Ore, 1996 23

FIGURE 2 I High School Mathematics Course Completedby Race/Ethnicity, by NAEP Score, 1996 24

FIGURE 22 Mathematics Preparation of Grade 7-12 Teachersin High- vs. Low-performing States, NAEP 1996 26

FIGURE 23 Improving States by NAEP Teacher ProfessionalDevelopment in Math 27

FIGURE 24 Topics and Time of Professional Development, K I 2 Teachers 28

FIGURE 25 Mathematics Curriculum Topics, TIMSS, Grade 8 30

FIGURE 26 Mathematics Classroom Practices in Selected StatesNAEP 1996, Grade 8 31

FIGURE 27 Classroom Activities Used in "Most Lessons," TIMSS ;Teacher Expectations of Students in "Most Lessons," TIMSS 32

FIGURE 28 Percent of Students Whose Teachers Receive InstructionalMaterials and Resources They NeedNAEP 1996, Grade 4 33

FIGURE 29 Distribution of Test Items by Content StrandGrade 8 NAEP, 1996; TIMSS 36

BEST COPY AVAILABLE

6

THE COUNCIL OF CHIEF STATE SCHOOL OFFICERS STATE EDUCATION ASSESSMENT CENTER

Page 7: TIMSS. 54p. - ERIC · 2014-06-02 · Improving Mathematics Education Using Results from NAEP and. TIMSS. Council of Chief State School Officers, Washington, DC. State Education Assessment

ribproving Mathematics EducationUsing Results from NAEP and TIMSS

LINDA DAGER WILSON AND ROLF K. BLANK

Recent results of national and international studieshave focused the attention of educators, educationpolicymakers and the public on the condition ofmathematics and science education in our nation'sschools. The 1997 findings from the Third Interna-tional Mathematics and Science Study (Tuvsss) provide

the most comprehensive, in-depth information todate on the achievement of students in U.S. schools as

compared to countries around the world. Also in 1997

the U.S. Department of Education released the results

of the National Assessment of Educational Progress(NAEP) in mathematics that provided the first analysis

of trends in mathematics learning in the 1990's state-by-state. With these two major studies, mathematicsand science educators have available a wealth ofinformation about the knowledge and performance of

students as well as details about the characteristics oftheir teachers and schools.

Well-publicized reports and press releases from thesponsoring agencies of these studies have focusedattention on U.S. students' relatively good perfor-mance in mathematics at the elementary level andthe apparent decline in math proficiency as studentsmove into middle school and high school. Math-ematics curriculum in the U.S. was reported byTimss to be very broad in relation to other countries,covering too many topics at each grade with too littledepth. Many of the headlines about NAEP mathemat-

ics results have focused on most students not meetingexpected levels of achievement, only small improve-ments in student achievement over time, and widelydiffering achievement results from state to state.

Analyses and interpretation of these major studiesare just beginning to be more generally available.Mathematics educators are unlikely to have suffi-cient useful information from the NAEP and TIMSSanalyses so far to guide their efforts toward improve-

ment of teaching and curriculum. In this paper, weapproach the NAEP and TIMSS results from theperspective of mathematics educators and educationdecision-makers at state and local levels. We high-light some of the key findings that show problemsand successes in mathematics teaching and learningin schools, and we pinpoint some of the educationalpractices and policies that appear to improve studentperformance.

Before presenting our analyses, what have been themain themes that have predominated the discussionof recent NAEP and miss results for mathematics?What are the messages that many educators and thepublic are likely to have received, so far, about theperformance of our students?

"... Most students are not meeting defined currentnational levels of 'proficiency' in mathematicson NAEP, and long-term trends in mathematicsperformance show little improvement."

"... In the Timss mathematics results, U.S. studentsranked high in 4th grade, below average in 8thgrade, and almost last at grade 12 ."

(11 Students in Midwest and Northeast states didmuch better on NAEP mathematics than otherstates because these states have fewer low-income

students; NAEP scores appear to be more relatedto social and economic differences between stu-dents than differences in school quality, curricu-lum, or teaching."

"... Results on NAEP and Tuvlss show U.S. studentsdoing worse in mathematics than do the resultsfrom other large-scale assessments in mathemat-ics, such as college entrance tests or standardizedtests used in state and local testing programs."

7THE COUNCIL OF CHIEF STATE SCHOOL OFFICERS STATE EDUCATION ASSESSMENT CENTER

Page 8: TIMSS. 54p. - ERIC · 2014-06-02 · Improving Mathematics Education Using Results from NAEP and. TIMSS. Council of Chief State School Officers, Washington, DC. State Education Assessment

The initial conclusions that readers draw fromNAEP and TIMSS results may lead them to questionthe usefulness of the findings because the resultsappear to be based on quite different expectationsor standards for mathematics education than thosethey know. Or, the findings may not appear to beuseful because the results are reported in theaggregate at national or state levels and they arenot reported with sample items, analysis of perfor-mance by content areas, or supporting data onteaching practices and curriculum. Thus, NAEP andTIMSS results often do not appear to be a resourcefor analyzing how mathematics education can be

improved or for identifying policies or practicesleading to higher student achievement. In fact,reports and supporting data on NAEP and TIMSSprovide very useful materials for teachers, parents,policy makers, and administrators who are lookingfor assistance in raising the achievement level ofstudents in mathematics. In this paper, we lookmore deeply at the performance of our students inthese studies, we analyze areas of strength andweakness in U.S. mathematics performance on NAEP

and TIMSS, and we identify important school andclassroom factors that are related to higher achieve-ment of U.S. students in mathematics.

Using NAEP and TIMSS Results:Key Findings

ow

Some of the key findings follow from our detailedanalysis of the recent results from NAEP and TIMSS as

they apply to mathematics education in U.S. publicschools:

cla, One-half of states improved student scores onNAEP Mathematics from 1990 to 1996.

The 1996 NAEP results reveal trends in the progress of

mathematics for each participating state. In 1990,NAEP began reporting results at the state level and the

assessment was changed to increase the focus onproblem solving and to require more open-endedmathematics exercises. Only 15 percent of grade 8students scored at the Proficient level in 1990. From1990 to 1996, 27 states significantly improved theproportion of their students scoring at the Proficientlevel, and three states improved by II to 12 percent-age pointsMichigan, North Carolina, and Minne-sota. In 1996, although mathematics progress wasmade in many states, other states did not improve,and as a nation only one fourth of grade 8 studentsreached the Proficient level.

ow U.& students at grades 4 and 8 improved theirperformance in the algebra content area, andgrade 8 students improved in geometry, ourstudents scored well on whole number com-putation and operations, but they were weakin the area of number sense and estimation.

It is not sufficient to only report that most studentsin grade 8 did not reach the Proficient level, or thatstudents at grade 4 are above the internationalaverage or twelfth graders are below the interna-tional average. There are relative strengths andweaknesses within and between different contentareas of mathematics, and careful analysis of theresults can help to focus improvement efforts. Forexample, on TIMSS our fourth graders scored abovethe international average on questions involvingpatterns, relations and functions, and our eighthgraders were right at the international average inthis area. Relative to other content areas, NAEPresults show improved performance in algebrasince 1990 at all grade levels.

8

THE COUNCIL OF CHIEF STATE SCHOOL OFFICERS STATE EDUCATION ASSESSMENT CENTER

2

Page 9: TIMSS. 54p. - ERIC · 2014-06-02 · Improving Mathematics Education Using Results from NAEP and. TIMSS. Council of Chief State School Officers, Washington, DC. State Education Assessment

In the numbers area, our students did well onquestions involving fundamental concepts of num-bers, relationships between numbers, and propertiesof numbers, as well as in skills required for manipu-lating numbers and completing computations. Ourstudents did poorly on questions requiring multi-step solutions, using new concepts, or applyingnumber sense to new or unusual situations.

°h' Measurement is a particular weakness inmathematics for U.S. students at all gradelevels.

U.S. students scored below the international averagein this content area at grades 4 and 8, and NAEPmeasurement scores were lower than the overall mathaverages at grades 8 and 1 2 . The questions that wereparticularly difficult for students were those requiringunit conversions, calculations of volume and circum-ference, and estimation of measurements. We findthat students are given too few opportunities toactually engage in the use of measuring instruments.Instead they are shown pictures of objects and theinstruments chosen to measure some attribute ofthose objects. Students should have more hands-onexperiences with measuring, including making deci-sions about which instrument might be appropriatefor measuring a particular attribute. Emphasis shouldbe placed on understanding the underlying concepts,rather than simply applying formulas.

cbo Students scored higher on NAEP multiple-choice items than on open-ended items.

At all grade levels, student performance on multiplechoice items was significantly better than perfor-mance on either regular constructed response items(i.e., open-ended) or extended constructed responseitems. Constructed response items often assess math-ematical reasoning and conceptual understanding andthey demand good student communications skills and

flexibility to solve non-routine problems. As moreteachers use constructed response items in class andstudents gain more experience in answering them,performance of our students on these items shouldimprove.

cxv Teacher reports on curriculum content re-veal many math topics, but little focus.

Results from TIMSS teacher surveys show thatJapanese eighth grade teachers spend most of theirtime teaching a few topics: geometry, congruenceand similarity, functions, relations and patterns,and equations and formulas, and these four areas ofthe curriculum account for approximately 67 per-cent of teaching time in Japanese classrooms. Incontrast, grade 8 teachers in the U.S. spread timevery thinly among a wide range of topics. Themajority of our teachers cover 16-18 differenttopics, with only one topic accounting for morethan 8 percent of their teaching time.

(II' Well prepared teachers of mathematics makea difference.

The group of states with the highest average scoreson NAEP at grade 8 are well above the nationalaverage in proportion of teachers with a major orminor in mathematics. The group of states withthe lowest NAEP scores are below or near theaverage level of preparation of their mathematicsteachers. Analyses of teacher preparation by schoolcharacteristics show that students in high povertyschools and schools with high minority enroll-ments have higher proportions of under-preparedteachers than other schools.

C\-V NAEP, MISS, and state assessment programshave important differences in their purposes,frameworks, types of items, and methods ofreporting results.

Before interpreting the results of NAEP and TIMSS, or

any large-scale assessments, it is essential to have anunderstanding of the purposes, design, and reportingscheme for the assessment. Each of these assessments

was built from a different set of purposes and adifferent framework, and these contexts must beincorporated into any set of conclusions that might bedrawn about the results.

9IMPROVING MATHEMATICS EDUCATION USING RESULTS FROM NAEP AND TIMSS

3

Page 10: TIMSS. 54p. - ERIC · 2014-06-02 · Improving Mathematics Education Using Results from NAEP and. TIMSS. Council of Chief State School Officers, Washington, DC. State Education Assessment

Based on the results of our more detailed analysesof NAEp and TIMSS data, our paper offers a set ofrecommendations for educators and decision-mak-ers. The analyses and interpretations of NAEP andTIMSS results in this paper will help mathematicseducators understand specific details about thecurrent quality and effectiveness of mathematicseducation in our nation's public schools. We try togo beyond the initial level of analysis about thefindings, and we try to focus on some of the keyfindings as they reference results in states. The datawe analyzed and the assessment items presentedhere are from reports released to the public (Tnviss:Beaton, et al., 1996, 1997, 1998; NCES, 1996a,1997a, 1998a; NAEP: Reese, et al., 1997;Shaughnessy, et al., 1998).

We also address some of the barriers that educatorshave noted in trying to use the results from NAEPand TIMSS. Although these studies are well-knowngenerally in some levels of the education commu-nity, educators and researchers have found difficul-ties in their use, because of: a) high degree ofcomplexity in how the assessments are conducted,scored, and reported; b) overreliance on composite

ratings and rankings of countries and states, c)methods of scoring and reporting NAEP and TIMSSwhich differ from those used with state and localtests; and d) results derived from state and nationalsamples that do not appear to provide disaggre-gated data for analyzing issues of concern to teach-ers and schools. We contend that closer analyses ofNAEP and TIMSS results reveal some striking mes-sages that can be of use to all who are interested inimproving mathematics education.

In the following sections, we first discuss keyfindings about mathematics learning from an in-depth examination of NAEP and TIMSS assessmentresults. We also present several examples of howanalyzing variation in performance can be helpfulto educators and decision makers. We illustrate theuse of measures of opportunity-to-learn mathemat-ics to help explain the wide differences in math-ematics proficiency among our schools, classrooms,and states. Finally, to assist in analyzing results, wesummarize some of the major differences in thepurpose, design, and operation of NAEP, TIMSS, andstate assessment programs in mathematics.

THE COUNCIL OF CHIEF STATE SCHOOL OFFICERS STATE EDUCATION ASSESSMENT CENTER

4

Page 11: TIMSS. 54p. - ERIC · 2014-06-02 · Improving Mathematics Education Using Results from NAEP and. TIMSS. Council of Chief State School Officers, Washington, DC. State Education Assessment

v-Trends in Mathematics Learnings WHAT NAEP AND TIMSS DATA REVEAL ABOUT

MATHEMATICS TEACHING AND LEARNING IN THE U.S.

Recent results from NAEP and MISS can be valuableresources for in-depth analysis of mathematics educa-

tion in U.S. schools. To analyze and interpret math-ematics assessment results from NAEP and 'miss it isimportant to keep in mind key aspects of the purposes

and designs of these assessment studies, and how theydiffer. Some key differences and similarities between

NAEP and 'miss with regards to mathematics assess-ment are outlined in the table below.

The differences in purpose, design, and structure ofNAEP and TIMSS indicate that direct comparison ofresults is difficult. The linking study of NAEP andTIMSS recently completed by the National Centerfor Education Statistics (NcEs, 1998b) showed thatcaution should be taken with comparisons from thetwo studies. The linking study does show where

states would have performed if their students hadbeen in TIMSS. Similarities in the assessmentframeworks which outline the content for the twostudies provide a basis for validity of comparisonsbetween the studies. A summary of the assessmentframeworks and distribution of the test itemsacross the frameworks are discussed in part four ofthe paper (page 34).

GAINS IN MATHEMATICS PROFICIENCY

NAEP mathematics assessment results are expressedas percentages of students reaching each of three"achievement levels," and they are reported using aNAEP scale score. The NAEP scale ranges from o to500, and the same scale is used for reporting scoresat grades 4, 8, and 12.

NAEPPURPOSE

Regular, periodic assessment and reporting of trends instudent learning in schools for the nation and the states

in core academic subjects.

STUDENTS TESTEDNational and state representative samples of students and

their teachers, based on sampling at the school level.

TEST DEVELOPMENTBased on NAEP mathematics assessment framework, with

new and continuing items provided every four years

FREQUENCY AND LEVEL (MATH)NATIONAL: Four years (grades 4, 8, 12)

STATE: Four years (grades 4, 8)

rrods50 percent multiple choice, 50 percent open-ended

or constructed response

SCORING AND REPORTINGThree achievement levels and scale score by gradefor nation and each state. Report of achievement

results for subject, followed by report of supportingdata on background and practices.

TIMSS

PURPOSECross-national research and analysis of student

achievement, curriculum, and teaching in mathematicsand science education with 45 participating countries.

STUDENTS TESTEDNational representative samples of students

and their teachers, based on sampling atthe school level.

TEST DEVELOPMENTBased on Tatss assessment framework;

items written for 'MISS and reviewed by countries

FREQUENCY AND LEVEL (MATH)Main data collection 1995;

by country: grades 4, 8, end of secondary school.

ITEMS

75 percent multiple choice, 25 percent open-ended.

SCORING AND REPORTINGScale score for each nation by grade, percentcorrect in content areas; Reports of student

achievement by grade, curriculum analysis, andother studies.

BEST COPY AVAILABLE 1 1IMPROVING MATHEMATICS EDUCATION USING RESULTS FROM NAEP AND TIMSS

5

Page 12: TIMSS. 54p. - ERIC · 2014-06-02 · Improving Mathematics Education Using Results from NAEP and. TIMSS. Council of Chief State School Officers, Washington, DC. State Education Assessment

NAEP achievement levels are descriptions of whatstudents should know and be able to do in mathemat-ics at each grade level. Three levels were defined foreach grade level, under supervision of the NationalAssessment Governing BoardBasic, Proficient, andAdvanced.

NAEP mathematics scores for 1996 reveal that stu-dents in grades 4 and 8 did make achievement gainsin proficiency during the 1990s. Although only onequarter of students nationally scored at the Profi-cient level and just over half scored at the Basic levelor higher, performance in mathematics improved.The data from 1996 also showed that results by statediffered significantly. About half of the states madereal gains in mathematics but the remainder showedlittle change.

With the 1990 NAEP assessment framework, the testwas significantly changed to include open-endedexercises and a broader range of mathematics con-tent. Also, in 1992 NAEP began reporting studentresults by achievement levels. Figures I and 2provide a graphic summary of key gains since 1990.

In 1996, 21 percent of grade 4 students per-formed at or above the Proficient levels and 62percent of grade 4 students performed at orabove the Basic level. Twelve states made sig-nificant improvement in the percentage of stu-dents at/above the Basic level as compared to1992, and seven states improved the percentageof students at/above the Proficient level. Gainswere largest at grade 4 mathematics in Texas,Indiana, and North Carolina, each improvingover 8 to 10 percentage points.

At grade 8 in 1996, 24 percent of studentsscored at or above the Proficient level and 62percent were at/or above the Basic level. From1990 to 1996, 27 states made significant im-provement in the percentage of students at/aboveProficient. Michigan, Minnesota, and NorthCarolina made the largest gains at grade 8, witheach improving II to 12 percentage points.

PERCENTSTUDENTSSCORING

I 00

908o

70

6o

50

40

3020

I 0

FIGURE I

NATIONAL MATHEMATICS TRENDS

ON MAIN NAEP 1990 TO 1996

GRADE 4 GRADE 8

El Proficient 0 Basic / Proficient Basic

1990 1992 1996

Soutre: NAEP Mathematics Report Card, 1996

FIGURE 2

HIGHEST IMPROVING STATES

AT GRADES 4 AND 8

NAEP 1996

GRADE 4 IMPROVEMENT

ProficientState Grade 4TEXAS 25%INDIANA 24NORTH CAROLINA 2 I

CONNECTICUT 3 I

WEST VIRGINIA I 9

TENNESSEE I 7

COLORADO 22NATION 21

GRADE 8 IMPROVEMENT

State

MICHIGAN

MINNESOTA

NORTH CAROLINA

WISCONSIN

CONNECTICUT

COLORADO

TEXAS

NATION

ProficientGrade 8

28

342032

252123

Source: NAEP Mathematics Report Card, 1996

Change1992-96+io%

8

8

7

7

7

5+3

Change1990-96

+ 12

I II I

998

8+9

1 2THE COUNCIL OF CHIEF STATE SCHOOL OFFICERS STATE EDUCATION ASSESSMENT CENTER

6

Page 13: TIMSS. 54p. - ERIC · 2014-06-02 · Improving Mathematics Education Using Results from NAEP and. TIMSS. Council of Chief State School Officers, Washington, DC. State Education Assessment

The data summarized in this paper are from state-by-state NAEP mathematics results for grades 4 and8 in 1996, including trends since 1990 which arereported in the NAEP Report Card in Mathematics(Reese, et al., 1997) and in ccsso's State Indicatorsof Science and Mathematics Education (Blank, et al.,1997a). Our analysis focuses on gains in mathemat-ics performance for each state based on the NAEPachievement levels, and uses the percentage of stu-dents at/above the Proficient level as a key bench-mark. The National Education Goals Panel hasreleased a new report with state profiles of studentachievement in mathematics and science usingNAEP achievement levels and results from theNAEP-TIMSS linking study ( I 998 b).

Figure 3 shows the percentage of students that attained

each of the achievement levels in 1996. Consistentwith the scale score results, achievement levels haveincreased in all three grades since 1990 and 1992. Inparticular, the percentage of fourth, eighth, andtwelfth graders performing at or above the Basic andProficient levels has increased. However, the percent-

age of fourth and twelfth grade students achieving theAdvanced level has not shown an increase since either

1990 or 1992. Additionally, approximately one thirdof all students are below the Basic level at all grades.

MATH GAINS EN

INTERNATIONAL PERSPECTWE

The international perspective offered by TIMSS con-firms that U.S. mathematics education is still far fromthe goal of being "first in the world in mathematicsand science achievement by the year 2000," as Presi-dent Bush and 50 governors declared in 1989. Fourthgraders scored above the international average onTIMSS, although students in seven countriesSingapore, Korea, Japan, Hong Kong, Netherlands,Czech Republic, and Austriaoutperformed U.S.students (NCES, I997a). U.S. eighth graders scoredbelow the international average of the 41 TIMSScountries (NCES, I996a).

The United States was the only TIMSS country forwhich fourth-grade results were above the averageand eighth-grade results were below the average(see Figure 4). Eighth graders in Singapore, Korea,and Japan scored more than 100 points higher onthe TIMSS scale than eighth graders in the U.S.This is a substantial difference, considering thatthe difference in performance between grades 7 and8 is only 26 points in the U.S. (Mullis, 1998). Evenfor the best U.S. eighth grade students, the news isdiscouragingonly 5 percent would be included

FIGURE 3

MATHEMATICS ACHIEVEMENT LEVELS AND RESULTS

NAEP 1996

BASIC

Partial mastery of prerequisite knowledge and skills that are

fundamental for proficient work at each grade

PROFICIENT

Solid academic performance for each grade assessed. Students

reaching this level have demonstrated competency over challenging

subject matter, including subject-matter knowledge, applicationof such knowledge to real-world situations, and analytical skillsappropriate to the subject matter.

ADVANCED

Superior performance

Source: NAEP Mathematics Report Card

Grade 4 Grade 8 Grade 12

64% 62% 69%

21% 24% 16%

2%

BEST COPY MALAWI 1 3

IMPROVING MATHEMATICS EDUCATION USING RESULTS FROM NAEP AND TIMSS

7

Page 14: TIMSS. 54p. - ERIC · 2014-06-02 · Improving Mathematics Education Using Results from NAEP and. TIMSS. Council of Chief State School Officers, Washington, DC. State Education Assessment

in the top io percent of all eighth-grade students inthe 41 'rimss countries. For Singapore the corre-sponding number would be 45 percent.

While the news was not good at eighth grade, it wasworse for twelfth grade. U.S. twelfth graders scoredbelow the international average and the U.S. placedamong the lowest of the 21 raiss nations in math-ematics general knowledge in the final year of second-ary school. When the eighth grade results are com-pared with the average of the 20 countries thatparticipated in both eighth and twelfth grade 'miss,the U.S. scores are similar to the international average.

But at twelfth grade only two countries, Cyprus andSouth Africa, had scores below those of U.S. students.

The average U.S. twelfth grade score was 461, whilethe highest score (Netherlands) was 56o. An assess-ment of advanced mathematics was given to a sampleof students taking advanced course work in math-ematics. The performance of these advanced studentswas among the lowest of the 16 countries thatparticipated, for all three content areas assessed.

LONG-TERM NAEP TRENDS

One of the strengths of the NAEP program in theU.S. is the capacity for analyzing trends in learningover a considerable period of time. As a context forexamining results from the most recent nationaland international assessments in the content areas,we can look at what has happened over time inFigure 5. NAEP includes a portion of the assessmentthat has remained the same since 1973, yieldingvaluable trend data for those 23 years. At all agelevels, the overall trend is of increased performanceover time (Campbell et al., 1997).

When this data is broken down by quartiles inFigure 6, the result is the same: every quartile hasimproved. For &ample, the lower quartile of studentscores improved from 22 I tO 237a statisticallysignificant increase over the 18-year span from 1978to 1996. The results show that for all students,regardless of achievement level, the NAEP results have

improved over time. Moreover, grade 4 and 8 stu-dents have shown statistically significant improve-ments since both the 1990 and the 1992 assessmeOts.

1 4

FIGURE 4TIMSS MATHEMATICS

INTERNATIONALGrade 4 Grade 8 Grade 12

AVERAGE 529 513 500U . S. 545 500 461JAPAN 597 6o5 N/A

CANADA 532 527 519

ENGLAND 513 506 N/A

GERMANY N/A 509 495Note: TIMSS scale o to 800 each gradeSource: NCES, 1996a, 1997a, I998a

FIGURE 5

LONG-TERM NATIONAL TRENDS,

NAEP MATHEMATICS350

300AGE 17

307*

274*264

AGE 13250

231*219

AGE 9

200

1978 1990 1996

* Significant change from 1978 Source: Campbell et al., 1997

FIGURE 6NAEP MATHEMATICS TRENDS

BY QUARTILES

1978 1990 1996AGE 9

UPPER 256 266 268*MIDDLE TWO 221 231 232*LOWER 178 190 191*

AGE 13

UPPER 305 307 311*MIDDLE TWO 266 271 275*LOWER 22 I 234 237*

AGE 17

UPPER 339 341 342*MIDDLE TWO 302 305 308*LOWER 260 268 270*

* Significant change from 1978. Scale range: o to 500Source: Campbell et al., 1997

THE COUNCIL OF CHIEF STATE SCHOOL OFFICERS STATE EDUCATION ASSESSMENT CENTER

8

Page 15: TIMSS. 54p. - ERIC · 2014-06-02 · Improving Mathematics Education Using Results from NAEP and. TIMSS. Council of Chief State School Officers, Washington, DC. State Education Assessment

GROWTH IN MATHEMATICS LEARNING

FROM GRADE 4 TO GRADE 8

One of the concerns for mathematics educatorsarising from the TIMSS results was the finding thatour students do worse at grade 8 mathematicscompared to other nations than at grade 4. Anapproach to analyzing NAEP results called cohortgrowth analysis sheds light on the question of howmuch improvement U.S. students make over thosefour years of schooling. NAEP mathematics resultsby state for 1992 and 1996 can be analyzed to showthe extent of improvement in the same cohort ofstudents over time. Barton and Coley (1998) usedthe cohort analysis method to determine the extentof improvement in NAEP for the nation and by statefrom 1992 to 1996 by comparing the scores ofgrade 4 students in mathematics with scores fouryears later for grade 8 students. Thus, samples of thesame cohort of students are compared. The advan-tage of this approach is to move closer to determin-ing the effects of schooling in improving mathemat-ics because comparable samples of schools and stu-dents are compared from one period to the next.

Figure 7 shows selected states with high, average,and below average growth in terms of number ofNAEP scale points increase over four years-1992to 1996. The states with highest growth, Nebraska

and Michigan, improved scores by 57 pointsNebraska scores rose from 226 in 1992 to 283 in1996, while Michigan increased fiOnl 220 in 1992to 277 in 1996. The average growth for the nationwas 52 points, and six states were at this level ofgrowth. A total of 14 states were above averagegrowth, and 14 were below average.

The growth analysis by state allows us to see thatstates whose mean score was below the nationalaverage at grade 8, such as Kentucky and Arkansas,have made the same amount of improvement inmathematics learning as Maine which is near thetop in performance at grades 4 and 8. Several stateswith close to average scores, such as Michigan andNorth Carolina, were near the top in math growthfrom grade 4 to grade 8. Five scale points differ-ence in growth (e.g., 57 points vs. 52 points) couldbe viewed as about three months difference inmathematics learning, since the average increase inNAEP scores is about 13 points per school year.

The NAEP trend data tell us that scores in math-ematics have been improving, steadily but gradu-ally, over the past 23 years. Yet the TIMSS resultsshow that, after fourth grade, U.S. students lag farbehind students around the world. Within thiscontext, we now examine results in specific contentareas of mathematics, along with sample items.

FIGURE 7

COHORT GROWTH FOR SELECTED STATES IN MATHEMATICS

NAEP 1992 TO 1996

High Growth in MathNEBRASKA, MICHIGAN

NORTH DAKOTA, MINNESOTA

NORTH CAROLINA, COLORADO

Average GrowthMAINE, MARYLAND, TEXAS, TENNESSEE,NEW YORK, KENTUCKY, ARKANSAS

NATION

Below Average GrowthMISSISSIPPI, SOUTH CAROLINA, ALABAMA,LOUISIANA, HAWAII

GEORGIA

* Difference between average grade 8 score (1996) and grade 4 score (1992)

Source: Barton and Coley, Growth in NAEP (1998). 5

NAEP ScorePoints Improvement*

5 7

56

55

52

4847

IMPROVING MATHEMATICS EDUCATION USING RESULTS FROM NAEP AND TIMSS

9

Page 16: TIMSS. 54p. - ERIC · 2014-06-02 · Improving Mathematics Education Using Results from NAEP and. TIMSS. Council of Chief State School Officers, Washington, DC. State Education Assessment

Content Area Strengths and Weaknesses(11,

In order to more fully understand the results fromNAEP and nmss, it is helpful to examine some of thespecific content areas covered by both assessments. On

NAEP there was little variation in results from onecontent area to another, although some content areasshowed improvement. In particular, public schoolstudents in grades 4 and 8 showed significant im-provement in algebra, and grade 8 students haveimproved in geometry from 1992 to 1996. Since the1990 assessment, grade 8 scores have improved in all

content areas. While these findings show good newsfor mathematics education, we must keep in mindthat the overall scores continue to be low (which isespecially highlighted by the MASS results). Thedistribution of NAEP mathematics scores is negatively

skewed, showing that there is an abundance of lowscores that pull the mean score down.

In spite of the small amount of variation on NAEPamong content areas, we can learn more about therelative strengths and weaknesses by looking atNAEP and TIMSS results by content area. Doing soenables us to make more accurate statements aboutwhat U.S. students know and can do in mathemat-ics. Because the NAEP and TIMSS frameworks aresimilar, it is possible to look within content areas

and analyze results from both assessments. In thisnext section, we will focus on those content areasthat showed weaknesses at grades 4, 8 or 12. Asample of items from both assessments is includedwithin each content area discussion.

MEASUREMENT IS A WEAKNESS

AT ALL GRADES

In this content strand of mathematics, students areexpected to have a conceptual and procedural under-

standing of measurement units, the ability to usemeasurement tools and instruments, and the abilityto solve problems related to perimeter, area, andvolume. The NAEP assessment also measured stu-dents' ability to estimate absolute and relative mea-surements. At fourth grade, the emphasis for NAEPwas on measurement of time, money, temperature,

length, perimeter, area, weight/mass, and angles.Problems for students in grades 8 and 12 were morecomplex and involved volume and surface area inaddition to the other topics. Some questions alsodealt with reasoning with proportions, which areskills required in scale drawing and map reading.

The TIMSS data show that measurement is a par-ticular weakness of U.S. elementary and middlegrades students. U.S. students scored below theinternational average in this content area at bothgrades 4 and 8. On NAEP, students' average score inmeasurement (mean and median) was lower thanthe overall average at grades 8 and 12. So, relative toother content areas on NAEP, measurement was weakin those grades. The questions that were particularlydifficult for students were those requiring unitconversions, calculations of volume and circumfer-ence, and estimation of measurements.

FIGURE 8

"ODOMETER"

NAEP 1996, GRADE 8

A car odometer registered 41,256.9 miles when ahighway sign warned of a detour 1,200 feet ahead.What will the odometer read when the car reachesthe detour?

A. 42,456.9B. 41,279.9

C. 41,261.3

D. 41,259.2

E. 41,257.1

OVERALL CORRECT 26%PROFICIENT 50%

The Odometer item in Figure 8 is a multiple choiceitem that assesses the measurement strand. Usingthe conversion for feet to miles given in the prob-lem, the student must convert the 1,200 feet givenin the problem into miles, and then add this amountto the reading on the odometer. Using a calculator,

i 6THE COUNCIL OF CHIEF STATE SCHOOL OFFICERS STATE EDUCATION ASSESSMENT CENTER

I 0

Page 17: TIMSS. 54p. - ERIC · 2014-06-02 · Improving Mathematics Education Using Results from NAEP and. TIMSS. Council of Chief State School Officers, Washington, DC. State Education Assessment

the student could compute 1,200 divided by 5,280and then the sum of that quotient and 41,256.9. Asan alternative, the student could use number senseand estimation to solve the problem, especially sincethe answer is in multiple-choice format. The studentcould estimate that 1,200 feet is about 1/5 of a mile,which would be 2/10, and add 0.2 tO 41,256.9,choosing the correct answer of "E".

U.S. eighth graders did not do well on this item:only 26 percent got it correct. Another 37 percentchose option A, found by simply adding the numberof feet to the odometer reading without first con-verting the distance to miles. Males performedsignificantly better than females on this item. Ofstudents whose overall NAEP score was at the Profi-cient level, 5o percent got this item correct. Theitem touches on several weak spots in overall perfor-mance by students on NAEP: measurement, opera-tions with decimals, and estimation.

One of the primary problems with the teaching ofmeasurement is that, when it is taught, it is oftendone with proxies or simulations. Students are giventoo few opportunities to actually engage in the use ofmeasuring instruments. Instead they are shown pic-tures of objects and the instruments chosen to mea-sure some attribute of those objects. Students shouldhave more hands-on experiences with measuring,including making decisions about which instrumentmight be appropriate for measuring a particularattribute. Emphasis should be placed on understand-ing the underlying concepts, rather than simplyapplying formulas. Measuring activities should em-phasize the approximate nature of measurement, andthe related issues of precision and error.

NUMBER SENSE AND ESTIMATION:

STUDENTS HAVE DIFFICULTY APPLYING

KNOWLEDGE TO SITUATIONS

The NAEP results show that U.S. students can dosimple whole number computations but lack flex-ibility in applying them to new or unusual situa-tions. The content strand of number sense andestimation covers basic arithmetic skills and con-cepts, which represent a significant part of thel

mathematics curriculum at most U.S. schools, par-ticularly at the lower grades. The NAEP assessmentreflected this emphasis, with 40 percent of thequestions for students in grade 4, 25 percent ofthose in grade 8, and 20 percent of those in grade12 falling within this content strand. For TIMSS,number sense and estimation was covered in sec-tions on measurement, estimation and numbersense for students in grade 4, and within fractionsand number sense for students in grade 8.

NAEP scores on number sense were below theoverall average at grades 4 and 12. While fourthgraders were strong in whole number computation,they showed weaknesses in number sense items.Students scoring in the Basic achievement level onNAEP (64 percent for grade 4, 62 percent for grade8, and 69 percent for grade i2) appeared to graspmany of the fundamental concepts of numbers,relationships between numbers, and properties ofnumbers, as well as to display the skills requiredfor manipulating numbers and completing com-putations. Questions requiring multi-step solutionsor involving new concepts tended to be more diffi-cult. Additionally, questions requiring students tosolve problems and communicate their reasoningproved challenging, and often it was the communi-cation aspect that provided the most challenge.Internationally, U.S. students were below the inter-national average at grade 4, while eighth graderswere at the international average.

Number sense involves having an intuition aboutnumbers; it entails being able to use a variety ofstrategies, including mental computation, to findsolutions to problems, either in a context or context-free. Questions in this content strand required stu-dents to demonstrate an understanding of numberproperties and operations, to generalize from nu-merical patterns, and to verify results. These ques-tions also assessed student understanding of numeri-cal relationships as expressed in ratios, proportions,and percentages. Students at all grade levels wereassessed on their ability to reason mathematicallyand to communicate the reasoning they used to solveproblems involving number sense, properties, andoperations.

IMPROVING MATHEMATICS EDUCATION USING RESULTS FROM NAEP AND TIMSS

I I

Page 18: TIMSS. 54p. - ERIC · 2014-06-02 · Improving Mathematics Education Using Results from NAEP and. TIMSS. Council of Chief State School Officers, Washington, DC. State Education Assessment

FIGURE 9

"SAM'S LUNCH"

NAEP 1996, GRADE 4

Sam can purchase his lunch at school. Each dayhe wants to have juice that costs 50, a sandwichthat costs 900, and fruit that costs 35¢. Hismother has only $1.00 bills. What is the leastnumber of $1.00 bills that his mother should givehim so he will have enough money to buy lunchfor five days?

OVERALL CORRECT 17%PROFICIENT 44%

The regular constructed-response item for fourthgraders in Figure 9 measures number sense andoperations with decimals. Students could use manydifferent strategies to solve this problem. One possi-bility is to add the cost of juice, sandwich, and fruit,multiply by 5, and then round up to the nearestdollar. Or the cost of each food item could bemultiplied by 5 and then a total sum found. An-other strategy would be to estimate the number ofdollars needed each day (2), then see that the changefrom four days (.25 times 4) makes another dollar, sothe total would be 9. Regardless of the approach, thesuccessful student must take into account the cost ofthe food over five days and the notion of "least"number of dollars.

This item was scored using a three-point scoringguide that allowed for partial credit. This questionwas difficult for most students. Ten percent did notrespond to the question, and half of the studentsresponded incorrectly. Only 17 percent of fourthgraders received the highest score on this item. Forstudents whose overall NAEP score was Proficient, 44

percent got the highest score on this item.

In the twelfth grade item shown in Figure 10,students must consider a percent of a percent to find

the number of serious bicycle accidents that involvefatal head injuries. The correct answer (20 percent) isfound by taking 8o percent of 25. The answer couldalso be found by using number sense, and estimat-ing that 20 percent is about 8o percent of 25.

FIGURE 10"BICYCLE ACCIDENTS"

TIMSS, GRADE 12

Experts say that 25% of all serious bicycle acci-dents involve head injuries and that, of all headinjuries, 8o% are fatal. What percent of all seri-ous bicycle accidents involve fatal head injuries?

A. 16% B. 20% C. 55% D. 105%

INTERNATIONAL AVERAGE 64%U.S. 57%

Twelfth grade students in the U.S. scored below theinternational average on this item. Only 57 percentchose the correct response, compared to the interna-tional average of 64 percent. In The Netherlands, forexample, 83 percent of the students got the rightanswer.

While U.S. students seem to be fairly strong inbasic whole number computation, they seem tolack the flexibility to apply those skills to new orunusual situations. The emphasis in the curricu-lum should move beyond basic paper and pencilcomputations with numbers to include such topicsas computational estimation and mental computa-tion. Both are skills that are strongly needed withthe increased use of technology to perform thealgorithms that used to be done with a pencil andpaper. In a world where problem solving andreasoning are highly valued, mental strategies andthe ability to judge the reasonableness of an answerare more important than ever.

THE COUNCIL OF CHIEF STATE SCHOOL OFFICERS STATE EDUCATION ASSESSMENT CENTER

12

Page 19: TIMSS. 54p. - ERIC · 2014-06-02 · Improving Mathematics Education Using Results from NAEP and. TIMSS. Council of Chief State School Officers, Washington, DC. State Education Assessment

GEOMETRY: FOURTH GRADERS STRONG, BUT

EIGHTH AND TWELFTH GRADERS WEAK

The questions classified under this content strandcentered around a conceptual understanding of geo-metric figures and their properties. Fourth-gradestudents were asked to demonstrate an understand-ing of the properties of shapes and to visualize shapesand figures under simple combinations and transfor-mations, as well as to write verbal descriptions of theproperties of geometric figures. Eighth grade stu-dents were asked about concepts related to proper-ties of angles and polygons, such as symmetry,congruence and similarity and the PythagoreanTheorem. They also had to apply reasoning skills tomake and validate conjectures about combinationsand transformations of shapes. Twelfth graders wereexpected to demonstrate knowledge of more sophis-ticated geometric concepts and to use more sophisti-cated reasoning processes. Some questions involvedproportional reasoning or coordinate geometry.

Fourth graders on TIMSS scored above the interna-tional average in geometry, with only two nationsscoring significantly higher. By eighth grade, how-ever, 25 countries scored higher than the U.S., and our

students were below the international average. Theadvanced U.S. twelfth graders scored the lowest of all

countries taking the test. On NAEP, fourth graderswere relatively strong in geometry, compared withother content areas. The geometry results were lowerat eighth grade, and then higher at twelfth grade.

The twelfth grade item in Figure II represents a fairlystandard geometry proof. To gain full credit, students

had to exhibit some understanding of angle sums in atriangle, isosceles triangles, and possibly other con-cepts, such as vertical angles and supplementaryangles. They also had to be able to justify eachstatement in the proof. The international average forthis item (getting it at least partially correct) was 48percent, while only 19 percent of U.S. advancedtwelfth graders reached this level of achievement.

BEST COPY AVAILABLE

FIGURE I I"GEOMETRIC PROOF"

TIMSS, GRADE 12

In the ABC, shown below, the altitudes BN andCM intersect at point S. The measure of the LMSBis 400 and the measure of ZSBC is 200. Write aproof of the following statement:

" .6, ABC is isosceles."

ANSWER:

INTERNATIONAL AVERAGE

U.S.48%19%

Geometry needs to become a more visible part of themiddle grades curriculum. Too often it is ignoreduntil high school, and delays students' experience in ahighly valuable and applicable part of mathematics.Geometric thinking and spatial visualization arelinked to many other areas of mathematics, such asalgebra, fractions, data, and chance. Teachers shouldspend more time developing geometric concepts (con-

cretely and with many different representations) andprinciples (in varied settings) and not merely focus onpractice involving algorithmic procedures.

PROPORTIONALITY: STRONG AT GRADE 4,

WEAK AT GRADES 8 AND 1 2

Proportions are relationships among quantities thatare related by multiplication. To say that quantity a isto quantity b as quantity c is to quantity d is todescribe a relationship between a, b, c, and d that ismultiplicative (for example, a times c is equal to btimes d). Ratios and proportions are an important part

of mathematics learning, usually starting in the upper

1 9

IMPROVING MATHEMATICS EDUCATION USING RESULTS FROM NAEP AND TIMSS

13

Page 20: TIMSS. 54p. - ERIC · 2014-06-02 · Improving Mathematics Education Using Results from NAEP and. TIMSS. Council of Chief State School Officers, Washington, DC. State Education Assessment

elementary and middle grades. The Tam frameworkseparates proportionality into a separate content cat-egory at grades 4 and 8, while NAEP embeds notions

of proportional reasoning within the other strands,particularly number sense.

Results from TIMSS show that, like the other content

areas, U.S. students fared better at grade 4 than at thehigher grades. On items measuring fractions andproportionality, fourth graders scored slightly higherthan the international average, with six countriesscoring higher. Eighth graders scored below theinternational average, and 18 countries scored higher.

FIGURE 12"AMOUNT PAID"

TIMSS, GRADE 8

Peter bought 70 items and Sue bought 90 items.Each item cost the same and the items cost $800altogether. How much did Sue pay?

INTERNATIONAL AVERAGE 38%U.S. 23%

To be successful on the eighth grade item shown inFigure 12, a student needs to find the portion of the$800 total that accounts for Sue's items. This couldbe accomplished by computing the total number ofitems (i6o) and calculating that Sue's portion is 90/16o, therefore her portion of the total bill would bethat amount times 800. The international averagefor this short constructed-response item was 38percent. Only 23 percent of eighth graders in theU.S. got this item correct. In Singapore, 83 percentof the eighth graders got this item correct.

Figure 13 shows a twelfth grade, regular con-structed-response item that measures number sense,proportions, and operations. Students' responseswere scored using a three-point scoring guide thatallowed for partial credit. To earn a "satisfactory"

score, or full credit, the student would have to showthat Martin's mixture had a stronger cherry flavor.This would entail showing that Martin's ratio of 5ounces of syrup to 42 ounces of water had a highersyrup to water ratio than Luis' mixture of 6 ounces to53 ounces of water. In the sample response, thestudent showed this by dividing 5 by 42 and 6 by 53and showing that one quotient was larger than theother. Other possible strategies would include set-ting up a proportion and "cross multiplying," orfinding a common denominator and comparingnumerators. Only 23 percent of U.S. twelfth gradestudents were able to reach the satisfactory level onthis item, including 6o percent for those whoseoverall score was Proficient.

FIGURE 13"CHERRY SYRUP"

NAEP 1996, GRADE 12

Luis mixed 6 ounces of cherry syrup with 53ounces of water to make a cherry-flavored drink.Martin mixed 5 ounces of the same cherry syrupwith 42 ounces of water. Who made the drinkwith the stronger flavor?

Give mathematical evidence to justify youranswer.

OVERALL CORRECT

PROFICIENT23%6o%

Proportional reasoning is a challenging concept formany students, and one that should be an impor-tant part of the middle grades curriculum. To beable to understand and work with proportionalsituations requires a multiplicative way of think-ing about relationships. Teachers need to enhancethe development of the topic as it is typicallypresented in textbooks, with hands-on experiences.Initial activities should focus on the developmentof meaning, postponing efficient procedures untilsuch understandings are internalized.

20THE COUNCIL OF CHIEF STATE SCHOOL OFFICERS STATE EDUCATION ASSESSMENT CENTER

14

Page 21: TIMSS. 54p. - ERIC · 2014-06-02 · Improving Mathematics Education Using Results from NAEP and. TIMSS. Council of Chief State School Officers, Washington, DC. State Education Assessment

ALGEBRA RESULTS HIGHER THAN OTHER

AREAS, BUT OVERALL KNOWLEDGE

AT A LOW LEVEL

The algebra strand extends from work with simplepatterns at grade 4 to basic algebra concepts at grade8 to sophisticated analysis at grade 12. Students areexpected to use algebraic notation and thinking inmeaningful contexts to solve mathematical and real-world problems.

Fourth grade algebra questions on TIMSS werecategorized as those involving patterns, relations,and functions. In this content area fourth graderswere again above the international average, withonly four nations scoring higher. Eighth gradersscored just at the international average, and twelfthgraders were lower. From the NAEP results we cansay that, relative to the other content areas, algebraresults were higher at all three grade levels. How-ever, the overall level of algebra knowledge is rela-tively low. In particular, 64 percent of fourthgraders, 62 percent of eighth graders, and 69 per-cent of twelfth graders scored well on basic alge-braic representations and simple equations, as wellas finding simple patterns. But only 24 percent ofeighth graders and 16 percent of twelfth gradersdemonstrated knowledge of linear equations, alge-braic functions and trigonometric identities ex-pected for their grade level. Only four percent ofeighth graders and two percent of twelfth gradersdemonstrated the ability to identify and generalizecomplex patterns and solve real-world problemsexpected for their level.

The item shown in Figure 14 asks eighth gradersa conceptual question about an algebraic equa-tion. Rather than asking students to simply ma-nipulate symbols, this question probes a student's

understanding of a variable expression. The stu-dent must be aware that n 1, n, and n + 1 arerepresentations of the consecutive whole numbersin the problem, and that n would necessarily repre-sent the middle of the numbers.

FIGURE 14"MEANING OF EQUATION"

TIMSS, GRADE 8

Brad wanted to fmd three consecutive wholenumbers that add up to 81. He wrote the equa-tion (n-1) + n + (n+1) = 81. What does n standfor?

A. The least of the three whole numbers.

B. The middle whole number.

C. The greatest of the three whole numbers.

D. The difference between the least andgreatest of the three whole numbers.

INTERNATIONAL AVERAGE 37%U.S. 32%

The international average for this item was low at 37percent, yet the number of U.S. eighth graderschoosing the correct response was below the average,at 32 percent. In Japan, 62 percent of the studentschose the correct response.

Algebra has traditionally been considered a highschool course, but more and more students aretaking a first formal course in eighth grade. Thereis a need for algebraic thinking to be introduced inthe early elementary grades, so that algebra becomesa natural way of expressing what is sensible inarithmetic. With this sort of foundation, the firstformal courses in algebra would be a smoothercontinuation of ideas that had evolved naturally.

21IMPROVING MATHEMATICS EDUCATION USING RESULTS FROM NAEP AND TIMSS

15

-

Page 22: TIMSS. 54p. - ERIC · 2014-06-02 · Improving Mathematics Education Using Results from NAEP and. TIMSS. Council of Chief State School Officers, Washington, DC. State Education Assessment

A Closer Look at Item Types on NAEP

The 1996 NAEP in mathematics included three typesof items: multiple choice, regular constructed re-sponse, and extended constructed response. The regu-

lar constructed response items required that studentsprovide their own answer, rather than selecting from a

given set of options. The extended constructed re-sponse items, however, involved longer responses andwere designed to assess higher levels of problemsolving, reasoning, and mathematical communica-tion. Typically, students are asked to explain theirthinking or justify their solutions. These latter itemswere first included on NAEP in 1992.

The results by item type from 1992 and 1996 showwide differences. At all three grade levels, studentperformance on multiple choice items was signifi-cantly better than performance on constructedresponse items. Figure 15 shows the mean percentcorrect for multiple choice and regular constructedresponse, and the percent satisfactory or better onthe extended constructed response (Dossey et al,1993; Silver et al., 1998). Clearly students performbetter on multiple choice questions than on eithertype of constructed response. The results are simi-lar for 1992 and 1996. It is interesting to note thateighth graders did comparatively better on the regularconstructed response items than either fourth ortwelfth graders.

The most obvious conclusion from Figure 15 isthat overall performance on all of the item types istoo low: slightly more than half of the responses tomultiple choice items are correct, but performanceon the constructed response questions was worse.When students were asked to produce an answerrather than choose from a set of possible answers,their performance declined considerably. Whenasked to produce an extended response, their per-formance was abysmal.

Perhaps the tasks are too difficult

What is it about the extended constructed responseitems that makes the performance levels so poor?Some may wonder whether the tasks are too difficult.Let's take a look at one fourth grade released item that

was on both the 1992 and the 1996 NAEP .

The task in Figure 16 shows two figures, both drawnon the same grid. Students are first asked to tell howthe two figures are alike, and then to list ways thatthey are different. A typical response might say thatthey are alike because they both have four sides, orthat they have the same length or base, or that theyboth have little squares inside. It is also true that bothfigures have the same height and the same area, and

FIGURE 15

RESULTS BY ITEM TYPE

NAEP 1992, 1996

1992

MultipleChoice

(% Correct)

ConstructedResponse

(% Correct)

ExtendedConstructed Response(% Satisfactory or Better)

GRADE4 50% 42% 16%GRADE 8 56 53 8GRADE I 2 56 40 9

1996GRADE 4 54 38 17GRADE 8 55 49 9GRADE 12 6o 34 12

Source: NAEP Mathematics, 1992, 1996

BEST COPY AVAILABLETHE COUNCIL OF CHIEF STATE SCHOOL OFFICERS STATE EDUCATION ASSESSMENT CENTER

16

Page 23: TIMSS. 54p. - ERIC · 2014-06-02 · Improving Mathematics Education Using Results from NAEP and. TIMSS. Council of Chief State School Officers, Washington, DC. State Education Assessment

they both have parallel sides. For differences, thestudent might notice that one figure has four equalangles, while the other does not, and that those equalangles are right angles. The figures also have different

perimeters. They belong to different classes of four-sided figures, in that one is a rectangle and one is aparallelogram.

FIGURE 16"TWO GEOMETRIC SHAPES"

NAEP 1996, GRADE 4

Think carefully about the following question.Write a complete answer. You may use drawings,words, and numbers to explain your answer. Besure to show all of your work.

In what ways are the figures above alike? List as many

ways as you can. In what ways are the figures abovedifferent? List as many ways as you can.

MINIMAL 31%PARTIAL 29%SATISFACTORY II%

The scoring rubric for this task essentially countsthe number of correct reasons, both for similaritiesand differences. To earn a Minimal response, thestudent needs to identify one correct reason, or givea nonspecific response. A Partial response requirestwo reasons, while a Satisfactory response requiresthree in some combination of similarities or differ-ences. The Extended response requires some com-bination of four similarities or differences.

Nearly one third of fourth graders had responses tothis item that were either incorrect or off task (or there

was no response at all). Most fourth graders were only

able to identify one or two reasons why the figureswere either alike or different, and only aboutpercent reached the Satisfactory or Extended level.

Why the poor showing? This task does not present acomplex problem solving situation. It presents two

=.

simple geometric figures and asks how they are alikeor different. The scoring rubric does not differentiatebetween students who responded about the figuresbased solely on appearances ("one looks slantier thanthe other") versus those who used more sophisticatedgeometric terminology, yet most students were stillunable to come up with more than two characteristics.

Perhaps we should consider other possible causes forthe poor performance on this and other extendedconstructed response tasks.

Maybe they didn't try hard enough

Among the many reasons for the relatively poorshowing on the NAEP extended constructed re-sponse tasks, one must consider motivation. On thesample item shown above, 8.5 percent of theresponses were off task or omitted. The issue ofmotivation is more clearly seen with responses tothe twelfth grade items, comparing the number ofno response" or "off task" papers in twelfth grade

to those in the eighth or fourth grade. For example,on the released extended constructed response tasksfrom 1996, the twelfth grade results showed a rangeof 25-30 percent of the papers in the combinedcategories of "off task" or "omit." In contrast, at bothfourth and eighth grades, the number of "off task" or"omit" responses on the released tasks ranged fromabout 6 percent to about 13 percent. Because NAEPis designed to give results at the national or statelevel, there are no student-level scores given. Norare there any consequences for students based ontheir results. It appears that twelfth graders, awareof the lack of consequences, put less effort intothese tasks than did either eighth or fourth graders.

Maybe they didn't get to the item

Another, perhaps related, reason for the poor show-ing on the extended constructed response tasks isthe position of the tasks on the test. Each student isgiven a block of items to work, and this includes acombination of multiple-choice, regular con-

structed response, and one extended constructedresponse items. The extended constructed responseitem is always positioned at the end of the block. It

2 3IMPROVING MATHEMATICS EDUCATION USING RESULTS FROM NAEP AND TIMSS

17

Page 24: TIMSS. 54p. - ERIC · 2014-06-02 · Improving Mathematics Education Using Results from NAEP and. TIMSS. Council of Chief State School Officers, Washington, DC. State Education Assessment

is reasonable to assume that, for some students,there is insufficient time to work the extendedconstructed response item. The number of "notreached" responses to the fourth grade item wasonly 2.5 percent. Again, the statistics for "notreached" are higher for twelfth grade students thaneither eighth or fourth. When combined withlower motivation, this could account for manystudents either not trying at all, or not giving theirbest efforts to these items. After all, these are itemsthat demand much more effort than either choos-ing from a set of given responses, or finding asingle numerical answer.

Perhaps they have never done thiskind of mathematics before

Another consideration regarding the poor showingon these items is the opportunity to learn thecontent embedded in the item. On the 1996 NAEP,a disproportionate number of extended constructedresponse items came from the content areas of dataand geometry. Both of these areas have been tradi-tionally weak in the U.S. curriculum, especially atfourth and eighth grades. Both are topics found inthe last chapters of traditional elementary text-books, and often not reached (or valued) by el-ementary teachers. On the comparison of geometricfigures described above, typical instruction in theelementary grades might include simple definitionsfor certain shapes, but students are not often askedto analyze the geometric features of those shapes. Infact, a qualitative analysis of a sample of studentresponses showed that students had much moredifficulty with the "difference" question than the"likeness" question, which may be related to thefrequency with which students are asked to look forcontrasts, rather than comparisons.

These items demand a differentkind of reasoning

Aside from lack of motivation, insufficient time, oropportunity to learn, it is important to considerthe more substantive reasons for low scores on theseitems. That is, these items assess higher order

thinking skills, such as reasoning, making connec-tions, analyzing, making conjectures, and writingexplanations or proofs. Such tasks are not onlyinherently more difficult, but often students havenot had many opportunities to engage in this kindof activity in mathematics class. When studentsspend most of their class time learning how toperform relatively simple procedures, rather thanlearning to understand concepts and engaging inhigher order thinking, it is not surprising that theydo poorly on these types of tasks. Fourth graderswho may have learned to identify shapes will have amore difficult time when asked to analyze thegeometric properties of those shapes.

These tasks requiregood communication skills

Nearly all extended constructed response tasksdemand that students be able to communicatemathematical ideas clearly, precisely, and concisely.Responses might include a variety of types of com-munication, such as equations, graphs, tables, dia-grams, charts, and words. Scoring of these responsesalmost always involves how clear the explanation is,or how clearly the student can demonstrate under-standing of the concepts. Once again, students needpractice in these types of activities. They need to bein classrooms where both verbal and written expla-nations are valued and form an integral part of theclass. From the earliest grades, students need togain an understanding of what constitutes a validmathematical argument, and they need to practiceexplaining their ideas to others.

Problems are often non-routine, andstudents do not have readily available

algorithms or procedures for solving them

For nearly twenty years mathematics educatorshave seen from the NAEP results that students havedifficulty with any non-routine problems that re-quire more than one step to solve or involve someanalysis or thinking. In an interpretive report ofthe 1980 NAEP mathematics assessment, Carpenteret al. noted,

24THE COUNCIL OF CHIEF STATE SCHOOL OFFICERS STATE EDUCATION ASSESSMENT CENTER

18

Page 25: TIMSS. 54p. - ERIC · 2014-06-02 · Improving Mathematics Education Using Results from NAEP and. TIMSS. Council of Chief State School Officers, Washington, DC. State Education Assessment

Part of the cause of students' difficulty withnon-routine problems may lie in our overem-phasis on one-step problems that can be solvedby simply adding, subtracting, multiplying,or dividing. ...Instruction that reinforces thissimplistic approach to problem solving maycontribute to students' difficulty in solvingunfamiliar problems. (1981, p. 146)

Certainly most extended constructed response tasksfit the description of non-routine, multi-step prob-lems. It is also clear that little progress has beenmade in students' abilities to solve such problemssuccessfully. Similar comments about the need forinstruction in such tasks were made after the 1992assessment (Kenney & Silver, 1997).

SUMMARY

AND RECOMMENDATIONS

The poor performance on the extended constructedresponse items may be due to any one of, or thecombination of, the reasons given here. While wemight explain some of the results by test-relatedissues, such as motivation or the placement of theitems at the end of the block, those were clearly notthe major reasons for the low scores on the fourthgrade item we analyzed. The lessons to be learnedfrom these results are clear: students in mathemat-ics classes need more opportunities to work non-routine problems, to use higher-order thinkingskills, and to communicate their mathematical ideas.

25IMPROVING MATHEMATICS EDUCATION USING RESULTS FROM NAEP AND TIMSS

19

Page 26: TIMSS. 54p. - ERIC · 2014-06-02 · Improving Mathematics Education Using Results from NAEP and. TIMSS. Council of Chief State School Officers, Washington, DC. State Education Assessment

Using NAEP DataANALYZING DIFFERENCES IN STUDENT MATHEMATICS PROFICIENCY

Ozst;

Too often, we find that national, state, and localassessment results are reported and interpretedonly with mean scores, percentages of all students,or other statistics of central tendency. Large-scalestudies such as NAEP, TIMSS, or state assessmentsare reported in a way that gives little idea ofdifferences among groups of students, types ofschools, or different curricula. In the 1990S, NAEPresults have typically been reported using state-level averages and percentages, (e.g., the percent ofstudents at/above the Proficient level). A commonuse of state NAEP results is to compare where onestate ranks, on average, against other states in thesame region or the nation. This kind of one-statisticanalysis is often too narrow, not informative, andnot useful for mathematics educators or decision-makers. The analysis of results can lead to mislead-ing interpretations. Careful study of differenceswithin the target population can increase the use-fulness of the results.

In the following section we look at four examplesof how to analyze differences in NAEP results atnational and state levels. The examples discussedin this paper focus on variations in NAEP resultsthat are in our judgement likely to have specificpolicy and program interest to states.

DIFFERENCES IN STATE NAEP RESULTS

BY STATE CONTEXT

Public analysis and discussion about NAEP resultshas often focused on attributing high or low resultsto differences in characteristics of the state, such associal, cultural, or economic characteristics. InFigure 17 we show NAEP results for high and lowperforming states by three variables commonlycited as being reasons for test score differencesrelated to state context: amount of money spent oneducation, students living in poverty, and adulteducation level.

FIGURE 17STATE MATH PROFICIENCY BY MEASURES OF STATE CONTEXT

NAEP 1996, GRADE 8

% At/AboveChildren in

PovertyExpenditures

per PupilEducationof Adults

High-Performing States Proficient (5-17 YEARS OLD) (ADJUSTED COL) (% H.S. GRADS)

MINNESOTA 34 16% $5,738 82%NORTH DAKOTA 33 14 5,234 77WISCONSIN 32 14 6,588 79MONTANA 32 18 5,653 81CONNECTICUT 31 i8 7,279 79

Low-Performing StatesMISSISSIPPI 7 33% $4,358 64%LOUISIANA 7 34 4,875 68ALABAMA 12 24 4,604 67ARKANSAS 13 22 4,804 66NEW MEXICO 29 4,749 75

Source: Blank, Manire, et al., 1997; (PovertyCurrent Pop. Survey, 1994; Expenditures NCES, 1994-95; Adult EducationBureau of Census, 1990)

26

THE COUNCIL OF CHIEF STATE SCHOOL OFFICERS STATE EDUCATION ASSESSMENT CENTER

20

Page 27: TIMSS. 54p. - ERIC · 2014-06-02 · Improving Mathematics Education Using Results from NAEP and. TIMSS. Council of Chief State School Officers, Washington, DC. State Education Assessment

LA NATION

BEST COPY AVAILABLF

This kind of high/low analysis can tell us if thereis a likely correlation between the characteristicsof the state and student performance. The statecontext also informs educators about some of thebarriers or disadvantages that schools and dis-tricts must address in improving mathematics inclassrooms.

The data show clearly that states with high NAEPmath performance results have better conditionsfor education. Compared to low-performing states,high-performing states have 10 to 20 percent fewerchildren living in poverty, 10 to 15 percent moreadults with high school diplomas, and $1,000 to$1,5oo higher expenditures per pupil. The differ-ence between the extremes on NAEP student perfor-mance results shows that state context is related tostudent achievement in mathematics.

VARIATION IN MATHEMATICS PROFICIENCY

FROM TOP STUDENTS TO BOTTOM STUDENTS

A way of improving our analysis of student achieve-ment is to examine the extent of variation betweenthe students performing at high and low levels, andthe degree to which students at each end of thedistribution are improving their learning. Educatorsand policymakers need first to see the total range ofperformance and then examine where the averagefalls in the range. An average score may hide poorperformance by a specific portion of students.

State percentile scores are a useful approach foranalyzing variation. Percentiles give the range ofperformancedifferences between the studentslearning the most mathematics and those learningthe least. To obtain a picture of how much scorevariation there is among states we examine NAEPresults from five states in Figure 18. The states wereselected according to the percentage of studentsscoring at/above Proficient in grade 4: Connecticut(the top performing state), Michigan (xi th fromtop), Missouri (21st), Kentucky (31st), and Louisi-ana (41st). The percentile breaks on NAEP arereported using the NAEP scale (o to 5oo). For grade4 students a score of 249 and higher is at/above theProficient level, and 214 is at/above the Basic level(Reese, et al., 1997, p.io).

The variation nationally from bottom quartile to topquartile at grade 4 mathematics is 43 points. Thisdifference shows significant variation in mathemat-ics learning and performance. Variation in perfor-mance of grade 4 students was similar in all fivestates for 1996. Thirteen points on the NAEP scalerepresent about one year of education, based ondifferences between grade 4 and grade 8 results,which means a difference of over 40 points can beviewed as greater than three years of math education.

It should be noted that the differences in mathperformance within each state are much greaterthan differences between states. For example, the

SCORE

275

250

225

200

175

25 3

228

214

FIGURE 18SCORE VARIATION WITHIN SELECTED STATES

NAEP 1996, GRADE 4

247 244

222 I223207 206

2.40

201

229

190

75% Percentile

5o% Percentile (Median)

25% Percentile

244

220

201

CT

Source: NAEP Cross-State Compendium

MI MO KY

IMPROVING MATHEMATICS EDUCATION USING RESULTS FROM NAEP AND TIMSS

21

Page 28: TIMSS. 54p. - ERIC · 2014-06-02 · Improving Mathematics Education Using Results from NAEP and. TIMSS. Council of Chief State School Officers, Washington, DC. State Education Assessment

difference in average scale scores between Con-necticut and Louisiana is only 23 points.

NAEP results can address the effect of more gradesof schooling on differences in student math perfor-mance. Above, we observed that math performanceon NAEP did improve in some states, particularly atgrade 8. With the data on variation, we can askwhether more schooling tends to increase or decreasethe variation in math performance. The results showthat variation does increase from grade 4 to grade 8(NAEP Cross State Compendium, 1996): Connecticut

increased from 39 points (grade 4) to 48 points(grade 8); Michigan from 40 points to 49; Missourifrom 38 to 43; Kentucky from 39 to 42; andLouisiana from 39 to 43.

VARIATION BY TYPE OF COMMUNITY

OR SCHOOL LOCATION

A second kind of analysis of differences in perfor-mance in mathematics is by type of community orlocation of school. From 1990 to 1996, a total of27 states had significant improvement in thepercentage of grade 8 students at/above Profi-cient. We can go further with the NAEP data andanswer the question of how scores, and improve-ment, differ by the community characteristics ofschools. For example, we can examine the percent-age of students reaching the Proficient level byschools in central cities, suburbs, and rural areas.

In Figure 19, we show variation in 1992 and 1996NAEP results by school location. Students in centralcity schools do less well in math in each state. Infour of five states, students in rural or small townschools score slightly less well than suburban chil-dren, with the exception of rural/small town Con-necticut schools.

The state and community location analysis showssignificant improvement in suburbs/large towns inMichigan (9 percentage points) and Connecticut(6 percentage points). Math performance improvedin central cities in Michigan (9 percentage points)and Kentucky (8 points). Michigan also showed the

-

BEST COPY AVAILABLE

FIGURE 19DIFFERENCES IN SELECTED STATES'

MATHEMATICS PROFICIENCY

BY SCHOOL LOCATION

NAEP GRADE 8, 1992 TO 1996

ConnecticutCENTRAL CITY

113%14%

SUBURB/LARGE TOWN

19921996

29%J 35%

35%, 37%

RURAL/SMALL TOWN

r

MichiganCENTRAL CITY

SUBURB/LARGE TOWN

RURAL/SMALL TOWN

Missouri

i%120%

1 23%32%

4 20%, 30%

CENTRAL CITY I8%

123%j 23%

18%21%

SUBURB/ 1:LARGE TOWN

1

RURAL/SMALL TOWN

Kentucky.5%CENTRAL CITY

23%

20%SUBURB/ 18%

LARGE TOWN1

RURAL/SMALL TOWN

1%03%

LouisianaCENTRAL CITY 8%

8%SUBURB/

LARGE TOWN

RURAL/SMALL TOWN

NationCENTRAL CITY

SUBURB/LARGE TOWN

15%16%

125%126%

117%1 25%

RURAL/SMALL TOWN

Source: NAEP Cross-State Compendium

28THE COUNCIL OF CHIEF STATE SCHOOL OFFICERS STATE EDUCATION ASSESSMENT CENTER

22

Page 29: TIMSS. 54p. - ERIC · 2014-06-02 · Improving Mathematics Education Using Results from NAEP and. TIMSS. Council of Chief State School Officers, Washington, DC. State Education Assessment

most improvement in rural and small town schools(io points). Nationally, the most improvement atgrade 8 math was in rural/small town schools.

The NAEP Mathematics Cross-State Compendium re-

port (Shaughnessy, et al., 1998) shows that otherstates making significant improvement in centralcity schools were North Carolina (ii points) andIndiana (7 points). States with significant improve-ment in rural schools were Rhode Island (9 points)and Wisconsin (8 points).

The data lead to several questions that states anddistricts could ask in order to further analyze theirperformance: What efforts did some states, such asMichigan, Kentucky and North Carolina, make incentral cities to raise scores as much as in suburbanand rural schools? What could educators in centralcities in Connecticut learn from their colleagues insuburban and rural schools that are scoring verywell on NAEP?

VARIATION IN MATH PERFORMANCE

BY CURRICULUM

A key finding reported from analysis of the SecondInternational Mathematics Study (SIMS) was thatschools in the U.S. really offer three or four differ-ent mathematics curricula. A primary reason forthe wide variation in math performance on achieve-ment tests is the differentiated curriculum pro-vided to students (McKnight, et al., 1987). Analy-ses of recent NAEP results show that high math-ematics proficiency is best explained by the levelof mathematics courses students have completed(Mullis et al., 1993, Reese, et al., 1997; Jones, L.V.,et al., 1986).

Figure 20 shows differences in NAEP 1996 mathperformance according to the percentage of stu-dents that took algebra, pre-algebra, or "regular"eighth grade math.

% 50

40

30

20

I0

FIGURE 20EIGHTH GRADE MATHEMATICS CURRICULUM IN SELECTED STATES

BY NAEP SCORE, 1996

CT MI MO KY

Percent Taking

Source: NAEP Cross-State Compendium

LA NATION

"REGULAR" 8TH GRADE

SCORE

CT MI MO KY LA NATION

Average NAEP Score

M PRE-ALGEBRA 0 ALGEBRA

BEST COPY AVAILABLE2 9

IMPROVING MATHEMATICS EDUCATION USING RESULTS FROM NAEP AND TIMSS

23

Page 30: TIMSS. 54p. - ERIC · 2014-06-02 · Improving Mathematics Education Using Results from NAEP and. TIMSS. Council of Chief State School Officers, Washington, DC. State Education Assessment

Grouping students by prior mathematics achieve-ment has been debated by educators and research-ers for years. The data for these selected states showclearly that student achievement is predicted ineach state by the course students take, particularlywhen considering algebra vs. pre-algebra. Weshould also note that the proportion of studentstaking these different course levels differs signifi-cantly among states. Kentucky has 49 percent ofstudents taking regular eighth grade math and 20percent taking algebra, as compared to 35 percentof Connecticut students taking regular math and28 percent taking algebra.

Research on patterns of student achievement hasdemonstrated that instructional time and coursetaking in math and science varies widely acrossU.S. schools, and that they are correlated with thesocioeconomic status of students in the schools(Good lad, 1984; Horn & Hafner, 1992; Oakes,1990; Weiss, 1994; NCES, 19970. New research onsecondary curricula show that schools with highlydifferentiated (or tracked) secondary course offer-ings have the lowest achievement among economi-cally disadvantaged students (Lee, et al., 1995.)

The level of mathematics curriculum that stu-dents reach in high school makes a major differ-ence in achievement for all groups of students.The table in Figure 2 I shows the NAEP achieve-ment scale score by three student ethnicity groupsand by the level of high school mathematicsattained by graduation.

Course-level is a strong predictor of NAEP scores forall student groupswhite, black, and Hispanicstudents. The NAEP trends data back to 1973 showthat all ethnic groups have doubled their enrollmentin higher level math by graduation, e.g., algebra 2and pre-calculus. In 1973, only 28 percent of blackstudents took algebra 2 by graduation as comparedto 45 percent in 1996. At the same time we shouldnote that even though NAEP scores of all groupshave gone up significantly since 1973, there is stilla significant gap in achievement scores betweenwhite students and black and Hispanic students atall course levels. ccsso's biennial report on StateIndicators of Science and Mathematics Education (Blank,

et al., 1997a) provides a detailed analysis of disparityin NAEP results by student race/ethnicity and trendssince 1990.

FIGURE 2 I

HIGH SCHOOL MATHEMATICS COURSE COMPLETED BY RACE/ETHNICITY

BY NAEP SCORE, 1996

WhitePercent ScaleTaking Score

BlackPercentTaking

ScaleScore

HispanicPercent ScaleTaking Score

Algebra

GeometryAlgebra 2

Pre-calculus/calculus

11%

15

5313

287

304320

342

18%

i645

8

273

280299

16%

19

4'9

306

* Sample not sufficient for reliable estimate.Source: NAEP Trends in Academic Progress, 1997

BEST COPY AVAILABLE

3 0

THE COUNCIL OF CHIEF STATE SCHOOL OFFICERS STATE EDUCATION ASSESSMENT CENTER

24

Page 31: TIMSS. 54p. - ERIC · 2014-06-02 · Improving Mathematics Education Using Results from NAEP and. TIMSS. Council of Chief State School Officers, Washington, DC. State Education Assessment

Opportunity to Learn MathematicsTEACHERS, CURRICULUM, INSTRUCTIONAL PRACTICES

(IV

The international studies conducted under the In-ternational Association for Evaluation of EducationAchievement (IEA), have provided survey modelsand research designs for analyzing students' "op-portunity to learn" mathematics in classrooms andschools. The in-depth analysis of results of theSecond International Mathematics Study (SIMS),such as those by McKnight, et al., (1987) andTravers (1985), provided important research find-ings on the dominant role of the "implementedcurriculum" in explaining their mathematics per-formance. The analysis showed the wide variationin classroom mathematics curriculum among andoften within U.S. schools, and showed that curricu-lum differences were more central to explainingdifferences in achievement than other variablessuch as total instructional time, homework, classsize, or teacher degrees and courses.

Recent research by Porter (1995) and McDonnell,Burstein, et al. (1995) tested the validity of avariety of measures of opportunity to learn includ-ing content coverage, instructional practices, andmaterials, and demonstrated the relationship ofthese measures to achievement. The critical role ofteacher preparation and knowledge in opportunityfor learning was outlined by Stevens (1993) andDarling-Hammond (1995).

Analyses of data from the National Education Lon-gitudinal Study (Hoffer, et al., 1995; Elliott, 1998)show the significant differences in access to learningopporninities among our students. New analyses ofNAEP mathematics results by state (Raudenbush,1998; Education Trust, 1998) show that achieve-ment results are related to differences in mathcurriculum, teacher preparation, and teaching prac-tices both within and between states, and theseopportunity differences are related to student minor-ity and economic status. Grissmer and Flanagan(1998) have conducted new analyses of lorig-term

311

NAEP trends to explain improved performance ofblack students and low-income students and foundthat lower class size made a significant differencefor these student populations.

In our analysis, we focus on three kinds of criticalvariables in opportunity to learn mathematics:

Teacher preparation in mathematics;

Implemented mathematics curriculum; and

Teaching practices in mathematics classrooms.

Measures in these areas have been used in priorstudies of opportunity to learn. They are measuresthat can be analyzed with data available in NAEPand TIMSS, and they indicate conditions that can beaffected by education policies and decisions aboutpractice.

WELL-PREPARED TEACHERS OF

MATHEMATICS MAKE A DIFFERENCE

National professional standards in mathematicsand science, as well as many state standards, call forchange in teaching and classroom practices toemphasize active learning by students, deep under-standing of concepts, and developing skills inproblem solving and reasoning (Ncrm, 1989, 1991;AAAS, 1993; NRC, 1995). The standards for teaching

in mathematics and science de-emphasize teacherlectures, memorizing facts and terminology, andcurriculum aimed at briefly covering many topics.

One implication of states establishing challengingcontent standards in mathematics is that teachersneed in-depth knowledge and understanding oftheir discipline, and skills in a variety of classroompractices that actively engage students. A problemin measuring teacher knowledge and skill in relationto standards is the lack of precision in the data. Most

IMPROVING MATHEMATICS EDUCATION USING RESULTS FROM NAEP AND TIMSS

25

Page 32: TIMSS. 54p. - ERIC · 2014-06-02 · Improving Mathematics Education Using Results from NAEP and. TIMSS. Council of Chief State School Officers, Washington, DC. State Education Assessment

5L:

of the information about teachers and teaching arefrom self-report surveys by teachers or administra-tive records (e.g., degrees, transcripts, etc.). Thevideotape study of eighth grade mathematics class-rooms in the U.S., Germany and Japan led byStigler (NCES, 1997d) has revealed great differencesin the approach and methods of teachers, the role ofstudents in the classes, and the use of materials andtechnology in class.

For analyzing differences in performance of U.S.students in mathematics, we would ideally prefermeasures of the quality of teaching in classrooms and

the quality of teacher preparation. However, researchhas shown that traditional measures of differencesamong teachers in the amount of course prepara-tion in mathematics and science are related tostudent performance among U.S. schools and class-rooms. Although these measures are less useful forinternational comparisons, they are useful for ex-plaining differences within the U.S.

An indicator often used in national and state-by-state reports is the percentage of teachers that holdan undergraduate or graduate major in the teachingfield they are assigned to teach. Research has consis-tently shown a positive relationship between the

amount of course work preparation of U.S. teachersin science and mathematics and student learning inthose fields (Shavelson et al., 1989). A recent analy-sis of data from the Longitudinal Study of AmericanYouth showed that each additional mathematicscourse taken by mathematics teachers above theaverage for teachers translates into two to fourpercent higher student achievement (Monk, 1993).

We can compare levels of teacher preparation inmathematics with NAEP mathematics student profi-ciency. Figure 22 shows the io states with highestNAEP performance at grade 8 (from 34 to 28 percentat/above Proficient), and the io states with lowestperformance. For each of these states we report thepercentage of secondary math teachers (grades 7-12)with a major or minor in mathematics/math educa-tion (based on the 1994 Schools and Staffing Survey,NCES, I996b).

The data on teachers major or minor in mathematicsshow extensive variation among statesfrom 95percent of secondary teachers well prepared in math-ematics to just over 6o percent. Nationally, 8opercent of all mathematics secondary teachers (i.e.,teaching math one or more period) have a major orminor in math; and, 72 percent of secondary teach-

High-PerformingStates, NAEP, Gr. 8

MINNESOTA

NORTH DAKOTA

WISCONSIN

MONTANA

CONNECTICUT

NEBRASKA

IOWA

MAINE

ALASKA

MASSACHUSETTS

NATION

FIGURE 22MATHEMATICS PREPARATION OF GRADE 7-12 TEACHERS

IN HIGH- VS. LOW-PERFORMING STATES ON NAEP 1996

7-12 Teacherswith Math

Major/MinorLow-PerformingStates, NAEP, Gr. 8

7-12 Teacherswith Math

Major/Minor

93% MISSISSIPPI 82%92 LOUISIANA 8283 ALABAMA 8790 ARKANSAS 8088 WEST VIRGINIA 8987 SOUTH CAROLINA 7192 NEW MEXICO 8171 TENNESSEE 6957 KENTUCKY 7975 HAWAII 6280

Note: Standard errors for state estimates are from 2 10 6%.Sources: NAEP 1996 Mathematics Report Card; Schools and Staffing Survey, 1994 BEST COPY MLA

3 2THE COUNCIL OF CHIEF STATE SCHOOL OFFICERS STATE EDUCATION ASSESSMENT CENTER

2 6

Page 33: TIMSS. 54p. - ERIC · 2014-06-02 · Improving Mathematics Education Using Results from NAEP and. TIMSS. Council of Chief State School Officers, Washington, DC. State Education Assessment

ers with their main assignment in math have a majorin that field. (Note: the national figure of 20 percentwithout adequate math preparation represents morethan 46,000 secondary teachers of math.)

The levels of teacher preparation in mathematicsshow a difference between the two groups of states.Seven of the ten high-performing states had percent-ages significantly above the 8o percent nationalaverage of teachers with a major or minor in math-ematics or math education.

The low-performing states on NAEP had fewer wellprepared teachers in mathematics. With the excep-tion of Alabama and West Virginia, the low-perform-ing NAEP states were near or below the nationalaverage for well-prepared teachers in mathematics.

Quality of teacher preparation in mathematics var-ies significantly within states by school. Othernational data analyses (Ingersoll and Gruber, 1996;Weiss, 1994) have shown that schools with a highproportion of low-income or minority studentshave significantly lower percentages of math teach-ers with a major or minor in their field.

NEED FOR IMPROVED MEASURES OF

TEACHER PROFESSIONAL DEVELOPMENT

Professional standards for teaching mathematics(Ncrm, 1991) recommend that teachers have ad-equate course work preparation in the content areasthey will be teaching. In addition the professionalorganizations recommend ongoing professional de-velopment in the subject content and methods ofteaching their assigned field and grade level. Onequestion on the NAEP teacher questionnaire ad-dressed the number of hours of professional develop-ment in math or math education received by teach-ers during the past year.

In Figure 23 we compare the extent of professionaldevelopment in math for two groups of statesstates with highest improvement in NAEP scores atgrade 4 (1992 to 1996) and states with highestimprovement at grade 8 (1990 to 1996). Nation-ally, 28 percent of fourth grade teachers had 16 ormore hours of professional development in math-ematics education, and 48 percent of eighth gradeteachers had 16 or more hours.

The percentage of grade 4 teachers with high levelsof professional development varies widely by state.

FIGURE 23IMPROVING STATES

BY TEACHER PROFESSIONAL DEVELOPMENT IN MATH

NAEP 1996

Grade 4High ImprovementStates

Prof. Devel.Math

16+ Hours

Grade 8High ImprovementStates

Prof. DeveLMath

16+ Hours

CONNECTICUT 22% MINNESOTA 50

TEXAS 46 TEXAS 64

INDIANA 13 WISCONSIN 40COLORADO 2 I CONNECTICUT 47NORTH CAROLINA NEBRASKA 36

WEST VIRGINIA 20 MICHIGAN 44TENNESSEE I9 COLORADO 42NATION 28 NATION 48

Note: Standard errors of state estimates are from i to 3 percent.

Source: NAEP 1996 Mathematics Cross-State Compendium

BEST COPY AVAILABLE3 3

IMPROVING MATHEMATICS EDUCATION USING RESULTS FROM NAEP AND TIMSS

27

Page 34: TIMSS. 54p. - ERIC · 2014-06-02 · Improving Mathematics Education Using Results from NAEP and. TIMSS. Council of Chief State School Officers, Washington, DC. State Education Assessment

For example, only 13 percent of Indiana's grade 4teachers and 19 percent of North Carolina teachersreceived 16 hours or more of math development,while 46 percent of Texas teachers received thismuch development over a one year period. Onlyone of the high improving states at grade 4 hadmore than the national average amount of math-ematics professional development (more than 28 per-cent of teachers). At grade 8, only two of theimproving states had more than 48 percent of teachersreceiving over 16 hours professional development.

The amount of professional development being re-ceived by teachers does not have a consistent relation-

ship to student achievement averages for states thatshowed most improvement in NAEP scores. It mightbe expected that in states with more professionaldevelopment in mathematics we would find greaterimprovement on NAEP mathematics than in stateswith less professional development. But this hypoth-esis is demonstrated not to hold up with the dataavailable at the state level. A majority of the statesthat have shown high improvement are below thenational average with respect to the percentage ofteachers receiving more math professional develop-ment (i.e., 16 or more hours in the past year).

Our view is that focusing on the amount of time thatteachers spend in professional development is inad-equate because it does not analyze the quality oreffectiveness of professional development provided.We also examined other questions pertaining toprofessional development in NAEP such as trainingwith NCTM standards but none covered the topic well.

States, districts, or schools may offer professionaldevelopment or in-service experiences for a variety ofreasons, including state law or district teacher con-tracts. To adequately analyze professional develop-ment, mathematics educators need to monitor itscontent, the methods used to work with teachers, thecontinuity and followup, and the impact on theteachers. The items on amount of exposure are toogeneral and encompassing of different methods ofteacher development and thus are unlikely to show aclear relationship of teacher development to studentachievement.

OTHER DATA SOURCES

ON PROFESSIONAL DEVELOPMENT

TIMSS did not include questions on either amount ofprofessional development or what was studied. Analternate national and state-level data source concern-

ing professional development is the Schools andStaffing Survey (SASS) which is normally conducted by

NCES every four years. The 1994 SASS includedquestions about the content of professional develop-ment activities, i.e., what was intended for teachers to

learn. Teachers reported on how much time they spent

on selected topics during the previous year. Thenational percentages (NCES, 199613) for elementary

and secondary teachers are shown in Figure 24:

FIGURE 24TOPICS AND TIME OF PROFESSIONAL

DEVELOPMENT, K-12 TEACHERS

Topic 1-8 hrs > 9 hrs.Content studyin a subject 15% 15%

Teaching methods 37 27

Methods ofstudent assessment 40 12

Use of educationtechnology 35 15

Source: SASS, 1994

These topics of professional development and time are

reported by state and by elementary vs. secondaryteachers in CCSSO's State Education Indicators with a

F Oa& on Title I (Blank, et aL, 1997c). The next SASS

survey will provide data on professional developmentduring the 1999-2000 school year.

CURRICULUM CONTENT DATA REVEAL

MANY MATH TOPICS, LITTLE FOCUS

In the 19905 almost all states have developed newcontent standards or curriculum frameworks for coreacademic subjects, including mathematics (Blank,et al., 1997b; CCSSO, Key State Policia, 1998a). The

state standards consistently call for focusing math-ematics content on a smaller number of areas,developing mathematical abilities across content

3 4

THE COUNCIL OF CHIEF STATE SCHOOL OFFICERS STATE EDUCATION ASSESSMENT CENTER

28

Page 35: TIMSS. 54p. - ERIC · 2014-06-02 · Improving Mathematics Education Using Results from NAEP and. TIMSS. Council of Chief State School Officers, Washington, DC. State Education Assessment

areas, and planning the curriculum across thegrades. As these standards are being applied inwriting local curriculum, designing professionaldevelopment, and developing student assessmentprograms, educators and policy makers are likely toneed consistent, reliable information and feedbackon the actual subject content that is taught in mathand what students are expected to know and be ableto do. ccsso is currently studying the implementa-tion of state reforms and standards in a projectinvolving II states that employs "surveys of enactedcurriculum" to measure classroom curriculum con-tent and teaching practices. A field study of thesurveys showed that the data can be collected fromteachers, analyzed, and reported in ways that areuseful to educators (ccsso, scAss Science Project,1997). Gamoran, Porter and colleagues completed astudy of effects of different high school mathematicsinstruction using a similar survey approach (1997).

The most comprehensive and detailed data on"implemented curriculum" have been collectedand analyzed in international studies. TIMSS mea-sured student achievement in 41 countries basedon mathematics and science assessment frame-works developed by consensus of the participatingcountries (NCES, 1996a, 1997a). A highlight of theTIMSS results reported in the U.S. has been the"Videotape Classroom Study" of mathematics teaching

in grade 8 (NCES, I997d). The video and accompany-

ing compact disk provides the opportunity to ana-lyze teaching practices by content topic, teachingapproach, and student activity, and it thus offers aqualitative dimension for research on how curricu-lum is taught. This level of detail would be verydifficult and costly for states or districts to replicateon a large scale.

Therefore, we focus on data from TIMSS teachersurveys. This approach can be used more readily by

35

states and districts to assist in analyzing achieve-ment results in mathematics education. TIMSS datacollection included surveys with teachers and stu-dents that had a goal of collecting reliable, compa-rable data on the content of curriculum in math andscience classrooms across the participating countries.Teachers completed a survey that asked which of the35 curriculum content topics in mathematics weretaught during that year and the average number ofperiods the topic or topics were taught.

The results of the survey show variation across the 41participating countries in the content of the actualcurriculum taught, as well as the degree of variationin curriculum within a country. In Figure 25, wecompare data on implemented curriculum in math-ematics at grade 8 in Japan and the U.S. We showthe percentage of teachers that covered each topic,and the average percentage of time per year spent onthe topic.

As the data show, Japanese eighth grade teachersspend most of their time teaching a few topics:geometry, congruence and similarity, functions,relations and patterns, and equations and formulas.In fact, those four areas of the curriculum accountfor approximately 67 percent of the time theyspend teaching. In contrast, teachers in the U.S.spread time very thin among a wide range of topics.The majority teach 16-18 different topics, with onlyone topic accounting for more than eight percent oftheir teaching time. That topic is fractions, whichonly about a quarter of Japanese teachers teach,accounting for only two percent of their time. Theseresults point to why some have accused the U.S.mathematics curriculum, especially in the middlegrades, of being "a mile wide and an inch deep."With so many different topics taught during thatyear, it is unrealistic to expect that any given topicwill be treated at more than a superficial level.

IMPROVING MATHEMATICS EDUCATION USING RESULTS FROM NAEP AND TIMSS

29

Page 36: TIMSS. 54p. - ERIC · 2014-06-02 · Improving Mathematics Education Using Results from NAEP and. TIMSS. Council of Chief State School Officers, Washington, DC. State Education Assessment

FIGURE 25MATHEMATICS CURRICULUM TOPICS, TIMSS, GRADE 8

JapanTeachers

Covered TopicPercent of

Tune

U.S.Teachers

Covered TopicPercent of

Tune

Meaning of Whole Numbers 19% 1% 87%Fractions 26 2 98 17

Percentages 19 92 6Number Concepts 26 2 8

Number Theory 14 95 6Estimation/Number Sense 44 23 ._.

Measurement Units/Processes 25 88 5

Measurement Estimation/Error 40 2 61 2

Perimeter, Area and Volume 27 2 92 5

Geometry, ID & 2D 8i 14 90 5

Symmetry/Transformations 21 56 2

Congruence/Similarity 98 23 72 33D Geometry 22 51 2

Ratio/Proportion 66 3 98 5

Slope/Trigonometry 3' 3 38

Functions, Relations, Patterns 81 12 3 3

Equations/Formulas 94 18 90 8

Data/Statistics 84 2 75 3

Probability/Uncertainty 2 66 2

Sets/Logic I0 49 2

Other advanced content 49 4 48 6

Source: TIMSS Teacher Questionnaire, Population 2 (Schmidt fr Cogan, unpublished data, 1998)

TEACHING PRACTICES DATA INDICATE

MATHEMATICS CHANGE BUT ALSO

TRADITIONAL PRACTICES

The NCTM curriculum standards (1989) and teaching

standards (1991) recommend approaches to instruc-tion that increase students' conceptual understanding,and their abilities to communicate mathematically, toreason and solve problems with mathematics, to make

connections between math learning and real-worldproblems, and to learn skills and procedures. Manystates have completed their own standards and cur-riculum -frameworks in mathematics and science thatsuggest teaching strategies or provide examples ofclassroom practices that are consistent with challeng-ing content standards (Blank, et al., 1997b). We have

,

selected data from the NAEP mathematics teachersurvey that are intended to be used in analyzingteaching in relation to standards. In Figure 26, weshow data on practices in five selected states coveringthe range of state performance. State-by-state statis-

tics on these teaching practices are available in ccsso's

Science and Mathematics Indicators report (Blank, et al.,

1997a, PP. 47, 49).

The statistics on teaching practices for the fivestates at different levels of student achievement donot show any overall relationship between theseteaching practices, consistent with NCTM standardsand state NAEP scores. There is considerable varia-tion in these practices among the five states, and incom arison to the national averages. Connecticut

6

THE COUNCIL OF CHIEF STATE SCHOOL OFFICERS STATE EDUCATION ASSESSMENT CENTER

30

Page 37: TIMSS. 54p. - ERIC · 2014-06-02 · Improving Mathematics Education Using Results from NAEP and. TIMSS. Council of Chief State School Officers, Washington, DC. State Education Assessment

FIGURE 26MATHEMATICS CLASSROOM PRACTICES IN SELECTED STATES

NAEP 1996, GRADE 8

DevelopReasoning/

Analytical Ability(Percent)'

DiscussSolutions to

Math Problems(Daily)

WriteAbout Math(Weekly/more)

Use aCalculator

(Daily)NAEP Score

Average

CONNECTICUT 59 % 44 % 42 % 6o 96 280MICHIGAN 48 48 39 78 277

MISSOURI 46 43 22 71 273

KENTUCKY 49 55 6o 267

LOUISIANA 44 44 25 19 252

NATION 52 48 32 57 271

Notes:

Source:

Percent = Teachers' response "a lot", as compared to "some" or "none". Standard errors of state estimates am from s to 4 percent.Teacher questionnaire; NAEP 1596 Mathematics Cross-State Compendium

teachers report higher emphasis on "developingreasoning and analytical ability," Kentucky andConnecticut teachers include more writing answersto math problems, and Michigan and Missouri havegreater use of calculators in class. The variationshown among the 50 states is consistent with the50-state data (Blank, et al., 1997a).

Develop reasoning and analytical ability;Discuss solutions to math problems

These two practices address the problem solvingand reasoning theme of the NCTM standards formathematics education. Figure 26 shows that na-tionally, half the grade 8 students have teachersthat report emphasis on teaching to develop rea-soning and almost half the students discuss mathproblems almost every day. Although these practicesare quite prevalent, many teachers also do notemphasize these approaches to teaching math. Otherdata on grade 4 classroom practices show that 35percent of teachers report that students discussproblems with other students almost every day. In1 2 of 45 states participating in NAEP 1996, over 5opercent of students in grade 4 reported they dis-cussed solutions to math problems with otherstudents in class once per week or more. Oftenthese students may be working with other studentsin small groups.

BEST COPY AVAILABLE

The fairly high frequency of students reporting theydiscuss solutions to math problems with otherstudents may be somewhat surprising, given thecommon perception of U.S. math instruction asteacher-centered or individuals working on theirown math assignments in class. These findings mayindicate there is change taking place in methods ofmath instruction in our schools.

Write about solving math problems

One third of eighth grade students across thenation reported that they write about how tosolve math problems in class once a week or more.Applying mathematics to real-life needs andproblems is a major emphasis of NCTM standards.Many states have recommended in their standardsthat instruction should develop students' abilitiesto communicate mathematically, such as by writ-ing about how to solve a math problem. Writingabout math in grade 8 classes varied from 23percent in Indiana, Utah, and West Virginia to 58percent in Kentucky and 50 percent in California.

Writing about solving mathematics problems is usedslightly more often in grade 4 math classes than ingrade 8, according to NAEP surveys with students. At

grade 4, 37 percent of students report they writeabout solving math problems once a week or more,?Fared to 32 percent of eighth grade classes.

IMPROVING MATHEMATICS EDUCATION USING RESULTS FROM NAEP AND TIMSS

31

Page 38: TIMSS. 54p. - ERIC · 2014-06-02 · Improving Mathematics Education Using Results from NAEP and. TIMSS. Council of Chief State School Officers, Washington, DC. State Education Assessment

USE OF CALCULATORS IN CLASS

Nationally, 57 percent of students reported they usecalculators almost every day in grade 8 mathematicsclass. Twenty states had over half their studentsusing calculators in math class almost every day.Over 76 percent of grade 8 students use calculatorsat least once a week, which increased from 53percent in 1992. States with the greatest calculatoruse at grade 8 in 1996 were: Alaska, Department ofDefense Schools, Iowa, Michigan, Minnesota, Mis-souri, Montana, Nebraska, Oregon, Utah, andWisconsin.

In 1992, only 18 percent of grade 4 students acrossthe United States were reported by their teachers asusing calculators in math class once per week ormore. From 1992 to 1996 the rate increased to 34percent. Eight states had over 5o percent of theirfourth-grade students using calculators in class atleast once per week in 1996.

The TIMSS teacher questionnaire in mathematicsincludes items about how calculators are used inclass. These data may be helpful for educators toanalyze in relation to teaching practices. The ques-tions in TIMSS ask for frequency of use of calculators

for the following activities: (a) checking answers; (b)

tests and exams; (c) routine commutation; (d)solving complex problems; and (e) exploring num-ber concepts.

CLASSROOM ACTIVITIES ANALYZED IN TIMSS

The students in the TIMSS study reported a widerange of classroom activities according to the re-sponses "most lessons," "some lessons," or "never."Some of the class activities reported most often byU.S. students are shown in Figure 27.

The data on classroom activities indicate that inmost lessons, teachers show students how to solveproblems and students do individual "seat work" atboth grades 4 and 8. Half of grade 8 students reportthey begin homework in class, but only one fourthare asked to apply everyday life to math.

The TIMSS teacher questionnaire asked for detailsabout what students are expected to do in class.Teachers in most classes at both levels expectstudents to be able to explain the reasoning behindan idea in mathematics. Only about a third of theteachers expect students to solve problems and onlyone in ten expects students to work on problemsthat do not have an immediate solution.

CLASSROOM ACTIVITIES USED

IN "MOST LESSONS"

TIMSS

Class Activity

Teacher shows howto do problems

Individual worksheetsor textbook use

FIGURE 27

Most LessonsGrade 4 Grade 8

73% 78%

55 59

Have a quiz or test 48 39

Apply everyday life insolving problems 37 23

Begin homework in class 36

Source: T1MSS Student Questionnaire

50

TEACHER EXPECTATIONS OF STUDENTS

IN "MOST LESSONS"

TIMSS

Most LessonsExpectation Grade 4 Grade 8Explain reasoningbehind an idea 71% 67%Work on problemswith no immediatesolutionWrite equations tosolve exercises/problemsPractice computationalskills

Source: T1MSS Teacher Questionnaire

7 1 2

2 8 38

70 59

38

THE COUNCIL OF CHIEF STATE SCHOOL OFFICERS STATE EDUCATION ASSESSMENT CENTER

32

Page 39: TIMSS. 54p. - ERIC · 2014-06-02 · Improving Mathematics Education Using Results from NAEP and. TIMSS. Council of Chief State School Officers, Washington, DC. State Education Assessment

CONDITIONS FOR TEACHING

A KEY OPPORTUNITY TO LEARN MEASURE

A problem that is often raised by teachers withregard to improving instruction or support fortheir role as teachers is the need for appropriatetextbooks and materials. The issue is often not onlythe adequacy of the materials themselves but theteacher's views on the degree of support they receivefrom their school or district when they requestassistance. In the NAEP mathematics teacher ques-tionnaire a question is asked about teacher viewsabout the availability of instructional materials andresources. The NAEP teachers were asked, "how well

are you provided with instructional materials andresources you need to teach?"

The results shown in Figure 28 indicate the avail-ability of materials and resources to teachers atgrade 4 for mathematics. The availability of re-sources does not explain whether teachers are pro-viding quality teaching, but it does show evidenceabout one indicator of good teaching and highstudent performance. In all but one of the high-performing states, the percentage of students whoseteachers receive all or most of the materials theyneed is significantly above the national average. Theresults indicate that teachers in high-performingstates have more satisfaction with the availabilityof materials and resources than teachers in low-performing states. Four of five low-performingstates are at or below the national average withrespect to this statistic.

NAEP teacher questionnaires provide state-levelinformation about a number of school conditionsthat may be important for teaching mathematics,including availability of a curriculum specialist,support from parents as aides, student absenteeism,and preparation time for teachers. We have focusedon data about materials and resources as one keyindicator which appears related to student perfor-mance and could be helpful to mathematics educa-tors for continuing to track and monitor theimportant elements of high quality teaching.

BEST COPY AVAILABLE

FIGURE 28PERCENT OF STUDENTS WHOSE TEACHERS

RECEWE INSTRUCTIONAL MATERIALS AND

RESOURCES THEY NEED

NAEP 1996, GRADE 4

High-PerformingStates

Receive MaterialsMost/All Some/None

CONNECTICUT 75% 25%MINNESOTA 72 28MAINE 67 33WISCONSIN 8o 20NEW JERSEY 73 27TEXAS 78 22NATION 66 34

Low-Performing StatesMISSISSIPPI 68% 32%LOUISIANA 59 41CALIFORNIA 58 42ALABAMA 64 36SOUTH CAROLINA 66 34

Note: Standard errors from 2 to 4 percent

Source: NAEP 1996 Mathematics Cross-State Compendium

SUMMARY ON OPPORTUNITY TO LEARN

The measures for studying opportunity to learn math-ematics in NAEP and TIMSS show the differencesbetween the two data sources. TIMSS was designed toprovide many different measures of differences inclassrooms and teaching in mathematics and scienceas well as student learning in those subjects. NAEP was

primarily designed to report on student achievement,and the number and types of measures of whathappens in classrooms are limited. We did find thatNAEP data are useful for examining differences across

states on some key characteristics sensitive to policyand program change such as teacher preparation, useof calculators, and classroom materials. Alternatively,MASS data provide much greater depth and detailabout the content of mathematics that is taught andthe different types of teaching practices that are used.With T1MSS, we can see nation to nation variations,but it is not possible to report state differences. It ispossible to analyze school characteristics that areimportant to educators, such as location, backgroundof students, and school and class size.

39IMPROVING MATHEMATICS EDUCATION USING RESULTS FROM NAEP AND TIMSS

33

Page 40: TIMSS. 54p. - ERIC · 2014-06-02 · Improving Mathematics Education Using Results from NAEP and. TIMSS. Council of Chief State School Officers, Washington, DC. State Education Assessment

Major Differences BetweenNAEP and miss and State Assessments

To provide context for our analysis of NAEP and TIMSS

results, and to relate these results to the experienceand knowledge of math educators, we can look atimportant differences in the assessmentsincludingtheir purposes, structure, and development. Com-mon understanding about how the tests are devel-oped and how the results are reported will helpeducators and policymakers to better use these large-scale test results and compare the findings withresults from their state and local mathematics re-sults. Too often we begin to read and analyzestatistical findings without clear understanding andperspective on the sources of the data and whatmeaning can be taken from them.

PURPOSE OF THE ASSESSMENTS

The purposes need to be clearly stated and understood.

Differing purposes for a student assessment study orprogram produce different frameworks for the contentof the tests, different scope and size, different methods

of selecting respondents, and varying methods oftesting and collecting other research data.

NAEP. The National Assessment of Educational Pro-gress provides a regular, periodic report on the extentof knowledge and skills of students in America'sschools and to track the progress of learningNAEP is

the "Nation's Report Card." Since the inception of theNAEP under federal legislation and support in 1969, ithas regularly reported on student performance in coreacademic subjects including mathematics, reading,science, history, arts. From the outset, NAEP tests were

written from a framework developed independentlyfrom local curricula or textbooks and independentlyfrom state guidelines or standards for student learn-ing. Subject specialists, educators, policymakers, par-ents and other advisers develop the NAEP "assessment

framework" which provides guidance for the develop-ment of the tests, scoring, and reporting of results. Arepresentative sample of the nation's students is as-sessed in grades 4, 8, and 12, and starting in 1990

representative samples of students in each state focus-ing on grades 4 and 8.

NAEP provides national and state-level indicators ofstudent learning in math, disaggregation of summaryscores by content topic (e.g., number sense, measure-ment) and a wide range of indicators of studentbackground, teacher preparation, instruction, andschool and classroom conditions. NAEP does not provide

diagnostic data or curriculum analysis directly toteachers, schools, or local districts. A strength of theNAEP is its capacity for providing a high-qualitycomprehensive assessment of math learning which isindependent of specific local curriculum textbooks, orstate policies and standards.

The NAEP assessment results and supporting ques-tionnaires from students and teachers are based on asample of 2,000 students per state at each assessedgrade. The data do not provide a way for states toanalyze student achievement for each school anddistrict. The results, however, are still extremelyvaluable as indicators. NAEP results provide a way tomonitor state progress in student achievement; toassess education received by specific groups of stu-dents; and, very important, to determine the relation-ship of student achievement to characteristics ofschools, classroom practices, and teachers, by state.

TIMSS. The purpose of the Third InternationalMathematics and Science Study was to measure theextent of student learning in mathematics andscience in the 41 participating countries, and todetermine the key factors in explaining differencesin student learning. The study design and U.S.data collection were supported by the NationalScience Foundation and the U.S. Department ofEducation. Student testing and data collection inschools were completed during the 1994-95 schoolyear. The TIMSS research design included a study ofcountries' intended, or written, curricula, theimplemented curriculum, the methods of instruc-

4 0

THE COUNCIL OF CHIEF STATE SCHOOL OFFICERS STATE EDUCATION ASSESSMENT CENTER

BEST COPY AVAILABLE 34

Page 41: TIMSS. 54p. - ERIC · 2014-06-02 · Improving Mathematics Education Using Results from NAEP and. TIMSS. Council of Chief State School Officers, Washington, DC. State Education Assessment

tion, and key variables about teachers, schools, andstudents. All of the participating countries had toagree to the consensus assessment framework formathematics and science from which the tests werewritten, as well as the types of items to be used andthe final test instruments. A representative sampleof schools and classrooms was selected at each of thethree grade levels of the study with the goal ofproviding fair and comparable samples of studentsfor each country. In addition to student tests, TIMSSdata collection included teacher questionnaires,school questionnaires, curriculum document analy-sis, a videotaped classroom study, and case studies.

A main strength of TIMSS as compared to NAEP is the

time and effort devoted to analysis of the curriculumand methods of instruction in mathematics andscience. A detailed methodology was developed forcoding, categorizing and analyzing each nation'scurriculum standards, guides and textbooks. For theU.S., the analysis was based on a representativesample of documents from states. The videotaping ofinstruction in a sample of 8th grade mathematicsclassrooms in three countries (U.S., Japan, Germany)

was a second key feature of the TIMSS analysis. And,third, the TIMSS questionnaires for teachers providemuch greater detail and depth about what teacherscover in math and science and what instructionalpractices they use to teach specific content. Themain TIMSS study did not provide results by states in

the U.S., but several states (Colorado, Minnesota,Missouri, and Oregon) chose to conduct the TIMSSstudy and their results can be compared to the U.S.and other countries.

HOW DO THE PURPOSES OF NAEP AND TIMSS

COMPARE TO STATE ASSESSMENTS?

Each state selects or develops their own assessmentof learning in mathematics, generally under statelaw or mandate. In 1996-97,45 states administereda state mathematics test to almost all students at oneor more grade levels (ccsso, Key State Policies,1998a). State assessments have a variety of statedpurposes, according to state directors, includingaccountability for schools and districts, instructional

improvement, monitoring student progress and cer-tifying students for graduation (CCSSO, SSAP 1998b).

The high priority purposes that distinguish statetests are accountability, since state law requirestesting and reporting the results by school anddistrict, and in some cases by classroom or student.State assessments are given annually and except intwo states include all students in selected grades.The great majority of states now produce a report foreach school and district in the state, and many alsoprovide reporting by classrooms and individual stu-dent (ccsso, 1998c). States report their test resultswithin six to nine months of students taking the test.

Most state assessment programs in mathematics donot have the degree of emphasis on multiplemethods of assessment found in NAEP, where 50percent of the test score is based on open-ended orconstructed response questions. Recent informa-tion shows that 14 states do have some mathematicsexercises requiring open-ended, extended responsesand 12 states ask for short-answer responses (CCSSO,

SSAP, I998b). State programs do not typically include

teacher, student, and school questionnaires, as in NAEP

and TIMSS, that allow for detailed analysis or expla-nation of assessment results, although more statesare now collecting some supporting data on in-structional practices received by students.

Neither NAEP nor TIMSS have specific consequences or

"high stakes" for students, teachers or schools. NAEPand TIMSS results are used by some states as animportant reference point about student learning andthe teacher classroom, and school background datahave been used by many states.

Participation in NAEP assessments raises issues forschools. Some states report that participation inNAEP by the state or cooperation of selected schoolsis a problem because the administration time addsto their own high-stakes tests. Also, questions havebeen raised about the motivation of students to dowell on NAEP and other special assessments such asTIMSS, particularly for students in grade 8 or 12who might be aware that their scores do not reflecton their own performance or that of the school.

4 1

IMPROVING MATHEMATICS EDUCATION USING RESULTS FROM NAEP AND TIMSS

35

Page 42: TIMSS. 54p. - ERIC · 2014-06-02 · Improving Mathematics Education Using Results from NAEP and. TIMSS. Council of Chief State School Officers, Washington, DC. State Education Assessment

ASSESSMENT FRAMEWORKS

An "assessment framework" as used with NAEP andTIMSS provides guidelines for item constructionand test development which define the subjectcontent and expected student abilities or capacities.

NAEP. The NAEP Assessment Framework is developed

under the National Assessment Governing Board, afederally-supported body with appointed membersrepresenting policymakers, educators, researchers, and

constituents. The Mathematics Framework for 1996(NAGB, 1996) was written by an advisory group of15 mathematics educators and mathematicians.

The NAEP frameworks after 1990 were stronglyinfluenced by the NCTM mathematics standards(NCTM, 1989, CCSSO, 1988), and the recent NAEPassessments incorporate more open-ended and con-structed response items, in large part to match thecontent and expected student performance set in theassessment framework. Federal funding support hasincreased to match the costs of developing a high-quality test, including the development, pilotingand scoring processes.

The NAEP Mathematics Assessment for 1996 in-cluded items and problems from five frameworkmathematics strands and two domains that cross thestrands:

CONTENT STRANDS:

Number sense, Measurement, Geometry, Data/Sta-tistics, and Algebra/Functions

DOMAINS:

Mathematical Abilities (Conceptual understanding,

Procedural knowledge, Problem solving); and Math-ematical Power (Reasoning, Connections, Commu-nications)

(Rwse, et al., 19974. 2)

TIMSS. The TIMSS Assessment Framework was de-veloped by an international panel of scholars andeducators in mathematics and science. Draft frame-works and content category descriptions were re-viewed by oversight committees representing par-

BEST COPY AVAILABLE

ticipating countries. Countries were asked to deter-mine if their curriculum and instruction matchesthe Framework. Items to match the TIMSS frame-work were drafted and submitted by any participat-ing country, and items were reviewed by all partici-pating countries to determine validity of the test inrelation to their curricula. The TIMSS framework formathematics had eight Content categories and fivePerformance expectations categories:

CONTENT:

Numbers, Measurement, Geometry, Proportion-ality, Functions/relations/equations, Data represen-tation/probability/statistics, Elementary analysis,and Validation and structure

PERFORMANCE EXPECTATIONS:

Knowing, Using routine procedures, Investigatingand problem solving, Mathematical reasoning, andCommunicating

(Robitaille, et al., 1993)

FIGURE 29DISTRIBUTION OF TEST ITEMS

BY CONTENT STRAND

GRADE 8: NAEP, 1996 AND TIMSS

% of ItemsStrand NAEP rriiiii:Number Sense,Properties, Operations

Proportionality/Ratios

Fractions andNumber Sense

25%

1 7% ,

i 34% '

MeasurementMeasurement 12% 12%

Geometry andSpatial Sense

Geometry20%

; 15% :

Data Analysis, Statistics,and Probability

Data Representation,Analysis and Probability

15%

1 14%1

Algebra and FunctionsAlgebra

25%18%

Sources: Reece, 1997; NCES, 1996a

4 2THE COUNCIL OF CHIEF STATE SCHOOL OFFICERS STATE EDUCATION ASSESSMENT CENTER

36

Page 43: TIMSS. 54p. - ERIC · 2014-06-02 · Improving Mathematics Education Using Results from NAEP and. TIMSS. Council of Chief State School Officers, Washington, DC. State Education Assessment

The content strands for the two frameworks werequite similar. However, the distribution of items tocontent strands differed significantly. The TIMSSmath test for grade 8 had slightly less algebra thanNAEP and slightly less geometry. TIMSS placedgreater emphasis in testing fractions and proportion-ality and ratios. The two tests had about the sameemphasis on data, statistics, and probability.

Another important difference between the math-ematics tests was the types of items. On the NAEP math

assessment in 1996, 50 percent of the questions weremultiple choice and 50 percent were constructedresponse, including short answer questions and prob-lems requiring extended written responses and expla-nations. The NAEP items were written and scored tocount toward more than one content strand, and eachitem was coded for the domain of math abilities orpower that was tested. On the TIMSS mathematicstest, 75 percent of items were multiple choice andone-fourth were constructed response, includingshort answer and extended response.

STATE FRAMEWORKS. Recent reports on state policies

and reform initiatives (Zucker et al., 1998; CCSSO,Key State Policies, 1998a) indicate that states areactively working to "align" their assessment pro-grams in mathematics and science to the newcontent standards developed in almost every statesince 1993. A recent review of the main categoriesand structure of state standards and frameworks in aCCSSO study (Blank et al., 1997b) reveals that moststates do have content topics and expectation forstudents that are modeled after the NCTM curricu-lum standards, and generally include the five con-tent strands and knowledge domains outlined inNAEP and TIMSS. However, we cannot say how theassessment frameworks or the actual assessments inmathematics conducted by states match to what isset out for NAEP and TIMSS.

States may want to examine the assessment frame-works used by NAEP and MISS. The assessmentframeworks allow the public, educators, and policy-makers to determine how the assessment will be

4 3

writtenusually providing much more specificitythan either state standards or local curriculum.

The process of aligning assessments to state stan-dards is an important task being carried out bymany states, and the assessment framework stepprovides a way to focus the test development efforttoward each priority of the standards, both inbreadth and depth of coverage. In fact, alignment ofassessments to standards has gained a new level ofimportance as state content standards have beenapproved as the means for assuring accountability ofschools and districts. More detailed, systematic ap-proaches to alignment analysis that can be applied tostate standards are being tested with states by CCSSO(Webb, 1997) and by Achieve.

REPORTING ASSESSMENT RESULTS

NAEP assessment results are initially released to thepublic, press, states, and others in the NAEP ReportCard. It includes the test results for the nation andstates for three grade levels and a report for each state

for the grades tested, (e.g., grades 4 and 8 in 1996).The Report Card is released by NCES about one yearafter the test administration. The results are disag-gregated by student demographic characteristics.Subsequent reports, such as the NAEP 1996 Math-ematics Cross-State Data Compendium (Shaughnessy, et

al., 1997), have provided data by state for all of thequestionnaire items and further disaggregation oftest results. Other research reports and analyses forNAEP are supported and produced by NCES.

TIMSS test results were released to the public andparticipating countries for math and science sepa-rately by grade level: eighth grade in Fall 1996,fourth grade in spring 1997, and twelfth grade inFall 1997. The international reports (e.g., Beaton,et al., 1996) were released simultaneously with areport focusing on U.S. results from NCES (e.g.,1996a). Separate reports were produced and re-leased in the TIMSS curriculum documents analysis,Many Visions, Many Aims (Schmidt, McKnight, etal., 1996), video study, and case studies (see NCES

IMPROVING MATHEMATICS EDUCATION USING RESULTS FROM NAEP AND TIMSS

37

Page 44: TIMSS. 54p. - ERIC · 2014-06-02 · Improving Mathematics Education Using Results from NAEP and. TIMSS. Council of Chief State School Officers, Washington, DC. State Education Assessment

website for reports list, www.ed.gov/NcEs). Datafrom questionnaires and test item results were madeavailable on the Internet and by compact disc.

State assessment programs are typically instrumentsfor accountability of the education system. Testresults are reported by school for purposes of ac-countability in 37 states (ccsso, SSAP, 1998). Thesereports are aimed at examining the degree of educa-tion progress of each school. State tests also haveimplications for individual students. In 21 stateshigh school students must pass a state assessment or"exit exam" prior to graduation (CCSSO, Key StatePolicies, 1998). High priorities for states are rapidscoring of the items, analysis and reporting that canbe provided to policymakers and the public effi-ciently, and methods for reporting and trackingprogress of schools and districts.

In ii states, school-level test scores are used in theschool accreditation process and ro states provideschool awards or recognition (CCSSO, SSAP, 1998).Additionally, over 20 states have planned programinterventions or sanctions for schools that do notshow improvement in student achievement over a

multi-year period. State assessment programs placeemphasis on rapid turnaround so data can be used byeducators at all levels, and they are concerned aboutholding down costs per student particularly becausestate programs emphasize testing the universe ofstudents. A new trend in state assessment scoringand reporting is to indicate the specific subjectcontent or standard for learning that has beenassessed and how schools, classrooms, and studentsperformed against each standard.

USE OF ACHIEVEMENT/

PROFICIENCY LEVELS IN REPORTING

NAEP achievement levels are descriptions of whatstudents should know and be able to do in math-ematics at each grade level. Under supervision ofthe National Assessment Governing Board, threelevels were defined for each grade levelBasic,Proficient, and Advanced. A group of 75 educators,citizens and mathematicians participate in the

level-setting process by rating the kinds of assess-ment exercises given in NAEP for what studentsshould know and be able to do. The NAEP achieve-ment levels are set prior to the test and they areestablished separately from the items and studentscores for a given year's assessment. Each of theachievement levels is defined more narrowly for eachgrade level, in terms of the particular mathematicalconcepts and skills that apply. For example, belowis the definition of the Basic level for grade 8:

Eighth grade students performing at the Basiclevel should exhibit evidence of conceptualand procedural understanding in the five NAEPcontent strands. This level of performancesignifies an understanding of arithmetic opera-tionsincluding estimationon whole num-bers, decimals, fractions, and percent.

Eighth graders performing at the Basic levelshould complete problems correctly with thehelp of structural prompts such as diagrams,charts, and graphs. They should be able tosolve problems in all NAEP content strandsthrough the appropriate selection and use ofstrategies and technological toolsincludingcalculators, computers, and geometric shapes.Students at this level also should be able touse fundamental algebraic and informal geo-metric concepts in problem solving.

As they approach the Proficient level, students

at the Basic level should be able to determinewhich of the available data are necessary andsufficient for correct solutions and use them inproblem solving. However, these eighth grad-ers show limited skill in communicatingmathematically (Reese, et al., 1997).

Several major advantages are offered in using theNAEP levels. First, NAEP scores are more understand-

able and interpretable by educators and the publicwhen reported according to written standards forwhat is expected of students at a given grade level. We

know what percentage of students meet expectedstandards and what percentage are still below the

4 4THE COUNCIL OF CHIEF STATE SCHOOL OFFICERS STATE EDUCATION ASSESSMENT CENTER

38

Page 45: TIMSS. 54p. - ERIC · 2014-06-02 · Improving Mathematics Education Using Results from NAEP and. TIMSS. Council of Chief State School Officers, Washington, DC. State Education Assessment

standard. The achievement levels are widely usedin reportingincorporatedmathematics

and analyzing NAEP results. ccssothe NAEP levels in reporting stateand science indicators starting in

1993, and other organizations such as the NationalEducation Goals Panel have used the percentage ofstudents meeting these levels as key indicators.

The NAEP scale is a composite of the five contentstrands measured in the mathematics assessment.Student responses to each question are analyzed todetermine the percentage of students respondingcorrectly (for multiple-choice) and the percentageof students responding in each of the score catego-ries (for regular and extended constructed re-sponse). NAEP uses item response theory (IRT)scaling methods to produce an overall compositemathematics score for the nation and each state bygrade level, and scale scores are produced for eachmath content strand. TIMSS used IRT scaling toproduce overall mathematics and science scalescores for each participating country. The IRTscaling method produces a score by averaging theresponses of each student, taking into account thedifficulty of each item. NAEP results are reported ona composite scale from o to 500 that allowscomparisons of scores by state, type of school, or avariety of other disaggregations. TIMSS results arereported for each grade level on a composite scalefrom o to 800. TIMSS also reported each country's"average percent correct" for each separate contentcategory on the Mathematics test, such as fractionsand proportionality.

IRT scaling and matrix sampling of items forstudents allow NAEP and TIMSS to test a wide rangeof mathematics content by including many moreitems in the total assessment pool. NAEP empha-sizes reporting of student performance for the na-tion, by state, and a for variety of subpopulations, inrelation to proficiency levels for expected math-ematics knowledge and skills. Scale scores provideefficient comparisons across groups but also can belinked to expected mathematics knowledge and

45

ability. In 1996 the NAEP mathematics reportprovided a unique display of scale score results byshowing the kinds of math problems a studentcould do at any given scale score. The graphic isreproduced in the Appendix.

Large-scale student tests typically have been scoredand reported using a "norm-referenced" approachin which student scores are mainly interpretedrelative to the performance of other students, othergroups of students, or other states or types ofschools. The "norms" for such a test are generallyset by testing a national representative sample ofstudents. Any group or individual can be comparedto this sample score.

States have moved toward use of achievement, orproficiency, levels for state assessment programs.As of the 1996-97 school year, 39 of the 45 stateswith state mathematics assessments had definedperformance levels for scoring and reporting(ccsso, SSAP, 1998). The states move toward use ofperformance levels is encouraged by requirementsfor accountability under federal Title I law. By2000 all states will need to show the relationshipbetween their content standards for mathematicsand reading and the state assessment and each statewill need to report scores using three or morereporting levels to track progress. The intendedfocus of Title I programs and accountability is toimprove the performance of schools that serve low-income studentsthe focus of Title I funds.

A significant advantage of performance levels isplacing focus on the proportion of students in aschool, district, or state that have met a set level ofexpected performance in the subject area, and thenmonitoring the extent to which the percentage ofstudents meeting the level increases over time.This is a "growth-based" model for analyzingassessment results rather than an analysis based onabsolute score or relative scores. The state candetermine whether students are accomplishing ex-pected knowledge and skills, and whether perfor-

IMPROVING MATHEMATICS EDUCATION USING RESULTS FROM NAEP AND TIMSS

39

Page 46: TIMSS. 54p. - ERIC · 2014-06-02 · Improving Mathematics Education Using Results from NAEP and. TIMSS. Council of Chief State School Officers, Washington, DC. State Education Assessment

mance schools is improving. A focus on growth overtime, using set performance standards for all stu-dents and schools redirects priority and function toan accountability system and assessment resultsand away from the simple use of scores to rank orrate schools, districts, or students within the sys-tem.

SAMPLING VS. ALL STUDENTS

NAEP and TIMSS results are based on scores from arepresentative sample of students in each participat-ing entitystate or nation. Both use matrix sam-pling and up to eight different test booklets, thusallowing better coverage of the full mathematicsframework. NAEP and TIMSS are much more likely to

cover the content and expected abilities and skillscalled for in the assessment framework than tradi-

tional test designs and many of the current stateassessments. The scoring and scaling of resultsprovides weighting of scores for items with differentcomplexity, difficulty, and item design, e.g., con-structed response vs. multiple-choice.

With matrix sampling, students are administereddifferent combinations of test items with the ver-sions matched on items difficulty, content, anddesign. For example, the NAEP scores that areproduced for a content strand such as algebra are anaggregation of student answers across all the differ-ent test versions. Matrix sampling increases thevalidity and reliability of state and national scoresin relation to the assessment framework. The maindisadvantage is that individual student, classroom,and school results cannot be produced and scorescannot be compared at these levels.

4 6

THE COUNCIL OF CHIEF STATE SCHOOL OFFICERS STATE EDUCATION ASSESSMENT CENTER

40

Page 47: TIMSS. 54p. - ERIC · 2014-06-02 · Improving Mathematics Education Using Results from NAEP and. TIMSS. Council of Chief State School Officers, Washington, DC. State Education Assessment

mplications For Mathematics Education

From the analysis given here of the results from themost recent administration of NAEP, the state levelNAEP, and TIMSS, we can find several importantmessages. We have seen that there has been someimprovement over time in the mathematics achieve-ment of U.S. students, and a number of states areshowing growth in student achievement. Yet theinternational picture is sobering, and reminds usthat we are far from the goal of enabling all of ourstudents to experience a quality education in math-ematics. Some of the specific results noted in thisanalysis point to several major areas that needimprovement. We can categorize those areas as:changing the emphasis on what is taught, attendingto how it is taught, and improving the preparation ofteachers.

WHAT IS TAUGHT:

THE NEED FOR MORE DEPTH IN CONTENT

While basic computational skills seem to be fairlystrong, the curriculum at all grades needs to putmore emphasis on measurement, number sense,geometry, and proportionality. The critical factorin all of this is that the topics should not be taughtin a shallow, "scattershot" approach. Fewer topicsshould be taught at more depth, at all levels, andwith less repetition. This is especially true for themiddle grades. Performance expectations need tobe set higher at all levels.

Many traditional textbook series in the U.S. haveplaced too much emphasis on arithmetic skills asthe primary target of the curriculum in grades K-8.Newer curricula, such as those developed withfunding from the National Science Foundation,integrate mathematics topics from across the cur-riculum, emphasizing conceptual understanding.Many of these materials embed the mathematics inreal contexts, where the required skills and proce-dures are learned as tools for solving non-routineproblems.

4 7

Finally, NAEP results show clearly that the math-ematics course that students receive is directlyrelated to their performance. Students that aretaught a more challenging curriculum in middleand high school mathematics reach higher levels ofachievement .

HOW IT IS TAUGHT: THE NEED FOR MORE

MATHEMATICAL PROCESSES AND

DEPTH OF UNDERSTANDING

Students are reasonably successful with basic, one-step problems and routine procedures. What theyneed are more experiences with non-routine prob-lem solving situations and more opportunities toapply their skills in real-world contexts. Teachersneed to choose more tasks for students that requiremathematical reasoning, making conjectures, andjustifying their answers. Students need more oppor-tunities to learn to communicate mathematically,through listening, speaking, arguing, writing, read-ing, and explaining. Communication skills shouldbe central to the activities in every mathematicsclassroom, and not simply relegated to the ubiqui-tous direction of "show your work." In addition, theemphasis should be on understanding concepts,rather than memorizing procedures.

TEACHER PREPARATION

If teachers are to choose the kinds of tasks de-scribed above for their students, that is, tasks thatdemand solving contextualized, novel problems,mathematical reasoning, and mathematical com-munication, the teachers' knowledge demands aremuch greater than if the teacher merely demon-strates to students how to carry out routineprocedures. Teachers must have sufficient depthand breadth in content knowledge to feel com-fortable facilitating discussions about mathemati-cal concepts. Often such discussions take bothteachers and students into new and perhaps unfa-

IMPROVING MATHEMATICS EDUCATION USING RESULTS FROM NAEP AND TIMSS

41

Page 48: TIMSS. 54p. - ERIC · 2014-06-02 · Improving Mathematics Education Using Results from NAEP and. TIMSS. Council of Chief State School Officers, Washington, DC. State Education Assessment

miliar mathematical territory. A teacher whoseown background and knowledge of mathematicsis shallow or uneven will not feel confident aboutsuch explorations, and will tend to retreat to morefamiliar and less demanding tasks. Therefore, theprofessional development of teachers is a majorconcern in bringing about these changes in theway mathematics is taught.

The quality of professional development opportu-nities is key. Rather than attending one-day "makeand take" workshops that are designed only to offer"fun" activities, teachers need ongoing, serious

work in developing breadth and depth in contentknowledge and pedagogical skills. They need op-portunities to examine student work on extendedconstructed response tasks, and to discuss withother teachers how to incorporate more communi-cation and higher order thinking into their courses.On a regular basis teachers should be discussingteaching and learning with their colleagues, byvisiting other classes, examining student products,and viewing videos of classrooms. The releasedtasks and scoring guides from NAEP and the toolkitfrom TIMSS can be valuable sources of materials forprofessional development.

Conclusioncv

Educators and policymakers are constantly facedwith findings from yet another national, state, orinternational study that calls the public's attentionto the relative failings or successes of our schools.State assessment programs, NAEP, and TIMSS repre-sent major investments in education research andpublic accountability for K-12 education. Theybring a powerful focus on central questions aboutthe health of our education system. Teachers andlocal educators, especially, may be suspicious of thefindings of these studies because they receive sig-nificant attention for short periods of time withoutsustained followup for educators, and because theyoften are not accompanied with details on how thestudies were conducted or with relevant informa-tion that can be applied to day-to-day teaching andlearning.

This paper has demonstrated that a more completeand detailed picture of mathematics education inthe U.S. can be obtained by studying the detailsand meaning behind the averages and overalltrends typically reported about NAEP and TIMSS.We have shown how educators' decisions aboutwhat and how to teach makes a great difference instudent performance. We also have shown how

averages and national and state summary statisticscan mask both improvements and problems. Forexample, our research shows that improvement onNAEP mathematics assessment shown in studentknowledge with number sense and operations mustbe weighed against poor proficiency in measure-ment and geometry. Similarly, the difficulties U.S.students have shown in answering non-routineproblems requiring open-ended responses shouldbe faced in light of the significant gains by stu-dents at all achievement levels in basic operationsand procedures. The analysis and interpretation ofstudent difficulties with the selected NAEP andTIMSS exercises are the kind of item analysis thatwill help highlight instructional improvements.The sample exercises provide examples of the kindsof broad examination of student knowledge andabilities that are offered in these assessments, andillustrate ways that classroom assessments and localaccountability tests can be improved.

The analysis and comparison of U.S. results onNAEP mathematics and TIMSS help educators seekey differences between the achievement tests andaccompanying data for interpreting the findings.We have shown some differences between the

4 8THE COUNCIL OF CHIEF STATE SCHOOL OFFICERS STATE EDUCATION ASSESSMENT CENTER

42BEST COPY AVAILABLE

Page 49: TIMSS. 54p. - ERIC · 2014-06-02 · Improving Mathematics Education Using Results from NAEP and. TIMSS. Council of Chief State School Officers, Washington, DC. State Education Assessment

assessment frameworks and the composition ofitems. Also we have examined the use of achieve-ment levels and scale scores for reporting NAEPassessments, and shown some of the advantages anddisadvantages of how the results are reported. It isimportant for state and local educators to be able tocompare how their own tests are constructed andreported in relation to NAEP and TIMSS. The results

are not generally directly comparable, but it iscritical to see how the methods and emphases ofthese tests can be explained in relation to otherresults used by educators.

We have illustrated some methods of analyzingand disaggregating NAEP mathematics results toexamine educational progress for different studentpopulations and schools and classrooms withvarying characteristics. Patterns of performancevary widely at one time, and extent of improve-ment over time differs. Expertise in monitoringtrends for key target groups is required to makeeffective use of sample-based data. Issues of op-portunity-to-learn are critical in the current era ofstandards-driven education reform. For all stu-dents to attain to challenging standards for math-ematics requires analysis of differences in oppor-

tunities in curriculum, teaching, and teacherpreparation. Adequate measures of how studentsare offered opportunities in mathematics will beneeded to plan KI 2 programs aimed toward highstandards. NAEP and TIMSS offer excellent re-sources of measures of mathematics opportunityto learn. We observed limitations of each study,and also identified fruitful measures that could beincorporated into assessment programs and evalu-ation studies at local and state levels.

Finally, we have offered a list of improvements thatwe believe are implied by the combined results ofNAEP and TIMSS. We have presented specific sug-gestions in three critical areas of mathematicseducation: curriculum, teaching, and professionaldevelopment. There are some clear messages fromthese two assessments that point to specific areas inneed of attention. We believe that all U.S. studentsdeserve the chance to experience a quality math-ematics curriculum that is taught in a way thatpromotes understanding. To achieve this goal willmean making some changes to what mathematicsis taught, how it is taught, and how teachers aresupported.

4 9

IMPROVING MATHEMATICS EDUCATION USING RESULTS FROM NAEP AND TIMSS

43

Page 50: TIMSS. 54p. - ERIC · 2014-06-02 · Improving Mathematics Education Using Results from NAEP and. TIMSS. Council of Chief State School Officers, Washington, DC. State Education Assessment

Appendix:Map of Selected Questions on the

NAEP Mathematics Scale for Grade 8

NAEP Scale

500 <,c

Use scale drawing to find area (375) >-

List all possible outcomes (371)

Compare areas of two figures (362) Advanced

Write word problem involving division (323)

Reason about magnitude of numbers (314)

Draw lines of symmetry (311) ).

Find location on a grid (299)>Graph linear inequality (297)

Interpret remainder in division (293)

Use pattern to draw path on grid (282)

Partition area of rectangle (272) 'P.

Use ruler's nonzero origin to find length (270)).

Partition area of hexagon (245) 0-

Find coordinate on number line (231)

333

Proficient

299

Basic

262

A (344) Find equivalent term in number pattern

A (337) Find central angle measure

A (332) Find remainder in division problem

A (329) Determine whether ratios are equal

A (328) Use scale drawing to find distance

A (318) Identify function from table values

(314) Read measurement instrument

A (311) Compute using cirde graph data

A (302) Multiply two integers

(294) Solve literal equation

A (289) Understand sampling technique

A (286) Identify acute angles in figure

A (279) Solve problem involving money

A (278) Identify fractional representation

A (265) Identify solution for linear inequality

A (257) Find area of figure on a grid

A (254) Use multiplication to solve problem

A (246) Round decimals to nearest whole numbers

Note : Position of questions is approximate and an appropriate scale range is displayed for grade 8.

s oSource: NAEP 1996 Mathematics Report Card

THE COUNCIL OF CHIEF STATE SCHOOL OFFICERS STATE EDUCATION ASSESSMENT CENTER

44REST COPY MAME

Page 51: TIMSS. 54p. - ERIC · 2014-06-02 · Improving Mathematics Education Using Results from NAEP and. TIMSS. Council of Chief State School Officers, Washington, DC. State Education Assessment

References(11,

American Association for the Advancement of Science(1993). Benchmarks for Science Literacy. New York:Oxford University Press.

Barton, P.E. & Coley, R.J. (1998). Growth in School:Achievement Gains from the Fourth to the Eighth Grade.Princeton, NJ: Educational Testing Service, PolicyInformation Center.

Beaton, A.E., Mullis, I.V.S., Martin, M.O., Gonzalez,E.J., Kelly, D.L., & Smith, T.A. (1996, 1997,1998). Mathematics Achievement in the Middle SchoolYears: lEA's Third International Mathematics andScience Study (TIMSS), Mathematics Achievement in the

Primary School Year, Mathematics and Science Achieve-

ment in the Final Year of Secondary School. ChestnutHill, MA: Boston College.

Blank, R.K. & Langesen, D. (1997A) State Indicators ofScience and Mathematics Education 1997. State-by-State Trends and New Indicators from the 1995-96School Year. Washington, DC: CCSSO, State Educa-tion Assessment Center.

Blank, R.K., Langesen, D., Bush, M., Sardina, S.,Pechman, E., & Goldstein, D. (1997B). Mathematicsand Science Content Standards and Curriculum Frame-works. States Progress on Development and Implementa-

tion. Washington, DC: CCSSO, State EducationAssessment Center.

Blank, R.K., Manise, J.G., & Brathwaite, B.C. (1997c).State Education Indicators with a Focus on Title I.Washington, DC: CCSSO, State Education Assess-ment Center.

Campbell, J.R., Voelkl, K.E., & Donahue, P.L. (1997).NAEP 1996 Trends in Academic Progress. Washing-ton, DC: U.S. Department of Education, NationalCenter for Education Statistics.

Carpenter, T., Corbitt, M.., Kepner, H., Lindquist, M.,& Reys, R. (1981). Results from the Second Mathemat-ics Assessment of the National Assessment of Educational

Progress. Reston, VA: National Council of Teachersof Mathematics.

Council of Chief State School Officers, State EducationAssessment Center (1988). Final Report: AssessingMathematics in 1990 by the National Assessment ofEducational Progress. Washington, DC: Author.

51

Council of Chief State School Officers, SCASS ScienceEducation (1997). How is Science Taught in Schools?A Five-State Preliminary Study on the Use of EnactedCurriculum Surveys in Science. Washington, DC: Au-thor.

Council of Chief State School Officers, State EducationAssessment Center (1998A). Key State EducationPolicies on K- I 2 Education. Washington, DC: Au-thor.

Council of Chief State School Officers, State EducationAssessment Center (1998B). Annual Survey of StateStudent Assessment Programs, 1996-97 School Year.Washington, DC: Author.

Council of Chief State School Officers, State EducationAssessment Center (1998c). State Education Ac-countability Reports and Indicator Reports: Status ofReports Across the States 1998. Washington, DC:Author.

Darling-Hammond, L. (1995). The Role of TeacherExpertise and Experience in Students' Opportunityto Learn, pp. 19-24, Strategies for Linking SchoolFinance and Students' Opportunity to Learn, edited byPatricia Brown. Washington, DC: National Gover-nors' Association.

Dossey, J., Mullis, I., & Jones, C. (1993). Can StudentsDo Mathematical Problem-Solving? Results from Con-structed Response Questions in NAEP's 1992 Math-ematics Assessment. Washington: DC: U.S. Depart-ment of Education, National Center for EducationStatistics.

Education Trust (1998) Education Watch, 1998 State andNational Data Book, Vol. II. Washington, DC:Author.

Elliott, M. (1998). School Finance and Opportunitiesto Learn: Does Money Well Spent Enhance Stu-dents' Achievement? Sociology of Education, Vol. 71,Number 3.

Gamoran, A., Porter, A.C., Smithson, J., & White, P.A.(1997). Upgrading High School Mathematics In-struction: Improving Learning Opportunities forLow-Achieving, Low-Income Youth. EducationalEvaluation and Policy Analysis, Vol. 19, Number 4.

Goodlad, J.I. (1984). A Place Called School: Prospects forthe Future. New York: McGraw-Hill.

IMPROVING MATHEMATICS EDUCATION USING RESULTS FROM NAEP AND TIMSS

45

Page 52: TIMSS. 54p. - ERIC · 2014-06-02 · Improving Mathematics Education Using Results from NAEP and. TIMSS. Council of Chief State School Officers, Washington, DC. State Education Assessment

Grissmer, D. & Flanagan, A. (1998) Improving the Dataand Methodologies in Educational Research. Paperpresented at USED/Rand conference on AnalyticalIssues in the Assessment of Student Achievement.Washington, DC: RAND.

Hoffer, T.B., Rasinski, K.A., & Moore, W.M. (1995).Social Background Differences in High School Math-ematics and Science Course Taking and Achievement.Washington, DC: U.S. Department of Education,National Center for Education Statistics.

Horn, L. & Hafner, A. (1992). A Profile of AmericanEighth-Grade Mathematics and Science Instruction.Washington, DC: U.S. Department of Education,National Center for Education Statistics.

Husen, T. (Ed.) (1967). International Study of Achievementin Mathematics: A Comparison of Twelve Countries(Vols. i and 2). New York: John Wiley.

Ingersoll, R.M. & Gruber, K (1996) Out-of Field Teach-ing and Educational Quality. Washington, DC: U.S.Department of Education, National Center forEducation Statistics.

Jones, L.V., Davenport, E.C., Bryson, A., Bekhuis, T., &Zwick, R. (1986). Mathematics and Science TestScores as Related to Courses Taken in High Schooland Other Factors. Journal of Educational Measure-ment, 23 (3), 197-208.

Kenney, P. & Silver, E. (1997). Results from the SixthMathematics Assessment of the National Assessment ofEducational Progress. Reston, VA: National Councilof Teachers of Mathematics.

Lee, VE., Bryk, A.S., & Smith, J.B. (1993). The Effectsof High School Organization Influences the Equi-table Distribution of Learning in Mathematics andScience. Sociology of Education, 70 (2). Washington,DC: The American Sociological Association.

McDonnell, L. (1995). Opportunity to Learn as a Re-search Concept and a Policy Instrument, EducationalEvaluation and Policy Analysis, 17:305-22.

McKnight, C.C., Crosswhite, F.J., Dossey, J.A., Kifer,E., Swafford, J.0., Travers, K.J., & Cooney, T.J.(1987). The Underachieving Curriculum: AssessingU.S. School Mathematics from an International Perspec-tive. Champaign, IL: Stipes Publishing.

Monk, D.H. (1993). Subject Area Preparation of SecondaryMathematics and Science Teachers and Student Achieve-ment. Ithaca, NY: Cornell University, Departmentof Education.

I.V.S., Dossey, J.A., Owen, E.H., & Phillips, G.W.(1993). NAEP 1992 Math Report Card for the Nationaland the States. Washington, DC: U.S. Department ofEducation, National Center for Education Statistics.

National Assessment of Governing Board (1996). Math-ematics Framework for 1996 NAEP. Washington, DC:U.S. Department of Education.

National Center for Education Statistics (1996a). Pursu-ing Excellence: A Study of U.S. Eighth-Grade Math-ematics and Science Teaching, Learning, Curriculum,and Achievement in International Context, Initial Find-ings from the Third International Mathematics andScience Study. Washington, DC: U.S. Department ofEducation.

National Center for Education Statistics (1996B). SASSby State, 1993-94 Schools and Staffing Survey. Wash-ington, DC: U.S. Department of Education.

National Center for Education Statistics (1997A). Pursu-ing Excellence: A Study of U.S. Fourth-Grade Math-ematics and Science Achievement in InternationalContext. Initial Findings from the Third Interna-tional Mathematics and Science Study. Washington,DC: U.S. Department of Education.

National Center for Education Statistics (19978). TowardBetter Teaching: Professional Development in 1993-94.Washington, DC: U.S. Department of Education.

National Center for Education Statistics (1997B). Condi-tion of Education. Washington, DC: U.S. Departmentof Education.

National Center for Education Statistics (19970). Video-tape Classroom Study. Washington, DC: U.S. Depart-ment of Education.

National Center for Education Statistics (1998A). Pursu-ing Excellence: A Study of U.S. Twelfth-Grade Math-ematics and Science Achievement in International Context.

Initial Findings from the Third International Mathemat-ics and Science Study. Washington, DC: U.S. Depart-ment of Education.

National Center for Education Statistics (1998B). Linkingthe National Assessment of Educational Progress and theThird International Mathematics and Science Study:Eighth-Grade Results. Washington, DC: U.S. Depart-ment of Education.

National Council of Teachers of Mathematics (1989).Curriculum and Evaluation Standards for School Math-ematics. Reston, VA: Author.

5 2THE COUNCIL OF CHIEF STATE SCHOOL OFFICERS STATE EDUCATION ASSESSMENT CENTER

46

Page 53: TIMSS. 54p. - ERIC · 2014-06-02 · Improving Mathematics Education Using Results from NAEP and. TIMSS. Council of Chief State School Officers, Washington, DC. State Education Assessment

National Council of Teachers of Mathematics (1991).Professional Standards for Teaching Mathematics.Reston, VA: Author.

National Education Goals Panel (1998). Mathematicsand Science Achievement State by State, 1988. Wash-ington, DC: U.S. Government Printing Office.

National Research Council (1995). National Science Edu-cation Standards. Washington, DC: National Acad-emy Press.

Oakes, J. (1990). Multiplying Inequalities: The Effects ofRace, Social Class, and Tracking on Opportunities toLearn Mathematics and Science. Santa Monica, CA:The Rand Corporation.

Porter, A.C. (1995). Developing Opportunity-to-Learn Indi-cators of the Content of Instruction: Progress Report.Madison, WI: Wisconsin Center for EducationResearch.

Raudenbush, S.W. (1998) Synthesizing Results fromthe Trial State Assessment. Paper presented atUSED/Rand conference on Analytical Issues in theAssessment of Student Achievement. Ann Arbor,MI: University of Michigan, School of Education.

Reese, C.M., Miller, K.E., Mazzeo, J., & Dossey, J.A.(1997). NAEP 1996 Mathematics Report Card for theNation and the States. Washington, DC: U.S. De-partment of Education, National Center for Educa-tion Statistics.

Robitaille, D.F., Schmidt, W.H., Raizen, S., McKnight,C., Britton, E., & Nicol, C. (1993). CurriculumFrameworks for Mathematics and Science. Vancouver,Canada: The Third International Mathematics andScience Study.

Rock, D., Braun, H.I., & Rosenbaum, P.R. (1985).Excellence in High School Education: Cross-SectionalStudy, 1980-1982. (Final Report). Princeton, NJ:Educational Testing Service.

Schmidt, Cogan, et. al. (1998). Unpublished data fromTIMSS, Teacher Questionnaire, Populations r and 2.

Schmidt, W.H., McKnight, C.C., Valverde, G.A.,Houang, R.T., & Wiley, D.E. (1996). Many Visions,Many Aims. A Cross-National Investigation of Cur-ricular Intentions in School Mathematics. Kluwer Aca-demic Publishers.

53

Sebring, RA. (1987). Consequences of DifferentialAmounts of High School Coursework: Will theNew Graduation Requirements Help? EducationEvaluation and Policy Analysis, Vol. 9, Number 3.

Shaughnessy, C.A., Nelson, J.E., & Norris, N.A.(1997). NAEP 1996 Mathematics Cross-State DataCompendium for the Grade 4 and Grade 8 Assessment.Findings from the State Assessment in Mathematics of the

National Assessment of Educational Progress. Wash-ington, DC: U.S. Department of Education, Na-tional Center for Education Statistics.

Silver, E. A., Alacaci, C. & Stylianou, D. (1998, in press).Students' Performance on Extended Constructed-Response Taskzs. In Edward A. Silver and PatriciaAnn Kenney, Eds. Results from the Seventh MathematicsAssessment of the National Assessment of EducationalProgress. Reston, VA: National Council of Teachersof Mathematics.

Shavelson, R., McDonnell, L., Oakes, J., & Carey, N.(1989). Indicatzt, C.C. (1985). MathematicsAchievement in U.S. Schools: Preliminary Find-ings from the Second IEA Mathematics Study, PhiDelta Kagan, PP. 407-413-

Walberg, H.J. (1984). Improving the Productivity ofAmerican Schools. Educational Leadership, 41(8).

Webb, N. (1997). Criteria for Alignment of CurricularExpectations and Assessments in Mathematics and Sci-ence Education. Washington, DC: CCSSO; NationalInstitute for Science Education, University of Wis-consin-Madison.

Weiss, I.R. (1994). A Profile of Science and MathematicsEducation in the United States: 1993. Chapel Hill,NC: Horizon Research, Inc.

Zucker, A., Shields, P., Humphrey, D., & Adelman(1998). SSI Strategies for Reform: Findings from theEvaluation of NSF's SSI Program. Palo Alto: SRIInternational.

IMPROVING MATHEMATICS EDUCATION USING RESULTS FROM NAEP AND TIMSS

47

Page 54: TIMSS. 54p. - ERIC · 2014-06-02 · Improving Mathematics Education Using Results from NAEP and. TIMSS. Council of Chief State School Officers, Washington, DC. State Education Assessment

References for Sample Tasks

"ODOMETER"

Mitchell, J., Hawkins, E., Jakwerth, R, Stancavage, F.,

Dossey, J. (draft March 20, 1998). Student Work and

Classroom Practices in Mathematics, p. 76. NCES.

"SAM'S LUNCH"

Mitchell, J., Hawkins, E., Jakwerth, P., Stancavage, F.,

Dossey, J. (draft March 20, 1998). Student Work and

Classroom Praaices in Mathematics, p. 57. NCES.

"TWO GEOMETRIC SHAPES"

Released item from the NAEP 1996 Mathematics Report

Card for the Nation and the States: Findings from the

National Assessment of Educational Progress (4th

grade). Washington, D.C.: Office of EducationalResearch and Improvement, U.S. Department ofEducation.

"BICYCLE ACCIDENTS"

Business Coalition for Educational Reform (May, 1998).

Fornzukt for Sums: A Businers Leader's Guide to Sup-

porting Math and Science Achievement, p. 9. Wash-

ington, D.C.: U.S. Department of Education.

"GEOMETRIC PROOF"

Takahira, S., Gonzales, P, Erase, M., & Salganik, L (1998).

Pursuing Excellence: A Study of U.S. Twelfth-grade

Mathematics and Science Achievetnent in International

Context, p. 43. Washington, D.C.: U.S. Department

of Education.

5 4

"AMOUNT PAID FOR PORTION OF ITEMS"

Beaton, A., Mullis, I., Martin, M., Gonzalez, E.,Kelly, D., & Smith, T (November, 1996). Math-ematics achievement in the middle school years: lEA's

Third International Mathematics and Science Study

(TIMSS), p. 98. Boston: Center for the Study ofTesting, Evaluation, and Educational Policy, Bos-ton College.

"CHERRY SYRUP"

Reese, C., Miller, K., Mazzeo, J., & Dossey, J. (Febru-

ary, 1997). NAEP 1996 Mathematics Report Card for

the Nation and the States: Findings from the National

Assessment of Educational Progress, p.22. Washing-

ton, D.C.: Office of Educational Research andImprovement, U.S. Department of Education.

"MEANING OF EQUATION"

Mullis, I. (Draft, February 27, 1998). TIMSS as aStarting Point to Examine Mathematier Assessments:

An in-depth Look at Geometry and Algebra, p. 95.

Washington, D.C.: Office of Educational Research

and Improvement, U.S. Department of Education.

THE COUNCIL OF CHIEF STATE SCHOOL OFFICERS STATE EDUCATION ASSESSMENT CENTER

48

Page 55: TIMSS. 54p. - ERIC · 2014-06-02 · Improving Mathematics Education Using Results from NAEP and. TIMSS. Council of Chief State School Officers, Washington, DC. State Education Assessment

U.S. Departmentof EducationOffice of Educational Research and Improvement (OERI)

National Library of Education (NLE)Educational Resources Information Center (ERIC)

REPRODUCTION RELEASE(Specific Document)

I. DOCUMENT IDENTIFICATION:

ERICTM029827

Title:11?/14.1":

c-royvt-

ttioOk7Aili4".6411:73-. 170-4A-0-4,Cato tc tVaca4(2.1%4 niti SS

Author(s): -e.A( c. t is,"*" (CL-- a lava-auly 5)--eW S't/Lre

eirrce-Ld

Corporate Source: ao_v_vt4.,;e Publication Date:

tc/9II. REPRODUCTION RELEASE:

In order to disseminate as widely as possible timely and significant materials of interest to the educational community, documents announced In the

monthly abstract journal of the ERIC system, Resources in Education (RIE), are usually madeavailable to users in microfiche, reproduced paper copy,and electronic media, and sold through the ERIC Document Reproduction Service (EDRS). Credit is given to the source of each document, and, Ifreproduction release is granted, one of the following notices is affixed to the document.

If permission is granted to reproduce and disseminate the identified document, please CHECK ONE of the following three options and sign at the bottom

of the page.

The sample slicker shown below will beaffixed to ail Level 1 documents

PERMISSION TO REPRODUCE ANDDISSEMINATE THIS MATERIAL HAS

BEEN GRANTED BY

TO THE EDUCATIONAL RESOURCESINFORMATION CENTER (ERIC)

Level 1

271Chedt here for Level 1 release. permitting reproductionand dissemination in microfiche or other ERIC archival

media (e.g., electronic) and paper copy.

Signhere,-)please

The sample sticker shown below will beaxed to all Level 2A documents

PERMISSION TO REPRODUCE ANDDISSEMINATE THIS MATERIAL IN

MICROFICHE, AND IN ELECTRONIC MEDIAFOR ERIC COLLECTION SUBSCRIBERS ONLY,

HAS BEEN GRANTED BY

2A

TO THE EDUCATIONAL RESOURCESINFORMATION CENTER (ERIC)

Level 2A

Check here for Level 2A release, permitting reproductionand disseminabon In miaoliche and In electronic media

for ERIC archival collection subscribers only

The sample sticker shown below will beaffixed to all Level 28 documents

PERMISSION TO REPRODUCE ANDDISSEMINATE THIS MATERIAL IN

MICROFICHE ONLY HAS BEEN GRANTED BY

2B

TO THE EDUCATIONAL RESOURCESINFORMATION CENTER (ERIC)

Level 28

Check here for Level 28 release, permittingreproduction and dissemination in microfiche only

Doc/merits wilt be processed as Indicated provided reproduction quality permits.If permission to reproduce Is granted, but no box is chedted, Comments will be processed et Level 1.

I hereby grant to the Educational Resources Information Center (ERIC) nonexclusive permission to reproduce and disseminate this document

as indicated above. Reproductidn from the ERIC microfiche or electronic media by persons other than ERIC employees and its systemcontractors requires permission from the copyright holder. Exception is made for non-profit reproduction by librariesand other service agencies

to satisfy information needs of educators in response to discrete inquiries.

Signature:

DOrganization/Address:

" ""s VI% AC 9-trit0lAt PU-'"%1

Prre°75 3444 ow, a-Te*Phaj,,ei 2_ 3 1 7Al LeffE-Mail Adaresa:. #-N

rut 4-4(0 (2.g.Sgb,

FAX

Date ar/u/c,p(over)

Page 56: TIMSS. 54p. - ERIC · 2014-06-02 · Improving Mathematics Education Using Results from NAEP and. TIMSS. Council of Chief State School Officers, Washington, DC. State Education Assessment

III. DOCUMENT AVAILABILITY INFORMAliON (FROM NON-ERIC SOURCE):

if permission to reproduce is not granted to ERIC, or, if you wish ERIC to cite the availability of the document from another source, pleaseprovide the following information regarding the availability of the document. (ERIC will not announce a document unless it is publiclyavailable, and a dependable source can be specified. Contributors should also be aware that ERIC selection criteria are significantly morestringent for documents that cannot be made available through EDRS.)

Publisher/Distributor:

Address:

Price:

IV. REFERRAL OF ERIC TO COPYRIGHT/REPRODUCTION RIGHTS HOLDER:

If the right to grant this reproduction release is held by someone other than the addressee, please provide the appropriate name andaddress:

Name:

Address:

V. WHERE TO SEND THIS FORM:

Send this form to the following ERIC CleariffiromvERSITY

ERIC CLEARINGHOUSE ON ASSESSMENT AND EVALUATION1129 SHRWER LAB, CAMPUS DRIVE

COLLEGE PARK, MD 20742-5701Attn: Acquisitions

However, if solicited by the ERIC Facility, or if making an unsolicited contribution to ERIC, return this form (and the document beingcontributed) to:

ERIC Processing and Reference Facility1100 West Street, rd Floor

Laurel, Maryland 20707-3598

Telephone: 301-497-4080Toll Free: 800-799-3742

FAX: 301-953-0263e-mall: [email protected]

WWW: http://ericfac.piccard.osc.com

EFF-088 (Rev. 9/97)PREVIOUS VERSIONS OF THIS FORM ARE OBSOLETE.