ERIC - Education Resources Information Center ...285; for a PowerPoint slide presentation related to...

141
DOCUMENT RESUME ED 447 473 CS 014 284 AUTHOR Donahue, Patricia L.; Finnegan, Robert J.; Lutkus, Anthony D.; Allen, Nancy L.; Campbell, Jay R. TITLE The Nation's Report Card: Fourth-Grade Reading, 2000. INSTITUTION National Center for Education Statistics (ED), Washington, DC. REPORT NO NCES-2001-499 PUB DATE 2001-04-00 NOTE 140p.; In collaboration with Dave Freund, Steve Isham, Laura Jerry, Youn-hee-Lim, Shari Santapau, Spence Swinton, Lois Worthington. For the highlights of this report, see CS 014 285; for a PowerPoint slide presentation related to this report, see CS 014 286. AVAILABLE FROM ED Pubs, P.O. Box 1398, Jessup, MD 20794-1398. Tel: 877-433-7827 (Toll Free); Web site: http://www.ed.gov/pubs/edpubs.html. For full text: http://nces.ed.gov/nationsreportcard. PUB TYPE Numerical/Quantitative Data (110) -- Reports - Research (143) EDRS PRICE MF01/PC06 Plus Postage. DESCRIPTORS Comparative Analysis; Grade 4; Intermediate Grades; *National Competency Tests; *Reading Achievement; Reading Research; Sex Differences; Standardized Tests; *Student Evaluation; Tables (Data); Test Results IDENTIFIERS *National Assessment of Educational Progress ABSTRACT This report presents the results of the 2000 NAEP (National Assessment of Educational Progress) fourth-grade reading assessment for the nation. Results in 2000 are compared to previous NAEP reading assessments. After an introduction, chapter 1 presents average scale scores and achievement level results for the nation. Chapter 2 presents average scale score and achievement level results for selected subgroups of the fourth-grade students. In Chapter 3, school and home contexts for learning are addressed. Chapter 4 discusses becoming a more inclusive national assessment. Major findings are: (1) the reading performance of the nation's fourth graders remained relatively stable across assessment years; (2) significant changes were evident at the upper and lower ends of the performance distribution--higher performing students made progress, and the score at the 10th percentile in 2000 was significantly lower than 1992; (3) in 2000, the percentage of fourth-grade students performing at or above the "basic" level was 63k, and performance at or above the "proficient" level was achieved by 32% of fourth graders; (4) female fourth graders had a higher average score than their male peers; (5) white and Asian/Pacific Islander students outperformed their black, Hispanic, and American Indian peers; (6) students in the Northeast and Central regions outperformed their counterparts in the Southeast and the West; (7) students in central city schools had a lower average score than their peers in urban fringe/large town and rural/small town locations; (8) students eligible for the free/reduced lunch program had a lower average score than students ineligible for that program; (9) students attending public schools had lower average scores than their peers attending nonpublic schools; (10) students who reported reading more pages daily in school and for homework had higher average scores than Reproductions supplied by EDRS are the best that can be made from the original document.

Transcript of ERIC - Education Resources Information Center ...285; for a PowerPoint slide presentation related to...

Page 1: ERIC - Education Resources Information Center ...285; for a PowerPoint slide presentation related to this report, see CS 014 286. AVAILABLE FROM ED Pubs, P.O. Box 1398, Jessup, MD

DOCUMENT RESUME

ED 447 473 CS 014 284

AUTHOR Donahue, Patricia L.; Finnegan, Robert J.; Lutkus, AnthonyD.; Allen, Nancy L.; Campbell, Jay R.

TITLE The Nation's Report Card: Fourth-Grade Reading, 2000.INSTITUTION National Center for Education Statistics (ED), Washington,

DC.REPORT NO NCES-2001-499PUB DATE 2001-04-00NOTE 140p.; In collaboration with Dave Freund, Steve Isham, Laura

Jerry, Youn-hee-Lim, Shari Santapau, Spence Swinton, LoisWorthington. For the highlights of this report, see CS 014285; for a PowerPoint slide presentation related to thisreport, see CS 014 286.

AVAILABLE FROM ED Pubs, P.O. Box 1398, Jessup, MD 20794-1398. Tel:877-433-7827 (Toll Free); Web site:http://www.ed.gov/pubs/edpubs.html. For full text:http://nces.ed.gov/nationsreportcard.

PUB TYPE Numerical/Quantitative Data (110) -- Reports - Research(143)

EDRS PRICE MF01/PC06 Plus Postage.DESCRIPTORS Comparative Analysis; Grade 4; Intermediate Grades;

*National Competency Tests; *Reading Achievement; ReadingResearch; Sex Differences; Standardized Tests; *StudentEvaluation; Tables (Data); Test Results

IDENTIFIERS *National Assessment of Educational Progress

ABSTRACT

This report presents the results of the 2000 NAEP (NationalAssessment of Educational Progress) fourth-grade reading assessment for thenation. Results in 2000 are compared to previous NAEP reading assessments.After an introduction, chapter 1 presents average scale scores andachievement level results for the nation. Chapter 2 presents average scalescore and achievement level results for selected subgroups of thefourth-grade students. In Chapter 3, school and home contexts for learningare addressed. Chapter 4 discusses becoming a more inclusive nationalassessment. Major findings are: (1) the reading performance of the nation'sfourth graders remained relatively stable across assessment years; (2)

significant changes were evident at the upper and lower ends of theperformance distribution--higher performing students made progress, and thescore at the 10th percentile in 2000 was significantly lower than 1992; (3)

in 2000, the percentage of fourth-grade students performing at or above the"basic" level was 63k, and performance at or above the "proficient" level wasachieved by 32% of fourth graders; (4) female fourth graders had a higheraverage score than their male peers; (5) white and Asian/Pacific Islanderstudents outperformed their black, Hispanic, and American Indian peers; (6)

students in the Northeast and Central regions outperformed their counterpartsin the Southeast and the West; (7) students in central city schools had alower average score than their peers in urban fringe/large town andrural/small town locations; (8) students eligible for the free/reduced lunchprogram had a lower average score than students ineligible for that program;(9) students attending public schools had lower average scores than theirpeers attending nonpublic schools; (10) students who reported reading morepages daily in school and for homework had higher average scores than

Reproductions supplied by EDRS are the best that can be madefrom the original document.

Page 2: ERIC - Education Resources Information Center ...285; for a PowerPoint slide presentation related to this report, see CS 014 286. AVAILABLE FROM ED Pubs, P.O. Box 1398, Jessup, MD

students reporting reading fewer pages daily; and (11) the average score forthe nation was lower in the results that included the performance of studentswho needed and were provided with testing accommodations. Appendixes containan overview of the procedures used; sample text and questions; and data. (RS)

Reproductions supplied by EDRS are the best that can be madefrom the original document.

Page 3: ERIC - Education Resources Information Center ...285; for a PowerPoint slide presentation related to this report, see CS 014 286. AVAILABLE FROM ED Pubs, P.O. Box 1398, Jessup, MD

KR.6cDou0 @[cn@[T go,o

0 ICC

U.S. DEPARTMENT OF EDUCATIONInce of Educational Research and Improvement

EDUCATIONAL RESOURCES INFORMATIONCENTER (ERIC)

This document has been reproduced asreceived from the person or organizationoriginating it.

Minor changes have been made toimprove reproduction quality.

° Points of view or opinions stated in thisdocument do not necessarily represent

OERI position or policy.

REPARTCARD

ea

DepartmentOffice

EductionEducational Research

1146,o1 M4L48

11.

BB COPY AVALABIE

rovement 691Pg gi9011 -499

Page 4: ERIC - Education Resources Information Center ...285; for a PowerPoint slide presentation related to this report, see CS 014 286. AVAILABLE FROM ED Pubs, P.O. Box 1398, Jessup, MD

hat is The Nation's Report Card?THE NATION'S REPORT CARD, the National Assessment of Educational Progress (NAEP), is the only nationallyrepresentative and continuing assessment of what America's students know and can do in various subject areas. Since1969, assessments have been conducted periodically in reading, mathematics, science, writing, history, geography, andother fields. By making objective information on student performance available to policymakers at the national, state,and local levels, NAEP is an integral part of our nation's evaluation of the condition and progress of education. Onlyinformation related to academic achievement is collected under this program. NAEP guarantees the privacy ofindividual students and their families.

NAEP is a congressionally mandated project of the National Center for Education Statistics, the U.S. Departmentof Education.The Commissioner of Education Statistics is responsible, by law, for carrying out the NAEP projectthrough competitive awards to qualified organizations. NAEP reports directly to the Commissioner, who is alsoresponsible for providing continuing reviews, including validation studies and solicitation of public comment, onNAEP's conduct and usefulness.

In 1988, Congress established the National Assessment Governing Board (NAGB) to formulate policy guidelinesfor NAEP.The Board is responsible for selecting the subject areas to be assessed from among those included in theNational Education Goals; for setting appropriate student performance levels; for developing assessment objectives andtest specifications through a national consensus approach; for designing the assessment methodology; for developingguidelines for reporting and disseminating NAEP results; for developing standards and procedures for interstate,regional, and national comparisons; for determining the appropriateness of test items and ensuring they are free frombias; and for taking actions to improve the form and use of the National Assessment.

The National Assessment GoverningMark D. Musick, ChairPresidentSouthern Regional Education BoardAtlanta, Georgia

Michael T. Nettles, Vice ChairProfessor of EducationUniversity of MichiganAnn Arbor, Michigan

Moses BarnesSecondary School PrincipalFort Lauderdale, Florida

Melanie A. CampbellFourth-Grade TeacherTopeka, Kansas

Honorable Wilmer S. CodyFormer Commissioner of EducationState of KentuckyFrankfort, Kentucky

Daniel A. DomenechSuperintendent of SchoolsFairfax County Public SchoolsFairfax,Virginia

Edward DonleyFormer ChairmanAir Products & Chemicals, Inc.Allentown, Pennsylvania

Thomas H. FisherDirectorStudent Assessment ServicesFlorida Department of EducationTallahassee, Florida

Edward H. HaertelProfessor, School of EducationStanford UniversityStanford, California

oard

Juanita HaugenLocal School Board MemberPleasanton, California

Honorable Nancy KoppState LegislatorAnnapolis, Maryland

Honorable Ronnie MusgroveGovernor of MississippiJackson, Mississippi

Roy M. Nageak, Sr.First Vice-ChairAlaska Board of Education and

Early DevelopmentBarrow, Alaska

Debra PaulsonEighth-Grade Mathematics TeacherEl Paso, Texas

Honorable Jo Ann PottorffState LegislatorWichita, Kansas

Diane RavitchResearch ProfessorNew York UniversityNew York, New York

Sister Lourdes Sheehan, R.S.M.Secretary for EducationUnited States Catholic ConferenceWashington, DC

John H. StevensExecutive DirectorTexas Business and Education

CoalitionAustin, Texas

Adam UrbanskiPresidentRochester Teachers AssociationRochester, New York

Migdania D. VegaPrincipalCoral Way Elementary Bilingual

SchoolMiami, Florida

Deborah VoltzAssistant ProfessorDepartment of Special EducationUniversity of LouisvilleLouisville, Kentucky

Honorable Michael E. WardState Superintendent of Public

InstructionNorth Carolina Public SchoolsRaleigh, North Carolina

Marilyn A. Whirr),Twelfth -Grade English TeacherManhattan Beach, California

Dennie Palmer WolfSenior Research AssociateHarvard UniversityGraduate School of EducationCambridge, Massachusetts

(Ett-Officio)Assistant Secretary of EducationOffice of Educational Research and

ImprovementU.S. Department of EducationWashington. DC

Roy TrubyExecutive Director, NAGBWashington, DC

3 BEST COPY AVARABLE

Page 5: ERIC - Education Resources Information Center ...285; for a PowerPoint slide presentation related to this report, see CS 014 286. AVAILABLE FROM ED Pubs, P.O. Box 1398, Jessup, MD

National Center forEducation Statistics

to SFourth-Grade

Patricia L. Donahue

Robert J. Finnegan

nthony D. Lutkus

Nancy L. Hen

Jay R. Campbe00

In collaboration with

Dave FreundSteve lishamLaura JerryYoun-hee Lim

Shari SantapatSpence SwintonLois Alorthington

Apill 2001

U.S. Department of EducationOffice of Educational Research and Improvement NCES 2001-499

4

Page 6: ERIC - Education Resources Information Center ...285; for a PowerPoint slide presentation related to this report, see CS 014 286. AVAILABLE FROM ED Pubs, P.O. Box 1398, Jessup, MD

U.S. Department of EducationRod PaigeSecretary

National Center for Education StatisticsGary W. PhillipsActing Commissioner

April 2001

4-

SUGGESTED CITATIONU.S. Department of Education. Office of Educational Research and Improvement. National Center for EducationStatistics. The Nation's Report Card: Fourth-Grade Reading 2000, NCES 2001-499, by P. L. Donahue, R.J. Finnegan,

A. D. Lutkus, N.L. Allen, and J. R. Campbell.Washington, DC: 2001.

FOR MORE INFORMATIONContent contact:Sheida White202-502-7473

To obtain single copies of this report, limited number of copies available, or ordering information on other U.S.Department of Education products, call toll free 1-877 4ED PUBS (877-433-7827), or write:

Education Publications Center (ED Pubs)U.S. Department of EducationP.O. Box 1398Jessup, MD 20794-1398

TTY/TDD 1-877-576-7734FAX 301-470-1244

Online ordering via the Internet: http://www.ed.gov/pubsiedpubs.htmlCopies also are available in alternate formats upon request.This report also is available on the World Wide Web: http://nces.ed.govinationsreportcard

The work upon which this publication is based was performed for the National Center for EducationStatistics, Office of Educational Research and Improvement, by Educational Testing Service.

BEST COPY AVALABLE

Page 7: ERIC - Education Resources Information Center ...285; for a PowerPoint slide presentation related to this report, see CS 014 286. AVAILABLE FROM ED Pubs, P.O. Box 1398, Jessup, MD

abOe Con tints

Executive Summary ix

Introduction 1

Overview of the 2000 National Assessment of Educational Progress (NAEP) 2

Framework for the 1992, 1994, 1998, and 2000 Assessments 2

The Reading Assessment Instruments 4

Description of School and Student Samples 5

Reporting the Assessment Results 5

The Setting of Achievement Levels 6

The Developmental Status of Achievement Levels 7

Interpreting NAEP Results 10

This Report 10

Chapter 1

Average Scale Score and Achievement Level Results for the Nation 11

Average Scale Score Results 12

Achievement Level Results 13

Sample Assessment Questions and Student Responses 16

Map of Selected Item Descriptions 21

Summary 23

Chapter 2

Average Scale Score and Achievement Level Results for

Selected Subgroups of 4th-Grade Students 25

Gender 26

Race/Ethnicity 28

Trends in Scale Score Differences Between Selected Subgroups 33

Region of the Country 34

Type of Location 37

Eligibility for the Free/Reduced-Price Lunch Program 39

Type of School 41

Summary 45

BEST COPY NAOLABLE

Nation's.

Repo10rtCard.

TADLE OF COPTEHTS DEMOS REPORT CA110 iii

Page 8: ERIC - Education Resources Information Center ...285; for a PowerPoint slide presentation related to this report, see CS 014 286. AVAILABLE FROM ED Pubs, P.O. Box 1398, Jessup, MD

Chapter 3

School and Home Contexts for Learning 47

Reading in and for School 48

Literacy with Family and Friends 54

Summary 62

Chapter 4

Becoming a More Inclusive National Assessment 63

Two Sets of 2000 NAEP Reading Results 64

Results for the Nation:Accommodations Not Permitted and Accommodations Permitted 68

Results by Gender:Accommodations Not Permitted and Accommodations Permitted 70

Results by Race/Ethnicity:Accommodations Not Permitted and Accommodations Permitted 70

Summary 72

Appendix A

Overview of Procedures Used for the NAEP 2000 Grade 4 Reading Assessment 73

The NAEP 2000 Grade 4 Reading Assessment 73

The Assessment Design 75

National Sample 76

Students with Disabilities (SD) and Limited English Proficient (LEP) Students 79

Data Collection and Scoring 83

Data Analysis and IRT Scaling 83

Item Mapping Procedures 86

Weighting and Variance Estimation 87

Drawing Inferences from the Results 88

Analyzing Group Differences in Averages and Percentages 89

Conducting Multiple Tests 91

NAEP Reporting Groups 93

Cautions in Interpretations 96

Appendix B

Sample Text and Questions from the NAEP 2000 Reading Assessment 97

Appendix C

Data Appendix 103

Acknowledgments 125

Iv TABLE OF CONTENTS . READING REPORT CARD

Page 9: ERIC - Education Resources Information Center ...285; for a PowerPoint slide presentation related to this report, see CS 014 286. AVAILABLE FROM ED Pubs, P.O. Box 1398, Jessup, MD

Figures and Tables

Figure i.1Descriptions of the three reading purposes specified in the NAEP framework 3

Figure i.2Descriptions of the four reading stances specified in the NAEP framework 4

Figure i.3Policy definitions of the three achievement levels 7

Figure 1.1Average fourth-grade reading scale scores: 1992-2000 12

Figure 1.2Fourth-grade reading scale score percentiles: 1992-2000 13

Figure 1.3Descriptions of reading achievement for the levels at grade 4 14

Figure 1.4Percentage of fourth-grade students at or above reading achievement levelsand within each achievement level range: 1992-2000 15

Table 1.1Sample question 1 results (multiple-choice) 17

Table 1.2Sample question 2 results (short constructed-response) 18

Table 1.3Sample question 3 results (extended constructed-response) 19

Figure 1.5Map of selected item descriptions on theNational Assessment of Educational Progressreading scale for grade 4 22

Figure 1.6Average scale score, percentile scores,and achievement level results: 1992-2000 23

Figure 2.1Average fourth-grade reading scale scoresfor male and female students: 1992-2000 26

Figure 2.2Percentages of fourth-grade male and female students at or abovereading achievement levels and within each achievement level range:1992-2000 27

Figure 2.3Average fourth-grade reading scale scores by race/ethnicity: 1992-2000 29

Figure 2.4Percentages of fourth-grade students at or above reading achievement levelsand within each achievement level range by race/ethnicity: 1992-2000 30

Figure 2.5Differences in average fourth-grade reading scale scoresby gender and race/ethnicity: 1992-2000 33

TABLE OF COOTEUTS

, aREADIUG REPORT CARD

Page 10: ERIC - Education Resources Information Center ...285; for a PowerPoint slide presentation related to this report, see CS 014 286. AVAILABLE FROM ED Pubs, P.O. Box 1398, Jessup, MD

Figure 2.6Average fourth-grade reading scale scoresby region of the country: 1992-2000 34

Figure 2.7Percentages of fourth-grade students at or above reading achievement levelsand within each achievement level range by region of the country: 1992-2000 35

Table 2.1Average fourth-grade reading scale scoresby school's type of location: 2000 37

Figure 2.8Percentages of fourth-grade students at or above reading achievement levelsand within each achievement level range by school's type of location: 2000 38

Figure 2.9Average fourth-grade reading scale scores by student eligibilityfor the free/reduced-price lunch program: 1998-2000 39

Figure 2.10Percentages of fourth-grade students at or above reading achievement levelsand within each achievement level range by student eligibility for thefree/reduced-price lunch program: 1998-2000 40

Figure 2.11Average fourth-grade reading scale scores by type of school: 1992-2000 42

Figure 2.12Percentages of fourth-grade students at or above reading achievement levelsand within each achievement level range by type of school: 1992-2000 43

Figure 2.13Summary of scale scores and achievement levels: 1992-2000 46

Table 3.1Students' reports on the number of pages read each dayin school and for homework: 1992-2000 48

Table 3.2Student's reports on the amount of time spent doinghomework each day: 1992-2000 49

Table 3.3Students' reports on how often they write long answers toquestions on tests or assignments that involved reading: 1992-2000 51

Table 3.4Students' reports on how often their teachers help thembreak words into parts and help them understand new words: 1998-2000 53

Table 3.5Students' reports on how often they read for funon their own time: 1992-2000 55

Table 3.6Students' reports on how often they discuss their studies at homeand talk about reading with their family and friends: 1992-2000 57

vI TABLE OF CONTENTS READING REPORT CARD

Page 11: ERIC - Education Resources Information Center ...285; for a PowerPoint slide presentation related to this report, see CS 014 286. AVAILABLE FROM ED Pubs, P.O. Box 1398, Jessup, MD

.4

Table 3.7Students' reports on the number of different types of reading materialsin the home: 1992-2000 59

Table 3.8Students' reports on the amount of time spent watching televisioneach day: 1992-2000 61

Figure 4.1The two sets of NAEP results based on a split-sample design 67

Table 4.1Average score by type of results: 1998 and 2000 68

Table 4.2Percentages of students attaining reading achievement levels by type of results:1998 and 2000 69

Figure A.11992, 1994, 1998, and 2000 NAEP frameworkaspects of grade 4 reading literacy 74

Table A.1Target and actual percentage distribution of questions by reading purpose,

2000 NAEP grade 4 reading assessment 75

Table A.2 ,Target and actual percentage distribution of questions by reading stance,2000 NAEP grade 4 reading assessment 75

Table A.3Grade 4 national student sample sizes 77

Table A.4NAEP 2000 school and student participation rates for the nation:Grade 4 public schools, nonpublic schools, and combined 78

Table A.5Students with disabilities and limited English proficient students in theNAEP reading assessment: Grade 4 national samples where accommodationswere not permitted: 1992-2000 80

Table A.6Students with disabilities and limited English proficient students in theNAEP reading assessment: Grade 4 national samples where accommodationswere permitted for identified students: 1998 and 2000 81

Table A.7SD and LEP students assessed with accommodations, in theNAEP reading assessment: Grade 4 national samples where accommodationswere permitted for identified students, public and nonpublic schools combinedby type of accommodation: 1998 and 2000 82

Table A.8FDR comparisons of average reading scale scores forrace/ethnicity groups: 1994 and 2000 91

Figure A.2States included in the four NAEP regions 93

TABLE OF CONTENTS

10READING REPORT CARD vil

Page 12: ERIC - Education Resources Information Center ...285; for a PowerPoint slide presentation related to this report, see CS 014 286. AVAILABLE FROM ED Pubs, P.O. Box 1398, Jessup, MD

ixecutive Summary

The National Assessment of Educational Progress (NAEP)

is the nation's only federally mandated survey of student

achievement in various subject areas. Authorized by

Congress and administered by the National Center for

Education Statistics in the U.S. Department of Education,

NAEP regularly reports to the public on the educational

progress of students in grades 4, 8, and 12. In 2000,

NAEP conducted a national reading assessment of

fourth-grade students.

This report presents the results of the 2000 NAEP fourth-

grade reading assessment for the nation. Results in 2000 are

compared to results of previous NAEP reading assessments.

Students' performance on the assessment is described in

terms of average scores on a 0-500 scale, and in terms of the

percentage of students attaining three achievement levels:

Basic, Proficient, and Advanced. The achievement levels are

performance standards adopted by the National Assessment

Governing Board (NAGB) as part of its statutory

responsibilities. They are collective judgments of what

students should know and be able to do.

As provided by law, the Commissioner of Education

Statistics, upon review of a congressionally mandated

evaluation of NAEP, determined that the achievement levels

are to be considered developmental and should be

interpreted and used with caution. However, both the

Acting Commissioner and the Board believe these

performance standards are useful for understanding trends in

student achievement. They have been widely used by

Reading Scale

Score and

Achievement

Level Results for

the Nation

Results for

Student

Subgroups

School and Home

Contexts for

Learning

Transitioning to a

More Inclusive

NAEP

EXECUTIVE SUMMARY READING REPORT CARD ix

Page 13: ERIC - Education Resources Information Center ...285; for a PowerPoint slide presentation related to this report, see CS 014 286. AVAILABLE FROM ED Pubs, P.O. Box 1398, Jessup, MD

national and state officials, including theNational Education Goals Panel, as a com-mon yardstick of academic performance.

In addition to providing average scoresand achievement level performance inreading for the nation's fourth-graders, thisreport provides results for subgroups offourth-grade students defined by variousbackground and contextual characteristics.A summary of major findings from the2000 NAEP reading assessment is pre-sented on the following pages.

Reading Scale Score andAchievement Level Resultsfor the NationThe reading performance of the nation'sfourth-graders has remained relativelystable across assessment years. In 2000, thenational average scale score of 217 wassimilar to that in 1992.

Although the national average scalescore has remained relatively stable, signifi-cant changes are evident at the upper andlower ends of the performance distribution.Higher performing students have madeprogress: scores at the 75th and 90th per-centiles in 2000 were significantly higherthan 1992. In contrast, the score at the 10thpercentile in 2000 was significantly lowerthan 1992.

In 2000, the percentage of fourth-gradestudents performing at or above the Basiclevel of reading achievement was 63 per-cent. Performance at or above the Proficientlevelthe level identified by NAGB as thelevel that all students should reachwasachieved by 32 percent of fourth-graders.The highest level of performance, theAdvanced level, was achieved by 8 percent

of fourth-graders.

EXECUTIVE SUTTMARY READING REPORT CARD

In 2000, the percentages of fourth-graders performing at or above Pro ficiclIland at Advanced were higher than in 1992.

Results for Student SubgroupsGenderXi In 2000, female fourth-grade students

had a higher average score than theirmale peers. The scale-score gap betweenmales and females widened since 1998.

IE The percentage of females at or abovethe Proficient level exceeded that of males.

ER The percentage of female fourth-gradersat or above the Proficient level in 2000was higher than in 1992.

Race/Ethnicity

En In 2000, white and Asian/Pacific Islanderstudents outperformed their black,Hispanic, and American Indian peers.

n A significant increase was observed inthe average scale score of Asian/PacificIslander students, whose 2000 score washigher than in 1992.The 2000 averagescore of black students was significantlyhigher in comparison to 1994.

EM The percentages of white and Asian/Pacific Islander students at or above theProficient level exceeded that of otherracial/ethnic groups.

Only among Asian/Pacific Islanderstudents was an increase observed inthe percentage at or above Proficientsince 1992.

Region

The 2000 results by region showfourth-grade students in the Northeastand Central regions outperformingtheir counterparts in the Southeast andthe West.

Among students in the Northeast, theaverage scale score in 2000 was higherin comparison to 1994.

Page 14: ERIC - Education Resources Information Center ...285; for a PowerPoint slide presentation related to this report, see CS 014 286. AVAILABLE FROM ED Pubs, P.O. Box 1398, Jessup, MD

El Students in the Northeast and Centralregions had higher percentages ofstudents at or above the Proficient levelthan the Southeast. The Northeast regionhad a higher percentage of students at orabove Proficient than the West.

Type of LocationE Fourth-grade students in central city

schools had a lower average score in2000 than their peers who attendedschools in urban fringe/large town andrural/small town locations.

IN1 Comparisons of achievement levelresults between locations show a lowerpercentage of central city students at orabove the Proficient level than their peersin other types of location.

Eligibility forFree/Reduced-Price Lunchlip In 2000, students who were eligible for

the free/reduced-price lunch programhad a lower average score than studentswho were ineligible for the program.

M Achievement level results also showlower performance among studentseligible for the program. In 2000,14 percent of eligible students performedat or above the Proficient level incomparison to 41 percent of noneligiblestudents.

Type of SchoolEll Consistent with past NAEP reading

assessments, the 2000 results indicatedthat students attending public schools hadlower average reading scale scores thantheir peers attending nonpublic schools.

M A lower percentage of public schoolstudents performed at or above theProficient level in comparison tononpublic school students.

School and dome Contextsgov Learrning

Pages Read in School andfor Homeworka Fourth-graders who reported reading

more pages daily in school and forhomework had higher average scoresthan students who reported readingfewer pages daily.

The 2000 results indicate that morefourth-grade students are readingeleven or more pages in school and forhomework on a daily basis than in 1992and 1994.

Time Spent Doing HomeworkFourth-graders who reported spendinga moderate amount of time onhomeworkone-half hour or one hourdailyhad higher average scores thanstudents who reported that they spentmore than an hour or that they eitherdid not have or did not do homework.

IN The percentage of students in 2000who reported that they do not havehomework was lower in comparisonto 1992 and 1994.

Writing about Readinga Fourth-graders who reported writing

long answers to questions on tests andassignments that involved reading on aweekly or monthly basis had higheraverage scores than students whoreported doing so once or twice a year,or never or hardly ever.

['A The 2000 assessment reports by fourth-graders indicate an increase in thefrequency of writing about reading on aweekly basis in comparison to 1994.

EXECUTIVE SUMMARY

13

READING REPORT CARD xi

Page 15: ERIC - Education Resources Information Center ...285; for a PowerPoint slide presentation related to this report, see CS 014 286. AVAILABLE FROM ED Pubs, P.O. Box 1398, Jessup, MD

Teachers' Help with WordsD Fourth-grade students who reported

that their teachers never or hardly everhelped them break words into partsscored higher than their peers whoreported receiving such help dailyor weekly.

E Fourth-graders who reported that theirteachers helped them understand newwords on a weekly or monthly basisscored higher than those who reportedreceiving this help daily or never orhardly ever.

Reading for FunStudents who reported reading for funon their own time every day had higheraverage scores than students whoreported reading for fun less frequently.

1 In 2000,75 percent of fourth-gradestudents reported reading for fun ontheir own time at least weekly.

Discussing Studies and Talking aboutReadingg Students who reported discussing their

studies at home daily, weekly, or monthlyhad higher average scores than studentswho reported never or hardly everhaving such discussions.

E Students who reported talking abouttheir reading with family and friends ona weekly basis had a higher average scorethan students who reported engaging insuch conversations daily, monthly, ornever or hardly ever.

E In 2000, 61 percent of fourth-gradestudents reported talking about theirreading with family or friends at leastweekly.

xil EXECUTIVE SUMMARY READIUG REPORT CARD

Reading Materials in the HomeThe average score for students whoreported having all four types of readingmaterials (books, magazines, newspapers,encyclopedia) in their home was higherthan those who reported having fewerreading materials.

In 2000, a lower percentage of studentsreported having all four types of readingmaterials in the home in comparison to1994.

lime Spent Watching TelevisionIN Students who reported watching three

or fewer hours of television each dayoutperformed students who reportedwatching more television.

In 2000, the percentages of students whoreported watching four or more hours oftelevision daily has decreased since 1994and the percentages of students reportingwatching three hours or less has increasedsince 1994.

Transitioning to a More inclusive NAEPE A second set of results from the 2000

NAEP reading assessment represents theperformance of students when testingaccommodations are permitted forspecial-needs students.

E A comparison of the two sets of resultsshow that the average score for thenation was lower in the results thatincluded the performance of studentsthat needed and were provided withtesting accommodations.

a I A comparison of the two sets of resultsfor Hispanic students show that theiraverage score was lower in the resultsthat included the performance ofstudents that needed and were providedwith testing accommodations.

14

Page 16: ERIC - Education Resources Information Center ...285; for a PowerPoint slide presentation related to this report, see CS 014 286. AVAILABLE FROM ED Pubs, P.O. Box 1398, Jessup, MD

ntroduction

At the start of a new century, the importance of providing

all students with an education that ensures them the means

for personal, social, and economic growth remains as central

to the American enterprise as it has always been. It may be,

however, that the challenge inherent in this enterprise has

never been greater. While the country had long anticipated

technological advances, during the last decade

progress seemed to accelerate and daily reality was

transformed in many ways. One such transformation

involved the way people read.

Traditional images of reading mainly represented

an individual with a book in hand, but contemporary

images more frequently include the image of an

How does theindividual reading from a computer screen. Many of

NAEP reading these individuals are our nation's students. Growingassessment up in a time of unprecedented access to information,measure and

report student these students need the skills to read widely andprogress? critically in order to successfully navigate the variety

of information available to them. Whether the text is

print- or electronic-based, the act of reading in an

increasingly complex world requires acute understanding

and interpretation of all that is read.

The centrality of reading has long been recognized in the

curriculum of our schools for its importance in shaping

both personal selves and participants in a democratic society.

The need to ensure the literate participation of our society's

future citizens underlies the need to monitor our students'

reading achievement.

FocusQuestions

What is the

NAEP reading

assessment?

BEST COPY AVAILABLE

15

-IntroductionContents

Overview

Reading

Framework

Reading

Assessment

School and

Student Samples

Reporting

Results

NAEP

Achievement

Levels

Interpreting

NAEP Results

This Report

INTRODUCTION READING REPORT CARD 1

Page 17: ERIC - Education Resources Information Center ...285; for a PowerPoint slide presentation related to this report, see CS 014 286. AVAILABLE FROM ED Pubs, P.O. Box 1398, Jessup, MD

Overview of the2000 National Assessment ofEducational Progress (NAEP)In 1969, the National Assessment ofEducational Progress was authorized byCongress to collect, analyze, and reportreliable and valuable information aboutwhat American students know and can doin core subject areas. Since that time, inwhat has come to be referred to as thelong-term trend assessment, NAEP hasassessed public and nonpublic schoolstudents who are 9, 13, and 17 years old.Since 1990, the more recently developedassessments referred to as the main NAEP,have also assessed public and nonpublicschool students in grades 4, 8, and 12. In2000, student performance in mathematicsand science was assessed at all three grades,and student performance in reading wasassessed at grade 4 only.

All NAEP assessments are based oncontent frameworks developed through anational consensus process. The 2000NAEP reading assessment was the fourthadministration of an assessment based onThe NAEP Reading Framework.' In 1992,1994, and 1998, the NAEP readingassessment was administered to nationalsamples of fourth-, eighth-, and twelfth-graders. In 1992 and 1994, the readingassessment was also administered to samplesof fourth-graders participating in the state-by-state assessment, and in 1998 the state-by-state assessment included samples of

fourth- and eighth-graders. Although theNational Assessment Governing Boardschedule calls for a reading assessment everyfour years, with the next full-scale readingassessment planned for 2002, an assessmentat grade four only was authorized for 2000.The administration of the readingassessment in 2000 was in response toheightened concern that all children learnto read by the end of third grade. In orderto closely monitor early reading achievement,the assessment was given to a nationalsample of fourth-graders. No state-by-statereading results were collected in 2000.

This report describes the results of the2000 NAEP reading assessment at grade 4,and compares results in 2000 to fourth-grade performance in 1992, 1994, and1998.The comparisons will focus on 2000results in relation to earlier results.Comparisons of 1998 to 1994 and of 1994to 1992 were made in previous report cardsand therefore are neither highlighted nordiscussed in this report.' Comparisonsacross assessment years are possible becausethe assessments were developed under thesame framework and share a common setof reading tasks and because thepopulations of students were sampled andassessed using comparable procedures.

Framework for the 1992, 1994,1998, and 2000 AssessmentsThe NAEP Reading Framework' providedthe operational specifications andtheoretical basis for developing all the NAEP

I National Assessment Governing Board. Reading frametvork for the National Assestnent of Educational Progress: 1992-

2000. Washington, DC:Author.

2 Donahue, P.L.,Voekl, K.E., Campbell, J.R., & Mazzeo, J. (1999). NAEP 1998 reading report card for the nation and the

states. Washington, DC: National Center for Education Statistics.

Campbell, J.R.., Donahue, P.L., Reese, C.M., & Phillips, G.W. (1996). NAEP 1994 reading report card for the nation

and the states. Washington. DC: National Center for Education Statistics.

3 National Assessment Governing Board. Reading framework for the National Assessment of Educational Progress.

1992-2000. Washington, DC:Author. [Also available online at http:www.nagb.org/pubs/92-2000read/toc.htmll

2 INTRODUCTION * READING REPORT CARO

16

Page 18: ERIC - Education Resources Information Center ...285; for a PowerPoint slide presentation related to this report, see CS 014 286. AVAILABLE FROM ED Pubs, P.O. Box 1398, Jessup, MD

reading assessments administered since 1992.

The result of a national consensus effort, theNAEP Reading Framework reflects the ideasof many individuals involved and interestedin reading education. This consensus effortwas managed by the Council of Chief StateSchool Officers (CCSSO) under thedirection of the National AssessmentGoverning Board (NAGB).

The framework reflects research thatviews the act of reading as a dynamic andinteractive process involving the reader, thetext, and the context of the readingexperience. For example, readers may

employ different strategies depending onthe text's structure and their purpose forreading it.'

Recognizing that readers vary theirapproach according to their purpose forreading, the framework specifies threepurposes for reading to be assessed: readingfor literary experience, reading to gaininformation, and reading to perform a task.All three purposes are assessed at grades 8and 12, but reading to perform a task is notassessed at grade 4. The three purposes forreading as specified in the framework aredescribed in figure i.1.

U+.

Reading for literary

ettporience

Reading to gain

information

Reading to perform

a task

Reading for literary experience entails the reading of various literary texts toenlarge our experience of human events and emotions, and to enhance both ourappreciation of the world and how it is depicted through language. Literary textsused in the NAEP reading assessment included adventure stories, science fiction,and folktales.

When reading to gain information, readers are usually focused on a specific topicor point of reference. They are trying to understand and retain the text information.Informative texts used in the NAEP reading assessment included science articles,primary and secondary historical sources, sections of textbook chapters, essays,and speeches.

Reading to perform a task involves reading various types of materials for thepurpose of applying the information or directions to complete a specific task.As such, readers must focus on how they will actually use the information.The materials used to assess this purpose in the NAEP reading assessmentincluded classified advertisements, directions for completing various projects,and a tax form.[Not assessed at grade 4.]

SOURCE: National Assessment Governing Board. Reading framework for the National Assessment of Educational Progress: 1992-2000.

4 Anderson, R.C. & Pearson, P.D. (1984). A schema-theoretic view of basic processes in reading comprehension. InPD. Pearson (Ed.), Handbook of reading research (pp. 255-292). NewYork: Longman.

Pressley, M. & Afflerbach, P. (1995). Verbal protocols of reading. The nature of constructively responsive reading. Hillsdale,NJ: Lawrence Erlbaum Associates.

Ruddell, R.B. & Unrau, N.J. (1994). Reading as a meaning-construction process:The reader, the text, and theteacher. In K. B. Ruddell, M.R. Ruddell, & H. Singer (Eds.) Theoretical models and processes of reading (pp. 864-894).Newark, DE: International Reading Association.

Taylor, B.M. (1992). Text structure, comprehension, and recall. In S.J. Samuels & A.E. Farstrup (Eds.), What research

has to say about reading instruction (pp. 220-235). Newark, DE: International Reading Association.

BEST COPY AVAILABLE

1?INTRODUCTION READING REPORT CARD 3

Page 19: ERIC - Education Resources Information Center ...285; for a PowerPoint slide presentation related to this report, see CS 014 286. AVAILABLE FROM ED Pubs, P.O. Box 1398, Jessup, MD

The framework also specifies four typesof reading processesreferred to as"reading stances"that characterize waysreaders respond to text. The four stancesare: Initial Understanding, Developing anInterpretation, Personal Reflection andResponse, and Critical Stance.The readingstances represent the changing waysreaders position themselves in relation toa text, with each way contributing to thecomprehension of it.' The stances are notintended to be indicative of hierarchicalreading skills, but rather to represent aspectsof reading processes that occur at anyreading ability level.The four readingstances as specified in the framework aredescribed in figure i.2.

The Reading AssessmentinstrumentsAs the only federally mandated ongoingassessment of student reading achievementon a national scale, it is imperative that theNAEP assessment reflects the frameworkand expert perspectives and opinions aboutreading comprehension and its measurement.To that end, the assessment developmentprocess involves stages and processes ofreview by teachers and teacher educators aswell as by state officials and measurementexperts. All components of the assessmentare evaluated for curricular relevance,developmental appropriateness, and fairnessconcerns.

leitial

...understanding:

Developing an

interpretation:

Personal reflection

and response:

Critical stance:

preliminary consideration of the text as a whole

Readers are asked to consider the whole text in demonstrating an overall under-standing of its meaning and function.

discerning connections and relationships among ideas within the text

Readers are asked to build upon their initial impressions to develop a morethorough understanding of the text and the interrelationship of its parts.

relating personal knowledge to text ideas

Readers are asked to describe how ideas in the text confirm, contradict, or comparewith prior knowledge and experiences.

standing apart from the text to consider it objectively

Readers are asked to consider how the author conveys information, expresses ideasor feelings, and communicates a message.

SOURCE: National Assessment Governing Board. Reading framework for the National Assessment of Educational Progress: 1992-2000.

5 Langer, J.A. (1990).The process of understanding. Reading for literary and informative purposes. Research in theteaching of English, 24(3), 229-259.

4 INTRODUCTION READING REPORT CARD

18

Page 20: ERIC - Education Resources Information Center ...285; for a PowerPoint slide presentation related to this report, see CS 014 286. AVAILABLE FROM ED Pubs, P.O. Box 1398, Jessup, MD

The reading passages used in the NAEPreading assessment are drawn from thetypes of books and magazines that studentsmight encounter in or out of school. Instriving for authentic reading experiences,the reading materials are neither abridgednor written for the assessment. They arereprinted in test booklets in a format thatreplicates as closely as possible theiroriginal publication.

At grade 4, all test booklets contain two25-minute sections, each containing areading passage and a set of approximately10 comprehension questions. The questionsare presented in both multiple-choice andconstructed-response formats. At least halfof the questions are constructed-response,which allow students to write out theirown answers and explain and support theirideas. Constructed-response questions wereof two types: short, requiring a one or twosentence answer, and extended, requiring aparagraph or full-page response.

The 2000 reading assessment at fourthgrade used 8 different reading passages andcomprised a total of 81 questions: 35multiple-choice, 38 short constructed-response (scored according to a two- orthree-level scoring rubric), and 8 extendedconstructed-response (scored according toa four-level rubric).The greater proportionof student response time was spent answeringthe constructed-response questions.

Description of School andStudent SamplesThe NAEP 2000 reading assessment wasadministered to fourth-graders at thenational level.There was no state-by-statereading assessment in 2000.The findings inthis report pertain to all fourth-graders inthe nation, both those who do not needaccommodations as well as those who are

provided accommodations for theirdisability or limited English proficiency.The national results presented in the firstthree chapters of this report are based on anationally representative sample of fourth-graders who can be meaningfully assessedwithout accommodations. Chapter 4presents national results for a representativesample that includes the performance ofstudents who needed and were providedwith accommodations.

Each selected school that participated inthe assessment and each student assessedrepresent a portion of the population ofinterest. For information on sample sizes andparticipation rates, see appendix A.

Reporting the Assessment ResultsThe results of student performance on theNAEP reading assessment are presented intwo ways: as average scores on the NAEPcomposite reading scale, and in terms ofthe percentage of students attaining NAEPreading achievement levels. The averagescale scores represent students' performanceon the assessment. The achievement levelsrepresent how that performance measuredup against set expectations for achievement.The average scale scores represent whatstudents know and can do. The achievementlevel results indicate the degree to whichstudent performance meets expectations ofwhat they should know and be able to do.

Average scale score results are presentedon the NAEP reading composite scale,which ranges from 0-500. Students'responses on the NAEP 2000 readingassessment were analyzed to determine thepercentages of students respondingcorrectly to each multiple-choice questionand the percentages of students respondingat each score level for the constructed-response questions. The analysis entails

19

INTRODUCTION READING REPORT CARD S

Page 21: ERIC - Education Resources Information Center ...285; for a PowerPoint slide presentation related to this report, see CS 014 286. AVAILABLE FROM ED Pubs, P.O. Box 1398, Jessup, MD

summarizing the results on separatesubscales for each reading purpose, thencombining the separate scales to form asingle composite reading scale. The relativecontribution of each reading purpose atgrade 4, as specified in the readingframework, is 55 percent for reading forliterary experience and 45 percent forreading to gain information. (A fulldescription of NAEP scales and scalingprocedures can be found in theforthcoming NAEP 2000 Technical Report.)

Achievement level results are presentedin terms of reading achievement levels asauthorized by the NAEP legislation andadopted by the National AssessmentGoverning Board. For each grade tested,NAGB has adopted three achievementlevels: Basic, Proficient, and Advanced. For

reporting purposes, the achievement levelcut scores are placed on the reading scale,resulting in four ranges: below Basic, Basic,

Proficient, and Advanced.

The Setting of AchievementLeveDsThe 1988 NAEP legislation that createdthe National Assessment Governing Boarddirected the Board to identify "appropriateachievement goals...for each subject area"that NAEP measures!' The 1994 NAEPreauthorization reaffirmed many of theBoard's statutory responsibilities, including"developing appropriate studentperformance standards for each age andgrade in each subject area to be testedunder the National Assessment."7 In orderto follow this directive and achieve themandate of the 1988 statute "to improve

the form and use of NAEP results," theBoard undertook the development ofstudent performance standards (called"achievement levels"). Since 1990, theBoard has adopted achievement levels inmathematics, reading, U.S. history, worldgeography, science, writing, and civics.

The Board defined three levels for eachgrade: Basic, Proficient, and Advanced.TheBasic level denotes partial mastery of theknowledge and skills that are fundamentalfor proficient work at a given grade. TheProficient level represents solid academicperformance. Students reaching this leveldemonstrate competency over challengingsubject matter.The Advanced level signifiessuperior performance at a given grade. Foreach grade, the levels are cumulative; that is,abilities achieved at the Proficient levelpresume mastery of abilities associated withthe Basic level, and attainment of theAdvanced level presumes mastery of boththe Basic and Proficient levels. Figure i.3presents the policy definitions of theachievement levels that apply across gradesand subject areas. (Specific descriptions ofreading achievement for the levels at grade4 are presented in chapter 1.) Adoptingthree levels of achievement for each gradesignals the importance of looking at morethan one standard of performance. TheBoard believes, however, that all studentsshould reach the Proficient level; the Basiclevel is not the desired goal, but ratherrepresents partial mastery that is a steptoward Proficient.

The achievement levels in this reportwere adopted by the Board based on a

6 Public Law 100-297. (1988). National Assessment of Educational Progress Improvement Act (20 USC 1221).Washington, DC.

7 Public Law 103-382. (1994). Improving America's Schools Act (20 USC 9010). Washington, DC.

6 IIITRODUCTION REAOIDG REPORT CARD

f)0

Page 22: ERIC - Education Resources Information Center ...285; for a PowerPoint slide presentation related to this report, see CS 014 286. AVAILABLE FROM ED Pubs, P.O. Box 1398, Jessup, MD

Figure definitions

Achievement

..:.8asic This level denotes partial mastery of prerequisite knowledge and skills that arefundamental for proficient work at each grade.

s: Proficient

Advanced

This level represents solid academic performance for each grade assessed. Studentsreaching this level have demonstrated competency over challenging subject matter,including subject-matter knowledge, application of such knowledge to real-worldsituations, and analytical skills appropriate to the subject matter.

This level signifies superior performance.

SOURCE: National Assessment Governing Board.

standard-setting process designed andconducted under a contract with ACT. Todevelop these levels, ACT convened a crosssection of educators and interested citizensfrom across the nation and asked them tojudge what students should know and beable to do relative to a body of contentreflected in the NAEP assessmentframework for reading. This achievementlevel setting process was reviewed by anarray of individuals including policymakers,representatives of professional organizations,teachers, parents, and other members of thegeneral public. Prior to adopting theselevels of student achievement, NAGBengaged a large number of persons tocomment on the recommended levels andto review the results.

The results of the achievement levelsetting process, after NAGB approval,became a set of achievement leveldescriptions and a set of achievement levelcut points on the 0-500 NAEP readingscale.The cut points are the scores thatdefine the boundaries between below Basic,Basic, Proficient, and Advanced performance

at grades 4, 8, and 12. The Board establishedthese reading achievement levels in 1992based upon the reading content framework;these levels were used for the 1992, 1994,1998, and 2000 reading assessments.

The Developmental Status ofAchievement LevelsThe 1994 NAEP reauthorization lawrequires that the achievement levels beused on a developmental basis until theCommissioner of Education Statisticsdetermines that the achievement levels are

valid, and informativeto the public." Until that determination ismade, the law requires the Commissionerand the Board to state clearly thedevelopmental status of the achievementlevels in all NAEP reports.

In 1993, the first of severalcongressionally mandated evaluations ofthe achievement level setting processconcluded that the procedures used to setthe achievement levels were flawed andthat the percentage of students at or aboveany particular achievement level cut point

8 The Improving America's Schools Act of 1994 (20 USC 9010) requires that the Commissioner base his determi-nation on a congressionally mandated evaluation by one or more nationally recognized evaluation organizations,such as the National Academy of Education or the National Academy of Science.

INTRODUCTION READING REPORT CARD 7

Page 23: ERIC - Education Resources Information Center ...285; for a PowerPoint slide presentation related to this report, see CS 014 286. AVAILABLE FROM ED Pubs, P.O. Box 1398, Jessup, MD

may be underestimated.' Others havecritiqued these evaluations, asserting thatthe weight of the empirical evidence doesnot support such conclusions."'

In response to the evaluations andcritiques, NAGB conducted an additionalstudy of the 1992 reading achievementlevels before deciding to use the 1992reading achievement levels for reporting1994 NAEP results." When reviewing thefindings of this study, the NationalAcademy of Education (NAE) Panelexpressed concern about what it saw as a"confirmatory bias" in the study and aboutthe inability of this study to "address thepanel's perception that the levels had beenset too high."12 In 1997, the NAE Panelsummarized its concerns with interpretingNAEP results based on the achievementlevels as follows:

First, the potential instability of the levels

may interfere with the accurate portrayal of

trends. Second, the perception that few American

students are attaining the higher standards we

have set for them may deflect attention to the

wrong aspects of education reform. The public has

9

10

11

12

13

14

15

INTRODUCTION

indicated its interest in benchmarking against

international standards, yet it is noteworthy that

when American students performed very well on

a 1991 international reading assessment, these

results were discounted because they were

contradicted by poor performance against the

possibly flawed NAEP reading achievement

levels in the following year"

The NAE Panel report recommended"that the current achievement levels beabandoned by the end of the century andreplaced by new standards...." TheNational Center for Education Statisticsand the National Assessment GoverningBoard have sought and continue to seeknew and better ways to set performancestandards on NAEP. 14 For example, NCESand NAGB jointly sponsored a nationalconference on standard setting in large-scale assessments, which explored manyissues related to standard setting.'Although new directions were presentedand discussed, a proven alternative to thecurrent process has not yet been identified.The Commissioner of Education Statisticsand the Board continue to call on the

United States General Accounting Office. (1993). Education achievement standards: NACB's approach yields misleading

interpretations, U.S. General Accounting Office Report to Congressional Requestors. Washington, DC: Author.

National Academy of Education. (1993). Setting performance standards for achievement:A report of the National Academyof Education Panel on the evaluations of the NAEPTrial State Assessment: An evaluation of the 1992 achievement levels.Stanford, CA: Author.

Cizek, G. (1993). Reactions to National Academy of Education report. Washington, DC: National Assessment GoverningBoard.

Kane, M. (1993). Comments on the NAE evaluation of the NAGB achievement levels.Washington, DC: NationalAssessment Governing Board.

American College Testing. (1995). NAEP reading revisited: An evaluation of the 1992 achievement level descriptions.Washington, DC: National Assessment Governing Board.

National Academy of Education. (1996). Reading achievement levels. In Quality and utility:The 1994 Trial StateAssessment in reading. The fourth report of the National Academy of Education Panel on the evaluation of the NAEP TrialState Assessment. Stanford, CA: Author.

National Academy of Education. (1997). Assessment in transition: Monitoring the nation's educational progress (p. 99).

Mountain View, CA: Author.

Reckase, Mark, D. (2000). The evolution of the NAEP achievement levels setting process: A summary of the research anddevelopment efforts conducted by ACT. Iowa City, IA:ACT. Inc.

National Assessment Governing Board and National Center for Education Statistics. (1995). Proceedings of the joint

conference on standard setting for large -scale assessments of the National Assessment Governing Board (NAGB) and theNational Center for Education Statistics (NCES). Washington, DC: Government Printing Office.

READING REPORT CARDBEST COPY ARAB RE

Page 24: ERIC - Education Resources Information Center ...285; for a PowerPoint slide presentation related to this report, see CS 014 286. AVAILABLE FROM ED Pubs, P.O. Box 1398, Jessup, MD

research community to assist in findingways to improve standard setting forreporting NAEP results.

The most recent congressionallymandated evaluation conducted by theNational Academy of Sciences (NAS)relied on prior studies of achievementlevels, rather than carrying out newevaluations, on the grounds that the processhas not changed substantially since theinitial problems were identified. Instead, theNAS Panel studied the development of the1996 science achievement levels. The NASPanel basically concurred with earliercongressionally mandated studies. ThePanel concluded that "NAEP's currentachievement level setting proceduresremain fundamentally flawed. Thejudgment tasks are difficult and confusing;raters' judgments of different item types areinternally inconsistent; appropriate validityevidence for the cut scores is lacking; andthe process has produced unreasonableresults."16

The NAS Panel accepted the continuinguse of achievement levels in reportingNAEP results on a developmental basis,until such time as better procedures can bedeveloped. Specifically, the NAS Panelconcluded that "...tracking changes in thepercentages of students performing at orabove those cut scores (or in fact, any

selected cut scores) can be of use indescribing changes in student performanceover time.""

The National Assessment GoverningBoard urges all who are concerned aboutstudent performance levels to recognizethat the use of these achievement levels is adeveloping process and is subject to variousinterpretations. The Board and the ActingCommissioner believe that theachievement levels are useful for reportingtrends in the educational achievement ofstudents in the United States." In fact,achievement level results have been used inreports by the President of the UnitedStates, the Secretary of Education, stategovernors, legislators, and members ofCongress. The National Education GoalsPanel and government leaders in the nationand in more than 40 states use these resultsin their annual reports.

However, based on the congressionallymandated evaluations so far, theCommissioner agrees with the NationalAcademy's recommendation that cautionneeds to be exercised in the use of thecurrent achievement levels. Therefore, theCommissioner concludes that theseachievement levels should continue to beconsidered developmental and shouldcontinue to be interpreted and used withcaution.

16 Pellegrino, J.W.Jones, L.R., & Mitchell, K.J. (Eds.). (1998). Grading the nation's report card: evaluating NAEP andtransforming the assessment of educational progress. Committee on the Evaluation of National Assessments of Educa-tional Progress, Board on Testing and Assessment, Commission on Behavioral and Social Sciences and Education.National Research Council. (p. I 82). Washington. DC: National Academy Press.

17 ibid., page 176.

18 Forsyth. Robert A. (2000). A description of the standard-setting procedures used by three standardized testpublishers. In Student performance standards on the National Assessment of Educational Progress:Affirmations andimprovements. Washington, DC: National Assessment Governing Board.

Nellhaus, Jeffrey M. (2000). States with NAEP-like performance standards. In Student performance standards on theNational Assessment of Educational Progress: Affirmations and improvements. Washington, DC: National AssessmentGoverning Board.

BEST COPY PAO BLE23

INTRODUCTION READING REPORT CARD 9

Page 25: ERIC - Education Resources Information Center ...285; for a PowerPoint slide presentation related to this report, see CS 014 286. AVAILABLE FROM ED Pubs, P.O. Box 1398, Jessup, MD

interpveting NAEP ResuitsThe average scores and percentagespresented in this report are estimatesbecause they are based on representativesamples of students rather than on eachindividual student in the population(s). Assuch, the results are subject to a measure ofuncertainty, reflected in the standard errorof the estimates. The standard errors for theestimated scale scores and percentages inthis report are provided in appendix C.

The differences between scale scores andbetween percentages discussed in thefollowing chapters take into account thestandard errors associated with theestimates. Comparisons are based onstatistical tests that consider both themagnitude of the difference between thegroup average scores or percentages andthe standard errors of those statistics.Throughout this report, differencesbetween scores or between percentages arepointed out only when they are significantfrom a statistical perspective. All differencesreported are significant at the .05 level withappropriate adjustments for multiplecomparisons. The term significant is notintended to imply a judgment about theabsolute magnitude of the educationalrelevance of the differences. It is intendedto identify statistically dependablepopulation differences to help informdialogue among policymakers, educators,and the public.

10 INTRODUCTION o READING REPORT CARD

Readers are cautioned againstinterpreting NAEP results in a causal sense.Inferences related to subgroup performanceor to the effectiveness of public andnonpublic schools, for example, should takeinto consideration the manysocioeconomic and educational factors thatmay also impact reading performance.

This ReportThis report describes fourth-grade students'performance on the NAEP 2000 readingassessment and compares the performanceto previous NAEP assessment results.Chapter 1 presents overall scale score andachievement level results and also providesa more delineated view by showing scoresat percentiles across the performancedistribution. Chapter 2 examinesassessment results for subgroups of studentsby gender, race/ethnicity, region of thecountry, type of location, type of school,and by eligibility for the free/reduced-price lunch program. In chapter 3, students'home and school experiences as portrayedby their responses to NAEP backgroundquestions provide the context forexamining student assessment scores.Chapter 4 focuses on a second set of resultsfrom the 2000 reading assessment thatincludes the performance of special needsstudents who were permittedaccommodations in the test administration.The chapter presents these results for thenation and selected subgroups of students,and compares them to the results presentedin chapters 1 and 2.

24

Page 26: ERIC - Education Resources Information Center ...285; for a PowerPoint slide presentation related to this report, see CS 014 286. AVAILABLE FROM ED Pubs, P.O. Box 1398, Jessup, MD

verage ScaDe Sc ire andAchievement Level -"esuDts for the Nation

This chapter presents the national results of the NAEP 2000

reading assessment at grade 4. Student performance is

described by average scale scores on the NAEP reading

composite scale and in terms of percentages of students who

attained each of the three reading achievement levels: Basic,

Proficient, and Advanced. Results of the NAEP 2000 reading

assessment are compared with fourth-grade results from

three previous assessments: 1992, 1994, and 1998.This

comparison is possible because the assessments share a

common set of reading tasks based on the current

reading framework and because the populations of

students were sampled and assessed using comparable

procedures.

To illustrate the reading abilities demonstrated

on the NAEP assessment, sample questions and actual

student responses from the 2000 assessment are

included in this chapter. Three sample questions are

provided to show the kind of tasks students were

asked to do and how they responded.

The concluding section of this chapter presents

a map of selected item descriptions. The item map

provides a more comprehensive portrait of student

performance by placing item descriptions on the NAEP

scale where the question was likely to be answered

successfully by students. By mapping item descriptions in

this way, the reading skills and abilities associated with

different points on the scale are visually represented.

Are the nation's

fourth-graders

making progress

in reading?

racingabilities co theyhave?

MIT COPY AVAMIABITg

25

ChapterContents

Scale Score

Results for the

Elation

Achievement

level Results for

the Nation

Sample

Assessment

Questions and

Student

Responses

Item Map

CHAPTER 1 READING REPORT CARD 11

Page 27: ERIC - Education Resources Information Center ...285; for a PowerPoint slide presentation related to this report, see CS 014 286. AVAILABLE FROM ED Pubs, P.O. Box 1398, Jessup, MD

Avenge Scale Scope ResultsThe results of the NAEP 2000 readingassessment at grade 4 show overall stabilityin student performance across the assess-ment years: the average reading scale scorefor 2000 was not significantly differentfrom 1992, 1994, and 1998 results. Figure1.1 presents the average reading scale scoresof fourth-grade students attending bothpublic and nonpublic schools for the fourassessments between 1992 and 2000.

Another way to view students' readingability is by looking at the scale scoresattained by students across the performancedistribution.The advantage of looking atthe data this way is that it shows howfourth-graders with lower or higher read-ing ability performed in relation to thenational average. In addition, the percentiledata show whether trends in the nationalaverage score are reflected in scores across

#

1

217

the performance distribution. Figure 1.2presents the reading scale scores for fourth-grade students at the 10th, 25th, 50th, 75th,and 90th percentiles for the NAEP readingassessments from 1992-2000.

While the 2000 national average scoreof 217 was not significantly higher or lowerin comparison to fourth-graders' averagescore in previous assessment years, stabilityacross years is not reflected at all thepercentiles. In fact, only at the 50thpercentileamong students performingaround the national averagehave scoresremained stable across assessments.

In 2000 the scores at both the 10th and25th percentiles are higher in relation to1994; however, at the 10th percentile thescore in 2000 is lower than it was in 1992.A different trend is evident among higher-performing fourth - graders. At the 75th and90th percentiles, scores have increased

o

214 217 217

111

SOURCE: Notional Cantor for Education Statistics, National Assessment of Educational Progross INAEP), 1992. 1994. 1990, and 2000 Reading Assessments.

12 CUOMO 1 22A0100 ClEPOQT CA00

26REST COPY AVAOLAME

Page 28: ERIC - Education Resources Information Center ...285; for a PowerPoint slide presentation related to this report, see CS 014 286. AVAILABLE FROM ED Pubs, P.O. Box 1398, Jessup, MD

across assessment years, resulting in a 2000

score that is higher in relation to 1992.

These data indicate that the lowest-performing students in 2000 were notperforming as well as their counterparts in1992; however, the results for the highest-performing students indicate an increase inperformance between 1992 and 2000.

Achlevement Levee eiengetsThe 'results of student performance are notonly reported using scores on the NAEPreading scale, but also using readingachievement levels as authorized by theNAEP legislation and as adopted by theNational Assessment Governing Board.'The achievement levels are performancestandards adopted by the Board, based onthe collective judgements of experts about

what students should be expected to knowand be able to do in terms of the NAEPreading framework. Viewing students'performance from this perspective providessome insight into the adequacy of students'knowledge and skills and the extent towhich they achieved expected levels ofperformance.

The Board reviewed and adopted therecommended achievement levels in 1992,which were derived from the judgments ofa broadly representative panel that includedteachers, education specialists, and membersof the general public. For each gradeassessed, the Board has adopted threeachievement levels: Basic, Proficient, and

Advanced. For reporting purposes, theachievement level cut scores are placed

I I

0231* 263 I

E:5224

242c712317245

211) ate

00

1

yoex

1EE1*

221

I I I

II MIT WW,I

4i 6r,

ir Significantly different from 2000.SOURCE: National Center for Education Statistics. National Assessment of Educational Progress (NAEP), 1992,1994, 1998, and 2000 Reading Assessments.

I The Improving America's Schools Act of 1994 (20 USC 9010) requires that the National Assessment GoverningBoard develop "appropriate student performance levels" for reporting NAEP results.

CORM° 1

BEST COPY NAUBLE 2 7OEAOI00 aaponi cant) 13

Page 29: ERIC - Education Resources Information Center ...285; for a PowerPoint slide presentation related to this report, see CS 014 286. AVAILABLE FROM ED Pubs, P.O. Box 1398, Jessup, MD

on the NAEP reading scale resulting infour ranges: the range below Basic, Basic,Proficient, and Advanced. Figure 1.3 presentsspecific descriptions of reading achieve-ment for the levels at grade four.

The NAEP legislation requires thatachievement levels be "used on a develop-

mental basis until the Commissioner ofEducation Statistics determines ... thatsuch levels are reasonable, valid, andinformative to the public."A discussion ofthe developmental status of achievementlevels may be found in the introduction tothis report.

Basic

(208)

-..:4.

Proficient

(238)

Advanced

(268)

Fourth-grade students performing at the Basic level should demonstrate anunderstanding of the overall meaning of what they read. When reading textappropriate for fourth-graders, they should be able to make relatively obviousconnections between the text and their own experiences and extend the ideasin the text by making simple inferences.

For example, when reading literary text, students should be able to tell what thestory is generally aboutproviding details to support their understandingandbe able to connect aspects of the stories to their own experiences.

When reading informational text, Basic-level fourth-graders should be able totell what the selection is generally about or identify the purpose for reading it,provide details to support their understanding, and connect ideas from the textto their background knowledge and experiences.

Fourth-grade students performing at the Proficient level should be able todemonstrate an overall understanding of the text, providing inferential as well asliteral information. When reading text appropriate to fourth grade, they shouldbe able to extend the ideas in the text by making inferences, drawingconclusions, and making connections to their own experiences. The connectionbetween the text and what the student infers should be clear.

For example, when reading literary text, Proficient-level fourth-graders should beable to summarize the story, draw conclusions about the characters or plot, andrecognize relationships such as cause and effect.

When reading informational text, Proficient-level students should be able tosummarize the information and identify the author's intent or purpose. Theyshould be able to draw reasonable conclusions from the text, recognizerelationships such as cause and effect or similarities and differences, andidentify the meaning of the selection's key concepts.

Fourth-grade students performing at the Advanced level should be able togeneralize about topics in the reading selection and demonstrate an awarenessof how authors compose and use literary devices. When reading text appropriateto fourth grade, they should be able to judge text critically and, in general, givethorough answers that indicate careful thought.

For example, when reading literary text, Advanced-level students should be ableto make generalizations about the point of the story and extend its meaning byintegrating personal experiences and other readings with the ideas suggested bythe text. They should be able to identify literary devices such as figurativelanguage.

When reading informational text, Advanced-level fourth-graders should be ableto explain the author's intent by using supporting material from the text. Theyshould be able to make critical judgments of the form and content of the textand explain their judgments clearly.

14 CHAPTER 1 READING REPORT CARD

28BEST GOV'? AVAILABLE

Page 30: ERIC - Education Resources Information Center ...285; for a PowerPoint slide presentation related to this report, see CS 014 286. AVAILABLE FROM ED Pubs, P.O. Box 1398, Jessup, MD

Achievement level results for the nation'sfourth-grade students are presented infigure 1.4. Results are shown in two ways:the percentage of students within eachachievement level interval, and the per-centage of students at or above the Basicand at or above the Proficient achievementlevels. In reading figure 1.4, it is necessaryto keep in mind that the levels in the upper

bars are cumulative; included amongstudents who are considered to be at orabove the Basic level are those who havealso achieved the Proficient and Advancedlevels of performance, and included amongstudents who are considered to be at orabove Proficient are those who have attainedthe Advanced level of performance.

As shown in figure 1.4, performance at

Achievement Level

Results:'

1.-atfoe centage of itourt Eade4tudentntior aboveteadin

within each achievement level nange:);992-200

Os- 62% at or above Basic

1992 -Jo- 29% at or above Proficient*

Below Basic

1994

Basic

O. 60% at or above Basic

122%Proficient

1101- 30% at or above Proficient

22%Below Basic

1998

Basic

145- 62% at or above Basic

Below Basic

2000

Below Basic

Basic

10- 63% at or above Basic

Proficient Advanced

31% at or above Proficient

24%Proficient

lk> 32% at or above Proficient

Basic Proficient

tr Significantly different from 2000.

NOTE: Percentages within each reading achievement level range may not add to 100, or to the exact percentages at or above achievement levels,

due to rounding.

SOURCE: National Center for Education Statistics, National Assessment of Educational Progress (NAEP), 1992, 1994, 1998, and 2000 Reading Assessments.

EST COPY NARA LE

29ClIAPTER 1 REA01136 IIEPOUT CARD 15

Page 31: ERIC - Education Resources Information Center ...285; for a PowerPoint slide presentation related to this report, see CS 014 286. AVAILABLE FROM ED Pubs, P.O. Box 1398, Jessup, MD

or above the Proficient levelthe achieve-ment level identified by NAGB as the levelthat all students should reachwasachieved by 32 percent of students in 2000,while the highest level of performance,Advanced, was achieved by 8 percent offourth-graders in 2000.

Viewing achievement level results acrossassessment years shows that in 2000 ahigher percentage of students were at orabove the Proficient level and at the Advancedlevel in comparison to 1992.Corresponding to the increase at or abovethe Proficient level, figure 1.4 shows that alower percentage of students fell within theBasic level in 2000 than in 1992.

Sample Assessment Questionsand Student ResponsesThe following pages present sample ques-tions and student responses that portraystudent performance on the 2000 NAEPreading assessment. Three questions, whichwere asked about an informative article,were selected to exemplify the range ofreading abilities demonstrated by students.The full text of the informative article, aswell as additional sample questions, can befound in appendix B.

16 CHAPTER 1 READING REPORT CARD

The three questions to follow include amultiple-choice, a short constructed-response, and an extended constructed-response question. The correct oval is filledin for the multiple-choice question. Forconstructed-response questions, a summaryof the scoring criteria used to rate students'responses is provided. Actual studentresponses have been reproduced fromassessment test booklets to illustrate repre-sentative answers.The rating assigned toeach sample response is indicated.

The tables in this section present twotypes of percentages for each samplequestion: the overall percentage of studentswho answered successfully, and thepercentage of students who answeredsuccessfully within a specific score range onthe NAEP reading composite scale. Thescore ranges correspond to the threeachievement level intervalsBasic, Proficient,and Advanced. It should be noted that theoverall percentage of students shown in

- these tables includes students in the rangebelow Basic, as well as students whoseperformance fell within the threeachievement level ranges.

30

Page 32: ERIC - Education Resources Information Center ...285; for a PowerPoint slide presentation related to this report, see CS 014 286. AVAILABLE FROM ED Pubs, P.O. Box 1398, Jessup, MD

Informative Article:

A Brick to Cuddle Up To is an informational passagedescribing what colonists did to keep warm during thewinter months. The author includes descriptions ofhow colonial homes were heated, how colonists keptwarm at night, how they heated water for baths, as wellas providing details that depicted differences betweencontemporary and colonial American life. (Seeappendix B for full text of passage.)

Sample Question

In writing this article, the author mostly made use of

broad ideas

0 specific details

© important questions

© interesting characters

Reading

fib 00Purpose:

Information

Reading

CSC; Stance

VELO211118 Sample question 'fte@ab (multiple-choice)

Overall percentage correct and percentages correct within each achievement level range: 2000

Perce tageac ieve en

co restCfxd

decointerval

Overall percentage fink Proficient Advanced

Coifed 200 -231* 230-267° 260 and above'

66 72 79 ii_

tlncludes fourth-grade students who were below the Basic level.

*NAEP Reading composite scale range.

SOURCE: National Center for Education Statistics, National Assessment ofEducational Progress (NAEP), 2000 Reading Assessment.

31

EsT COPY AVAILABEE

CHAPTER 1 READING REPORT CARD 11

Page 33: ERIC - Education Resources Information Center ...285; for a PowerPoint slide presentation related to this report, see CS 014 286. AVAILABLE FROM ED Pubs, P.O. Box 1398, Jessup, MD

Sample Question

Do you think "A Brick to Cuddle Up To" is a goodtitle for this article? Using information from thearticle, tell why or why not.

Reading Purpose:

Information

Reading

Developing

Interpretation

Responses to this question were scored according to a three-level rubric asUnsatisfactory, Partial, or Complete

Overall percentage "Complete" and percentages "Complete" within each achievement level range: 2000

Overall percentage Basic Proficient Advanced

"Complete"' 208-237° 238-267° 268 and above'

37 38 57 16

tlncludes fourth-grade students who were below the Basic level.

*NAEP Reading composite scale range.

SOURCE: National Center for Education Statistics, National Assessment ofEducational Progress (NAEP), 2000 Reading Assessment.

Responses scored "Complete" demonstrated understanding of how the title relates to thecentral theme by indicating that the article described methods of keeping warm during winterin colonial times.

Sample "Complete" Response:

Do you think "A Brick to Cuddle Up To" is a good title for this article?Using information from the article, tell why or why not.

Yes 1- an +h i r If is a 3,,, d 47-11e,

Ic a sood 10-ecc/se f.h.t(s °boat how cnion'isic

kepi-- worn? 'Ili Ihe ;lifer q- hot') flie,qui sect head ur IC -Jo

18 CHAPTER 1 READING REPORT CARD

- C. Id'

32BEST COPY MAILABLE

Page 34: ERIC - Education Resources Information Center ...285; for a PowerPoint slide presentation related to this report, see CS 014 286. AVAILABLE FROM ED Pubs, P.O. Box 1398, Jessup, MD

Sample Question

Pretend that you are an early American colonist.Describe at least three activities you might doduring a cold winter evening. Be specific. Use detailsfrom the article to help you write your description.

Reading Purpose:

Information

Reading Stance:

Developing

Interpretation

Responses to this question were scored according to a four-level rubric asUnsatisfactory, Partial, Essential, or Extensive

Overall percentage "Essential" or better and percentages "Essential" or better within eachachievement level range: 2000

bora percentage Basic Proficient Advanced

"Essential" or better' 208-231° 230-267° 268 and above'

18 15 29 .10

tlncludes fourth-grade students who were below the Basic level.

*NAEP Reading composite scale range.

SOURCE: National Center for Education Statistics, National Assessment ofEducational Progress (NAEP), 2000 Reading Assessment.

Responses scored "Essential" demonstrated comprehension of colonial life as portrayed inthe article by providing three activities, some of which are related to the need to stay warm.

Sample "Essential" Response:

Pretend that you are an early American colonist. Describe at least threeactivities you might do during a cold winter evening. Be specific. Usedetails from the article to help you write your description.

wou\ by "'\'N.CD

otna fAcl or rto riJeea\e.wovAsor 5+; r a V.e.A-\-\e_ nt- cornpu d cVtrvA or c-hec-\6 -the- boy .Y)\-_) re. ack,.

33CHAPTER 1 READING REPORT CARD 19

Page 35: ERIC - Education Resources Information Center ...285; for a PowerPoint slide presentation related to this report, see CS 014 286. AVAILABLE FROM ED Pubs, P.O. Box 1398, Jessup, MD

Responses scored "Extensive" demonstrated comprehension of the central theme of the article.Of the activities provided, at least three focus on the need to stay warm.

Sample "Extensive" Response:

Pretend that you are an early American colonist. Describe at least threeactivities you might do during a cold winter evening. Be specific. Usedetails from the article to help you write your description.

vc uc\-- c\ckrrlei

\f rr\Lii

___ZstrCelL

W cd\ trY\

cur o o_nci

A.91 r\k Aro c'\9_e_.p -ki\-Q.Am

VI o UL 'o AV,

4.0 r42. ito.ea 4 mo, h(a_k ,-kC3flA; S w\aQin LL)

wv\\estrN,, 40\e\

CU 0

WO,C'yv\ \c:),(N

ki3rN't k/2_

woua..Pic)(N (L\i\u,;(0_,\A.

.11°

20 CHAPTER 1 READING REPORT CARD

v\oia k-

-1-0

42.p

t)t.o.eAo

oc

34

Page 36: ERIC - Education Resources Information Center ...285; for a PowerPoint slide presentation related to this report, see CS 014 286. AVAILABLE FROM ED Pubs, P.O. Box 1398, Jessup, MD

Map of SelectedRem DescriptionsThe reading performance of fourth-gradersand aspects of reading comprehension canbe illustrated by a map that positions itemdescriptions along the NAEP readingcomposite scale where items are likely to beanswered successfully by students.' Thedescriptions used on the map focus on thereading skill or ability needed to answer thequestion. For multiple-choice questions,the description indicates the comprehensiondemonstrated by selection of the correctresponse option; for constructed-responsequestions, the description takes into accountthe degree of comprehension demonstratedas specified by the different levels of scoringcriteria for that question. An examination ofthe descriptions provides some insight intothe range of comprehension processesdemonstrated by fourth-grade students atdifferent score points on the NAEP scale.

For each question indicated on the map,students who scored above the scale pointhad a higher probability of successfullyanswering the question, and students whoscored below the scale point had a lowerprobability of successfully answering thequestion. The map identifies where indi-vidual comprehension questions wereanswered successfully by at least 65 percent

of the students for constructed-responsequestions, or by at least 74 percent of thestudents for multiple-choice questions.'For example, a multiple-choice questionthat asks students to identify the majortopic of an article maps at 224 on the scale.

z

This means that fourth-grade students withan average scale score of 224 or more haveat least a 74 percent chance of answeringthis question correctly. In other words, atleast 74 out of every 100 students whoscore at or above 224 answered this ques-tion correctly. Although students scoringabove the scale point had a higher prob-ability of successfully answering the ques-tion, it does not mean that every student ator above 224 always answers this questioncorrectly, nor does it mean that studentsbelow 224 always answered the questionincorrectly. The item maps are a usefulindicator of higher or lower probability ofsuccessfully answering the question de-pending on students' overall ability asmeasured by the NAEP scale.

In considering the information providedby the item maps, it is important torecognize that these descriptions are basedon comprehension questions that wereanswered about specific passages. It ispossible that questions intended to assessthe same aspect of comprehension, whenasked about different passages, would mapat different points on the scale. In fact, oneNAEP study found that even identicallyworded questions function differently (i.e.,easier or harder) when associated withdifferent passages, suggesting that thedifficulty of a question resides not onlyin the question itself, but also in theinteraction of the question with aparticular passage.'

2 Details on the procedures used to develop item maps will be provided in the forthcoming N.-1EP 2000 Technical

Report.

3 The probability convention is set higher, at 74 percent, for multiple-choice questions to correct for the possibilityof answering correctly by guessing.

4 Campbell, J.R. & Donahue, P.L. (1997). Students selecting stories: The effects of choice in reading assessment. Washington,

DC: National Center for Education Statistics.

CHAPTER 1 REACIAG REPORT CARO 21

33

Page 37: ERIC - Education Resources Information Center ...285; for a PowerPoint slide presentation related to this report, see CS 014 286. AVAILABLE FROM ED Pubs, P.O. Box 1398, Jessup, MD

Advanced

3 032031030029028027

26

324 Use text ideas to elaborate a hypothetical situation Sample Question 3

312 Use metaphor to compare story characters311 Contrast historical information to present day Sample Question 7

300 Provide and explain an alternative ending299 Interpret story action to provide lesson characters learned299 Describe character's changing feelings and explain cause

291 Use character trait to connect with prior knowledge

287 Extract relevant examples to support statement Sample Question 5286 Compare story characters using text details

280 Use text description and prior knowledge to support opinion

274 Explain purpose of direct quotations273 Identify author's illustration of theme through story action

250

264 Use text evidence to evaluate title Sample Question 2263 Use different parts of text to provide supporting examples

259 Explain author's statement with text information257 Infer character motivation from story setting255 Evaluate author's presentation of information Sample Question 6

246 Identify main theme of story

Proficient 2 240 Explain character's motivation240 Identify author's use of specific details. .Sample Question 1

230220

231 Use prior knowledge to make text-related comparison230 Compare text ideas using specific information

225 Recognize meaning of specialized vocabulary from context224 Identify main topic of article

218 Locate and provide explicitly stated informationBasic 212 Recognize a description of character's motivation

210 Recognize accurate of character's feelings

2 206 Identify defining character trait

200 Recognize explicitly stated information

1208

190

180

170

160rl

U

192 Provide a personal reaction with minimal supporting explanation

188 Recognize genre of story

184 Identify main reason for reading article Sample Question 4

170 Identify character's main dilemma

166 Provide plot-level lesson characters learned

Map=of selecteititem

descriptiOns on the

; Natinnailt,isses ent'

of Educational

irProiree5 reading

scale fat grade.

111V;

ans e

o se

bill

idttp e-

ztions

NOTE: Regular type denotes a constructed-response question. Italic type denotes a multiple-choice question.

Each grade 4 reading question was mapped onto the NAEP 0-500 reading scale. The position of the question on the scale represents the scale score attained by students whohad a 65 percent probability of successfully answering a constructed-response question or a 74 percent probability of correctly answering a four-option, multiple-choice question.

Only selected questions are presented. Scale score ranges for reading achievement levels are referenced on the map.

SOURCE: National Center for Education Statistics. National Assessment of Educational Progress (NAEP), 2000 Reading Assessment.

22 CHAPTER 1 READING REPORT CARD

36En copy AVERAIBLS

Page 38: ERIC - Education Resources Information Center ...285; for a PowerPoint slide presentation related to this report, see CS 014 286. AVAILABLE FROM ED Pubs, P.O. Box 1398, Jessup, MD

SummaryThis chapter presented overall results fromthe NAEP reading assessment for fourth-grade students. Results examined includedthe national average score, the percentilescores across the performance distribution,and attainment of the reading achievementlevels.

The following figure displays the majorfindings presented in this chapter. In eachline of the display, the average readingscore, the percentile score, or the percentageof students at or above achievement levels

is compared to that in the first assessmentyear under the current reading framework.An arrow pointing upward () indicates asignificant increase, a horizontal arrow (4)indicates no significant change, and anarrow pointing downward (4) indicates asignificant decrease. For example, the firstsection of the display shows that thenational average for fourth-grade studentshas not changed significantly since 1992,whereas the first line in the next sectionindicates that the score at the 90thpercentile has increased since 1992.

National Average ,

Scale score since 1992

Percentile Scores

1. 90th percentile score since 1992

4 75th percentile score since 1992

50th percentile score since 1992

4 25th percentile score since 1992

10th percentile score since 1992

Achievement Level Results

44 Percentage at Advanced since 1992

Percentage at or above Proficient since 1992

Percentage at or above Basic since 1992

SOURCE: National Center for Education Statistics, National Assessment of Educational Progress (NAEP), 1992 and 2000 Reading Assessments.

CHAPTER 1 READING REPORT CARD 23

37

Page 39: ERIC - Education Resources Information Center ...285; for a PowerPoint slide presentation related to this report, see CS 014 286. AVAILABLE FROM ED Pubs, P.O. Box 1398, Jessup, MD

verage Scale Score andAchievement Level Results forSelected Subgroups of 4th-Grade Students

Implicit in the call for high standards is the goal that all

children regardless of background or where they attend

school should be held to the same expectations and given

the same opportunity for achievement. Within this larger

context of reform, the performance results in this chapter

may prove useful as one indicator of progress toward the

nation's educational goals. Presented in this chapter

are results for six subgroups of students with different

demographic characteristics. Results are reported by:

gender, race/ethnicity, region of the country, school's

type of location, eligibility for the free/reduced-price

lunch program, and type of school. For all subgroups

except type of location and free/reduced-price lunch

eligibility, results of the 2000 assessment can be

compared to the results of previous assessments under

the current framework, 1992, 1994, and 1998.

Results are presented both in terms of students'

scores on the NAEP reading composite scale and in

terms of the percentages of students attaining

achievement levels. Scale score results represent students'

actual performance on the assessment, and the achievement

level results view that performance in relation to set

expectations of what students should know and be able to do.

Are selected

subgroups of

students making

progress in

reading?

38

Gender

Race/Ethnicity

Trends in

Scale Score

Differences

Region of the

Country

Type of Location

Eligibility

for the Free/

Reduced-Price

Lunch Program

Type of School

CHAPTER 2 READING REPORT CARD 25

Page 40: ERIC - Education Resources Information Center ...285; for a PowerPoint slide presentation related to this report, see CS 014 286. AVAILABLE FROM ED Pubs, P.O. Box 1398, Jessup, MD

The reading achievement levelsBasic,Proficient, and Advancedused to reportNAEP results were established by theNational Assessment Governing Board(NAGB) in 1992 for the content frame-work that provides the basis for the readingassessments. Descriptions of abilities associ-ated with the three levels are presented infigure 1.3 in chapter 1 of this report.

The NAEP legislation (The ImprovingAmerica's Schools Act of 1994) requiresthat achievement levels be "used on adevelopmental basis until the Commis-sioner of Education Statistics determines ...that such levels are reasonable, valid, andinformative to the public."A discussion ofthe developmental status of the achieve-ment levels is included in the introductionto this report.

Differences reported in this chapterbetween demographic subgroups for the2000 assessment and between previous

assessments are based on statistical tests thatconsider both the magnitude of the differ-ence between the group average scores orpercentages and the standard errors ofthose statistics. Differences between groupsand between assessment years are discussedonly if they have been determined to bestatistically significant.

GenderFigure 2.1 presents average reading scalescores for male and female students acrossassessment years. In the 2000 assessment, nosignificant increase or decrease was ob-served for either male or female fourth-grade students.The increase achieved bymale students between 1994 and 1998 didnot continue or lead to a higher score in2000.The average score for female fourth-graders has remained relatively consistentacross assessment years. In 2000, resultsshow female students continuing to out-perform their male counterparts.

500

Male

1994 1998 2000

W-1

z2:z T:''''==r1 i 23 209 214 212

IN---_----..---1

_Cil,

female

5 I 0 1994 1998 2000

Wor t:szl

221

)

220

)

220222*I

'7I----._\\.---

--oi

SOURCE: National Center for Education Statistics, National Assessment of Educational Progress (NAEP),

1992, 1994, 1998, and 2000 Reading Assessments.

26 CHAPTER 2 READING REPORT CARD

39 ,P,I.F.,97 Copy AVAREA IIE

Page 41: ERIC - Education Resources Information Center ...285; for a PowerPoint slide presentation related to this report, see CS 014 286. AVAILABLE FROM ED Pubs, P.O. Box 1398, Jessup, MD

The percentages of male and femalestudents at or above the reading achieve-ment levels and within each achievementlevel range are presented in figure 2.2.Among male students, apparent changes inthe percentages at or above any of theachievement levels across assessment yearswere not statistically significant. In 2000,however, the percentage of females at orabove the Proficient level (36 percent) was

higher in comparison to 1992 (32 percent).The achievement level results are consistentwith the scale score results, in that they alsoshow female students outperforming malestudents. Comparison of male and femaleperformance in 2000 shows higher per-centages of female fourth-graders at orabove Basic, at or above Proficient, and at theAdvanced level.

Male

199242%Below Basic

1994

Below Basic

1998ONSBelow Basic

2000aEsBelow Basic

> 58% At or above Basic

> 25%At or above Proficient

32% 120%Basic Proficient Advanced

> 55% At or above Basic

I> 26%At or above Proficient

30% i (NSBasic Proficient Advanced

> 59% At or above Basic

28%At or above Proficient

i?ET6

Proficient

31%Basic

20%

515Advanced

> 58% At or above Basic

I> 27%At or above Proficient

31% jffl% ! OrbBasic Proficient Advanced

40

Figure continues on the next page.

BIZOT COPY AVAILEII1a1E

CHAPTER 2 READING REPORT CARD 21

Page 42: ERIC - Education Resources Information Center ...285; for a PowerPoint slide presentation related to this report, see CS 014 286. AVAILABLE FROM ED Pubs, P.O. Box 1398, Jessup, MD

Figure

Achievement

Results

RogGender

Jcontinued)

Percentages fourth-grade Edo n:11EefforioquaD3 Osia 113migEolicau

female students

achievementgalECM

OGOOWE 1)

reading

992-2000

Female

1992Below Basic

1994

> 67% At or above Basic

35%Basic

Below Basic

1998DNSBelow Basic

2000Below Basic

I> 32%At or above Proficient*

gOck, El%

Proficient Advanced

66% At or above Basic

I> 34%At or above Proficient

32%Basic Proficient Advanced

> 65% At or above Basic

I> 33%At or above Proficient

32% jam.Basic Proficient

80/0

Advanced

> 67% At or above Basic

I> 36% At or above Proficient

1 'iMS31%Basic

26 °oProficient Advanced

*Significantly different from 2000.

NOTE: Percentages within each reading achievement level range may not add to 100, or to the exact percentages at or above

achievement levels, due to rounding.

SOURCE: National Center for Education Statistics, National Assessment of Educational Progress (NAEP),

1992, 1994, 1998, and 2000 Reading Assessments.

Race/EthnicityThe background questionnaire administeredwith the 2000 NAEP reading assessmentasked students to indicate the racial/ethnicsubgroup that best described them. In the1992, 1998, and 2000 reading assessmentsthe mutually exclusive subgroup categorieswere: White, Black, Hispanic, Asian/PacificIslander, and American Indian (includingAlaskan native). In 1994, a similar back-ground question was asked, the onlydifference being that Asian and PacificIslander were two separate response cat-egories.To analyze changes in performanceacross assessment years, the separate responsecategories used in 1994 were combined

28 CHAPTER 2 READING REPORT CARD

into a single category, Asian /Pacific Islander,as used in the other assessment years.

Across assessment years, some changesare evident in the population of fourth-graders sampled by NAEP. The populationof Hispanic fourth-grade students in theNAEP sample has increased from 9 percentin 1992 to 15 percent in 2000, while thepopulation of white fourth-grade studentshas decreased from 71 percent in 1992 to66 percent in 2000.The across-year per-centages for all racial/ethnic subgroups canbe found in table C.9 in appendix C.

The average reading scale scores forstudents by racial/ethnic subgroup across

31E0111 COPY AVAILMIE41

Page 43: ERIC - Education Resources Information Center ...285; for a PowerPoint slide presentation related to this report, see CS 014 286. AVAILABLE FROM ED Pubs, P.O. Box 1398, Jessup, MD

assessment years are presented in figure 2.3.Of the five racial/ethnic subgroups, onlyAsian/Pacific Islander students showedoverall gains since 1992.The 2000 averagescore of students in each of the othersubgroups was similar to or did not differsignificantly from the 1992 average score.

Although black and Hispanic students'average scores had declined in 1994, slightrebounds since that time are evident forboth groups. For black students only,however, this resulted in a 2000 averagescore that was significantly higher than thatin 1994.Any apparent differences between

Scale Score Result's

Mfag

500 1994 1998 2000

---"\----

izz=225 224 227 226

MIcz:zzNO

i151

1, . .....

Asian/PacificIslander

5BB 1994 1998 2000

el214

,:.CCU 225

232245

400

175

Black

1994 2000

Americanndian

o o as 1994 1998 2000

ZO

str:fWO207 ,

'01 202 196

*Significantly different from 2000.

SOURCE: National Center for Education Statistics, National Assessment of Educational Progress (NAEP),

1992, 1994, 1998, and 2000 Reading Assessments.

42

Pt

ispanic

o o Ill 1994 1998 2000

__,&,-------_-----

___Ng ill

Vg

i 1 201

el==1

151191

196 197

---,---------"'

CHAPTER 2 READING REPORT CARD 29

Page 44: ERIC - Education Resources Information Center ...285; for a PowerPoint slide presentation related to this report, see CS 014 286. AVAILABLE FROM ED Pubs, P.O. Box 1398, Jessup, MD

years in the average scores of white orAmerican Indian students were not statisti-cally significant. Comparisons of subgroupperformance in 2000 show white and Asian/Pacific Islander students outperforming theirblack, Hispanic, and American Indian peers.

Achievement level results for racial/ethnic subgroups are presented in figure 2.4.Consistent with scale score results,achievement level results show that theonly group of students with significantgains was Asian/Pacific Islander students. Ahigher percentage of Asian/Pacific Islanderstudents were at or above the Proficient level

in 2000 than in 1992. For other racial/ethnic subgroups, any apparent differencesfrom past assessments observed in thepercentages of students at or above theBasic and Proficient achievement levels werenot statistically significant. Comparisonsof achievement level results betweensubgroups in 2000 show higher attainmentfor white and Asian/Pacific Islanderstudents. The percentages of white andAsian/Pacific Islander students at or aboveProficient and at or above Basic were higherthan the percentages for all other racial/ethnic groups.

FigureAchievement Roml]

Results VG12020

Percentages CQ fourth-grade students EaFgfrm reading

dint Golfo edalgolairmeOga afro PaVAlleeffloEiktbaKtila GizeM

992-2000

White

199229%Below Basic

199422gif)Below Basic

1998&PDSS

Below Basic

2000ff'533Below Basic

> 71% At or above Basic

36%Basic

> 71% At or above Basic

34%Basic

I> 35%At or above Proficient

M3 CM)Proficient Advanced

I> 37%At or above Proficient

MS

> 73% At or above Basic

Proficient Advanced

I> 39%At or above Proficient

34%Basic Proficient Advanced

> 73% At or above Basic

33%Basic

30 CHAPTER 2 READING REPORT CARD

I> 40%At or above Proficient

&NSProficient Advanced

43

BM" COPY /MLA RE

Page 45: ERIC - Education Resources Information Center ...285; for a PowerPoint slide presentation related to this report, see CS 014 286. AVAILABLE FROM ED Pubs, P.O. Box 1398, Jessup, MD

FigureAchievement

Resultsty4iEthrucity (continued)

N=299

Percentages

dab eciAfourth-grade

achievement

students EC co ELcog reading

a cfa:B? [3g OMOgGlitgacdolgoaKe[larglall

992-2000

Black

1992all%Below Basic

1994Ccii?)Below Basic

1998(30%Below Basic

2000Below Basic

> 33% At or above Basic

II> 8%At or above Proficient

1% AdvancedProficient

25%Basic

> 31% At or above Basic

I> 9%At or above Proficient

22% 1 8°0 i v-fi . Advanced

Basic Proficient

1> 36% At or above Basic

I> 10%At or above Proficient:gil 1% Advanced

Proficient26%Basic

37% At or above Basic

I> 12%At or above Proficient

25% i'M 2 % Advanced,_ Advanced

Basic Proficient

Hispanic

1992

Below Basic

1994a3c33

Below Basic

199860%Below Basic

200058%Below Basic

> 44% At or above BasicI> 16%At or above Proficient

28% tom r 3% AdvancedBasic Proficient

> 36% At or above BasicI> 13%At or above Proficient

22% NTS i 2% AdvancedBasic Proficient

> 40% At or above BasicI> 13%At or above Proficient

26% I11% 2% Advanced

Basic Proficient

> 42% At or above BasicI> 16%At or above Proficient

26% jG 3% Advanced

Basic Proficient

44

Figure continues on the next page.

CHAPTER 2 READING REPORT CARD 31

Page 46: ERIC - Education Resources Information Center ...285; for a PowerPoint slide presentation related to this report, see CS 014 286. AVAILABLE FROM ED Pubs, P.O. Box 1398, Jessup, MD

WogMIR (continued)

Percentages

utiENceGth

fourth-grade

achievement

students ER co above reading

OREI] GE110D GEGGUiffial3EGUMIOTP [120gg Ea]

992-2000

Asian/Pacific °slander

1992OSSBelow Basic

1994

Below Basic

1998

59% At or above Basic

I> 25%At or above Proficient *

34% I,M2) tua,Basic Proficient Advanced

> 75% At or above Basic

I> 44%At or above Proficient

310/0

Basic

Below Basic

2000ffliSBelow Basic

Proficient Advanced

> 69% At or above Basic

I> 37%At or above Proficient

32% -IRR) 1033.Basic Proficient Advanced

> 78% At or above Basic

I> 46%At or above Proficient

32%Basic

American Indian

1992Oa)Below Basic

1994

Proficient Advanced

> 53% At or above Basic

35%Basic

gY33Below Basic

1998

I> 18% At or above Proficient

IMie) I 3%Proficient Advanced

1> 48% At or above Basic

I> 18%At or above Proficient

IIIM I 3%Proficient Advanced

6:23Below Basic

2000

30%Basic

47% At or above Basic

33%

Basic

Below Basic

I> 14%At or above Proficient

2%Proficient Advanced

> 43% At or above Basic

I> 17% At or above Proficient

26% ! TEA u 2%Basic Proficient Advanced

*Significantly different from 2000.

NOTE: Percentages within each reading achievement level range may not add to 100, or to the exact percentages at or above achievement levels, due to rounding.

SOURCE: National Center for Education Statistics, National Assessment of Educational Progress (NAEP), 1992, 1994, 1998, and 2000 Reading Assessments.

32 CHAPTER 2 READING REPORT CARD

451IDlr 11 COIPT AV1411SILABLI2

Page 47: ERIC - Education Resources Information Center ...285; for a PowerPoint slide presentation related to this report, see CS 014 286. AVAILABLE FROM ED Pubs, P.O. Box 1398, Jessup, MD

Trends in Scale ScoreDifferences Between SelectedSubgroupsAs with previous NAEP reading assess-ments, results from the 2000 NAEP con-tinue to show performance differencesbetween racial/ethnic subgroups andbetween male and female students. Suchdifferences between the average assessmentscores of student groups should be inter-preted cautiously.The average score of aselected group does not represent theentire range of performance within thatgroup. Furthermore, differences betweengroups of students can not be attributedsolely to group identification. A complexarray of educational and social factorsinteracts to affect average studentperformance.

Scale score differences between selectedsubgroups of students from 1992 to 2000are displayed in figure 2.5. In the 2000assessment, white students outperformedtheir black and Hispanic peers by 33 and29 scale points, respectively. Femalestudents outperformed male students by10 scale points.

Little change is seen since 1992 in themagnitude of the gap between thesegroups of students.The gap between whiteand black, and between white and Hispanicstudents, has varied somewhat across theassessment years, but none of the changeswere statistically significant. The gap be-tween male and female students has alsofluctuated slightly; however, the 4-pointincrease in score differences between malesand females between 1998 and 2000represents a statistically significant wideningof the gender gap.

FigureScale Score

Differences

Selected Subgroups

Between

Differences average

magfiliesfourth-grade

992-2000

reading @a:09 @KIK@

FemaleMale

1992 8

1994

1998

2000

10

6*10

0 10 20 30 40

SCORE DIFFERENCES

1992

1994

1998

2000

WhiteBlack WhiteHispanic

0 10 20 30 40

SCORE DIFFERENCES

1992

1994

1998

2000

0 10 20 30 40

SCORE DIFFERENCES

*Significantly different from 2000.

SOURCE: National Center for Education Statistics, National Assessment of Educational Progress (NAEP), 1992, 1994, 1998, and 2000 Reading Assessments.

46 CHAPTER 2 READING REPORT CARD 33

Page 48: ERIC - Education Resources Information Center ...285; for a PowerPoint slide presentation related to this report, see CS 014 286. AVAILABLE FROM ED Pubs, P.O. Box 1398, Jessup, MD

Region of the CountryThis section examines results for fourregions of the country: Northeast, South-east, Central, and West. A listing of thestates that are within these regions isprovided in appendix A.

Figure 2.6 presents scale score results byregion. Most of the gains made between1994 and 1998 by fourth-graders in theNortheast were maintained; their averagescore in 2000 was higher than in 1994, butnot significantly different from 1992. For

r 42 iiortheast

5.0 to 1994 1998 2000

-----"\---

Affil

MI*'.222

c'=1°226 222

20.215

175,.-..,

@gad

-iffill all/ 1994 1998 2000

____,..---.- .\_,._...-.1-

WO

=219 220

rz222 220

MN

11775.1

CS7)1

sap ..... 3 el

Southeast

1994 1998 2000

ZO

213210 211

,

21120.0

1151---_-_____----

10

VtgiR

"i, t1241 1994 1998 2000_,NO

..---,------,---1-

WO

ry ,,c,210.0

214)

212

c

212214

fa_------- \\_-----

11

*Significantly different from 2000.

SOURCE National Center for Education Statistics, National Assessment of Educational Progress (NAEP), 1992, 1994, 1998, and 2000 Reading Assessments.

34 CHAPTER 2 READING REPORT CARD

4 73I g° COITY

Page 49: ERIC - Education Resources Information Center ...285; for a PowerPoint slide presentation related to this report, see CS 014 286. AVAILABLE FROM ED Pubs, P.O. Box 1398, Jessup, MD

the other regions, student performance in2000 varied only slightly from previousassessment years, and none of the apparentvariations were statistically significant.

Comparisons between the regions in the2000 assessment show fourth-graders in theNortheast and Central regions outperform-ing their peers in the Southeast and the West.

As shown in figure 2.7, achievementlevel results were stable for all the regions.No increases or decreases were observed inthe percentages of students at or above anyof the achievement levels. While thepercentage of students in the Northeastwho were at or above the Basic level hadincreased between 1994 and 1998, the

percentage at or above Basic in 2000 was notsignificantly higher than in past assessments.

Comparisons of achievement levelresults between the regions show higherattainment by fourth-grade students in theNortheast and Central regions.The North-east region had a higher percentage ofstudents at the Proficient level and at theAdvanced level than the Southeast. Thepercentages of Northeast students at orabove Proficient and at or above the Basiclevel were higher than both the Southeastand the West. The Central region hadhigher percentages of students at Proficient,as well as at or above Proficient and at orabove Basic than the Southeast.

FigureAchievement Level

Results Region

Percentages

MTh cotfourth-grade

achievement

students gasEilmo reading

tagEllEOLFROceagachievement

country:03E03 Ea]

992-2000

Northeast

1992assBelow Basic

1994KIDR3

Below Basic

199881RBelow Basic

2000@NSBelow Basic

> 66% At or above Basic

> 34%At or above Proficient

32%Basic Proficient Advanced

> 61% At or above Basic

I> 31%At or above Proficient

30% 1&936Basic Proficient

CV)Advanced

> 70% At or above Basic

I> 38%At or above Proficient

32%Basic

igo33 0%Proficient Advanced

> 67% At or above Basic

I> 37%At or above Proficient

30%Basic Proficient

Virk,

48

Advanced

Figure continues on the next page.

CHAPTER 2 READING REPORT CARD 35

Page 50: ERIC - Education Resources Information Center ...285; for a PowerPoint slide presentation related to this report, see CS 014 286. AVAILABLE FROM ED Pubs, P.O. Box 1398, Jessup, MD

FigureAchievement

Results

OogRegion

(continued)

Percentages

clabosGthfourth-grade

achievement

students

Southeast

1992

Below Basic

1994(ASBelow Basic

1998

Below Basic

2000Below Basic

> 58% At or above Basic

I> 24%At or above Proficient

'Eta ;INS34%Basic Proficient Advanced

1> 55% At or above Basic

I> 25%At or above Proficient

30%Basic

VRE84 WSProficient Advanced

> 56% At or above Basic

I> 25%At or above Proficient

31 0,Basic Proficient Advanced

IN 57% At or above BasicI> 26%At or above Proficient

!WSAdvanced

Central

1992

31%Basic

1> 66% At or above Basic

Below Basic

199434%Below Basic

1998ai?)

Proficient

I> 30%At or above Proficient

36%Basic Proficient Advanced

66% At or above Basic

I> 34%At or above Proficient

33%Basic

1> 68% At or above Basic

Proficient Advanced

I> 35%At or above Proficient

34% 26% 80/0

Below Basic Basic

2000

Proficient Advanced

> 66% At or above Basic

I> 35%At or above Proficient

VBS 310/0 26% ENSBelow Basic Basic Proficient Advanced

36 CHAPTER 2 READING REPORT CARD 49

Page 51: ERIC - Education Resources Information Center ...285; for a PowerPoint slide presentation related to this report, see CS 014 286. AVAILABLE FROM ED Pubs, P.O. Box 1398, Jessup, MD

FigureAdhievement

Results by

dcontinuesl)

Rxe0Region

Percentages

calit Goafourth-grade

achievement

students CIIEDMI9 reading

Ogal accto 'IPEGOXIDEGIR CIA9243 aoCl

country: 992-2000

v v est

19924311,i3

Below Basic

> 59% At or above Basic

32%Basic

ID 27% At or above Proficient

ET?) i033Proficient Advanced

> 59% At or above Basic

1994 II> 29%At or above Proficient

av)Below Basic

1998

30%Basic

1E% I

Proficient Advanced

> 57% At or above Basic

30%Basic

> 61% At or above Basic

I> 30%At or above Proficient

30%Basic

(3g33Be low Basic

2000Below Basic

I> 27%At or above Proficient

Proficient Advanced

Proficient Advanced

NOTE: Percentages within each reading achievement level range may not add to 100, or to the exact percentages at or above

achievement levels, due to rounding.

SOURCE: National Center for Education Statistics, National Assessment of Educational Progress (NAEP), 1992, 1994, 1998, and 2000 Reading Assessments.

Type of LocationThe schools of students who participate inthe NAEP assessments are classified accord-ing to type of location. Based on CensusBureau definitions of metropolitan statisti-cal areas, population size, and density, thethree mutually exclusive categories are:

central city, urban fringe/large town, andrural/small town. Due to a change in howlocations were identified in 2000, compari-sons to previous assessment years based ontype of location are not possible. An expla-nation of this change can be found inappendix A.

Results

Location

Average fourth-grade reading @Kb @KGGg school's location: 2000

2000

Central city

209

Urban fringe/large town Rural /small town

222

SOURCE: National Center for Education Statistics, National Assessment of Educational Progress (NAEP), 2000 Reading Assessment.

BRAT COPY AVAIIL LIE 50

218

CHAPTER 2 READING REPORT CARD 31

Page 52: ERIC - Education Resources Information Center ...285; for a PowerPoint slide presentation related to this report, see CS 014 286. AVAILABLE FROM ED Pubs, P.O. Box 1398, Jessup, MD

Consistent with past performancepatterns when comparing location types,students in central city schools had a loweraverage reading score than their peers whoattend schools in other types of location.

Figure 2.8 presents achievement levelresults by type of location. Comparisons ofachievement level results between locations

show lower percentages of central citystudents at or above the Proficient level andat or above the Basic level of performancethan their peers in urban/fringe and rurallocations. The slight differences betweenlocations in the percentage of studentswho attained the Advanced level were notstatistically significant.

Figure 28AchievenentleyelResults by

Location

Type of

Percentages Cg fourth-grade students E.12CFELCM

dElifiD (DD camile6 O CO ffiga Cp school's

reading

06mcg

achievement

location.OxarkEa]

Central City

Below Basic

> 53% At or above BasicI[> 26%At or above Proficient

27% gIMBasic Proficient Advanced

Urban Fringe/Large Towni> 68% At or above Basic

I> 36%At or above Proficient

ffla2.,Below Basic

Rural/Small Town

32%Basic

Below Basic

0.?Z) j ilOSS

Proficient

> 65% At or above Basic

Advanced

I> 32%At or above Proficient

33%Basic Proficient

OSSAdvanced

NOTE: Percentages within each reading achievement level range may not add to 100, or to the exact percentages at or above

achievement levels, due to rounding.

SOURCE: National Center for Education Statistics, National Assessment of Educational Progress (NAEP), 2000 Reading Assessment.

38 CHAPTER 2 READING REPORT CARD 51

BEST COPY AVAILCB11.2

Page 53: ERIC - Education Resources Information Center ...285; for a PowerPoint slide presentation related to this report, see CS 014 286. AVAILABLE FROM ED Pubs, P.O. Box 1398, Jessup, MD

Eligibility for the Free/Reduced-Price Lunch ProgramFunded by the U.S. Department of Agri-culture (USDA) as part of the NationalSchool Lunch Program (NSLP), the free/reduced-price lunch program is available topublic schools, nonprofit private schools,and residential childcare institutions. Eligi-bility for free or reduced-price lunch isdetermined through the USDA IncomeEligibility Guidelines whereby studentsfrom families near or below the povertyline can be assured of having a wholesomemeal at school.'

While NAEP first collected informationon student eligibility for the federallyfunded NSLP in 1996, it was not until1998 that this indicator of poverty wasreported for students participating in theNAEP reading assessment. Thus compari-sons can only be made between the twomost recent assessments. As shown in figure2.9, there was no change in average scoresbetween the two assessment years for non-eligible students and the apparent decreasefor eligible students was not statisticallysignificant. In 2000 the score for eligiblestudents remained significantly lower thanthat for students not eligible to receive afree or reduced-price lunch.

FigureResults

Free/Reduced-

Nce Lunch Eligibility,

Average fourth-gradefree/reduced-price

reading

logoEiK09E1KuGg

program. fi

student

998-2000

eligibility CU T°

El igibig

5.010 1994 1998 2000

.--- \......-1--,-- ....---

Z 0

orzza198 196

WA

BI

OR-----....-----

GU digiA

5 0 1994 1998 2000

a.,--j-----'

___,M._ er=227 227

aili)

1 7Mr---"-- \\--------'-'-

1994

status

1998 2000

z==r227 228

SOURCE: National Center for Education Statistics, National Assessment of Educational Progress (NAEP), 1998 and 2000 Reading Assessments.

I U.S. General Services Administration. (1999). Catalogue offederal domestic assistance. Washington, DC: ExecutiveOffice of the President, Office of Management and Budget.

5 2CHAPTER 2 READING REPORT CARD 39

Page 54: ERIC - Education Resources Information Center ...285; for a PowerPoint slide presentation related to this report, see CS 014 286. AVAILABLE FROM ED Pubs, P.O. Box 1398, Jessup, MD

As shown in figure 2.10, no change wasobserved since 1998 in the percentages ofeligible and noneligible students attainingspecific achievement levels. As with averagescale score results, achievement level resultsalso show lower performance amongstudents eligible for the program. Fourteen

percent of eligible students performed ator above the Proficient achievement levelin comparison to 41 percent ofnoneligible students. Among fourth-graders who were eligible for theprogram, 60 percent were below theBasic level.

"MO*FigureAchievement

Free/Reduced-

Eligibility

Percentages

czeibefellctl fourth-grade

achievement

program:

students

0zeOuolt3D1x998-2000

above

student

reading

eligibilityachievement

Stu 1)9

figoll3Eagfree/reduced-price

Eligible

1998RCMBelow Basic

2000CMSBelow Basic

I> 42% At or above Basic

13%At or above Proficient

i Thi 2%Proficient Advanced

29%Basic

Not Eligible

1998&SSBelow Basic

2000&VD

> 73% At or above Basic

33%Basic

> 40% At or above Basic

I> 14%At or above Proficient

26% i DM i 2%Basic Proficient Advanced

I> 40%At or above Proficient

1@3% !ia3

> 74% At or above Basic

Proficient Advanced

41%At or above Proficient

34% ER) HZ)Below Basic Basic

40 CHAPTER 2 READING REPORT CARD

Proficient

53

Advanced

Page 55: ERIC - Education Resources Information Center ...285; for a PowerPoint slide presentation related to this report, see CS 014 286. AVAILABLE FROM ED Pubs, P.O. Box 1398, Jessup, MD

tr.Figurp2.10

:1*vAchievement

Results by

riA04Free/Reduced-

(continued)

Eligibility

Percentages

attONGoAlOmal

cQ fourth-grade

achievement

program:

students gaiMoToUellffEciaally

998-2000

student

reading

eligibilityachievement

g3789flguAffg

free/reduced-price

Eligibility Status Not AvailableI> 73% At or above Basic

199833%Basic

DR)Below Basic

2000giNSBelow Basic

I> 40%At or above Proficient

!aaProficient Advanced

74% At or above Basic

42%At or above Proficient

32%Basic

jERSProficient

11MAdvanced

NOTE: Percentages within each reading achievement level range may not add to 100, or to the exact percentages at or above

achievement levels, due to rounding.

SOURCE: National Center for Education Statistics, National Assessment of Educational Progress (NAEP),

1998 and 2000 Reading Assessments.

Type of SchoolThe schools of students who take theNAEP reading assessment are first classifiedoverall as either public or nonpublic. Afurther distinction is then made within thenonpublic classification between schoolsthat are Catholic and other nonpublicschools. Differences in performance be-tween public and nonpublic schools sur-veyed and reported by the NAEP readingassessment have consistently shown thatstudents attending the various types ofnonpublic schools outperform their publicschool peers. Despite this consistent patternin performance results, readers are cau-tioned against making assumptions aboutthe comparative quality of instruction in

public and nonpublic schools. Socioeco-nomic and sociological factors that mayaffect student performance should beconsidered when interpreting these results.

Average reading scale scores by type ofschool are presented in figure 2.11. In2000, the average scores of students attend-ing any of the types of school did notdiffer significantly from 1998 or from pastassessment years.

Comparison of scale score results be-tween the types of schools in 2000 shownonpublic school students outperformingpublic school students. Students at Catholicand other nonpublic schools also hadhigher average scores than fourth-gradersattending public schools.

54CHAPTER 2 READING REPORT CARD 41

Page 56: ERIC - Education Resources Information Center ...285; for a PowerPoint slide presentation related to this report, see CS 014 286. AVAILABLE FROM ED Pubs, P.O. Box 1398, Jessup, MD

5010

Publ

1994 1998 2000

lel1111

2(0.0 °215 212215 215

175

1111

NonpublCathol

no 1994 1998 2000

In*229 229 233 2312'24

200

El

7kroiDAEDh

5110 1994 1998 2000

tIlliai.4.

_Mk

9232 4'231 233 2 34

CIO

I173r.-...

Il

OtherAra Dpubleta 1994 1998 2000

TI)._-----%-\_\----1'..-

AR238 234

2oll

32231

200

117751_.--=-\_\.-----.

-0.. .._.

SOURCE: National Center for Education Statistics, National Assessment of Educational Progress (NAEP),

1992, 1994, 1998, and 2000 Reading Assessments.

42 CHAPTER 2 e READING REPORT CARD 55

BEST COPY AVARIIA1311,3

Page 57: ERIC - Education Resources Information Center ...285; for a PowerPoint slide presentation related to this report, see CS 014 286. AVAILABLE FROM ED Pubs, P.O. Box 1398, Jessup, MD

Achievement level results by type ofschool are presented in figure 2.12.As withscale score results, achievement level resultsin 2000 show no significant increases ordecreases in the percentages of students ator above the Basic or Proficient achievementlevels as compared to past assessment years.

Comparison of 2000 achievement levelresults between types of schools showfourth-grade students in nonpublic schoolsoutperforming their peers in public schools.All classifications of nonpublic schools hadhigher percentages of students at or above

Basic, at Proficient, and at or above Proficient

than did public schools. Among schoolsclassified as "nonpublic" a higher percent-age of students attained the Advanced levelof achievement than did their public schoolpeers. Within the types of nonpublicschools, only those classified as "othernonpublic" had a higher percentage ofstudents at the Advanced level than didpublic schools. The apparent differencebetween the percentages of students atthe Advanced level in public and Catholicschools was not statistically significant.

Public

19924 0 %

Below Basic

1994(A%Below Basic

199839%Below Basic

2000Q0k)Below Basic

1> 60% At or above Basic

1> 27% At or above Proficient

33%Basic

> 59% At or above Basic

30%Basic

Proficient Advanced

> 28%At or above

gl%Proficient

Proficient

Advanced

> 61% At or above Basic

I> 29%At or above Proficient

31% I W36 1nBasic Proficient Advanced

> 60% At or above Basic

30%At or above Proficient

31% j ffiABasic Proficient Advanced

56

Figure continues on the next page.

CHAPTER 2 READING REPORT CARD 43

Page 58: ERIC - Education Resources Information Center ...285; for a PowerPoint slide presentation related to this report, see CS 014 286. AVAILABLE FROM ED Pubs, P.O. Box 1398, Jessup, MD

FigureAchievement ORO]

VICBCGResults

(continued)

Percentages

EOM ceelIfourth-grade

achievement

students EP CU ECM

OXGI] MCP WW2 CO

reading

school:

achievement

992-2000Oara3Erol

Nonpublic

1992MSSBelow Basic

1994gNESBelow Basic

1998ic2S

Below Basic

2000aM)Below Basic

> 79% At or above Basic

34%Basic

I> 45%At or above Proficient

'13W)

> 77% At or above Basic

34%Basic

> 78% At or above Basic

32%Basic

> 80% At or above Basic

32%Basic

Nonpublic: Catholic> 76% At or above Basic

1992WR)Below Basic

199424%Below Basic

1998MSSBelow Basic

2 0 0 0?NisBelow Basic

Proficient Advanced

43%At or above Proficient

tao IraProficient Advanced

I> 46%At or above Proficient

ig§7) fi

Proficient Advanced4%

I> 47%At or above Proficient

34%Proficient Advanced

35%Basic

I> 41%At or above Proficient

@3(36 fiC)%Proficient Advanced

> 76% At or above Basic

I> 42%At or above Proficient

34% !@NS ;11N2)

Basic Proficient Advanced

> 79% At or above Basic

33%Basic

I> 46%At or above Proficient

> 78% At or above Basic

33%Basic

Proficient Advanced

I> 44%At or above Proficient

44 CHAPTER 2 READING REPORT CARD

Proficient

57

Advanced

BEST COPY AVAILABLE

Page 59: ERIC - Education Resources Information Center ...285; for a PowerPoint slide presentation related to this report, see CS 014 286. AVAILABLE FROM ED Pubs, P.O. Box 1398, Jessup, MD

11

Figure2.Rga

LamcgAchievement

Results

w co (continued)

Percentages Cig fourth-grade students @ICU above

cab GEg] @AIM BOODMORNM [VWreading

school:TAIMAK002021k

992-2000

Other Nonpublic> 84% At or above Basic

19 9 2 I> 53% At or above Proficient

1331% 38% ill

Basic Proficient Advanceda5Below Basic

199420%Below Basic

199824%Below Basic

2000

Below Basic

> 80% At or above Basic

34%Basic

I> 46%At or above Proficient

*33Proficient

'11033,

Advanced

> 76% At or above Basic

I> 46%At or above Proficient

30% :,31MBasic Proficient

82% At or above Basic

I> 51%At or above Proficient

;MAProficient

31%Basic

60/0

Advanced

6%Advanced

NOTE: Percentages within each reading achievement level range may not add to 100, or to the exact percentages at or above

achievement levels, due to rounding.

SOURCE: National Center for Education Statistics, National Assessment of Educational Progress (NAEP),

1992, 1994, 1998, and 2000 Reading Assessments.

SummaryThis chapter presented results from theNAEP reading assessment for fourth-gradestudents in different subgroups. Thesubgroups examined were gender, race/ethnicity, region of the country, type oflocation, eligibility for the free/reduced-price lunch program, and type of school.

The following figure displays the majorfindings presented in this chapter. In eachline of the display, the 2000 average readingscore of a particular group or thepercentage of students at or above theProficient achievement level for a particulargroup is compared to that in the first

assessment year under the current readingframework or to the first year that data wasavailable. Arrows pointing upward (q4)indicate significant increases, horizontalarrows (.31.) indicate no significant change,and arrows pointing downward (k,)indicate significant decreases. For example,the first section under gender indicates thatthat there has been no significant increasesince 1992 in the average score for eithermale or female fourth-graders, but therehas been an increase since 1992 in thepercentage of females at or above theProficient achievement level.

58

BEST COPY AVAILABLE

CHAPTER 2 o READING REPORT CARD 45

Page 60: ERIC - Education Resources Information Center ...285; for a PowerPoint slide presentation related to this report, see CS 014 286. AVAILABLE FROM ED Pubs, P.O. Box 1398, Jessup, MD

Gender

Female

-3> Scale score since 1992

I, Percentage at or above Proficient since 1992

Race/Ethnicity

White

9 Scale score since 1992

> Percentage at or above Proficient since 1992

Black

9. Scale score since 1992

9 Percentage at or above Proficient since 1992

Hispanic

9 Scale score since 1992

9 Percentage at or above Proficient since 1992

Region

Northeast

4> Scale score since 1992

9 Percentage at or above Proficient since 1992

Southeast

Scale score since 1992

9 Percentage at or above Proficient since 1992

Male

9 Scale score since 1992

9 Percentage at or above Proficient since 1992

Asian/Pacific Islanderog Scale score since 1992

Percentage at or above Proficient since 1992

American Indian

9 Scale score since 1992

9 Percentage at or above Proficient since 1992

Central

9 Scale score since 1992

9 Percentage at or above Proficient since 1992

West

9 Scale score since 1992

9 Percentage at or above Proficient since 1992

Eligibility for the Free/Reduced-Price Lunch Program

Eligible

9 Scale score since 1998

9 Percentage at or above Proficient since 1998

Type of School

Public

9 Scale score since 1992

9 Percentage at or above Proficient since 1992

Nonpublic

9 Scale score since 1992

9 Percentage at or above Proficient since 1992

Not Eligible

9 Scale score since 1998

9 Percentage at or above Proficient since 1998

Nonpublic: Catholic

9 Scale score since 1992

9 Percentage at or above Proficient since 1992

Other Nonpublic

9 Scale score since 1992

9 Percentage at or above Proficient since 1992

NOTE: Years are shown in boldface when the comparison year is different from the initial assessment year because earlier data are not available.

SOURCE: National Center for Education Statistics, National Assessment of Educational Progress (NAEP), 1992 and 2000 Reading Assessments.

46 CHAPTER 2 READING REPORT CARD 59

Page 61: ERIC - Education Resources Information Center ...285; for a PowerPoint slide presentation related to this report, see CS 014 286. AVAILABLE FROM ED Pubs, P.O. Box 1398, Jessup, MD

Schoo and Home Contexts for Learning

This chapter presents information on the two contexts that

largely form a child's world and contribute most to learning:

school and home. Aspects of both school and home are

considered because what students do at school and at home

contribute, in different ways, to reading development. What

students learn and do while at school may be reinforced in

the home when literacy-related activities are interwoven

into daily life. Both classrooms and homes can be print-rich

environments that can accommodate talking about books or

reading aloud from them. Quiet time to read on

one's own or to write down thoughts about what has

been read ideally occurs not only at school, but also

at home. Such connections between the worlds of

school and home communicate the value of learning

and encourage students' reading development.'

The information in this chapter is based on

Have school and students' responses to background questionshome factors

I administered as part of the NAEP 2000 readingrelated to

learning changedassessment. Results from the 2000 assessment are

since 1992? compared to 1992, 1994, and 1998 results.The

percentage of students who selected each response

option and the average scale scores are presented for

each contextual variable reported.Thus, it is possible to

examine the relationship between students' school and home

experiences and their performance on the NAEP

assessment. Readers of this report are reminded that the

relationship between a contextual variable and reading

performance is not necessarily causal.

Chapter'Focus

How do certain

school and home

factors relate to

reading progress?

I Baker, L. (1999). Opportunities at home and in the community that foster readingengagement. In J. Guthrie and D. Alvermann (Eds.), Engaged reading: Processes, practices,and policy implications (pp. 105-133), NewYork:Teachers College.

IBIE27 COPY AVARABIS 6 0

Students'

Reports on Their

Reading in and

for School

Students'

Reports on

Literacy with

Family and

Friends

CHAPTER 3 a READING REPORT CARD 47

Page 62: ERIC - Education Resources Information Center ...285; for a PowerPoint slide presentation related to this report, see CS 014 286. AVAILABLE FROM ED Pubs, P.O. Box 1398, Jessup, MD

Reading in and for SchoolAs with the acquisition of many skills,practice is important to reading develop-ment. Fluency, vocabulary, and comprehen-sion may improve in relation to theamount of reading accomplished. Whileguided oral reading has been shown toeffectively enhance a variety of readingskills, that increased silent reading leads tohigher reading achievement has not beenconfirmed by research studies.Yet, it isgenerally agreed that practice in readingdevelops better readers.'

Pages Read in School and for Homework.

Students' reports in 2000 indicate a consis-

Table 3.1

Students' reports on the number of pagesread each day in school and for homework:1992-2000

tent relationship between the daily amountof reading done in school and for home-work and reading performance.

As shown in table 3.1, 80 percent of thestudents reported reading at least 6 pagesdaily and 60 percent reported reading 11or more pages each day. Higher numbers ofpages read daily are associated with higheraverage reading scale scores. Fourth-graderswho reported reading 11 or more pagesdaily had the highest average score, outper-forming their peers who reported readingfewer pages.

Looking across assessment years indicatesthat fourth-graders currently are reading

Grade

Pages Read

School

Homework

AUG students PIcu (a:RD co:32@

effillc101033Kilo161) 994.

Students dl 01391

@ill scored alt3i0a

1992 1994 1998 2000

11 or more pages

Percentage of Students 56 * 54* 57

Average Scale Score 222 220 221

6 to 10 pages

Percentage of Students 2 2 22

Average Scale Score 217 214 217 215

5 or fewer pages

Percentage of Students 21 2 21 19

Average Scale Score 203 201 207 202

* Significantly different from 2000.

NOTE: Percentages may not add up to 100 due to rounding.

SOURCE: National Center for Education Statistics, National Assessment of Educational Progress (NAEP), 1992, 1994, 1998, and 2000 Reading Assessments.

2 National Reading Panel. (2000). Report of the national reading panel: Teaching children to read: An evidence-basedassessement of the scientific research on reading and its implications for reading instruction: Reports of the subgroups(pp. 3- 15- 3 -38). Washington, DC: National Institute of Child Health & Human Development, National Institutesof Health.

48 CHAPTER 3 READING REPORT CARD

61 BEST COPY AVAILA

Page 63: ERIC - Education Resources Information Center ...285; for a PowerPoint slide presentation related to this report, see CS 014 286. AVAILABLE FROM ED Pubs, P.O. Box 1398, Jessup, MD

more pages on a daily basis. The percentageof fourth-graders who reported reading themost pages-11 or more dailywas higherin 2000 than the percentages had been inboth 1992 and 1994.

Time Spent Doing Homework?. While 2000results showed students who reported moredaily reading in school or for homeworkoutperforming their peers, students' reportson the amount of time spent each day onhomework do not necessarily indicate thatmore time is better.

Table 3.2

Students' reports on the amount oftime spent doing homework each day:1992-2000

As shown in table 3.2, 72 percent offourth-grade students reported spendingone-half to one hour a day on homework.These students outperformed their peerswho reported spending more than onehour on homework each day and thosewho reported either not doing or nothaving homework. Students who reportednot having any homework or spendingmore than one hour on homework hadsimilar scores.

Grade

2000

More than one hour

Percentage of Students

Average Scale Score

15

208

One hour

Percentage of Students 28

Average Scale Score

One-half hour

221

Percentage of Students 39 *

Average Scale Score 217

Do not do homework

Percentage of Students

Average Scale Score

2

196

Do not have homework

I Percentage of Students 16*

Average Scale Score 220

15 16

208 213

30 31

218 221

212

lime Spent

110iE3

at+,Homework

TOM cflcre WI) (WO KM1

fll Tagcro homeworkMk

students

one-half

216 219

3* 2

183 188

13* 8

216 213

2

172

10

212

GM&

* Significantly different from 2000.NOTE: Percentages may not add up to 100 due to rounding.

SOURCE: National Center for Education Statistics, National Assessment of Educational Progress (NAEP), 1992, 1994, 1998, and 2000 Reading Assessments.

2,12/1 COIPU 1 L.1262

CHAPTER 3 READING REPORT CARD 49

Page 64: ERIC - Education Resources Information Center ...285; for a PowerPoint slide presentation related to this report, see CS 014 286. AVAILABLE FROM ED Pubs, P.O. Box 1398, Jessup, MD

z

Some variation can be observed instudents' reports about time spent daily onhomework since previous NAEP assess-ments. In 2000, the percentage of fourth-graders who reported spending one-halfhour on homework each day was higher incomparison to both 1992 and 1994.Whilethe percentages of students who reportednot having homework have been relativelysmall across assessments, in 2000 thispercentage of students was significantlylower than in 1992 and 1994.

Writing About Reading. Writing aboutreading has become a valued part of in-struction that provides learners with waysof thinking about their ideas and exploringcontent domains.' As the framework thatunderlies the NAEP assessment viewsreading as an active process, many of theassessment questions ask students to writeout their own'answers rather than to selecta multiple-choice option. How preparedare fourth-graders to write out their ownanswer about reading by having been askedto do so in school?

Fourth-graders who took the NAEPreading assessment were asked howfrequently during the school year they hadbeen asked to write long answers toquestions on tests and assignments that

involved reading. As shown in table 3.3,81 percent of fourth-grade studentsreported writing long answers aboutreading at least once or twice a month.Students who reported being asked toproduce such answers on a monthly basishad a higher average score than fourth-graders asked to do so weekly and thoseasked to do so less frequently. While 2000results show highest performance amongstudents who are asked to engage in thisactivity once or twice monthly, studentsasked to write out long answers weeklyoutperformed their peers who reporteddoing so yearly or never or hardly ever.Lowest reading performance was associatedwith never or hardly ever writing longanswers about reading.

Increase in the frequency of this activityat grade 4 observed in 1998 was main-tained by the most recent student reports.In 2000, the percentage of students whoreported being asked to write long answersto questions that involved reading at leastweekly was again higher in comparison to1994. A decrease since 1994 was observedin the small percentage of students report-ing never being asked to write long an-swers to questions about reading.

3 McGinley, W. &Tierney, (1989).Traversing the topical landscape. Written Communications 6 (3), 243-269.

50 CHAPTER 3 a READING REPORT CARD 63

Page 65: ERIC - Education Resources Information Center ...285; for a PowerPoint slide presentation related to this report, see CS 014 286. AVAILABLE FROM ED Pubs, P.O. Box 1398, Jessup, MD

Table 3.3

Students' reports on how often they writelong answers to questions on tests orassignments that involved reading: 1992-2000 Grade

1994s ms,

At least once a week

Percentage of Students 51 48 * 53

Average Scale Score 220 217 218

Once or twice a month

Percentage of Students 28 31* 30

Average Scale Score 221 221 223

Once or twice a year

Percentage of Students 13 12 10

Average Scale Score 209 209 212

Never or hardly ever

Percentage of Students 9 9* 8

Average Scale Score 202 198 199

2000

About

Reading

217

225

11

210

8 I

199

cU

afin (10 WIRDELM reading EP

0202a

students

monthly

highest.

scored

* Significantly different from 2000.NOTE: Percentages may not add up to 100 due to rounding.

SOURCE: National Center for Education Statistics, National Assessment of Educational Progress (NAEP), 1992, 1994, 1998, and 2000 Reading Assessments.

CHAPTER 3 READING REPORT CARD 51

Page 66: ERIC - Education Resources Information Center ...285; for a PowerPoint slide presentation related to this report, see CS 014 286. AVAILABLE FROM ED Pubs, P.O. Box 1398, Jessup, MD

52 CHAPTER 3

Teachers' Help with Words. The last school

factors examined in this chapter look at thefrequency of two distinctly different typesof word-level activity and their relationshipto fourth-grade reading performance. In1998 and 2000, students who took theNAEP assessment were asked to reporthow often their teacher helped them tobreak words into parts and how often theirteacher helped them to understand newwords. Results indicate that these twoactivities had a different relationship toperformance.

Help with breaking words into parts hada consistent negative relation to fourth-grade reading performance as demonstratedon the NAEP assessment. As shown in table3.4, the more frequently students receivedthis help the lower their average score;whereas fourth graders who reported thattheir teachers never helped them breakwords into parts had the highest averagescore. These results are consistent withresearch findings that have shown this typeof word-level reading instruction mosteffective in kindergarten and the 1st gradebefore children have learned to readindependently and also effective in helpingolder struggling readers.' The 53 percentof fourth-graders who reported never orrarely needing this type of help demon-strated the highest comprehension skills,outperforming those fourth-graders who

were still focusing more on recognizingindividual words rather than on compre-hending the meaning of the text as a whole.If by the fourth grade students still requiredaily or weekly help breaking words intoparts, it may likely be that these studentsare having difficulty learning to read.

While breaking words into parts is usefulin the early stages of learning to read,learning new words is a part of reading atall stages of development. By fourth grade,students are beginning to explore contentareas and may encounter vocabulary that isspecific to that content. Vocabulary instruc-don, either by explaining new words priorto reading or by explaining them in thecontext of the reading experience, canfacilitate fourth-graders' comprehension ofnew content material.' Broadly stated,help breaking words into parts is associatedwith learning to read and help with newwords is more closely linked with readingto learn. As shown in table 3.4, studentswho reported receiving a moderateamount of help with new wordsweeklyor monthlyhad higher average scoresthan fourth-grade students who reportedreceiving this help either every day ornever/hardly ever. It may be that thefourth-graders who required help weeklyor monthly are reading independently andencountering texts that challenge themwith some new words.

4 National Reading Panel. (2000). Report of the national reading panel: Teaching children to read: An evidence-basedassessment of the scientific research literature on reading and its implications for reading instruction: Reports of the subgroups

(pp. 2 -81 -2 -130). Washington, DC: National Institute of Child Health & Human Development, NationalInstitutes of Health.

Foorman, B., Francis, D., Fletcher, Schatschneider, C., & Mehta, P. (1998). The role of instruction in learning toread: Preventing reading failure in at-risk children. Journal of Educational Psychology, 90,37-55.

5 National Reading Panel (2000). Report of the national reading panel: Teaching children to read: An evidence-basedassessment of the scientific research literature on reading and its implications for reading instruction: Reports of the subgroups

(pp. 4-15-4-25). Washington, DC: National Institute of Child Health & Human Development, National Institutesof Health.Brett, A., Rothlein, L., & Hurley, M. (1996).Vocabulary acquisition from listening to stories and explanations oftarget words. Elementary School Journal, 96 (4), 415-422.

READING REPORT CARD 65

Page 67: ERIC - Education Resources Information Center ...285; for a PowerPoint slide presentation related to this report, see CS 014 286. AVAILABLE FROM ED Pubs, P.O. Box 1398, Jessup, MD

Table 3.4

Students' reports on how often their teachershelp them break words into parts and helpthem understand new words: 1998-2000

Grade

2000

How often their teachers help them break words into partsEvery day

Percentage of Students 25 25

Average Scale Score 210 209

Once or twice a week

Percentage of Students 23 22

Average Scale Score 217 217

Never or hardly ever

IPercentage of Students 52

Average Scale Score 226

WIVXD,amsto

Students

.4111.2.1641.6i/00.161611.5 .... .

How often their teachers help them understand new wordsEvery day

Percentage of Students 49 48

Average Scale Score 217 217

Once or twice a week

Percentage of Students

Average Scale Score

24 23

224

Once or twice a month

IPercentage of Students 14 14

Average Scale Score 223

Never or hardly ever

Percentage of Students

Average Scale Score

12 * 14

219 216

Students

Telt 'MI197

coal msTohelped

cr? monthly

Mc&scored

* Significantly different from 2000.NOTE: Percentages may not add up to 100 due to rounding.

SOURCE National Center for Education Statistics, National Assessment of Educational Progress (NAEP), 1998 and 2000 Reading Assessments.

66

1111151I° COPY AVAIIILA LIE

CHAPTER 3 READING REPORT CARD 53

Page 68: ERIC - Education Resources Information Center ...285; for a PowerPoint slide presentation related to this report, see CS 014 286. AVAILABLE FROM ED Pubs, P.O. Box 1398, Jessup, MD

Literacy with FamiCy and FriendsLearning is not limited to the classroombut is a process that continues in, and isshaped by, home environments and socialinteractions. Families use print for variousactivities on a daily basis, and differentcultural groups have unique ways ofintegrating oral and written languagewith daily social life.' Occasions forlearning and for enhancing reading skillsideally coincide with the behaviors ofeveryday life.This section of the chapterlooks at literacy-related activities that occuroutside of school and their relationship tostudent performance on the 2000 NAEPreading assessment.

Reading for Fun. As noted earlier, theamount of reading students do in and forschool on a daily basis had a positiverelationship to performance on the NAEPreading assessment. If students, however,regard reading only as a school-relatedactivity, as a duty rather than a pleasure,their future prospects for reading to under-stand themselves and the world are limited.For reading on one's own not only extendscomprehension skills, but also enhances theunderstanding of what happens in life.

While reading stories, for instance, childrenuse literature to make connections andcomparisons, providing them with a per-spective that may exceed the boundaries oftheir immediate experience.' When the actof reading extends beyond the schoolroomand becomes part of daily life, ongoingliteracy is on its way to becoming a reality.

As shown in table 3.5, 75 percent offourth-grade students reported reading forfun at least once or twice a week. Morefrequent reading for fun showed a positiverelationship to scores. Students who re-ported reading for fun daily had higheraverage scores than students who reportedreading for fun less frequently.

While students' reports indicate relativelystable percentages of students engaged inreading on their own across assessmentyears, the percentage of students in 2000who reported never reading for fun ontheir own was significantly higher than in1994.This contrasts with students' reportson school-related reading, noted earlier inthis chapter, which showed higher percent-ages of students reading more pages dailyin school and for homework than in 1992and 1994.

6 Gee, J. (1990). Social linguistics and literacies. London: Falmer.

7 Wolf, S.A. & Heath, S.B. (1992). The braid of literature: Children's world of reading. Cambridge, MA: HarvardUniversity Press.

54 CHAPTER 3 READING REPORT CARD 67

Page 69: ERIC - Education Resources Information Center ...285; for a PowerPoint slide presentation related to this report, see CS 014 286. AVAILABLE FROM ED Pubs, P.O. Box 1398, Jessup, MD

Table 3.5

Students' reports on how often they readfor fun on their own time: 1992-2000 Grade

ae +n

Reading for

Every day

43Percentage of Students 44 45

Average Scale Score 223 223 222

Once or twice a week

32 32 32Percentage of Students

Average Scale Score 218 213 219

Once or twice a month

Percentage of Students 12 12 12

Average Scale Score 210 208 216

Never or hardly ever

Percentage of Students

Average Scale Score

Students atop mitiaillowc0659scored highest.

218

12

216

13

199 197 203 202

Three quarters CO

giO students ECH

lia)fiGeOftf?CfOHJ2

02EilMaitb

* Significantly different from 2000.NOTE: Percentages may not add up to 100 due to rounding.

SOURCE: National Center for Education Statistics, National Assessment of Educational Progress (NAEP), 1992, 1994, 1998, and 2000 Reading Assessments.

68 CHAPTER 3 READING REPORT CARD 55

Page 70: ERIC - Education Resources Information Center ...285; for a PowerPoint slide presentation related to this report, see CS 014 286. AVAILABLE FROM ED Pubs, P.O. Box 1398, Jessup, MD

Discussing Studies and Talking About Reading.

Reading, whether for school or for fun, isonly one aspect of literacy development.Equally important is the social dimensionthat brings books off the shelves and out ofbookbags into the dynamics of daily life.Social interactions that provide studentswith the opportunity to talk with othersabout their reading enhance literacy skillsand encourage further reading activity."Fourth-grade students in the NAEPassessment were asked to indicate howfrequently they discussed their studies athome or talked about reading with familyand friends. As shown in table 3.6, students'reports suggest a distinction between thetwo types of discourse. Students whodiscussed their studies at home, howeverfrequently, had higher average readingscores than students who reported neverdiscussing their studies at home.There wasno significant difference between the scoresof those who did so daily, weekly, ormonthly.

Somewhat more frequent talk withfamily and friends about their reading

showed a positive relation to studentperformance. Fourth-graders whoresponded that they engaged inconversation about their reading withfamily or friends on a weekly basis had ahigher average score than students whoreported engaging in such talk daily,monthly, or never. Students who reportednever or rarely doing so had the lowestaverage score.

The percentages of students reportingvarious frequencies of talking about studiesor their reading outside of school haveremained relatively stable across assessmentyears. No significant change was observedin student reports of how often theydiscuss studies at home. An increase since1994 was observed, however, in thepercentage of students who reported neveror hardly ever talking about their readingwith family or friends. Considering thepositive relationship this activity has withreading achievement, it is disappointing to

_ note that nearly one - quarter of studentsreported never or hardly ever doing so.

H Guthrie, , Shafer,W.D.,Yang,Y.Y., & Afflerbach, P. (1995). Relationships of instruction on reading: An explorationof social, cognitive and instructional connections. Reading Research Quarterly, 30, 8-25.

56 CHAPTER 3 READING REPORT CARD

69

Page 71: ERIC - Education Resources Information Center ...285; for a PowerPoint slide presentation related to this report, see CS 014 286. AVAILABLE FROM ED Pubs, P.O. Box 1398, Jessup, MD

Table 3.6

Students' reports on how often they discusstheir studies at home and talk about readingwith their family and friends: 1992-2000 Grade

994 998

Discuss studies at homeAlmost every day

Percentage of Students 54 55 54

Average Scale Score 221 219 220

Once or twice a week

Percentage of Students 22 22 23

Average Scale Score 220 215 222

Once or twice a month

Percentage of Students 6 6 6

Average Scale Score 215 208 213

Never or hardly ever

Percentage of Students 17 17 18

Average Scale Score 202 199 205

994 998

Talk about reading with family or friendsAlmost every day

Percentage of Students 26 28 27

Average Scale Score 215 213 211

Once or twice a week

Percentage of Students 36 36 35

Average Scale Score 224 223 223

Once or twice a month

Percentage of Students 15 15 15

Average Scale Score 219 214 222

Never or hardly ever

Percentage of Students 23 21 * 23

Average Scale Score 209 207 214

Discussing Studies44i 4P

Talking About

Reading

2000

221

219

217

171

201

2000

213

227

Tog OMCf07

students

discussed

studies

RGOT

611 Gumhowever frequentlyout erformedstudents Tao cox&

aErgir}chaa

cU

@EU ffospqaci)

E0s6 readings

(tile EEO friends EQ

s udents

_least

Students CaD fflt

monthly.

Szal waGgIXscored

220

24

209

highest.

* Significantly different from 2000.NOTE: Percentages may not add up to 100 due to rounding.

SOURCE: National Center for Education Statistics, National Assessment of Educational Progress (NAEP), 1992, 1994, 1998, and 2000 Reading Assessments.

111307 Copy AVAILUEZ 7CHAPTER 3 READING REPORT CARD 51

Page 72: ERIC - Education Resources Information Center ...285; for a PowerPoint slide presentation related to this report, see CS 014 286. AVAILABLE FROM ED Pubs, P.O. Box 1398, Jessup, MD

58 CHAPTER 3

Reading Materials in the Home. One indica-don that a household values literacy as aninherent part of everyday life is the pres-ence of a variety of reading materials.Students who participated in the NAEPreading assessment were asked to indicate iftheir family regularly received a newspaperor magazines, if there was an encyclopediaat home, and to approximate how manybooks were in the home. Students wereclassified as having books in the home ifthey reported having more than 25 books.Reports on these four different types ofreading materials are presented in table 3.7.

Results of the 2000 NAEP readingassessment are consistent with the findingsof previous assessments that showed higheraverage scores among students who re-ported having more types of reading

materials at home.'' Students who reportedhaving all four types of materials in theirhome had higher average scores thanstudents who reported having three typesof materials; those who indicated thepresence of three types had in turn ahigher average score than students indicat-ing the presence of two or fewer types ofreading materials.

Perhaps the availability of newspapers,magazines, and encyclopedias in electronicform has influenced the number of differ-ent types of reading materials in the homereported by students. Although there wasno significant change since 1998, a lowerpercentage of students in 2000 reportedhaving all four types of reading materials athome in comparison to 1994 reports.

9 Donahue, P.D.,Voekl, K.R., Campbell, J.R. & Mazzeo, J. (1999). NAEP's 1998 reading report card for the nation andthe states. National Center for Education Statistics. Washington, DC: U.S. Government Printing Office.

Campbell, J.R., Donahue, P.D., Reese, C.M. & Phillips, G.W. (1996). NAEP's 1994 reading report card for the nationand the states. National Center for Education Statistics. Washington, DC: U.S. Government Printing Office.

READING REPORT CARD 71

Page 73: ERIC - Education Resources Information Center ...285; for a PowerPoint slide presentation related to this report, see CS 014 286. AVAILABLE FROM ED Pubs, P.O. Box 1398, Jessup, MD

Table 3.7

Students' reports on the number of differenttypes of reading materials in the home:1992-2000 Grade

Two-thirds cil

students 0

Vgffl 994 1998 2000

Four

Percentage of Students 37 38* 37

Average Scale Score 226 227 228 2290E1)&02 C11 COT

X1213 CO reading

aIeffs03Three

homes. Students

(D:7aimzia3 scoredawal

Percentage of Students 32 34 33

Average Scale Score 219 216 220 219

Two or fewer

Percentage of Students 31 29* 30 32 I

Average Scale Score 204 197 204 203

* Significantly different from 2000.NOTE: Percentages may not add up to 100 due to rounding.

SOURCE: National Center for Education Statistics, National Assessment of Educational Progress (NAEP), 1992, 1994, 1998, and 2000 Reading Assessments.

7 2 CHAPTER 3 READING REPORT CARD 59

Page 74: ERIC - Education Resources Information Center ...285; for a PowerPoint slide presentation related to this report, see CS 014 286. AVAILABLE FROM ED Pubs, P.O. Box 1398, Jessup, MD

Time Spent Watching Television. The last

home variable to be considered in thischapter is the amount of time studentsreported watching television daily. Whilethe availability of technologies includingtelevision and computers have been associ-ated with reading engagement,'° andalthough many educationally orientedprograms are aired, concern remains thattelevision watching interferes with the timestudents spend on more active literacypursuits.

Students' reports on the amount of timethey spend watching television are pre-sented in table 3.8. Results from the NAEP2000 reading assessment once again showthat watching many hours of televisiondaily has a negative relationship to readingperformance. Students who reportedwatching the most television, six hours ormore, had the lowest average reading score,and those who reported watching four

to five had the next lowest score. Fourth-grade students who reported watching lesstelevision, either two or three hours or anhour or less daily, had higher and similarscores.

A clear trend in whether students arewatching more or less television is notevident. Although a significantly higherpercentage of fourth-grade students in2000 than in 1998 reported watching sixor more hours of television daily, thispercentage is lower than it was in 1994.However, results also show increases inthe percentages of students watching threeor fewer hours of television daily. Thepercentage of students in 2000 whoreported watching two or threehours was higher than in 1994, and thepercentage who reported watching onehour or less was higher in comparison toboth 1992 and 1994.

10 Baker, L. (1999). Opportunities at home and in the community that foster reading engagement. In J. Guthrie & D.Alvermann (Eds.) Engaged reading: Processes, practices and policy implications. (pp. 105-33), New York:TeachersCollege.

60 CHAPTER 3 o READING REPORT CARO7 3

Page 75: ERIC - Education Resources Information Center ...285; for a PowerPoint slide presentation related to this report, see CS 014 286. AVAILABLE FROM ED Pubs, P.O. Box 1398, Jessup, MD

Table 3.8

Students' reports on the amount oftime spent watching television each day:1992-2000

Grade

Six hours or more

Percentage of Students

Average Scale Score

Four or five hours

Percentage of Students

Average Scale Score

Two or three hours

Percentage of Students

Average Scale Score

994 998

20

199

21*

194

16*

198

22* 22* 19*

216 216 216

40 38* 41

224 222 223

2000

18]

Watching

students

213

401

One hour or less

Percentage of Students

Average Scale Score 221 220 222

Television

watching

television

televisionscored

Ge20) Ct

* Significantly different from 2000.

NOTE: Percentages may not add up to 100 due to rounding.

SOURCE: National Center for Education Statistics, National Assessment of Educational Progress (NAEP), 1992, 1994, 1998, and 2000 Reading Assessments.

74

r,Wir COPY AVAIMAE12

CHAPTER 3 READING REPORT CARD 61

Page 76: ERIC - Education Resources Information Center ...285; for a PowerPoint slide presentation related to this report, see CS 014 286. AVAILABLE FROM ED Pubs, P.O. Box 1398, Jessup, MD

Summ&yWhile no causal relationship can be madebetween student reports on school andhome experiences and their reading per-formance, results presented in this chapterdo suggest that active engagement withwords and meanings may be an essentialfactor for literacy development.

Students who reported reading morepages daily in school and for homeworkand students who reported being asked towrite about their reading were higherachievers. Activities outside of school, suchas reading for fun, discussing studies athome, or talking about reading with familyand friends also showed a positive influenceon fourth-grade reading achievement asmeasured by the NAEP assessment.

Considering that students' reportsindicating more involvement with languageat school and at home were positivelyrelated to scale scores, it is not surprisingthat having access to more types of literacymaterials in the homematerials thatmight stimulate reading activitywere alsoreported by higher-achieving students.Theinverse of active engagement in literacy-related activities is reflected in students'reports on television viewing. Studentswho reported watching the most televisiondaily were the lowest achievers.

Some of the factors that had a positiverelationship with students' scores on theNAEP reading assessment have shownencouraging change across assessment years.In comparison to 1992 and 1994, higherpercentages of students in 2000 reportedreading more pages in school and forhomework on a daily basis; also, in com-parison to 1994, higher percentages ofstudents reported being asked to writeabout their reading weekly or monthly.

62 CHAPTER 3 READING REPORT CARO

While these school-related factorsindicate students' increased involvementwith language activity, some home factorsrelated to literacy development do notreflect a positive trend. In comparison to1994, small increases can be observed inthe percentages of students who reportednever or hardly ever reading for fun ontheir own time or talking about theirreading with family and friends. Althoughtwo-thirds of students reported havingthree or four types of reading materials intheir homes, more students than in 1994reported having two or fewer types. Noclear pattern over time was observed instudents reports on how much televisionthey watch daily; however, it is encouragingthat in comparison to 1992 and 1994 ahigher percentage of fourth-gradersreported watching one hour or less oftelevision each day.

Overall, it may be said from the results inthis chapter that high percentages offourth-grade students are engaged inliteracy-related activities on a fairly regularbasis. For example, three-quarters offourth-graders reported reading for funweekly and talking about reading withfamily and friends at least monthly; andover three-quarters of the students reporteddiscussing their studies at home. Students'reports on school activities show 60 percentof fourth-graders reading eleven or morepages daily in school or for homework and72 percent spending one-half to one hourdaily on homework.

75

Page 77: ERIC - Education Resources Information Center ...285; for a PowerPoint slide presentation related to this report, see CS 014 286. AVAILABLE FROM ED Pubs, P.O. Box 1398, Jessup, MD

ecoming a More lnclusiveNational Assessment

Legislation at the federal level now mandates the inclusion

of all students in large-scale academic assessments.' As a

consequence, the majority of states have assessment

programs that must make provisions to include special-needs

studentsprovisions that include the allowance of testing

accommodations when appropriate. Assessing as

representative a sample of the nation's students as

possible is particularly important for NAEP's mission

to serve as a key indicator of the academic

achievement of the nation's students. This mission

can be satisfactorily accomplished only if the

assessment results include data gathered from all

groups of students, including those classified as

having special needs.

Although the intent of NAEP has consistently

been to include special-needs students in its

assessments to the fullest degree possible, the

implementation of the assessment has always resulted

in some exclusion of students with disabilities and

students with limited English proficiency who could

assessed meaningfully without accommodations.

How would the

NAEP results

differ if

accommodations

were permitted

for special needs

students?

not be

Participating schools have been permitted to exclude certain

students who have been classified as having a disability under

the Individuals with Disabilities Education Act, based upon

I Goals 2000, Elementary and Secondary Education Act (ESEA), Improving America'sSchools Act (IASA), Individuals with Disabilities Education Act (IDEA). See also:TitleVI of the Civil Rights Act, Equal Educational Opportunities Act, Section 504 of theRehabilitation Act.

76

Two sets of 2000

MEP Reading

Results

Results for the

Nation

Results by

Gender

Results by Race/

Ethnicity

CHAPTER 4 e READING REPORT CARD 63

Page 78: ERIC - Education Resources Information Center ...285; for a PowerPoint slide presentation related to this report, see CS 014 286. AVAILABLE FROM ED Pubs, P.O. Box 1398, Jessup, MD

their Individualized Education Programs(IEP) and Section 504 of the Rehabilita-tion Act of 1973. Similarly, schools havebeen permitted to exclude some studentsthey identify as being limited Englishproficient. Exclusion decisions are made inaccordance with explicit criteria providedby the NAEP program.

In order to move the NAEP assessmentstoward more inclusive samples, the NAEPprogram began to explore the use ofaccommodations with special-needsstudents during the 1996 and 1998assessments.An additional impetus for thischange was to keep NAEP consistent withstate and district testing policies thatincreasingly offered accommodations sothat more special-needs students could beassessed. In both 1996 and 1998, thenational NAEP sample was split so thatpart of the schbols sampled were permittedto provide accommodations to special-needs students and the other part was not.This sample design made it possible tostudy the effects on NAEP results ofincluding special-needs students in theassessments under alternate testingconditions. Technical research papers havebeen published with the results of thesecomparisons.' Based on the outcomes ofthese technical analyses, the 1998 results ofthose NAEP assessments that used new testframeworks (writing and civics), and hencealso began new trend lines, were reportedwith the inclusion of data fromaccommodated special-needs students.

The results presented in the 1998 read-

ing report card included the performanceof students with disabilities (SD) and thosewith limited English proficiency (LEP)who were assessed without the possibilityof accommodations.The results did notinclude the performance of students forwhom accommodations were permittedbecause of the need to preserve compara-bility with the results from 1992 and 1994.Students in those earlier assessments hadnot had accommodations available to them.However, in both the 1998 and 2000reading assessments, the NAEP programused the split-sample design, so that trendsin students' reading achievement could bereported across all the assessment years and,at the same time, the program couldcontinue to examine the effects of includ-ing students tested with accommodations.

Two Sets of 2000 NAEPReading ResultsThis report card is the first to display twodifferent sets of NAEP results based on thesplit-sample design: (1) those that reflectthe performance of regular and special-needs students when accommodationswere not permitted, and (2) those thatreflect the performance of regular andspecial-needs studentsboth those whowere accommodated and those who couldtest without accommodationswhenaccommodations were permitted. It shouldbe noted that accommodated studentsmake up a small proportion (about 3percent) of the total weighted number ofstudents assessed (see table A.6 in appendix A

for details). Making accommodations

2 Olson, J.E and Goldstein, A. A. (1997). The inclusion of students with disabilities and limited English proficientstudents in large-scale assessments:A summary of recent progress. (NCES Publication No. 97-482). Washington, DC:National Center for Education Statistics.

Mazzeo, J., Carlson, J.E.,Voelkl, K.E., & Lutkus, A. D. (1999). Increasing the participation of special needs students

in NAEP:A report on 1996 research activities. (NCES Publication No. 2000-473). Washington, DC: NationalCenter for Education Statistics.

64 CHAPTER 4 0 READING REPORT CARO 7 7

Page 79: ERIC - Education Resources Information Center ...285; for a PowerPoint slide presentation related to this report, see CS 014 286. AVAILABLE FROM ED Pubs, P.O. Box 1398, Jessup, MD

available may change the overall assessmentresults in subtle and different ways. Forexample, when accommodations arepermitted, there may be some occurrencesof students being accommodated who mayhave taken the test under standard condi-tions if accommodations were not permit-ted. This could lead to an overall increasein the average assessment results, if it can beassumed that accommodations increasespecial-needs students' performance. Con-versely, when accommodations are permit-ted, special-needs students who could nothave been tested without accommodationscould be included in the sample. Assumingthat these are generally lower-performingstudents, their inclusion in the sampleeven with accommodationsmay result inan overall lower average score. The find-ings in the NAEP research reports citedpreviously suggest that these two oppositeinfluences on the total assessment resultsmay balance out to result in little change inthe overall assessment averages.

The first three chapters of this report arebased on the first set of results (no accom-modations offered).This final chapterpresents an overview of the second set ofresultsresults that include students whowere provided accommodations during thetest administration. By including theseresults, the NAEP program begins a phasedtransition toward a more inclusive report-ing sample. Future assessment results willbe based solely on a student and schoolsample in which accommodations arepermitted.

The two sets of results presented in thischapter were obtained by administering theassessment to a nationally representativesample of students and schools. In half ofthe schools sampled, no accommodationswere permitted; all students were tested

,

under the same conditions that were thebasis for reporting results from the 1992,1994, and 1998 NAEP reading assessments.In the other half of the schools sampled,accommodations were permitted forstudents with disabilities and limitedEnglish proficient students who normallyreceive accommodations in their district orstate testing programs. Most accommoda-tions that schools routinely provide fortheir own testing programs were permitted.The permitted accommodations included,but were not limited to:

O one-on-one testing,O small-group testing,

El extended time,

O oral reading of directions,

O signing of directions,

O use of magnifying equipment, and,

O use of an aide for transcribing responses.

The program did not allow some of theaccommodations that are permitted incertain states. In particular, some statesallow questions and, in some instances,reading passages to be read aloud to thestudents. Such "read-aloud accommoda-tions" were viewed by the program aschanging the nature of the construct beingmeasured and, hence, were not permitted.Because the NAEP program considers thedomain of its reading assessment as "readingin English," no attempt was made to providean alternate-language version of the instru-ment and the use of bilingual dictionarieswas not allowed. (See appendix A, tableA.7 for greater detail on numbers andpercentages of students accommodated byaccommodation type in the 1998 and 2000assessments.) The "read-aloud" accommo-dation is permitted, however, in NAEPsubjects other than reading.

78 CHAPTER 4 READ1116 REPORT CARD 65

Page 80: ERIC - Education Resources Information Center ...285; for a PowerPoint slide presentation related to this report, see CS 014 286. AVAILABLE FROM ED Pubs, P.O. Box 1398, Jessup, MD

68 CHAPTER 4

Figure 4.1 provides a visual representa-tion of how the two sets of results werebased on the two half-samples in 1998 and2000. Included in both sets of results(accommodations not permitted andaccommodations permitted) are thosestudents from both half-samples of schoolswho were not identified as having a dis-ability (SD) or limited English proficiency(LEP). In addition, the first set of results(accommodations not permitted) includesSD and LEP students from the half-sampleof schools where accommodations werenot permitted (see middle portion ofFigure 4.1).This is the set of results thatallows for trend comparisons back to 1992and are reported in the first three chaptersof this report.

The second set of results, accommoda-tions permitted (see bottom portion ofFigure 4.1), includes SD and LEP studentsfrom the half-sample of schools whereaccommodations were permitted. This isthe set of results that form the new, moreinclusive baseline for future reporting oftrend comparisons for the NAEP readingassessment.

. READING REPORT CARD

In the NAEP 2000 samples whereaccommodations were not permitted, 15percent of the students were identified bytheir schools as having special needs (i.e.,either as students with disabilities or lim-ited English proficient students). In theother half-sample where accommodationswere offered, 17 percent of the studentswere identified as having special needs. Inthe sample where accommodations werenot permitted, 48 percent of the special-needs students (7 percent of all students, seeappendix A, table A.5) were excluded fromNAEP testing by their schools. In thesample where accommodations wereoffered, only 35 percent of the special-needs students were excluded from testing(6 percent of the total sample).Thus, asseen with other NAEP subjects, offeringaccommodations leads to greater inclusionof special-needs students.

79

Page 81: ERIC - Education Resources Information Center ...285; for a PowerPoint slide presentation related to this report, see CS 014 286. AVAILABLE FROM ED Pubs, P.O. Box 1398, Jessup, MD

The two sets of NAEP results based on a split-sample design

Half-sample with no Half-sample with Split-sample designaccommodations permitted accommodations permitted The national sample was split. In half of the

schools, accommodations were not permittedfor students with disabilities (SD) and

Non-SO/LEP Non-SD/LEP students with limited English proficiencystudents students (LEP). In the other half of the schools,

accommodations were permitted for SD andLEP students who routinely received them intheir school assessments.

SO/LEP SO/LEP

students students

Half-sample with no Half-sample with

accommodations permitted accommodations permitted

0 .0

SO/LEP

students

Half-sample with no Half-sample with

accommodations permitted accommodations permitted

INANfe (1,t 0-4CIOW

agtai*

SO/LEP

students

ffiEgIr COPY AVAIIILA

Accommodations-not-permitted results

The accommodations-not-permitted resultsinclude the performance of students from bothhalf-samples who were not classified as SD orLEP and the performance of SD and LEPstudents from the half-sample in which noaccommodations were permitted.

Accommodations-permitted results

The accommodations-permitted results alsoinclude the performance of students from bothhalf-samples who were not classified as SD orLEP; however, the SD and LEP students whoseperformance is included in this set of resultswere from the half-sample in whichaccommodations were permitted. Sincestudents who required testing accommodationscould be assessed and represented in theoverall results, it was anticipated that theseresults would include more special-needsstudents and reflect a more inclusive sample.

80 CHAPTER 4 READING REPORT CARO 61

Page 82: ERIC - Education Resources Information Center ...285; for a PowerPoint slide presentation related to this report, see CS 014 286. AVAILABLE FROM ED Pubs, P.O. Box 1398, Jessup, MD

The focus of this chapter is a compari-son of data from the two sets of results: (1)when accommodations were not permit-ted, and (2) when accommodations werepermitted. Because the split-sample designwas used in both 1998 and 2000 for theNAEP reading assessment, both sets ofresults are presented for both years. Overallresults are provided for the nation, as wellas for student subgroups by gender and byrace/ethnicity.These results are discussed interms of statistically significant differencesbetween the two sets of results in each year,changes between assessment years, anddifferences between subgroups of studentswithin each set of results.Throughout thischapter, the assessment results that includeSD and LEP students for whom accommo-dations were not permitted will be referredto as the "accommodations-not-permitted"average score.The set of results that in-cludes SD and LEP students for whomaccommodations were permitted will be

referred to as the "accommodations-permitted" average score.

Results for the Nation:Accommodations Not Permitted andAccommodations Permitted

Table 4.1 displays the average reading scalescores for the nation in 1998 and 2000 forthe two sets of results: (1) accommodationsnot permitted, and (2) accommodationspermitted. In the 1998 reading assessmentat grade 4, there was no statistically signifi-cant difference between the two averagescores. However, in 2000 the accommoda-tions-permitted average score was twopoints lower than the accommodations-not-permitted average score. An importantcomparative context for this finding is thatNAEP's previous research in the scienceand mathematics subjects in the 1996assessments found no significant differencesin the average scores of the two sets ofresults at grades 4, 8, or 12.3

itE0g611

Average score by type of results: 1998 and 2000

1998

2000

Accommodations

of permitted

211

217

AcCommodatiOnS

permitted

216

215'

*Significantly different from the sample where accommodations were not permitted.SOURCE: National Center for Education Statistics, National Assessment of Educational Progress (NAEP), 1998 and 2000 Reading Assessments.

3 Mazzeo, J., Carlson, J.E.,Voelkl, K.E., & Lutkus, A. D. (1999). Increasing the participation of special needs students inNAEP:A report on 1996 research activities. (NCES Publication No. 2000-473). Washington, DC: National Centerfor Education Statistics.

68 CHAPTER 4 ° READING REPORT CARD 81

Page 83: ERIC - Education Resources Information Center ...285; for a PowerPoint slide presentation related to this report, see CS 014 286. AVAILABLE FROM ED Pubs, P.O. Box 1398, Jessup, MD

As noted in the introduction to this chap-ter, NAEP has always sought to includespecial-needs students proportional to theirrepresentation in the U.S. population.Offering accommodations tends to reduceexclusion rates for special-needs studentsand therefore allows NAEP to offer a fairerand more accurate picture of the status ofAmerican education. Because special-needsstudents are typically "classified" as eligiblefor special educational services after havingshown some difficulty in the regularlearning environment, it may be assumedthat the academic achievement of special-needs students will be lower than that ofstudents without such needs.This assump-tion appears to have been justified in theobserved difference between the two setsof fourth-grade reading results in 2000,where the accommodations-permittedresults, which included slightly more

special-needs students because of theavailability of accommodations, were lowerthan the accommodations-not-permittedresults. It is important to examine thepercentages of students attaining the NAEPachievement levels to see if the percentagesin the lower performance categories (i.e.,below Basic and Basic) were higher in theset of results where accommodations wereoffered.

Table 4.2 shows the percentages ofstudents attaining each of the achievementlevels.The percentages are similar across thetwo sets of 2000 results; the differencesbetween the accommodations-not-permitted and the accommodations-permitted results were not significantlydifferent. Similarly, the achievement leveldistributions for the two sets of results inthe 1998 reading assessment showed nosignificant differences.

Percentages of students attaining reading achievement levels by type of results(Accommodations Not Permitted and Accommodations Permitted): 1998 and 2000

1998Below Basic At Basic At Advanced

At or above

Basic

At or above

ProficientAt Proficient

Not permitted 62 31

Permitted 38 32 24 7 61 3139 31 23 3

2000

Not permitted 63 32

Permitted 37 31 24 8 61 3139 30 23 7

NOTE: Percentages within each reading achievement level range may not add to 100, or to the exact percentages at or above achievement levels, due to

rounding.

SOURCE: National Center for Education Statistics, National Assessment of Educational Progress (NAEP), 1998 and 2000 Reading Assessments.

82 CHAPTER 4 READING REPORT CARD 69

Page 84: ERIC - Education Resources Information Center ...285; for a PowerPoint slide presentation related to this report, see CS 014 286. AVAILABLE FROM ED Pubs, P.O. Box 1398, Jessup, MD

Results by Gender:Accommodations Not Permitted andAccommodations Pere 1 itted

In past NAEP assessments of reading,female students outperformed male stu-dents.4 The average reading scale scores bygender for both sets of results in 1998 and2000 are provided in table C.30 in appen-dix C.The small differences within gendergroups between the accommodations-not-permitted and accommodations-permittedaverage scores were not significant foreither 1998 or 2000.

Female students outperformed malestudents by 10 points in 2000, regardless ofwhether accommodations were permitted.In addition, the gap in average scoresbetween males and females was larger in2000 than in 1998 in both sets of results.

For male students, the accommodations-permitted average score declined from 214to 210 between 1998 and 2000. However,the small decline between 1998 and 2000in the accommodations-not-permittedaverage score was not statistically signifi-cant. For female students, no significantscore difference between 1998 and 2000 isevident in either set of results.

The percentages of male and femalestudents attaining the Basic, Proficient, andAdvanced levels are provided in table C.31in appendix C. Comparing the two sets ofresults both in 1998 and 2000, no signifi-cant differences were found in the percent-

ages of students attaining each of theachievement levels. Neither set of resultsshows significant change between 1998 and2000 for either males or females.

Results by Race/Ethnicity:Accommodations Not Permitted andAccommodations Permitted

NAEP assessments across academic subjectshave typically reported large score differ-ences according to race and ethnic groupmembership. To the extent that studentswith disabilities or limited English profi-cient students may be overrepresented in aparticular racial or ethnic group, thatgroup's assessment scores may decrease.Table C.32 in appendix C provides theaverage reading scale scores for each of therace/ethnicity categories for the two sets ofresults in -1998 and 2000. Of the race/ethnicity categories, "Hispanic" was theonly one in which a significant differencewas observed between the accommoda-dons-not-permitted and accommodations-permitted average scores. In the 2000assessment, the results for Hispanic studentsthat included those who received accom-modations had an average scale score thatwas seven points lower than the results forHispanic students that did not includethose who received accommodations. In1998, however, the difference betweenthese two sets of results for Hispanicstudents was smaller and was not statisti-cally significant.

4 Mullis, I.V. S., Campbell, J. R., & Farstrup, A. E., (1993). NAEP 1992 reading report card for the nation and the states.Washington, DC: National Center for Education Statistics.

Campbell, J. R., Donahue, P. D., Reese, C. M., & Phillips, G. W. (1996) NAEP 1994 reading report card for the nationand the states. Washington, DC: National Center for Education Statistics.

Donahue, P.L.,Voelkl, KR., Campbell, J.R.,1k Mazzeo, J. (1999). The NAEP 1998 reading report card for the nationand the states. Washington, DC: National Center for Education Statistics.

70 CHAPTER 4 0 READING REPORT CARD 83

Page 85: ERIC - Education Resources Information Center ...285; for a PowerPoint slide presentation related to this report, see CS 014 286. AVAILABLE FROM ED Pubs, P.O. Box 1398, Jessup, MD

As noted in chapter 2, a pattern ofperformance differences by race/ethnicitycan be seen in the accommodations-not-permitted results in 2000. Both white andAsian/Pacific Islander students scoredhigher than black, Hispanic, or AmericanIndian students. The same pattern can beobserved in the accommodations-permit-ted results. The only difference noted in theperformance by ethnicity pattern betweenthe two sets of results was that in theaccommodations-permitted results, Ameri-can Indian students scored higher thanHispanic students. This was not the case inthe accommodations-not-permitted results.

The percentages of students in each race/ethnicity category who attained the Basic,Proficient, and Advanced levels are providedin table C.33 in appendix C. No significantdifferences were found between the ac-commodations-not-permitted results andthe accommodations-permitted results forthe percentages of students attaining eachof the achievement levels in 1998 or 2000.While, as noted above, there was a scalescore difference for Hispanic studentsbetween the two sets of results in the 2000assessment, a similar statistically significantdifference was not observed for Hispanicstudents in terms of the achievement levels.

84 CRAPTER 4 READ17111 REPORT CARD 71

Page 86: ERIC - Education Resources Information Center ...285; for a PowerPoint slide presentation related to this report, see CS 014 286. AVAILABLE FROM ED Pubs, P.O. Box 1398, Jessup, MD

SummavyThis chapter compared the reading perfor-mance of the nation's fourth-graders whenaccommodations were not permitted andwhen accommodations were permitted forspecial-needs students in the 1998 and2000 NAEP assessments. Both samplesbeing compared in this chapter includednon-special-needs students and special-needs students who were tested withoutaccommodations. In the sample withaccommodations permitted, those studentswho customarily receive accommodationsin their school's testing received them inthe NAEP testing. While NAEP's previousresearch in the 1996 science and math-ematics assessments found no significantdifferences between the two types of resultsat grades 4, 8, or 12, in the 2000 reading

assessment the accommodations-permittedaverage score at grade 4 was lower than theaccommodations-not-permitted averagescore.'

Within the accommodations-permittedresults, the average score for male studentsdeclined between 1998 and 2000. ForHispanic students, the accommodations-permitted average score was lower than theaccommodations-not-permitted averagescore in 2000. The comparable differencebetween the two sets of results for Hispanicstudents in 1998 was not statistically signifi-cant. No significant difference between thetwo sets of results was evident for any otherracial/ethnic subgroup of students in 1998or 2000.

5 Mazzeo, J., Carlson, J.E.,Voelkl, K.E., & Lutkus, A.D. (1999). Increasing the participation of special needs students inNAEP:A report on 1996 research activities. (NCES Publication No. 2000-473),Washington, DC: National Centerfor Education Statistics.

12 CHAPTER 4 o READING REPORT CARD

Page 87: ERIC - Education Resources Information Center ...285; for a PowerPoint slide presentation related to this report, see CS 014 286. AVAILABLE FROM ED Pubs, P.O. Box 1398, Jessup, MD

p d xvervie of Procedures Used for the

NAE 2000 Grade 4 Reading *ssessment

9,1

This appendix provides an overview of the NAEP 2000

grade 4 reading assessment's primary components

framework, development, administration, scoring, and

analysis. A more extensive review of the procedures and

methods used in the reading assessment will be included in

the forthcoming NAEP 2000 Technical Report.

The NAEP 2000 Grade 4 Reading AssessmentThe reading framework underlying the NAEP 2000

Technical Aspects

of the NAEP

2000 Reading

Assessment.

assessment originated from a consensus among an

array of individuals interested in education about the

nature of reading comprehension. This framework

was also used in the 1992, 1994, and 1998 reading

assessments, permitting analyses of trends in reading

performance.

The framework's purpose was to provide a

definition of reading on which to base the NAEP

assessment. Developing this framework and the

specifications that guided development of the

assessment involved the critical input of many people

including representatives of national education

organizations, teachers, parents, policymakers, business

leaders, and members of the general public. This consensus

process was managed by the Council of Chief State School

Officers for the National Assessment Governing Board.

86

The Assessment

The Sample

Data Collection

Data Analysis

NAEP Reporting

Groups

Cautions in

Interpretations

APPENDIX A READING REPORT CARD 73

Page 88: ERIC - Education Resources Information Center ...285; for a PowerPoint slide presentation related to this report, see CS 014 286. AVAILABLE FROM ED Pubs, P.O. Box 1398, Jessup, MD

The framework sets forth a broad defini-tion of "reading literacy" that entails notonly being able to read, but also knowingwhen to read, how to read, and how toreflect on what has been read. In addition,the framework views reading as an interac-tive process in which the reader's abilities,interests, and prior knowledge interact withthe text and the context of the readingsituation as meaning construction occurs.

The aspects of grade 4 reading literacydescribed by the reading framework,including purposes for reading and readingstances, are presented in figure A.1.Thisfigure also provides examples of the typesof questions that were used to assess thedifferent purposes for reading via the fourreading stances.

1992, 1994, 1998, and 2000 NAEP frameworkaspects of grade 4 reading literacy

Constructing, extending examining meaning

Initialunderstanding

Requires the readerto provide an initialimpression orunreflected under-standing of whatwas read.

Developing aninterpretation

Requires the readerto go beyond theinitial impressionto develop a morecomplete under-standing of whatwas read.

Personal reflectionand response

Requires the readerto connect knowledgefrom the text withhis/her own personalbackground knowl-edge. The focus hereis on how the textrelates to personalknowledge.

Demonstrating acritical stance

Requires the readerto stand apartfrom the text andconsider it.

What is the story/plot about?

How would youdescribe the maincharacter?

How did the plotdevelop?

How did thischaracter changefrom the beginning tothe end of the story?

What does this article What caused thistell you about event?

What does theauthor think aboutthis topic?

In what ways arethese ideas importantto the topic or theme?

How did thischaracter changeyour idea of

7

Is this story similar toor different from yourown experience?

What current eventdoes this remind youof?

Does this descriptionfit what you knowaboutWhy?

Rewrite this storywith as asetting oras a character.

How does thisauthor's use of

(irony,personification,humor) contribute to

How useful wouldthis article be for

Explain.

What could be addedto improve theauthor's argument?

SOURCE: National Assessment Governing Board. Reading framework for the National Assessment of Educational Progress: 1992-2000.

14 APPENDIX A READING REPORT CARD 8733071° COPY AVARAMLIg

Page 89: ERIC - Education Resources Information Center ...285; for a PowerPoint slide presentation related to this report, see CS 014 286. AVAILABLE FROM ED Pubs, P.O. Box 1398, Jessup, MD

The assessment framework specified notonly the particular aspects of reading literacyto be measured, but also the percentage ofthe assessment questions that should bedevoted to each. The target percentagedistributions of reading purposes andreading stances as specified in the frame-work, along with the actual percentagedistributions in the assessment, are pre-sented in tables A.1 and A.2.The actualcontent of the assessment has varied fromthe targeted distribution, with PersonalResponse and Critical Stance Questionsfalling below the target proportions in theframework. The reading instrument devel-opment panel overseeing the developmentof the assessment recognized this variance,but felt strongly that assessment questionsmust be sensitive to the unique elements ofthe authentic reading materials being used.

Thus, the distribution of question classifica-tions will vary across reading passages andreading purposes.

The Assessment DesignStudents participating in the assessmentreceived a booklet containing a set ofgeneral background questions, readingmaterials and comprehension questions,reading-specific background questions, andquestions about their motivation andfamiliarity with the assessment tasks.Reading materials that served as stimuliand their corresponding questions wereassembled into sets or "blocks." Studentswere given two 25-minute blocks ofreading passages and questions.

The grade 4 assessment consisted ofeight 25-minute blocks: four blocks ofliterary materials and questions, and four

I e

Target and actual percentage distribution of questions by.reading purpose, 2000 NAEP grade 4reading assessment

Reading Purpose

Grade 4 Target

Actual

Literary experience

55%

50%

Gain information

45%

50%

SOURCE: National Center for Education Statistics, National Assessment of Educational Progress (NAEP), 2000 Reading Assessment.

Target and actual percentage distribution of questions by reading stance, 2000 NAEP grade 4reading assessment

Initial understanding/developing aninterpretation

Personalresponse

Criticalstance

Grade 4 Target 33% 33% 33%

Actual 59% 16% 25%

Actual percentages are based on the classifications agreed upon by NAEP's Instrument Development Panel. It is recognized that making discrete classifications

for these categories is difficult and that independent efforts to classify NAEP questions have led to different results.

SOURCE: National Center for Education Statistics, National Assessment of Educational Progress (NAEP), 2000 Reading Assessment.

88 APPENDIX A READI46 REPORT CARD 75

Page 90: ERIC - Education Resources Information Center ...285; for a PowerPoint slide presentation related to this report, see CS 014 286. AVAILABLE FROM ED Pubs, P.O. Box 1398, Jessup, MD

blocks of informative materials and ques-tions. Each block contained one passagecorresponding to one of the readingpurposes and 9 to 12 questions in bothmultiple-choice and constructed-responseformats. In each block, one of theconstructed-response questions required anextended response. As a whole, the fourth-grade assessment consisted of 35 multiple-choice questions, 38 short constructed-response questions, and 8 extendedconstructed-response questions.

The assessment design allowed formaximum coverage of reading abilities atgrade 4, while minimizing the time burdenfor any one student. This was accomplishedthrough the use of matrix sampling ofitems, in which representative samples ofstudents took various portions of the entirepool of assessment questions. Individualstudents were fequired to take only a smallportion of the assessment, but the aggregateresults across the entire assessment allowedfor broad reporting of reading abilities forthe targeted population.

In addition to matrix sampling, theassessment design utilized a procedure fordistributing booklets that controlled forposition and context effects. Studentsreceived different blocks of passages andcomprehension questions in their bookletsaccording to a procedure called "partiallybalanced incomplete block (PBIB) spiraling."This procedure assigned blocks of questionsso that every block appears in the first andsecond position within a booklet an equalnumber of times. Every block of questionswas paired with every other block with thesame reading purpose, and every block waspaired with some block having the otherreading purpose.The spiraling aspect of thisprocedure cycles the booklets for adminis-

70 APPENDIX A READING REPORT CARD

tration, so that typically only a few studentsin any assessment session receive the samebooklet.

In addition to the student assessmentbooklets, three other instruments provideddata relating to the assessmenta teacherquestionnaire, a school questionnaire, and aStudents with Disabilities/Limited EnglishProficiency (SD/LEP) questionnaire. TheSD/LEP student questionnaire was com-pleted by a school staff member knowl-edgeable about those students who wereselected to participate in the assessment andwho were identified as (1) having anIndividualized Education Plan (IEP) orequivalent plan (for reasons other thanbeing gifted or talented) or (2) beinglimited English proficient (LEP).AnSD/LEP student questionnaire was com-pleted for each identified student regardlessof whether the student participated in theassessment. Each SD/LEP questionnairetook approximately three minutes tocomplete and asked about the student andthe special programs in which he or sheparticipated.

f apt ona 6 Sai pieThe results presented in this report arebased on a nationally representative prob-ability sample of fourth-grade students.Thesample was selected using a complex multi-stage design that involved sampling stu-dents from selected schools within selectedgeographic areas across the country. Thesample design had the following stages:

1. selection of geographic areas (a county,group of counties, or metropolitanstatistical area);

2. selection of schools (public andnonpublic) within the selected areas; and

3. selection of students within selectedschools.

8 9

Page 91: ERIC - Education Resources Information Center ...285; for a PowerPoint slide presentation related to this report, see CS 014 286. AVAILABLE FROM ED Pubs, P.O. Box 1398, Jessup, MD

Each selected school that participated inthe assessment and each student assessedrepresents a portion of the population ofinterest. Sampling weights are needed tomake valid inferences between the studentsamples and the respective populationsfrom which they were drawn. Samplingweights account for disproportionaterepresentation due to the oversampling ofstudents who attend schools with highconcentrations of black and/or Hispanicstudents and students who attendnonpublic schools. Among other uses,sampling weights also account for lowersampling rates for very small schools.

A special feature of the 1998 and 2000national assessments of reading was thecollection of data from samples ofstudents where accommodations werenot permitted and samples of studentswhere accommodatiotis were permitted.

The inclusion rules were applied, andaccommodations were offered only when astudent had an Individualized EducationPlan (LEP) for reasons other than beinggifted and talented or was identified aslimited English proficient (LEP); all otherstudents were asked to participate in theassessment.

Table A.3 shows the number of studentsincluded in the national samples for theNAEP reading assessments. For the 1998and 2000 assessments, the table includes thenumber of students in the sample whereaccommodations were not permitted andthe number of students in the samplewhere accommodations were permitted.The table shows that the same non-SD/LEP students were included in bothsamples. Only the SD/LEP studentsdiffered between the two samples.

Grade 4 national student sample sizes

1006

Accommodations

not permitted

sample

1 22*

Accommodations

not permitted

sample

1220

Accommodations

not permitted

sample

Accommodations

permitted

sample

ALIUU

Accommodations

not permitted

sample

Accommodations

permitted

sample

Non-SD/LEP

students assessed

6,051 6,783 7,232 7,484

SD/LEP students

assessed without

accommodations

263 599 440 413 430 476

SD/LEP students

assessed who

required and

received

accommodations

NA NA NA 167 NA 114

Total sample

assessed 6,314 7,382 7,672 7,812 7,914 8,074

NA = Not applicable. No accommodations were permitted in this same e.SOURCE: National Center for Education Statistics, National Assessment of Educational Progress (NAEP), 2000 Reading Assessment.

9 0APPE111) II( A READit113 REPORT CADD 77

Page 92: ERIC - Education Resources Information Center ...285; for a PowerPoint slide presentation related to this report, see CS 014 286. AVAILABLE FROM ED Pubs, P.O. Box 1398, Jessup, MD

Table A.4 provides a summary of thenational school and student participationrates for the reading assessment sampleswhere accommodations were not per-mitted and where accommodations werepermitted. Participation rates are presentedfor public and nonpublic schools, individu-ally and combined. The first rate is theweighted percentage of schools participat-ing in the assessment before substitution.This rate is based only on the number ofschools that were initially selected for theassessment. The numerator of this rate isthe sum of the number of students repre-sented by each initially selected school thatparticipated in the assessment.The denomi-nator is the sum of the number of studentsrepresented by each of the initially selectedschools that had eligible students enrolled.The initially selected schools include thosethat participated and those that did not.

The second school participation rate isthe weighted participation rate after substi-tution. The numerator of this rate is thesum of the number of students representedby each of the participating schools,

whether originally selected or substituted.The denominator is the same as that forthe weighted participation rate for theinitial sample. The denominator for thisparticipation rate, as well as for the ratebefore substitution of schools, is the num-ber of eligible students from all schoolswith eligible students within the nation.Because of the common denominators, theweighted participation rate after substitu-tion is as least as great as the weightedparticipation rate before substitution.

Also presented in table A.4 are weightedstudent participation rates. The numeratorof this rate is the sum across all studentsassessed (in either an initial session or amakeup session) of the number of studentsthat each represents.The denominator ofthis rate is the sum across all eligiblesampled students in participating schools ofthe number of students that each repre-sents. The overall participation rates takeinto account the weighted percentage

- school participation before or after substi-tution and the weighted percentage studentparticipation after makeup sessions.

NAEP 2000 school and student participation rates for the nation: Grade 4 public schools,nonpublic schools, and combined

Weighted schoolparticipation

Samples whore accommodationswore not permitted

Samples where accommodationswore permitted

Percentage Percentage Total

before after number of

substitutes substitutes schools

Grade 4

Public 84 87 325

Nonpublic 86 89 108

Combined 84 87 433

Weighted Total Weighted TotalOverall participation rate Overall participation rate

percentage number of percentage number of

student students Before After student students Before After

participation assessed substitution substitution participation assessed substitution substitution

96 5,945 80 83

96 1,969 83 86

96 7,914 81 84

96 6,095 81 83

97 1,979 83 86

96 8,074 81 84

SOURCE: National Center for Education Statistics, National Assessment of Educational Progress (NAEP), 2000 Reading Assessment.

70 APPENDIX A c READING REPORT CARD 91

Page 93: ERIC - Education Resources Information Center ...285; for a PowerPoint slide presentation related to this report, see CS 014 286. AVAILABLE FROM ED Pubs, P.O. Box 1398, Jessup, MD

Students with Disabilities (SD)and Limited English Proficient(LEP) StudentsIt is NAEP's intent to assess all selectedstudents from the target population. There-fore, every effort is made to ensure that allselected students who are capable ofparticipating in the assessment are assessed.Some students sampled for participation inNAEP can be excluded from the sampleaccording to carefully defined criteria.These criteria were revised in 1996 tomore clearly communicate a presumptionof inclusion except under special circum-stances. According to these criteria, studentswith Individualized Education Programs(IEPs) were to be included in the NAEPassessment except in the following cases:

1. The school's IEP team determined thatthe student could not participate, OR,

2. The student's cognitive functioning wasso severely impaired that she or he couldnot participate, OR,

3. The student's IEP required that thestudent had to be tested with an accom-modation or adaptation and that thestudent could not demonstrate his or herknowledge without that accommodation.

All LEP students receiving academicinstruction in English for three years ormore were to be included in the assess-ment.Those LEP students receivinginstruction in English for less than three

x

years were to be included unless schoolstaff judged them as being incapable ofparticipating in the assessment in English.

For the 1998 and 2000 national assess-ments in reading, for one type of sample,the assessment was conducted using thesecriteria with no provisions made foraccommodations.The results for thissample are presented in chapters 1, 2, and 3.For another type of sample, the assessmentwas conducted using these criteria, withprovisions made for accommodations foridentified students.These results are pre-sented in chapter 4. The accommodationsprovided by NAEP were meant to matchthose specified in the student's IEP or those_ordinarily provided in the classroom fortesting situations.

Currently, NAEP is in the process ofchanging the way that assessments areconducted to better reflect the purpose ofthe 1997 Individuals with DisabilitiesEducation Act (IDEA).' Permittingaccommodations for identified students inthe 1998 and 2000 samples laid thegroundwork for future NAEP readingassessments in which the provision ofaccommodations will be standard programpractice. Also, the NAEP 1998 and 2000reading assessments included samples whereno accommodations were permitted. Thesesamples were comparable to samples fromprevious assessments-so that trend resultscould be reported.

Office of Special Education Programs. (1997). Nineteenth annual report to Congress on the implementation of theindividuals with disabilities education act. Washington, DC: U.S. Department of Education.

BEST COPY AVABUIBLE 'be 92APPENDIX A READING REPORT CARD 79

Page 94: ERIC - Education Resources Information Center ...285; for a PowerPoint slide presentation related to this report, see CS 014 286. AVAILABLE FROM ED Pubs, P.O. Box 1398, Jessup, MD

Participation rates for students withdisabilities and LEP student samples arepresented in table A.5 for samples whereaccommodations were not permitted andin table A.6 for samples where accommo-dations were permitted for identifiedstudents. As can be seen from the data in

table A.6, three percent of the studentssampled for the 1998 and 2000 readingassessments were provided with accommo-dations. Ninety-seven percent of thesampled students (including SD/LEP andnon-SD/LEP students), took the assess-ment under standard conditions.

Students with disabilities and limited English proficient students in the NAEP readingassessment: Grade 4 national samples where accommodations were not permitted: 1992-2000

1992

Number of

students

Weighted

percentageof students

sampled

1994 1998 2000

Number of

students

Weighted

percentageof students Number of

sampled students

Weighted

percentageof students Number of

sampled students

Weighted

percentageof students

sampled

SD and LEPstudents

Identified 2,013 10 1,624 13 985 16 823 15

Excluded 1,750 6 1,025 5 545 9 393 7

Assessed 263 4 599 8 440 7 430 8

SD studentsonly

Identified 1,149 7 1,039 10 490 11 524 11

Excluded 990 4 685 4 247 6 295 6

Assessed 159 3 354 6 243 5 229 5

LEP studentsonly

Identified 945 3 623 4 527 6 356 5

Excluded 835 2 368 1 323 3 141 2

Assessed 110 1 255 2 204 2 215 3

SD = Students with Disabilities (the term previously used was IEP)LEP = Limited English Proficient students

NOTE: The combined SD/LEP portion of the table is not a sum of the separate SD and LEP portions because some students were identified asboth SD and LEP.

Such students would be counted separately in the bottom portions but counted only once in the top portion. Within each portion of the table, percentages may

not sum properly due to rounding.

SOURCE: National Center for Education Statistics, National Assessment of Educational Progress (NAEP), 1992, 1994, 1998, and 2000 Reading Assessments.

80 APPENDIX A READING REPORT CARO

BEST Writ AVAIILABILL

9 3

Page 95: ERIC - Education Resources Information Center ...285; for a PowerPoint slide presentation related to this report, see CS 014 286. AVAILABLE FROM ED Pubs, P.O. Box 1398, Jessup, MD

Students with disabilities and limited English proficient students in the NAEP readingassessment: Grade 4 national samples where accommodations were permitted for identifiedstudents: 1998 and 2000

1998 2000

Number of

students

Weightedpercentageof studentssampled

Number of

students

Weightedpercentageof students

sampled

0 SD and LEP students

Identified 973 15 906 17

Excluded 393 6 316 6

Assessed 580 9 590 11

Assessed without

accommodations 413 6 476 9

Assessed with

accommodations 167 3 114 3

0 SD students only

Identified 558 11 510 12

Excluded 246 5 193 4

Assessed 312 6 317 7

Assessed without

accommodations 179 4 209 5

Assessed with

accommodations 133 3 108 2

® LEP students only

Identified 446 5 446 6

Excluded 167 2 159 2

Assessed 279 3 287 4

Assessed without

accommodations 238 3 273 4

Assessed with

accommodations 41 1 14 A

Percentage is between 0.0 and 0.5.

SO = Students with Disabilities (the term previously used was IEP)LEP = Limited English Proficient students

NOTE: The combined SD/LEP portion of the table is not a sum of the separate SD and LEP portions because some students were identified as both SD and LEP.

Such students would be counted separately in the bottom portions but counted only once in the top portion. Within each portion of the table, percentagesmay

not sum property due to rounding.

SOURCE: National Center for Education Statistics, National Assessment of Educational Progress (MEP), 1998 and 2000 Reading Assessments.

Table A.7 displays the number and thepercentages of SD and LEP studentsassessed with the variety of availableaccommodations. It should be noted thatstudents assessed with accommodationstypically received some combination of

BEST COPY AVAILABLE

accommodations. For example, studentsassessed in small groups (as compared tostandard NAEP sessions of about 30 stu-dents) usually received extended time. Inone-on-one administrations, students oftenreceived assistance in recording answers and

9 4 APPENDIX A 9E1111111 DEPODI Cane 01

Page 96: ERIC - Education Resources Information Center ...285; for a PowerPoint slide presentation related to this report, see CS 014 286. AVAILABLE FROM ED Pubs, P.O. Box 1398, Jessup, MD

were afforded extra time. Extended timewas considered the primary accommoda-tion only when it was the sole accommo-dation provided.The assessment did not,however, allow some accommodations thatwere permitted in certain states in pastassessments. Some states have allowedquestions and, in some cases, readingpassages to be read aloud to the students.In designing the 2000 reading assessment,

reading aloud as an accommodation wasviewed as changing the nature of theconstruct being measured and, hence, wasnot permitted. Because NAEP considersthe domain of its reading assessment to bereading in English, no attempt was made toprovide an alternate language version ofthe assessment, and the use of bilingualdictionaries was not permitted.

SD and LEP students assessed with accommodations, in the NAEP reading assessment:Grade 4 national samples where accommodations were permitted for identified students, publicand nonpublic schools combined by type of accommodation: 1998 and 2000

1998 2000

Number of

students

Weightedpercentageof studentssampled

Number of

students

Weightedpercentageof studentssampled

SD and LEP students

Large print 0 0 1 0.06

Extended time 63 1.07 41 0.86

Small group 90 1.94 61 1.48

One-on-one 90.23

9 0.27

Scribe or computer 1 0.05 1 0.03

Other 4 0.09 1 0.01

SD students only

Large print 0 0 0.06

Extended time 43 0.78 41 0.86

Small group 76 1.70 55 1.36

One-on-one 9 0.23 9 0.27

Scribe or computer 1 0.05 1 0.03

Other 4 0.09 1 0.01

LEP students only

Large print 0 0 0 0

Extended time 22 0.31 0.01

Small group 19 0.32 11 0.20

One-on-one 0 0 1 0.01

Scribe or computer 0 0 0 0

Other 0 0 1 0.01

SD = Students with Disabilities (the term previously used was IEP)LEP = limited English Proficient students

NOTE: The combined SD/LEP portion of the table is not a sum of the separate SD and LEP portions because some students were identified asboth SD and LEP.

Such students would be counted separately in the bottom portions but counted only once in each line of the top portion. Within each portion of the table,

percentages may not sum properly due to rounding.

SOURCE: National Center for Education Statistics, National Assessment of Educational Progress (NAEP), 1998 and 2000 Reading Assessments.

02 APPENDIX A READING REPORT CARD 95BEST COPY AN/AO LAME

Page 97: ERIC - Education Resources Information Center ...285; for a PowerPoint slide presentation related to this report, see CS 014 286. AVAILABLE FROM ED Pubs, P.O. Box 1398, Jessup, MD

Data Collection and ScoringThe 2000 reading assessment was con-ducted from January through March 2000,with some makeup sessions in early April.As with all NAEP assessments, data collec-tion for the 2000 assessment was conductedby a trained field staff For the nationalassessment, this was accomplished byWestat, Inc. staff.

Materials from the 2000 assessment wereshipped to National Computer Systems,where trained staff evaluated the responsesto the constructed-response questions usingscoring rubrics or guides prepared by theEducational Testing Service. Each con-structed-response question had a uniquescoring rubric that defined the criteriaused to evaluate students' responses. Theextended constructed-response questionswere evaluated with four-level rubrics, andmany of the short constructed- responsequestions were rated according to three-level rubrics that permitted partial credit.Other short constructed-response questionswere scored as either acceptable or unac-ceptable.

For the 2000 reading assessment, approxi-mately 123,075 constructed responses werescored. This number includes restoring tomonitor inter-rater reliability and trendreliability. In other words, scoring reliabilitywas calculated within year (2000) andacross years (1994, 1998, and 2000).Thewithin-year average percentage of agree-ment for the 2000 national grade 4 reliabil-ity sample was 88 percent. The percentagesof agreement across the assessment years forthe national inter-year (1994 to 2000)reliability sample was 89 percent.

Data Analysis and ORT ScalingSubsequent to the professional scoring, allinformation was transcribed to the NAEPdatabase at ETS. Each processing activitywas conducted with rigorous qualitycontrol. After the assessment informationhad been compiled in the database, the datawere weighted according to the populationstructure. The weighting for the nationalsample reflected the probability of selectionfor each student as a result of the samplingdesign, adjusted for nonresponse. Throughpost-stratification, the weighting assuredthat the representation of certain subpopu-lations corresponded to figures from theU.S. Census and the Current PopulationSurvey.'

Analyses were then conducted to deter-mine the percentages of students who gavevarious responses to each cognitive andbackground question. In determining thesepercentages for the cognitive questions, adistinction was made between missingresponses at the end of a block (i.e., missingresponses subsequent to the last questionthe student answered) and missing re-sponses prior to the last observed response.Missing responses before the last observedresponse were considered intentionalomissions. Missing responses at the end ofthe block were considered "not reached"and treated as if the questions had not beenpresented to the student. In calculatingresponse percentages for each question,only students classified as having beenpresented the question were included inthe denominator of the statistic.

2 These procedures are described more fully in the section "Weighting and Variance Estimation." For additionalinformation about the use of weighting procedures in NAEP, see Johnson, E.G. (1989, December). Considerationsand techniques for the analysis of NAEP data. Journal of Education Statistics, 14(4), 303-334.

96 APPENDIX A a READING REPORT CARD 03

Page 98: ERIC - Education Resources Information Center ...285; for a PowerPoint slide presentation related to this report, see CS 014 286. AVAILABLE FROM ED Pubs, P.O. Box 1398, Jessup, MD

It is standard ETS practice to treat allnonrespondents to the last question in ablock as if they had not reached the ques-tion. For multiple-choice and short con-structed-response questions, this practiceproduces a reasonable pattern of results inthat the proportion reaching the lastquestion is not dramatically smaller thanthe proportion reaching the next-to-lastquestion. However, for blocks that endedwith extended constructed-responsequestions, the standard ETS practice wouldresult in extremely large drops in theproportion of students attempting the finalquestion. Therefore, for blocks ending withan extended constructed-response question,students who answered the next-to-lastquestion but did not respond to the ex-tended constructed-response question wereclassified as having intentionally omittedthe last question.

Item response theory (IRT) was used toestimate average reading scale scores for thenation and for various subgroups of interestwithin the nation. IRT models the prob-ability of answering a question in a certainway as a mathematical function of profi-ciency or skill.The main purpose of IRTanalysis is to provide a common scale onwhich performance can be compared acrossgroups such as those defined by character-istics, including gender and race/ethnicity.

The results for 1992, 1994, 1998, and2000 are presented on the grade 4 NAEPreading scale. In 1992, a scale ranging from0 to 500 was created to report performancefor the literary and information readingpurposes at grade 4.' The scales summarizestudent performance across all three types

of questions in the assessment (multiple-choice, short constructed-response, andextended constructed-response). Resultsfrom subsequent reading assessments (1994,1998, and 2000) are reported on thesescales.

Each reading scale was initially based onthe distribution of student performanceacross all three grades in the 1992 nationalassessment (grades 4, 8, and 12). In thatyear, the scales had an average of 250 and astandard deviation of 50. In addition, acomposite scale was created as an overallmeasure of students' reading performance.At grade 4, this composite scale is aweighted average of the two separate scalesfor the two reading purposes assessed.Theweight for each reading purpose is propor-tional to the relative importance assignedto the reading purpose by the specificationsdeveloped through the consensus planningprocess and given in the framework. (Seetarget percentages in table A.1)

In producing the reading scales, threedistinct IRT models were used. Multiple-choice questions were scaled using thethree-parameter logistic (3PL) model; shortconstructed-response questions rated asacceptable or unacceptable were scaledusing the two-parameter logistic (2PL)model; and short constructed-responsequestions rated according to a three-levelrubric, as well as extended constructed-response questions rated on a four-levelrubric, were scaled using a generalizedpartial-credit (GPC) model.' Developed byETS and first used in 1992, the GPC modelpermits the scaling of questions scoredaccording to multipoint rating schemes.The

3 A third purpose, reading to perform a task, was not assessed at grade 4.

Muraki, E. (1992).A generalized partial credit model:Application of an EM algorithm. Applied PsychologicalMeasurement, 16(2), 159-176.

04 APPENDIX A * READING REPORT CARD 9 7

Page 99: ERIC - Education Resources Information Center ...285; for a PowerPoint slide presentation related to this report, see CS 014 286. AVAILABLE FROM ED Pubs, P.O. Box 1398, Jessup, MD

model takes full advantage of the informa-tion available from each of the studentresponse categories used for these morecomplex constructed-response questions.

One natural question about the readingscales concerns the amount of informationcontributed by each type of question.Unfortunately, this question has no simpleanswer for the NAEP reading assessment,due to the complex procedures used toform the composite reading scale.Theinformation provided by a given questionis determined by the IRT model used toscale the question. It is a function of theitem parameters and varies by level ofreading proficiency.' Thus, the answer tothe query "How much information do thedifferent types of questions provide?" willdiffer for each level of reading perfor-mance. When considering the compositereading scale, the answer is even morecomplicated. The reading data are scaledseparately by the purposes of reading(reading for literary experience, reading togain information, and reading to perform atask).`' The composite scale is a weightedcombination of these subscales. IRT infor-mation functions are only strictly compa-rable when they are derived from the samecalibration. Because the composite scale isbased on two separate calibrations, there isno direct way to compare the informationprovided by the questions on the compos-ite scale.

Because of the PBIB-spiraling designused by NAEP, students do not receiveenough questions about a specific topic toprovide reliable information aboutindividual performance. Traditional testscores for individual students, even thosebased on IR.T, would lead to misleadingestimates of population characteristics, suchas subgroup means and percentages ofstudents at or above a certain scale-scorelevel. Consequently, NAEP constructs setsof plausible values designed to representthe distribution of performance in thepopulation. A plausible value for anindividual is not a scale score for thatindividual, but may be regarded as arepresentative value from the distributionof potential scale scores for all students inthe population with similar characteristicsand identical patterns of item response.Statistics describing performance on theNAEP reading scale are based on theplausible values. Under the assumptions ofthe scaling models, these populationestimates will be consistent, in the sensethat the estimates approach the model-based population values as the sample sizeincreases, which would not be the case forpopulation estimates obtained byaggregating optimal estimates of individualperformance.'

5 Donoghue, J.R. (1994). An empirical examination of the IRT information of polytomously scored reading itemsunder the generalized partial credit model. Journal of Educational Measurement, 3 1(4), 295-311.

6 Only two purposes, literary and informtion, were used in the 2000 grade 4 reading assessment.These aredescribed in the introduction of this report.

7 For theoretical and empirical justification of the procedures employed, see Mislevy, R.J. (1988). Randomization -based inferences about latent variables from complex samples. Psychometrika, 56(2), 177-196.

For computational details, see the forthcoming NAEP 2000 technical report. National Assessment of EducationalProgress. (2000). NAEP 2000 technical report. (forthcoming] Princeton, NJ: Educational Testing Service.

BEST COPY AVAILABLE 98 APPENDIX A READING REPORT CARO 85

Page 100: ERIC - Education Resources Information Center ...285; for a PowerPoint slide presentation related to this report, see CS 014 286. AVAILABLE FROM ED Pubs, P.O. Box 1398, Jessup, MD

item Mapping ProceduvesTo map items to particular points on thereading proficiency scale, a response prob-ability convention had to be adopted thatwould divide those who had a higherprobability of success from those who hada lower probability. Establishing a responseprobability convention has an impact onthe mapping of the test items onto thereading scale. A lower boundary conventionmaps the reading items at lower pointsalong the scale, and a higher boundaryconvention maps the same items at higherpoints on the scale. The underlying distri-bution of reading skills in the populationdoes not change, but the choice of aresponse probability convention does havean impact on the proportion of the studentpopulation that is reported as "able to do"the items on the reading scales.

There is no 'obvious choice of a pointalong the probability scale that is clearlysuperior to any other point. If the conven-tion were set with a boundary at 50 per-cent, those above the boundary would bemore likely to get an item right than get itwrong, while those below the boundarywould be more likely to get the itemwrong than right. Although this conventionhas some intuitive appeal, it was rejected onthe grounds that having a 50/50 chance ofgetting the item right shows an insufficientdegree of mastery. If the convention wereset with a boundary at 80 percent, studentsabove the criterion would have a highprobability of success with an item. How-ever, many students below this criterionshow some level of reading ability thatwould be ignored by such a stringent

criterion. In particular, those in the rangebetween 50 and 80 percent correct wouldbe more likely to get the item right thanwrong, yet would not be in the groupdescribed as "able to do" the item. .

In a compromise between the 50 per-cent and the 80 percent conventions,NAEP has adopted two related responseprobability conventions: 74 percent formultiple-choice questions (to correct forthe possibility of answering correctly byguessing) and 65 percent for constructed-response questions (where guessing is not afactor).These probability conventions wereestablished, in part, based on an intuitivejudgment that they would provide the bestpicture of students' reading skills.

Some additional support for the dualconventions adopted by NAEP was pro-vided by Huynh." He examined the IRTinformation provided by items, accordingto the IRT model used in scaling NAEPquestions. ("Information" is used here in atechnical sense. See the forthcomingNAEP 2000 Technical Report for details.)Following Bock,'' Huynh decomposed theitem information into that provided by acorrect response [P(q) 1(q)] and that pro-vided by an incorrect response [(1- P(q))1(q)]. Huynh showed that the item infor-mation provided by a correct response to aconstructed-response item is maximized atthe point along the reading scale at whichthe probability of a correct response is twothirds (for multiple-choice items, theinformation provided by a correct responseis maximized at the point at which theprobability of getting the item correct is.74). It should be noted, however, that

8 Huynh, H. (1994, October). Some technical aspects of standard setting. Paper presented at the Joint Conference onStandard Setting for Large-Scale Assessment, Washington, DC.

9 Bock, R. D. (1972). Estimating item parameters and latent ability when responses are scored in two or more latentcategories. Psychometrika, 37,29-51.

86 APPENDIX A o READING REPORT CARD 99.

Page 101: ERIC - Education Resources Information Center ...285; for a PowerPoint slide presentation related to this report, see CS 014 286. AVAILABLE FROM ED Pubs, P.O. Box 1398, Jessup, MD

maximizing the item information I(q),rather than the information provided by acorrect response [P(q) I(q)], would implyan item mapping criterion closer to 50percent.

The results in this report are presented interms of the composite reading scale.However, the reading assessment was scaledseparately for the two purposes for readingat grade 4. The composite scale is aweighted combination of the two subscalesfor purposes for reading. To obtain itemmap information presented in this report, aprocedure by Donoghue was used. '° Thismethod models the relationship betweenthe item response function for the subscaleand the subscale structure to derive therelationship between the item score andthe composite scale (i.e., an item responsefunction for the composite scale).This itemresponse function is then used to derive theprobability used in the mapping.

Bighting andVaviance EstimationA complex sample design was used toselect the students who were assessed. Theproperties of a sample selected through acomplex design could be very differentfrom those of a simple random sample, inwhich every student in the target popula-tion has an equal chance of selection and inwhich the observations from differentsampled students can be considered to bestatistically independent of one another.Therefore, the properties of the sample forthe complex data collection design weretaken into account during the analysis ofthe assessment data.

One way that the properties of the sampledesign were addressed was by using samplingweights to account for the fact that theprobabilities of selection were not identicalfor all students. All population and subpopu-lation characteristics based on the assessmentdata used sampling weights in their esdma-tion.These weights included adjustments forschool and student nonresponse.

Not only must appropriate estimates ofpopulation characteristics be derived, butappropriate measures of the degree ofuncertainty must be obtained for thosestatistics. Two components of uncertaintyare accounted for in the variability ofstatistics based on student ability: (1) theuncertainty due to sampling only a rela-tively small number of students, and (2) theuncertainty due to sampling only a rela-tively small number of cognitive questions.The first component accounts for thevariability associated with the estimatedpercentages of students who had certainbackground characteristics or who answereda certain cognitive question correctly.

Because NAEP uses complex samplingprocedures, conventional formulas forestimating sampling variability that assumesimple random sampling are inappropriate.NAEP uses a jackknife replication proce-dure to estimate standard errors.Thejackknife standard error provides a reason-able measure of uncertainty for any studentinformation that can be observed withouterror. However, because each studenttypically responds to only a few questionswithin any purpose for reading, the scalescore for any single student would beimprecise. In this case, plausible values

10 Donoghue, J. R. (1997, March). Item mapping to a weighted composite scale. Paper presented at the annual meeting ofthe American Educational Research Association, Chicago, IL.

100 APPENDIX A READING REPORT CARD 87

Page 102: ERIC - Education Resources Information Center ...285; for a PowerPoint slide presentation related to this report, see CS 014 286. AVAILABLE FROM ED Pubs, P.O. Box 1398, Jessup, MD

methodology can be used to describe theperformance of groups and subgroups ofstudents, but the underlying imprecisioninvolved in this step adds another compo-nent of variability to statistics based onNAEP scale scores." (Appendix C providesthe standard errors for the results presentedin this report.)

Typically, when the standard error isbased on a small number of students orwhen the group of students is enrolled in asmall number of schools, the amount ofuncertainty associated with the standarderrors may be quite large. Throughout thisreport, estimates of standard errors subjectto a large degree of uncertainty are fol-lowed by the "!" symbol. In such cases, thestandard errorsand any confidenceintervals or significance tests involvingthese standard errorsshould be inter-preted cautiously. Additional details con-cerning procedures for identifying suchstandard errors are discussed in the forth-coming NAEP 2000 Technical Report.

The reader is reminded that, as withfindings from all surveys, NAEP results aresubject to other kinds of error, includingthe effects of imperfect adjustment forstudent and school nonresponse andunknowable effects associated with theparticular instrumentation and data collec-tion methods. Nonsampling errors can beattributed to a number of sourcesinabil-ity to obtain complete information aboutall selected schools in the sample (somestudents or schools refused to participate, orstudents participated but answered onlycertain questions); ambiguous definitions;differences in interpreting questions;inability or unwillingness to give correct

information; mistakes in recording, coding,or scoring data; and other errors in collect-ing, processing, sampling, and estimatingmissing data. The extent of nonsamplingerror is difficult to estimate; and, because oftheir nature, the impact of such errorscannot be reflected in the data-basedestimates of uncertainty provided in NAEPreports.

Drawing inferencesfrom the ResultsBecause the percentages of students inthese subpopulations and their averagescale scores are based on samples ratherthan on the entire population of fourth-graders in the nation, the numbers reportedare estimates. As such, they are subject to ameasure of uncertainty, reflected in thestandard error of the estimate. When thepercentages or average scale scores ofcertain groups are compared, the standarderror should be taken into account, andobserved similarities or differences shouldnot be relied on solely. Therefore, thecomparisons discussed in this report arebased on statistical tests that consider thestandard errors of those statistics and themagnitude of the difference among theaverages or percentages.

Using confidence intervals based on thestandard errors provides a way to take intoaccount the uncertainty associated withsample estimates, and to make inferencesabout the population averages and percent-ages in a manner that reflects that uncer-tainty. An estimated sample average scalescore plus or minus two standard errorsapproximates a 95 percent confidenceinterval for the corresponding population

tI For further details, see Johnson, E.G. & Rust, K.F. (1992). Population inferences and variance estimation forNAEP data. Journal of Educational Statistics, 17(2), 175-190.

80 APPENDIX A READING REPORT CARD 101

Page 103: ERIC - Education Resources Information Center ...285; for a PowerPoint slide presentation related to this report, see CS 014 286. AVAILABLE FROM ED Pubs, P.O. Box 1398, Jessup, MD

quantity. This statement means that one canconclude with approximately a 95 percentlevel of confidence that the average perfor-mance of the entire population of interest(e.g., all fourth-grade students in publicand nonpublic schools) is within plus orminus two standard errors of the sampleaverage.

As an example, suppose that the averagereading scale score of the students in aparticular group was 256 with a standarderror of 1.2.A 95 percent confidenceinterval for the population quantity wouldbe as follows:

Average ± 2 standard errors

256 ± 2 X 1.2

256 ± 2.4

(253.6, 258.4)

Thus, one can conclude with a 95percent level of confidence that the averagescale score for the entire population ofstudents in that group is between 253.6and 258.4.

Similar confidence intervals can beconstructed for percentages, if the percent-ages are not extremely large or extremelysmall. Extreme percentages should beinterpreted with caution. Adding or sub-tracting the standard errors associated withextreme percentages could cause theconfidence interval to exceed 100 percentor go below 0 percent, resulting in num-bers that are not meaningful. (The forth-coming NAEP 2000 Technical Report willcontain a more complete discussion ofextreme percentages.)

Analyzing Group Differencesin Averages and PercentagesThe statistical tests determine whether theevidence, based on the data from thegroups in the sample, is strong enough toconclude that the averages or percentagesare actually different for those groups inthe population. If the evidence is strong(i.e., the difference is statistically signifi-cant), the report describes the groupaverages or percentages as being different(e.g., one group performed higher than orlower than another group), regardless ofwhether the sample averages or percentagesappear to be approximately the same.Occasionally, if an apparent difference isquite large but not statistically significant,this report will point out that fact.

The reader is cautioned to rely on theresults of the statistical tests rather than onthe apparent magnitude of the differencebetween sample averages or percentageswhen determining whether the sampledifferences are likely to represent actualdifferences among the groups in thepopulation.

To determine whether a real differenceexists between the average scale scores (orpercentages of a certain attribute) for twogroups in the population, one needs toobtain an estimate of the degree of uncer-tainty associated with the differencebetween the averages (or percentages) ofthese groups for the sample.This estimateof the degree of uncertainty, called thestandard error of the difference betweenthe groups, is obtained by taking the squareof each group's standard error, summingthe squared standard errors, and taking thesquare root of that sum.

102 APPENDIX A READING REPORT CARD RR

Page 104: ERIC - Education Resources Information Center ...285; for a PowerPoint slide presentation related to this report, see CS 014 286. AVAILABLE FROM ED Pubs, P.O. Box 1398, Jessup, MD

Standard Error of the Difference =

SEA_B = 4(SEA' + SEB2)

Similar to how the standard error for anindividual group average or percentage isused, the standard error of the differencecan be used to help determine whetherdifferences among groups in the populationare real. The difference between the aver-ages or percentages of the two groups plusor minus two standard errors of the differ-ence represents an approximate 95 percentconfidence interval. If the resulting intervalincludes zero, there is insufficient evidenceto claim a real difference between thegroups in the population. If the intervaldoes not contain zero, the differencebetween the groups is statistically signifi-cant (different) at the 0.05 level.

As an example of comparing groups,consider the problem of determiningwhether the average reading scale score ofgroup A is higher than that of group B.Suppose that the sample estimates of theaverage scale scores and standard errorswere as follows:

Group

Average

Scale Score Standard Error

A 218 0.9

B 216 1.1

The difference between the estimates ofthe average scale scores of groups A and Bis two points (218 216). The standarderror of this difference is

1(0.92 + 1.12) = 1.4

Thus, an approximate 95 percent confi-dence interval for this difference is plus orminus two standard errors of the difference

2 ± 2 X 1.42 ± 2.8

(-0.8, 4.8)

The value zero is within the confidenceinterval; therefore, there is insufficientevidence to claim that group A out-performed group B.

In some cases, the differences betweengroups were not discussed in this report.This happened for one of two reasons: (a) ifthe comparison involved an extremepercentage (as defined above); or (b) if thestandard error for either group was subjectto a large degree of uncertainty (i.e., thecoefficient of variation is greater than 20percent, denoted by "!" in the tables).'' Ineither case, the results of any statistical testinvolving that group need to be interpretedwith caution; and so, the results of suchtests are not discussed in this report.

12 As was discussed in the section "Weighting and Variance Estimation," estimates of standard errors subject to a largedegree of uncertainty are designated by the symbol "!". In such cases, the standard errorand any confidenceintervals or significance tests among these standard errorsshould be interpreted with caution.

90 APPENDIX A READING REPORT CARD 103

Page 105: ERIC - Education Resources Information Center ...285; for a PowerPoint slide presentation related to this report, see CS 014 286. AVAILABLE FROM ED Pubs, P.O. Box 1398, Jessup, MD

Conducting Multiple TestsThe procedures in the previous sectionand the certainty ascribed to intervals(e.g., a 95 percent confidence interval)are based on statistical theory thatassumes that only one confidenceinterval or test of statistical significance isbeing performed. However, in chapters 2and 3 of this report, many differentgroups are being compared (i.e., multiplesets of confidence intervals are beinganalyzed). In sets of confidence intervals,statistical theory indicates that thecertainty associated with the entire set ofintervals is less than that attributable toeach individual comparison from the set.To hold the significance level for the setof comparisons at a particular level (e.g.,0.05), adjustments (called "multiplecomparison procedures') must be madeto the methods described in the previoussection. One such procedure, the False

Discovery Rate (FDR) procedure" wasused to control the certainty level.

Unlike the other multiple comparisonprocedures (e.g., the Bonferroni procedure)that control the familywise error rate (i.e.,the probability of making even one falserejection in the set of comparisons), theFDR procedure controls the expectedproportion of falsely rejected hypotheses.Furthermore, familywise procedures areconsidered conservative for large familiesof comparisons.'s Therefore, the FDRprocedure is more suitable for multiplecomparisons in NAEP than other proce-dures. A detailed description of the FDRprocedure appears in the forthcomingNAEP 2000 Technical Report.

To illustrate how the FDR procedure isused, consider the comparisons of 2000 and1994 grade 4 average reading scale scoresfor all five of the race/ethnicity groupspresented in table A.B. Note that the

FDR comparisons of average reading scale scores for race/ethnicity groups: 1994 and 2000

1994 2000

Average

scale score

Standard

error

Average

scale score

Standard

error

Difference

in averages

Standard

error of

difference

Test

statistic

Percent

confidence

White 224 1.3 226 1.0 2.08 1.62 1.29 20

Black 187 1.7 193 1.7 6.31 2.36 2.68 1

Hispanic 191 2.6 197 1.7 6.63 3.08 2.15 4

Asian/Pacific Islander 229 4.4 232 4.6 3.24 6.35 .51 62

American Indian 201 3.4 196 4.7 -5.51 5.81 -.95 35

SOURCE: National Center for Education Statistics, National Assessment of Educational Progress (NAEP), 2000 Reading Assessment.

13 Miller, R.G. (1966). Simultaneous statistical inference. NewYork: Wiley.

14 Benjamini,Y. & Hochberg,Y. (1995). Controlling the false discovery race: A practical and powerful approach tomultiple testing.fournal of the Royal Statistical Society, Series B, No. 1., pp 298-300.

15 L.V., &Tukey, J.W. (1994, December) Controlling error in multiple comparisons with specialattention to the National Assessment of Educational Progress. Research Triangle Park, NC: National Institute ofStatistical Sciences.

APPENDIX A

104READING REPORT CARD 91

Page 106: ERIC - Education Resources Information Center ...285; for a PowerPoint slide presentation related to this report, see CS 014 286. AVAILABLE FROM ED Pubs, P.O. Box 1398, Jessup, MD

difference in average scale scores and thestandard error of the difference are calcu-lated in a way comparable with that of theexample in the previous section. The teststatistic shown is the difference in averagescale scores divided by the standard error ofthe difference.

The difference in average scale scoresand its standard error can be used to findan approximate 95 percent confidenceinterval as in the example in the previoussection or they can be used to identify aconfidence percentage. In the example inthe previous section, because an approxi-mate 95 percent confidence interval wasdesired, the number 2 was used to multiplythe standard error of the difference tocreate the approximate confidence interval.In the current example, the test statistic istreated like the number 2 and the matchingpercent confidence for the related confi-dence interval is identified from statisticaltables. Instead of checking to see if zero iswithin the 95 percent confidence interval,the percent confidence from the statisticaltables can be directly compared to 100-95= 5 percent.

If the comparison of average scale scoresacross two years were made for only one ofthe race/ethnicity groups, there would be asignificant difference between the averagescale scores for the two years if the percentconfidence were less than 5 percent. How-ever, because we are interested in thedifference in average scale scores across thetwo years for all five of the race/ethnicitygroups, comparing each of the percents ofconfidence to 5 percent is not adequate. Inthis report, comparisons of average scale

92 APPENDIX A o READING REPORT CARD

score differences across years for all race/ethnicity groups are discussed together.Groups of students defined by sharedcharacteristics, such as race/ethnicitygroups, are treated as sets or families whenmaking comparisons. However, compari-sons of average scale scores for each pair ofyears were treated separately. So the stepsdescribed in this example were replicatedfor the comparison of 2000 and 1998average scale scores and the comparison of2000 and 1992 average scale scores.

To use the FDR procedure to take intoaccount that all five race/ethnicity com-parisons are of interest to us, the percentsof confidence in the example are orderedfrom largest to smallest: 62, 35, 20, 4, and 1.In the FDR procedure, 62 percent confi-dence for the Asian/Pacific Islander com-parison would be compared to 5 percent,35 percent for the American Indian com-parison would be compared to 5*(5-1)/5= 4 percent, 20 percent for the Whitecomparison would be compared to 5*(5-2)/5 = 3 percent, 4 percent for the His-panic comparison would be compared to5*(5-3)/5 = 2 percent, and 1 (actuallyslightly smaller than 1 prior to rounding)percent for the black comparison would becompared to 5*(5-4)/5 = 1 percent.Thelast of these comparisons is the only onefor which the percent confidence is smallerthan the FDR procedure value.The differ-ence in 2000 and 1994 average scale scoresfor the black students is significant; for allof the other race/ethnicity groups, averagescale scores for 2000 and 1994 are notsignificantly different from one another.

105

Page 107: ERIC - Education Resources Information Center ...285; for a PowerPoint slide presentation related to this report, see CS 014 286. AVAILABLE FROM ED Pubs, P.O. Box 1398, Jessup, MD

NAEP Reporting GroupsIn this report, results are provided forgroups of students defined by sharedcharacteristicsregion of the country,gender, race or ethnicity, school's type oflocation, eligibility for the Free/Reduced-Price School Lunch program, and type ofschool. Based on participation rate criteria,results are reported for subpopulations onlywhen sufficient numbers of students and

adequate school representation are present.The minimum requirement is at least 62students in a particular subgroup from atleast five primary sampling units (PSUs).'6However, the data for all students, regard-less of whether their subgroup wasreported separately, were included incomputing overall results. Definitions ofthe subpopulations referred to in thisreport are presented below.

States included in the four NAEP regions

Connecticut Alabama Illinois Alaska

Delaware Arkansas Indiana Arizona

District of Columbia Florida Iowa CaliforniaMaine Georgia Kansas Colorado

Maryland Kentucky Michigan Hawaii

Massachusetts Louisiana Minnesota Idaho

New Hampshire Mississippi Missouri Montana

New Jersey North Carolina Nebraska Nevada

New York South Carolina North Dakota New Mexico

Pennsylvania Tennessee Ohio Oklahoma

Rhode Island *Virginia South Dakota Oregon

Vermont West Virginia Wisconsin Texas

*Virginia Utah

Washington

Wyoming

* NOTE: The part of Virginia that is included in the Washington, DC metropolitan area is included in the Northeast region; the remainder of the state is included inthe Southeast region.

16 For the national assessment, a PSU is a selected geographic region (a county, group of counties, or metropolitanstatistical area). For the state assessment program, a PSU is most often a single school. Further details about theprocedure for determining minimum sample size appear in the 1998 NAEPTechnical Report.

APPENDIX A READING REPORT CARD 93

106

Page 108: ERIC - Education Resources Information Center ...285; for a PowerPoint slide presentation related to this report, see CS 014 286. AVAILABLE FROM ED Pubs, P.O. Box 1398, Jessup, MD

RegionResults in NAEP are reported for fourregions of the nation: Northeast, Southeast,Central, and West. Figure A.2 shows howstates are subdivided into these NAEPregions. All 50 states and the District ofColumbia are listed. Territories and the twoDepartment of Defense EducationalActivities jurisdictions are not assigned toany region.

GenderResults are reported separately for malesand females.

Race/EthnicityThe race/ethnicity variable is derived fromtwo questions asked of students and fromschool records, and it is used for race/ethnicity subgroup comparisons. Twoquestions from the set of general studentbackground questions were used to deter-mine race/ethnicity:

If you are Hispanic, what is your Hispanicbackground?

0 I am not Hispanic0 Mexican, Mexican American, or Chicano0 Puerto Rican0 Cuban0 Other Spanish or Hispanic background

Students who responded to this questionby filling in the second, third, fourth, orfifth oval were considered Hispanic. Forstudents who filled in the first oval, did notrespond to the question, or providedinformation that was illegible or could notbe classified, responses to the followingquestion were examined to determine theirrace/ethnicity.

94 APPENDIX A READING REPORT CARO

Which best describes you?

White (not Hispanic)

0 Black (not Hispanic)

fa Hispanic ("Hispanic" means someonewho is Mexican, Mexican American,Chicano, Puerto Rican, Cuban, or otherSpanish or Hispanic background)

0 Asian or Pacific Islander ("Asian orPacific Islander" means someone who isfrom a Chinese, Japanese, Korean,Filipino,Vietnamese, Asian American orfrom some other Asian or PacificIslander background.)

0 American Indian or Alaskan Native("American Indian or Alaskan Native"means someone who is from one of theAmerican Indian tribes or one of theoriginal people of Alaska.)

0 Other (specify)

Students' race/ethnicity was thenassigned on the basis of their responses. Forstudents who filled in the sixth oval("Other"), provided illegible informationor information that could not be classified,or did not respond at all, race/ethnicity wasassigned as determined by school records.

Race/ethnicity could not be determinedfor students who did not respond to eitherof the demographic questions and whoseschools did not provide information aboutrace/ethnicity.

Details of how race/ethnicity classifica-tions were derived are presented so thatreaders can determine how useful theresults are for their particular purposes.Also, some students indicated that theywere from a Hispanic background (e.g.,Puerto Rican or Cuban) and that a racial/ethnic category other than Hispanic bestdescribed them. These students wereclassified as Hispanic based on the rules

107

Page 109: ERIC - Education Resources Information Center ...285; for a PowerPoint slide presentation related to this report, see CS 014 286. AVAILABLE FROM ED Pubs, P.O. Box 1398, Jessup, MD

described above. Furthermore, informationfrom the schools did not always correspondto how students described themselves.

Therefore, the racial/ethnic resultspresented in this report attempt to providea clear picture based on several sources ofinformation.

In the 1992, 1998 and 2000 NAEPreading assessments the mutually exclusiveracial/ethnic categories were: white, black,Hispanic, Asian /Pacific Islander, andAmerican Indian (including Alaskan native).In the 1994 NAEP reading assessment, theAsian Pacific/Islander category was dividedinto separate Asian and Pacific Islandercategories.To make comparisons of perfor-mance across all three assessments, theseparate Asian and Pacific Islander catego-ries used in 1994 have been collapsed intoa single category to report results.

Type of LocationResults from the 2000 assessment arereported for students attending schools inthree mutually exclusive location types:central city, urban fringe/large town, andrural/small town:

Central City:This category includes centralcities of all Standard Metropolitan Statisti-cal Areas (SMSA) as defined by the Officeof Management and Budget. Central Cityis a geographical term and is not synony-mous with "inner city."

Urban Fringe /Large Town:The urban fringecategory includes all densely settled placesand areas within SMSA's that are classifiedas urban by the Bureau of the Census, butwhich do not qualify as Central City. ALarge Town is defined as a place outside aSMSA with a population greater than orequal to 25,000.

Rural/Small Town: Rural includes all placesand areas with populations of less than2,500 that are classified as rural by theBureau of the Census.A Small Town isdefined as a place outside a SMSA with apopulation of less than 25,000, but greaterthan or equal to 2,500.

In this report, results for each type oflocation are not compared across years. Thiswas due to new methods used by NCES toidentify the type of location assigned toeach school in the Common Core of Data(CCD). The new methods were put intoplace by NCES in order to improve thequality of the assignments and they takeinto account more information about theexact physical location of the school.

Eligibility for the Free/Reduced-Price

School Lunch ProgramBased on available school records, studentswere classified as either currently eligiblefor the free/reduced-price lunch compo-nent of the Department of Agriculture'sNational School Lunch Program or noteligible.The classification applies only tothe school year when the assessment wasadministered (i.e., the 1999-2000 schoolyear) and is not based on eligibility inprevious years. If school records were notavailable, the student was classified as"Information not available." If the schooldid not participate in the program, allstudents in that school were classified as"Information not available."

108 APPENDIX A READING REPORT CARD 95

Page 110: ERIC - Education Resources Information Center ...285; for a PowerPoint slide presentation related to this report, see CS 014 286. AVAILABLE FROM ED Pubs, P.O. Box 1398, Jessup, MD

Type of SchoolResults are reported by the type of schoolthat the student attendspublic or non-public. Nonpublic schools include Catholicand other private schools. Although Bureauof Indian Affairs (BIA) schools and Depart-ment of Defense Domestic DependentElementary and Secondary Schools(DDESS) are not included in either thepublic or nonpublic categories, they areincluded in the overall national results.

Cautions in interpretationsAs described earlier, the NAEP readingscale makes it possible to examine relation-ships between students' performance andvarious background factors measured byNAEP. However, a relationship that existsbetween achievement and another variabledoes not reveal its underlying cause, whichmay be influenced by a number of othervariables. Similarly, the assessments do notcapture the influence of unmeasuredvariables.The results are most useful whenthey are considered in combination withother knowledge about the student popu-lation and the educational system, such astrends in instruction, changes in the school-age population, and societal demands andexpectations.

BEST COPY AVAOLABLE

96 APPEDDIX A REA0100 REPORT CARO 109

Page 111: ERIC - Education Resources Information Center ...285; for a PowerPoint slide presentation related to this report, see CS 014 286. AVAILABLE FROM ED Pubs, P.O. Box 1398, Jessup, MD

ppencHSample Text and Questions from the

AE 2000 Reading Assess entThis appendix contains the reading passage released from the

NAEP 2000 reading assessment. In addition, sample

questions are presented to supplement those in chapter 1.

The tables accompanying each question present two types of

percentages: the overall percentage of students who answered

successfully, and the percentage of students who answered

successfully within the achievement level score

ranges. For each question, the reading purpose and

reading stance is indicated. For multiple-choice

questions, the correct answer is marked. For

constructed-response questions, the rating assigned to

a response and a summary of the scoring criteria

used to rate responses is provided. Sample student

responses have been reproduced from actual test

booklets to illustrate answers that demonstrated at

least adequate comprehension. To review additional

passages and questions from previous NAEP

assessments, please visit NAEP on the internet at

http://nces.ed.gov/nationsreportcard.

Released

materials

from the

2000 reading

assessment.

no

AppendixContents

Sample Text

Sample

Questions

Student

Responses

APPENDIX B o READING REPORT CARD 91

Page 112: ERIC - Education Resources Information Center ...285; for a PowerPoint slide presentation related to this report, see CS 014 286. AVAILABLE FROM ED Pubs, P.O. Box 1398, Jessup, MD

By Barbara Cole

Imagine shivering on a cold winter's night.The tip of your nose tingles in the frost air.Finally, you 'climb into bed and find the

toasty treat you have been waiting for yourvery own hot brick.

If you had lived in colonial days, that wouldnot sound as strange as it does today. Winterswere hard in this New World, and the colonistshad to think of clever ways to fight the cold. Atbedtime, they heated soapstones, or bricks, inthe fireplaces. They wrapped the bricks incloths and tucked them into their beds.Thebrick kept them warm at night, at least for aslong as its heat lasted.

Before the colonists slipped into bed, theyrubbed their icy sheets with a bed warmer.This was a metal pan with a long woodenhandle. The pan held hot embers from thefireplace. It warmed the bedding so well thatsleepy bodies had to wait until the sheetscooled before climbing in.

Staying warm wasn't just a bedtime problem.On winter rides, colonial travelers coveredthemselves with animal skins and warmblankets. Tucked under the blankets, near theirfeet, were small tin boxes called foot stoves. Afoot stove held burning coals. Hot smoke

90 APPEDDIX 0 o REA0100 REPORT CARD

puffed from small holes in the stove's lid,soothing freezing feet and legs. When thecolonists went to Sunday services, their footstoves, furs, and blankets went with them. Themeeting houses had no heat of their own untilthe 1800s.

t home, colonist families huddled closeto the fireplace, or hearth. The fireplacewas wide and high enough to hold a

large fire, but its chimney was large, too. Thatcaused a problem: Gusts of cold air blew intothe house. The area near the fire was warm, butin the rest of the room it might still be coldenough to see your breath.

Reading or needlework was done by candle-light, or by the light of the fire. During thewinter, animal skins sealed the drafty windowsof some cabins and blocked out the daylight.The living area inside was gloomy, except inthe circle of light at the hearth.

Early Americans did not bathe as often as wedo. When they did, their "bathroom" was thekitchen, in that toasty space by the hearth.They partially filled a tub of cold water, thenwarmed it up with water heated in the fire-place. A blanket draped from chairs for privacyalso let the fire's warmth surrounded the bather.

111 BESTCOPY AMIABLE

Page 113: ERIC - Education Resources Information Center ...285; for a PowerPoint slide presentation related to this report, see CS 014 286. AVAILABLE FROM ED Pubs, P.O. Box 1398, Jessup, MD

The household cooks spent hours at thehearth.They stirred the kettle of corn puddingor checked the baking bread while the rest ofthe family carried on their own firesideactivities. So you can see why the fireplace wasthe center of a colonial home.

The only time the fire was allowed to diedown was at bedtime. Ashes would be piledover the fire, reducing it to embers that mightglow until morning.

By sunrise, the hot brick had become a coldstone once more. An early riser might getdressed under the covers, then hurry to thehearth to warm up.

Sample Question

Maybe you'd enjoy hearing someone whokept warm in these ways tell you what it waslike.You wouldn't need to look for someonewho has been living for two hundred years. Inmany parts of the country the modern waysdidn't take over from the old ones untilrecently.Your own grandparents or other olderpeople might remember the warmth of ahearthside and the joy of having a brick tocuddle up to.

Used by permission of Highlights for Children, Inc., Columbus, OHCopyright © 1991. Illustrations by Katherine Dodge.

You would probably read this article if you wantedto know how the colonists

cooked their food

0 traveled in the winter

© washed their clothes

0 kept warm in cold weather

Reading Purpose:

Information

Dfis Sample Question Results Multiple-Choice

Overall percentage correct and percentages correct within each achievement level range: 2000

Overall percentage Basic Proficient Advancedcorrect' 208-237* 238-261* 268 and above*

85 (0.8) 91 (2.0) 97(1.1) 100 ( * * * *)-

'Includes fourth-grade students who were below the Basic level.*NAEP Reading composite scale range.Standard errors of the estimated percentages appear in parentheses.(****) Standard error estimates cannot be accurately determined.SOURCE: National Center for Education Statistics, National Assessment ofEducational Progress (NAEP), 2000 Reading Assessment.

112APPENDIX 8 READING REPORT CARD 99

Page 114: ERIC - Education Resources Information Center ...285; for a PowerPoint slide presentation related to this report, see CS 014 286. AVAILABLE FROM ED Pubs, P.O. Box 1398, Jessup, MD

Sample Question

Give two reasons stated in the article why thehearth was the center of the home in colonial times.

Reading

BAPurpose:

Information

Reading Stance:

Developing

Interpretation

Responses to this question were scored according to a three-level rubric asUnsatisfactory, Partial, or Complete

VELO2M Sample Question Results (Short Constructed-Response)

Overall percentage "Complete" and percentages "Complete" within each achievement level range: 2000

Overall percentage

"Complete" '

Percentage "Completeachievement B9m1 intervals

Basic Proficient

208-237° 238-267°

Advanced

268 and above'

20 (1.2) 16 (1.8) 37 (3.2) 58 (4.9)

'Includes fourth-grade students who were below the Basic level.*NAEP Reading composite scale range.Standard errors of the estimated percentages appear in parentheses.(****) Standard error estimates cannot be accurately determined.SOURCE: National Center for Education Statistics, National Assessment ofEducational Progress (NAEP), 2000 Reading Assessment.

Responses scored "Complete" demonstrated understanding of a major aspect of the article byproviding two reasons for the hearth being the center of the colonial home.

Sample "Complete" Response:

Give two reasons stated in the article why the hearth was the center of the home incolonial times.

100 APPENDIX B READING REPORT CARD

113

BEN COITY AVAIIIIA UR

Page 115: ERIC - Education Resources Information Center ...285; for a PowerPoint slide presentation related to this report, see CS 014 286. AVAILABLE FROM ED Pubs, P.O. Box 1398, Jessup, MD

Sample Question

Does the author help you understand what coloniallife was like? Use examples from the article toexplain why or why not.

Reading Purpose:

Information

Reading

CriticalStance:

Stance

Responses to this question were scored according to a three-level rubric asUnsatisfactory, Partial, or Complete

Sample Question Results (Short Constructed-Response

Overall percentage "Complete" and percentages "Complete" within each achievement level range: 2000

Percentage Complete

achievement (}mil intervals

Overall percentage Basic Proficient Advanced"Complete "' 208-237° 238-267° 268 and above*

20 (1.4) 19 (3.1) 29 (3.6) 35 (6.2)

'Includes fourth-grade students who were below the Basic level.*NAEP Reading composite scale range.Standard errors of the estimated percentages appear in parentheses.(****) Standard error estimates cannot be accurately determined.SOURCE: National Center for Education Statistics, National Assessment ofEducational Progress (NAEP), 2000 Reading Assessment.

Responses scored "Complete" provided an opinion of the author's presentation of informationand an example from the text to illustrate that opinion.

Sample "Complete" Response:

Does the author help you understand what colonial life was like? Use examplesfrom the article to explain why or why not.

r. Ao-e.5 0S he oft ieS 50.e c't c;c,c[6. a;15 a \noA -k\Ae.

Q.

0 csA el- Pt-yCfr\:& tr\i\f\ n vi -sea,

114APPENDIX B READING REPORT CARD 101

Page 116: ERIC - Education Resources Information Center ...285; for a PowerPoint slide presentation related to this report, see CS 014 286. AVAILABLE FROM ED Pubs, P.O. Box 1398, Jessup, MD

Sample Question

Some of the ways that colonists kept warmduring the winter were different from the waysthat people keep warm today. Tell about two ofthese differences.

Reading Purpose:

Information

Reading Stance:

Personal Reflection

Response

Responses to this question were scored according to a three-level rubric asUnsatisfactory, Partial, or Complete

Tin B4: Sample Question Results (Short Constructed- Response)

Overall percentage "Complete" and percentages "Complete" within each achievement level range: 2000

Overall percentage Cask Proficient Advanced"Complete"' 208-237* 238-261e 268 and above*

17(1.1) 15(2.4) 27 (3.7) 46(6.5)

'Includes fourth-grade students who were below the Basic level.*NAEP Reading composite scale range.Standard errors of the estimated percentages appear in parentheses.(****) Standard error estimates cannot be accurately determined.SOURCE: National Center for Education Statistics, National Assessment ofEducational Progress (NAEP), 2000 Reading Assessment.

Responses scored "Complete" connected text descriptions to prior knowledge by comparingtwo ways colonists kept warm during winter to the ways people keep warm today.

Sample "Complete" Response:

Some of the ways that colonists kept warm during the winter weredifferent from the ways that people keep warm today. Tell about twoof these differences.

102 APPENDIX B READING REPORT CARD 115

Page 117: ERIC - Education Resources Information Center ...285; for a PowerPoint slide presentation related to this report, see CS 014 286. AVAILABLE FROM ED Pubs, P.O. Box 1398, Jessup, MD

ppendix CData Appendbc

This appendix contains complete data for all the tables and

figures presented in this report, including average scores,

achievement level results, and percentages of students. In

addition, standard errors appear in parentheses next to each

scale score and percentage. The comparisons presented in

this report are based on statistical tests that consider the

magnitude of the difference between group averages or

percentages and the standard errors of those statistics.

Because NAEP scores and percentages are based on

samples rather than the entire population(s), the

results are subject to a measure of uncertainty

reflected in the standard errors of the estimates. It can

be said with 95 percent certainty that for each

population of interest, the value for the whole

population is within plus or minus two standard

errors of the estimate for the sample. As with the

Complete data

for all tables

and figures.

figures and tables in the chapters, significant

differences between results of previous assessments

and the 2000 assessment are highlighted.

116

Average Scores

Achievement

Level Results

Percentages of

Students

Standard Errors

APPENDIX C READING REPORT CARD 103

Page 118: ERIC - Education Resources Information Center ...285; for a PowerPoint slide presentation related to this report, see CS 014 286. AVAILABLE FROM ED Pubs, P.O. Box 1398, Jessup, MD

Average fourth-grade reading scale scores: 1992-2000

1992

1994

1998

2000

217 (0.9)

214 (1.0)

217 (0.8)

217 (0.8)

Standard errors of the estimated scale scores appear in parentheses.SOURCE: National Center for Education Statistics, National Assessment of Educational Progress (NAEP), 1992,1994, 1998, and 2000 Reading Assessments.

d1 PrDistnbution

laleatitaLLui

Fourth-grade reading scale score percentiles: 1992-2000

Mean 10th 25th 50th 75th 90th

1992 217 ( 0.9) 170 ( 1.9)* 194 ( 1.1) 219 ( 1.3) 242 ( 1.1)* 261 ( 1.4)*

1994 214 ( 1.0) 159 ( 1.5)* 189 ( 1.0) * 219 ( 1.2) 243 ( 1.3) 263 ( 1.6)

1998 217 ( 0.8) 167 ( 1.5) 193 ( 0.9) 220 ( 1.3) 244 ( 0.9) 263 ( 0.9)

2000 217 ( 0.8) 163 ( 1.9) 193 ( 0.9) 221 ( 1.1) 245 ( 0.8) 264 ( 0.9)

Standard errors of the estimated scale scores appear in parentheses.

Significantly different from 2000.SOURCE: National Center for Education Statistics, National Assessment of Educational Progress (NAEP), 1992, 1994, 1998, and 2000 Reading Assessments.

104 APPENDIX C READING REPORT CARD

117

Page 119: ERIC - Education Resources Information Center ...285; for a PowerPoint slide presentation related to this report, see CS 014 286. AVAILABLE FROM ED Pubs, P.O. Box 1398, Jessup, MD

Percentages of fourth-grade students at or above reading achievement levels and within eachachievement level range: 1992-2000

Below Basic

1992 38 ( 1.1) 1

1994 40 ( 1.0)

1998 38 ( 0.9)

2000 37 ( 0.8)

r.

At Basic At Proficient At Advanced

At or above

Basic

At or above

Proficient

34 ( 0.9) " 22 ( 0.9) 6 ( 0.6) " 62 ( 1.1) 29 ( 1.2) *

31 ( 0.1) 22 ( 0.8) 1 ( 0.1) 60 ( 1.0) 30 ( 1.1)

32 ( 0.7) 24 ( 0.7) 7(0.5) 62 ( 0.9) 31 ( 0.9)

31 ( 0.9) 24 ( 0.8) 8 ( 0.5) 63 ( 0.8) 32 ( 0.9)

Standard errors of the estimated percentages appear in parentheses.

* Significantly different from 2000.NOTE: Percentages within each reading achievement level range may not add to 100, or to the exact percentages at or above achievement levels, due to

rounding.SOURCE: National Center for Education Statistics, National Assessment of Educational Progress (NAEP), 1992, 1994, 1998, and 2000 Reading Assessments.

APPENDIX C READING REPORT CARD 105

118

Page 120: ERIC - Education Resources Information Center ...285; for a PowerPoint slide presentation related to this report, see CS 014 286. AVAILABLE FROM ED Pubs, P.O. Box 1398, Jessup, MD

100 APPENDIX C

&M. Data to,ITT

Sa mple Question Results (iMultiple-Dhoice)

Overall percentage correct and percentages correct within each achievement level range: 2000

Overall Basic Proficient Advanced

2000 66 (1.6) 72 (3.5) 79 (3.3) 84 (5.1)

Standard errors of the estimated percentages appear in parentheses.SOURCE: National Center for Education Statistics, National Assessment of Educational Progress (NAEP), 2000 Reading Assessment.

Sample Question Results (Short Constructed Response)

Overall percentage "Complete" and percentages "Complete" within each achievement level range: 2000

Overall Basic Proficient Advanced

2000 37 (1.6) 38 (3.6) 57 (3.9) 76 (4.1)

Standard errors of the estimated percentages appear in parentheses.SOURCE: National Center for Education Statistics, National Assessment of Educational Progress (NAEP), 2000 Reading Assessment.

Ta le iC Data TableTble,ritr..,

Sample fluestion-aillesust xtended Constructed-Response)

Overall percentage "Essential" or better and percentages "Essential" or better within eachachievement level range: 2000

Overall Basic Proficient Advanced

2000 18 (1.0) 15 (1.9) 29 (3.2) 40 (4.9)

Standard errors of the estimated percentages appear in parentheses.SOURCE: National Center for Education Statistics, National Assessment of Educational Progress (NAEP), 2000 Reading Assessment.

READING REPORT CARD

119

Page 121: ERIC - Education Resources Information Center ...285; for a PowerPoint slide presentation related to this report, see CS 014 286. AVAILABLE FROM ED Pubs, P.O. Box 1398, Jessup, MD

Percentage of students and average fourth-grade reading scale scores for male and female students:1992-2000

Male Female

1992 51 ( 0.6) 49 ( 0.6)

213 ( 1.2) 221 ( 1.0)

1994 51 ( 0.7) 49 ( 0.7)

209 ( 1.3) 220 ( 1.1)

1998 50 ( 0.6) 50 ( 0.6)

214 ( 1.1) 220 ( 0.7)

2000 50 ( 0.7) 50 ( 0.7)

212 ( 1.1) 222 ( 0.9)

The percentage of students is listed first with the corresponding average scale score presented below.

Standard errors of the estimated percentages and scale scores appear in parentheses.

SOURCE: National Center for Education Statistics, National Assessment of Educational Progress (NAEP), 1992, 1994, 1998, and 2000 Reading Assessments.

Percentages of fourth-grade male and female students at or above reading achievement levels andwithin each achievement level range: 1992-2000

Male

Female

At or above AI or above

Below Basic Al Basic Al Proficient A1 Advanced Basic Proficient

1992 42 (1.6) 32 (1.3) 20 (1.1) 5 (0.7)58 (1.6) 25 (1.4)

1994 45 (1.4) 30 (1.1) 20 (0.9) 6 (0.8)55 (1.4) 26 (1.3)

1998 41(1.4) 31(1.0) 22 (0.9) 6 (0.6)59 (1.4) 28 (1.2)

2000 42 (1.2) 31(1.1)(1.1) 21(1.2)(1.2) 6 (0.5)58 (1.2) 27 (1.1)

1992 33 (1.3) 35 (1.4) 24 (1.2) 8 (0.8)

1994 34(1.2) 32(1.1) 25(1.5) 3(0.9)

1998 35 (1.0) 32 (1.1) 25 (0.9) 8 (0 6)

2000 33 (1.2) 31 (1.5) 26 (1.0) 10 (0.8)

67 (1.3) 32 (1.4) *

66 (1.2) 34 (1.5)

65 (1.0) 33 (1.0)

67 (1.2) 36 (1.2)

Standard errors of the estimated percentages appear in parentheses.

* Significantly different from 2000.NOTE: Percentages within each reading achievement level range may not add to 100, or to the exact percentages at or above achievement levels, due to

rounding.

SOURCE: National Center for Education Statistics, National Assessment of Educational Progress (NAEP), 1992, 1994, 1998, and 2000 Reading Assessments.

APPENDIX C e READING REPORT CARD 107

120

Page 122: ERIC - Education Resources Information Center ...285; for a PowerPoint slide presentation related to this report, see CS 014 286. AVAILABLE FROM ED Pubs, P.O. Box 1398, Jessup, MD

Percentage of students and average fourth-grade reading scale scores by race/ethnicity: 1992-2000

White Black Hispanic

Asian/

Pacific Islander

American

Indian

1992 71 ( 0.2) 16 ( 0.1) 9 ( 0.1) 2 ( 0.3) 2 ( 0.2)

225 ( 1.2) 193 ( 1.6) 201 ( 2.1) 214 ( 3.3) * 207 ( 4.6)

1994 69 ( 0.2) 15 ( 0.2) 12 ( 0.2) 3 ( 0.2) 2 ( 0.2)

224 ( 1.3) 187 ( 1.7) * 191 ( 2.6) 229 ( 4.4) 201 ( 3.4)

1998 67 ( 0.5) 16 ( 0.2) 13 ( 0.5) 2 ( 0.2) 2 ( 0.2)

227 ( 0.8) 194 ( 1.7) 196 ( 1.8) 225 ( 2.7) 202 ( 3.1)

2000 66 ( 0.4) 14 ( 0.2) 15 ( 0.3) 3 ( 0.2) 2 ( 0.2)

226 ( 1.0) 193 ( 1.7) 197 ( 1.7) 232 ( 4.6) 196 ( 4.7)

The percentage of students is listed first with the corresponding average scale score presented below.

Standard errors of the estimated percentages and scale scores appear in parentheses.

" Significantly different from 2000.SOURCE: National Center for Education Statistics, National Assessment of Educational Progress (NAEP), 1992, 1994, 1998, and 2000 Reading Assessments.

100 APPENDIX C * READING REPORT CARO 121

Page 123: ERIC - Education Resources Information Center ...285; for a PowerPoint slide presentation related to this report, see CS 014 286. AVAILABLE FROM ED Pubs, P.O. Box 1398, Jessup, MD

Percentages of fourth-grade students at or above reading achievement levels and within eachachievement level range by race/ethnicity: 1992-2000

White

Black

Hispanic

Asian/Pacific Islander

American Indian

At or above At or above

Below Basic . At Oosic 1 At Proficient Al Advanced Basic Proficient...____

199229 (1:3) 36 (1.4) 27 (1.3) 8 (0.9) 71 (1.3) 35 (1.7)

1994 29 (1.2) 34 (0.8) 27 (1.1) 9 (0.9) 71(1.2) 37 (1.4)

1998 27 (1.1) 34 (0.8) 29 (0.9) 10 (0.7) 73 (1.1) 39 (1.1)1

2000 27 (1.1) 33 (1.3) 29 (1.0) 11(0.7) 73 (1.1) 40 (1.2)

..1"..:':11992 67Ji.3)....; 25 (2.8) 8 (1.3) 1(0.4)(0.4) 33 (2.3) 8 (1.4)

1994 69.(2.i).: 1 22 (2.2) 8 (0.9) 1(0.4) 31 (2.5) 9 (1.0). 1

1998 64 (1.7) 26 (1.6) 9 (1.0) 1(0.5) 36 (1.7) 10 (1.0)

2000 63 (1.6) 25 (1.5) 10 (1.2) 2 (0.6) 37 (1.6) 12 (1.3)

1992 56 (2.2) 28 (2.0) 13 (1.6) 3 (0.8) 44 (2.2) 16 (1.8)

64 (2.6) 22 (2.1) 11 (1.5) 2 (0.6)1994 36 (2.6) 13 (1.6)

1998 60 (1.9) 26 (1.5) 11(1.2) 2 (0.4) 40 (1.9) 13 (1.2)

2000 511(.1.8) I. 26 (1.6) 13 (1.4) 3 (0.6) 42 (1.8) 16 (1.3)

1992 41(4.8) 34 (4.2) 21(4.7) 4 (1.8) 59 (4.8) 25 (4.7) *

25 (3.9) 31 (4.0) 31 (3.5) 13 (4.9)1994 75 (3.9) 44 (5.5)

1998 11 (4.2) 32 (4.7) 25 (3.4) 12 (2.9) 69 (4.2) 37 (3.9)

2000 22 (5.0) 32 (3.9) 29 (3.8) 17 (4.6) 78 (5.0) 46 (4.4)

1992 .17 (6.6) 35 (6.5) 15 (4.5) 3 (2.1) 53 (6.6) 18 (4.5)

1994 52 (4.4) 30 (4.2) 15 (3.8) 3 (2.1) 48 (4.4) 18 (3.8)

1998 53 (5.5) 33 (5.2) 12 (4.5) 2 (1.2) 47 (5.5) 14 (3.8)

2000 57 (4.2) 26 (6.1) 16 (4.6) 2 (**°) 43 (4.2) 17 (4.8)

Standard errors of the estimated percentages appear in parentheses.

* Significantly different from 2000.(****) Standard error estimates cannot be accurately determined.NOTE: Percentages within each reading achievement level range may not add to 100, or to the exact percentages at or above achievement levels, due to

rounding.

SOURCE: National Center for Education Statistics, National Assessment of Educational Progress (NAEP), 1992, 1994, 1998, and 2000 Reading Assessments.

122APPENDIX C READING REPORT CARD 109

Page 124: ERIC - Education Resources Information Center ...285; for a PowerPoint slide presentation related to this report, see CS 014 286. AVAILABLE FROM ED Pubs, P.O. Box 1398, Jessup, MD

WI &HsCoR9 Figure %EN i$cut? Differences Between Selected Subgroups

Differences in average fourth-grade reading scale scores by gender and race/ethnicity: 1992-2000

Female/male White/black White/Hispanic

1992 8 ( 1.6) 32 ( 2.0) 23 ( 2.4)

1994 10 ( 1.7) 37 ( 2.1) 33 ( 2.9)

1998 6 ( 1.3) * 33 ( 1.9) 31 ( 2.0)

2000 10 ( 1.4) 33 ( 2.0) 29 ( 2.0)

Standard errors of the estimated difference in scale scores appear in parentheses.

° Significantly different from 2000.SOURCE: National Center for Education Statistics, National Assessment of Educational Progress (NAEP), 1992, 1994, 1998, and 2000 Reading Assessments.

TableIT.121

Percentage of students and average fourth-grade reading scale scores by regions of the country:

1992-2000

Northeast Southeast Central West

1992 21(1.1) 23 (1.0) 27 (0.5) 28 (0.8)

222 (3.6) 213 (2.3) 219 (1.4) 214 (1.4)

1994 23 (0.9) 23 (1.1) 25(0.7) 29 (0.8)

215 (2.1) * 210 (2.0) 220 (2.4) 212 (2.0)

1998 22 (1.0) 26 (1.1) 24 (0.6) 28 (1.3)

226 (1.4) 211 (1.3) 222 (2.0) 212 (1.9)

2000 22 (0.7) 22 (1.5) 25 (0.5) 31 (1.5)

222 (1.7) 211 (1.9) 220 (1.8) 214 (1.6)

The percentage of students is listed first with the corresponding average scale score presented below.

Standard errors of the estimated percentages and scale scores appear in parentheses.

* Significantly different from 2000. -

SOURCE: National Center for Education Statistics, National Assessment of Educational Progress (NAEP), 1992, 1994, 1998, and 2000 Reading Assessments.

110 APPENDIX C READING REPORT CARD 123

Page 125: ERIC - Education Resources Information Center ...285; for a PowerPoint slide presentation related to this report, see CS 014 286. AVAILABLE FROM ED Pubs, P.O. Box 1398, Jessup, MD

Percentages of fourth-grade students at or above reading achievement levels and within eachachievement level range by regions of the country: 1992-2000

Northeast

Southeast

Central

West

1992

1994

1998

2000

1992

1994

1998

2000

Below Basic

34 (3.6)

39 (2.1)

30 (1.7)

.111

3312.0)

a.

42 (3.1)

At basic At Proficient At Advanced

32 (2.2) 24 (2.6) 9 (2.4)

30 (1.7) 23 (1.7) 8 (1.4)

32 (1.3) 28 (1.3) 10 (1.2)

30 (2.0) 26 (1.6) 11(1.1)

34 (2.3)

45(2.3) 30 (1.7)

44 (1.6) 31(1.2)

31(1.7)43 (1.8)1.

19 (2.0)

19 (1.7)

20 (1.0)

20 (1.3)

1992 34 (1.7) r 36 (1.7) 24 (1.6)

1994 34 (2.6) 33 (1.5) 26 (2.0)

1998 32 (2.2) 34 (1.1) 26 (1.5)

2000 34 (2.2) 31 (1.8) 26 (1.9)

1992

1994

1998

2000

41(1.7)

41(2.1)

43 (2.3)

39 (1.3)

32 (1.5)

30 (1.5)

30 (1.9)

30 (1.3)

21(1.7)

22 (1.5)

21(1.5)

23 (1.3)

5 (1.0)

7 (0.9)

5 ((f.7)

6(0.7)

6 (1.1)

8(1.1)

8 (0.9)

811.2)

6 (0.7)

7 (0.8)

6 (0.8)

8 (0.9)

At or above At or above

Basic Proficient

66 (3.6) 34 (4.3)

61 (2.1) 31 (2.4)

70 (1.7) 38 (1.7)

67 (2.0) 37 (1.8)

58 (3.1)

55 (2.3)

56 (1.6)

57 (1.8)

66 (1.7)

66 (2.6)

68 (2.2)

66 (2.2)

59 (1.7)

59 (2.1)

57 (2.3)

61 (1.3)

24 (2.6)

25 (2.1)

25 (1.4)

26 (1.8)

30 (2.1)

34 (2.5)

35 (1.9)

35 (2.4)

27 (1.7)

29 (1.8)

27 (2.0)

30 (1.3)

Standard errors of the estimated percentages appear in parentheses.

NOTE: Percentages within each reading achievement level range may not add to 100, or to the exact percentages at or above achievement levels, due to

rounding.

SOURCE: National Center for Education Statistics, National Assessment of Educational Progress (NAEP), 1992, 1994, 1998, and 2000 Reading Assessments.

Ealr COPY MARLA

124 APPENDIX C READING REPORT CARD 111

Page 126: ERIC - Education Resources Information Center ...285; for a PowerPoint slide presentation related to this report, see CS 014 286. AVAILABLE FROM ED Pubs, P.O. Box 1398, Jessup, MD

T.0114 Data Occno9m NEco 2021 Results cza)903 Location

Percentage of students and average reading scale scores by school's type of location: 2000

Central city Urban fringe/large town Rural/small town

2000 32 (1.7) 45 (2.4) 23 (2.1)209 ( 1.6) 222 ( 1.8) 218 (1.8)

The percentage of students is listed fiat with the corresponding average scale score presented below.

Standard errors of the estimated percentages and scale scores appear in parentheses.

SOURCE: National Center for Education Statistics, National Assessment of Educational Progress (NAEP), 2000 Reading Assessment.

Percentages of fourth-grade students at or above reading achievement levels and within eachachievement level range by school's type of location: 2000

Below Basic At Basic At Proficient At Advanced

At or above

Basic

At or above

Proficient

Central city -47 (1.9) 27 (1.6) 20 (1.1) 6 (0.9) 53 (1.9) 26 (1.7)

Urban fringe/large town 32 (1.6) 32 (1.7) 26 (1.2) 10 (1.1) 68 (1.6) 36 (1.8)

Rural/small town 35 (1.7) 33 (1.9) 25 (1.9) 8 (0.8) 65 (1.7) 32 (2.3)

Standard errors of the estimated percentages appear in parentheses.

NOTE: Percentages within each reading achievement level range may not add to 100, or to the exact percentages at or above achievement levels, due to

rounding.

SOURCE: National Center for Education Statistics, National Assessment of Educational Progress (NAEP), 2000 Reading Assessment.

112 APPENDIX C READING REPORT CARD

125

Page 127: ERIC - Education Resources Information Center ...285; for a PowerPoint slide presentation related to this report, see CS 014 286. AVAILABLE FROM ED Pubs, P.O. Box 1398, Jessup, MD

Percentage of students and average fourth-grade reading scale scores by student eligibility for thefree/reduced-price lunch program: 1998-2000

Eligible Not eligible Information not available

1998 35 (1.2) 54 (1.8) 12 (1.9)

198 (1.2) 227 (0.9) 227 (2.8)

2000 34 (1.2) 51 (1.9) 15 (2.2)

196 (1.2) 227 (1.2) 228 (1.9)

The percentage of students is listed first with the corresponding average scale score presented below.

Standard errors of the estimated percentages and scale scores appear in parentheses.

SOURCE: National Center for Education Statistics, National Assessment of Educational Progress (NAEP), 1998 and 2000 Reading Assessments.

E0g Data 437Mtn 22.10 Achievement Ogg Results Free/Reduced-Price Raw@ Eligibility

Percentages of fourth-grade students at or above reading achievement levels and within eachachievement level range by student eligibility for the free/reduced-price lunch program:1998 -2000

Eligible

Not Eligible

Information not available

1998

2000

1998

2000

1998

2000

Below Basic

58(1.4)

60 (1.3)

27 (1.2)

26 (1.2)

27 (2.7)

26 (2.0)

M or above

Basic

42 (1.4)

40 (1.3)

73 (1.2)

74 (1.2)

73 (2.7)

74 (2.0)

At or above

Proficient

13 (1.2)

14 (0.9)

40 (1.3)

41 (1.3)

40 (3.8)

42 (2.1)

^

At Oasis

29 (1.4)

26 (1.2)

33 (1.2)

34 (1.4)

33 (2.4)

32 (1.9)

At Proficient

11(1.1)

12 (0.9)

30 (1.0)

30 (1.0)

29 (3.1)

30 (2.1)

At Advanced

2 (0.4)

2 (0.3)

10 (0.9)

11(0.8)

11(1.6)

12 (1.3)

Standard errors of the estimated percentages appear in parentheses.

NOTE: Percentages within each reading achievement level range may not add to 100, or to the exact percentages at or above achievement levels, due torounding.

SOURCE: National Center for Education Statistics, National Assessment of Educational Progress (NAEP), 1998 and 2000 Reading Assessments.

26APPENDIX C * READING REPORT CARD 113

Page 128: ERIC - Education Resources Information Center ...285; for a PowerPoint slide presentation related to this report, see CS 014 286. AVAILABLE FROM ED Pubs, P.O. Box 1398, Jessup, MD

114 APPEROIX C

Percentage of students and average fourth-grade reading scale scores by type of school: 1992-2000

Public Nonpublic Nonpublic: Catholic Other Nonpublic

1992 89 (1.0) 11(1.0) 8 (0.8) 4 (0.7)

215 (1.0) 232 (1.7) 229 (2.2) 238 (2.9) !

1994 90 (0.9) 10 (0.9) 7 (0.8) 4 (0.6)

212 (1.1) 231 (2.5) 229 (3.3) 234 (3.7)

1998 89 (1.2) 11(1.2) 7 (1.0) 4 (0.6)

215 (0.8) 233 (2.3) 233 (2.5) 232 (4.5)

2000 89 (0.7) 11(0.7) 6 (0.6) 5 (0.5)

215 (0.9) 234 (1.7) 231 (2.6) 237 (2.1)

The percentage of students is listed first with the corresponding average scale score presented below.

Standard errors of the estimated percentages and scale scores appear in parentheses.

The nature of the sample does not allow accurate determination of the variability of the statistic.SOURCE: National Center for Education Statistics, National Assessment of Educational Progress (NAEP), 1992, 1994, 1998, and 2000 Reading Assessments.

READING REPORT CARD

EST COPY AVAOLABLE

127

Page 129: ERIC - Education Resources Information Center ...285; for a PowerPoint slide presentation related to this report, see CS 014 286. AVAILABLE FROM ED Pubs, P.O. Box 1398, Jessup, MD

Percentages of fourth-grade students at or above reading achievement levels and within eachachievement level range by type of school: 1992-2000

Public 1992

1994

1998

2000

Nonpublic 1992

1994

1998

2000

Nonpublic: Catholic 1992

1994

1998

2000

Other Nonpublic 1992

1994

1998

2000

21(1.9)

23 (2.4)

22.(?.6)

20(1.)

24

24 (3.2)

21(2.9)

12 (2.9)

16 (2.7) !

20 (4.2)

24 (4.6)

18 (2.1)

....

At Advanced

At or above

BasicAt &sic At Proficient

At or above

Proficient

33 (1.0) 21(1.0) 6(0.6) 60 (1.1) 27 (1.3)

30 (0.8) 21(1.0) 7 (0.7) 59 (1.1) 28 (1.2)

31(0.8) 23 (0.8) 6 (0.5) 61 (1.0) 29 (0.9)

31(1.0) 22 (0.8) 7 (0.6) 60 (0.9) 30 (1.0)

34 (2.2) 33 (2.2) 12 (1.3) 79 (1.9) 45 (2.4)

34 (2.2) 31(2.0) 13 (1.8) 77 (2.4) 43 (3.0)

32 (2.1) 32 (2.1) 14 (1.5) 78 (2.6) 46 (2.9)

32 (1.8) 34 (1.8) 14 (1.4) 80 (1.8) 47 (2.4)

35 (2.2) 30 (2.0) 10 (1.5) 76 (2.7) 41 (2.7)

34 (2.9) 30 (2.8) 12 (2.2) 76 (3.2) 42 (3.9)

33 (2.5) 32 (2.5) 13 (1.7) 79 (2.9) 46 (3.3)

33 (2.2) 33 (2.3) 11 (1.9) 78 (2.9) 44 (3.1)

31(4.1) ! 38 (4.5) ! 15 (2.9) ! 84 (2.7) ! 53 (4.4) !

34 (3.0) 32 (2.6) 14 (2.9) 80 (4.2) 46 (4.0)

30 (3.0) 31(3.6) 16 (2.9) 76 (4.6) 46 (5.0)

31(2.7) 35 (2.3) 16 (1.8) 82 (2.1) 51 (2.9)

Standard errors of the estimated percentages appear in parentheses.

!The nature of the sample does not allow accurate determination of the variability of the statistic.NOTE: Percentages within each reading achievement level range may not add to 100, or to the exact percentages at or above achievement levels, due to

rounding.SOURCE: National Center for Education Statistics, National Assessment of Educational Progress (NAEP), 1992, 1994, 1998, and 2000 Reading Assessments.

128

EST COTIY7 AVAIMA121112

APPENDIX C READING REPORT CARD 115

Page 130: ERIC - Education Resources Information Center ...285; for a PowerPoint slide presentation related to this report, see CS 014 286. AVAILABLE FROM ED Pubs, P.O. Box 1398, Jessup, MD

Students' reports on the number of pages read each day in school and for homework: 1992-2000

1992 1994 1998 2000

11 or more pages 56 (1.2) * 54 (1.1) * 57 (1.1) 60 (1.2)

222 (1.1) 220 (1.3) 221 (0.8) 222 (0.9)

6 to 10 pages 23 (0.7) * 23 (0.7) * 22 (0.6) 20 (0.8)

217 (1.3) 214 (1.3) 217 (1.3) 215 (1.5)

5 or fewer pages 21(1.0) 23 (0.8) * 21(0.8) 19 (0.8)

203 (1.4) 201 (1.2) 207 (1.3) 202 (2.1)

The percentage of students is listed first with the corresponding average scale score presented below.

Standard errors of the estimated percentages and scale scores appear in parentheses.

Significantly different from 2000.NOTE: Percentages may not add up to 100 due to rounding.

SOURCE: National Center for Education Statistics, National Assessment of Educational Progress (NAEP), 1992, 1994, 1998, and 2000 Reading Assessments.

Students' reports on the amount of time spent doing homework each day: 1992-2000

1992 1994 1990 2000

More than one hour 15 (0.6) 15 (0.6) 16 (0.6) 16(0.7)

208 (1.9) _ 208 (2.1) 213 (1.7) 212 (1.7)

One hour 28 (0.9) 30 (0.7) 31(0.8) 29 (0.6)

221 (1.1) 218 (1.4) 221 (1.2) 222 (1.2)

One-half hour 39 (1.2) * 39 (1.0) * 43 (1.0) 43 (0.9)

217 (1.3) 216 (1.3) 219 (1.1) 219 (1.1)

Do not do homework 2 (0.2) 3 (0.3) * 2 (0.2) 2 (0.2)

196 (3.7) 183 (3.0) 188 (3.4) 172 (4.1)

Do not have homework 16 (1.6) * 13 (0.9) * 8 (0.8) 10 (0.9)

220 (1.6) 216 (2.2) 213 (2.7) 212 (3.0)

The percentage of students is listed first with the corresponding average scale score presented below.

Standard errors of the estimated percentages and scale scores appear in parentheses.

° Significantly different from 2000.NOTE: Percentages may not add up to 100 due to rounding.

SOURCE: National Center for Education Statistics, National Assessment of Educational Progress (NAEP), 1992, 1994, 1998, and 2000 Reading Assessments.

116 APPENDIX C o READING REPORT CARD

BEST COPY AVAUI\BLE

129

Page 131: ERIC - Education Resources Information Center ...285; for a PowerPoint slide presentation related to this report, see CS 014 286. AVAILABLE FROM ED Pubs, P.O. Box 1398, Jessup, MD

i

Students' reports on how often they write long answers to questions on tests or assignments thatinvolved reading: 1992-2000

1992 1994 1998 2000

At least once a week 51(1.0) 48 (0.8) * 53 (0.8) 53 (1.1)

220 (1.1) 217 (1.0) 218 (0.8) 217 (1.0)

Once or twice a month 28 (0.8) 31(0.7) * 30 (0.6) 28 (0.9)

221 (1.2) 221 (1.4) 223 (1.0) 225 (1.2)

Once or twice a year 13 (0.5) 12 (0.5) 10 (0.5) 11 (0.4)

209 (2.2) 209 (2.2) 212 (1.8) 210 (2.5)

Never or hardly ever 9 (0.5) 9 (0.4) * 8(0.4) 8 (0.4)

202 (2.1) 198 (2.8) 199 (2.1) 199 (2.6)

The percentage of students is listed first with the corresponding average scale score presented below.

Standard errors of the estimated percentages and scale scores appear in parentheses.

° Significantly different from 2000.NOTE: Percentages may not add up to 100 due to rounding.SOURCE: National Center for Education Statistics, National Assessment of Educational Progress (NAEP), 1992, 1994, 1998, and 2000 Reading Assessments.

130APPEODIX C AUDIO 8 REPORT CARD 111

Page 132: ERIC - Education Resources Information Center ...285; for a PowerPoint slide presentation related to this report, see CS 014 286. AVAILABLE FROM ED Pubs, P.O. Box 1398, Jessup, MD

Students' reports on how often their teachers help them break words into parts and help themunderstand new words: 1998-2000

How often their teachers help them break words into parts

1998 2000

Every day 25 (0.7) 25 (0.9)

210 (1.2) 209 (1.3)

Once or twice a week 23 (0.6) 22 (0.7)

217 (1.3) 217 (1.2)

Never or hardly ever 52 (0.8) 53 (0.9)

226 (1.0) 226 (1.1)

How often their teachers help them understand new words

Every day 49 (0.9) 48 (0-.9)

217 (0.9) 217 (1.1)

Once or twice a week 24 (0.7) 23 (0.6)

224 (1.2) 224 (1.1)

Once or twice a month 14 (0.5) 14 (0.5)

223 (1.9) 224 (1.9)

Never or hardly ever 12 (0.5) * 14 (0.5)

219 (1.8) 216 (2.1)

The percentage of students is listed first with the corresponding average scale score presented below.

Standard errors of the estimated percentages and scale scores appear in parentheses.

Significantly different from 2000.NOTE: Percentages may not add up to 100 due to rounding.

SOURCE: National Center for Education Statistics, National Assessment of Educational Progress (NAEP), 1998 and 2000 Reading Assessments.

110 APPENDIX C 0 READING DEPORT CARD

131

Page 133: ERIC - Education Resources Information Center ...285; for a PowerPoint slide presentation related to this report, see CS 014 286. AVAILABLE FROM ED Pubs, P.O. Box 1398, Jessup, MD

Students' reports on how often they read for fun on their own time: 1992-2000

1992 1994 1998 2000

Every day 44 (0.9) 45 (0.7) 43 (0.7) 43 (0.9)

223 (1.2) 223 (1.2) 222 (1.1) 223 (1.1)

Once or twice a week 32 (0.8) 32 (0.7) 32 (0.6) 32 (0.7)

218 (1.2) 213 (1.1) 219 (1.0) 218 (0.9)

Once or twice a month 12 (0.4) 12 (0.5) 12 (0.6) 12 (0.5)

210 (1.6) 208 (2.1) 216 (1.7) 216 (1.6)

Never or hardly ever 13 (0.5) 12 (0.4) * 13 (0.5) 14 (0.5)

199 (1.8) 197 (1.9) 203 (1.4) 202 (2.8)

The percentage of students is listed first with the corresponding average scale score presented below.

Standard errors of the estimated percentages and scale scores appear in parentheses.

* Significantly different from 2000.NOTE: Percentages may not add up to 100 due to rounding.

SOURCE: National Center for Education Statistics, National Assessment of Educational Progress (NAEP), 1992, 1994, 1998, and 2000 Reading Assessments.

APPEUOIX C READIOG REPORT CARD 110

132

Page 134: ERIC - Education Resources Information Center ...285; for a PowerPoint slide presentation related to this report, see CS 014 286. AVAILABLE FROM ED Pubs, P.O. Box 1398, Jessup, MD

,r-tr4,1m Discussing Studies Talking About Reading

Students' reports on how often they discuss their studies at home and talk about reading with theirfamily and friends: 1992-2000

Discuss studies at home

1992 1994 1999 2000

Almost every day 54 (0.8) 55 (0.8) 54 (0.6) 54 (0.7)

221 (1.0) 219 (1.0) 220 (0.8) 221 (1.1)

Once or twice a week 22 (0.7) 22 (0.5) 23 (0.6) 23 (0.6)

220 (1.5) 215 (1.7) 222 (1.3) 219 (1.2)

Once or twice a month 6 (0.3) 6 (0.4) 6 (0.3) 6 (0.3)

215 (1.8) 208 (2.3) 213 (2.2) 217 (3.5)

Never or hardly ever 17 (0.6) 17 (0.5) 18 (0.5) 17 (0.5)

202 (1.5) 199 (1.7) 205 (1.3) 201 (1.7)

Talk about reading with family or friends

Almost every day 26 (0.6) 28 (0.6) 27 (0.5) 27 (0.6)

215 (1.4) 213 (1.3) 211 (1.2) 213 (1.6)

Once or twice a week 36 (0.9) 36 (0.6) 35 (0.7) 34 (0.7)

224 (1.1) 223 (1.2) 223 (1.0) 227 (1.0)

Once or twice a month 15 (0.6) 15 (0.5) 15 (0.5) 15 (0.5)

219 (1.6) 214 (2.1) 222 (1.5) 220 (1.8)

Never or hardly ever 23 (0.8) 21 (0.6)* 23 (0.7) 24 (0.6)

209 (1.4) 207 (1.6) 214 (1.3) 209 (1.1)

The percentage of students is listed first with the corresponding average scale score presented below.

Standard errors of the estimated percentages and scale scores appear in parentheses.

° Significantly different from 2000.NOTE: Percentages may not add up to 100 due to rounding.SOURCE: National Center for Education Statistics, National Assessment of Educational Progress (NAEP), 1992, 1994, 1998, and 2000 Reading Assessments.

120 APPENDIX C READING REPORT CARD

133

Page 135: ERIC - Education Resources Information Center ...285; for a PowerPoint slide presentation related to this report, see CS 014 286. AVAILABLE FROM ED Pubs, P.O. Box 1398, Jessup, MD

Students reports on the number of different types of reading materials in the home: 1992-2000

1992 1994 1998 2000

Four 37 (0.9) 38(0.8)* 37 (1.0) 34 (0.8)

226 (1.3) 227 (1.1) 228 (1.1) 229 (1.3)

Three 32 (0.7) 34 (0.7) 33 (0.7) 34 (0.8)

219 (1.3) 216 (1.2) 220 (1.1) 219 (1.0)

Two or fewer 31(0.8) 29 (0.9) * 30 (0.8) 32 (1.0)

204 (0.9) 197 (1.4) 204 (1.1) 203 (1.3)

The percentage of students is listed first with the corresponding average scale score presented below.

Standard errors of the estimated percentages and scale scores appear in parentheses.

Significantly different from 2000.NOTE: Percentages may not add up to 100 due to rounding.

SOURCE: National Center for Education Statistics, National Assessment of Educational Progress (NAEP), 1992, 1994, 1998, and 2000 Reading Assessments.

aTableiC Dita4r ime 'gent WatohingI h6111

111 1 .14`oild1

Students' reports on the amount of time spent watching television each day: 1992-2000

1992 1994 1998 2000

Six hours or more 20 (0.7) 21 (0.7)* 16 (0.6) * 18 (0.6)

199 (1.5) 194 (1.4) 198 (1.5) 196 (1.7)

Four or five hours 22 (0.8) * 22 (0.7) * 19 (0.7) * 17(0.6)

216 (1.3) 216 (1.7) 216 (1.4) 213 (2.2)

Two or three hours 40 (0.8) 38 (0.7) * 41(0.9) 40 (0.7)

224 (1.0) 222 (1.1) 223 (0.9) 224 (1.0)

One hour or less 19 (0.8) * 19 (0.7) * 24 (0.7) 25 (0.8)

221 (1.6) 220 (1.9) 222 (1.3) 224 (1.4)

The percentage of students is listed first with the corresponding average scale score presented below.

Standard errors of the estimated percentages and scale scores appear in parentheses.

Significantly different from 2000.NOTE: Percentages may not add up to 100 due to rounding.SOURCE: National Center for Education Statistics, National Assessment of Educational Progress (NAEP), 1992, 1994, 1998, and 2000 Reading Assessments.

BEST COPY &ARABLE

134APPENDIX C READING REPORT CARD 121

Page 136: ERIC - Education Resources Information Center ...285; for a PowerPoint slide presentation related to this report, see CS 014 286. AVAILABLE FROM ED Pubs, P.O. Box 1398, Jessup, MD

UDe aEaCcoMg6A1

Average score by type of results: 1998-2000

Accommodations

not permitted

Accommodations

permitted

1998 217 (0.8) 216 (0.9)

2000 217 (0.8) 215 (1.0) t

Standard errors of the estimated scale scores appear in parentheses.

tSignificantly different from the sample where accommodations were not permitted.

SOURCE: National Center for Education Statistics, National Assessment of Educational Progress (NAEP), 1998 and 2000 Reading Assessments.

1E024MA Data flxikf:021(a

Percentages of students attaining reading achievement levels by type of results: 1998-2000

1998

Accommodations were

not permitted

permitted

2000

Accommodations were

not permitted

permitted

At or above At or above

At Basic At Proficient At Advanced Basic Proliciont

38 (0.9) 32 (0.7) 24 (0.7) 7 (0.5) 62 (0.9) 31(0.9)39 (1.0) 31(0.8) 23 (0.8) 8 (0.5) 61(1.0) 31(0.9)

37 (0.8) 31 (0.9) 24 (0.8) 8 (0.5)

39 (1.1) 30 (0.9) 23 (0.9) 7 (0.6)

63 (0.8) 32 (0.9)

61 (1.1) 31 (0.9)

Standard errors of the estimated percentages appear in parentheses.

NOTE: Percentages within each reading achievement level range may not add to 100, or to the exact percentages at or above achievement levels, due torounding.

SOURCE: National Center for Education Statistics, National Assessment of Educational Progress (NAEP), 1998 and 2000 Reading Assessments.

TM Mg

Average scores by gender and type of results (Accommodations Not Permitted andAccommodations Permitted): 1998-2000

1998

2000

Male Female

Not permitted Permitted Not Permitted Permitted

214 (1.1) 214(1.2)° 220 (0.7) 219 (1.1)

'212(1.1) 210 (1.0) 222 (0.9) 220 (1.2)

Standard errors of the estimated scale scores appear in parentheses.

*Significantly different from 2000.

SOURCE: National Center for Education Statistics, National Assessment of Educational Progress (NAEP), 1998 and 2000 Reading Assessments.

122 APPENDIX C READING REPORT CARD

135REST COPY J.Irkaux

Page 137: ERIC - Education Resources Information Center ...285; for a PowerPoint slide presentation related to this report, see CS 014 286. AVAILABLE FROM ED Pubs, P.O. Box 1398, Jessup, MD

Percentages of students attaining reading achievement levels by gender and by type of results:1998-2000

Male

1998: Accommodations were

not permitted

permitted

2000: Accommodations were

not permitted

permitted

Female1998: Accommodations were

not permitted

permitted

2000: Accommodations were

not permitted

permitted

I Belo0;1c

. .!:

41 (1.4)

( 1.2)

42 (1..2)

44 ( 1.2)

35 ( 1.0)

36 ( 1.2)

33 ( 1.2)

34 ( 1.3)

1.

At Basic

31 ( 1.0)

30 ( 1.0)

31 ( 1.1)

30 ( 1.4)

32 ( 1.1)

31 ( 1.0)

31 ( 1.5)

30 1.0)

At Proficient

22 ( 0.9)

22 ( 1.0)

21 ( 1.2)

20 ( 1.0)

25 0.9)

24 ( 1.1)

26 ( 1.0)

26 ( 1.2)

At Advanced

6 (

6 (

6 (

6 (

8 (

9 (

10 (

9 (

0.6)

0.7)

0.5)

0.6)

0.6)

0.8)

0.8)

1.0)

At or above

Basic

59 ( 1.4)

59 ( 1.2)

58 ( 1.2)

56 ( 1.2)

65 ( 1.0)

64 ( 1.2)

67 ( 1.2)

66 ( 1.3)

At or above

Proficient

28 ( 1.2)

28 ( 1.1)

27 ( 1.1)

26 ( 1.1)

33 ( 1.0)33 1.1)

36 ( 1.2)35 ( 1.2)

Standard errors of the estimated percentages appear in parentheses.NOTE: Percentages within each reading achievement level range may not add to 100, or to the exact percentages at or above achievement levels, due to

rounding.

SOURCE: National Center for Education Statistics, National Assessment of Educational Progress (NAEP), 1998 and 2000 Reading Assessments.

Average scores by race/ethnicity and type of results (Accommodations Not Permitted andAccommodations Permitted): 1998-2000

Asian AmericanWhite Black Hispanic Pacific Islander Indian

Not Not Not Not Notpermitted Permitted permitted Permitted permitted' Permitted permitted Permitted permitted Permitted

1998 227 (0.8) 226 (1.0) 194 (1.7) 194 (1.8) 196 (1.8) 193 (2.6) 225 (2.7) 220 (3.8) 202 (3.1) 199 (3.0)

2000 226 (1.0) 225 (1.0) 193 (1.7) 193 (1.4) 197 (1.7) 190 (2.5) t 232 (4.6) 229 (4.3) 196 (4.7) 201 (4.3)

Standard errors of the estimated scale scores appear in parentheses.

tSignificantly different from the sample where accommodations were not permitted.

SOURCE: National Center for Education Statistics, National Assessment of Educational Progress (NAEP), 1998 and 2000 Reading Assessments.

136

31EST COPY AVAIL& TT

APPENDIX C READING REPORT CARD 123

Page 138: ERIC - Education Resources Information Center ...285; for a PowerPoint slide presentation related to this report, see CS 014 286. AVAILABLE FROM ED Pubs, P.O. Box 1398, Jessup, MD

Lg3

Percentages of students attaining reading achievement levels by race/ethnicity and by type ofresults: 1998-2000

White

1998: Accommodations were

not permitted

permitted

2000: Accommodations were

not permitted

permitted

Black

1998: Accommodations were

not permitted

permitted

2000: Accommodations were

not permitted

permitted

Hispanic

1998: Accommodations were

not permitted

permitted

2000: Accommodations were

not permitted

permitted

Asian/Pacific Islander1998: Accommodations were

not permitted

permitted

2000: Accommodations were

not permitted

permitted

American Indian

1998: Accommodations were

not permitted

permitted

2000: Accommodations were

not permitted

permitted

Below Bask,;.",,-,c.

,gist.

aka:44.64 (1.7)

63 (2.1)

-,... .:...:.63 (1.6)63 (1.7)

;

60 (1.9)

63 (2.5)

58 (1.8)

62(2.3)

31(4.2)37 (5.0)

22 (5.0)

26 (4.4)

53 (5.5)

58 (5.4)

57 (4.2)

51(5.1)

. .

At Basic At Proficient

,,,,..'.::

Itif,t

34 (0.8) 29 (0.9)

33 (1.1) 29 (1.2)

''.':,z t'44

33 (1.3) 29 (1.0)

33 (1.2) 29 (1.2)

26 (1.6) 9 (1.0)

26 (2.0) 9 (1.2)

I25 (1.5) 10 (1.2)

26 (1.4) 9 (1.1)

26 (1.5) 11(1.2)

25 (1.8) 11(1.2)

26 (1.6) 13 (1.4)

24(2.4) 12(1.3)

32 (4.7) 25 (3.4)

29 (4.4) 23 (3.4)

32 (3.9) 29 (3.8)

30 (4.8) 28 (4.0)

33 (5.2) 12 (4.5)

29 (4.3) 12 (3.5)

26 (6.1) 16 (4.6)

29 (6.9) 18 (4.6)

At Advanced

10 (0.7)

10 (0.8)

11(0.7)

10 (0.8)

1(0.5)

1(0.4)

2 (0.6)1(0.6)

2 (0.4)

2 (0.4)

3 (0.6)

2(0.5)

12 (2.9)

11 (2.8)

17 (4.6)

16 (3.0)

2 (1.2)

1 (0.6)

2 ("")2 ("")

At or above

Basic

73 (1.1)

72 (1.3)

73 (1.1)

71 (1.3)

36 (1.7)37 (2.1)

37 (1.6)

37 (1.7)

40 (1.9)

37 (2.5)

42 (1.8)

38 (2.3)

69 (4.2)

63 (5.0)

78 (5.0)

74 (4.4)

47 (5.5)

42 (5.4)

43 (4.2)

49 (5.1)

At or above

Proficient

39 (1.1)39 (1.2)

40 (1.2)

39 (1.2)

10 (1.0)

10 (1.1)

12 (1.3)

11(1.1)

13 (1.2)

13 (1.3)

16 (1.3)

14 (1.4)

37 (3.9)

34 (3.7)

46 (4.4)

44 (5.0)

14 (3.8)

13 (3.2)

17 (4.8)

20 (4.8)

Standard errors of the estimated percentages appear in parentheses.

(****) Standard error estimates cannot be accurately determined.NOTE: Percentages within each reading achievement level range may not add to 100, or to the exact percentages at or above achievement levels, due to

rounding.

SOURCE: National Center for Education Statistics, National Assessment of Educational Progress (NAEP), 1998 and 2000 Reading Assessments.

124 APPENDIX C READING REPORT CARO

137

BEST COPY AVAREABILE

Page 139: ERIC - Education Resources Information Center ...285; for a PowerPoint slide presentation related to this report, see CS 014 286. AVAILABLE FROM ED Pubs, P.O. Box 1398, Jessup, MD

cknow0edgmentsThis report is the culmination of the effort of many individuals who contributedtheir considerable knowledge, experience, and creativity to the NAEP 2000 readingassessment. The assessment was a collaborative effort among staff from the NationalCenter for Education Statistics (NCES), the National Assessment Governing Board(NAGB), Educational Testing Service (ETS),Westat, and National Computer Systems(NCS). Most importantly, NAEP is grateful to the students and school staff whomade the assessment possible.

The 2000 NAEP reading assessment was funded through NCES, in the Office ofEducational Research and Improvement of the U.S. Department of Education. Theacting Commissioner of Education Statistics, Gary W Phillips, and the NCES staffPeggy Carr, Patricia Dabbs,Arnold Goldstein, Carol Johnson, Andrew Kolstad, HollySpurlock, and Sheida Whiteworked closely and collegially with the authors toproduce this report.

The NAEP project at ETS is directed by Stephen Lazer and John Mazzeo.Sampling and data collection activities were conducted by Westat under the directionof Rene Slobasky, Nancy Caldwell, Keith Rust, and Dianne Walsh. Printing,distribution, scoring, and processing activities were conducted by NCS under thedirection of Brad Thayer, Patrick Bourgeacq, William Buckles, Mathilde Kennel,Linda Reynolds, and Connie Smith.

The complex statistical and psychometric activities necessary to report results forthe NAEP 2000 reading assessment were directed by Nancy Allen with assistancefrom Spence Swinton.The numerous computer programs associated with theseactivities were carried out by Steve Isham and Lois Worthington. The analysespresented in this report were produced by Dave Freund, Steve Isham, Laura Jerry,Youn-hee Lim, and Lois Worthington.

The design and production of this report was overseen by Loretta Casalaina.Joseph Kolodey and Carol Erickson contributed invaluable design and productionexpertise to the effort. Assistance in the production of the reports was also providedby Barbette Tardugno. Wendy Grigg coordinated the documentation and datachecking procedures with assistance from Alice Kass. Shari Santapau coordinatedthe editorial and proofreading procedures. The web version of this report wascoordinated by Pat O'Reilly with assistance from Rick Hasney. These activities, aswell as statistical and psychometric activities, were conducted at ETS.

Many thanks are due to the numerous reviewers, both internal and external toNCES and ETS.The comments and critical feedback of the following reviewers arereflected in the final version of this report: Peter Afflerbach, Mary Crovo, PatriciaDabbs, Lawrence Feinberg, Steve Gorman, David Grissmer, Carol Johnson, JanetJohnson, Andrew Kolstad, Marilyn McMillen, Holly Spurlock, Alan Vanneman, andSheida White.

APPEURIX C

138o REARIPG REPORT CARD 125

Page 140: ERIC - Education Resources Information Center ...285; for a PowerPoint slide presentation related to this report, see CS 014 286. AVAILABLE FROM ED Pubs, P.O. Box 1398, Jessup, MD

United

essupSiandy

MD

Education

20794-

OfficialPenalty

Ogi4G0

90 Private W210

139 BEST COPY AVAII

Page 141: ERIC - Education Resources Information Center ...285; for a PowerPoint slide presentation related to this report, see CS 014 286. AVAILABLE FROM ED Pubs, P.O. Box 1398, Jessup, MD

Mora Depadmerd EducagitonOffice of Educational Research and Improvement (OERI)

National Library of Education (NLE)Educational Resources Information Center (ERIC)

NOT2C11

This document is covered by a signed 66I eproduction i >elease

lanket) form (on file within the El' I C system), encompassing all

or classes ofdocuments from its source organization and, therefore,

does not r(_ 'mire A "Specific Docume t" elease form.

I igp,

This document is Federally-funded, or carries its own permission toreproduce, or is otherwise in the public domain and, therefore, maybe reproduced by ERIC without a signed Reproduction Release form

(either "Specific Document" or "Blanket").

EFF-089 (9/97)