Afterschool Counts!

52
Afterschool Counts! By Leila Fiester with Policy Studies Associates, Inc. Prepared for the After School Project of The Robert Wood Johnson Foundation A Guide to Issues and Strategies for Monitoring Attendance in Afterschool and Other Youth Programs

Transcript of Afterschool Counts!

Page 1: Afterschool Counts!

Afterschool Counts!

By Leila Fiester with Policy Studies Associates, Inc.Prepared for the After School Project of The Robert Wood Johnson Foundation

A Guide to Issues andStrategies for Monitoring Attendance in Afterschooland Other Youth Programs

Page 2: Afterschool Counts!

CONTENTS

EXECUTIVE SUMMARY iI. Why Track Afterschool Attendance? 1

II. Overview of Effective Strategies 4III. Using Data 5IV. Collecting Data 10

Essential and Optional Data Elements 10Types of Attendance to Track 11When and How Often to Take Attendance 12How Program Design Affects Data Collection 12Ensuring Data Quality 12Merging Data from More than One System 13

V. Processing Data 14Technology Needs, Costs, and Complexity 14Data Entry 14Cleaning Data 15

VI. Analyzing Data 17Selecting a Denominator for Attendance Rate 17Measuring Dosage 17Interpreting and Comparing Results 18Using Attendance Data to Assess Demand 19Reporting Data 19

VII. Creating Capacity for Data Tracking 20Department of Youth and Community

Development (New York, NY) 20Boston After School Enterprise (Boston, MA) 21Chicago Public Schools (Chicago, IL) 22

VIII. Necessary Resources 23Staff Time and Capacity 23Money and Expertise for Technology 23

IX. Lessons and Conclusions 24Choosing a System 24Implementing the Process 25Living with the Consequences 26Conclusions 26

Appendix A: Interviewees 27Appendix B: Profiles of Selected Strategies 28Appendix C: Resources 41Acknowledgments 42

Page 3: Afterschool Counts!

ABOUT THE AFTER SCHOOL PROJECTThe Robert Wood Johnson Foundation created the After School Project in 1998as a five-year, three-city demonstration aimed at connecting significant numbersof young people in low-income neighborhoods with responsible adults duringout-of-school time.To that end, the Project focuses on developing: (1) consistent,dedicated revenues to support afterschool programs in low-income communities;(2) an array of developmental opportunities for youth, including physical activi-ties, sports, and educational, social, and recreational programs; and (3) stronglocal organizations with the necessary resources, credibility, and political cloutto bring focus and visibility to the youth development field.

After School Project staff provide technical assistance and tools to sites thatre c e i ve RWJF afterschool grants. In the area of attendance monitoring, the Fo u n d a-tion has a particular interest in helping cities to develop the capacity needed formonitoring youth part i c i p ation in all programs that occur during nonschool hours.

Page 4: Afterschool Counts!

EXECUTIVE SUMMARY

T HE PUBLIC PROFILE of out-of-school activities for children and youth has growndramatically in the last 15 years, along with the number of afterschool programs.With that growth in programming (and, by necessity, funding) came a greater

emphasis on accountability. That pressure, in turn, underscored for program directors,funders, and researchers the need for better systems of counting and tracking afterschoolattendance and participation.

This guide—based on interviews with people working on the front lines to staff,direct, or evaluate afterschool programs—is intended to raise awareness of the issues andoptions involved in tracking attendance and participation. It describes the challengesinherent in using, collecting, processing, and analyzing data, and it offers practical exam-ples of ways that program directors and youth-serving organizations are trying to meet theneed for attendance data.

Options for collecting data on student attendance range from a pen-and-paperapproach to web-based data systems and swipe card technology. The “right” method forcollecting, organizing, and analyzing data depends on how program leaders expect to usethe data—what questions they need to answer, and for whom—as well as the program’ssize, structure, and resources.

USING DATAData on afterschool attendance and/or participation can be used to:

■ Fulfill accountability requirements (e.g., to determine the daily cost per child;verify grantees’ compliance with a targeted level of service; and substantiatereimbursement claims made to city, state, or federal funders)

■ Monitor the quality and effectiveness of an overall initiative

■ Evaluate student outcomes—for instance, by comparing attendance data to students’school performance records to measure the effect of participation

■ Support program-level planning and management decisions, including how many staff to hire and how to deploy them, how much space to obtain, and how many snacks orsupplies to order

■ Facilitate case management for participants with special needs

■ Support student rewards, incentives, stipends, and sanctions

■ Gauge demand for services, overall and for specific activities

■ Facilitate staff self-reflection, training, and education

■ Advocate for more funding or for the use of specific strategies

Examples of all of these uses can be found in Chapter III.

COLLECTING DATAResearchers and program directors suggest that the following data elements are essentialto collect:

i

T

Page 5: Afterschool Counts!

■ Site name (if part of a multi-site initiative or citywide database)

■ Total number of students enrolled

■ Total head count per day, week, month

■ Student names

■ Individualized student number, such as a school- or district-assigned ID

■ Age and/or grade in school

■ Each student’s first and last date of enrollment

■ Each student’s demographic information

Just as there are several types of data to track, there are several ways to measure atten-dance: (1) by the total number of children who come in the door (average daily attendance,or ADA, for the program overall); (2) by the number of days that each child attends; and(3) by the kinds of activities that children participate in (e.g., homew o rk help, sports, art s ) .Many programs measure only the total number of children who attend the program. T h o s ethat want to understand what low or high ADA rates mean, or to link attendance to spe-cific outcomes, will also re c o rd whether an individual student attends on a given day.

Program directors who want to know which program model produces the most impactor is of the highest quality also need to know what students do after they walk in the door.Multi-site initiatives and citywide data collection systems face an additional issue: Will theyaccept data in multiple formats or insist that each site use the same format so that data qual-ity is consistent? Ve ry few site-based programs have the technical ability to translate dataf rom one system into another, and it may be hard to justify that level of effort .

Program design further affects data collection, because the way in which studentsmove through the day can make one method or level of data collection more appropriatethan another. Some design elements to consider:

■ If the program is school-based, do students leave the building between the regular andafterschool day?

■ Do some activities occur offsite? If so, a centralized sign-in location won’t capture dataon all students.

■ Do all students participate in the same activity at the same time, or do groups ofstudents rotate through activities?

PROCESSING DATAElectronic processing systems make it easier and faster to analyze and use data later on,but they also carry financial and human resource costs. Choices to make about processingdata include:

■ Should you automate the attendance data or stick to a simpler, but perhaps more timeconsuming, system of hard copy storage and calculations done by hand? The inherenttradeoff: financial cost, investment in staff training, and ongoing commitments of stafftime vs. easier data analysis and reporting, greater flexibility, and reduced staff time andeffort to create reports and analyses.

■ If the system is automated, how will you support sites that lack the necessaryequipment?

ii

Page 6: Afterschool Counts!

Examples that address these questions can be found in Chapter V.Accurate data entry is especially important because erroneous or missing information

can distort findings and thus undermine the quality of the database. The most commonproblems include invalid entries for a specific variable, such as attendance recorded for adate on which the program did not operate, and missing data on a student’s demographiccharacteristics. Other common issues include: Verifying the student’s enrollment date,clarifying the dropout date, checking that the recorded dates of attendance were indeeddates on which the program operated, and obtaining a unique student ID number foreach participant.

ANALYZING DATAAttendance data usually are collected daily but tracked monthly. In other words, at theend of each month the daily data are aggregated so programs can calculate overall enroll-ment and average daily attendance (for the program and, sometimes, per child). Otheruseful analyses include:

■ Number of days that each child attended

■ Individual children’s attendance rates

■ Duration of enrollment—the average number of weeks or months that a child remainsin the program and whether a child or group of children attends for more than oneenrollment period (i.e., semester or year)

■ The total number of children served during the year, versus the number who attend fora specified duration

■ Maximum and minimum number of children served on any given day

The attendance rate can be calculated in several ways. Some programs base their analy-sis on the percent of available days that an enrolled child (or group of children) actuallyattended. Others interpret average daily attendance as the number of students present ina given site, on an average day, over the course of a month.

Some programs also measure “dosage,” which refers to the level of exposure to a pro g r a mor activity that a child re c e i ves. Students who attend for more hours or days are pre s u m e dto re c e i ve a higher dosage than those whose experience is more limited. Dosage is most re l-e vant for programs that seek to show that participation in an afterschool program pro d u c e sspecific outcomes. The main challenge here is to establish a standard for the minimumdosage level needed to produce the desired results. If one measures solely the percent of ava i l-able days of service that a child or group attends, and the number of available days is ve rysmall, interpretations of participation might be falsely high. Combining the duration of ac h i l d’s attendance with his or her attendance rate provides a better measure of dosage.

The final steps in analyzing data involve interpreting and comparing results. Someimportant questions to consider include:

■ Are students required to attend every day that the program operates and stay for thefull session, or are they allowed to come and go at will? For a drop-in program it’s fine if thedata show that students don’t stay for a full session, but for a more structured program—especially onewith academic goals—that finding suggests a problem.

iii

Page 7: Afterschool Counts!

■ Is the program offered every day of the week? In order to reach conclusions about exposure,you need to know how many hours per day and week the program operates.

■ How are certain services defined? For initiatives that aim to increase the number of afterschoolslots available to youth, for instance, how does one define “new” or “expanded” services?

■ Are there competing activities that prevent children from attending? A survey or interviewsmay be the best way to understand the impact of contextual factors on attendance patterns.

CREATING CITYWIDE CAPACITY FOR DATA TRACKINGA few cities and school systems are developing data systems that span a variety of youth-serving programs. Most are still evolving, so it is difficult to find full-blown models readyfor replication. However, Chapter VII of this guide features three efforts that illustratesome promising attempts to address the challenge.

The Department of Youth and Community Development (DYCD) in New York Cityis building on the state’s Results-Oriented Management and Accountability framework tocreate a Web-based system for linking the participation of youth and adults in specificafterschool activities to outcomes. Currently, outcomes are limited to program attendance.As the system moves forward, program attendance will continue to be tracked but will bemonitored only to determine whether a program is serving the projected number of chil-dren. The system will be piloted at the city’s Beacon Schools. All DYCD contract officerswill be trained to use the database to generate reports for funders and other stakeholders.

Building Boston’s After-School Enterprise (BASE), supported by The Ro b e rt Wo o dJohnson Foundation, is developing a citywide data system that will provide up-to-date infor-mation on afterschool supply and demand, re s o u rces for parents and service providers, andstate-of-the-field re p o rts. BASE staff approached city agencies and intermediary organizationsthat already compile information about individual sites and asked them to cooperate in cre-ating a single data collection tool. That instrument, currently in development, will trackb road pro g r a m - l e vel data such as type of service(s), enrollment capacity, actual enro l l m e n t ,participants’ demographic characteristics, staffing configurations, and funding streams.

In 2002-03, the afterschool planning division and the budget office of Chicago Pu b l i cSchools (CPS) began working on a citywide system for tracking attendance in afterschoolp rograms. The effort was driven by the need to document the number of afterschool par-ticipants eligible for TANF reimbursement and by concerns about pre s e rving the pro g r a m swith heaviest usage during a time of budget cuts. The Web-based afterschool data systembuilds on two bases: CPS’s system for tracking teacher payroll and an online system cre a t e dby the city’s summer jobs initiative for youth. Data are entered at each afterschool site bythe school’s payroll clerk, who has only to enter a student’s identification number to trig-ger the underlying school database and find the child’s name and vital information.

NECESSARY RESOURCESWhat does it take to develop the data systems described in this guide? Staff time andcapacity for data activities, and money and expertise for technology. Typical costs for adata system include:

■ Purchase and installation of onsite computers, if not already available

■ Development or purchase of a management information system (MIS)

iv

Page 8: Afterschool Counts!

■ Training for the staff who will collect and/or enter data

■ Internet access

■ For swipe cards, at least one scanner and a serial connection

■ Purchase or lease of a software package

■ Customization of the MIS software, if desired

■ Expertise to analyze and interpret the data (e.g., using statistical software or reportsgenerated by the systems)

Software costs often vary, depending on how many sites (or, in the case of swipe cards,classrooms) are involved and how much customization is required to meet the program’sinformation needs. (Typical costs are presented in Chapter VIII).

LESSONS AND CONCLUSIONSAlthough data needs vary from program to program, the following lessons apply to mostsystems:

■ Keep it simple. If the program doesn’t really need lots of bells and whistles, opt for somethingstreamlined and easy to use.

■ Make sure the people who are directly responsible for data collection play a role indesigning the system. Frontline program staff can give system designers a realistic sense of whatdata can and can’t be collected.

■ If you want to link afterschool attendance to educational outcomes, you’ll probablyneed to establish a data partnership with the school district. Agreements on what data willbe provided, how, and with whom, pave the way for data sharing.

■ Build trust for how the data will be used. Make an effort to convince people that they needgood data to strengthen their programs and communicate their successes.

■ Be consistent in requiring that sites submit data, but be as flexible as possible in howthey collect and submit the information. Don’t sacrifice standardization, however, because that willjeopardize your ability to compare across sites.

■ Urge sites to collect data daily. Don’t wait until you need to generate a report, and don’t try toreconstruct data retroactively.

■ Take time to troubleshoot data discrepancies.

■ Create a mechanism for addressing data collection issues, such as a committee or onlinetroubleshooter.

■ Think creatively about the resources available to you for implementation. Are there schoolstaff or equipment that can be put to use with minimal effort? Is there another program with anexisting data system that your program can piggyback onto?

■ Commit to the goal of managing data and keep pushing toward it. That often meanskeeping key players motivated to stay at the table.

■ Create incentives for collecting and entering data. Some programs require instructors to turn intheir attendance sheets before they can be paid, and at least one state links funding for afterschooleducation and safety programs to average daily attendance.

v

Page 9: Afterschool Counts!

Some of the people interviewed for this guide had struggled to put tracking systemsin place, sometimes trying several options before finding a system that worked. But theyall understood the importance of not being daunted by the challenges. Unfortunately,there is no such thing as a flawless data system or a step-by-step recipe for choosing a sys-tem, because of the great variation in program models, capacities, and data needs. But thelessons outlined above and the examples profiled in the guide offer a starting point. Startsmall; you can always build on the system. Keep it efficient and simple. Focus on how thedata will be useful to you and your program staff—to improve program design or man-agement, perhaps, or to fulfill accountability requirements. And don’t be intimidated bythe sometimes obscure language of technology and analysis. The important thing, expertsagree, is to jump in and give one of the methods a try.

vi

Page 10: Afterschool Counts!

OVER THE LAST 15 YEARS, the number ofyouth-serving programs in America thatoperate during nonschool hours has

more than doubled.1 At least two-thirds of publicschools now offer optional afterschool programs.2

There are many reasons for this growth, butamong the most compelling is the belief thatattendance in afterschool programs matters to thehealthy development of children and yo u t h .Research has linked participation in such pro-grams to positive academic outcomes, “as meas-ured through test scores, absenteeism, schooldropout rates, homework completion, and schoolgrades”; to “multiple aspects of youth’s friend-ships, including the number of friends, the qual-ity of those friendships, and who those friendsare”; and to mental health, including “fewer feel-ings of loneliness and depression and less problembehavior.”3 Research also shows that the level ofparticipation in an afterschool program is directlyrelated to improved student outcomes, includinghigher school attendance and achievement onstandardized math and reading tests.4

The growth in afterschool programs hasi n c reased public pre s s u res for accountability, withstudent attendance and participation often citedas an indicator of a pro g r a m’s success. (See the boxon p. 3 for definitions of “attendance,” “p a rt i c i p a-tion,” and other terms). The increased attention todata, in turn, is prompting efforts to find moreu n i versal, compre h e n s i ve, and statistically va l i dsystems for tracking afterschool attendance.

Knowing how often children attend a pro-gram or specific activities helps program directorsunderstand what “dosage” of participation their

program generates and what dosage is needed toproduce intended results. High or low attendancefigures may suggest high or low program qualityand give directors insight into needed improve-ments. And fluctuations in attendance patternsmay offer insights into staffing and programmingthat will help directors make the best use of lim-ited resources.

Program funders increasingly require atten-dance data as a form of grantee accountability orto qualify for reimbursements from city or federalfunding streams. Some, like The Robert WoodJohnson Foundation, also see pro g r a m - l e ve lattendance data as a first step toward building acitywide, cross-program database that can yielduseful information about youth experiences andopportunities (see pp. 20-22). And, on the mostpractical level, competition for limited resourcesamong nonprofit organizations and public agen-cies has increased the need for data on programquality and results. Yet program directors andstaff sometimes avoid monitoring attendancebecause of the time commitment, the cost, andthe “intrusiveness” of taking roll (which makes

CHAPTER I / 1

“The question of who is participating, and howoften, is important for good practice, becauseit enables [staff] to choose content andinstruction that will retain participants. It’simportant for accountability, especially to fun-ders whose contributions are tied to atten-dance. And it’s important to researchers, so wecan answer questions about program qualityand explain patterns in program utilization.”—Elizabeth Reisner, Evaluator, Policy Studies Associates

IWhy Track Afterschool Attendance?

1 DeAngelis & Rossi, 1997. Cited in Northwest Regional Educational Laboratory (1999). After school programs: Good for kids, good forcommunities. Retrieved August 18, 2003 from http://www.nwrel.org/request/jan99/article4.html.

2 Belden, Russonello, & Stewart Research and Communications (2001). Principals and after-school programs: A survey of preK-8 principals.Washington, DC: National Association of Elementary School Principals. Cited in National Institute on Out-of-School Time 2003 Fact Sheet.Making the Case: A Fact Sheet on Children and Youth in Out-of-School Time. Retrieved August 18, 2003 from http://www.niost.org/publica-tions/Factsheet_2003.PDF.

3 Simpkins, S. (2003). “Does youth participation in out-of-school time activities make a difference?” In The Evaluation Exchange IX(1).Cambridge, MA: Harvard Family Research Project.

4 Welsh, M.E., et al. (2002). Promoting learning and school attendance through after-school programs: Student-level changes in educationalperformance across TASC’s first three years.Washington, DC: Policy Studies Associates. See also Huang, D. et al. (2000). A decade of results:The impact of the LA’s BEST after school enrichment program on subsequent student achievement and performance. Los Angeles: UCLACenter for the Study of Evaluation, Graduate School of Education & Information Studies.

Page 11: Afterschool Counts!

the afterschool program seem more like theschool day).

This guide is intended to raise awarenessamong program directors, funders, policymakers,and researchers of the issues and options involvedin tracking attendance and participation in out-of-school activities. It both describes the chal-lenges inherent in data tracking and offerspractical examples of ways that program directorsand youth-serving organizations are trying tomeet the need for attendance data.

We hope that the guide encourages peopleto track attendance data despite the chal-lenges—and that greater use of attendance sys-tems at the program level will ultimately lay theg ro u n d w o rk for citywide systems that trackyoung people’s activities.

The information presented here comes pri-marily from interviews with people working onthe front lines to staff, direct, or evaluate after-school programs. We sought information fromsources at many of the major, national afterschoolinitiatives as well as at individual, small-scale pro-grams and within city agencies (see Appendix A).

We profiled some sites in extra depth, based on:

■ Their ability, collectively, to represent the majoroptions for attendance tracking and analysisthat are available to afterschool programs ofvarious sizes and resources;

■ Their ability to illustrate experiences andprogram factors that are common to mostsites;

■ Their use of techniques that are relatively easyto administer and portable to many settings; or

■ Their use of innovative approaches to fosterwidespread data collection and use.

Those profiles are contained in Appendix B.Throughout the guide, we also include examplesfrom programs whose conditions or practices aresomewhat unique, are still in the early stages ofdevelopment, or are more relevant at the cross-site level than for individual programs.

Chapter II of the guide briefly outlines thevariety of methods for monitoring attendance

that are available to afterschool programs, whichare detailed in subsequent chapters. Chapters IIIt h rough VII describe issues and solutionsinvolved in using, collecting, processing, and ana-lyzing data and in building citywide capacity fortracking attendance across programs. ChapterVIII identifies resources necessary to support datasystems and analysis. Chapter IX suggests lessonsand conclusions that can guide readers’ attemptsto establish their own attendance monitoring sys-tems. Appendix A lists the individuals inter-viewed for the guide. Appendix B profiles foursites and the data systems they use. Appendix Clists resources for more information.

2 / CHAPTER I

Page 12: Afterschool Counts!

DEFINITIONS

N o t e : I n formation about prog ram atten -dance falls into two basic cl a s s e s : c o u n t sand ra t e s. Attendance fi g u res can also re fe rto the afters chool prog ram as a whole orto the behavior of individual students.

Afterschool programs: Youth-serving pro-grams that operate during nonschool hours. Thisguide uses the term broadly, to include programsthat operate before and after the school day, onweekends, and during the summer (comparable tothe term “out-of-school time”).

A t t e n d a n c e : Student presence or absence at anafterschool prog r a m . Attendance can be measuredin a variety of ways (e. g . , d a i l y , we e k l y , m o n t h l y ; byactivity or for all activities combined; by individualchild or for all enrolled students as a group).Attendance offers a snapshot of participation atone or several points in time, focusing on whetherthe organization has attained its service goals,while participation (defined below) focuses more onthe over-time experiences of individual children. T h eindividual student attendance rate usually isderived from the number of days a studentattended a program during a ye a r , divided by thenumber of days it was possible for that student toattend (with the calculation repeated for each stu-d e n t ) . O r , one might divide the number of days astudent attended during a year by the number ofd ays the program was open that year (whichrequires less effort than determining the number ofd ays possible for each student but may result indeflated attendance rates, because students whowere not enrolled for the entire year are countedas absent even for days they were not enrolled inthe prog r a m ) .

A verage Daily Attendance (ADA ) : A type ofcount commonly used by afterschool programs andschool systems to refer to the unduplicated countof students attending a program each day . T h i scount is averaged over a specified period of time(a we e k , m o n t h , specified "count" day s, or the fullschool ye a r ) . Project managers find A DA useful fordocumenting the magnitude of their serv i c e s, t h enumber of children serv e d , and the hours and

d ays of services prov i d e d . A DA also can helpd e t e rmine the number of staff needed to maintainthe targeted staff-to-student ratio, whether larg e ror smaller facilities are needed, and whether staffare allocated to the appropriate grade levels. A DAcan be calculated in various way s, and the denom-inator used for calculation has implications for thef i n d i n g s ’ accuracy (see p. 1 7 ) .

A DA rat e : This rate can be calculated in severalw ay s, depending on the denominator used. O n eapproach is to divide A DA by the number of stu-dents enrolled, which reveals the proportion ofenrolled students who attend on a typical day .This rate can be interpreted as an indication ofh ow attractive the program is to students andtheir parents. The A DA rate can also be calculatedby dividing the A DA by the enrollment target setby the prog r a m ’s mission or funders. This calcula-tion helps to gauge whether the program is serv-ing the intended number of students. A site withan A DA rate smaller than 1.0 is serving fewer stu-dents than it is funded to serv e, while a site withan A DA greater than 1.0 is exceeding its goal.

D ata elements: The categories of student infor-mation entered into a database, such as name, s t u-dent ID number, b i rt h d a t e, date of enrollment, e t c.

Dosage: The level of exposure to a program oractivity that a child receives by attending; stu-dents who attend for more hours or days arepresumed to receive a higher “dosage.” Dosage ismost relevant for programs that seek to link lev-els of exposure to specific outcomes.

Enrollment: The number of children signed upto attend an afterschool program. Enrollment fig-ures are not always accurate indicators of actualattendance or participation.

Participation: Active student involvement inan afterschool program. While attendance is pri-marily an administrative function, participationoffers a glimpse into how children and youthview and value the program. Participation can bemeasured at the individual student level or forsubgroups of enrolled students (e.g., all girls), andit often requires measurement of attendance inspecific activities. A common way of analyzingparticipation is to examine patterns over time.

CHAPTER I / 3

Page 13: Afterschool Counts!

OPTIONS FOR COLLECTING DATA on stu-dent attendance fall on a continuumfrom systems that use technology only

minimally to those that use it extensively. Forexample:

■ In the traditional pen-and-paper appro a c h ,someone—usually an instru c t o r, p rog r a ma s s i s t a n t , or parent vo l u n t e e r — m a kes a markon a hard copy of the enrollment roster foreve ry student who attends on a given day.A l t e rn at i ve l y, students may sign in eve ry timet h ey at t e n d , and the sign-in sheets are collectedd a i l y. Some programs simply collect these hardcopies “for the re c o rd ” but don’t use the dat abecause it takes too long to org a n i ze ands y n t h e s i ze the information by hand. O r, ap rogram assistant, school clerk, or administrat o rat the prog r a m ’s central office may enter dat at h at are collected manually into an electro n i cd at a b a s e , such as a simple electro n i cs p readsheet file.

■ Programs that aggregate daily attendance databy the week, month, and/or year need to entertheir data into some kind of database thatallows the numbers to be combined andmanipulated. The simplest of these is housed on a personal computer (PC). Several types ofsoftware support databases that range fromrelatively simple (e.g., Excel or Microsoft Access)to complex (e.g., KidTrax, YouthServices.net).The capacities and limitations of varioussoftware are discussed in Chapter III and in the site profiles and examples.

■ Web-based systems offer users more choices andf l exibility in analyzing and re p o rting dat a , a n dt h ey are a growing trend in afterschool dat atracking because they make prog r a m - l evel dat aava i l a ble to a broader audience, such as theentity that manages multiple sites. Because thesesystems link electronically to a central dat a b a s e ,p rogram sites do not need to purchase ex p e n s i ves o f t wa re (although they do re q u i re access to a computer and the Intern e t , and licensing costs are usually incurre d ) . The major systemst h at are marketed to afterschool prog r a m s( Yo u t h S e rv i c e s . n e t , Q S P, and KidTrax) customizethe system’s data elements, l evel of analysis, a n dre p o rting formats to the needs of each customer.

■ Programs that want to collect detailed datawith minimal burden on staff sometimes useswipe cards. Students receive ID cards withindividual barcodes, which they present toelectronic scanners as they enter and exit eachday. Some programs also require students toswipe their cards as they enter and exit eachactivity. The data can be stored on Web-basedsystems or in stand-alone PCs. This systemrequires the purchase of proprietary software(e.g., KidTrax) and at least one scanner andcomputer per site, so it is the most resource-intensive option.

The “right” data collection method for agiven program depends on several factors: Whatquestions are the data expected to answer, andhow precise do the answers need to be? How largeis the program (i.e., how many children need tobe tracked)? How much money and staff time areavailable for data tracking, and how important isit for the program to invest resources in the activ-ity? (Invest too little in data collection, and theeffort will not produce the information neededfor accountability reports or program improve-ment. Invest too much, and scarce resources maybe wasted.) Do staff have the necessary expertise?Do activities occur away from the program site?Do children leave the building between theschool and afterschool day? Do children moveindividually or as a group from one activity to thenext? How often do activities change? The rele-vance of these factors is discussed in Chapter III.

Strategies for tracking data from multiple pro-grams, at the citywide level, are different fro mthose available to programs. They re q u i re morecomplex databases that can accommodate multi-ple identification codes for each child, va r i a t i o n sin the way that programs define part i c i p a t i o n ,and, in some cases, confidentiality pro t e c t i o n swhen data are shared across programs. Some of theoptions profiled in this guide, howe ve r — e s p e-cially the Web-based systems Yo u t h Se rv i c e s . n e tand KidTrax—illustrate monitoring and re p o rt i n gf e a t u res that would be similar at either the pro-gram or citywide leve l .

4 / CHAPTER II

IIOverview of Effective Strategies

Page 14: Afterschool Counts!

MOST PEOPLE THINK about data in thissequence: collecting, processing, ana-lyzing, and finally using. But thinking

about how you will use the data first helps toanswer questions about what data you need tocollect and how you might obtain, organize, andanalyze the information. In fact, some evaluatorsrecommend not collecting attendance data untilyou can describe how the data will be used,because your goals for using data will determinethe complexity of the system you choose, thedegree to which frontline staff buy into the needto take attendance, and whether you will need to:

■ Monitor attendance (the number of bodiescoming in the door every day, week, or month),participation (the frequency, or level ofintensity, with which they attend), or both;

■ Know only the aggregate number of attendeesor also the attendance of individual children;

■ Collect data on the types of activities childrenattend (and, if so, whether you need the datafor every day, every week, or every month andwhether you should monitor the activitiesattended by all children or by each childindividually); and

■ Merge your attendance data into a broaderdatabase, such as one maintained by the schoolsystem or city youth authority, or use themonly for analysis within the program.

Data on afterschool attendance and/or partic-ipation can be used for the following purposes:

To fulfill accountability requirements. Pro-gram funders typically use attendance data todetermine the daily cost per child; to verifygrantees’ compliance with a targeted level of serv-ice (the “utilization” rate); and to substantiatereimbursement claims made to city, state, or fed-eral funding streams.

For example, The After-School Corporation(TASC), which sponsors 242 school-based pro j e c t s(187 of which are in New Yo rk City and arere q u i red to submit attendance/enrollment data), isreimbursed by three different city agencies, based

on attendance figures. In order to pre p a re invo i c e s ,TASC needs the names and school identificationnumbers of the children who attend TA S C - s p o n-s o red programs each day. In the case of one funder(the city’s Human Re s o u rce Administration, whichmanages child care funding for parents in we l f a re -t o - w o rk programs), TASC must cross-tabulate itsattendance data with various city databases to deter-mine which children are eligible for re i m b u r s e-ment. TASC also uses attendance data to determineh ow much money to disburse to each site. A TA S Cp rogram officer examines the data collected fro msites to monitor the sites’ performance in meetingtheir enrollment and attendance targets. At mid-year and between each contract ye a r, TASC adjustsgrants up or down to match the number of part i c i-pants actually being served on a daily basis, usingfunding formulas that have evo l ved over time.

To monitor the quality and effectiveness ofan overall initiative. LA’s BEST, which encom-passes 117 afterschool programs in the Los Ange-les Unified School District, uses data onparticipants’ start and end dates and their dailyparticipation to examine variations in attendancepatterns across sites. When combined with aninternal performance monitoring system, theadministrators can begin to understand the rela-tion among program implementation, programoutcomes, and student attendance.

One of the goals of Team-Up for Youth, anafterschool youth sports intermediary in the SanFrancisco Bay Area that is funded by The RobertWood Johnson Foundation, is to “level the play-ing field” for underserved populations, includinggirls and low-income children. That initiativeuses data on participants’ demographic character-istics to determine whether its goal is being met.

The Safe and Sound Campaign in Ba l t i m o re ,

CHAPTER III / 5

IIIUsing Data

“Data in and of [themselves are] not thatinteresting or useful. It’s what you do withdata that matters.”—Eric Bruns, Evaluation Coordinator,

Baltimore’s After-School Strategy

Page 15: Afterschool Counts!

whose afterschool programs serve about 4,000 K-12 students eve ry day, combines attendance datawith feedback from staff and youth to test the the-o ry of change for its multi-site afterschool strategy.The theory assumes that the initiative “will incre a s eutilization, increase the number of afterschool slotsa vailable [i.e., create new openings], and incre a s ethe quality of the funded programs,” explains Eva l-uation Coordinator Eric Bruns. Using data onattendance rates, “we have found that, in cert a i na reas, quality is immensely improved. Howe ve r,utilization [rates] and [the number of ava i l a b l e ]slots have increased only modestly. The increase isencouraging, but there’s still work to be done.”

To evaluate student outcomes. Both theTASC and Ba l t i m o re Safe and Sound initiative sc o m p a re attendance data to students’ school per-formance re c o rds to measure the effect of part i c i-pation on student outcomes. Policy St u d i e sAssociates, the firm evaluating TASC, inve s t i g a t e swhether students who attend the afterschool pro-gram frequently have better school outcomes thanthose who attend irregularly or not at all, includingbetter test scores and school attendance. Safe andSound evaluators look primarily for a re l a t i o n s h i pb e t ween afterschool attendance and achieve m e n ttest scores (which they have not yet found).

Several school districts around the countryuse Quality School Portfolio (QSP), a free toolprovided by the National Center for Research onEvaluation, St a n d a rds, and Student Te s t i n g(CRESST), to match students’ attendance inafterschool programs to test scores, classroom per-formance assessments, parent involvement, andteacher experience level. (QSP does not trackattendance on a daily basis, but it will storecumulative attendance data.)

To support pro g r a m - l e vel planning and man-agement. Attendance data can help managers deter-mine how many staff to hire, how much space toobtain, how many snacks or supplies to ord e r, andso on. Information on attendance by grade level orage group also can help managers figure out how toc o n f i g u re staff and other re s o u rces. For example, ifthe program serves grades K-8 but attendance datas h ow that half the students who attend on a typicalday are in grades 3 or 5, then it makes sense to con-centrate re s o u rces on those grades.

Citizen Schools, which in 2003-04 servedabout 1,500 young people (age 9-14) at 20 after-school sites in Boston, the surrounding area, andselected other cities, monitors attendance closelyduring the first three weeks of the school year.

6 / CHAPTER III

“High attendance is a good preliminarymeasure of program quality. To somedegree, students and families vote withtheir feet. A high attendance rate is a necessary but not sufficient condition,however. It is not an end in itself.”—Richard White, Evaluator, Policy Studies Associates

USING ATTENDANCE DATA TO ASSESS PROGRAM AND INITIATIVE QUALITY

To monitor quality at the program leve l ,contract managers at the Family League ofB a l t i m o re City use attendance data to ve r i f ywhether Safe and Sound sites are re a c h i n gthe number of students they proposed tos e rve . “ We also use the data to look at ours t r at egy as a whole,” s ays eva l u ator EricB ru n s . “ W h at are the pat t e rns of high andl ow utilization across programs? What arethe characteristics of programs [in each cat-eg o ry]? Some things suggested by the dat aneed to be inve s t i g ated further—such aswhether it’s true that programs located inschools have different utilization rates thanp rograms located in community centers.”

Safe and Sound also cross-tabulatesattendance and enrollment data with budg-etary data to learn the cost per partici-pant. “We compare that [information] tonational norms in the field to try to keepour strategy on track,” Bruns says. “That’simportant right now because the [initia-tive] has had significant changes in theamount of overall funds available, so theyhave to make hard decisions about whogets funding in the future.”

Page 16: Afterschool Counts!

The program’s administrators and site directorsuse that information to make staffing adjustmentsacross campuses. Using SPSS statistical software,Citizen Schools also examines the data for differ-ences in attendance between subgroups of partic-ipants. Do older students (seventh- andeighth-graders) have higher or lower attendancerates than fourth- through sixth-graders? Are ratesincreasing or decreasing? Do girls attend moreoften than boys? The information gleaned fromthose analyses guides decisions about what activi-ties to offer.

To facilitate case management. The after-school drop-in center operated by the P.F. BreseeFoundation, in East Hollywood, Los Angeles,uses magnetically encoded swipe cards to trackevery activity a participant engages in each day.Bresee’s intake process is extensive, so the data-base also includes a wealth of information aboutp a rt i c i p a n t s’ school activities, hobbies, specialinterests, career goals, medical needs, and otherfactors. When administrators examine the atten-dance data, they also consider these other vari-ables. “If the youth isn’t improving, is it because[he’s] not attending consistently? Or are therelearning disabilities?” asks (former) AssociateDirector John Wolfkill. “We can then targetintervention to specific youth.”

Similarly, individual attendance data triggerefforts by Citizen Schools leaders to identifyproblems in the program or at home. “If campusdirectors don’t have 80 percent attendance, wetroubleshoot with their supervisor,” says (former)Re s e a rch Di rector Charlie Schlegel. “T h a tquickly becomes a conversation about specifickids’ issues and what can be done to follow upwith parents and teachers.”

To support student rewards, incentives, andsanctions. The Bresee Foundation uses afterschoolattendance data to support incentives for partici-pating in educational activities. Every activity thathas educational value is worth a certain numberof points. When a participant signs into the activ-ity using his or her swipe card, the attendance is

recorded in the database. Upon completion of theactivity, points are awarded and later recorded inthe student’s file. He or she can then redeem thepoints for goods and services (see box above).

After School Matters, which operates after-school programs at 35 sites in Chicago with sup-p o rt from The Ro b e rt Wood Jo h n s o nFoundation, offers apprenticeship programs in thea rts, technology, communication, and sports (forwhich students are paid a stipend) and dro p - i nclubs (which do not carry a stipend). “We have tobe sticklers about attendance because we’re payingthe kids,” notes Exe c u t i ve Di rector Nancy Wa c h s .Teachers at the school sites enter attendance datainto an online database, and school clerks use theinformation to generate students’ payc h e c k s .

CHAPTER III / 7

USING ATTENDANCE DATA TO CREATE PARTICIPATION INCENTIVES

Youth who attend the Bresee Foundation’safterschool program earn points for partici-pating in educational activities. Completinga computer tutorial is worth 300 or 400points, for instance; a half-hour of home-work earns about 50 points.

Students receive a statement (whichlooks like a bank check) of the points theyearn for each designated activity, and it issigned by the staff overseeing the activityto verify that the student not onlyattended but fully participated in theactivity. Students deposit their “checks”at Bresee’s “Youth Bank” and enter theamount into the program’s database. Pointsare redeemable for school supplies, food,candy, soccer balls, PlayStation games—even electronic equipment. A can of sodafrom the center’s store costs 50 points,roughly the same as a half hour of home-work; a Sony PlayStation recently “sold” toa high-attending student for 20,000 points.

B resee offers a financial literacy class,and instructors use the at t e n d a n c e - b a s e di n c e n t i ve system to make points about sav-ing and building assets. Youth can also learnabout credit by obtaining “ l o a n s ,” with auto-m atic payback through the point system.

Page 17: Afterschool Counts!

To help gauge demand for services (in gen-eral and for specific activities). Attendance dataare a quick indicator of how attractive a programis to children and parents. “If [staff ] plan a specialactivity and attendance doesn’t go up, chances arethe activity didn’t work well,” observes RichardWhite, an evaluator at Policy Studies Associates.Directors can also use attendance data to identifyservices that cause a drop-off in participation.“Even without data on specific activities, if theyknow a big performance or event is going tooccur in February but the attendance rates in Jan-uary and February aren’t any different from othermonths, then you can assume the event didn’tserve as an attendance draw,” White says. Or, ifthe director knows a group of children is workingwith a specific teacher, and that group’s atten-dance rate is lower than that of another groupworking with another teacher, the first teacher’spractices may need improvement.

At Citizen Schools, where pro g r a m m i n gchanges weekly, Schlegel compares the attendancerate on days when students work on academicactivities with days when they take field trips. Heuses data on attendance in special three- and four-week programs to figure out what schedule is bestfor attracting students. Annually, Citizen Schoolslooks at how many students chose to continue inthe program, overall and at an individual level.“We’ve learned that retaining kids throughout awhole year and across years is important...to theiracademic growth and to our impact,” Schlegelsays. “Something like 80 percent of kids whowere with us at the start of year were still with usat end of year, and we only know it throughattendance tracking.”

For staff self-reflection, training, and edu-cation. TASC staff examine weekly attendancere p o rts for each site to see if the individual pro ject

8 / CHAPTER III

CHECKLIST FOR UNDERSTANDING DATA NEEDS

IF YOU WANT DATA SO YOU CAN…

■ Fulfill the legal responsibility ofkeeping participants safe

■ Create payroll records (e.g., forparticipants who receive stipends)or award incentives

■ Describe program operations

■ Evaluate participant outcomes

■ Analyze the “dosage” (frequencyand/or intensity of services) thatparticipants receive

■ Target services to specific sub-groups of the student population

YOU MAY NEED TO…

✔ Collect data on the time each student arrives and departs✔ Collect attendance data on each enrolled student, not just

the group as a whole

✔ Collect data on the specific activities students participatein (if some are paid and some are unpaid) and the numberof hours worked

✔ Collect student demographic data ✔ Collect data on instructors’ capacities

✔ Collect student-level attendance data✔ Forge agreements for data sharing with the school system✔ Obtain parent consent for data sharing✔ Establish a unique identifier for each student that

corresponds to the one used by the school system

✔ Collect data on when each student enrolled, how manyhours and/or days s/he attends, and (possibly) in whatspecific activities s/he participates

✔ Collect student-level attendance data✔ Collect data on demographic characteristics

Page 18: Afterschool Counts!

is reaching the program’s standards. If a site is notreaching its attendance target, TASC programofficers will talk with the site director or coordi-nator about possible barriers and solutions.Rahan Uddin, administrator of TASC’s databaseand coordinator of its help desk, also uses atten-dance data to help site-based staff evaluate theirprograms. For example, if a program directorthinks attendance is meeting the target but thedata show it is actually much lower, Uddinencourages the director to examine the detailedreport of attendance by date. “They may be get-ting 200 kids on Monday and Tuesday but by Fri-day they’re only getting 100,” Uddin explains.“That tells you to schedule some different activi-ties on Fridays.” Some site staff also use the datato determine which activities are most popularwith students (a data component that is optionalin the TASC system).

To advocate for more funding or for the useof specific strategies. Baltimore’s Safe and SoundCampaign uses geocodes to array afterschoolattendance data and program capacity by U.S.Census tract, which enables leaders to identifywhich neighborhoods have (and are filling) themost afterschool slots. Then, they compare thosedata to information on the risks faced by youngresidents of each neighborhood. Using the data inthat way helped Safe and Sound obtain a $1 mil-lion grant to serve children in several specific geo-graphic areas.

Si m i l a r l y, TASC leaders present attendancedata at conferences to generate support for the prac-tice of universal enrollment (i.e., opening the pro-gram to all students in a school), to funders as anargument for program expansion, and with princi-pals at potential sites to illustrate the high demandfor afterschool services in specific neighborhoods.

Despite these examples of how attendancedata can be used, observers agree that many after-school programs don’t use the data as effectively asthey might. The key, says Bre s e e’s Wolfkill, is thatattendance data “have to help you improve yo u rp rogram. Do n’t just have data for data’s sake.”

CHAPTER III / 9

Page 19: Afterschool Counts!

TH E M O S T C O M M O N M E T H O D S of collectingattendance data are the simplest: sign-insheets or roster check-offs, usually posted

at the front door but not in eve ry classroom ora c t i v i t y. “It’s basically a re c o rd of students’ pre s e n c eor absence for the day,” explains Richard White ofPolicy Studies Associates (PSA), who evaluates theTASC afterschool program, including thosefunded jointly with the 21st Century CommunityLearning Centers program. “Most of the pro g r a m sI ’ve encountered try to collect it by the day, not byreflecting back on the week or month.”

The main issues involved in collecting datainclude:

■ Concerns about the reliability of data. If studentsare allowed to sign themselves in, there’s always achance that a truant student will be marked present byhis or her friends or a student will attend but fail tosign in. Or, a busy staff person may fall behind on thedaily records and end up entering several week’s worthof daily records retrospectively.

■ The administrative burden on program staff.“It’s not so hard to take attendance, but at some pointdata have to be entered into the computer,” saysresearcher Carolyn Marzke. “Lack of expertise about dataand technology at the program level can make that dif-ficult.” The data intake process may be especially daunt-ing for programs that have no automated record ofbasic student data. “At some point, they are going tohave to do data entry,” acknowledges Ananda Roberts,president of the firm that sells KidTrax software. “We’vefound it takes five to seven minutes [to enter intakedata] per child, for someone with average skills. You canenter 450 to 500 kids in a 40-hour work week.”Another system, YouthServices.net, helps users by generat-ing printable registration forms, templates, and checkliststo help with data intake.

■ Managing overlapping data requirements. Mostprograms receive money from more than one funder,each of whom has a different set of data definitions,categories, and reporting requirements. Several large-scaleprograms are seeking ways to merge the requirementsinto a single set of data collection forms and reports.

As program directors sort through theseissues, several ove r a rching questions emerge:Which data elements are essential, and which areoptional? What types of attendance should betracked? When and how often should attendancebe taken? How does program design affect datacollection? How can data quality be ensured?How can data from more than one managementinformation system (MIS) be merged?

ESSENTIAL AND OPTIONAL DATA ELEMENTSComplex data systems are harder to “f e e d” andmaintain than simple ones, so it’s best to keep datacollection as simple as possible. At the same time,you don’t want to forgo collecting data that yo u’l lneed later to draw conclusions about your pro-gram. Complex programs like the San Fr a n c i s c oBeacon In i t i a t i ve, for instance, which offers myriadactivities in health, education, youth leadership,a rts, and re c reation to more than 5,000 childre nand youth at eight sites, must collect an array ofdata to satisfy their stakeholders’ special interests.

Researchers and program directors suggestthat the following data elements represent a goodminimum standard:

■ Site name (if part of a multi-site initiative orcitywide database)

■ Total number of students enrolled

■ Total head count per day, week, month

■ Student names

■ Individualized student number, such as aschool- or district-assigned ID

■ Age and/or grade in school

■ Each student’s first and last date of enrollment

■ Each student’s demographic information

10 / CHAPTER IV

IVCollecting Data

“I don’t think there’s any magic set of datathat you need. What you collect depends onwhat project leaders, the project sponsor,and/or evaluator wants to know.”—Richard White, Evaluator, Policy Studies Associates

Page 20: Afterschool Counts!

The need for demographic data va r i e sdepending on the program’s target population. Ifthe program is supposed to target students whoare having academic difficulty, low-income chil-dren, or youth with a particular race/ethnicity,the system should collect data on those character-istics. For example, the TASC program, whichaims to serve all of the children in its host schools,must collect data on participants’ race, age, andgender to know whether sites are attracting (orfailing to attract) a particular subpopulation.Similarly, programs in cities with rapidly chang-ing demographics, such as San Francisco’s BeaconInitiative, want to know about their participants’linguistic and ethnic backgrounds so they canprovide culturally relevant activities.

Start and drop dates are essential for calculat-ing attendance rate across an entire year. “Somekids don’t start at the beginning of year, or theyl e a ve the program before the end of ye a r, ”explains Christina Russell, a researcher on theTASC evaluation. “If those factors are n’taccounted for, it could suppress your attendancerate because it will look like some kids are absentwhen in fact they are no longer enrolled in theprogram.”

It’s important to use a unique identifier foreach participant so you can isolate his or herdata, but it isn’t essential to use the school sys-t e m’s ID number—which can be difficult tolearn, since many students don’t have themm e m o r i zed—unless the program is trying tomerge with the school system’s database (forexample, to link afterschool participation withs t u d e n t s’ academic outcomes). In fact, Citize nSchools uses an original, six-digit ID number foreach child, rather than his or her name, becauset h e re are so many participants who share thesame name.

Optional data elements collected by the pro-grams featured in this guide include students’ par-ticipation in specific activities, reasons fordropping out, parent consent given for field trips,and the name of the assigned team leader.

TYPES OF ATTENDANCE TO TRACKThere are several ways to measure attendance:

■ By the total number of children who come inthe door (average daily attendance, or ADA, forthe program overall)

■ By the number of days that each child attends

■ By the kinds of activities that childre np a rt i c i p ate in (e.g., h o m ework help, s p o rt s , a rt s )

Many programs measure only the total num-ber of children who attend the program. Thosethat want to understand what low or high ADArates mean, or to link attendance to specific out-comes, will also record whether an individual stu-dent attends on a given day. “In order to have anaccurate [analysis at the] group level you need toknow how kids are moving in and out of thegroup,” explains Policy Studies Associates’ Rus-sell. “Otherwise, the group of children includedin the analysis might be changing day to day ormonth to month.”

Program directors who want to know whichprogram model produces the most impact or is ofthe highest quality also need to know what stu-dents do after they walk in the door. But fewtrack those data; it requires a significant invest-ment in staff time and/or technology to post datarecorders at every activity, and it can be confusingwhen the data show a student attending some butnot all of the program day.

Multi-site initiatives and citywide data collec-tion systems face an additional issue: Will theyaccept data in multiple formats or insist that eachsite use the same format so that data quality isconsistent? Very few site-based programs have the

CHAPTER IV / 11

“People’s eyes glaze over when they realizethere are an unlimited number of data pointsthat we can track. But you don’t have to trackeverything. Start with the basics, based onwhat your program is focusing on, and slowlybuild up. Your staff will follow along.”—Ananda Roberts, President, nFocus Software (KidTrax)

Page 21: Afterschool Counts!

technical ability to translate data from one systeminto another, and it may be hard for the peoplewho require data to justify that level of effort.

For in-depth examples of the types of atten-dance afterschool programs track, see Appendix B.

WHEN AND HOW OFTEN TOTAKE ATTENDANCEMany afterschool programs take attendance oncea day when children are gathered in large groups,such as snack time, or they assign the task to theinstructor for each group of children. Drop-inprograms, such as those operated by Boys & GirlsClubs, may station a “greeter” or a sign-up sheetat the front door to catch all incoming students.But data collection is more difficult for programsin which groupings and activities vary throughoutthe day and week. Programs usually reach com-promises based on when staff are most availableto take attendance.

HOW PROGRAM DESIGN AFFECTSDATA COLLECTIONThe way in which students move through anafterschool program can make one method orlevel of data collection more appropriate thananother. Some design elements to consider:

■ If the program is school-based, do studentsleave the building between the regular andafterschool day? If so, the afterschool programcan’t necessarily rely on data collected by theschool program (e.g., through the schoolsystem’s scanners and swipe cards) and willhave to re-enter its own data.

■ Do some activities occur offsite? If so, acentralized sign-in location at the mainprogram site won’t be sufficient to capturedata on all students.

■ Do all students part i c i p ate in the same activityat the same time, or do groups of studentsro t ate through activities? If eve ryone does thesame thing at the same time, you may be abl eto gauge the effects of part i c i p ation even if yo uonly know which students wa l ked in the door atthe beginning of each day, because that alonewill tell you what each student did each day.

For in-depth examples of the way that pro g r a mdesign affects data collection, see Appendix B.

ENSURING DATA QUALITYThe reliability of ADA data varies considerablyacross afterschool sites, for several reasons. If theprogram funder uses attendance data to calculategrant amounts, or if the data are used to justifyi n c e n t i ves to participating students, sites aremotivated to make sure their data are complete.But there also is an incentive to show high atten-dance rates. “At some sites, when you disaggregatethe data you may find identical individual datafor all 50 kids, and you know they made up theirdata,” one researcher says. “The problem is notthat people are venal but whether they’re reallycollecting what they’re supposed to be collecting,”agrees a funder.

Drop-in visits by program directors or fun-ders, to verify re p o rted attendance figures, are onestrategy for quality control. Re s e a rchers also advisep rograms to keep data collection processes simpleso staff will do the task consistently. Re s e a rc h e rCharlie Schlegel, who works with many pro g r a m sthat serve middle-school students, further urgesp rogram directors to pay special attention to ve r i-fying the attendance re c o rds of older youth. “T h e ymay say a parent asked them to leave early whenthat isn’t the case. W h y, exactly, the student wasabsent from the program is pretty important tothe analysis of participation data,” he says.

For more on techniques for ensuring dataquality, see the “Issues and Challenges” section ofprofiles in Appendix B.

12 / CHAPTER IV

“Quality is a huge issue, because this is labori n t e n s i ve . I don’t want to do something thati s n ’t methodologically meaningful, b e c a u s ethen people are doing a lot of work for noth-i n g….At the same time, we want the prog r a m sto think of this as an important way for themto make better decisions.”—Rachel Baker, Deputy Director,Team-Up for Youth

Page 22: Afterschool Counts!

MERGING DATA FROM MORE THAN ONE SYSTEMOne way to maximize the amount of student dataavailable to afterschool programs, without com-plicating data collection on the front lines, is tocombine the program’s data with information onthe same participants culled from the MISs ofother youth-serving institutions. For example,some large afterschool initiatives have agreementswith school districts that enable them to submitthe names and identification codes of afterschoolparticipants and receive data on those students’test scores and school attendance.

Three issues complicate this type of data collection:

■ Inconsistent methods for identifying individualstudents. Foundations, Inc. operates elementary andhigh school programs in multiple sites. Each district hasa different system for identifying students, which makesit difficult to merge the data into a single database.Consequently, for elementary school programs, Founda-tions, Inc. collects individual attendance data but usuallyonly reports it at the group level. Other programs, whichindividually identify participants by a code other thanthe school system’s student ID number, have found ithard to verify that they are, in fact, getting data for theright students.

■ Confidentiality concerns. Some school systems arewilling to release student data, with the appropriatecaveats and guidelines for their use; others are not.Typically, afterschool programs or their evaluators mustobtain parent consent for the data (from either data-base) to be shared. Thus one essential element of theafterschool program’s database may be whether partici-pants have a signed consent form on file.

■ Delays in obtaining key data. School test data for a particular year usually are not available to researchersuntil the following school year, which can delay analysesof afterschool impacts.

A few school systems and city youth authori-ties are developing systems for collecting and stor-ing data from a variety of yo u t h - s e rv i n gprograms, and some will incorporate attendancedata. See pp. 20-22 for examples of these systemsand the issues they face.

CHAPTER IV / 13

Page 23: Afterschool Counts!

COLLECTING DATA is only the first step inmonitoring afterschool attendance. If youwant to use your data for any of the pur-

poses outlined in Chapter III, you’ll need to enterand maintain the data in a system capable oforganizing and manipulating information. “Hav-ing an electronic system is important because itenables you to sort by various data points,”explains Charlie Schlegel, former research direc-tor at Citizen’s Schools in Boston. “For example,kids often are grouped by program elements oractivities rather than alphabetically.”

Usually, the attendance data captured ondaily rosters or sign-in sheets are entered into anelectronic tracking system, which can range incomplexity from a simple electronic spreadsheetto a custom MIS. All systems, however, face sim-ilar choices about technology needs, costs, andcomplexity and procedures for entering, “clean-ing,” and maintaining the data.

TECHNOLOGY NEEDS, COSTS,AND COMPLEXITYElectronic processing systems make it easier andfaster to analyze and use data later on. But theycarry financial costs—in the form of computers,software, and sometimes Internet access—andhuman resource costs, in the form of technicalskills needed by the staff who perform data entry.(For the amount of money needed to obtain someof the most popular systems, see p. 24.) Choicesinclude the following:

■ Should you automate the attendance data orstick to a simpler, but perhaps more timeconsuming, system of hard copy storage andcalculations done by hand? The inherenttradeoff: financial cost, investment in stafftraining, and ongoing commitments of stafftime vs. easier data analysis and reporting,greater flexibility, and reduced staff time andeffort to create reports and analyses.

■ If the system is automated, how will yousupport sites that lack the necessaryequipment? LA’s BEST provided its sites withcomputers, scanners, and printers to support

the data system. For Baltimore’s Safe andSound Campaign, where one parent organizationsponsored multiple afterschool sites, it wasmost efficient to have each site send hardcopies of attendance records to a centrallocation that had the staff, equipment, andexpertise to process data. TASC also followedthat model in its first year, although TASC laterpurchased computer equipment, a databasesystem, and training for sites.

Rachel Ba k e r, deputy director of Te a m - Up forYouth (an afterschool youth sports intermediary inthe San Francisco Bay Area that re c e i ves fundingf rom The Ro b e rt Wood Johnson Foundation), hasgrappled with this issue. Although Te a m - Up asksits grantees to provide data, “we have n’t created as t a n d a rd [process] so it’s hard to compare acro s ssites,” Baker notes. “We want to standard i ze morebut also be respectful of the burden by trying tobuild on the systems already in place.”

DATA ENTRYAccurate data entry is important because erro n e o u sor missing information can distort findings andthus undermine the quality of the database. T h emost common problems include “stray va l u e s”(i.e., invalid entries for a specific variable, such asattendance re c o rded for a date on which the pro-gram did not operate) and missing data on a stu-d e n t’s demographic profile (e.g., age or sex),a c c o rding to John Lee, senior re s e a rcher for theNational Center for Re s e a rch on Evaluation, St a n-dards, and Student Testing (CRESST). Manydata systems, including CRESST’s Quality School

14 / CHAPTER V

“In one year, LA’s BEST grew 187 percent. Thathad major implications for how the organiza-tion documented and maintained student attendance. An automated system notonly ensures a higher level of accuracy—frontline staff receive forms with preprintedstudent information—but also allows theorganization to analyze attendance patternsacross sites.”—Tiffany Berry, LA’s BEST, Evaluation Consultant

VProcessing Data

Page 24: Afterschool Counts!

Profile, have built-in filters that identify inva l i dentries and prompt users to fill in missing informa-tion. (See also the section on cleaning data, below. )

Still, data entry can be a fairly sophisticatedtask. For example, the PC-based data “shell”designed by the Family League of Baltimore Cityhad separate screens for entering students’ demo-graphic information, attendance, and many otherpieces of data. After the data were entered, site-based staff generated a summary file. By the sec-ond Friday of each month, they were expected to

send data from the previous month via email ordiskette to the organization responsible for all 89sites, which aggregated the data.

The initiative provided all of its sites with theMi c rosoft Access-enabled data program andtraining on how to use it. But, as an observernotes, “What they didn’t do as well was make sureeach site had someone competent to do the dataentry. A lot of the sites…didn’t even have com-puters. It was quite a culture shock.” (Partly inresponse to these challenges, the Family League ofBaltimore City now uses an online system thateliminates the need for site-based entry shells andmonthly transfers of data to the central database.)

CLEANING DATA“Cleaning” the data means checking to make surethe data are complete, valid, and consistent withthe definitions established by the program or itsevaluator. Problems with data are especially preva-lent during the beginning and end of the schoolyear, when routines are in transition. Other com-mon issues include:

■ Verifying the student’s enrollment dat e . E v a l u a-tors of one major afterschool initiative found manyr e p o rts of a student attending a program a month ormore before his or her official start date. In otherc a s e s, t h ey found a lapse of several months betwe e nthe enrollment data and the time the student showe dup in attendance records. Both discrepancies made itdifficult to draw conclusions about the student’s ex p o-sure to the program and any effects it might hav ep r o d u c e d . “Enrollment can be interpreted as when theparent fills out an application form or as when the kidfirst comes to the prog r a m ,” o b s e rves an evaluator. “ Weended up establishing the first day of attendance asthe enrollment date.”

■ Clarifying the dropout dat e . O f t e n , a t t e n d a n c erecords indicate a student’s enrollment date but not hisor her end date—yet many students don’t show con-tinuous attendance until the end of the school ye a r .This confusion over end or “ d r o p ” dates becomes anissue for evaluators attempting to measure ex p o s u r eand outcomes. Evaluators of TA S C, whose database con-tained records on more than 37,000 children in 2002,

CHAPTER V / 15

WHO DOES DATA ENTRY?

At AFTER SCHOOL MATTERS in Chicago, a teacherat each of the 24 school sites enters after-school attendance data. “You could assignthe task to kids, but that takes them awayfrom what they could be doing in the pro-gram,” observes Executive Director NancyWachs. “We could have each [afterschool]instructor do it, but they don’t have acomputer in each room.”At the P.F. BRESEE FOUNDATION’S drop-in pro-gram in Los Angeles, which serves about1,600 children and 1,500 youth annually,each participant scans his or her swipecard while entering the building. Studentsscan in again at every classroom and activ-ity lab, where the system asks for informa-tion on what they are doing there. Youthworkers stationed at the activities verifythe information. Activities that occur offsiteuse a sign-up sheet, and staff enter theirstudents’ ID numbers into the center’s MIS.L A ’S B E S T is beginning to use an automat e dp rocess called Te l e f o r m . Afterschool stafft a ke daily attendance on standard i ze df o r m s , which they scan weekly into a centrald at a b a s e . T h e re is no manual data entry,although someone verifies the scanned infor-m ation to make sure it is complete.At the 40 afterschool programs funded byFOUNDATIONS, INC., site-based staff take a dailyheadcount. At the end of each month, theycalculate average daily attendance and sendthe number to the central office for entryinto a Microsoft Excel database.

Page 25: Afterschool Counts!

designated the seventh day after the last recorded dateof absence as the student’s end date. H owe v e r , t h a tsolution would not work for youth-serving prog r a m sthat use a drop-in model, such as Boys & Girls Clubs.

■ Validating attendance dates. It isn’t unusual tofind records of student attendance on weekends, holi-days, and other dates when a program is closed, soattendance data need to be compared to a mastercalendar of possible service dates. “Between the day[students] first attend and the last day they attend,we compute the number of days it was possible toattend the program—in TASC’s case, the number ofweekdays minus h o l i d ays and other school closings,”explains PSA’s Richard White. “We go through day byday and take out the unclean data. We also check tosee whether the number of [reported] days attended islarger than the number of days possible. It’s prettyeasy to do if the data are in a spreadsheet and thedates are columns in the spreadsheet; you can justdelete everything in a column representing a date thatthe program wasn’t open.”

■ Obtaining a unique student ID number. If dataanalysis involves matching attendance data to informa-tion on individual students extracted from other data-b a s e s, each student will need a unique identifier. T h e r eare two issues here: establishing an identifier that willelicit the data you need from the database, and makingsure the identifier is entered accurately. School system-assigned student ID numbers are an obvious choice, b u tt h ey are notorious for causing confusion when a studentf o rgets his or her number or deliberately submits thewrong one. After School Matters allows teens to useeither their student ID number or Social Security num-b e r . “A child may give us one number the first ye a rand the other number the nex t , and we have no wayof knowing if it’s a different [person],” s ays Wa c h s. “ Weoffer both these options because they may not knowtheir student ID or they may not go to one of thec i t y ’s high schools. For many teens, this is the first timet h ey’ve used a Social Security Number.” To address thisp r o b l e m , someone may need to check every student IDnumber for accuracy before the attendance data can bem e rged with other sources.

■ Categorizing unexcused absences. Some programshave considered tracking excused absences (e.g., due toillness, a family commitment, participation in anotherafterschool activity) because they tend to depress

attendance rates. It’s a good idea, because it can yieldvaluable information about competing demands on chil-dren’s time, but one that is hard to execute. It mayrequire the child’s family to send written verification ofexcused absences to the program; someone has todecide whether to accept the excuse; and someone hasto enter the right code for the absence in the datasystem. “It usually becomes a bigger burden than it’sworth,” says White, recalling that TASC created morethan a dozen subcodes to categorize student absencesbut abandoned the practice when sites interpreted thecodes inconsistently. TASC’s attendance takers now sim-ply mark participants either present or absent.

Most data systems developed for evaluationpurposes have built-in mechanisms that help withthe cleaning process. Program directors who useYouthServices.net can generate reports that iden-tify missing data elements for each child and sum-m a r i ze the percentage of children who havevalidated entries on file in each data category. Atthe initiative level, managers use the reports tolearn which sites are having trouble entering dataand what kinds of data they’re struggling with.Some YouthServices.net users respond very rigor-ously to those data gaps. The City of San Fran-cisco, for example, considers services for any childto be “invalid” (not countable) if any of the dataelements pertaining to the child’s teacher, such ashis or her primary language, is missing.

Similarly, nFocus Software (developer of theKidTrax system) checks to see which sites are fail-ing to enter data and follows up with thoseclients, at both the initiative and site level. “Thesystem is designed to send automatic emails toremind [clients] of the upcoming reporting cycle,to alert them to the fields that are blank, or totrigger our staff to make calls,” says companypresident Ananda Roberts. That feature makessystems like KidTrax especially useful for looselyorganized groups of programs that receive fund-ing from the same organization but don’t have astandardized data reporting process.

16 / CHAPTER V

Page 26: Afterschool Counts!

ATTENDANCE DATA usually are collecteddaily but tracked monthly. In otherwords, at the end of each month the daily

data are aggregated so programs can calculateoverall enrollment and average daily attendance(for the program and, sometimes, per child).Other useful analyses include:

■ Number of days that each child attended

■ Individual children’s attendance rates

■ Duration of enrollment—the average number of weeks or months that a child remains in theprogram and whether a child or group ofchildren attends for more than one enrollmentperiod (i.e., semester or year)

■ The total number of children served during the year, versus the number who attend for aspecified duration

■ Maximum and minimum number of childrenserved on any given day

All of these analyses are based on knowingstudents’ enrollment date, so it is essential to havea consistently applied definition of enrollment.TASC uses the term to mean the target numberof students to be served; other programs use thetotal number of children served during the year,even if they only come for one day. “We alwayslook at the month that had the highest atten-dance and consider that the total enrollment forthe program,” says one researcher. “If we lookedjust at numbers for all the kids who ever attended,it would be an inflated enrollment because of thelarge number of kids dropping in and out.”

Other data analysis issues include: selecting adenominator for attendance rate, measuringdosage, interpreting and comparing results, andusing attendance data to assess demand for serv i c e s .

SELECTING A DENOMINATOR FOR ATTENDANCE RATEAttendance rate can be calculated in several ways.Some programs base their analysis on the percentof available days that an enrolled child (or groupof children) actually attended. Policy St u d i e sAssociates used that method during the first few

years of the TASC evaluation; evaluators estab-lished a rate of 60 percent (attendance three outof five weekdays) as an indicator of “active atten-dance” and 80 percent (attendance four out offive weekdays) as an indicator of “high atten-dance.” But they later realized those rates didn’taccount for the varying duration of students’involvement in the program. Now, PSA definesan “active” attendance rate as at least 60 percentand 60 days; a “high” attendance rate is at least 80percent and 80 days.

“To pick those definitions, we looked at themedian number of days kids attended. Sixty dayswas the cutoff for the 30th percentile, meaningthat 30 percent of kids attended fewer than 60days,” explains Richard White. “We’re thereforecutting out the kids who weren’t enrolled for verylong but whose high attendance might throw offour analysis. The 80 percent/80 days definition isan even more stringent measure.”

A l t e r n a t i ve l y, Ba l t i m o re’s Safe and So u n dprogram, which uses average daily attendance tohold afterschool sites accountable for their servicetargets, interprets ADA as the number of studentspresent in a given site, on an average day, over thecourse of a month.

MEASURING DOSAGE“Dosage” refers to the level of exposure to a pro-gram or activity that a child receives by attending.Students who attend for more hours or days arepresumed to receive a higher dosage than thosewhose experience is more limited. Dosage is mostrelevant for programs that seek to show that par-ticipation in an afterschool program producesspecific outcomes.

The main challenge is to establish a standardfor the minimum dosage level needed to pro d u c ethe desired results. If one measures solely the per-cent of available days of service that a child org roup attends, and the number of available days isve ry small, interpretations of participation mightbe falsely high. For instance, a student whoattends 100 percent of 10 days has achieved a highattendance rate, but the actual exposure he or shere c e i ved to the pro g r a m’s social, cognitive, and

CHAPTER VI / 17

VIAnalyzing Data

Page 27: Afterschool Counts!

academic benefits is ve ry different from that of astudent who attends at a rate of 80 percent spre a dover 60 days. Combining the duration of a child’sattendance with his or her attendance rate pro-vides a better measure of dosage—as TASC eva l u-ators did, for example, in establishing 80 perc e n tand 80 days as the standard for high attendance.

An attendance system that captures the infor-mation needed to compute dosage also enables aprogram director or evaluator to explore whetherthere is a relationship between dosage and desiredoutcomes, whether a minimum threshold ofdosage must be met before benefits can beexpected, and whether there is a maximumdosage after which no additional benefits occur.

Minimum dosage standards tend to varyaccording to the age of program participants.Evaluators of Citizen Schools, whose programsserve middle-school students, established 80 per-cent attendance as the minimum level needed toachieve positive outcomes, based on an intuitivesense about what can be expected from youth ingrades 6-8. “We’re recognized for having over 80percent attendance, which is high for this agegroup. But for a program serving kindergartenthrough third grade, I would not be satisfied with80 percent,” Schlegel says.

Another challenge is to eliminate duplicatere c o rd s . L e t’s say a program in San Fr a n c i s c o’sC h i n a t own plans to offer three seasonal activities:a volleyball program, followed by dragon dancing,f o l l owed by Brazilian martial arts. “The questionis, how many kids participate in each program andh ow many participate in more than one?” asksRachel Ba k e r. “If a girl participates in two seasons,h ow do you make sure she’s not counted twice?”To control for duplication and make sure it iscomparing “apples to apples,” Ba k e r’s program hasc o n s i d e red measuring dosage by the number ofs e rvice hours a child re c e i ves during the ye a r.

INTERPRETING AND COMPARING RESULTS Data interpretation is both a science and an art,and this guide does not attempt to teach eitherone. However, some important questions to con-sider as you examine attendance data include:

■ A re students re q u i red to attend eve ry dayt h at the program operates and to stay forthe full session, or are they allowed to comeand go at will? For a drop-in program like theB oys & Girls Clubs, i t ’s fine if the data show thatstudents come to play basketball for a while andthen leave when the game is ov e r . But for a mores t ructured prog r a m , especially one with academicg o a l s, that finding suggests a problem.

■ Is the program offered eve ry day of theweek? In order to reach conclusions aboutex p o s u re , you need to know how many hoursper day and week the program operat e s . P r o-grams like KidTrax can analyze the busiest days ofmonth or year and the busiest hours during thosed ay s. “ M aybe Tu e s d ays are slow in terms of numbers[of participants] but the kids stay longer, so totalhours of service on those days are higher per activ-i t y ,” explains nFocus Vice President Don Pru i t t .

■ How are certain services defined? For initiativeslike Team-Up for Youth and Safe and Sound, which aimto increase the number of afterschool slots available toyouth, how does one define “new” or “expanded” serv-ices? Does it mean services provided to a person whohas never played sports before? To someone who isnewly enrolled at the program in question? Or expan-sion of a program’s capacity to offer a certain numberof new slots? “This is important, because a girl in oneprogram might be getting 12 more months of [serv-ices] while a girl in another program is getting only12 more weeks,” says Team-Up’s Rachel Baker. Bothgirls’ data show expanded services, but the impact willbe different in each case.

■ Are there competing activities that preventchildren from attending? A survey or interviewsmay be the best way to understand the impact ofcontextual factors on attendance patterns.

18 / CHAPTER VI

Page 28: Afterschool Counts!

USING ATTENDANCE DATATO ASSESS DEMAND Can attendance data be used to gauge demand forservices? Yes…and no, experts say. The data canoffer insights into the preferences of attendingstudents. For example, if the typical studentattends three days a week or less, there probablyisn’t much demand for services five days a week.Data also can indicate situations that warrantmore analysis—such as a site with the capacity toserve 100 students that has only 20 participants,or one that consistently operates at overflowcapacity. But attendance data can’t reveal muchabout what prevents non-attending students fromparticipating or whether those students want toattend in the first place.

REPORTING DATAExperience suggests three especially useful ways ofreporting afterschool attendance data. One sum-marizes “process” information, such as the num-ber of students served and the types of servicesthey receive; one shows relationships among datafrom different sources; and one summarizes out-come data. For example:

■ YouthServices.net can generate a periodic Unitsof Service Report for each site that tabulatesthe number of students served, how manytimes each student shows up, and thecumulative hours of service. “We will thenbreak it down to the number of kids,encounters, or hours of service per category,such as academic support or recreation,” sayssoftware developer Mark Min. Reports can betailored to show the participation of asubgroup of students. The KidTrax system hassimilar capabilities.

■ YouthServices.net’s Youth Participation Reportcan track the attendance history of a specificchild and provide a one-page summary of thechild’s participation, including number of daysattended, cumulative hours of service, types of services received, referrals to otherprograms, and the names of staff who worked with the child.

■ K i d Trax provides more than 150 standardre p o rts that analyze at t e n d a n c e , t u rn ove r,weekly usage, and part i c i p a n t s ’ d e m og r a p h i cc h a r a c t e r i s t i c s . Trend analyses identify usagep at t e rns (positive and neg at i ve) over time,which helps users eva l u ate prog ress towa rdo u t c o m e s . A wizard-based Web survey tool take soutcome measurement a step further by helpingusers gather information on part i c i p a n t s ’experiences and attitudes towa rd prog r a m m i n g .

■ Quality School Portfolio (QSP) has a set ofbuilt-in reports that capture the relationshipbetween data elements, such as student mathscores based on teacher’s education level andparent’s involvement.

When reporting some data, it is important toacknowledge the external factors that might influ-ence them. Rachel Klein, evaluation director forAfter School Matters, cautions that attendancedata for older students can be especially sensitive.Low-attending teenagers “tell us they have otherdemands on their time,” she says. “Overwhelm-ingly, [poor attendance] is because they got sick,had to do homework, had to take care of theirsiblings, or needed to make more money. Thoseare things we can’t control, and we need to pres-ent that information to help manage the expecta-tions [of our audience].”

CHAPTER VI / 19

Page 29: Afterschool Counts!

AFEW CITIES AND SCHOOL SYSTEMS ared e veloping data tracking systems thatspan a variety of youth-serving programs.

Because most are still evolving, there are few well-established models available for replication. Thethree efforts outlined below, however, illustratesome promising attempts to address the chal-lenge. (For more detailed profiles of two othercitywide systems, located in San Francisco andLouisville, Kentucky, please see Appendix B.)

DEPARTMENT OF YOUTH ANDCOMMUNITY DEVELOPMENT(NEW YORK, NY)The De p a rtment of Youth and Community De ve l-opment (DYCD) in New Yo rk City distributes fed-eral Community De velopment Block Gr a n tallocations made to local community-based organi-zations. To meet federal grant re p o rting re q u i re-ments, New Yo rk State created a Re s u l t s - Or i e n t e dManagement and Accountability framew o rk( ROMA). ROMA proved to be such a powe rf u ltool that it has become the agency standard for out-comes tracking. In 2003, DYCD obtained fundingfor a demonstration project to develop a RO M A -d e r i ved outcome framew o rk for youth and adultsthat participate in Beacon afterschool programs, asa first step in introducing ROMA to all of DYC D ’safterschool programs. The software will allow fore x t e n s i ve analysis of program effectiveness, across avariety of demographic groups, through password -p rotected Web access—available through virt u a l l yany computer connected to the Internet, using aone-of-a-kind, Web-based query feature.

Because of ROMA’s emphasis on results, thecity’s tracking system will focus on linking partic-ipation levels in specific activities to outcomes,a c c o rding to Janice Mo l n a r, DYCD De p u t yCommissioner for Program Operations. Cu r-rently, outcomes are limited to program atten-dance. As the system moves forward, programattendance will continue to be tracked but will bemonitored only to determine whether a programis serving the projected number of children.

The software that will power DYC D ’s systemis being adapted from tools developed by T h e

Re n s s e l a e rville Institute, a nonprofit national edu-cation center based in upstate New Yo rk, to helpother organizations in the state track ROMA out-comes. One of the first steps was to decide what toinclude in the system. The current list of potentialdata fields includes name, address and other iden-tifying information, demographic data, hours ofp rogram attendance, and eve ry activity the stu-dent participates in each day. Although the num-ber of re q u i red data fields is small, programs thatuse the system may add additional fields to organ-i ze data re q u i red by their other funders.

In i t i a l l y, the system will be implemented in thec i t y’s 80 Beacon Schools. Site directors will enterdescriptions of all afterschool activities available top a rticipants and their hours of operation. Each sitewill use its own pre f e r red method of re c o rding stu-dent attendance and activity participation (mostBeacons currently use sign-in “activity sheets,” butt h e re is no standard i zed method). The data fro meach site will be entered electronically onsite andsubmitted via the Web to a centralized database ata location still to be determined.

All DYCD contract officers will be trained touse the database to generate re p o rts for funders andother stakeholders—for example, to track whichc h i l d ren are eligible for TANF re i m b u r s e m e n t s .Having the data on all programs available electro n-ically in one database, in an easy-to-use format,should improve contract managers’ efficiency,notes Da r ryl Rattray, Special Assistant to the Assis-tant Commissioner for Afterschool Programs.

The Rensselaerville Institute will work withDYCD staff to define program outcomes andidentify indicators of them, so the system can beused to gauge the effect of afterschool programson the children and communities they serve. “Ifone of our goals is academic enhancement, forexample, an outcome could be [improved] reportcard scores for [afterschool participants],” Rattrayexplains. “Or, we might look at participant activ-ity in tutoring and see if those who have a highrate of tutoring participation do better in school.”

Individual programs and schools will be able toaccess their own data to add activity descriptions orn ew participant information, run queries on spe-

20 / CHAPTER VII

VIICreating Capacity for Data Tracking

Page 30: Afterschool Counts!

cific participants or groups of participants, cre a t esite re p o rt cards, and enter data from their ow ns u rveys. The system is password - p rotected, so eachsite or program has access only to its own data.

The DYCD data system is still in the planningstage. The 80 Beacon sites should be fully entere dinto the system by fall 2004. Ul t i m a t e l y, DYC Dhopes to implement the ROMA framew o rk for allafterschool programs funded by DYCD.

BUILDING BOSTON’S AFTER-SCHOOL ENTERPRISE (BOSTON, MA)Building Boston’s After-School Enterprise (BASE)is a project of Boston’s After-School for All Pa rt n e r-s h i p, a 15-member funding collaborative funded byThe Ro b e rt Wood Johnson Foundation. In deve l-oping a citywide data system, BASE (which oper-ates under a separate RWJF grant) seeks to create an ew data collection and analysis system that willi m p rove the development of afterschool pro g r a m s .The system will provide up-to-date information onsupply and demand, re s o u rces for parents and serv-ice providers, and state-of-the-field re p o rts.

“This project is particularly important becauseBoston is a city of neighborhoods, and by and largethe neighborhoods are low- to moderate-income,with families of color,” explains Debra Mc L a u g h-lin, managing director of Boston’s After-School forAll Pa rtnership (BASAP). “We need to understandw h e re these kids are going and what’s available tothem in order for us to realign our re s o u rces anddecision-making to ensure that as many kids aspossible re c e i ve the support they need.”

BASE staff approached city agencies andmajor intermediary organizations that alreadycompile information about individual sites andasked them to cooperate in creating a single datacollection tool. That instrument, which is indevelopment, will initially track broad program-level data: type of service(s), enrollment capacity,actual enrollment, part i c i p a n t s’ demographiccharacteristics (aggregated), staffing configura-tions, funding streams, type of outcome measure-ment, connections with other organizations orschools, and possibly information about facilities.

“Our goal in this first cut is to get as manyprograms as possible to answer general questions.Then we expect to do some intensive surveys onissues of special concern, such as funding orstaffing or transportation, and also surveys thattap into specific types of [activities],” says BASEProject Director Lisa Jackson. Ultimately, sheexpects the system to have the following benefits:

■ Data will be collected using a standard, Web-accessible form.

■ Programs will be asked to provide data lessfrequently. The reduced burden will giveprogram staff time to conduct the work theydo best.

■ Data for research, policy making, and systemicimprovement will be more easily available topeople working in the afterschool sector.

For now, each intermediary organization willwork with the programs it supports to complete adata profile. The data will be entered in a centraldatabase, either onsite via the Web or by beingscanned at a central location. Frontline data col-lection techniques have been left up to the inter-mediaries, although BASE is trying to identifyuseful data collection resources. At least tem-porarily, the database will be housed on Boston’sAfter-School for All Partnership’s website.

Database planners are grappling with severalissues:

■ BASE hopes to capture data on all afterschoolprograms serving children from kindergartenthrough high school—but programs that serveyouth over age 13 are exempt from statelicensing requirements, which means there is nocentralized, formal record of their existence.

■ The spectrum of program types is broad, and itwill be a challenge to include the informalnetwork of volunteer providers, such as coachesand mentors, faith organizations, recreationalleagues, and clubs in addition to formalprograms for homework help and enrichment.

■ People in some sites may need help using thesystem. “There will probably be a subset oforganizations—the mom-and-pop groups that

CHAPTER VII / 21

Page 31: Afterschool Counts!

may not even have an office—that will needadditional support,” McLaughlin acknowledges.“Will we have to get them computers? Whattypes of technical assistance will they need?”

CHICAGO PUBLIC SCHOOLS(CHICAGO, IL)Chicago Public Schools (CPS) supports about 450school-based afterschool sites serving an estimated40,000 students in kindergarten through eighthgrade. In 2002-03, the CPS afterschool pro g r a m-ming division and the budget office began work i n gto create a citywide system for tracking attendancein afterschool programs. The effort was driven bythe need to document the number of afterschoolp a rticipants eligible for TANF reimbursement andby concerns about pre s e rving the programs withheaviest usage during a time of budget cuts.

The afterschool data system builds on twobases: CPS’s existing system for tracking teacherpayroll, and an online system created by the city’ssummer jobs initiative for youth. The jobs pro-gram created a central database that youth canaccess via the Internet to learn about availablejobs, complete a detailed questionnaire (withfields for a range of demographic data), and sub-mit applications; employers use the system toidentify job candidates. Working with Edge Tech-nological Resources (ETR), Inc., CPS similarlyadapted the payroll system, which was alreadyfamiliar to school data-entry staff, to track atten-dance in afterschool programs.

Beatriz Rendon, then head of CPS after-school programs, and Diane Fager, director ofPolicy and Program Development, worked with atechnology consulting firm and a programmer tocreate the Web-based After School AttendanceRe p o rting (ASAR) data system. Since ASARpiggy-backs onto the school district’s payroll sys-tem, it resides on the payroll department’s server(supplemented by the addition of a secondserver). Afterschool attendance data are entered ateach site by the school’s payroll clerk, and for thatreason a senior payroll official was also consultedduring the system’s development. The clerks onlyhave to enter a student’s identification number to

trigger the underlying school database and findthe child’s name and vital information.

ASAR only collects data on a student’s dailyattendance in or absence from an afterschool pro-gram. The CPS Office of Accountability sendsforms to each afterschool program, which teachersor program coordinators use to collect individualstudent data. Eve ry week, for each day that a stu-dent was absent, payroll clerks enter into the sys-tem one of several codes that identify the re a s o n(e.g., sick, excused, transferred). Blank boxes arep resumed to mean the student was present.

The system is password-protected, and onlypayroll clerks and principals have access to thedatabase. They can view data for their programbut not for another school.

ASAR relies on human and technologicalcapacity at each school, which has caused someglitches. Some schools had old computers withinadequate memory; ASAR developers arrangedfor technical assistance to re m ove unused pro g r a m sand, in some cases, upgrade memory. When CPSbegan using the new system in the middle of the2002-03 school ye a r, school payroll clerks re c e i ve dstipends for their help. The stipends stopped in2003-04, because the data entry process is nowc o n s i d e red a regular part of the clerk s’ job. CPSshifted from monthly to weekly data entry afterrealizing that breaking the task into smaller piecesreduced the burden on school staff.

Chicago has an advantage in that taxingauthorities that are legally independent of the Cityof Chicago (e.g., Chicago Public Schools, ChicagoPa rk District) come together to share a commonvision of how to provide services for young people.Cooperation helps align their priorities and makesit easier to mobilize action. Still, system deve l o p-ers say that the process of creating a centralize dsystem for multiple data users can re veal compet-ing interests. “If yo u’re going to make all thesepeople work together, you have to quickly identifythe most common data points and put them in as t a n d a rd i zed stru c t u re that appeals to [fro n t l i n edata producers and users],” notes Jaison Mo r g a n ,senior policy analyst for the Ma yo r’s Office and ac reator of the summer jobs data system.

22 / CHAPTER VII

Page 32: Afterschool Counts!

WHAT DOES IT TAKE to develop the datasystems described in this guide? Twoingredients are paramount: Staff time

and capacity for data activities, and money andexpertise for technology.

STAFF TIME AND CAPACITYProgram directors and staff routinely bemoan theamount of time and effort it takes to collect andenter attendance data. And when data must becompiled from dozens of sites, the task becomese ven gre a t e r. The central administration at the Fa m-ily League of Ba l t i m o re City, for example, at onetime had three full-time staff devoted to data work.

Staff who are being asked to take on unfamiliardata tasks need training. When LA’s BEST intro-duced its data system, the initiative convened all sitec o o rdinators to introduce the process and then heldhands-on, small-group training sessions. Fo l l ow - u ptraining occurred throughout the next ye a r.

One appeal of automated data collection sys-tems, such as swipe cards and scanners, is that theym i n i m i ze the effort re q u i red by staff to take atten-dance. In addition, some software companies assisttheir clients with data “c ru n c h i n g” on a re g u l a rbasis. For example, CitySpan Technologies, work-ing for TASC, produces a weekly summary re p o rtthat TASC can re v i ew and refer to as needed.

MONEY AND EXPERTISE FORTECHNOLOGYTypical costs for a data system include:

■ Purchase and installation of onsite computers,if not already available

■ Development or purchase of a managementinformation system (MIS)

■ Staff training to collect and/or enter data

■ Internet access

■ For swipe cards, at least one scanner and aserial connection

■ Purchase or lease of a software package

■ Customization of the MIS software, if desired

■ Expertise to analyze and interpret the data

So f t w a re costs often va ry, depending on howmany sites (or, in the case of swipe cards, class-rooms) are invo l ved and how much customizationis re q u i red to meet the pro g r a m’s informationneeds. The major customizable systems are Yo u t h-Se rvices.net and KidTrax. So f t w a re for Yo u t h Se r-vices.net costs $500 per year for a single-sitelicense and unlimited technical support. Pro g r a m swith more than 10 sites get a price break, but cus-tomization of the software costs extra.

A single-site KidTrax license has a one-time feeof $1,850 per year (which includes one scanner) andan annual fee of $449, which covers unlimited tech-nical support, product updates, and data storage andhosting. Additional scanners cost $550 each. Mu l t i -user licenses (i.e., when three to five people will belogging onto the system) cost $2,250 upfront.

Typically, KidTrax can be customized at noadditional cost. Other customized software, how-ever, can cost between a few thousand dollars fora small program to more than $100,000 for acitywide data system.

Some database planners say that the cost ofd e veloping the system is not as significant as thei n vestment re q u i red to monitor and analyze thedata, which can be a large and continuous job.C h i c a g o’s citywide database of summer jobs foryouth (see p. 22), for instance, has a committee thatmeets monthly to analyze data on which neighbor-hoods have high demand for jobs and which arebeing under-supplied by employers. A special team,augmented by staff at city agencies, monitors dataquality and modifies the system as needed. Fo rinstance, if a large pro p o rtion of students begin anapplication but don’t complete it, the team may con-tact the principal of the school that serves the stu-dents and start a campaign to improve completions.Still, observers say the electronic database pro b a b l ys a ves Chicago money because the city doesn’t needto hire people to process thousands of paper applica-tions, and because it takes less time to categorize ands e a rch the information on electronic applications.

CHAPTER VIII / 23

VIIINecessary Resources

Page 33: Afterschool Counts!

TH E PE O P L E W H O S E K N OW L E D G E and experi-ences shaped this guide re p resent a va r i e t yof data tracking approaches. All, howe ve r,

had encountered similar challenges, including:

■ Identifying the information needs that shoulddrive the selection or development of atracking system

■ Balancing the need for detailed data with thefinancial and human resources available tocollect data

■ Learning how to use the system; understandinghow each data element is defined, how to makequeries, and how to analyze the reportsgenerated from queries

■ Ensuring that data are entered correctly andcompletely

■ Keeping attendance records up to date

■ Tracking attendance in unusual services,especially those that occur offsite or areespecially intensive (e.g., mental healthcounseling)

■ Tracking attendance by activity, especially whenactivities change several times during the day

■ Interpreting data accurately, especially whenthe data show a dramatic drop in expectedattendance levels

■ Protecting the confidentiality of individualstudents and, sometimes, of specific programswhen the data become part of a large, broadlyaccessed database

Their engagement with these issues (pre s e n t e din Chapters III-VII and Appendix B) pro d u c e dh a rd-won lessons about choosing a data system,implementing it, and living with the consequences.

CHOOSING A SYSTEMWhat’s the best system for monitoring data onafterschool attendance and participation? There isno simple answer, because it depends on how theprogram is designed. An initiative that operatesseveral very similar sites might do well hiring site-based data staff. An initiative with a very hetero-geneous mix of sites might find it more efficient

to have sites upload their data to a central loca-tion for processing and analysis, however.

In programs where students stay in groupsand move among activities with a group leader, apen-and-paper approach might be sufficient. In aprogram that allows students to select activitiesand move from one to the next individually,swipe cards might be a good method—especiallyif the afterschool program is based in a school orcenter that has already purchased the equipment.Swipe card technology requires more technicalsupport than other methods, however, and theexpense goes up if you need to track every activ-ity. (When relatively small numbers of partici-pants are involved, a single scanner can trackmultiple activities during a single period, saysAnanda Roberts, but high-traffic sites may need ascanner for each room.)

For many programs, scanner technology costsare offset by the time and effort saved. For otherprograms, however, there are political factors toconsider: “[Swipe cards] would be controversial inthe Bay Area because a lot of kids come fromundocumented immigrant families, and there’s areal fear about how information might be used,”says Team-Up for Youth’s Rachel Baker.

Choices about data systems also depend on thep ro g r a m’s data needs. A program that only needs tocollect a headcount and do simple calculations ofa verage daily attendance at the group level will dofine using hard-copy rosters and entering the datainto an Excel or Access database. A program thatneeds more detailed, child-level data or wants toanalyze outcomes will need a more robust systemfor storing and manipulating data.

Although data needs va ry from program to pro-gram, the following lessons apply to most systems:

24 / CHAPTER IX

IXLessons and Conclusions

“The key in making decisions is: What dothe sites really need to know? What doesthe program office need to know? And whatdoes the evaluator need to know? Every-body has to keep in mind how much workeach additional item adds for the site.”—Richard White, Policy Studies Associates

Page 34: Afterschool Counts!

1 Keep it simple. Program designers and evaluatorsadmit that they get excited about a system’s mostadvanced technological capacities. If the program does-n’t really need those capacities, however, you probablyshould opt for something streamlined and easy to use.

2 Make sure the people who are directlyresponsible for data collection play a role indesigning the system. Frontline program staff cangive system designers a realistic sense of what datacan and can’t be collected.

3 If you want to link afterschool attendance toeducational outcomes, you’ll probably need toestablish a data partnership with the schooldistrict. Agreements on what data will be provided,how, and with whom, pave the way for data sharing.

IMPLEMENTING THE PROCESSA few interv i ewees described bre a k d owns in thedata process as it moved from planning toimplementation. “T h e re was a disconnectb e t ween the people who had a vision for whatthe system would do, the MIS people, and the

re s e a rchers, all of whom we re in different places,”one observer said:

It was murky who the leadership was. The other problemwas the idea that you could just tell a bunch of sites to[collect data] and it would happen. We didn’t give themenough money to make them want to do it, and thenthere were a lot of people who did want to comply butd i d n ’t know how . I t ’s now 2003 and we ’re just getting asystem that people are beginning to use we l l .

Moreover, the quality of data tracking canvary across sites if the oversight organization orevaluator doesn’t require sites to use a standard-ized process and format. Without a standardizeddata system, it’s hard to compare data. School-and center-based programs have an easier timeaddressing this issue. But it’s much more chal-lenging at a program such as a sports league withmany separate teams, whose participants don’tcome to any central location.

Eight major lessons emerge from these experiences:

1 Build trust for how the data will be used.Make an effort to convince people that they needgood data to strengthen their programs and communicate their successes.

2 Be consistent in requiring that sites submitdata, but be as flexible as possible in howthey collect and submit the information.Don’t sacrifice standardization, however, because thatwill jeopardize your ability to compare across sites.

3 Urge sites to collect data daily. Don’t wait until you need to generate a report, and don’t tryto reconstruct data retroactively.

4 Take time to troubleshoot data discrepancies.

5 Create a mechanism for addressing data collection issues, such as a committee oronline troubleshooter.

6 Think creatively about the resources availableto you for implementation. Are there school staffor equipment that can be put to use with minimaleffort? Is there another program with an existing datasystem that your program can piggyback onto?

CHAPTER IX / 25

FACTORS TO WEIGH WHEN CHOOSING A METHOD

✔ How comprehensive the database needsto be

✔ The time burden that will fall on frontline staff

✔ The cost of developing or adapting the system

✔ User friendliness, especially in terms of generating and customizing reports

✔ How complicated it is to maintain thesystem (frequently a hidden cost oftime, if not money)

✔ Whether someone is ava i l a ble in-house to analyze data or whether t h at capacity must be purchased

✔ How dispersed the sites are (whichaffects the potential for losing data during transfers)

Page 35: Afterschool Counts!

7 Commit to the goal of managing data andkeep pushing toward it. That often means keepingkey players motivated to stay at the table.

8 Create incentives for collecting and enteringdata. Some programs require instructors to turn intheir attendance sheets before they can be paid. Thestate of California ties funding for afterschool educa-tion and safety programs to average daily attendance,at least for elementary and middle schools.

LIVING WITH THE CONSEQUENCESThe July/August, 2003 issue of Youth Today drewattention to afterschool programs that have lostfunding because attendance data showed low par-ticipation and, consequently, a high cost perchild. The pattern of sites getting “defunded”because of poor attendance makes this a legiti-mate concern for site and initiative directors. But,as one data expert points out, “There’s an adjust-ment period that has to occur when agencies[start using true numbers]. We have to valuethose numbers and move forward and not get allhung up on the difference between old and new

attendance numbers.”The best way to respond to poor attendance

results, experts say, is to do more analysis touncover the factors. Is the problem that studentsdon’t find the program interesting and are votingwith their feet? If so, it may make sense toimprove program quality rather than to eliminateor replace the program.

Is the problem caused by incomplete data,which causes too many student re c o rds to gett h rown out during the data cleaning process? Ma k es u re the system is simple and useful to the peoplecharged with collecting the data, so they will be

inclined to use it. Citizen School’s spreadsheet, forinstance, re c o rds not only student attendance butalso information on students’ school schedules andh ow they are transported to and from campus—information the frontline staff need and use.

CONCLUSIONSThis guide seeks to help program directors, fun-ders, policymakers, and researchers understandthe issues and options involved in tracking atten-dance and participation in out-of-school activi-ties. The people interviewed for the guide hadfirst-hand knowledge of the challenges inherentin using, collecting, analyzing, and maintainingdata at the program and citywide level. Some hadstruggled mightily to put tracking systems inplace, and some tried more than one optionbefore they found a system that worked for them.But they all understood the importance of notbeing daunted by the challenges.

In gathering information for this guide, wedid not discover any flawless data systems or prac-tices. We also realized, regrettably, that we couldnot offer a step-by-step recipe for choosing a sys-tem because of the great variation in programmodels, capacities, and data needs. We did, how-ever, hear a consistent theme: JUST DO IT. Thelessons outlined above, and the examples inAppendix B and throughout the guide, provide astarting point for “doing” attendance data. Pro-gram directors can mix and match a variety ofoptions, from the simple to the high-tech, to finda system that works for them.

Start small; you can always build on the sys-tem. Keep it efficient and simple. Focus on howthe data will be useful to you and your programstaff—to improve program design or manage-ment, perhaps, or to fulfill accountability require-ments. And don’t be intimidated by thesometimes obscure language of technology andanalysis. The important thing, experts agree, is tojump in and give one of the methods a try.

26 / CHAPTER IX

“Try to learn what it is about a certain daythat causes differences in attendance. Is itevery Friday that attendance drops dramat-ically? Are Mondays really high?”—Christina Russell, Policy Studies Associates

Page 36: Afterschool Counts!

Rachel BakerDeputy DirectorTeam-Up for YouthSan Francisco, CA

Martin BellDeputy SuperintendentJefferson County Public SchoolsLouisville, KY

Tiffany BerryEvaluator of LA’s BESTLos Angeles, CA

Eric BrunsEvaluation Coordinator for BaltimoreSafe and Sound CampaignUniversity of Maryland Baltimore, MD

Nani ColorettiDirector of Planning and BudgetDepartment of Children, Youth and FamiliesSan Francisco, CA

Lara FabianoSenior Research AssociatePolicy Studies Associates, Inc.Washington, DC

Hugo FernandezSite DirectorBronx College Town at P.S./M.S. 306New York, NY

Michael FunkDirectorSunset Beacon CenterSan Francisco, CA

Margaret HeritageWeb QSP DirectorNational Center for Research on Evaluation, Standards, and Student TestingLos Angeles, CA

Lisa JacksonProject DirectorBuilding Boston’sAfter-School EnterpriseBoston, MA

Tonia KizerAdministrative Assistant to the Director of Policy and ProgramDevelopmentChicago Public SchoolsChicago, IL

Rachel KleinDirector of EvaluationAfter School MattersChicago, IL

Andrew LappinSite DirectorExtended Day at the Heritage SchoolNew York, NY

Suzanne Le MenestrelPPAS System Project DirectorPromising Practices in AfterschoolAcademy for Educational DevelopmentWashington, DC

Carolyn Marzke(Former) Senior Research Associate Policy Studies Associates, Inc.Washington, DC

Mark MinPrincipal and FounderCitySpan TechnologiesBerkeley, CA

Debra McLaughlinManaging Director Boston’s After-School for All PartnershipBoston, MA

Rhe McLaughlinDirector of EvaluationFoundations, Inc.New Jersey

Janice MolnarDeputy Commissioner for Program OperationsDepartment of Youth and Community DevelopmentNew York, NY

Jaison MorganSenior Policy Analyst and Project ManagerMayor’s Office of ProgramsChicago, IL

Ellen PechmanSenior Research AssociatePolicy Studies Associates, Inc.Washington, DC

Don PruittVice President nFocus SoftwarePhoenix, AZ

Darryl RattraySpecial Assistant to the Assistant Commissioner for Afterschool ProgramsDepartment of Youth and ChildDevelopmentNew York, NY

Elizabeth ReisnerPrincipal and Co-FounderPolicy Studies Associates, Inc.Washington, DC

Ananda RobertsPresidentnFocus SoftwarePhoenix, AZ

Christina RussellResearch AssociatePolicy Studies Associates, Inc.Washington, DC

Lisa SaenzUnit DirectorBoys & Girls Club of Corpus ChristiCorpus Christi, TX

Charlie Schlegel(Former) Research DirectorCitizen SchoolsBoston, MA

Don ShawExecutive DirectorSalvation Army Boys & Girls ClubsLouisville, KY

Cheryl TaylorManaging ConsultantEdge Technological Resources (ETR), Inc.Chicago, IL

Rahan UddinDatabase Administrator and HelpDesk CoordinatorThe After-School CorporationNew York, NY

Nancy WachsExecutive DirectorAfter School MattersChicago, IL

Karen WalkerEvaluatorPublic/Private VenturesPhiladelphia, PA

Christopher WhippleVice President for OperationsThe After-School CorporationNew York, NY

Richard WhiteManaging DirectorPolicy Studies Associates, Inc.Washington, DC

Virginia WittDirectorSan Francisco Beacon InitiativeSan Francisco, CA

John Wolfkill(Former) Associate DirectorP.F. Bresee FoundationLos Angeles, CA

Benson WongDirectorChinatown Beacon CenterSan Francisco, CA

Joan WynnEvaluatorChapin Hall Center for ChildrenChicago, IL

APPENDIX A / 27

AInterviewees

Page 37: Afterschool Counts!

Evolving from Electronic Spreadsheets to a Web-basedTracking SystemTHE AFTER-SCHOOL CORPORATION

CONTEXT

The After-School Corporation (TA S C )s u p p o rts afterschool projects, primarilyin New Yo rk City, that serve students ingrades K-12. TASC projects are housedin schools but sponsored and operatedby community-based and other priva t en o n p rofit organizations. TASC selectsgrantees, administers the program, andp rovides funding and technical assis-tance. All sites share certain core com-ponents and values developed by TA S C ,although service delive ry models va ryw i d e l y. In 2002, the TASC attendancedatabase included 37,229 part i c i p a n t sat 160 sites. (In 2003, the total numberof TASC sites grew to 187. TASC doesnot track attendance in sites locatedoutside New Yo rk City nor in a fewother sites that re c e i ve only a small por-tion of their funding from TA S C . )

TASC began tracking attendance in1998, using a simple electronic spread-sheet filled out by site-based instruc-tors. In 2001, the organization beganshifting sites to a Web-based systemthat uses YouthServices.net to centralizeand manipulate data.

TASC had four reasons for trackingstudent attendance, says Chris W h i p p l e ,TA S C ’s Vice President for Operations:

■ Out of a recognition that a programcannot substantially affect children’soutcomes or public policy if childrendo not attend the program regularly;

■ To support the program evaluator’sefforts to link students’ afterschool“dosage” to outcomes;

■ To enable TASC to adjust the budg-ets of sites that exceeded or fell shortof projected enrollment and dailyattendance rates; and

■ To satisfy the reporting requirementsof TASC’s biggest funder, the city’sDepartment of Youth and Commu-nity Development, which based itslevel of contribution on attendancefigures.

THE STRATEGY

TA S C ’s original attendance trackingsystem was developed by program eval-uator Policy Studies Associates. Becausemost sites lacked access to computersand the Internet, it was designed to bevery simple. It consisted of a simpleMicrosoft Excel spreadsheet that cap-

tured students’ names, Board of Educa-tion identification numbers, whetherthe student had enrollment forms onfile, and all of the dates that the pro-gram would operate that year. At thebeginning of each daily session, site-based instru c t o r s — k n own as “g ro u pleaders”—recorded on hard-copy formsa “P” (for present) or an “A” (forabsent) next to the name of each child,on every day of service. Once a week,another staff member (often located atthe sponsoring community organiza-tion) entered the data and saved it on afloppy disc. Site directors mailed thediscs to TASC, where a programmerpulled the data into a Microsoft Accessdatabase.

That system collected the core datathat TASC and its evaluators needed,but it was cumbersome. There weremany opportunities for data to be lostor entered incorrectly, the data collec-tion burden on sites’ limited staff washigh, and once the data had been sub-mitted to TASC the sites had no oppor-tunities to do their own analyses.

Will Corbin, a computer program-ming consultant, helped to turn therudimentary spreadsheet into a moresystematic, flexible, powerful reportingtool. He provided site-based staff train-ing and support, and he helped TASCprogram officers figure out what reportswould be most useful for ongoing pro-

28 / APPENDIX B

Evolving from Electronic Spreadsheets to a Web-based Tracking SystemTHE AFTER-SCHOOL CORPORATION

Linking Afterschool Attendance Data to Students’ School RecordsJEFFERSON COUNTY (KY) PUBLIC SCHOOLS

Tracking Afterschool “Encounters” Across an Entire CitySAN FRANCISCO DEPARTMENT OF CHILDREN, YOUTH AND FAMILIES

Collecting Afterschool Attendance and “Dosage” Data for Evaluation PurposesTHE BEACON INITIATIVE

BProfiles of Selected Strategies

Page 38: Afterschool Counts!

gram management. After a few years,TASC decided to adopt a Web-basedsystem that could help sites collect“clean” data with a minimum of effort,store large amounts of data easily, anduse the data more effectively to manageindividual projects and the overall pro-gram. TASC selected the Yo u t h Se r-vices.net system, developed byCitySpan Technologies, to fulfill thosepurposes. (CitySpan’s data system hasbeen adapted to serve many large-scaleafterschool programs and initiative s ,including the Children and Yo u t hInvestment Trust Corporation in Wash-ington, DC, in addition to others pro-filed in this report.)

Data Collection “First and foremost in our thinking,”Whipple says, “was: What’s the mini-mum amount of information thatneeds to be inputted and what’s thesimplest possible way for sites to indi-cate whether a child is present or absenteach day? Those became the mandatory[data entry] fields, and everything elseis an optional management tool.”

After each TASC project is assigned aunique log-in and password to the sys-tem, the first step is to establish a recordfor each student and consolidate thestudents into groupings that re f l e c ttheir afterschool experiences. (AllTASC projects group participants intowhat are essentially homeroom classes.The exception is at high schools, wherestudents are grouped by activity.) In thedata system, each grouping encom-passes 20 to 30 children, and each has aunique name (e.g., Group A, Group B).

The required data elements for stu-dent records include: student name,address, Board of Education identifica-tion number, birthdate, and TA S Cenrollment date. Site staff may also fillseveral optional fields, including thec h i l d’s sex, race/ethnicity, grade inschool, language spoken at home,receipt of special educational services,homeroom teacher during the schoolday, and transportation services to and

from the afterschool program. Somesite directors, such as Andrew Lappin ofthe Extended Day Program at the Her-itage School in Manhattan, also includeinformal notes, such as “This girl issuper computer literate—would begreat for yearbook class,” or “Needsextra time for taking exams.”

Project directors then tell the data sys-tem what the schedule of activities is foreach group and create a student ro s t e rfor each gro u p. Once a week, they printout a blank ro s t e r — e s s e n t i a l l y, a manualattendance sheet—for each group andfor eve ry day that the group meets (usu-ally five times per week). The dire c t o r sg i ve the rosters to group leaders, whoplace an “X” in the box beside the nameof each student who is present, eve ry day.(A blank box is presumed to indicateabsence.) Either the site coordinator oran administrative assistant collects therosters at the end of the week, accessesthe database via the Internet, and entersthe data. Data entry is a fairly simplep rocess, because the electronic formlooks just like the rosters and users needonly click on the appropriate box toindicate attendance. At that point, thedata are in the system and can be used bythe site dire c t o r, TASC program officers,and the eva l u a t o r.

Some sites vary the process slightly.For example:

■ Hugo Fe r n a n d ez, director of Bro n xCollege Town afterschool pro g r a mat P.S./M.S. 306, a TASC pro j e c ts p o n s o red by the Committee forHispanic Children and Families, hashis group leaders take attendances e veral times: when students firstcome into class, during the supperp rovided by the program, and dur-ing special activities held at the endof the day. He also generates a dailyroster that has a blank line next toeach student’s name, which he usesas a sign-out sheet as parents pick uptheir children.

■ Andrew Lappin, whose high schoolafterschool program at the HeritageSchool is sponsored by Teachers Col-lege of Columbia University, waitsabout two weeks after the school yearstarts before creating rosters to givestudents time to switch classes ordrop out of the program. Teacherskeep the rosters for their classes intheir desks or mail boxes for thewhole week; Lappin also keepscopies in a file folder. On Fridays,two student assistants collect rostersfrom the teachers, and on MondayLappin and the assistants beginentering data. Then, Lappin gener-ates a Missing Attendance Report (afeature offered by most electronicdatabase systems) that identifies thepercentage of data missing for eachafterschool class, per day, week, ormonth. Lappin tracks down theteachers whose classes have incom-plete data to find out what still needsto be entered.

Development of the Data SystemYo u t h Se rvices.net was a pre e x i s t i n gsoftware and technical assistance pack-age, so TASC did not need to developthe system. Rollout occurred slow l yafter a one-year demonstration run bythe system’s developer. Twenty TASC

APPENDIX B / 29

“With the Excel system, we tried tofool ourselves and say it was a data-base. But the reality was, it wasn’tdoing anything for the sites. Theywere just plugging in values andsending it to us. It served the pur-pose of letting us know how manykids were attending, but it was onlygiving us attendance and it wasn’tgiving the sites anything more…Also, every time we wanted to gen-erate a report or invoice, or look atdata in various ways, the consultant[programmer] would have to do thequery from the ground up.”—Rahan Uddin,TASC Database

Administrator

Page 39: Afterschool Counts!

sites piloted YouthServices.net in 2001,and it expanded to another 40 sites in2002. All TASC sites are expected to beonline in the 2003-04 school year.

The phased rollout gave both sitesand TASC some time to work out thekinks. Ob s e rves Lappin, who is aphysics teacher and former mechanicalengineer, “It even took me a while toget up to snuff with the program….When we first started, it was harder tocreate classes [in the system] because Ihad no programming experience. Wedidn’t call a math class ‘math,’ for exam-ple, we had to call it ‘M$1’ or ‘M$B.’ Ididn’t know how those codes corre-sponded to the classes I knew. Now[after adjustments to the system], I canuse my own labels instead of theircodes. The system is much more lan-guage-based now, whereas before it wassymbol-based.”

Using DataRahan Uddin and Yo u t h Se rv i c e s . n e tgenerate three weekly reports for theTASC program officers, president, andother key leaders. The primary purposeof the reports is to support operationsmanagement. Staff and directors exam-ine the reports individually to see iftheir sites are reaching enrollment andattendance targets. If they aren’t, espe-cially as the school year gets into fullswing, TASC program officers willintervene with site directors. If by mid-year a site is still not meeting projectedenrollment, TASC will adjust the targetto a more realistic number and reducethe site’s budget allocation. Sites thatare exceeding enrollment targets andmaintaining high daily attendance ratesusually get a budget increase.

The first weekly report presents thenumber of days each program has beenopen that year, the program’s enroll-ment rate, the attendance rate, and thenumber of days for which data are miss-ing. TASC uses these reports not onlyto assess participation levels but as signsof possible management problems. Forexample, if TASC program officers

notice that any site is open less than thenumber of days school has been open,they will discuss with the site staff rea-sons for the closure (or for incompletedata) and encourage the director toremain open every regular school day.

The second report presents similardata grouped by elementary, middle (orcombined elementary/middle), or highschool. It gives each group’s enrollmentrate, attendance rate based on targetedenrollment, and actual head count.

The third report is a four-week his-tory for each site, which provides a“snapshot” of trends in enrollment andattendance. This report is likely to bephased out under the new system,which is capable of providing suchanalyses on demand.

In past years, TASC consultants gen-erated the reports from the raw data inthe system. Beginning in 2003, thedevelopers of YouthServices.net createda tool that TASC can use to generatethe reports in-house.

TASC asks site directors to periodi-cally run a Missing TASC-RequiredData Report to identify the type andamount of data missing for each stu-dent. Those data include the elementsrequired for TASC’s invoices to theDepartment of Youth and Child Devel-opment, such as valid Board of Educa-tion identification numbers andenrollment forms on file. Site directorsmay also generate their own reports of

average daily attendance, number ofdays attended by each part i c i p a n t ,attendance for specific populations(e.g., all girls), and attendance totals bydate. For each report, the user can setthe specified timeframe.

Lappin, for example, might look atattendance patterns by ZIP Code. Hi sschool is in East Harlem, but a signifi-cant number of students come fro mWashington Heights, Brooklyn, andQueens. “I can look to see if the kidsf rom Brooklyn have been attending re g-ularly by punching in their ZIP Code. Ift h e y’re not, I can ask them why. Ma y b eit takes too long to get here by subw a y.”

COSTS

Shifting from the Excel format toYo u t h Se rvices.net imposed start u p,licensing, and training costs on TASC(see p. 24 for information on typicalYouthServices.net prices, provided byCitySpan Technologies). Since the pro-gram is Web-based, it requires access toa computer and the Internet. TASCprovided one laptop computer to eachsite, but Uddin points out that manysite directors can also access the Inter-net in school libraries, school computerlabs, their sponsoring communityagency, or even their homes.

Now that TASC can generate its owndata reports, Uddin expects the cost ofusing data to be lower than in previousyears, when consultants had to be hiredfor every programming need. In previ-ous years, it was not unusual to spendabout $175,000 on data programmingfor 120 to 130 sites.

After the initial student informationhas been entered, the system’s cost interms of personnel time is minimal.Fernandez estimates that it takes aboutfive minutes to enter information foreach new child, assuming the user isfairly computer-literate. That translatesinto about two days of data entry for aproject with 200 participants. It thentakes an experienced user about 10

30 / APPENDIX B

“At one point I had to give a surveyand I wanted to pick kids who hadattended at least 100 days. I gener-ated a query and found all kidswhose attendance totals were inthree digits…Sometimes I want todo things like figure out how manychildren receive public assistance, orlearn who is bused and who gets towalk home. It isn’t the simplestthing to create queries on your own,but YouthServices is always willingto help.”—H u go Fe r n a n d e z ,TASC site dire c t o r

Page 40: Afterschool Counts!

minutes to print rosters and anotherhour to input the attendance once aweek. (This does not include the time ittakes to take daily attendance.)

ISSUES AND CHALLENGES

Keeping records up to date. With allthat site directors and their staff mem-bers need to do onsite, it’s easy for themto fall behind in entering data. ThusTASC has built an incentive into itsprocess: If a site is 15 or more servicedays (three weeks) behind in submit-ting data, payments are withheld.

Keeping track of rosters. The two-stage process used by TASC—markingup rosters and then entering the datainto the database—means there ispotential for data to be lost between thetime instructors take attendance andthe time it is entered electronically,especially when data entry only occursonce a week. Site directors usually des-ignate a place to keep the attendancesheets so they don’t get lost, such as amail box or binder.

Tracking attendance at the activitylevel. YouthServices.net has the capacityfor users to assign a service category(e.g., academic enrichment, technology,community service) to each studentgrouping or activity and to track partic-ipation in more than one activity perday. TASC has not used that option yetbecause of concerns that it places a bur-den on the group workers who collectattendance data. “It’s important to keepin mind what the agency’s reportingneeds are and also what the sites’ capa-bilities are,” Uddin says. “Do they havethe necessary technology to do whatwe’re asking? Will they likely have thestaff power to do it?”

Tracking offsite activities. At Lap-pin’s project and other TASC sites, ath-letic and gym classes occur off theschool campus, which makes it harderto collect attendance data from instruc-

tors. The swimming teacher meets par-ticipants at the swimming pool, forexample, and may not visit the schoolfor several days. Lappin tries to impresson all teachers that attendance datadrive the project’s funding, but it is aconstant effort to obtain offsite atten-dance data in a timely manner.

Analyzing and interpreting data.Although TASC leaders view their pro-gram as an eve ryday activity, they re a l i zethat afterschool programs are not manda-t o ry and participants may have compet-ing interests that cause them to missdays—especially as youth become olderand make commitments to sports teamsand afterschool jobs. T h e re f o re, TA S Cestablished attendance expectations thatva ry according to the child’s level ofschooling. In elementary schools (K-5 orK-6), 70 percent of students are expectedto attend a minimum of three days perweek. In combined schools (K-8), 65p e rcent of students are expected to main-tain that attendance rate. In middleschools, the pro p o rtion is 60 perc e n t ,and in high schools it drops to 50 per-cent. TASC takes those differing grade-l e vel expectations into account whenanalyzing data within and across projects.

OBSERVATIONS ANDLESSONS

1 For a large initiative, the Web-based system is an improvementover the simpler database, althoughthere is still room for improvement.

Site Di rector Hugo Fe r n a n d ez saysYouthServices.net makes it much easierand faster to enter data because he canchoose to indicate only the studentswho are absent, rather than having toenter either presence or absence forevery child. He also likes having the sys-tem on the Web rather than relying onemail or postal services to send data toand from TASC’s headquarters; thatincreases the likelihood that data will bekept current, he says.

Lappin also likes the system,although he’s eager to add additionaldata elements—such as the type ofmusic that each student likes. “I’d loveto be able to generate a report of all thekids who are into rap so I can tell themwhen a great rap artist is coming tovisit,” he says.

Both TASC directors say their systemwould also work for a smaller initiativeor a single site, although it might not beworth the data entry time for a smallprogram whose director knows all thep a rticipants personally. When askedwhether such a system requires an inter-mediary like TASC to provide staff timeand assistance, Uddin says that a full-time administrator is unnecessary aslong as site directors receive training inhow to use the system and guidance ontheir responsibilities for entering dataand generating reports.

2 Keep it simple.

Project staff won’t necessarily embraceall of the bells and whistles that appealto program designers and evaluators,Uddin cautions:

“They may just be focused on get-ting the basic data into the system, andyou have to understand that whenyou’re developing the system so youdon’t go overboard. Do they have a staffperson dedicated to doing this type ofthing, or will one staff person or eventhe site coordinator be taking time outto do [data collection and entry]? Whatis the end user’s knowledge of comput-ers and databases? We tried to make oursystem look simple and friendly; wegave it nice colors so people wouldn’t beintimidated.”

3 Take time to troubleshoot data discrepancies.

For accountability (and liability) rea-sons, it is helpful to know whether astudent who fails to show up for after-school activities was absent or presentduring the school day. As soon as thedaily rosters are completed, Fernandez

APPENDIX B / 31

Page 41: Afterschool Counts!

has a staff person check whether thestudents missing that day had attendedschool. If they did, the staff personimmediately calls the child’s familymember to report the child’s absenceand find out what happened. Fernan-dez also has the school’s main officegenerate a list of children who wereabsent during the school day, and hecompares it to his list of attendees to seeif any afterschool participants skippedthe school day.

4 Be patient.

The process of entering initial data forevery student can be time-consumingand tedious, but the effort pays off.Lappin hires two senior students tohelp with data entry. (To address confi-dentiality concerns, he selects top aca-demic performers and screens themcarefully. They do not have access to thesystem password and are not allowed towork on the database when Lappin isnot in the room.)

5 Build trust for how the data willbe used.

Lappin, who has entered into YouthSer-vices all of the student data from hisproject’s intake form and the school’sown hard-copy records, meets privatelywith each afterschool student duringthe year to show him or her what per-sonal data are in the system. He asks forc o r rections and also whether it isacceptable to include all of the data.“The school is extremely diverse cultur-ally, and some cultures don’t like givingout information, especially about SocialSecurity numbers and householdincome,” Lappin notes.

Linking Afterschool Attendance Data to Students’ School RecordsJEFFERSON COUNTY (KY) PUBLIC SCHOOLS

CONTEXT

Jefferson County Public Schools (J C P S ) ,which serves 97,000 children and yo u t ht h rough 175 schools, is the 27th largestschool district in the United States. In2001, school officials became intere s t e din learning more about the connectionsb e t ween student achievement and theexperiences that children have out ofschool. What community activities arec h i l d ren participating in, especially thosestudents who do or don’t perform well inschool? How might community experi-ences be enhanced for students who ares t ruggling in school?

“We wanted to know if afterschoolp a rticipation makes a difference forkids in school,” explains Deputy Super-intendent Martin Bell. “From the timea child is born until he or she graduatesfrom high school, the school system hasthem 13 percent of the hours they’reawake. [For] 87 percent of the time,they’ve been at home, at church, at aBoys & Girls Club, on street corners.”

The curiosity was driven in part bySuperintendent Stephen Da s c h n e r,whose background in academic researchinfused JCPS with the sense that “if thedata don’t support it, it isn’t true.” JCPSalso had collaborated for 10 years withthe city departments of family andchild pro t e c t i ve services, health andmental health, and drug abuse treat-ment. Their collective effort to makeservices more efficient and to improvefamily outcomes laid a foundation forsharing data across service systems.

JCPS administrators began conversa-tions about collecting and sharing datawith the heads of several communityorganizations, including the Y M C A ,United Way, and Salvation Army Youth

Centers (which operate Boys & GirlsClubs), and with city officials.

Don Shaw, executive director of theSalvation Army Boys & Girls Clubs,responded immediately. He was alreadytracking the afterschool participation of5,000 children and youth in his clubs,and he thought it would helpimmensely if community programs hadcentralized access to participants’ schoolattendance, test scores, and disciplinaryactions. “We were going from school toschool to collect the data by hand,”Shaw recalls.

The Boys & Girls Clubs were usingK i d Trax swipe cards and software ,d e veloped by Phoenix-based nFo c u sSoftware, to track student attendance,and Shaw recommended the system toBell. After investigating the system, Belland his partners concluded that Kid-Trax offered the simplicity, flexibility,and data accuracy they needed.

By fall 2003, about 25 afterschoolp rograms—including churches, com-munity schools, and Boys & Gi r l sClubs—were using KidTrax and spe-cially developed “middleware” to linkafterschool and school data. The cityhas begun requiring yo u t h - s e rv i n gorganizations that receive city funds forafterschool programs to adopt KidTrax,and planners expect that ultimately anestimated 300 programs will feed datainto the system.

THE STRATEGY

Data CollectionKidTrax uses swipe cards to collect dataon individual students and their activi-ties. Students re c e i ve identificationc a rds, known in Louisville as “c i t ycards,” that contain unique barcodes.They present the cards to electronicscanners as they enter and exit theirafterschool program each day. The datacollected automatically by the scannersare housed in a database on the schoolsystem’s server.

The afterschool programs’ database

32 / APPENDIX B

Page 42: Afterschool Counts!

includes each student’s name, birthdate,emergency contact information, gradein school, afterschool attendancerecord, sex, race/ethnicity, and annualhousehold income. Using “middleware”created for JCPS, programs can down-load other information on studentsf rom the school system’s database,including a student’s grade point aver-age, test scores, school attendance, andsuspensions and disciplinary referrals.

Developing the Data SystemJCPS leaders’ interest in afterschoolattendance data was sparked in part bya national conference on afterschoolprograms convened in Denver, Col-orado by the Academy for EducationalDe velopment. Bell and others fro mLouisville who attended the conferencewere asked to identify the afterschooloutcomes they would like to measure.After a day and a half discussing thetopic with representatives from city andcounty government, the private sector,and the school system, the Louisvilleteam came up with a handful of out-comes that all afterschool pro g r a m sought to be focused on, includingimproved school attendance, self-disci-pline, school performance, and connec-tions to a caring adult.

“We came back from that conferenceand put together a working group tofigure out how to measure the out-comes,” Bell recalls. Key participantsincluded Bell and Shaw, the presidentof the Urban League, the city’s deputym a yo r, and the president of Me t roUnited Way.

One of the first things team mem-bers realized was that they needed acentralized, easily accessible place forthe data. The school system’s data ware-house seemed like a good option,because it already contained all the dataon student achievement.

The next question was how to matchdata on afterschool participation withschool data to see if school outcomeswere affected. “We had to know whowas participating in these programs,”

Bell says. “We invited other programsin to look at our indicators and out-comes, and we offered to make our dataavailable to them in exchange.” Thenthe work group began looking for soft-ware that could match the data frommultiple databases and ensure they allpertained to the same children.

Work group members began talkingto United Parcel Service administrators,on the assumption that “if you couldtrack a package you could track a kid.”That piqued their interest in electronicscanning methods, which ultimately ledto the selection of KidTrax.

The school system, which was foot-ing the bill for software development,began looking for someone to developmiddleware linking KidTrax with theschool database. After balking at esti-mates of $300,000 to $400,000, plan-ners learned from a UPS contact thatthe computer applications class at theUniversity of Louisville sometimes tookon special projects at no charge. Twoprofessors agreed to lead the softwaredevelopment project, and soon studentswere meeting with representatives ofcommunity organizations and agenciesto understand the system requirements.Then the aspiring programmers dividedinto four software “companies,” each ofwhich would develop software and tryto “sell” the database work group ontheir approach. The university students’culminating project involved producinga skeleton program and a marketingplan. After hearing the presentations,the work group selected one of theplans (and ultimately hired one of itsdevelopers to the school district’s full-time technology resource staff ).

The middleware created by the uni-versity students is a point-and-click sys-tem called Connectedness AnalysisReporting System (CARS). It enablesusers to query the schools’ database andformat the results in a useful format.For about $4,000, KidTrax developern Focus So f t w a re wrote a bridgebetween CARS and the school district’sdata warehouse.

JCPS piloted the new system at sixBoys & Girls Clubs and six communityschools. After ironing out a few glitchesin uploading data from the sites to theschool database, they expanded the sys-tem to all youth-serving organizationsthat had purchased KidTrax softwareand scanners.

Using DataThis system is used for analyses at manyl e vels: of individual students, groups ofstudents, individual programs, anda c ross programs. Findings are used toidentify patterns and to trigger interve n-tions. A student whose grades are slip-ping and afterschool attendance is erraticmay get a home visit. Or the Sa l va t i o nA r m y’s Don Shaw might “drill dow n”t h rough the middlew a re to learn “howmany kids had attendance of 90 perc e n tor more for the year at a specific Boys &Girls club or at a certain grade level.”

One of Sh a w’s recent analysesrevealed that reading scores were slip-ping among eleventh-graders at onesite. Shaw urged the site director toimplement a daily tutoring program inreading skills, and the students’ gradesh a ve picked up. “Those eleve n t h -graders could have gone on and on, andwe would never have known about theirpoor attendance,” Shaw notes. “[Thissystem] allows us to do so much at theorganizational level to correct problemsbefore they get so bad you can’t correctthem, because I could check the datadaily if I wanted to.”

COSTS

Pa rticipating organizations must pur-chase a KidTrax software license ands c a n n e r. (The school system under-writes those costs for afterschool siteslocated in public schools.) The KidTr a xh a rd w a re, software license, and techni-cal support cost about $2,200 to $2,500per site. Eve ry site also needs computerand Internet access, although the systemwas designed to operate with a connec-

APPENDIX B / 33

Page 43: Afterschool Counts!

tion as slow as a 56K dial-up. JCPS doesnot charge users for CARS or for accessto the public schools’ database.

Of the estimated 300 youth-servingagencies in Louisville, about half arevery small organizations with budgetsof an estimated $30,000 to $40,000 peryear. For those that must purchase acomputer in addition to the softwareand scanner, the cost of participating inthis system may be prohibitive.

The use of swipe cards does makesome procedures less time-consumingand there f o re less costly, howe ve r.When a student enrolls in afterschoolactivities, program staff no longer needsto conduct a lengthy intake process.The person entering data simply typesin the student’s name and the relevantinformation appears onscreen.

ISSUES AND CHALLENGES

Scanner logistics. The scanningmethod of attendance-taking requiresthat all participants flow through a sin-gle point of entry and exit, unless thefacility owns more than one scanner.That wasn’t a problem for Boys & GirlsClubs, which are set up with one frontdesk, but schools had to designate onescanning spot and get all participants touse it properly.

Tr a i n i n g . Many frontline staff ofcommunity organizations lack computersavvy—some community school coord i-nators, for instance, had never turned ona computer—and had to be taught howto oversee the card scanning pro c e s s .K i d Trax provided special training on thesystem for school system and commu-nity organization staff, and one of thestudents who developed CARS trainedusers on the middlew a re application.

The shock of true numbers. Manynonprofit organizations rightfully fearthat accurate attendance data mayreveal they’re not serving as large a pop-ulation as everyone thought they were.

“ Our organization has been aro u n dabout 65 years and we’ve alwaysthought we’re doing a good job, but thisthing was going to be pretty darn accu-rate. There was going to be no way tofudge numbers!” recalls Shaw. His pro-gram, which had been estimated toserve about 5,300 children, turned outto reach about 4,800 after the moreaccurate data collection system elimi-nated duplicated counts.

Organizational confidentiality.JCPS addressed the community organi-zations’ concerns about confidentialityby ensuring that users have access onlyto their own program’s data unless thedata are aggregated to the district level.Unlike other jurisdictions, the JCPSschool system was not reluctant to shareits data broadly. “Our superintendent isvery data driven, and he opened thedoors,” says Shaw. And after the citybought into the idea of using KidTraxin eve ry afterschool program, otherleaders have come on board, too.

Pr i vacy of individual students.Using a universal card to collect data onall of a child’s activities—the YMCA,Boy Scouts, school, clubs—has over-tones of Big Brother, Shaw concedes.JCPS ove rcame that perception byallowing parents to specify if they don’twant data analysts to use data on theirchild except at the aggregate level. Veryfew—perhaps one in a thousand—doso. (On the other hand, Shaw some-times gets requests from parents whowant him to examine data on theirchild, especially if the child seems trou-bled at home or in school.)

The system is also password - p ro-tected, and the director of each organi-zation decides which of his employe e scan have access to the password. Eachcommunity organization has to sign ana g reement that it will use the data onlyfor its own purposes, will not give unau-t h o r i zed people access to individual-l e vel data, and will not misuse the data.

Bugs in the system. Opening up theschool systems’ database has inevitablymade the system more vulnerable toelectronic attacks. “All the computerv i ruses and worms out there haveimpacted our ability to operate for thelast two months,” notes Bell. “We’vehad to shut down and open up manytimes, and that affects our ability toprovide a smooth data flow.”

OBSERVATIONS ANDLESSONS

1 The openness and uniformity ofLouisville’s school system smoothedthe way for successful data sharing.

JCPS is described by its partners asdecidedly unbureaucratic and willing tocollaborate. The fact that the entire cityand county is represented by only oneschool district is also an advantage,because it means there are fewer datasystems to align and fewer turf issues toresolve when it comes to linking andsharing data.

2 Good data tracking systems don’thappen overnight.

Planners have to stay committed to thegoal of managing data and keep pushing

34 / APPENDIX B

“Data can be used for good or evil,and you always worry about someone misusing or misconstruingthe data. But if you are workingtogether to benefit kids, the data will not be misused.”—M a rtin Bell, Deputy Superi n t e n d e n t ,

J e f fe rson County Public Sch o o l s

“ We ’re getting over [privacy] hurd l e s ;people are seeing if you use outcomed ata as a tool to improve…the fun-ders stick with yo u .”—Don Shaw, Executive Director,

Salvation Army Boys & Girls Clubs of Louisville

Page 44: Afterschool Counts!

t ow a rd it. That often means keeping keyp l a yers motivated to stay at the table.

3 Partners must be open andhonest with each other.

At first, some community organizationswere reluctant to collect data on out-comes that had potential to show theyweren’t making a difference. “You haveto approach this work by convincingpeople they need data to strengthentheir programs and to [communicatewith] funders,” Bell advises.

Tracking Afterschool “ E n c o u n t e r s ” A c ross an Entire CityDEPARTMENT OF CHILDREN, YOUTH AND FAMILIES, SAN FRANCISCO

CONTEXT

The San Francisco De p a rtment of Chil-d ren, Youth and Families (DCYF) fundsabout 200 community programs thatp rovide child care, afterschool activities,education, youth employment, andmany other services to children fro mb i rth through age 17. Its grantees rangef rom small neighborhood-based organi-zations to the YMCA, Boys & Gi r l sClubs, Beacon In i t i a t i ve, and a school-based literacy program operated bycommunity organizations in 11 neigh-borhoods. The amount of moneyDCYF gives to each grantee is based inp a rt on the number of children, yo u t h ,or families participating in the pro g r a m .

DCYF is a city department but itoperates more like a public foundation.Its $45 million budget comes from ap ro p e rty tax set-aside and from the city’sand county’s general fund allocations.Ni n e t y - t h ree percent of the money flow st h rough DCYF to community-basedorganizations, and about $3 million goesto other city departments that cre a t ep rograms for children and youth, such asthe library bookmobile and satellite

health clinics that work only with yo u t h .In 2001, DCYF commissioned an

outcome evaluation of the departmentf rom Re s e a rch De velopment Associ-ates, and the evaluator suggested thatDCYF needed a better way of trackingchild-level data. The system would dodouble duty as a repository for pro-grams’ intake assessment tools and forthe evaluation’s consumer satisfactionsurvey and staff and parent surveys.And the system would help DCYF ful-fill its obligation to base funding deci-sions on evaluation data.

In 2002, DCYF began working on aWeb-based, citywide data system withMark Min of CitySpan Technologies,who three years earlier had designed thedepartment’s system for contract man-agement, invoicing, and outcomesreporting. Almost 180 of 220 DCYFgrantees now use the database, and allare expected to be online by early 2004.

THE STRATEGY

Data CollectionThe Web-based system tracks eve ryencounter with every child served byevery program funded by DCYF. Dataelements for participants include:

■ ZIP Code and address

■ Ethnicity/race (with 18 options tochoose from, including all CentralAmerican ethnicities)

■ Age

■ Type of service received, organizedinto 30 specified categories

■ Special needs (e.g., eligibility forTANF reimbursement, pare n t i n gteen, homelessness)

■ Length of service per encounter

■ Intake and dropout assessment

■ Consumer satisfaction survey

■ Parent/caregiver survey

■ Program staff survey

The system was also designed toaccommodate other data that individualp rograms want to track for their ow npurposes, although DCYF is just begin-ning to train grantees on that option.

Program staff at each site collect andenter the data using a computer withInternet access. To reduce the burdenon sites, DCYF allows people to enterencounter data either individually orfor an entire group of students who allreceive the same service. Sites can alsou n c h e c k absent students rather thanchecking off every student who is pres-ent. Some multi-site initiatives consoli-date their data and enter it for all oftheir sites; others have each site submitdata separately.

Developing the Data SystemPrevious work for DCYF had familiar-i zed Ma rk Min with the department andits grantees. He had visited many pro-gram sites, and he understood both theirdata needs and their limitations. Mi nw o rked closely with a program officerf rom DCYF and evaluators at Re s e a rc hDe velopment Associates, who helped todesign the data tracking componentsthat we re re l e vant for their work .

Using DataDCYF uses the intake and encounterdata for its evaluation and annualreport. Staff also use the data to ascer-tain the number of children served byDCYF grantees, which is one of theagency’s performance measures for localgovernment funding, and for programmanagement purposes.

As soon as the data are entered, anemail goes automatically to the appro-priate DCYF program officer, remind-ing him or her to check the data againstthe site’s invoice (at an aggregate level,for confidentiality reasons). The datavalidation process is rigorous. In orderfor a child to appear on an invoice, dataon the child and the staff who serve himor her have to meet a standard that has40 different requirements. If a child isenrolled in a program whose staff data

APPENDIX B / 35

Page 45: Afterschool Counts!

are missing a required element (e.g.,instructor’s primary language), the serv-ice is considered invalid and the child’sencounter will not be counted. “It maysound extreme, but we’re developing adataset for research evaluation, and allthe pieces have to be there for theresearch to work properly,” says Min.

After checking the invoice, the pro-gram officer sends it to DCYF’s fiscalstaff for processing. Meanwhile, Mi nd ownloads the data into easily digestiblemonthly re p o rts. DCYF staff and eva l u-ators examine the data at the aggre g a t el e vel and by grantee organization.“W h e n e ver we do these runs, we findsome incongruencies in the data—miss-ing items or erroneous entries,” saysNani Coloretti, DCYF’s Di rector ofPlanning and Budget. A Data Wo rk i n gGro u p, composed of DCYF pro g r a mofficers, the database project manager,and a member of the evaluation team,meets weekly to monitor data qualityand troubleshoot problems.

The “incongru e n c i e s” have raisedsome programmatic issues for DCYF.For example, will the depart m e n ta l l ow a grantee to submit an invoice ifall the encounter data have n’t beene n t e red? DCYF staff don’t want toundermine either the quality of theirdata or the solvency of their grantees.Thus, they compromised by decidingthat a program must show a minimumof 15 encounters per program, permonth, in order to submit an invo i c e .“We set the bar really low because wed i d n’t want to cause cash flow pro b-lems for the sites, and we have otherways of monitoring quality thro u g hsite visits and our technical assistancep roviders,” Coloretti explains.

COSTS

D C Y F ’s encounter database was builtonto an existing system, and the cost ofd e veloping the new system was wrappedinto DCYF’s existing contracts withC i t y Span Technologies and Re s e a rc h

De velopment Associates. Those factorsmake it difficult to determine the exactcost of the encounter database, but it islikely more than $100,000.

ISSUES AND CHALLENGES

Te c h n o l o gy re q u i rements. A l t h o u g hDCYF’s encounter database is highlyautomated and easy to use, many of thedepartment’s grantees are very smallservice providers; their only access to acomputer may be at the parent organi-zation’s central office, not at the sitewhere services are actually delivered.And frontline staff often require verybasic computer training before they feelcomfortable using the system. Databased e velopers spent hours discussingwhether DCYF should require smallp rograms that lacked technology toparticipate in data collection and whattools the department would need toprovide to them.

Lower-than-expected participationfigures. Analysis of DCYF’s encounterdata instantly revealed that the totalnumber of children and youth servedby grantees is far fewer than wasthought—about 18,000, compare dwith previous estimates of 50,000.DCYF leaders aren’t sure what to makeof the discrepancy at this point. Does itmean that sites that receive fundingfrom many sources are only reportingthe children they think DCYF is payingfor? Are they serving fewer childrenoverall than reported previously? Or is

it just taking a long time to get all par-ticipants entered into the system?

The system developer attributes thelower counts to several factors. First,DCYF’s grantees are still adjusting tothe new system’s more strenuous datacollection requirements. Programs thatserve children in many different waysmay be successfully capturing some,but not all, participants. As the DCYFgrantees develop better methods fordocumenting all of their services, theparticipation numbers will likely goback up. Second, before the new systemwas put in place grantees often includedinformal, temporary contact with chil-dren—such as one-time presentationsat schools—in their record of serviceencounters. Those contacts, however,did not generate the full set of dataneeded to qualify a participant as“valid” in the new data system (e.g.,intake forms, attendance records). Theexclusion of informal service recipientsfrom the data base produced a signifi-cant drop in participation figures.

DCYF now plans to develop twocounts: one that captures “fully docu-mented services” and a separate one for“partially documented” services. Com-bined, these two counts are expected tobring overall service numbers back upto 50,000.

Analysis of costs per encounter.Research Development Associates pro-vides DCFY with data on the cost perencounter, but those results are consid-ered very controversial. It is impossibleto tell from the data whether programsare incorporating money from othersources into their services. It is alsoinaccurate to compare encounters solelyon the basis of amount of time investedin them, Coloretti notes.

Co n f i d e n t i a l i t y. D C Y F ’s databasewas originally intended for invo i c i n gpurposes and to make the depart m e n tm o re efficient; it was not conceived as are p o s i t o ry for personal data on individ-ual participants. As the system’s role has

36 / APPENDIX B

“ We had to weigh the cost to each p rogram of entering the data againstthe potential benefit we ’d get from theeva l u at i o n . W h at we did not have achance to do, and hope to do in ourn ext round of grantmaking, is get a userg roup to comment on [the system].”—Nani Coloretti, Director of Planning

and Budget, Department of Children,Youth and Families

Page 46: Afterschool Counts!

expanded, DCYF has had to formulaterules about who sees what data andwhen, to protect childre n’s confidential-i t y. DCYF staff are not allowed to viewi n d i v i d u a l - l e vel data, and data are scram-bled when transmitted electro n i c a l l y.

OBSERVATIONS ANDLESSONS

1 The only way to ensure that sitessubmit data is to link datacollection to invoice submissions.

DCYF neglected to link intake assess-ments to invoicing, and it now appearsthat some of this vital baseline data maybe missing. The department now has todecide whether to “defund” the agen-cies that failed to collect the data.

2 A mechanism for addressing datacollection issues is necessary.

D C Y F ’s evaluators and pro g r a m m e rhave visited grantees to help them setup data collection tools, and the depart-ment has a heavily used online “IssueTracker.” People with questions aboutthe system—usually DCYF pro g r a mofficers who have been contacted bygrantees—use the Issue Tracker to seekhelp and to review responses given toothers with similar problems.

3 Analysis of data helps systemleaders define the scope, reach, andquality of services.

DCYF’s analysis of program data hasu n d e r s c o red the differences amongservices and their implications for bothp rogramming and evaluation. Fo rexample, the department supports ayo u t h - t o - youth peer talk line thatreceives 10,000 calls per year. That serv-ice reports a very different encounterhistory than a program that serves thesame 100 children every day but mayprovide equally vital services. “We’restarting to refine our definitions forwhat we’re counting [as encounters],

and we’ve created a category for educa-tion, outreach, and talk line [encoun-ters] that is separate from otherservices,” Coloretti says.

Fu rt h e r m o re, the data show thatDCYF-funded programs are helpingmany 18- and 19-year-olds, yet thedepartment’s mandate is only to serveyouth through age 17. DCYF programofficers are now trying to learn howmany of the older participants are actu-ally parenting teens, whose services arecovered by DCYF’s scope.

Finally, the encounter data have con-firmed that the racial/ethnic composi-tion of staff in sites funded by DCYFdoes not match the race/ethnicity ofp rogram clients. “That [dimension]may not be important to young chil-dren, but it’s important to youth to seepeople of the same ethnicity in leader-ship positions,” Coloretti notes. “I don’tknow that it will change right away, butI think the programs will be interestedto know what we’ve found.”

Collecting Afterschool Attendance and “Dosage”Data for an EvaluationTHE BEACON INITIATIVE

CONTEXT

The San Francisco Beacon Initiative,which began in 1996, is based on theBeacon model developed in New YorkCity. Each school-based Beacon centeris operated by a “lead agency”—a com-munity-based or other nonprofit organ-ization—that has deep roots in thecommunity and neighborhood. A spe-cially created oversight organizationraises funds, promotes quality stan-dards, convenes site directors and leadagency staff periodically for peer learn-ing, and manages an evaluation.

The centers serve students ink i n d e r g a rten through twelfth gradeand their family members. Each centerplans and implements activities inde-

pendently based on its clients’ needsand interests, but all centers addre s sf i ve core competencies of youth deve l-opment: health; arts and re c re a t i o n ;leadership; career development; andeducational support, which encom-passes such activities as academicenrichment, homew o rk help, tutoring,and English as a Second Language.

In 2003, there were eight Beacons inSan Francisco: one at an elementaryschool, one at a high school, and six inmiddle schools. Each center must servea minimum of 600 youths per year; col-lectively, the centers served 5,370 youthand 1,391 adults in 2002-03. Amongthose participants were large popula-tions of Chinese, Filipino, other Asian,Latino, African American, and Whiteresidents.

This profile features two Beacon cen-ters whose data experiences illustrateboth the differences that exist in amulti-site initiative and the commonal-ities of data users in all sites:

The Sunset Neighborhood BeaconCenter (SNBC) p rovides stru c t u re dafterschool programs at five locations(four schools and one storefront) on thewest side of San Francisco. Its services,including afterschool pro g r a m m i n g ,case management, youth leadership,community media production, andmental health services, reached 1,370c h i l d ren and youth in 2002. T h eschool-based afterschool programs hada total of about 550 participants whoattended five days per week. SNBCDirector Michael Funk is an experi-enced data user who also co-chairs thes t a t e’s afterschool advisory pro g r a mevaluation committee.

C h i n a t own Beacon Center (CBC) isoperated by Wu Yee Children’s Services,an organization that has a long historyof providing child, youth, and familyservices to bi- or monolingual Chineseresidents of Chinatown, many of whomare recent immigrants. CBC is the onlyBeacon center located in an elementary

APPENDIX B / 37

Page 47: Afterschool Counts!

school. Its focus is on services to chil-dren between kindergarten and eighthgrade and child care for infants, tod-dlers, and preschoolers, with a specialemphasis on English language develop-ment. It serves more than 700 childrenand 250 adults per year. Activities foradults include English as a Second Lan-guage, computer literacy, Web design,and Web-based parent education.

Data tracking was part of the SanFrancisco Beacon Initiative from theprogram’s inception, primarily as a toolto collect data for the evaluation byPublic/Private Ventures, which soughtto link students’ afterschool participa-tion rates and dosage levels to schooloutcomes.

THE STRATEGY

The San Francisco Beacon In i t i a t i veh i red programmer and software designerMa rk Min (CitySpan Technologies) tod e velop a database that Beacon centersuse to track data on students’ attendance( by activity) and demographic character-istics. An arrangement with the schooldistrict also enables the Beacon data sys-tem to collect data on students’ schoolattendance and performance (althoughthe depth of school data varies by Be a-con center according to the re l a t i o n s h i pb e t ween the center’s director and theprincipal of the host school).

Data CollectionThe required data elements in the sys-tem Min designed are relatively simple:intake information, including student’sage, address, emergency contact, homeschool, race/ethnicity, first language;and participation information, includ-ing dates of attendance and activitiesattended. There are also some optionaldata fields, such as eligibility for publicassistance. Evaluation surveys adminis-t e red to students, staff, and familymembers can cross-reference data in thesystem but aren’t entered into the sys-tem itself.

Parents or caregivers complete anintake form for each child upon entryinto the program; because the formcontains parental consents, it requires ac a re g i ve r’s signature. A Beacon staffperson enters the intake informationinto the database using a computer con-nected to the Internet.

After students sign up for activities,the system generates rosters for eachclass that are used to take attendance.At Sunset Neighborhood Beacon Cen-ters, staff use the rosters as sign-insheets; at Chinatown Beacon Center,i n s t ructors check off the students asp resent or absent. Someone at the cen-ter or the lead agency’s headquart e r sthen enters the data for each class intothe Web-based system. To speed thep rocess, data entry staff can choose an“all pre s e n t” category to indicate thatt h e re we re no absences.

Some centers augment the atten-dance tracking system with other infor-mal methods. The Chinatown BeaconCenter, for instance, holds a monthlycommunity dinner that attracts about250 residents. Staff use a handheld“clicker” to gauge the number of dinersso they can identify changes in atten-dance patterns.

Development of the Data SystemThe system developer worked closelywith leaders of the San Francisco Bea-con Initiative and evaluators from Pub-lic/Private Ventures to identify the dataelements and reporting formats the sys-tem needed to encompass to serve thedata needs of evaluators, program offi-cers, and center directors. Public/Pri-vate Ve n t u res negotiated with theschool district to get access to students’school records, which enabled the eval-uation to compare Beacon participantsto children who didn’t attend the pro-gram and to ascertain the effects pro-duced by different levels of exposure toBeacon activities.

The system developed for the Bea-con Initiative eventually became thepilot data system for all afterschool

youth programs funded by the city’sDepartment of Children, Youth andFamilies (DCYF). DCYF subsequentlycontracted with Min to upgrade thesystem, making it more comprehensiveand making data retrieval more flexible.In 2003, rather than similarly upgrad-ing the original system, leaders of theBeacon Initiative decided to “migrate”their data into the DCYF system.

Using the DataSNBC Di rector Funk uses the data sys-tem to generate re p o rts on how manyc h i l d ren are invo l ved in a specific cate-g o ry of activity, often filtering the analy-sis through a demographic va r i a b l e .Thus he might examine the number ofLatino girls invo l ved in youth leadershipactivities, or the number of third - g r a d eb oys engaged in arts and re c reation. Healso compares the center’s average dailyattendance rate from month to month,and compares the demographic bre a k-d own of SNBC participants with thedemographic characteristics of the com-munity at large, to see if the program isattracting its target population.

C h i n a t own Beacon Di rector Be nWong uses the system to generatemonthly reports that help him trackchanges in the program; his staff use itmostly to retrieve students’ emergencycontact information or to get addressesof family members they want to notifyof upcoming events. “It is important forus to know what attendance looks like,globally, so when someone—especiallya funder—asks how many kids we see ayear, we know,” Wong explains. “Wealso calculate average daily attendancebecause one of our goals is to improvethe frequency of young people’s atten-dance.” Understanding that interac-tions with caring adults are a factor inhealthy youth development, Wo n glooks to see if the average daily atten-dance for youths is increasing. “Ideally,we’d like to be able to follow up withkids who haven’t shown up for a while,”he says.

Wong finds it more useful to review

38 / APPENDIX B

Page 48: Afterschool Counts!

average daily attendance than to look attotal numbers of children serve d ,because he’s looking for patterns inattendance—especially increases in theproportion of children who come to thecenter on a regular basis. He also looksfor attendance patterns by children’ssex, ZIP Code, and age. “[Those data]help me in giving accurate descriptionsof what this agency is, what it does,what it looks and feels like. It helps inour grant proposals, and definitely inoutreach to families,” Wong says. Thedata also give Wong early warning onpotential service gaps. An influx in 14-year-old participants, for instance, indi-cates that planners had better preparefor next year’s 15-year-olds.

Funk likes to examine attendancedata by activity. Although the systemcaptures participation by activity, dataon dosage levels per activity are not easyto retrieve through a simple query. Forexample, the SNBC middle-school pro-gram offers project-based activities,such as film making, from 4 to 5 p.m.every day. If the database indicates thata child attended on Tuesday duringproject-based activities, one can assumethe child received one hour of dosage,but the exact amount of time the childwas present is not in the system. (Thisissue may be resolved as the Beacon Ini-tiative begins using the more up-to-dateDCYF data system.)

COSTS

The main cost to Beacon centers is fordata entry. Each of the centers profiledhere has one part-time staff person ded-icated to data entry. Funk’s assistantputs in about 35 hours a week to enterdata from five sites; Wong’s assistantspends 15 to 20 hours per week andearns about $12 an hour. “Ge t t i n geveryone to fill out registration formsand attendance sheets and fill in miss-ing data takes a lot of time but the bur-den is spread around to manyinstructors,” Wong notes.

ISSUES AND CHALLENGES

Understanding how to use the system.Users of any data system need to under-stand how each data element is definedand how to read the reports generatedfrom queries. “You may print a Unit ofSe rvice Re p o rt and know yo u’veentered the data one way, but when youprint it out it doesn’t look the way youexpected,” explains Wong. “For exam-ple, a class may not print as a classunless you’ve entered information onthe instructor and time of day. Or youmay have completed that portion of thescreen but it doesn’t print out right.And the word ‘unduplicated’—does itmean for this class, for this site, for thismonth, or for this year? There’s a lot oflearning how to use the system to makeit do what you want.”

Ensuring that data are entered cor-re c t l y. Missing data can cause the sys-tem to generate information that usersmight misinterpret. Centralizing all datae n t ry with one person helps to buildp roficiency with the system and re d u c ethe number of errors, Wong advised.

Two-stage data collection. L i k emany systems featured in this report,the Beacon Initiative collects data intwo stages: by hand on the front lines ofservice, followed by entry into an elec-tronic format. At sites that serve morethan 600 participants a day, enteringe ve ry re c o rd twice is burd e n s o m e .Funk’s centers have considered convert-ing to a wireless system (using handheld

or tablet computers) that would elimi-nate the second step, which he believeswould save enough money to justify thecost of the new system.

Tracking unusual services. Some ofthe most intensive services that Be a c o n sp rovide, especially those involving men-tal health, are the most difficult to track.“ For kids who are in stru c t u red clubs orp rojects, there’s a predictable rhythm tos e rvices,” notes Funk. “But for a yo u n gperson in case management, whose casemanager hangs out with the kids [ininformal settings], it’s harder to cap-t u re.” Even though the Beacon In i t i a t i vedesigned a password - p rotected area ofthe database for storing re c o rds that arec o n s i d e red confidential, the problem ofh ow to reliably collect those dataremains unsolve d .

Extracting data from the system.Funk, who has more technologicalsavvy than the average system user, hasbeen frustrated by his efforts to use theBeacon system for strategic planning.He often wants to formulate his ownqueries rather than relying on the pack-aged reporting formats—to learn howmany participants in the afterschool DJclub are also involved in case manage-ment through the juvenile justice sys-tem, for example. As the Beacon systemmigrates to the more modern DCYFsystem, this problem may be resolved.

OBSERVATIONS ANDLESSONS

1 People need an incentive tocollect and enter data.

Eve ryone agrees that people need a sig-nificant reason to collect and turn inattendance data or they won’t make thee f f o rt. Funk stipulates that his pro g r a mi n s t ructors have to turn in their atten-dance sheets before they can be paid. Atthe program level, the state of Californiarecently acted to tie funding for after-

APPENDIX B / 39

“ I t ’s really important for org a n i z at i o n sto be able to measure their impact—i t ’s the key to being sustainabl e . But itcan be ve ry intimidating for nonpro f i t sto think about developing [a dat atracking system]. For people draw ninto the nonprofit field, i t ’s not typi-cally their fort e .”—Virginia Witt, Executive Director,

San Francisco Beacon Initiative

Page 49: Afterschool Counts!

school education and safety programs toa verage daily attendance, at least for ele-m e n t a ry and middle schools.

2 Web-based systems are easy toaccess and use.

Users of the Beacon system generallyagree that it is easy to use, and becauseit is Web-based it can be accessed froma variety of convenient locations. “Ifwe’re behind in entering data, I can putthree people on the job in the computerlab, or someone can work on it fromhome,” Funk says. “When it comestime for software upgrades, the processis all centralized. And when evaluatorsaccess the data, we don’t have to do any-thing [at individual sites].”

3 A data partnership with theschool district is crucial if you wantto link attendance to educationaloutcomes.

Afterschool programs that have an edu-cational component should collaboratewith the local school district(s) on datacollection and sharing so that both enti-ties understand students’ needs andoutcomes and so afterschool providershave all of the tools they need to sup-port what happens in school.

4 One size does not fit all when itcomes to data systems.

The Beacon In i t i a t i ve is complex,because the centers offer an array ofactivities and have a large number ofstakeholders who want to know aboutresults in their area of interest. Thus theBeacon system has to collect a bro a dmenu of data. Other programs mayh a ve much more limited data needs.“ [ Designing a system] is about what yo uand your stakeholders need to know torun programs more effective l y,” says ini-t i a t i ve director Virginia Wi t t .

“ Systems have an impact on eve ry-thing around them,” adds Wong. As theBeacons move their data onto the DCYFsystem, for example, they will need toexpand data collection to satisfy that sys-

t e m’s re q u i red data elements. “We don’th a ve a stru c t u re for tracking interactionswith participants eve ry 15 minutes inthe Beacons,” Wong says. “T h a t[ re q u i rement] in effect changes our [datacollection] forms, because we’vedesigned them around what’s capturableand storable [in our current system].”

5 The people who are directlyresponsible for data collectionshould help design the system.

Frontline program staff can give systemdesigners a realistic sense of what datacan and can’t be collected—and a morecomprehensive perspective on partici-pants’ experiences. Data experts “wantto know that a kid moved from tutor-ing to basketball to arts” over the courseof a day, notes Wong. “Program peopledon’t think that way; we try to thinkmore holistically. In open environmentslike a Beacon center, there may bemany things going on and a young per-son may choose to go to several activi-ties for five or 10 minutes or half anhour. Life doesn’t happen in 15-minute,linear increments.”

40 / APPENDIX B

Page 50: Afterschool Counts!

RESEARCH AND INFORMATION

Afterschool Alliance1616 H St. NWWashington, DC 20006(202) 347-1022www.afterschoolalliance.org

Afterschool.govRoom 71041800 F St. NWWashington, DC 20405(202) 208-1309www.afterschool.gov

Fight Crime: Invest in KidsSuite 2402000 P St. NWWashington, DC 20036(202) 776-0027www.fightcrime.org

Harvard Family Research ProjectHarvard Graduate School of Education3 Garden St.Cambridge, MA 02138(617) 495-9108www.gse.harvard.edu

National Association of ElementarySchool Principals (NAESP)1615 Duke St.Alexandria, VA 22314(800) 386-2377www.naesp.org

National Institute on Out-Of-School Time106 Central St.Wellesley, MA 02481(781) 283-2547 www.niost.org

National PTASuite 2100330 N. Wabash Ave.Chicago, IL 60611(800) 307-4782www.pta.org

Wisconsin Center for Education ResearchSchool of EducationUniversity of Wisconsin-MadisonSuite 7851025 West Johnson St.Madison, WI 53706(608) 263-4200www.wcer.wisc.edu

ATTENDANCE TRACKING SYSTEMS

CitySpan Technologies (YouthServices.net)2437 Durant Ave.Suite 206Berkeley, CA 94704(510) [email protected]

nFocus Software (KidTrax)2400 E. Arizona Biltmore CircleBuilding One, Suite 1170Phoenix, AZ 85016(602) 954-9557www.nFocus.com

Quality School Portfolio (QSP)National Center for Research on Evaluation,Standards, and Student TestingMargaret Heritage, Program DirectorUniversity of California-Los Angeles(310) [email protected]

APPENDIX C / 41

CResources

Page 51: Afterschool Counts!

ACKNOWLEDGMENTS

THIS PROJECT WAS CONCEIVED AND DIRECTED BY Carol Glazer, Senior Program Consultant tothe After School Project. Elizabeth Reisner of Policy Studies Associates (PSA) helped to shapethe content and identify people and programs with useful experiences to share. Floyd Morris

of The Robert Wood Johnson Foundation increased the project’s depth and breadth by urging theauthor to include information on citywide data systems and their special challenges. JoAnne Vellarditaof the After School Project provided valuable guidance on the guide book’s scope and tone. LisaWeiner of PSA helped with the research, and PSA staff members Richard White, Christina Russell,Elizabeth Reisner, Ellen Pechman, Carolyn Marzke, and Lara Fabiano shared the knowledge they haveaccumulated from evaluating afterschool programs.

This guide book benefited from the insights of many people whose varied roles convergearound data collection and use: afterschool program directors and staff, evaluators, representatives ofcity government and community-based organizations, and developers of data tracking systems andsoftware. They were all generous with their time and willing to speak candidly in the hope that theirexperiences might help others navigate the challenging, but rewarding, realm of student attendancedata. (For a full list of names and affiliations, please see Appendix A.)

Special thanks are due to the representatives of the After School Project sites interviewed forthis project: Rachel Baker of Team-Up for Youth, Debra McLaughlin of Boston’s After-School for AllPartnership, Lisa Jackson of Building Boston’s After-School Enterprise, and Nancy Wachs and RachelKlein of After School Matters.

42 / ACKNOWLEDGMENTS

Page 52: Afterschool Counts!

For additional copies, please contact:The After School Project180 West 80th StreetNew York, NY 10024e-mail: [email protected]