Embedded Assessment for Citizen Science (EA4CS)...(EA) to measure participant science inquiry skill...

1
Embedded Assessment for Citizen Science (EA4CS) Research Questions From exemplary efforts Method: CS project interviews & document review From evaluators & researchers Method: focus groups From scientists Method: Survey From CS field Method: Literature review & CS leader survey What are.... Targeted skills? Training on these skills? Assessment of these skills? Data validation efforts? Strategies for underserved? Methods - Forest View Some Result and Insights Methods - Tree View Partners Summary Some Challenges EA4CS is an AISL Pathways project that explores the use of embedded assessment (EA) to measure participant science inquiry skill development within the context of citizen science (CS) projects. is proof-of-concept project consists of two concurrent strategies to study and enhance the use of EA in these informal- science-learning settings. With a “forest view,” we are scanning the CS community to understand assessment of targeted science inquiry skills. With our “tree view,” we are conducting case studies to develop, pilot, and analyze EAs of inquiry skills across three diverse CS projects, including documenting the process. An advisory committee is providing guidance and feedback on the process and products. Our project will result in the following: Deeper understanding of relevant science inquiry skills and issues around assess- ment of these skills within environmental-science-based CS Sample EA tools aligned with a subset of specific inquiry skills Description of the process to develop these skill-based EA tools Our two professional target audiences are (1) CS leaders, evaluators, and researchers and (2) the informal science learning community. Most CS projects do not assess volunteers’ skill gains, and some have not articu- lated targeted skills or strategies to address these skills. CS projects that do not prioritize high-quality data may define science inquiry skills differently than projects that do prioritize high-quality data (e.g., emphasiz- ing knowledge of skills over ability to perform skills). CS projects vary dramatically, and thus it is challenging to determine a one-size- fits-all EA solution. For this exploratory grant, we provided EAs by adapting existing activities of pilot CS projects; we have not explored the process of creating a unique EA measure. EA4CS project co-PIs Cathlyn Stylinski (University of Maryland Center for Environmental Science) Rachel Becker-Klein (PEER Associates) Karen Peterman (Karen Peterman Consulting) EA4CS project consultants Amy Grack Nelson (Science Museum of Minnesota) Tina Phillips (Cornell University) EA4CS evaluation/advisory committee Martin Storksdieck (Oregon State University) Leslie Goodyear (Education Development Center) Chris Lysy (Westat) Forest view: needs assessment Tree view: case studies What are common and unique CS science-inquiry skills? How are these skills measured to document project impact? What are opportunities and challenges for EA of these skills? How does EA help CS leaders better understand project impacts on these skills? Activities Goals EA Formative reflection on project’s theory of action e primary benefit of the EA process will be the assessment itself. However, the process has also produced unanticipated benefits; that is, developing EAs has served as a valuable reflection point for case- study sites, which will strengthen both project and evaluation design. NSF DRL 1422099 ® urbanecologycenter.org Description: National, online phenology program in which participants regularly record observation of plants and animals to generate long-term datasets Construct: Paying close attention Measure: A rubric is used to score an observation and journaling activity of selected plant and animal species Description: Out-of-school- time program that works with underserved youth ages 6-18 in Milwaukee, WI Construct: Creating/ interpreting graphs, and formulating conclusions Measure: A rubric is used to score a 30-minute inquiry activity that includes graphing and describing data and conclusions Description: Inquiry- based out-of-school science discovery program for students grades 4-8 in Raleigh, NC Construct: Precision in data collection Measure: Field-based dragonfly counts are compared to those of a peer who observed the same area DEVELOP EA Identify construct Explore existing activities Refine existing or create new activites Create scoring VALIDATION Expert review Revision 1 ink-alouds Pilot-testing in context Revision 2 FIELD TEST Within context Formative analysis Summative analysis EA TEMPLATE Progress in the process High-quality data requires skilled volunteers, and efforts to validate data are related to skill gain assessments. us, EA might be integrated into data validation efforts. Evidence of skill gains Example: Matching pH from volunteers and independent analyses Data validation Example: Split-samples (volunteer and independent analyses) Skilled volunteers Example: Properly sample and submit pH data High-quality data Example: Stream pH Tree graphic coutesy of Tracy Saxby, IAN/UMCES, ian.umces.edu/imagelibrary/ 0% 10% 20% 30% 40% 50% 60% 70% 80% 90% 100% None of the above Evaluate validity of study/claim CriCcally read science literature EsCmate uncertainty in data Develop hypotheses Use staCsCcs and probability Develop a testable quesCon Recognize bias in data Design invesCgaCons Use data to address a claim Create graphs and maps of data Visually interpret graphs/maps Communicate results Train other volunteers Make and record measurements Enter data Work effecCvely, efficiently and safely Use instrucCons to collect data Make/record observaCons Percentage of Ci.zen Science Projects Science Inquiry Skills (reviewed by scien.sts) We uncovered a rich diversity of possible volunteer science- inquiry skills. However, most CS projects are limited to collecting and submitting data.

Transcript of Embedded Assessment for Citizen Science (EA4CS)...(EA) to measure participant science inquiry skill...

Page 1: Embedded Assessment for Citizen Science (EA4CS)...(EA) to measure participant science inquiry skill development within the context of citizen science (CS) projects. This proof-of-concept

Embedded Assessment for Citizen Science (EA4CS)

Research Questions

From exemplary effortsMethod:

CS project interviews & document review

From evaluators & researchers

Method: focus groups

From scientistsMethod: Survey

From CS fieldMethod:

Literature review & CS leader survey

What are....Targeted skills?

Training on these skills?Assessment of these skills?

Data validation efforts?Strategies for underserved?

Methods - Forest ViewSome Result and Insights

Methods - Tree View

Partners

Summary

Some Challenges

EA4CS is an AISL Pathways project that explores the use of embedded assessment (EA) to measure participant science inquiry skill development within the context of citizen science (CS) projects. This proof-of-concept project consists of two concurrent strategies to study and enhance the use of EA in these informal-science-learning settings. With a “forest view,” we are scanning the CS community to understand assessment of targeted science inquiry skills. With our “tree view,” we are conducting case studies to develop, pilot, and analyze EAs of inquiry skills across three diverse CS projects, including documenting the process. An advisory committee is providing guidance and feedback on the process and products.

Our project will result in the following:

•Deeper understanding of relevant science inquiry skills and issues around assess-ment of these skills within environmental-science-based CS

•Sample EA tools aligned with a subset of specific inquiry skills•Description of the process to develop these skill-based EA tools

Our two professional target audiences are (1) CS leaders, evaluators, and researchers and (2) the informal science learning community.

•Most CS projects do not assess volunteers’ skill gains, and some have not articu-lated targeted skills or strategies to address these skills.

•CS projects that do not prioritize high-quality data may define science inquiry skills differently than projects that do prioritize high-quality data (e.g., emphasiz-ing knowledge of skills over ability to perform skills).

•CS projects vary dramatically, and thus it is challenging to determine a one-size-fits-all EA solution.

•For this exploratory grant, we provided EAs by adapting existing activities of pilot CS projects; we have not explored the process of creating a unique EA measure.

EA4CS project co-PIs•Cathlyn Stylinski (University of Maryland Center for Environmental Science)•Rachel Becker-Klein (PEER Associates)•Karen Peterman (Karen Peterman Consulting)

EA4CS project consultants•Amy Grack Nelson (Science Museum of Minnesota)•Tina Phillips (Cornell University)

EA4CS evaluation/advisory committee•Martin Storksdieck (Oregon State University)•Leslie Goodyear (Education Development Center)•Chris Lysy (Westat)

Forest view: needs assessment

Tree view: case studies

•What are common and unique CS science-inquiry skills?•How are these skills measured to document project impact?•What are opportunities and challenges for EA of these skills?•How does EA help CS leaders better understand project impacts on these skills?

ActivitiesGoals

EA

Formative reflection on project’s theory of action

The primary benefit of the EA process will be the assessment itself. However, the process has also produced unanticipated benefits; that is, developing EAs has served as a valuable reflection point for case-study sites, which will strengthen both project and evaluation design.

NSF DRL 1422099

®

u rbaneco l ogycen te r. o rg

Description: National, online phenology program in which participants regularly record observation of plants and animals to generate long-term datasets

Construct: Paying close attention

Measure: A rubric is used to score an observation and journaling activity of selected plant and animal species

Description: Out-of-school-time program that works with underserved youth ages 6-18 in Milwaukee, WI

Construct: Creating/interpreting graphs, and formulating conclusions

Measure: A rubric is used to score a 30-minute inquiry activity that includes graphing and describing data and conclusions

Description: Inquiry-based out-of-school science discovery program for students grades 4-8 in Raleigh, NC

Construct: Precision in data collection

Measure: Field-based dragonfly counts are compared to those of a peer who observed the same area

DEVELOP EAIdentify construct

Explore existing activitiesRefine existing or create

new activitesCreate scoring

VALIDATIONExpert review

Revision 1Think-alouds

Pilot-testing in contextRevision 2

FIELD TESTWithin context

Formative analysisSummative analysis

EA TEMPLATE

Progress in the process

High-quality data requires skilled volunteers, and efforts to validate data are related to skill gain assessments. Thus, EA might be integrated into data validation efforts.

Evidence of skill gains

Example: Matching pH from volunteers and independent

analyses

Data validationExample: Split-samples

(volunteer and independent analyses)

Skilled volunteersExample: Properly sample

and submit pH data

High-quality dataExample: Stream pH

Tree graphic coutesy of Tracy Saxby, IAN/UMCES, ian.umces.edu/imagelibrary/

0% 10% 20% 30% 40% 50% 60% 70% 80% 90% 100%

NoneoftheaboveEvaluatevalidityofstudy/claimCriCcallyreadscienceliterature

EsCmateuncertaintyindataDevelophypotheses

UsestaCsCcsandprobabilityDevelopatestablequesCon

RecognizebiasindataDesigninvesCgaCons

UsedatatoaddressaclaimCreategraphsandmapsofdataVisuallyinterpretgraphs/maps

CommunicateresultsTrainothervolunteers

MakeandrecordmeasurementsEnterdata

WorkeffecCvely,efficientlyandsafelyUseinstrucConstocollectdata

Make/recordobservaCons

PercentageofCi.zenScienceProjects

Scie

nce

Inqu

iryS

kills

(rev

iew

edb

ysc

ien.

sts)

We uncovered a rich diversity of possible volunteer science-inquiry skills. However, most CS projects are limited to collecting and submitting data.