Julie Quinn Computer-Based Assessments Specialist Utah State Office of Education 1.
Transcript of Julie Quinn Computer-Based Assessments Specialist Utah State Office of Education 1.
27 Multiple Choice CRTs◦ Grades 3 – 11 English language arts◦ Grades 3 – 7 math, Pre-Algebra, Algebra 1, Geometry,
Algebra 2◦ Grades 4 – 8 science, Earth Systems Science, Physics,
Chemistry, Biology Direct Writing Assessment
◦ Grades 5 and 8◦ Plus formative tool available year-round, grades 5 & 8
Utah Test Item Pool Service (UTIPS)◦ Formative tool – USOE item pool and/or educator items◦ Available year-round to all content areas, K-12◦ Facilitates local benchmark/interim tests
41 Districts, 81 Charter Schools 530,000 students Lowest per-pupil spending in nation
Infrastructure 50% Windows, 40% Macintosh, 10% Linux Strong technical skills among LEAs
◦ Wireless, thin clients, multiplied workstations Utah Education Network
◦ ISP for districts and secondary schools, some charter schools
◦ Few elementary schools with a single T1 line
Year Participation Rate
Number Of CRTs Administered
2001-2006
4 – 8% Max 90,000
2007 8% 92,000
2008 50% 495,000
2009 66% 659,000
2010 80% Projected
815,000 Projected
Year Key Events
2001 All 27 CRTs available online
2004 UTIPS available online
2004 & 2007
One-time legislative funding, focused on hardware acquisition
2007 CBT Summit – to define state vision
2009 Change in CBT vendor
Year Key Events
2009 & 2010
CAT pilot available as local LEA assessment option
2010 Change in CRT development vendor (ELA & math)
2010 Shorter CRTs, embedded pilot items
2010 Text-to-speech pilot, embedded within CRTs
2010 Innovative item research & small-scale pilot
2010 DWA online with AI scoring
Hardware + Software + Test items & forms + Bandwidth + Local configurations + Student preparation + Test administration procedures
= Testing experience
It’s not just a new test – it’s an ambitious technology implementation project
Different skills needed to support testing◦ Cleaning answer documents vs. technical support◦ Different and more preparation prior to testing
Low tolerance for interruptions◦ Browser loading of pages◦ System interruptions
Aging infrastructure◦ One-time funding creates “bubbles”◦ HVAC, electrical upgrades needed◦ Participation tied to what is physically possible
Balancing innovation with stability◦ Item types and accessibility impact on system◦ What are LEAs purchasing? Can it be supported?
What is standardized presentation? PBT version of the CBT format Change in vendor/software LEA configurations (e.g., screen resolution)
What is comparable?
Year to year Form to form
Redesigning processes to be CBT-centric, while still producing PBT
Development QA timeline is different
Require industry best practices for software development and deployment
Clear communication with all parties◦ Assessment and Technology brainstorming,
preparing, and resolving problems together Plan for crisis management
◦ There will be problems◦ Philosophy shift to “not if, but when”
Set clear expectations for participation◦ What is voluntary? Flexibility for LEAs? ◦ Each school CAN do something
All efforts focused on lowest risk implementation
Solid LEA and school readiness checklists◦ Compare system technical specifications to LEA
reported configurations to what is actually used Strong support for issue resolution
◦ Separate policy issues from system training and technical troubleshooting issues
◦ Well defined tier 1, 2 and 3 support ◦ Local configuration vs. system-wide problems◦ How to respond to administration anomalies
Long-term vision for assessments◦ More options for validly assessing students
Students more engaged Student results in teacher hands faster Technology resources available to support
instruction CBT shines light on many issues
◦ Test administration processes and ethics ◦ Appropriate accommodations◦ SIS system and course scheduling◦ Better picture of technology infrastructure
More time to spend on what to do because of the data instead of generating the data◦ Automatic scoring & use of artificial intelligence
Increases assessment literacy◦ What do good questions look like?◦ How can we make our questions better?
Easier to tailor assessments to instruction and student needs
Encourages conscious alignment of individual assessments to curriculum, K-12◦ Why am I asking this question?
Julie QuinnComputer-Based Assessments Specialist
Utah State Office of [email protected]
http://schools.utah.gov/assessment