ATP 2012

34
1 Incorporating Best-Practice Design and Development Principles into Testing Programs Sally Valenzuela, CTB/McGraw-Hill

Transcript of ATP 2012

1

Incorporating Best-Practice Design and Development Principles into Testing

Programs

Sally Valenzuela, CTB/McGraw-Hill

2

Abstract

• Evidence-based item development approaches require item authors to make explicit how items provide validity evidence to support claims.

• Advances in technology require increased consideration of accessibility and interoperability during item authoring.

• Refined item development processes are needed to meet these requirements.

3

Why now?

• Market demand for transparency • Evidence requirements articulated by CCSS

assessment consortia• Technical advances in assessment (e.g. CAT, diagnostic

models)• Interoperability requirements (accessibility, tagging)• Increased need for cost-effective development

4

Interoperability

• Multiple delivery platforms• Accessibility for all learners• New item types• Transparency of development

5

Discussion

What other considerations drive item development today?

6

CTB CADDS

• Serves as a bridge between standards and assessment• Provides opportunities for systematic unpacking of

standards for item development• Documents steps in reaching assessment goals• Maintains focus at all steps on evidence to support

interpretations and uses of test information

7

CTB CADDS

• Details information about the content to be assessed• Describes item types that will be used• Detailed specifications for each item/task

– cognitive tasks and processes– proficiency level targets

• Increased direction for item writers

8

CADDS

1. Define the intended inferences and decisions to be based on test scores.

2. Define the achievement construct.3. Draft performance level descriptors (expectations of

students).4. Define the evidence to be elicited by the item pool.

9

CADDS

5. Complete item writer assignments to meet the item pool specification requirements.

6. Complete item creation (authoring and editing).7. Field test assessment items and tasks.8. Implement the operational test.

10

INFERENCE about students 

(Step 1)

Achievement Construct (Step 2)

Assessment item/task 

creation (Step 6)

Item Writer Assignments(Step 5)

Performance Level 

Descriptors (Step 3)

Item Pool Evidence(Step 4)

Pilot, field, operational testing (Step 

7/8)

DESIGNDEVELOPMENT

11

CTB CADDS

1. Define intended inferences and uses of the assessment data.– Adoption of CCSS has created need for transition assessments.– Test designs are incorporating new item types.– Reporting requirements are changing.– “Alignment” issues abound.

12

CTB CADDS

2. Define the test construct(s) that will become assessment targets.– What cognitive tasks are required by the standard?– How do we consider students progression of learning?– What is the instructional context for a given standard?– How do we incorporate performance levels?

13

CTB CADDS

3. Develop initial proficiency level descriptors to guide development and interpretation of test scores.– Identify source of PLDS.– Define the role of PLDs in item specifications and authoring.

14

CTB CADDS

4. Define the evidence.– Item pool or test blueprint– Specifications– Item/task templates– Instructions to item writers

15

CTB CADDS

5. Develop items and performance tasks based on specifications.– specifications templates– cognitive task frameworks– tagging for accessibility and interoperability

16

CTB CADDS

6. Refine items and tasks through collaborative review by stakeholders.

7. Field test items and tasks in appropriate small- or large-scale settings.

8. Implement the operational test and continue the design and specification validation process.

17

Discussion

18

Cognitive Task Frameworks

• Explicate response demands in assessment• Response demands can be

– Intended– Additional construct-relevant – Construct-irrelevant– Other

19

Cognitive Task Frameworks

• Traditional (DOK, Bloom’s)• Within CCSS (Conley)

– Mathematical Practice Standard– Selected ELA standards

• Cognitive Rigor Matrix (Hess)

20

Implementation

• Expansion of item specifications template• Articulation of cognitive tasks during item development

21

Specifications Elements

• Performance description• Performance level descriptors• Grade level placement• Learning progressions• Assessment targets/standards• 21st century skills• Cognitive rigor• Problem/processing type

22

Specifications Elements

• Rules for source materials• Rules for item or task problem• Rules for response requirements• Administration requirements• Accessibility requirements• Administrator directions• Technology requirements

23

Item Authoring

• Clearly defined elements for authors– construct evidence requirements– parameters for student responses– options for item type, cognitive demand, other variable factors

• Articulated focus on evidence, validity and accessibility

24

CCSS StandardModeling with Geometry G‐MGApply Geometric Concepts in Modeling Situations.

Model with Mathematics

Make sense of problems.

Attend to precision.

Item 1

Item 2

Item 3

Proficient

Item 1

Item 2

Item 3

Proficient

Item 1

Item 2

Item 3

Proficient

25

Item Development Implications

• Explicit articulation of evidence to support claims• Rigorous specifications development• More direction for item writers• Clearer distinction among item attributes• Refinement of item development plans

26

27

Element SpecificationPerformance description Students will demonstrate their ability to evaluate texts in

multiple presentations and compare their presentations of ideas.

Performance Level Descriptors

Proficient, possibly Advanced

Grade level placement 6Assessment targets CCSS 6.RI-7. Integrate information presented in different

media or formats (e.g., visually, quantitatively) as well as in words to develop a coherent understanding of a topic or issue. CCSS 6.RI-2. Determine a central idea of a text and how it is conveyed through particular details

Cognitive Rigor Analyze and Evaluate (Bloom) DOK Level 3: Strategic Thinking and Reasoning (Webb)

28

Element SpecificationProblem/processing type ELA/Literacy Claim 1: Students can read closely and critically

to comprehend a range of increasingly complex literary and informational texts.Central ideas: summarize central ideasAnalyze relationships: analyze how presentation format reveals author interpretation of a topic/ideaAnalyze or interpret author’s craftEvaluate text to determine central ideas

Rules for Source Materials:

Stimulus must meet the grade 6 text specifications.Text stimuli must be a well-crafted informational text that has a clear central idea. Visual or quantitative text should have a connection to the informational text, such as shared theme, details, or ideas. Connection does not have to be immediately transparent (e.g., a text about storms and a photo of a storm), but it should be discernible by grade 6 students.Text should be readable in 5-7 minutes.Visually impaired students will be provided with a description of the explicit details of the visual stimuli, but the description will not present judgments about what the visual may suggest

29

Element SpecificationRules for item or task problem

The item will focus on how each medium presents similar information and will require an analysis of both texts.The visual stimulus will be presented before the item stem, and students will be directed to, “Look at the [stimulus].”The stem will explicitly direct students to consider both texts.The stem will pose a question for students to consider regarding both texts.All parts of the item must use grade-appropriate language.

Rules for response requirements

The distractors will be plausible to students who have misunderstood or misinterpreted the texts. A distractor mayoRelate to one text onlyoMisrepresent the central idea of one or both textsAll answer choices must use grade-appropriate language.

30

Which statement best captures an idea that the reader can infer from both the photograph and President Obama’s speech?

a. It is important to stay focused to reach the top.b. To do well, becoming physically fit is necessary.c. When in doubt, seeking help is a sign of strength.d. It takes effort and determination to achieve goals.

31

Look at the table Claire made to organize quadrilaterals by their attributes.

32

Content Standard

3.G.1: Understand that shapes in different categories (e.g., rhombuses, rectangles, and others) may share attributes (e.g., having four sides), and that the shared attributes can define a larger category (e.g., quadrilaterals). Recognize rhombuses, rectangles, and squares as examples of quadrilaterals, and draw examples of quadrilaterals that do not belong to any of these subcategories.

Practice Standard

M.P.3. Construct viable arguments and critique the reasoning of others.M.P.6 Attend to precision.

Cognitive Rigor

ModerateFor this task, a student first uses a fundamental concept, in this case attributes of a shape, to classify a shape within given categories. (DOK 2)The student then needs to explain clearly and precisely how and why he or she made the classification decision. (DOK 3)

Response requirements

Students will respond using a keyboard. And mouse.  The student shows evidence of classifying shapes, both within overlapping subcategories and within the larger category, by determining that even though a shape may share an attribute with another shape, both shapes may not fit under the larger category. The student also shows evidence of reasoning and explaining precisely using mathematical language about why shapes should be classified certain ways and thus moved from/placed in, and/or removed from certain categories based on particular attributes such as equal angles within the shape or equal‐length sides of the shape.

33

Questions?

34

Sally [email protected]