Collaborative Common Assessments:

32
Collaborative Common Assessments: A Learning Tool for Success Cassandra Erkens Anam Ċara Consulting, Inc. [email protected]

Transcript of Collaborative Common Assessments:

Page 1: Collaborative Common Assessments:

Collaborative Common Assessments: A Learning Tool for Success

Cassandra Erkens Anam Ċara Consulting, Inc.

[email protected]

Page 2: Collaborative Common Assessments:

Session outcomes:

• Identify the process and strategies of developing and using collaborative common assessments in ways that both monitor and promote continued learning for all

• Explore key factors to consider when developing a balanced and meaningful collaborative assessment system within teams

• Apply tools to review and respond to assessment data in ways that support instructional agility for individual teachers on a collaborative team

• Explore options for significant school improvement at the classroom level Agenda: 9:30 – 11:30 AM

• Collaborative common assessments in the context of learning communities

• Assessment literacy • Collaborative common assessment design

2:15 – 4:15 PM

• Collaborative common assessment design • Collaborative common assessment delivery systems • Effective data use • Systems implications

Collaborative common assessments in the context of learning communities:

What we know today: “Collaboration and the ability to engage in collaborative action are becoming increasingly important to the survival of the public schools. Indeed, without the ability to collaborate with others, the prospect of truly improving schools is not likely.” (Schlechty, 2005, p. 22) We know collaboration is critical to our success. “It starts when a group of teachers meet regularly as a team to identify essential and valued student learning, develop common formative assessments, analyze current levels of achievement, set achievement goals, and then share and create lessons and strategies to improve upon those levels.” (Schmoker, 2004, p. 48) In troubled schools that achieve rapid gains in student achievement, teachers work collaboratively using common assessment results for critical decision-making. Together, they use current, local evidence to identify which students need support, to isolate differences that ultimately inform instructional practice, and to plan targeted responses. (Chenoweth, 2009)

Page 3: Collaborative Common Assessments:

Reflection and professional learning are most powerful when they are collaborative activities informed by emerging evidence (Ainsworth, 2007; DuFour, 2015; DuFour, Eaker, and DuFour, 2010; Hattie, 2008; Hargreaves and Fullan, 2012; Fullan, 2011; Odden and Archibald, 2009; Reeves, 2004).

Yet. . . . In a recent Gates-funded Boston Research Group study, “Teachers Know Best: Teachers’ Views on Professional Development” they noted:

• Front-line educators prefer PD that helps them plan and improve their instruction, is teacher-driven, includes hands-on strategies relevant to their classrooms, is sustained over time, and recognizes they are professionals with valuable insights.

• Teachers also said the least beneficial kind of PD was “professional learning communities”, which they described as “just another meeting,” a place to “share their frustrations,” or “a social hour.” (as cited in Marshall Memos 609, Oct. 26, 15)

Let’s get Clear. . .

The Engine of the PLC A collaborative common assessment is any assessment, formative and/or summative in design, that is team created or team endorsed by all of the teachers who share the same standard expectations. The common assessments must be designed in advance of instruction and administered in close proximity by all instructors who share a role in administering that assessment. Those who designed or endorsed the assessment must then collaboratively examine the results for consistent scoring and shared, instructionally sensitive responses that address the following:

• Error analysis and appropriate intervention planning for individual learners, and • Curriculum, instruction, and/or assessment modifications.

Page 4: Collaborative Common Assessments:

Common assessments are the engine of a PLC. The process of using common assessments is integral to every question a PLC must explore together:

• What is it we expect them to learn? • How will we know when they have learned it? • How will we respond when they don’t learn? • How will we respond when they do learn?

DuFour, Eaker, & DuFour, 2010.

Talk Partners Begin with the End in Mind: Typically, team norms highlight a code of professional conduct that should already be in place (e.g. show up on time, turn off digital distractors, agree to participate, etc.). An alternative set of norms is offered below. What observations can you make about the quality / purpose of the following list?

Sample Team Norms Collective promises to hold each other safe when the data hit the table: • We make decisions by consensus: When the will of the group emerges, we agree to abide

by it, contribute to it, and gather data to make improvements or alternations as necessary. • We never sabotage it, passively or overtly. If we need to change a team decision, we take

it to the group for review and approval. • We use data rather than opinion or personal experience to frame team decisions. If we

don’t have existing data, we commit to gather it. • We commit to share strategies, practices, tools, and resources to support the success of all

members of the team. We make agreements about which materials we use and how we gather data regarding effectiveness of those materials.

• We maintain confidentiality. Data are public, but what’s said here remains private as we work through the complex issues and craft knowledge of teaching.

• We never speak from the place of evaluation—positively or negatively—about our individual results; instead, we speak of which learning targets, which students, and which instructional strategies are required to support continued learning.

• We speak about learners respectfully, supportively, and positively at all times—as if they are in our midst during our discussion. We focus on their assets and capabilities as we strive to address gaps or concerns.

• We never sandbag a colleague: We equally disperse the blessings and challenges amongst all of our classrooms, and we view all of our learners as our team responsibility.

Draw a conclusion and be prepared to share it: What makes the list above different from the professional courtesies that are so often used as team norms? Which type of norms would help you keep yourself and your team safe during challenging data conversations in which vulnerability is necessary and conflict might be present? Explain your thinking.

Page 5: Collaborative Common Assessments:

Our Thoughts: Playing PLC Lite

From To

Professional Courtesy Norms (show up on time, turn off phone, participate…)

Forced agendas/lock step processes

Compliance

Book study

Round robin sharing ideas

Adhering strictly to curriculum materials

Waiting for someone to deliver the test or test results

Assessment for the sake of assessment

Using test scores to sort/label learners (or teachers)

Your ideas?

What does this mean for me? My team? My school?

Page 6: Collaborative Common Assessments:

Assessment literacy

What we know today: “Knowledge of the curriculum and how to teach it effectively must accompany the greater knowledge of the interpretation and use of assessment information” (H. Timperly, 2009, p. 23). “What we know about teacher learning, in parallel to student learning, is that teachers need the opportunity to construct their own understandings in the context of their practice and in ways consistent with their identity as a thoughtful professional (rather than a beleaguered bureaucrat). Teachers need social support and a sense of purpose, hence the appeal of communities of practice (although mandated communities of practice may undo the intended meaning)” (p. xxii). Teachers can generalize their learning and findings in group discussions to be transferrable to their individual classrooms “once they get the hang of it. . .” (L. Shepard, 2013, p. xxi).

While the work of backward design has been recognized in the educational literature since the 1990’s (Hayes-Jacobs, 1997; McTighe and Ferrara, 1997; and Wiggins, McTighe, and McTighe, 1998), it is still not a prevalent practice. Research indicates that the data from classroom assessments are not currently driving instructional decision-making at a level of specificity that is supportive of a learner’s continued growth (Ruiz-Primo & Li, 2011; Schneider & Gowan, 2011; Andrade, 2013). The Assessment Tenets – created by Erkens, Schimmer, and Vagle (2015) – represent the consistent and overarching findings from the 2013 SAGE Handbook of Research on Classroom Assessments: Assessment practices must build hope, efficacy, and achievement for learners and teachers. In this learning environment, the following tenets ground all assessment policies and practices:

1. Student investment occurs when assessment and self-regulation have a symbiotic relationship. 2. The communication of assessment results must generate productive responses from learners and

all the stakeholders who support them. 3. Assessment architecture is most effective when it is planned, purposeful, and intentionally

sequenced in advance of instruction by all of those responsible for the delivery. 4. Assessment purposes (formative and summative) must be interdependent to maximize learning

and verify achievement. 5. Instructional agility occurs when emerging evidence informs real time modifications within the

context of the expected learning. 6. The interpretation of assessment results must be accurate, accessible, and reliable.

A learning rich culture provides opportunities for risk taking, productive failure, and celebrated successes.

Page 7: Collaborative Common Assessments:

Our Thoughts:

Instructional Agility

Precision Flexibility

• Standards • Learning Progressions • Assessment Maps • Error Analysis • Quality Questions • Sufficient Evidence • Accurate inferences

Collaboration

Required

• Instructional Tool Kit (teaching and re-teaching)

• Differentiation Strategies • Malleable Curriculum • Educational Triage -

Prioritization Skills

C. Erkens, T. Schimmer, & N. Vagle, 2015

How can the use of collaborative common assessments promote individual instructional agility? What does this mean for me? My team? My school?

and

Page 8: Collaborative Common Assessments:

Collaborative common assessment design What we know today: As noted by Dr. Richard Stiggins, education has traded away the understanding of the design and use of quality assessments to text book and testing companies for decades (2007). “A generalized finding was that, by and large, teachers lack expertise in the construction and interpretation of assessments they design and used to evaluate student learning (McMillan, 2013, p. 5). “Knowledge of the curriculum and how to teach it effectively must accompany the greater knowledge of the interpretation and use of assessment information” (H. Timperly, 2009, p. 23). “… it has become increasingly clear over the past twenty years that the contents of standardised tests and examinations are not a random sample from the domain of interests. In particular, these timed written assessments can assess only limited forms of competence, and teachers are quite able to predict which aspects of competence will be assessed. Especially in high-stakes assessments, therefore, there is an incentive for teachers and students to concentrate only on those aspects of competence that are likely to be assessed. To put it crudely, we start out with the intention of making the important measurable, and end up making the measurable important. The effect of this has been to weaken the correlation between standardised test scores and the wider domains for which they are claiming to be an adequate proxy” (D. Wiliam, 1998, p. 1). “The hope, too, is that next-generation assessments will more faithfully represent the new standards than has been the case for large-scale assessments in the past. While the knowledge exists to make it possible to construct much more inventive assessments, this could more easily be done in the context of small-scale curriculum projects than for large-scale, high-stakes accountability tests” (L. Shephard, 2013, p. xxi).

Collaborative Common Assessment Design (accurate assessments)

1. The assessment is collaboratively developed. 2. The assessment is aligned with the priority standards (most important learning expectations). 3. The assessment is tied tightly to clearly identified learning targets within the priority

standards. 4. The assessment is designed to meet challenging expectations (e.g. identified levels of rigor or

depths of knowledge) as outlined by the district, school, and/or team itself. 5. The assessment is designed for accuracy, and the selected method is appropriate for the

target requirements. 6. Any supporting assessment tools (rubrics, exemplars, etc.) align with any quality, focused

indicators of learning as established in the standards and/or outlined in team identified expectations.

7. The assessment is designed to avoid sources of bias that distort results. 8. The assessment itself, or the overall assessment plan for the intended learning, gathers

sufficient evidence to indicate mastery of student learning.

Page 9: Collaborative Common Assessments:

Let’s try it With the introduction of next generation oriented standards, even teachers who do not share common subjects or grade levels, but do share common processes or spiraled processes (technical reading, technical writing, problem-solving, etc.) can engage in the work of collaborative common assessments. 21st Century Skills and Processes

Reasoning Skills / Modern Literacies • Creating • Designing • Producing • Information Literacy • Global Literacy • Data Literacy

Reading: Informational texts • Key ideas and details • Craft and structure • Integration of knowledge and

ideas • Range of reading and level of

text complexity

Science • Inquiry / investigation • Cause / effect • Observing • Interpreting results • Compare/contrast • Hypothesizing

Speaking and Listening • Expressing ideas • Communicating thinking • Productive nonverbals

Math • Computation • Problem solving • Communicating thinking • Measurement • Graphing • Mathematical reasoning •

Social Studies • Sequencing • Pattern recognition • Prediction • Advocacy • Information literacy • Global literacy

Our Thoughts: Task: Write a single item exit ticket as a common formative assessment. Reference the appropriate standard(s) and strive for a level 3 Depth of Knowledge (Strategic Thinking) question or task. (see elementary and secondary options below).

Depth of Knowledge DOK (Webb, Vesperman, & Ely, 2005) is based on the cognitive demands of the task. Look at the

verb in context of the task.

Level 1: Recall: basic recall of concepts, facts, definitions, and processes Level 2: Skills and concepts: engagement of some mental processing beyond recall or producing

a response Level 3: Strategic thinking: deep understanding as exhibited through planning, using evidence,

and cognitive reasoning Level 4: Extended thinking: investigation that requires time to think and process multiple

conditions of the problem; high cognitive demand.

Page 10: Collaborative Common Assessments:

Elementary Task: Write a single item for an exit ticket. Reference the appropriate standard(s) and strive for a level 3 Depth of Knowledge (Strategic Thinking) question or task. At Soaring Elementary school, grades K – 5 wanted to develop common assessments using a math rubric for 1) computational accuracy; 2) mathematical communication, and 2) mathematical problem-solving. First, they created the rubric:

Then, they decided that they would check their learners’ skills with Mathematical communication. Each grade level will write a single question exit ticket for Friday based on the standards they have listed below. Select a grade level and write the exit ticket item that will match their standard and give them quality information about the learners’ ability to communicate mathematically.

K - 5 Math Rubrics - a work in progress

Level 1 Insufficient Progress

Level 2 Making Progress

Level 3 Met Standard

Level 4 Exceeds Expectation

Mathematical Concepts and Procedures (Computation Accuracy): The student understands mathematical concepts and performs related operations, chooses the appropriate math operations, and performs computations correctly.

I couldn't get started, I don't know how to begin.

I have part of the solution, but now I don't know what operation to use. I could complete simple calculations but cannot do complex computations.

I can select the proper operation. I can identify important information and solve the problem with accuracy.

I can complete the problem with accuracy. I can solve the problem in multiple ways to confirm accuracy.

Mathematical Communication: The student explains the process, reasoning and strategy used in solving the problem.

I did not explain how I solved the problem. My explanation is mostly restating the problem.

I explained part of the process and I explained my answer but not my thinking. Someone will need to add additional information for my explanation to make sense.

I clearly explained the process I used and my solution to the problem using numbers, words, pictures or diagrams.

I can explain a meaningful academic application to this task across contents.

Mathematical Problem Solving: The student selects and carries out a strategy to find a solution, and checks results for reasonableness.

I'm not sure what the problem asked me to do. I didn't know which strategy to use.

I understand parts of the problem and I got started but I couldn’t finish. My strategy seemed to work at the beginning, but did not work well for the whole problem.

I understood the problem and had an appropriate solution. All parts of the problem are addressed. I checked my solution for reasonableness.

My explanation can be read by others and easily understood.

Page 11: Collaborative Common Assessments:

K: Number & Operations in Base Ten Work with numbers 11-19 to gain foundations for place value. CCSS.Math.Content.K.NBT.A.1 Compose and decompose numbers from 11 to 19 into ten ones and some further ones, e.g., by using objects or drawings, and record each composition or decomposition by a drawing or equation (such as 18 = 10 + 8); understand that these numbers are composed of ten ones and one, two, three, four, five, six, seven, eight, or nine ones. Grade 1: Operations & Algebraic Thinking Represent and solve problems involving addition and subtraction. CCSS.Math.Content.1.OA.A.2 Solve word problems that call for addition of three whole numbers whose sum is less than or equal to 20, e.g., by using objects, drawings, and equations with a symbol for the unknown number to represent the problem. Grade 2 Measurement and Data Work with time and money. CCSS.Math.Content.2.MD.C.8 Solve word problems involving dollar bills, quarters, dimes, nickels, and pennies, using $ and ¢ symbols appropriately. Example: If you have 2 dimes and 3 pennies, how many cents do you have? Grade 3 Geometry Reason with shapes and their attributes. CCSS.Math.Content.3.G.A.1 Understand that shapes in different categories (e.g., rhombuses, rectangles, and others) may share attributes (e.g., having four sides), and that the shared attributes can define a larger category (e.g., quadrilaterals). Recognize rhombuses, rectangles, and squares as examples of quadrilaterals, and draw examples of quadrilaterals that do not belong to any of these subcategories. Grade 4 Number and Operations – Fractions Understand decimal notation for fractions, and compare decimal fractions. CCSS.Math.Content.4.NF.C.7 Compare two decimals to hundredths by reasoning about their size. Recognize that comparisons are valid only when the two decimals refer to the same whole. Record the results of comparisons with the symbols >, =, or <, and justify the conclusions, e.g., by using a visual model. Grade 5 Measurement and Data Represent and interpret data. CCSS.Math.Content.5.MD.B.2 Make a line plot to display a data set of measurements in fractions of a unit (1/2, 1/4, 1/8). Use operations on fractions for this grade to solve problems involving information presented in line plots. For example, given different measurements of liquid in identical beakers, find the amount of liquid each beaker would contain if the total amount in all the beakers were redistributed equally.

Page 12: Collaborative Common Assessments:

Secondary Task: Write a single item for an exit ticket. Reference the appropriate standard(s) and strive for a level 3 Depth of Knowledge (Strategic Thinking) question or task. You may have to find or create content/topics in order to write an item. Check to confirm that the learners are able to Introduce a precise, knowledgeable claim relevant to the learning of that week and back it with evidence. Reference the content below to write your item. At Champion Park High School, Grade 11 teachers of History, Chemistry, Spanish, and American Literature decided to pilot using the common assessment process across all of their disciplines. They picked the following writing standard for argumentation: CCSS.ELA-Literacy.W.11-12.1 Write arguments to support claims in an analysis of substantive topics or texts, using valid reasoning and relevant and sufficient evidence.

CCSS.ELA-Literacy.W.11-12.1.a Introduce precise, knowledgeable claim(s), establish the significance of the claim(s), distinguish the claim(s) from alternate or opposing claims, and create an organization that logically sequences claim(s), counterclaims, reasons, and evidence. CCSS.ELA-Literacy.W.11-12.1.b Develop claim(s) and counterclaims fairly and thoroughly, supplying the most relevant evidence for each while pointing out the strengths and limitations of both in a manner that anticipates the audience's knowledge level, concerns, values, and possible biases.

Then each identified individual teacher named the upcoming unit of instruction during which they could experiment with the work. History, Grade 11, European Exploration and Settlement, Beginnings to 1763

Early European exploration and colonization resulted in the redistribution of the world's population as millions of people from Europe and Africa voluntarily and involuntarily moved to the New World. Exploration and colonization initiated worldwide commercial expansion as agricultural products were exchanged between the Americas and Europe. In time, colonization led to ideas of representative government and religious toleration that over several centuries would inspire similar transformations in other parts of the world.

Chemistry, Grade 11, Chemical Reactions and Solutions

• Chemical reactions can be described by writing balanced equations. • The quantity of one mole is set by defining one mole of carbon-12 atoms to have a mass of

exactly 12 grams. • One mole equals 6.02 x 1023 particles (atoms or molecules). • The molar mass of a molecule can be determined from its chemical formula and a table of

atomic masses. • Hess’s law is used to calculate enthalpy change in a reaction.

Page 13: Collaborative Common Assessments:

Grade 11 American Literature: Reading The student will read, comprehend, and analyze relationships among American literature, history, and culture.

a) Describe contributions of different cultures to the development of American literature. b) Compare and contrast the development of American literature in its historical context. c) Discuss American literature as it reflects traditional and contemporary themes, motifs,

universal characters, and genres. d) Analyze the social or cultural function of American literature. h) Explain how an author’s specific word choices, syntax, tone, and voice support the author’s

purpose. i) Read and analyze a variety of American dramatic selections.

Grade 11, Spanish III, Understanding Culture

Cultural comparisons help one to understand the world by developing tolerance and appreciation of other cultures. Cultural comparisons help the students understand that language is a tool that can be used to communicate with others 1. What is considered polite and/or impolite behavior in the two cultures? 2. What are typical pastimes in the foreign culture? How are they similar or different from those

in America? 3. What celebrations do the two cultures share and which ones do they not share? What

traditions have influenced the English-speaking world? 4. Are there settlements and geographical evidence in the U.S. that point to the foreign country? 5. How does the study of language help to improve global relations?

Once they agreed to what they would look for and how they would measure it, they identified the measurement scale they would all agree to use:

What does this mean for me? My team? My school?

WRITING

Argumentative

Grades 11–12

Score 4.0 In addition to score 3.0 performance, the student demonstrates in-depth inferences and applications that go beyond what was taught.

Score 3.5 In addition to score 3.0 performance, partial success at score 4.0 content

Score 3.0 The student will write grade-appropriate arguments to support claims in an analysis of substantive topics or texts, using valid reasoning and relevant and sufficient evidence (W.11–12.1): • Introduce precise, knowledgeable claims, establish the significance of the claims, distinguish the claims from alternate or opposing claims, and create an organization that logically

sequences claims, counterclaims, reasons, and evidence (W.11–12.1a)

• Develop claims and counterclaims fairly and thoroughly, supplying the most relevant evidence for each while pointing out the strengths and limitations of both in a manner that anticipates the audience’s knowledge level, concerns, values, and possible biases (W.11–12.1b)

• Use words, phrases, and clauses as well as varied syntax to link the major sections of the text, create cohesion, and clarify the relationships between claims and reasons, between reasons and evidence, and between claims and counterclaims (W.11–12.1c)

• Establish and maintain a formal style and objective tone while attending to the norms and conventions of the discipline in which they are writing (W.11–12.1d)

• Provide a concluding statement or section that follows from and supports the argument presented (W.11–12.1e)

Score 2.5 No major errors or omissions regarding score 2.0 content, and partial success at score 3.0 content

Score 2.0 The student will recognize or recall specific vocabulary, such as: • Alternate, anticipate, argument, audience, bias, claim, clarify, clause, cohesion, concluding statement, convention, counterclaim, discipline, evidence, fair, formal style, introduce, limitation,

link, logical, norm, objective tone, opposing, organization, phrase, precise, reason, reasoning, relationship, relevant, sequence, significance, strength, support, syntax, thorough, text, topic, valid, value

The student will perform basic processes, such as: • Identify claims and counterclaims from teacher-provided examples

• Articulate specified patterns of logical sequence for argumentation

• Establish a claim and providing relevant evidence for the claim

• Write arguments using a teacher-provided template (which includes all of the 3.0 elements)

Score 1.5 Partial success at score 2.0 content, and major errors or omissions regarding score 3.0 content

Score 1.0 With help, partial success at score 2.0 content and score 3.0 content

Score 0.5 With help, partial success at score 2.0 content but not at score 3.0 content

Score 0.0 Even with help, no success

Page 14: Collaborative Common Assessments:

Collaborative common assessment delivery systems What we know today: Teachers have not had the necessary training or experience with designing accurate assessments at rigorous levels and then extrapolating meaningful learning from the results (R. Stiggins and M. Herrick, 2007). Once it is discovered that some learners do require additional time support, reengagement strategies should be integrated into the instructional day and those efforts should not be punitive in tone (DuFour, et. al., 2008; DuFour, et. al., 2010; Buffum, et. al., 2012). “The goal is to plan respectful tasks–which include high expectations for all students with activities that equally engage each learner. . .” (Kramer, 2014, p. 22). “. . . .Differentiation is not a set of strategies, but is instead a way of thinking about the teaching and learning process. Differentiation is not about who will learn what but rather how will you teach so that all students have access to, and support and guidance in, mastering the district and state curriculum” (Kramer, 2014. p. 17).

Collaborative Common Assessment Delivery (effectively used)

9. All staff members are aware of and supportive of the assessment plan. 10. The team delivers the common assessments in the same time frame. 11. The team’s focus is results-oriented by learning target to measure whether or not students are

learning and the results empower learners in addressing their own gaps through the intervention strategies.

12. All of the team’s efforts – before, during, and after the assessment is given – are based on determining ways for teachers /staff to identify children needing interventions and/or enrichments.

13. The team employs tools, processes, and policies that allow for student involvement in responding to the results (data interpretation, self-assessment, goal-setting, intervention planning, and reflection)

14. The team’s assessment plan promotes continued learning with formative opportunities and additional assessments to monitor for achievement.

15. Staff and procedures are in place to monitor the execution of the plan.

Page 15: Collaborative Common Assessments:

Our Thoughts: In your discussion, respond to the following statement of (traditional) practice and conclude with a belief statement to guide your delivery systems:

Reteaching is never fun for the students or the teachers. Some students will never be successful so some students will never have fun.

Only the learners who qualify for enrichment will have fun. Our statement of belief regarding responsive, productive delivery systems: What does this mean for me? My team? My school?

Page 16: Collaborative Common Assessments:

Effective data use What we know today: Research indicates that the data from classroom assessments are not currently driving instructional decision-making at a level of specificity that is supportive of a learner’s continued growth (Ruiz-Primo & Li, 2011; Schneider & Gowan, 2011; Andrade, 2013). Andrade (2013) states, “research indicates that classroom teachers do not know . . . how to use assessment information to adjust instruction to address student learning needs. . . This should come as no surprise, given that the documented lack of attention to assessment by most teacher preparation programs” (p. 20). “In a balanced assessment system, teachers use classroom assessment, interim assessment, and year and assessments to monitor and enhance student learning in relation to the state standards and to the state’s goals for student proficiency” (Schneider, et. al., 2013. p. 61).

Collaborative Common Assessment Data (monitoring achievement)

16. The data are gathered and analyzed in a timely fashion for immediate responses. 17. The data shared with learners are presented as meaningful feedback and/or information

designed to engage the learners in motivated responses to support continued learning. 18. All decisions regarding the data are aligned with proficiency levels that have been

predetermined by district, school and/or team itself. 19. Practices and protocols are utilized to guarantee common data result from the use of

common assessments (collaborative scoring is used to calibrate all scores to be consistent with team expectations).

20. The data are arranged in a manner that enables teaching teams to target appropriate interventions for specific classrooms and students.

21. The data are arranged in a manner that enables teaching teams to identify appropriate program (curriculum, instruction, and assessment) modifications.

22. The data report requires a display of the data, a reflection of team learning and a response plan to address the results with clearly determined ways for teachers /staff to respond to learners needing interventions.

23. The data are used to monitor progress toward achieving SMART goals. 24. The data are shared for school wide involvement to support learning when/as necessary. 25. The data are monitored for celebrations of student and teacher learning (and are not used to

judge teacher performance).

Page 17: Collaborative Common Assessments:

Our Thoughts: Talk Partners Right and Wrong Template: There are right and wrong ways to analyze student achievement data. What makes the right side ‘right’ and the left side ‘wrong’?

Wrong Ways

Right Ways

• Use percentages • Use proficiency (scale) scores with descriptors

• Look at the whole rather than the parts • Look target by target • Use grading based cut scores (e.g. 80% is

passing) • Dig deeper to examine target specific needs

and analyze errors • Provide scores to students for review and

acceptance • Engage students in self analysis and

decision making • Regroup based on general categories (this

student must relearn all of ‘inference’) • Develop strategic interventions within

target areas based on types of errors (reteaching, coaching, error analysis with students, outside/companion skill teaching – e.g. vocabulary development, if that was the actual reading issue and not the skill at hand)

What makes the wrong ways wrong? What makes the right ways right?

What generalizations can you make regarding the differences between the two lists for examining student achievement data?

Which data analysis practices are you already doing well? Which would you add? Remove?

Data Conversations

Page 18: Collaborative Common Assessments:

Every time a team gathers to examine collaborative common assessment data, they need at least 4 things on the table:

1. Team norms to keep the conversation safe. 2. A Data Protocol to guide the conversation thoroughly and accurately through all of the

key parts of the discussion. 3. The data, already aggregated and ready for analysis and easy identification of trends and

anomalies. 4. The student work / evidence behind the data to support error analysis.

Collaborative Common Assessment

Data Protocol 1. Examine data, and explore the following areas for team discussion.

• As a team, which targets from the assessment require more attention? • As a team, which students require what support? Use a data form to list the learners

requiring re-teaching, coaching or practice, and enrichment. o Which students did not master which targets? o Which students are close, but may require more focused practice or some

additional coaching? o Which students are ready for enrichment?

2. Analyze evidence or artifacts and explore the following areas for team discussion.

• Within each category of learners, look at samples of the students’ work for ‘error analysis.’ Subdivide the ‘did not master’ student work into groupings of common errors. o Identify the problem represented in each pile and for each type of error, identify

the possible solutions in the following areas: • Instructionally sensitive responses • curriculum modifications / additions and supports for each response

o Looking at all of the identified ‘error’ piles, identify the following: • overall curriculum and instruction modifications for future applications • item analysis and proposed revisions to those items that contributed to the

frequency of errors o Again, look at samples of the students’ work this time seeking ‘accuracy

synthesis.’ In other words, what knowledge, skills, and abilities are ready for refinement and extension?

o Identify possible instructional next steps in the following areas:

• next steps in progressions of the learning targets • exemplars for future use

Page 19: Collaborative Common Assessments:

o Develop enrichment / extension option(s) that meet the following criteria: tied to the current learning targets, engaging, exciting, relevant to the core of the work

3. Analyze the individual and collective results. Reflect on strengths and opportunities

for growth in the areas of curriculum, instruction, and assessment • As an individual teacher, which is my growth area, and how can I best improve? • How can my colleagues help me? • What are my strengths and how might I help my colleagues?

4. Develop a plan of action: What will be our team’s specific plan of action to address

our findings and results? • What needs to be retaught and how will we reteach it? • How will we coach or conduct error analysis for those who do not require reteaching, but

would benefit from more guided practice? • How will we extend and enrich for those who have it?

Common Assessment Grade 7 Team Results for

Social Studies Argumentation Essay Exam

Standard:CCSS.ELA-Literacy.WHST.6-8.1 Write arguments focused on discipline-specific content.

Target 1: CCSS.ELA-Literacy.WHST.6-8.1a Introduce claim(s) about a topic or issue, acknowledge and distinguish the claim(s) from alternate or opposing claims, and organize the reasons and evidence logically.

Target 2: CCSS.ELA-Literacy.WHST.6-8.1b Support claim(s) with logical reasoning and relevant, accurate data and evidence that demonstrate an understanding of the topic or text, using credible sources.

Target 3: CCSS.ELA-Literacy.WHST.6-8.1c Use words, phrases, and clauses to create cohesion and clarify the relationships among claim(s), counterclaims, reasons, and evidence.

Target 4: CCSS.ELA-Literacy.WHST.6-8.1d Establish and maintain a formal style. Target 5: CCSS.ELA-Literacy.WHST.6-8.1e Provide a concluding statement or section that follows

from and supports the argument presented. Unit Assessment Map: HW Task HW Task Q HW HW Q Task Paper Essay

Exam

Target 1 1,4,69, 10

Rubric Rubric Rubric

Rub

ric

Scal

e

Scal

e

Target 2 2,3,58,7

Rubric 1 - 5 Rubric Rubric Rubric Rubric

Target 3 6-10 Rubric Rubric

Target 4 Rubric

Target 5 Rubric Rubric Rubric

Page 20: Collaborative Common Assessments:

Standard: CCSS.ELA-Literacy.WHST.6-8.1 Write arguments focused on discipline-specific content.

Common Summative Assessment Task

Context Bullying has been a well-known problem in middle schools. With today’s social media, cyber-bullying has served to intensify the problem. But who’s responsible for addressing it when it happens, especially when it happens outside of school hours? Parents?Police?Schools?Students? No one seems to have an answer regarding who’s responsible for addressing the problem or ideas on how to reduce or completely stop cyber bullying. Task You are being summoned by the state legislature to help solve the problem. You will need to prepare a formal argument on how to address the problem of cyber bullying for young adults. Consider the following texts and media:

• School Cyber Bullying Victims Fight Back with Lawsuits (article) • Dealing with Cyberbullying: Tips for Kids and Parents to Prevent and Stop

Cyberbullying (www.helpguide.org) • Digital Citizenship (7 min. video)

Be sure to develop a well-thought out argument that is rich with evidence to support your stance. Feel free to include other references to materials we have studied throughout the entire year so far (e.g. Bill of Rights, US Constitution, civil rights documents, etc.). Criteria This essay will be scored using the following traits. This is the same argumentation rubric we have used in class on all of our past assessments:

• Focused claim (intro and conclusion) • Organization • Elaboration of Evidence • Language and Vocabulary

Rubric follows.

Page 21: Collaborative Common Assessments:

Com

mon

Sum

mat

ive

Ass

essm

ent R

ubri

c A

rgum

enta

tion–

use

d th

roug

hout

the

unit

and

on S

umm

ativ

e E

ssay

(M

odifi

ed fr

om S

mar

ter B

alan

ced

Arg

umen

tativ

e W

ritin

g R

ubric

Gra

des (

6 –

11)

4 3

2 1

Focu

sed

Cla

im

(intr

o tie

d to

co

nclu

sion

)

The

resp

onse

is fu

lly su

stai

ned

and

cons

iste

ntly

and

pur

pose

fully

fo

cuse

d fr

om in

tro to

con

clus

ion:

clai

m is

cle

arly

stat

ed,

focu

sed

and

stro

ngly

m

aint

aine

d

• al

tern

ate

or o

ppos

ing

clai

ms

are

clea

rly a

ddre

ssed

*

• cl

aim

is in

trodu

ced

and

com

mun

icat

ed c

lear

ly w

ithin

th

e co

ntex

t

The

resp

onse

is a

dequ

atel

y su

stai

ned

and

gene

rally

focu

sed:

clai

m is

cle

ar a

nd fo

r the

m

ost p

art m

aint

aine

d, th

ough

so

me

loos

ely

rela

ted

mat

eria

l m

ay b

e pr

esen

t •

cont

ext p

rovi

ded

for t

he

clai

m is

ade

quat

e

The

resp

onse

is so

mew

hat

sust

aine

d an

d m

ay h

ave

a m

inor

dr

ift in

focu

s:

• m

ay b

e cl

early

focu

sed

on

the

clai

m b

ut is

insu

ffic

ient

ly

sust

aine

d

• cl

aim

on

the

issu

e m

ay b

e so

mew

hat u

ncle

ar a

nd

unfo

cuse

d

The

resp

onse

may

be

rela

ted

to th

e pu

rpos

e bu

t may

off

er

little

rele

vant

det

ail:

may

be

very

brie

f •

may

hav

e a

maj

or d

rift

• cl

aim

may

be

conf

usin

g or

am

bigu

ous

Org

aniz

atio

n Th

e re

spon

se h

as a

cle

ar a

nd

effe

ctiv

e or

gani

zatio

nal s

truct

ure

crea

ting

unity

and

com

plet

enes

s:

• ef

fect

ive,

con

sist

ent u

se o

f a

varie

ty o

f tra

nsiti

onal

st

rate

gies

logi

cal p

rogr

essi

on o

f ide

as

from

beg

inni

ng to

end

effe

ctiv

e in

trodu

ctio

n an

d co

nclu

sion

for a

udie

nce

and

purp

ose

stro

ng c

onne

ctio

ns a

mon

g id

eas,

with

som

e sy

ntac

tic

varie

ty

The

resp

onse

has

an

evid

ent

orga

niza

tiona

l stru

ctur

e an

d a

sens

e of

com

plet

enes

s, th

ough

th

ere

may

be

min

or fl

aws a

nd

som

e id

eas m

ay b

e lo

osel

y co

nnec

ted:

adeq

uate

use

of t

rans

ition

al

stra

tegi

es w

ith so

me

varie

ty

• ad

equa

te p

rogr

essi

on o

f ide

as

from

beg

inni

ng to

end

adeq

uate

intro

duct

ion

and

conc

lusi

on

• ad

equa

te, i

f slig

htly

in

cons

iste

nt, c

onne

ctio

n am

ong

idea

s

The

resp

onse

has

an

inco

nsis

tent

or

gani

zatio

nal s

truct

ure,

and

fla

ws a

re e

vide

nt:

• in

cons

iste

nt u

se o

f bas

ic

trans

ition

al st

rate

gies

with

lit

tle v

arie

ty

• un

even

pro

gres

sion

of i

deas

fr

om b

egin

ning

to e

nd

• co

nclu

sion

and

intro

duct

ion,

if

pres

ent,

are

wea

k

• w

eak

conn

ectio

n am

ong

idea

s

The

resp

onse

has

littl

e or

no

disc

erni

ble

orga

niza

tiona

l st

ruct

ure:

few

or n

o tra

nsiti

onal

st

rate

gies

are

evi

dent

freq

uent

ext

rane

ous i

deas

m

ay in

trude

Page 22: Collaborative Common Assessments:

Arg

umen

tatio

n R

ubri

c –

used

thro

ugho

ut th

e un

it an

d on

Sum

mat

ive

Ess

ay

(Mod

ified

from

Sm

arte

r Bal

ance

d A

rgum

enta

tive

Writ

ing

Rub

ric G

rade

s (6

– 11

)

4

3 2

1

Ela

bora

tion

of

Evi

denc

e Th

e re

spon

se p

rovi

des

thor

ough

and

con

vinc

ing

supp

ort/e

vide

nce

for t

he

writ

er’s

cla

im th

at in

clud

es th

e ef

fect

ive

use

of so

urce

s, fa

cts,

and

deta

ils. T

he re

spon

se

achi

eves

subs

tant

ial d

epth

that

is

spec

ific

and

rele

vant

: •

use

of e

vide

nce

from

so

urce

s is s

moo

thly

in

tegr

ated

, com

preh

ensi

ve,

rele

vant

, and

con

cret

e

• ef

fect

ive

use

of a

var

iety

of

ela

bora

tive

tech

niqu

es

The

resp

onse

pro

vide

s ad

equa

te su

ppor

t/evi

denc

e fo

r w

riter

’s c

laim

that

incl

udes

the

use

of so

urce

s, fa

cts,

and

deta

ils. T

he re

spon

se a

chie

ves

som

e de

pth

and

spec

ifici

ty b

ut

is p

redo

min

antly

gen

eral

: •

som

e ev

iden

ce fr

om

sour

ces i

s int

egra

ted,

th

ough

cita

tions

may

be

gene

ral o

r im

prec

ise

adeq

uate

use

of s

ome

elab

orat

ive

tech

niqu

es

The

resp

onse

pro

vide

s une

ven,

cu

rsor

y su

ppor

t/evi

denc

e fo

r th

e w

riter

’s c

laim

that

incl

udes

pa

rtial

or u

neve

n us

e of

so

urce

s, fa

cts,

and

deta

ils, a

nd

achi

eves

littl

e de

pth:

evid

ence

from

sour

ces i

s w

eakl

y in

tegr

ated

, and

ci

tatio

ns, i

f pre

sent

, are

un

even

wea

k or

une

ven

use

of

elab

orat

ive

tech

niqu

es

The

resp

onse

pro

vide

s m

inim

al su

ppor

t/evi

denc

e fo

r the

writ

er’s

cla

im th

at

incl

udes

littl

e or

no

use

of

sour

ces,

fact

s, an

d de

tails

: •

use

of e

vide

nce

from

so

urce

s is m

inim

al,

abse

nt, i

n er

ror,

or

irrel

evan

t

Lan

guag

e &

V

ocab

ular

y Th

e re

spon

se c

lear

ly a

nd

effe

ctiv

ely

expr

esse

s ide

as,

usin

g pr

ecis

e la

ngua

ge:

• us

e of

aca

dem

ic a

nd

dom

ain-

spec

ific

voca

bula

ry is

cle

arly

ap

prop

riate

for t

he

audi

ence

and

pur

pose

The

resp

onse

ade

quat

ely

expr

esse

s ide

as, e

mpl

oyin

g a

mix

of p

reci

se w

ith m

ore

gene

ral l

angu

age

use

of d

omai

n-sp

ecifi

c vo

cabu

lary

is g

ener

ally

ap

prop

riate

for t

he

audi

ence

and

pur

pose

The

resp

onse

exp

ress

es id

eas

unev

enly

, usi

ng si

mpl

istic

la

ngua

ge:

• us

e of

dom

ain-

spec

ific

voca

bula

ry m

ay a

t tim

es

be in

appr

opria

te fo

r the

au

dien

ce a

nd p

urpo

se

The

resp

onse

exp

ress

ion

of

idea

s is v

ague

, lac

ks

clar

ity, o

r is c

onfu

sing

: •

uses

lim

ited

lang

uage

or

dom

ain-

spec

ific

voca

bula

ry

• m

ay h

ave

little

sens

e of

au

dien

ce a

nd p

urpo

se

Page 23: Collaborative Common Assessments:

Collaborative Common Assessments, p. © C. Erkens, 2015

1

Common Summative Assessment Data Grade 7 SS

Count of Students at Each Level

Focused Claim

Organization

Elaboration of Evidence

Language and Vocab

1 30 28 60 17 2 82 110 120 95 3 193 162 133 189 4 30 35 22 34

Total Students 335 335 335 335

% of Students at Each Level

Focused Claim

Organization

Elaboration of Evidence

Language and Vocab

1 9% 8% 18% 5% 2 24% 33% 36% 28% 3 58% 48% 40% 56% 4 9% 10% 7% 10%

Total Students 100% 100% 100% 100%

Data by Teacher

Count of Focus Focused Claim – rubric levels Teacher 1 2 3 4 Grand Total

Teacher A 3 14 24 41 Teacher B 7 20 44 3 74 Teacher C 12 24 57 10 103 Teacher D 8 24 68 17 117

Grand Total 30 82 193 30 335

Count of Development Elaboration of Evidence – rubric levels Teacher 1 2 3 4 Grand Total

Teacher A 13 19 8 1 41 Teacher B 15 32 25 2 74 Teacher C 18 52 28 5 103 Teacher D 14 17 72 14 117

Grand Total 60 120 133 22 335

Page 24: Collaborative Common Assessments:

Collaborative Common Assessments, p. © C. Erkens, 2015

2

Data by Teacher A and Students

Teacher Students Focused Claim Organization

Elaboration of Evidence

Language and Vocab

Teacher A Student 1 3 3 3 4 Teacher A Student 2 2 1 1 3 Teacher A Student 3 2 2 1 2 Teacher A Student 4 1 1 1 2 Teacher A Student 5 3 3 3 3 Teacher A Student 6 3 2 3 2 Teacher A Student 7 3 2 1 2 Teacher A Student 8 3 2 1 2 Teacher A Student 9 2 1 3 3 Teacher A Student 10 2 1 2 3 Teacher A Student 11 2 1 3 3 Teacher A Student 12 3 2 1 2 Teacher A Student 13 3 2 2 3 Teacher A Student 14 3 3 4 3 Teacher A Student 15 3 3 2 3 Teacher A Student 16 3 2 2 3 Teacher A Student 17 3 2 2 3 Teacher A Student 18 3 2 2 3 Teacher A Student 19 2 2 1 2 Teacher A Student 20 3 4 3 4 Teacher A Student 21 3 3 2 3 Teacher A Student 22 3 3 2 3 Teacher A Student 23 2 1 2 3 Teacher A Student 24 3 1 3 3 Teacher A Student 25 2 2 2 2 Teacher A Student 26 3 2 2 3 Teacher A Student 27 3 2 2 2 Teacher A Student 28 1 1 2 2 Teacher A Student 29 1 2 2 2 Teacher A Student 30 3 2 2 3 Teacher A Student 31 2 1 1 3 Teacher A Student 32 3 2 2 2 Teacher A Student 33 3 2 2 2 Teacher A Student 34 3 3 3 3 Teacher A Student 35 2 1 1 3 Teacher A Student 36 3 2 1 3 Teacher A Student 37 2 1 1 2 Teacher A Student 38 2 1 1 3 Teacher A Student 39 2 1 1 3 Teacher A Student 40 3 2 2 3 Teacher A Student 41 2 2 2 2

Page 25: Collaborative Common Assessments:

Collaborative Common Assessments, p. © C. Erkens, 2015

3

Team or Classroom Response Planning Response Template 1 Modified from Vagle, 2014

Use this document to prepare plans by target area: FOCUS LEARNING TARGET:____________________________________________

Needs Support

Needs Practice Needs Enrichment

Students (jot down their names as a place holder for more exploration with template 2):

Students:

Students:

Instructional Plans

Instructional Plans

Resources needed:

Resources needed:

Who will facilitate the instruction/process?

Who will facilitate the instruction/process?

Date initiated: Date completed:

Date initiated: Date completed:

Page 26: Collaborative Common Assessments:

Collaborative Common Assessments, p. © C. Erkens, 2015

4

TEAM or CLASSROOM Response Planning Template 2 Modified from Vagle, 2014

Where are our concerns? Based on our assessment results, who needs additional support? Learning Target: Learning Target: Learning Target:

# and Names of Students:

# and Names of Students:

# and Names of Students:

What types of errors were made with this target? (number of error categories? Names of error categories?)

What types of errors were made with this target? (number of error categories? Names of error categories?)

What types of errors were made with this target? (number of error categories? Names of error categories?)

Error: Instructional Fix:

Error: Instructional Fix:

Error: Instructional Fix:

Error: Instructional Fix:

Error: Instructional Fix:

Error: Instructional Fix:

Error: Instructional Fix:

Error: Instructional Fix:

Error: Instructional Fix:

Error: Instructional Fix:

Error: Instructional Fix:

Error: Instructional Fix:

Page 27: Collaborative Common Assessments:

Collaborative Common Assessments, p. © C. Erkens, 2015

5

Tools to Monitor the Work of Engaged PLCs The following 3 criteria and accommodating proficiency levels (1. engaging thinking, 2. activating participants as collaborative resources, and 3. making connections beyond the current context) frame the basis of what ‘engaged learning’ looks like. The tools can be used to monitor team based learning conversations - from the classroom to team meetings, to coaching conversations, and so on. The framework (using all 3 scales together) includes the work of formative assessment (Wiliam, 2011; Clarke, 2009), quality instruction (Fisher and Frey, 2015; Newmann, et. al, 2009), and Rigor and Relevance (Webb, 2005; Daggett, 2008; Erkens 2015).

Criteria 1: Engaging Thinking

4

Almost all participants, almost all of the time, are verbally engaged in the high cognitive demand tasks of strategic thinking and extended thinking. It is clear that the participants are demonstrating a deep understanding as exhibited through planning, using evidence, and cognitive reasoning (Assess, Revise, Critique, Draw Conclusions, Differentiate, Formulate, Hypothesize, Cite Evidence). It is clear that participants have engaged in investigation that requires time to think and process multiple conditions of the problem (Synthesize, Analyze, Prove, Connect, Design, Apply Concepts). All conversations are sustained at a high level of challenge or difficulty.

3

The majority of the participants are engaged in at least one major activity during the lesson in which they are verbally engaged in the high cognitive demand task of strategic thinking. There is a large body of evidence during the activity that participants are demonstrating a deep understanding as exhibited through planning, using evidence, and cognitive reasoning (Assess, Revise, Critique, Draw Conclusions, Differentiate, Formulate, Hypothesize, Cite Evidence). Most of the conversations are sustained at a high level of challenge or difficulty, but some of the conversation are sustained at a level of medium difficulty.

2

Participants are primarily engaged in skills and concepts. A little of their conversation might require participants to engage in strategic thinking (one or two of the following: Assess, Revise, Critique, Draw Conclusions, Differentiate, Formulate, Hypothesize, Cite Evidence), but the majority of the time they are processing knowledge and skills at a level of application (Infer, Identify Patterns, Modify, Predict, Distinguish, Compare). Most of the conversations are sustained at a medium level of challenge or difficulty, but some of the conversations are sustained at a low level of difficulty.

1

Participants are beginning to demonstrate a shift from basic recall of concepts, facts, definitions, and processes (Recite, Recall, Label, Naming, Define, Identify, Match, List, Draw, Calculate), to engaging in skills and concepts that draw upon recalled knowledge (Infer, Identify Patterns, Modify, Predict, Distinguish, Compare). Most of the conversations are sustained at a medium level of challenge or difficulty, but some of the conversations are sustained at a low level of difficulty.

© C Erkens, 2015

Page 28: Collaborative Common Assessments:

Collaborative Common Assessments, p. © C. Erkens, 2015

6

Criteria 2: Activating Learners as Collaborative Resources Rubric

4

In addition to all of level 3, participants engage in productive, collaborative group work. They are synergistic, capitalizing and blending on each others’ strengths and weaknesses:

• The dialogue builds coherently on participants’ ideas to promote improved collective understanding of a theme or topic.

• Participants expand their collective insights and repertoire of skills and strategies to address errors and gaps in understanding.

• Participants establish a social norm of excellence for all relying on social pressure and collective support to motivate and encourage all participants to achieve mastery.

3

Participants collaboratively increase understanding by engaging in disciplined inquiry, rich dialogue, and active debate. In doing so, they challenge each others’ thinking, extend current thinking, and create new possibilities. The conversation involves sharing of ideas and is not completely scripted or controlled by one party (as in teacher/leader/coach-recitation). In addition, participants provide formal and informal peer feedback in the moment that aligns with teacher/leader/coach expectations and clarifies gaps in understanding, misconceptions in concepts, or errors in reasoning.

2

Participants are beginning to engage in dialogue, and debate without depending on provided protocols. Some of the participants have begun challenging each others’ thinking, extending current thinking, or creating new possibilities, but not all participants are fully invested in the learning component of the conversations. The mechanics (protocols and procedures) of the discussion are still focal points for many of the participants. Participants can provide formal and informal peer feedback in the moment that aligns with teacher/leader/coach expectations but may not go as deep as to clarify gaps in understanding, misconceptions in concepts, or errors in reasoning.

1

Participants require guidance and instructional steps in order to participate in a dialogue or debate. The instructional focus is on the activity rather than the learning that might emerge because of the activity. The conversation is completely scripted or controlled by one party (as in teacher/leader/coach -recitation). Any feedback that is offered is orchestrated by the teacher/leader/coach and the conversations are isolated to a specific set of criteria. The quality of any feedback that is offered is contingent upon the personal criteria or interpretation of the teacher/leader/coach’s criteria of the individual offering the information.

© C Erkens, 2015

Page 29: Collaborative Common Assessments:

Collaborative Common Assessments, p. © C. Erkens, 2015

7

Criteria 3: Making Connections Beyond the Current Context Rubric

4

Participants study or work on an authentic, current unstructured challenge that has no known solution at the point of initiation. Participants recognize the connections between learning environment knowledge and situations outside the learning environment. They leverage their own knowledge and skills, often blending disciplines as they explore these connections beyond the learning environment. The meaning and significance generated as a result of their efforts and finding is strong enough to lead participants to influence a larger audience beyond their learning environment in one of the following ways: communicating knowledge to others (including within the school), advocating solutions to social problems, providing assistance to people, or creating performances or product with utilitarian or aesthetic value.

3

Participants study or work on a topic, problem, or issue that the teacher/leader/coach and participants see as connected to their personal experiences or actual contemporary public situations. Participants recognize the connections between learning environment knowledge and situations outside the learning environment. They explore these connections in ways that create personal meaning and significance for the knowledge. However, there is no effort to use the knowledge in ways that go beyond the learning environment to actually influence a larger audience.

2

Participants study a topic, problem, or issue that the teacher/leader/coach succeeds in connecting to participants’ actual experiences or to contemporary public situation. Participants recognize some connections between learning environment knowledge and situations outside the learning environment, but they do not explore the implications of these connections, which remain abstract or hypothetical. There is no effort to actually influence a larger audience.

1

Participants encounter a topic, problem, or issue that the teacher/leader/coach tries to connect to participants’ experiences or to contemporary public situations; that is, the teacher/leader/coach informs participants that there is potential value in the knowledge being studied because it relates to the world beyond the learning environment. For example, participants are told that understanding Middle East history is important for contemporary politicians trying to bring peace to the region; however, the connection is unspecified and there is no evidence that participants make the connection.

© C Erkens, 2015

What does this mean for me? My team? My school?

Page 30: Collaborative Common Assessments:

Collaborative Common Assessments, p. © C. Erkens, 2015

8

Rubric Title _______________________________________________________ Event 1: Score: _____

Evidence to back the score:

Event 2: Score: _____

Evidence to back the score:

Page 31: Collaborative Common Assessments:

Collaborative Common Assessments, p. © C. Erkens, 2015

9

Creating Better Interventions by Addressing Common Errors & Misconceptions: An Example With Assessing Evidence

• First, what is the definition of the concept done well? In this case, define “assessing

evidence”: I can assess evidence. This means when reading, writing, listening, or speaking, I can identify and examine the evidence provided to me or by me to determine whether or not the evidence is quality. Having the concept defined in student-friendly terms makes the task of identifying the types of errors learners make so much easier.

• Second, examine student work or draw on past experience to identify and label the specific errors found in the samples and classify the various types of errors (simple mistakes, concept errors, reasoning errors) that occur when learners are unable to demonstrate the accurate application of assessing evidence.

Label Specific Errors Classify Type of Error

Plausible Re-Engagement Strategies

Iden

tifyi

ng E

vide

nce

• Omission error—skip details.

• Misrepresentation errors— o Take details out of

context. o Add details that were

not provided. • Over-generalization error—

assume everything in print is accurate and logical.

• Faulty logic error—confuse claims with facts.

Eva

luat

ing

the

Evi

denc

e fo

r Q

ualit

y

• Evidence is inaccurate. • Evidence is insufficient. • Evidence is irrelevant. • Evidence is not

representative of or generalizable to a greater reality.

• Evidence does not build a cohesive and whole logical argument.

• Third, with that error in mind, create plausible instructional actions for re-engaging the learners in the learning.

Page 32: Collaborative Common Assessments:

Collaborative Common Assessments, p. © C. Erkens, 2015

10

Systems implications What we know today: Healthy organizations – learning organizations – tenaciously pursue their own internal brutal truths in an effort to attack problems and improve systems (Senge, 2006; Pfeffer and Sutton, 2006; DuFour, DuFour &Eaker, 2008; Catmull and Wallace, 2014). Our Thoughts: In regards to PLCs and the work of common assessments

What is it we want our teams to know and be able to do?

How will we know when they are doing it?

What will we do when they are not? What will we do for those who already have it?

Implementation ideas or future discussion needs: