Performance Assessments, Portfolio Assessments, Rubrics...

36
Performance Assessments, Portfolio Assessments, Rubrics, and STEAM - Final Team Project Chrystine Alhona, Marisa Drogan, Sean McKenna, Tim Paccione Marist College EPSY 605

Transcript of Performance Assessments, Portfolio Assessments, Rubrics...

Page 1: Performance Assessments, Portfolio Assessments, Rubrics ...mrpaccione.weebly.com/uploads/9/3/6/4/93644172/alhona_d...Assessments, Portfolio Assessments, and Rubrics. In addition, we

Performance Assessments, Portfolio Assessments, Rubrics, and STEAM - Final Team Project

Chrystine Alhona, Marisa Drogan, Sean McKenna, Tim Paccione

Marist College

EPSY 605

Page 2: Performance Assessments, Portfolio Assessments, Rubrics ...mrpaccione.weebly.com/uploads/9/3/6/4/93644172/alhona_d...Assessments, Portfolio Assessments, and Rubrics. In addition, we

Performance Assessments, Portfolio Assessments, Rubrics, and STEAM - Final Team Project

Introduction:

For our final project, we were tasked with exploring the topics of Performance

Assessments, Portfolio Assessments, and Rubrics. In addition, we were to examine these areas

through the lens of a particular grade level and content area of our choosing, for which we chose

a 9th grade science, technology, engineering, arts, and mathematics (STEAM) course. This

interdisciplinary classroom setting gave us an opportunity to examine our topics from a complex,

unique perspective.

For each of our core topics, and for STEAM as well, we conducted a thorough literature

review, which we then expanded upon to develop best practices for educators through a “theory-

into-practice” process. Those literature reviews and best practices discussions represent the bulk

of this paper. In addition, we developed a presentation in which we will share our key findings,

as well as provide workshop activities designed to support viewer understanding. For each of

our workshops, we created handouts, which can be found in Appendix A of this paper. Finally,

Appendix B includes a rubric that we have provided to our instructor, with which she is to grade

our overall project.

Page 3: Performance Assessments, Portfolio Assessments, Rubrics ...mrpaccione.weebly.com/uploads/9/3/6/4/93644172/alhona_d...Assessments, Portfolio Assessments, and Rubrics. In addition, we

A STEAM Curriculum:

Literature Review:

It is easy to distance the arts and humanities when concentrating on a STEM curriculum.

However, incorporating the arts and humanities, or STEAM, has proven to enhance stimulation,

diversity and richness of learning (Madden et al., 2013). STEAM curriculums foster creativity in

business and science industries by educating students to address complex problems facing human

society. This type of instruction moves beyond discipline toward multiple modes of inquiry and

viewpoints (Quigley & Herro, 2016). Quigley & Herro state that STEAM students not only

strengthen their learning within the separate contents, but search for opportunities to make

connections with art, music, and design.

STEAM broadens the pool of prospective mathematics and science learners who are not

predisposed to or interested in STEM. The arts and humanities integration causes learners to

travel a different, often enriching and rendezvous route (Howes, Kaneva, Swanson & Williams,

2013). A study conducted at a secondary school in Egypt concluded that a STEAM education is

directly linked to higher brain activity. Their research showed that STEAM gives emphasis to

both hemispheres of the brain; the right responsible for creativity, and the left responsible to

logic and academics (Sickel & Witzig, 2017).

A study from Hanyand Elementary School, located in Seoul, Korea, states that creativity

is the most required competency to be developed at schools (Kwon, 2015). This is also prevalent

in SUNY Potsdam’s curriculum, which revolves around student self-reflection in order to take

responsibility for their own learning. The curriculum also fosters multiple levels of divergent

thinking. It is creative in the way it teaches students to use widespread thinking when evaluating

a problem or content area, while combining a unique personal connection with the individual.

Page 4: Performance Assessments, Portfolio Assessments, Rubrics ...mrpaccione.weebly.com/uploads/9/3/6/4/93644172/alhona_d...Assessments, Portfolio Assessments, and Rubrics. In addition, we

SUNY Potsdam has stated that their creative STEAM approach has positively impacted their

graduate students in the field of industry, business and sciences (Madden, et al. 2013).

A STEAM curriculum is possible through projects and applied learning that focuses on

using real-life applications, problems and interdisciplinary work (Madden, et al. 2012). The

curriculum can shift around the students’ interests, and aim towards topics such as science in

society (Howes, Kaneva, Swanson & Williams, 2013). Promoting this deep engagement, it builds

on experiences and relationships between people and places using collaboration. Using real-life

problems is essential, because it creates communities that provide stimulation, diversity and

richness of experience (Madden, et al. 2013).

Integrating a STEAM curriculum takes understanding, experience and knowledge.

Educators need to understand the importance and differences of authentic assessments versus

traditional paper and pencil methods. Experience in various content areas is also necessary. One

of the hardest challenges, according to Sickel and Witzig (2017), when creating STEAM

assessments is incorporating engineering. Teachers need to be creative within their contents in

order to incorporate all aspects of STEAM to ensure the highest level of higher-order thinking.

Theory-Into-Practice:

STEAM assessments often revolve around problem-based learning, technology, twenty-

first century skills, student choice and authentic assessment (Quigley & Herro, 2016). A study

conducted in Hanyand Elementary School, Seoul, Korea, showed the positive effects of STEAM

portfolios in the classroom. The purpose of the portfolio was to promote a higher level of

creative and critical thinking in the classroom as well as a higher level of enjoyment in

Page 5: Performance Assessments, Portfolio Assessments, Rubrics ...mrpaccione.weebly.com/uploads/9/3/6/4/93644172/alhona_d...Assessments, Portfolio Assessments, and Rubrics. In addition, we

assessments across multiple contents (Kwon, 2015). The study proved in favor of all three

desired outcomes.

Rubrics are a common method for grading STEAM authentic assessments, projects or e-

portfolios. STEAM rubrics often include criteria of group collaboration, reflection and

knowledge of content (Kwon, 2015). Higher-order thinking skills, peer assessment and self-

assessment are often present in STEAM rubrics as means of monitoring and evaluating students’

learning (Sickel & Witzig, 2017).

Sickel and Witzig (2017) recommend using the “WHERETO” acronym as means of

evaluating learning. The “W” stands for multiple questions; “where are we going?” and “what is

expected?” The “H” represents “how will the students become engaged?” “E” refers to the

students expected performance. “R” is refers to rethinking or revise. The second “E” represents

“self-evaluation and reflection of learning.” “T” is for accommodations of learning styles,

interests and needs, and “O” is the organization of the assessment and learning (Sickel & Witzig,

2017, p. 84).

Page 6: Performance Assessments, Portfolio Assessments, Rubrics ...mrpaccione.weebly.com/uploads/9/3/6/4/93644172/alhona_d...Assessments, Portfolio Assessments, and Rubrics. In addition, we

Performance Assessments:

Literature Review:

The educational study of student performance assessments is complex and multi-faceted,

and a significant amount of research and study has been conducted in order to determine what

strategies within these areas best meet the needs of students. In particular, creating effective

mathematics and science assessments has been a challenge for secondary education instructors.

As a result, an immense amount of literature has been written on these topics, which can be used

to inform some possible best practices for teachers.

The National Academy of Sciences has conducted research in the area of student

performance assessments within the subjects of science, technology, engineering, and

mathematics (STEM). One such endeavor included a meta-analysis of 225 related studies that

compared how students perform in STEM courses taught under traditional instructional methods

with how they perform in STEM courses taught using active learning methods. Active learning

methods, as defined by the study, include constructivist teaching that allows the students to

discover the knowledge themselves through performance. This approach is examined in contrast

to traditional teaching methods, which consist predominately of academic lectures and paper and

pencil tests (Freeman et al., 2014).

The null hypothesis for the study stated that traditional methods maximizes learning

performances for students in STEM courses, and the alternative hypothesis supposed the

opposite, that active learning methods were more beneficial. The analysis completed within the

study supported the alternative hypothesis and therefore the researchers’ theory that increasing

the number of students receiving active learning instruction in STEM could lead to a correlated

increase in student performance. Strategies such as group problem-solving, authentic

Page 7: Performance Assessments, Portfolio Assessments, Rubrics ...mrpaccione.weebly.com/uploads/9/3/6/4/93644172/alhona_d...Assessments, Portfolio Assessments, and Rubrics. In addition, we

worksheets, and tutorials resulted in better results than lectures on average (Freeman et al.,

2014).

The results from the meta-analysis included findings which indicate that active learning

can lead to an increase in examination performance and raise students’ average grades to a

statistically significant degree. Additionally, failure rates under traditional lecturing methods

were found to be 55% higher than rates observed under active learning. Based on this

information and previous literature within the subject area, the researchers at the National

Academy of Sciences concluded that an increase in STEM instruction could lead to an increase

in overall student academic performance, catalyzed by active learning experiences (Freeman et

al., 2014).

Pandey (1990) discusses authentic mathematics assessments and their ability to meet the

needs of students. He notes the growing consensus among educators that the goals of

mathematics education should be to help students solve problems in every-day life. Whether

that application comes through employment, community outreach, or personal finance, students

must be prepared to perform mathematics throughout society. Pandey looks to determine what

role performance assessments can play in that preparation, and he explores the types of authentic

mathematics assessments that focus on concepts and analytical skillsets in order to test his

theories.

Throughout his discussion, Pandey (1990) argues that authentic assessments meet the

needs of instructional goals, as through conditions like situational lessons, real-life questions,

and mock investigations, students are able to develop their thinking and reasoning skills in

authentic contexts. At the same time, these activities influence student attitudes towards

mathematics and can increase their ability to collaborate with peers. He believes the key value

Page 8: Performance Assessments, Portfolio Assessments, Rubrics ...mrpaccione.weebly.com/uploads/9/3/6/4/93644172/alhona_d...Assessments, Portfolio Assessments, and Rubrics. In addition, we

offered by authentic assessments is that they require students to formulate problems, devise

solutions, and interpret results (Pandey, 1990).

Pandey (1990) goes on to explain that the goal of providing real-life learning experiences

can be accomplished numerous ways, including through open-ended questions, short

investigations (60 to 90 minute tasks), multiple-choice questions that emphasize combining

mathematical concepts, and portfolios. He concludes by observing that as national and state

standards focus more and more on authentic learning, consistent exposure and practice for

students in these areas plays a critical part in developing their ability to perform.

Further discussion of mathematics performance assessments took place in the Canadian

Journal of Behavioral Science. Within their study, researchers studied the relationship between

student attitudes and dispositions towards mathematics and their ability to succeed when

assessed. They reviewed previous literature on the subject and found mixed results. In an effort

to find a more definitive answer to their queries, they reexamined the relationship through the

scope of gender. Their study looked at the differences in mathematics achievement, problem

solving, and other performance areas that can be tied to gender (Randhawa and Hunter, 2001).

Gender dispositions had previously been found to have significant relationships with

educational variables, and the researchers set out to determine if these factors might be at play

within the area of mathematics performance. The results of the study showed that the differences

in student performance and achievement that can be attributed to gender differences has declined

over time. However, a measurably higher success rate does still exist for males, albeit to a

smaller extent than in the past. The researchers interpreted their findings to determine that a

gender differential in performance should be factored in when evaluating student scores, and

Page 9: Performance Assessments, Portfolio Assessments, Rubrics ...mrpaccione.weebly.com/uploads/9/3/6/4/93644172/alhona_d...Assessments, Portfolio Assessments, and Rubrics. In addition, we

should also be accounted for within instructional strategies and performance assessment planning

(Randhawa and Hunter, 2001).

As with mathematics, science assessments have been the focus of various studies aimed

at their improvement. Shymansky, Enger, Chidsey, and Yore (1997) explored this area in an

attempt to determine the feasibility of combining the expertise of science teachers, science

educators, and test developers in order to build creative and authentic performance assessments

within the content area. As with mathematics, a focus on having science students perform the

subject in a real-world context was tested for its ability to produce higher scores for the average

student. For their study, researchers asked three questions: “Is performance-based assessment

doable within the framework of a large-scale testing program,” “Do responses to performance-

based assessments reveal important information about learners not revealed through traditional

multiple-choice assessments,” and “How do students perform on these performance-based

tasks?” (Shymanksy et al., 1997, p. 172).

The study examined the statistical relationship between four performance assessments

that were designed for 9th grade students and tracked the performance of those students in the

Iowa Tests of Educational Development for science. The performance assessments were

designed to supplement the students’ traditional science tests, which were typically limited to

norm-references and multiple-choice questions. The quantitative and statistical results from the

study were inconclusive, but the qualitative insights gained from the experiment showed positive

themes that supported the transformation of science classrooms into areas where students “do

science and develop habits of mind” (Shymanksy et al., 1997, p. 182).

Page 10: Performance Assessments, Portfolio Assessments, Rubrics ...mrpaccione.weebly.com/uploads/9/3/6/4/93644172/alhona_d...Assessments, Portfolio Assessments, and Rubrics. In addition, we

Theory-Into-Practice:

By examining the literature concerning student performance assessments, valuable

lessons can be gleaned in order to develop best practices for instructors, particularly in the areas

of science and mathematics. A common theme among the studies and discussions was the value

of authenticity within assessments. Pandey (1990) emphasized how important it is to develop

mathematics performance assessments that mimic real-life situations and give students legitimate

practice in everyday scenarios. In the study conducted by the National Academy of Sciences it

was determined that the most effective learning tasks for students are those that involve active

learning (Freeman, et al., 2014). Finally, Shymansky, Enger, Chidsey, and Yore (1997) stressed

the importance of practicing science in real world contexts as a way to improve student test

scores. With authenticity being such a consistent, positive contributing factor in student

performance, it is critical for effective teachers to include it in their curriculums, and

performance assessments represent a perfect opportunity in which to do so. For example, if a

mathematics teacher wants to maximize his students’ performance on their upcoming Regents

exam, it would be beneficial for him to administer an authentic assessment that requires his

students to problem-solve in real-world scenarios.

A second lesson that can be learned from the performance assessment literature review

deals with the factor of gender disposition. Through the studies, it can be seen quite clearly that

gender dispositions are a factor. Though not as much as in days past, female students are still at

a disadvantage when it comes to their predisposed attitudes towards science and mathematics

and the effect those attitudes have on their relative performances (Randhawa and Hunter, 2001).

Knowing this, teachers must account for the disparity in their instruction and keep a close eye on

their female students to watch for any signs of negative attitudes or perspectives that can be

Page 11: Performance Assessments, Portfolio Assessments, Rubrics ...mrpaccione.weebly.com/uploads/9/3/6/4/93644172/alhona_d...Assessments, Portfolio Assessments, and Rubrics. In addition, we

rectified. If a science teacher hears his female students making remarks that imply the subject

area is more suitable for males, he must be prepared to intercede and provide a positive example

of why that does not have to be the case. Developing performance assessments that involve

strong female characters in authentic scientific scenarios can help counter those myths as well.

The final theme that can be observed throughout the literature on student performance

assessments is simply that performance itself matters. Each study found either quantitative or

qualitative evidence to support the theory that performing is a major part of student learning. It

is not enough for students to learn their content and master learning objectives if they cannot

show evidence of their understanding when asked to do so in authentic situations. It is the

instructor’s responsibility to provide a setting and opportunity for the students to perform that is

as fair and equitable as possible. Additionally, teachers must effectively prepare their students

for success using appropriate practice. For teachers, this also means creating performance

assessment questions that truly assess what they are intended to assess. That is no small task and

it requires hard work and discipline from the teacher, but its results in student achievement and

success make it worth the effort.

Page 12: Performance Assessments, Portfolio Assessments, Rubrics ...mrpaccione.weebly.com/uploads/9/3/6/4/93644172/alhona_d...Assessments, Portfolio Assessments, and Rubrics. In addition, we

Portfolio Assessments:

Literature Review:

Portfolios can serve a powerful purpose in all content areas, including STEM. It is

important that teachers facilitate the content of the portfolio to reflect the standards and learning

objectives they teach. When properly implemented, portfolios allow students to reflect on their

course work and guide their future goals. Working with students in creating their portfolios

provides teachers with an opportunity to improve communication and understanding of their

students.

Teachers can aid students in creating clear guidelines for portfolio content. The contents

of the portfolio should be reflective of the lesson or unit’s coordinated learning objectives and

content standards. Teachers can provide these guidelines in the form of an outline or rubric

which clearly indicates each component of the portfolio and its purpose. In a STEM course like

physics, this may include components like lab reports, data from experiments and essays

(Whitworth, 2013). These guidelines help assure that a student’s final portfolio will contain

components aligned with the learning standards and objectives.

While teachers should create clear expectations for portfolio contents, it is also important

that students are afforded an opportunity to reflect on their work within the portfolio. Students

may be asked to create an introduction to their portfolio at the beginning of the unit or lesson and

corresponding conclusion or reflection at the end. Students may also be afforded some choice in

which pieces of work are included in their portfolio. When used appropriately, many students

report that portfolios improve their understanding of their past and current coursework and help

them connect ideas to future content and learning goals (Cruz, 2013). This is particularly useful

Page 13: Performance Assessments, Portfolio Assessments, Rubrics ...mrpaccione.weebly.com/uploads/9/3/6/4/93644172/alhona_d...Assessments, Portfolio Assessments, and Rubrics. In addition, we

in STEM subjects when the complexity of content can leave students wondering how their

course work will be useful to them in their future lives.

While the benefits of effective portfolio implementation are clear for students, there are

also benefits for teachers themselves. Teachers often find that their use of portfolios provides a

source of communication that helps them gain a better understanding of their students. Over

time, teachers can reflect on student portfolios and solidify their knowledge and ability to

connect standards to their learning objectives and assessments. These skills are imperative for

teacher success, especially in current times as teachers are being held accountable for student

performance and standardized test scores. Portfolios provide a useful tool for helping teachers

embrace the use of standards and gain understanding of their connection to student learning

(Kim, 2014).

Theory-Into-Practice:

Portfolio use benefits students and teachers by guiding learning and helping them reflect

upon results. Teachers must create guidelines for the components of portfolios so that students

can have the ability to reflect on their final work. The ability to reflect on their course work

helps students create future learning goals for themselves. Teachers also benefit from effective

portfolio use as they progress in their own professional development. In STEM subjects,

portfolios can be useful tools for improving student interest in content areas that are often viewed

as difficult and intimidating. Portfolios generally fall into two categories, either growth or best

works (Nitko, 2015).

Growth portfolios provide examples of students work over the course of a lesson or unit

learning plan. They contain samples of student work in the form of formative assessments as

Page 14: Performance Assessments, Portfolio Assessments, Rubrics ...mrpaccione.weebly.com/uploads/9/3/6/4/93644172/alhona_d...Assessments, Portfolio Assessments, and Rubrics. In addition, we

well as feedback from the teacher and student notes. By displaying the progress of work over

time, student development and understanding can be observed. This is a useful tool for teachers

in tracking student progress and making necessary adjustments and accommodations for

students. By reflecting often on the growth portfolio, teachers can help assure that all students

demonstrate the ability to meet the high expectations of the unit learning objectives and

standards.

An example of a growth portfolio in a STEM subject like geometry would include

various components developed throughout a lesson or unit plan. One example would include

student practice problems and samples of formative pieces of homework. This would then

include the feedback provided by the teacher in correcting errors in student work. Further

formative work in the portfolio would then demonstrate the student’s ability to improve and

perform the skills laid out in the learning objectives.

Best works portfolios contain pieces of student’s final work or products from the lesson

or learning unit. In this way, they serve more of a summative purpose rather than the formative

focus of growth portfolios. These portfolios may be used to provide evidence of mastery of the

learning standards as required for giving students a final grade and allowing them to progress to

the next level or graduate from a program. They also provide an opportunity for students to

show their work to parents or future teachers.

In a STEM content area like robotics this may take the form of a video demonstration of

the robot and its abilities. This final polished exhibit would not include all the formative pieces

of work that took place along the way, as seen in a growth portfolio. Rather, it would showcase

the students understanding of the learning standards and objectives in a finalized form of

Page 15: Performance Assessments, Portfolio Assessments, Rubrics ...mrpaccione.weebly.com/uploads/9/3/6/4/93644172/alhona_d...Assessments, Portfolio Assessments, and Rubrics. In addition, we

evidence. Best works portfolios should still include a component of student metacognitive skills,

for example, in the form of a final self-reflection entry or questionnaire.

In both growth and best works portfolios it is imperative that coordination with the

standards and learning objectives is demonstrated. This requires continued effort on the part of

the teacher in creating guidelines for the portfolio that reflect these standards and objectives. It

also requires a continued understanding of the standards by the teacher and the ability to write

learning objectives that accurately reflect those goals. When implemented appropriately, both

forms of portfolios can collaborate to aid in student development over time and to provide final

evidence of their ability to perform the important skills reflected in the learning standards.

Page 16: Performance Assessments, Portfolio Assessments, Rubrics ...mrpaccione.weebly.com/uploads/9/3/6/4/93644172/alhona_d...Assessments, Portfolio Assessments, and Rubrics. In addition, we

Rubrics:

Literature Review:

While not used as often as in English or history courses, rubrics can meet the needs of

educators in assessing students in STEM courses. Learning science, technology, engineering,

and mathematics requires a lot more than memorizing content. In the 21st century, it requires

demonstrating critical thinking skills, which can be difficult to assess without the use of a rubric.

These higher-order thinking skills include cognitive skills such as application, analysis,

synthesis, evaluation, and creativity. Students are no longer passive participants in STEM

courses; they must take an active role in learning. This typically occurs through the use of

formative assessments and summative assessments such as performance assessments and

portfolio development. These assessments and the skills they attempt to measure are perfect

opportunities to introduce the use of rubrics into STEM courses.

Criteria for STEM rubrics should fall within four categories: content knowledge, higher-

order thinking skills, communication skills, and science literacy (Kishbaugh, et al., 2012).

Content knowledge represents the facts, concepts, and theories that are taught specifically in the

subject area. Higher-order thinking skills, as described above, are typically required for reports,

research papers, and visual materials created. Communication skills, including verbal, written,

and visual, must also be assessed, but they will vary depending on the assessment. For science

courses, aspects of science literacy, or the nature of science, should also be assessed by the

rubric. This category covers the “philosophy and sociology of science” (Kishbaugh, et al., 2012,

p. 269), ensuring that students understand that the nature of science is empirical, tentative,

inferential, theory-laden, embedded in a wider culture, founded on no specific scientific method,

Page 17: Performance Assessments, Portfolio Assessments, Rubrics ...mrpaccione.weebly.com/uploads/9/3/6/4/93644172/alhona_d...Assessments, Portfolio Assessments, and Rubrics. In addition, we

and creative. Rubrics for any long-term, summative, or complex assessment should cover each

of these major categories.

STEM Rubrics can also assess short-essay or constructed response prompts in addition to

entire assessments. For a rubric to be suitable, the prompts should not assess solely recall or

understanding, but also higher-order thinking skills. A well-designed general rubric for a

constructed response task was provided in Tang, Coffey, and Levin’s (2015) case study of a high

school biology rubrics. In general, students’ responses should demonstrate they synthesized

information, included supporting details, integrated ideas, use scientific language accurately, and

if applicable, apply concepts to authentic situations.

In courses such as mathematics and physics, assessments that focus on problem solving

may require different criteria in a rubric. Hull, et al. (2013) identified a five-step problem

solving strategy that should be used by students to approach these. The five steps are: visualize

the problem; describe the problem in mathematical or scientific terms; plan a solution; execute

the plan; check and evaluate answers. This strategy can be applied to a variety of STEM critical

thinking problems. To assess students’ mastery of this process, Hull et al. recommends including

the following criteria on the rubric: evidence of conceptual understanding; description of the

problem; appropriate equations used; reasonable plan; logical progression; and correct

calculations. This can be simplified for problems that are not overly complex.

A good rubric is not only a reliable and valid way to assess students, but it can be a way

to help students understand their teacher’s expectations and standards. Rubrics can “clarify

learning goals, build complex understandings, and encourage intellectual risk-taking”. (Siegel, et

al., 2011, p. 30) For a rubric to have such a positive effect on student performance, it must be

well thought out, distributed and explained in advance, and supplemented with checklists and

Page 18: Performance Assessments, Portfolio Assessments, Rubrics ...mrpaccione.weebly.com/uploads/9/3/6/4/93644172/alhona_d...Assessments, Portfolio Assessments, and Rubrics. In addition, we

instructions that further clarify expectations. Rubrics are helpful only if students engage with

them and understand what the rubric is attempting to communicate. One way to help students

understand their rubric is to ask them to rewrite it in their own words. When a student can

describe the teacher’s expectations in their own words, they will understand what is required to

meet the standards. Another strategy is to have students evaluate one another’s drafts or a

sample assignment using the rubric. By practicing applying the rubric, they will be able to fine-

tune their own work.

Some teachers have found that when they give students a rubric in advance, students will

do the minimum required to meet expectations and not exhibit the creativity and innovation that

the assessment calls for. Siegel, et al. (2013) suggest two remedies for this situation. First, they

suggest including an extra performance level, typically labeled “exceeding expectations” or

“beyond expectations”. Reaching this performance level can earn students extra credit or round

up their grade. They also suggest requiring students to reflect on their learning and performance.

By focusing students on their learning goals, rather than simply their grades, students will

exercise their higher-order thinking skills though reflection and introspection.

Finally, rubrics can be implemented in conjunction with formative assessments for long-

term projects, tasks, or portfolios. To help students manage the workload, teachers can

implement checkpoints throughout the assessment to provide students with feedback on their

progress. In this situation, it is important to teach students how to interpret comments and

feedback so that they will not be discouraged (Capraro, Capraro, & Morgan, 2013).

Page 19: Performance Assessments, Portfolio Assessments, Rubrics ...mrpaccione.weebly.com/uploads/9/3/6/4/93644172/alhona_d...Assessments, Portfolio Assessments, and Rubrics. In addition, we

Theory-Into-Practice:

For STEM educators and others incorporating STEM into their curricula, it is vital to

assess higher-order thinking skills. While it is possible to do this through traditional

assessments, portfolios and performance tasks are great ways to assess these skills. These

assessments commonly require students to create a produce or follow a complicated process.

Grading and providing feedback can be challenging, however. Rubrics provide a clear grading

standard for both teachers and students, increasing the validity and reliability of these alternative

assessments.

Because STEM assessments can address many different higher-order thinking, critical

thinking, and problem solving skills, rubrics’ flexible nature makes them the ideal assessment

tool in most circumstances. While task-specific, analytic rubrics are most common hybrid

rubrics or multiple rubrics can be used if needed. To make an effective STEM rubric, it must

match the assessment and the teacher’s expectations. Measuring content knowledge is not

enough, however. The rubric must measure those higher-order thinking skills that students must

demonstrate through their portfolios or performance assessments.

In addition to being a grading tool, rubrics should be used to communicate expectations

to students. They are useful in helping students understand exactly what the teacher expects in

terms of the scope and quality of work, while leaving flexibility for students to display their

creativity. It provides a common language with which to discuss levels of proficiency. Due to

the long-term nature of many STEM performance and portfolio assessments, the rubric can serve

as a checklist or in conjunction with a checklist to guide students through the various

requirements of the assessment. An enriching STEM course will undoubtedly require rubrics to

assess higher-order thinking skills.

Page 20: Performance Assessments, Portfolio Assessments, Rubrics ...mrpaccione.weebly.com/uploads/9/3/6/4/93644172/alhona_d...Assessments, Portfolio Assessments, and Rubrics. In addition, we

References

Capraro, R. M., Capraro, M. M., & Morgan, J. R. (2013). STEM project-based learning: An

integrated science, technology, engineering, and mathematics (STEM) approach (2nd;2;

ed.). Rotterdam: Sense Publishers.

Cruz, H. L., & Zambo, D. (2013). Student data portfolios give students the power to see their

own learning. Middle School Journal, 44(5), 40-47.

Freeman, S., Eddy, S., McDonough, M., Smith, M., Okoroafor, N., Jordt, H., & Wenderoth, M.

(2014). Active learning increases student performance in science, engineering, and

mathematics. Proceedings of the National Academy of Sciences of the United States of

America, 111(23), 8410-8415. doi: 10.1073/pnas.1319030111

Howes, A., Kaneva, D., Swanson, D., & Williams, J. (2013). Re-envision STEM education:

Curriculum, assessment and integrated, interdisciplinary studies. The University of

Manchester.

Hull, M. M., Kuo, E., Gupta, A., & Elby, A. (2013). Problem-solving rubrics revisited: Attending

to the blending of informal conceptual and formal mathematical reasoning. Physical

Review Special Topics - Physics Education Research, 9(1)

doi:10.1103/PhysRevSTPER.9.010105

Kim, Y., & Yazdian, L. S. (2014). Portfolio Assessment and Quality Teaching. Theory Into

Practice, 53(3), 220-227.

Kishbaugh, T. L. S., Cessna, S., Jeanne Horst, S., Leaman, L., Flanagan, T., Graber Neufeld, D.,

& Siderhurst, M. (2012). Measuring beyond content: A rubric bank for assessing skills in

Page 21: Performance Assessments, Portfolio Assessments, Rubrics ...mrpaccione.weebly.com/uploads/9/3/6/4/93644172/alhona_d...Assessments, Portfolio Assessments, and Rubrics. In addition, we

authentic research assignments in the sciences. Chem. Educ. Res. Pract, 13(3), 268-276.

doi:10.1039/C2RP00023G

Kwon, H. (2015). Instructions innovation for creative convergence: Based on e-portfolio of

Hanyang elementary school. International Information Institute (Tokyo). Information,

18(5), 1919-1924.

Madden, M. E., Baxter, M., Beauchamp, H., Bouchard, K., Habermas, D., Ladd, B.,... Plague, G.

(2013) Rethinking STEM education: An interdisciplinary STEAM curriculum. Procedia

Computer Science. 20. 541-546.

Pandey, T. (1990, November 30). Authentic Mathematics Assessment. ERIC/TM Digest.

Retrieved April 01, 2017, from https://eric.ed.gov/?id=ED354245

Quigley, C. F. & Herro, D. (2016). Finding the joy in the unknown: Implementation of STEAM

teaching practices in middle school science and math classrooms. Journal of Science

Education and Technology, 25(3), 410-426

Randhawa, B. S., & Hunter, D. M. (2001). Validity of performance assessment in mathematics

for early adolescents. Canadian Journal of Behavioural Science, 33(1), 14-24.

Shymansky, J. A., Enger, S., Chidsey, J. L., Yore, L. D., & al, e. (1997). Performance assessment

in science as a tool to enhance the picture of student learning. School Science and

Mathematics, 97(4), 172-183.

Sicklet, A. J., & Witzig, S. B. (2017). Designing and teaching the secondary science methods

course. An international perspective. Dordrecht: Sense.

Siegel, M. A., Halverson, K., Freyermuth, S., & Clark, C. G. (2011). BEYOND GRADING: A

series of rubrics for science learning in high school biology courses. The Science Teacher,

78(1), 28-33.

Page 22: Performance Assessments, Portfolio Assessments, Rubrics ...mrpaccione.weebly.com/uploads/9/3/6/4/93644172/alhona_d...Assessments, Portfolio Assessments, and Rubrics. In addition, we

Tang, X., Coffey, J., & Levin, D. M. (2015). Reconsidering the use of scoring rubrics in biology

instruction. The American Biology Teacher, 77(9), 669-675. doi:10.1525/abt.2015.77.9.4

Whitworth, B. A., & Bell, R. L. (2013). PHYSICS PORTFOLIOS. The Science Teacher, 80(8),

38-43.

Page 23: Performance Assessments, Portfolio Assessments, Rubrics ...mrpaccione.weebly.com/uploads/9/3/6/4/93644172/alhona_d...Assessments, Portfolio Assessments, and Rubrics. In addition, we

Appendix A - Handouts

STEAM Handout:

What is STEAM education?

STEAM stands for science,

technology, engineering, art and math. It is

similar to STEM, as it is an interdisciplinary

approach to learning. With an

interdisciplinary curriculum, students are

better able to appreciate how each content is

necessary for success in complex, real world

situations.

STEAM seeks to prepare students

for successful careers in the 21st Century.

Through STEAM, students will become

creative, innovative critical thinkers that are

able to adapt to change. Assessments in

STEAM are commonly in the form of

authentic assessments, project-based

assignments and with group collaboration.

STEAM Resources:

Educationcloset.com/steam

Edutopia.org/stem-to-steam-resources

Steamtosteam.org/resources

STEAM Lesson Assessment:

- Plant cells: (Grades 4-6)

Objective: to gain a working knowledge of

a plant cell through inquiry and solution,

hands-on creation of a plant cell cross-

section, exploration of plant cell

terminology, and digital documentation of

learning.

- Science:

- Technology:

- Engineering:

- Arts:

- Math:

Page 24: Performance Assessments, Portfolio Assessments, Rubrics ...mrpaccione.weebly.com/uploads/9/3/6/4/93644172/alhona_d...Assessments, Portfolio Assessments, and Rubrics. In addition, we

Performance Assessments Handout:

**Highlights provided for classroom discussion

Teacher Voice: In Defense of Standardized Testing tnscore.org · by James Aycock · May 20, 2014

It seems like every day there’s another article about the horrors of standardized testing. Just this

week comedian Louis CK raised the issue to seemingly new heights when he first took to the

Twittersphere to rail against testing and then followed up with an appearance on Letterman. His

daughter took New York’s state tests last week, and he wasn’t happy.

New York was not alone. Across our state, schools administered the Tennessee Comprehensive

Assessment Program (TCAP) this past week. Schools in other states did the same with their

respective state tests.

And Louis CK is not the only one speaking out against testing. It’s become a trend.

But I’M GOING TO TAKE THE UNPOPULAR STANCE OF DEFENDING STANDARDIZED

TESTING.

Why Test?

First, let’s look at the purpose of tests – or, as many educators prefer, assessments. And, to do

so, let’s consider what school would be like without assessments.

How would we know what kids know without assessments? That’s the purpose of testing kids –

to figure out what they know and are able to do.

Assessments also give us data to inform instruction. If I teach something, but my class still

hasn’t mastered it, then as a teacher I need to examine how I taught it the first time in order to

teach it better next time. Likewise, if my class already knows something, I don’t need to teach it

to them; we can move on to other things. Maybe most of my class have mastered a skill, but a

Page 25: Performance Assessments, Portfolio Assessments, Rubrics ...mrpaccione.weebly.com/uploads/9/3/6/4/93644172/alhona_d...Assessments, Portfolio Assessments, and Rubrics. In addition, we

handful need more time. Either way, I need data to inform my teaching – and that data comes

from assessments.

In sum, TESTING LETS US KNOW WHAT KIDS KNOW AND CAN DO, WHICH HELPS US

TEACH THEM BETTER.

Why Standardized Tests?

Okay, so maybe we do need to test kids. BUT IS IT NECESSARY TO TAKE STANDARDIZED

TESTS? That’s a fair question, so let’s look at the purpose of tests being standardized.

Not all classes are equal. We all know this, right? Some teachers are better than others, some

classes are harder (and some easier), etc. As a result, not all tests are equal.

Teacher A, the veteran master teacher, will probably write better tests than Teacher B, the

rookie who has never written a test before. Likewise, Teacher C, the hardworking young teacher

with high expectations, will probably write a much more difficult test than Teacher D, the

veteran who has been in the infamous “dance of the lemons” and has taught at five schools in

the past five years.

IT IS FOR THIS REASON, BECAUSE NOT ALL TESTS ARE EQUAL, THAT WE NEED A

COMMON TEST. That’s what it means for a test to be standardized, after all – that everyone

takes the same test.

If everyone is taking different tests, then you can’t compare scores. If you can’t compare scores,

then you can’t measure teachers, schools, or districts.

If Teacher A’s students achieve 1.5 years of growth in a single school year, then we need to know

what she is doing and share it with others. If Teacher B’s students down the hall only grow 0.75

years, then he probably needs extra coaching and support. The same with a school or a district;

those achieving growth should be celebrated, while those not achieving growth should be

supported. Either way, we can only determine objective growth data if tests are standardized.

Page 26: Performance Assessments, Portfolio Assessments, Rubrics ...mrpaccione.weebly.com/uploads/9/3/6/4/93644172/alhona_d...Assessments, Portfolio Assessments, and Rubrics. In addition, we

One more point here. Critics often talk about income, race, native language, disability status, etc.

Well, the great thing about standardized tests is that everyone takes the same test, no matter of

any of that. A standardized test is an equal playing field. When they’re graded, no one is looking

at income or zip code.

A Few Caveats

Now, just because I’m in favor of standardized tests in general does not mean that I’m in favor of

all standardized tests. SOME TESTS ARE BAD. Case in point: Tennessee’s TCAP. Multiple

choice has its place, but an all-multiple choice test like TCAP is not the best. Any

Reading/Language Arts test that doesn’t require short answer and/or essays is a bad test, and

any Math test that doesn’t require you to show work is a bad test. You just can’t assess deep

knowledge and understanding with multiple choice. (That’s why the recent decision to delay

PARCC, a clear upgrade over TCAP, by our state legislature makes no sense.)

Once we accept this, it becomes clear that it’s not enough to be in favor of standardized tests

– WE HAVE TO BE IN FAVOR OF good ONES.

Critics often like to talk about teaching to the test. Louis CK used this argument on Letterman

the other day. Well, he’s wrong for two reasons.

First, the test is not to blame! It’s not like tests have agency. A test can’t make a teacher do

anything. If you have a problem with teaching to the test, then blame the teacher who is writing

the lessons, blame the principal who is probably directing the teacher to teach that way, blame

the superintendent who is pressuring the principal to focus on the test. There’s blame to go

around, but none of that is the fault of the test.

Secondly, what’s wrong with teaching to the test anyway – if it’s a good test? You know how

good teachers plan? They start planning a unit by writing the test they want to use at the end.

Good teachers define the end goal first and then plan backwards from there. They plan their

lessons based on the test they wrote. That’s good teaching.

Page 27: Performance Assessments, Portfolio Assessments, Rubrics ...mrpaccione.weebly.com/uploads/9/3/6/4/93644172/alhona_d...Assessments, Portfolio Assessments, and Rubrics. In addition, we

THE PROBLEM IS NOT TEACHING TO THE TEST. THE PROBLEM IS WITH BAD

TESTS. Anytime you teach to a test that is only multiple choice, you are setting the bar too low.

Good teachers in Tennessee are transitioning to the Common Core State Standards (CCSS),

whether or not the PARCC test starts next year, because CCSS are better than the current TN

State Standards. We taught Common Core this year at my school, and our kids were still

prepared for TCAP. And I’m convinced that teaching the more rigorous CCSS will produce

higher TCAP scores anyway.

Conclusion

TCAP wasn’t even talked about at our school this year. The first time I remember our principal

even mentioning it was after Spring Break, when she told us that TCAP was coming soon but

that we were doing a great job, that we already had data to show that our scholars had grown

tremendously, and that our kids are more than test scores.

We didn’t do a bunch of test prep, and we didn’t have any big TCAP pep rally. We just went

about the business of good teaching and learning. Test day was just another day, no big deal.

And, when I asked several scholars who struggled this year how they did on TCAP, they

responded that “our teachers taught us all the hard stuff, so the test was pretty easy.”

We’ll see how they did soon enough. But it felt good. The test was important, very important, but

nothing to get worked up over.

That’s how it should be.

This blog was first published at Bluff City Education on May 12, 2014.

tnscore.org · by James Aycock · May 20, 2014

Page 28: Performance Assessments, Portfolio Assessments, Rubrics ...mrpaccione.weebly.com/uploads/9/3/6/4/93644172/alhona_d...Assessments, Portfolio Assessments, and Rubrics. In addition, we

Portfolio Assessments Handout:

Group One – Earth Science

Based on the Next Gen Science Standard listed below give some examples of different

components that would be found in both a growth portfolio and a best works portfolio. You

do not have to necessarily address each standard, just provide some examples that come to mind.

Record your answers on the paper provided.

1-ESS1 Earth's Place in the

Universe

Students who demonstrate understanding can:

1-ESS1-1. Use observations of the sun, moon, and stars to describe patterns that can

be predicted. [Clarification Statement: Examples of patterns could include that

the sun and moon appear to rise in one part of the sky, move across the sky, and

set; and stars other than our sun are visible at night but not during the day.]

[Assessment Boundary: Assessment of star patterns is limited to stars being seen

at night and not during the day.]

1-ESS1-2. Make observations at different times of year to relate the amount of

daylight to the time of year. [Clarification Statement: Emphasis is on relative

comparisons of the amount of daylight in the winter to the amount in the spring

or fall.] [Assessment Boundary: Assessment is limited to relative amounts of

daylight, not quantifying the hours or time of daylight.]

Page 29: Performance Assessments, Portfolio Assessments, Rubrics ...mrpaccione.weebly.com/uploads/9/3/6/4/93644172/alhona_d...Assessments, Portfolio Assessments, and Rubrics. In addition, we

Group Two – Physical Science

Based on the Next Gen Science Standard listed below give some examples of different

components that would be found in both a growth portfolio and a best works portfolio. You

do not have to necessarily address each standard, just provide some examples that come to mind.

Record your answers on the paper provided.

1-PS4 Waves and Their Applications

in Technologies for Information

Transfer

Students who demonstrate understanding can:

1-PS4-1. Plan and conduct investigations to provide evidence that vibrating materials

can make sound and that sound can make materials vibrate. [Clarification

Statement: Examples of vibrating materials that make sound could include tuning

forks and plucking a stretched string. Examples of how sound can make matter

vibrate could include holding a piece of paper near a speaker making sound and

holding an object near a vibrating tuning fork.]

1-PS4-2. Make observations to construct an evidence-based account that objects in

darkness can be seen only when illuminated.[Clarification Statement: Examples

of observations could include those made in a completely dark room, a pinhole

box, and a video of a cave explorer with a flashlight. Illumination could be from

an external light source or by an object giving off its own light.]

1-PS4-3. Plan and conduct investigations to determine the effect of placing objects

made with different materials in the path of a beam of light. [Clarification

Statement: Examples of materials could include those that are transparent (such as

clear plastic), translucent (such as wax paper), opaque (such as cardboard), and

reflective (such as a mirror).] [Assessment Boundary: Assessment does not include

the speed of light.]

1-PS4-4. Use tools and materials to design and build a device that uses light or

sound to solve the problem of communicating over a distance.* [Clarification

Statement: Examples of devices could include a light source to send signals, paper

cup and string “telephones,” and a pattern of drum beats.] [Assessment Boundary:

Assessment does not include technological details for how communication devices

work.]

Page 30: Performance Assessments, Portfolio Assessments, Rubrics ...mrpaccione.weebly.com/uploads/9/3/6/4/93644172/alhona_d...Assessments, Portfolio Assessments, and Rubrics. In addition, we

Group Three – Life Science

Based on the Next Gen Science Standard listed below give some examples of different

components that would be found in both a growth portfolio and a best works portfolio. You

do not have to necessarily address each standard, just provide some examples that come to mind.

Record your answers on the paper provided.

K-LS1 From Molecules to

Organisms: Structures and

Processes

Students who demonstrate understanding can:

K-LS1-1. Use observations to describe patterns of what plants and animals (including

humans) need to survive. [Clarification Statement: Examples of patterns could

include that animals need to take in food but plants do not; the different kinds of

food needed by different types of animals; the requirement of plants to have light;

and, that all living things need water.]

Students who demonstrate understanding can:

1-LS1-1. Use materials to design a solution to a human problem by mimicking how

plants and/or animals use their external parts to help them survive, grow,

and meet their needs.* [Clarification Statement: Examples of human problems

that can be solved by mimicking plant or animal solutions could include designing

clothing or equipment to protect bicyclists by mimicking turtle shells, acorn

shells, and animal scales; stabilizing structures by mimicking animal tails and

roots on plants; keeping out intruders by mimicking thorns on branches and

animal quills; and, detecting intruders by mimicking eyes and ears.]

1-LS1-2. Read texts and use media to determine patterns in behavior of parents and

offspring that help offspring survive.[Clarification Statement: Examples of

patterns of behaviors could include the signals that offspring make (such as

crying, cheeping, and other vocalizations) and the responses of the parents (such

as feeding, comforting, and protecting the offspring).]

Page 31: Performance Assessments, Portfolio Assessments, Rubrics ...mrpaccione.weebly.com/uploads/9/3/6/4/93644172/alhona_d...Assessments, Portfolio Assessments, and Rubrics. In addition, we

Rubrics Handout:

STEM Rubric Reminders

Bloom’s Taxonomy

Sample Criteria for STEM Rubrics • Synthesis of information • Use of supporting details • Use of accurate scientific terms • Application of information • Evidence of conceptual understanding • Communication skills (written, oral, visual) • Applying correct formulas • Reflection of learning • Evaluation of data/sources • Making predictions based on evidence

Page 32: Performance Assessments, Portfolio Assessments, Rubrics ...mrpaccione.weebly.com/uploads/9/3/6/4/93644172/alhona_d...Assessments, Portfolio Assessments, and Rubrics. In addition, we

Your Situation

You are a 9th grade biology teacher. Your class is beginning a unit on

plants and photosynthesis. You have decided to plan a performance

assessment for your students. They will attempt to grow lima beans from seeds

in a variety of different environments to determine what factors are necessary

for the beans to sprout. At the end of the assessment, students will submit a

report explaining their findings using concepts from the unit. You haven’t

planned all the details of the assessment yet, but you want to address higher-

order thinking skills. The first thing you decide to do is to create a rubric upon

which you will base your instructions.

Your Challenge

Pick a higher-order thinking skill to assess: ________________________

Create a rubric criterion (category) to measure it:

How would you describe the highest performance level:

Page 33: Performance Assessments, Portfolio Assessments, Rubrics ...mrpaccione.weebly.com/uploads/9/3/6/4/93644172/alhona_d...Assessments, Portfolio Assessments, and Rubrics. In addition, we

Your Situation

You are a 9th grade technology teacher. Your class is beginning a unit on

simple machines, forces, and energies (kinetic and potential). You have decided

to plan a performance assessment for your students. They will build catapults

using their knowledge from the unit and attempt to launch objects for

distance, height, and accuracy. At the end of the assessment, students will

demonstrate and verbally explain how to increase height, distance, and

accuracy using concepts from the unit. You haven’t planned all the details of

the assessment yet, but you want to address higher-order thinking skills. The

first thing you decide to do is to create a rubric upon which you will base your

instructions.

Your Challenge

Pick a higher-order thinking skill to assess: ________________________

Create a rubric criterion (category) to measure it:

How would you describe the highest performance level:

Page 34: Performance Assessments, Portfolio Assessments, Rubrics ...mrpaccione.weebly.com/uploads/9/3/6/4/93644172/alhona_d...Assessments, Portfolio Assessments, and Rubrics. In addition, we

Your Situation

You are a 9th grade algebra 1 teacher. Your class is beginning a unit on

statistics and probability. You have decided to plan a performance assessment

for your students. They will study mean, median, mode, standard deviation, and

create frequency tables based on the colors of M&M’s. At the end of the

assessment, students will submit a visual representation of these concepts

displaying their findings. You haven’t planned all the details of the assessment

yet, but you want to address higher-order thinking skills. The first thing you

decide to do is to create a rubric upon which you will base your instructions.

Your Challenge

Pick a higher-order thinking skill to assess: ________________________

Create a rubric criterion (category) to measure it:

How would you describe the highest performance level:

Page 35: Performance Assessments, Portfolio Assessments, Rubrics ...mrpaccione.weebly.com/uploads/9/3/6/4/93644172/alhona_d...Assessments, Portfolio Assessments, and Rubrics. In addition, we

Summary Handout:

Insert Handout here

Page 36: Performance Assessments, Portfolio Assessments, Rubrics ...mrpaccione.weebly.com/uploads/9/3/6/4/93644172/alhona_d...Assessments, Portfolio Assessments, and Rubrics. In addition, we

Appendix B – Presentation Rubric

CriteriaDid Not Meet

Expectations

Developing

Proficiency

Approaching

ProficiencyProficient Score

OutlineOutline not submitted. (1pt) Outline submitted late. (5pts) Incomplete outline submitted.

(7pts)

Complete outline submitted

on time. (10pts)

Paper - EvidenceMost articles discussed do not

apply to topic and fewer than

ten articles included. (10pts)

Less than ten articles are

discussed, and up to three are

off topic (25pts)

Ten articles discussed, but up

to three are off topic. (45pts)

A minimum of ten articles

were discussed, all applicable

to the topic. (60pts)

Paper – CommentaryNo connections made between

articles and topic. (10pts)

More than three articles are

not discussed in depth and

connected to topic. (25pts)

Up to three articles are not

discussed in depth and/or

connected to topic. (45pts)

Thorough connections made

between at least 10 articles

and topic. (60pts)

Paper – Best PracticesNo recommendations made.

(10pts)

Recommendations are not

detailed and do not apply to

topic. (25pts)

Recommendations are not

detailed or do not apply to

topic. (45pts)

Practical, detailed

recommendations for

educators provided. (60pts)

Paper – Quality of

Writing

Significant errors and/or

informal language. -or- No

APA formatting. -or- Late

submission. (5pts)

Many spelling/grammatical

and/or significant APA

formatting errors. (10pts)

Few spelling/grammatical

errors or minor APA formatting

errors. (15pts)

Formal language, minimal

spelling/grammatical errors.

APA formatting. Uploaded by

deadline. (20pts)

Rubric Rubric missing significant

components. (10pts)

Point values do not match

instructions. (20pts)

Unclear performance levels

and/or irrelevant criteria.

(30pts)

Point values match

instructions. Clear criteria and

performance levels. (40pts)

Presentation -

Information

Information provided was off

topic. (5pts)

Significant component(s)

missing. (15pts)

Relevant information

provided, but discussion was

superficial. (20pts)

Information was relevant and

thoroughly discussed. (25pts)

Presentation -

Engagement

Presenters did not engage with

audience. (5pts)

Minimal engagement with

audience. (15pts)

Inconsistent engagement with

audience. (20pts)

Held audience’s attention and

thoroughly involved them in

the workshop. (25pts)

Presentation –

Handouts/visuals

Minimal handouts or visuals

incorporated into workshop.

(5pts)

Materials do not relate to or

contribute to the workshop.

(15pts)

Materials are disorganized,

include errors, and/or are

cluttered. (20pts)

Clear, professional-looking

materials. Provide relevant

information. (25pts)

Presentation - Other Late submission of materials.

(5pts)

All materials ready for

presentation and uploaded

before deadline. (25pts)

Comments: Total Score (out of 350)