Post on 28-Sep-2018
Elaboración de Rúbricas para Medición de Resultados de
Aprendizaje
Dr. Síxifo Falcones FIEC – ESPOL
2013
Criterios ABET [1] • Criterion 1. Students • Criterion 2. Program EducaGonal ObjecGves • Criterion 3. Student Outcomes • Criterion 4. ConGnuous Improvement • Criterion 5. Curriculum • Criterion 6. Faculty • Criterion 7. FaciliGes • Criterion 8. InsGtuGonal Support • Program Criteria
CRITERION 4. CONTINUOUS IMPROVEMENT [1]
This secGon of your self-‐study report should document your processes for regularly assessing and evaluaGng the extent to which the student outcomes are being aZained. This secGon should also document the extent to which the student outcomes are being aZained. It should also describe how the results of these processes are being uGlized to effect conGnuous improvement of the program.
Terminología ABET [2]
ABET Webinar-March 2010 Copyright 2010
G. Rogers - ABET, Inc. grogers@abet.org 5
ABET�Terms 2010�Definitions
Program�Educational�Objectives
Broad�statements�that�describe�the�career�and�professional�accomplishments�that�the�program�is�preparing�graduates�to�achieve.
h d b h dProgram�Outcomes
Statements�that�describe�what�students�are�expected�to�know�and�able�to�do�by�the�time�of�graduation.
Performance�Criteria
Specific,�measurable statements�identifying�the�performance(s)�required�to�meet�the�outcome;�confirmable�through�evidence.
AssessmentProcesses�that�identify,�collect,�and�prepare�data�that�can�be�used�to�evaluate�achievement.
EvaluationProcess�of�reviewing�the�results�of�data�collection�and�analysis�and�making�a�determination�of�the�value�of�findings�and�action�to�be�taken.�
ABET�Terms 2011�Definitions�
Program�Educational�Objectives
Broad�statements�that�describe�what�graduates�are�expected�to�attain�within�a�few�years�after�graduation.
Student� Student�outcomes�describe�what�students�are�expected�to�know�and�able�to do by the time of graduation. These relate to the knowledge, skills, and
Outcomesto�do�by�the�time�of�graduation.��These�relate�to�the�knowledge,�skills,�and�behaviors�that�students�acquire�as�they�progress�through�the�program.
Performance�Criteria
Specific,�measurable statements�identifying�the�performance(s)�required�to�meet�the�outcome;�confirmable�through�evidence.
Assessment
Assessment�is�one�or�more�processes�that�identify,�collect,�and�prepare�data�to�evaluate�the�attainment�of�student�outcomes�and�program�educational�objectives.�Effective�assessment�uses�relevant�direct,�indirect,�
d l h bAssessment quantitative�and�qualitative�measures�as�appropriate�to�the�objective�or�outcome�being�measured.�Appropriate�sampling�methods�may�be�used�as�part�of�an�assessment�process.�
Evaluation
Evaluation�is�one�or�more�processes�for�interpreting�the�data�and�evidence�accumulated�through�assessment�processes.�Evaluation�determines�the�extent�to�which�student�outcomes�and�program�educational�objectives�are�being�attained.� Evaluation�results�in�decisions�and�actions�regarding�program�improvement.
ABET Student Outcomes [1] a) An ability to apply knowledge of mathemaGcs, science and engineering b) An ability to design and conduct experiments, as well as to analyze and
interpret data c) An ability to design a system, component, or process to meet desired needs
within realisGc constraints such as economic, environmental, social, poliGcal, ethical, health and safety, manufacturability, and sustainability
d) An ability to funcGon on mulGdisciplinary teams e) An ability to idenGfy, formulate, and solve engineering problems f) An understanding of professional and ethical responsibility g) An ability to communicate effecGvely (3g1 orally, 3g2 wriZen) h) The broad educaGon necessary to understand the impact of engineering
soluGons in a global, economic, environmental, and societal context i) A recogniGon of the need for, and an ability to engage in life-‐long learning j) A knowledge of contemporary issues k) An ability to use the techniques, skills, and modern engineering tools
necessary for engineering pracGce.
Ejemplos de Criterios de Desempeño (Performance Criteria) para Evaluar El Resultado de Aprendizaje “d” [2]
ABET Webinar-March 2010 Copyright 2010
G. Rogers - ABET, Inc. grogers@abet.org 11
Program�Educational
Program�Outcome
Researches�and�gathers�i f i
Performance�Criteria
Educational�Objective
information
Fulfill�duties�of�team�roles
Shares�in�work�of�team
Listens�to�other�teammates
Ability�to�function�effectively�on�a�team
Graduates�will�be�able�to�demonstrate�ability�to�solve�complex� problems�and�
participate�in�a�teamͲbased�environment
G.RogersͲͲABET,�Inc.
22
Please rate each member of the team on the following scale:Unsatisfactory
1Developing
2Satisfactory
3Exemplary
4
Name Attribute 1 2 3 4 AveScore
Researched and gathered information
Fulfilled team role’s duties as assigned
, In
c.
g
Shared in the work of the team
Listened to other teammates points of view
Researched and gathered information
Fulfilled team role’s duties as assigned
Shared in the work of the team
Listened to other teammates points of view
Researched and gathered information
Copy
righ
t © 2
007
by A
BET,
Fulfilled team role’s duties as assigned
Shared in the work of the team
Listened to other teammates points of view
Researched and gathered information
Fulfilled team role’s duties as assigned
Shared in the work of the team
Listened to other teammates points of view
Student Outcome
An ability to func3on
on mul3disciplinary teams
Taxonomía de Bloom: Niveles del Dominio CogniGvo [2]
Los verbos uGlizados para definir los criterios desempeño deben ser escogidos según el nivel al cual corresponde el curso.
ABET Webinar-March 2010 Copyright 2010
G. Rogers - ABET, Inc. grogers@abet.org 7
13
Developing performance criteria
• Two essential parts
, In
c.
– Content referent• Subject content that is the focus of
instruction (e.g., steps of the design process, chemical reaction, scientific method)
Copy
righ
t © 2
007
by A
BET,
– Action verb• Direct students to a specific performance
(e.g., “list,” “analyze,” “apply”)
14
Synthesis
Evaluation
DEMONSTRATE/CREATE
, In
c.Comprehension
Application
Analysis
Synthesis
NOVICE
INTERMEDIATE
Advanced
REINFORCE
Copy
righ
t © 2
007
by A
BET,
KnowledgeNOVICE INTRODUCE
Taxonomía de Bloom: Niveles del Dominio CogniGvo [3]
ABET Webinar-March 2010 Copyright 2010
G. Rogers - ABET, Inc. grogers@abet.org 8
Program Outcomes and Performance CriteriaPerformance criteria are a means to focus on specific expectations of a program. They facilitate the curriculum delivery strategies, and assessment procedures. There is an important first step that must come before the development of performance criteria, and that is deciding on program outcomes. These are usually communicated to students in the program description, and are stated in terms that inform the students about the general purpose of the program and expectations of the faculty. The primary difference between program outcomes and performance criteria is that program outcomes are intended to provide general information and thus are not measurable, while performance criteria indicate concrete measurable expectations. Performance criteria are developed from program outcomes.
Sample program outcomes:•Students will have an understanding of the social influences that affected technology in culture.•Students will work effectively as a member of a team.•Students can apply the principles of math and science to a technical problem.•Students will have an appreciation for the need to be lifelong learners.
Performance criteria indicate what concrete actions the student should be able to perform as a result of participation in the program and state minimum criterion for evaluation. Once program outcomes have been identified, the knowledge and skills necessary for the mastery of these outcomes should be listed. This will allow the desired behavior of the students to be described, and will eliminate ambiguity concerning demonstration of expected competencies. Performance criteria are made up of at least two main elements; action verb and content (referent). The expected behavior must be specified by name, using an observable action verb such as demonstrate, interpret, discriminate, or define. Sample performance criteria:
oStudents will know of a professional code of ethics. (knowledge)oStudents will be able to locate technical information independently. (comprehension)oStudents will solve research problems through the application of scientific methods. (application)
Level Illustrative Verbs Definition Example
Knowledgearrange, define, describe, duplicate, identify, label, list, match, memorize, name, order, outline, recognize, relate, recall, repeat, reproduce, select, state
remembering previously learned information
memory of specific facts, terminology, rules, sequences, procedures, classifications, categories, criteria, methodology, principles, theories, and structure
classify convert defend describe discuss distinguish stating problem in own words
COGNITIVE:learning is demonstrated by knowledge recall and the intellectual skills: comprehending information, organizing ideas, analyzing and synthesizing data, applying knowledge, choosing among alternatives in problem-solving, and evaluating ideas or actions.
Comprehension
classify, convert, defend, describe, discuss, distinguish, estimate, explain, express, extend, generalize, give examples, identify, indicate, infer, locate, paraphrase, predict, recognize, rewrite, report, restate, review, select, summarize, translate
grasping the meaning of information
stating problem in own words, translating a chemical formula, understanding a flow chart, translating words and phrases from a foreign language
Application
apply, change, choose, compute, demonstrate, discover, dramatize, employ, illustrate, interpret, manipulate, modify, operate, practice, predict, prepare, produce, relate, schedule, show, sketch, solve, use, write
applying knowledge to actual situations
taking principles learned in math and applying them to figuring the volume of a cylinder in an internal combustion engine
Analysis
analyze, appraise, break down, calculate, categorize, compare, contrast, criticize, diagram, differentiate, discriminate, distinguish, examine, experiment, identify, illustrate, infer, model, outline, point out, question, relate,
breaking down objects or ideas into simpler parts and seeing how the
discussing how fluids and liquids differ, detecting logical fallacies in a student's explanation of 's 1st law of motionillustrate, infer, model, outline, point out, question, relate,
select, separate, subdivide, testseeing how the parts relate and are organized
explanation of s 1st law of motion
Synthesis
arrange, assemble, categorize, collect, combine, comply, compose, construct, create, design, develop, devise, design, explain, formulate, generate, integrate, manage, modify, organize, plan, prepare, propose, rearrange, reconstruct, relate, reorganize, revise, rewrite, set up, summarize, synthesize, tell, write
rearranging component ideas into a new whole
writing a comprehensive report on a problem-solving exercise, planning a program or panel discussion, writing a comprehensive term paper
Evaluationappraise, argue, assess, attach, choose, compare, conclude, contrast, defend, describe, discriminate, estimate, evaluate, explain, judge, justify, interpret, relate, predict, rate, select, summarize, support, value
making judgments based on internal evidence or external criteria
evaluating alternative solutions to a problem, detecting inconsistencies in the speech of a student government representative
Las Rúbricas en el Proceso de Mejoramiento ConGnuo [2]
ABET Webinar-March 2010 Copyright 2010
G. Rogers - ABET, Inc. grogers@abet.org 6
Program�(Student)�M bl
MissionProgram�Educational�Objectives
Assess/E l t
RUBRICS
g ( )Outcomes
Feedback�for�Continuous�Improvement
Measurable�Performance�
Criteria
Educational�Practices/Strategies
Evaluate
Constituent/Stakeholder
Assessment�for�Continuous�Improvement©
Assessment:�Collection,�Analysis�
of�Evidence
Evaluation:Interpretation�of�
Evidence
Gloria�Rogers�– ABET,�Inc.
12
Performance Criteria• Performance criteria are specific, measurable
statements identifying the performances f
, In
c.
required to meet the outcome; confirmable through evidence– ABET criteria are silent about the issue of
performance criteria– Define the program outcomes and focus the data
collection process in ways that are systematic
Copy
righ
t © 2
007
by A
BET,collection process in ways that are systematic
– Are high level indicators of achievement of the program outcomes
¿Qué son las Rúbricas? [2] • Son la manera de definir de manera específica lo que se espera del desempeño de los estudiantes.
• Establecen las caracterísGcas de los varios niveles de desempeño.
• Pueden ser uGlizadas para generar una calificación total o parcial, pero de una manera más específica, detallada y disociada que una calificación.
• Proveen a los evaluados información clara de su desempeño y una indicación clara de lo que necesitan alcanzar para mejorar su desempeño.
¿Qué es una Rúbrica? [2]
• Normalmente conGenen tres componentes: – Dimensiones (criterios de desempeño) – Escala (niveles de desempeño) – Descriptores (Indicadores concretos para cada nivel)
Componentes de una Rúbrica [3]
Excelente Cumplió Bien Cumplió
Preparación Buen proceso de preparación, muestra profundidad en el desarrollo del tema.
Cumplido en la presentación de los resúmenes aprovecha el Gempo para aclaraciones.
Presenta el resumen y la acGvidad planeada sucintamente.
Sustentación Teórica
Domina el tema propuesto, logra conectarlo y explicarlo en sus diferentes aspectos. La evaluación logra analizar el tema.
Logra explicar el tema relacionando los diferentes aspectos de éste. La evaluación Gene en cuenta los diversos aspectos presentados.
Conoce el tema superficialmente, logra explicar los puntos planteados. La acGvidad de evaluación es poco adecuada.
Manejo de la Discusión
Bien liderada, suscita controversia y parGcipación.
Es Organizada, puede contestar los diferentes interrogantes.
La dirige, no resalta los puntos más importantes no llega a conclusiones.
Par3cipación PerGnente. AcGva, es fundamental para el buen desarrollo de cada uno de los temas.
Oportuna, aporta buenos elementos, presta atención a las disGntas parGcipaciones.
Está presente. Presta poca atención a las disGntas parGcipaciones.
Escala
Dimen
sion
es
Descriptores
Tipos de Rúbrica [2]
ABET Webinar-March 2010 Copyright 2010
G. Rogers - ABET, Inc. grogers@abet.org 15
29
Analytic Rubric
•Analytic scales tend to focus on important di i f t d t f l t d
, In
c.
dimensions of student performance related to performance criteria. •Dimensions are presented in separate categories and rated individually. •Points with associated descriptors are
Copy
righ
t © 2
007
by A
BET,•Points with associated descriptors are
assigned for performance on each of the dimensions.
Unsatisfactory1
Developing2
Satisfactory3
Exemplary4
Research & Does not collect any information Collects very little
Collects some basic
Collects a great deal of
Work Effectively in Teams
Gather Information
any information that relates to the
topic.
yinformation--some relates to the topic.
information--most relates to
the topic.
ginformation--all relates to
the topic.
Fulfill Team Role's Duties
Does not perform any duties of
assigned team role.
Performs very little duties.
Performs nearly all duties.
Performs all duties of assigned team role.
Share in Always relies on Rarely does the Usually does the Always does the assignedShare in
work of team
Always relies on others to do the
work.
assigned work--often needs reminding.
assigned work--rarely needs reminding.
the assigned work without having to be reminded.
Listen to Other
Teammates
Is always talking--never allows
anyone else to speak.
Usually doing most of the talking--rarely
allows others to speak.
Listens, but sometimes talks
too much.
Listens and speaks a fair
amount.
ABET Webinar-March 2010 Copyright 2010
G. Rogers - ABET, Inc. grogers@abet.org 14
Example of ResultsWork�effectively�in�teams
40%
60%
80%
100%
At a level expected for a student who will graduate?
0%
20%
Holistic Rubric• Advantages:
– They are often written generically and can be used with many tasks. y
– They save time by minimizing the number of decisions raters must make.
– Trained raters tend to apply them consistently, resulting in more reliable measurement.
• Disadvantages:– They do not provide specific feedback about the
strengths and weaknesses of student performancestrengths and weaknesses of student performance. – Performances may meet criteria in two or more
categories, making it difficult to select the one best description. (If this occurs frequently, the rubric may be poorly written.)
– Criteria cannot be differentially weighted.
ABET Webinar-March 2010 Copyright 2010
G. Rogers - ABET, Inc. grogers@abet.org 16
31
Teaming Skills
81%80%
100%Percent meeting target performance
, In
c.
54%
38%
25%
0%
20%
40%
60%
Copy
righ
t © 2
007
by A
BET,0%
Research Information
Fulfill Roles Share in work
Listens
32
Teaming Skills
25%49%
34%
25%
77%
6% 4% 4%
60%
80%
100%
, In
c.
15%
32%
50%30%
30%
19%
0%
20%
40%
Research�and� Fulfill�roles�and� Shares�in�work Listens
Copy
righ
t © 2
007
by A
BET,
gather�information
duties
Exemplary SatisfactoryDeveloping Unsatisfactory
Holís3ca Analí3ca
¿Cuál Tipo UGlizar? [2] • UGlizar una rúbrica holísGca cuando:
– Una vista rápida del desempeño es suficiente – Una sola dimensión es adecuada para entender el desempeño del estudiante
• UGlizar una rúbrica analíGca cuando: – Se necesita analizar debilidades y fortalezas – Retroalimentación detallada es necesaria para realizar mejoras – Se necesita evaluar habilidades complicadas – Se quiere que los estudiantes autoevalúen su propio entendimiento o desempeño
• El número niveles en la escala debe ser no más de 6, de lo contrario la confiabilidad el inter-‐evaluador e intra-‐evaluador se ve compromeGda.
Ejemplo de Rúbrica Incorrecta Resultado de Aprendizaje “C” Diseñar sistemas, componentes o procesos bajo restricciones realistas Primera Medición: Se evaluó en los estudiantes el diseño y la simulación del circuito del proyecto, presentados durante la primera parte del curso. Se uGlizó la siguiente rúbrica:
Bajo Medio Alto
El estudiante muestra poco criterio al diseñar el circuito del proyecto. El diagrama de bloques
no lo puede plantear como diagrama esquemáGco. El diseño no funciona en protoboard y no logra una simulación correcta.
El estudiante demuestra criterio al diseñar el circuito del proyecto. El diagrama de bloques lo puede
plantear como diagrama esquemáGco. El diseño funciona en protoboard con errores y logra una simulación correcta.
El estudiante demuestra criterio al diseñar el circuito del proyecto. El diagrama de bloques lo puede
plantear como diagrama esquemáGco. El diseño funciona en protoboard perfectamente y logra una simulación correcta. "
Consideración numérica del nivel (sobre 10)
Estudiantes que obtuvieron: ≤7
Estudiantes que obtuvieron: Entre 8 y 9
Estudiantes que obtuvieron: 10
Ejemplo de Rúbrica Mejorada [4] Criterio (peso)
Inicial (0)
En Desarrollo (1)
Desarrollado (2)
Excelencia (3) Puntaje
Selecciona un proyecto de
diseño (2)
La canGdad de diseño es mínimo
ConGene algo de diseño
ConGene diseño en su mayoría
Todo debe ser diseñado
En3ende el fundamento
Teórico (1)
EnGende superficialmente el funcionamiento del
circuito
EnGende en detalle el funcionamiento una porción del circuito
EnGende en detalle el funcionamiento de la mayor parte
del circuito
EnGende en detalle el funcionamiento de todo el circuito
U3liza la herramienta de simulación
(1)
El esquemáGco esta completo y la
simulación no se ha realizado
La simulación se ha realizado sólo para una porción del
circuito
Se realizó la simulación pero su resultado no ha sido
uGlizado
Se uGliza los resultados de la simulación para
verificar el funcionamiento del
circuito
Implementa el circuito en protoboard
(1)
El circuito está parcialmente armado y no funciona
El circuito está totalmente armado pero no funciona
La mayor parte del circuito funciona
Todo el circuito funciona
Rendimiento General Inicial En Desarrollo Desarrollado Excelencia TOTAL
Puntaje Requerido 0-‐4 5-‐9 10-‐13 14-‐15
Recomendaciones • Asegurarse de que los estudiantes se sientan moGvados a realizar el trabajo o responder la
pregunta que se uGliza para evaluación (ej. ubicar la pregunta en el primer literal del problema y en las primeras páginas del examen, darle un peso significaGvo).
• Al diseñar las preguntas, si es posible, tratar de evaluar sólo un criterio a la vez. • UGlizar sólo los cuatro niveles establecidos por la insGtución: inicial, en desarrollo,
desarrollado y excelencia. • Mostrar evidencia de haber calificado sólo la pregunta o trabajo que se uGlizó para la
evaluación. Medio electrónico preferiblemente (ej. escanear la resolución de la pregunta con una calidad que requiera poca memoria). Sólo 4 estudiantes, 1 por cada nivel.
• La rúbrica debe ser unificada para todos los paralelos de una materia, y por ende consensuada entre todos los profesores.
• Medir el resultado de aprendizaje una sola vez por semestre. • Si el curso debe evaluar más de un resultado de aprendizaje, distribuir las mediciones a lo
largo del semestre (ej. una en la primera evaluación, y la otra en la segunda evaluación). • Verificar que el texto del resultado de aprendizaje a evaluar coincida con el aprobado por el
Consejo DirecGvo (ej. “Habilidad para Aplicar Conocimiento de MatemáGcas, Ciencia e Ingeniería”)
• ABET requiere que las mediciones se realice basados en rúbricas. • ABET también requiere que en los portafolios, el análisis de los resultados y el seguimiento
que se hace (ej. determinar si los cambios propuestos anteriormente han surGdo efecto) sean escritos en inglés)
Referencias [1] Abet Self-‐study QuesGonnaire: Template For A
Self-‐study Report, 2013-‐2014 Review Cycle . [2] Gloria Rogers, Ph.D. Managing “Developing
Rubrics”, Professional Services ABET, Inc. [3] Matriz de Valoración diseñada y usada por la
profesora María del Pilar Aguirre en la Clase de Español, para el grado duodécimo, del Colegio Bolívar, Cali, Colombia.
[4] Connie M. Schroeder, “Design Project Assessment Rubric (sample analyGc rubric)”, University of Wisconsin-‐Milwaukee.