Web-Based Self- and Peer-Assessment of Teachers’ Educational Technology Competencies

Post on 28-Nov-2014

5.261 views 0 download

description

Presentation at the ICWL 2011 conference, 9 December 2011, Hong Kong.

Transcript of Web-Based Self- and Peer-Assessment of Teachers’ Educational Technology Competencies

Web-based Self- and Peer-assessment of Teachers’ Educational Technology

Competencies

Hans Põldoja, Terje Väljataga, Kairit Tammets, Mart LaanpereTallinn University, Estonia

This work is licensed under the Creative Commons Attribution-ShareAlike 3.0 Unported License. To view a copy of this license, visit http://creativecommons.org/licenses/by-sa/3.0/ or send a letter to Creative Commons, 444 Castro Street, Suite 900, Mountain View, California, 94041, USA.

c b a

Outline

• Problem statement

• Related works: existing competency frameworks

• Design challenges

• Design methodology

• Current prototypes

• Conclusions and future work

Research context

• Importance of educational technology competencies

• Generic ICT competency frameworks (e.g. ICDL) do not cover all the competencies needed for educational use of ICT

• Educational Technology Competency Model (ETCM) for Estonian teachers

• DigiMina project for assessing teachers’ educational technology competencies

Research problem

To what extent and how could be teachers’ educational technology competencies assessed using a Web-based tool?

Existing competency frameworks

• International Computer Driving License

• UNESCO ICT Competency Framework for Teachers

• ISTE National Educational Technology Standards for Teachers (NETS-T)

• ECDL / ICDL

• Used in 148 countries

• Focused on ICT usage

• Standardized testing

• Launched 2008, revised 2011

• Guidelines for creating national competency models

• 6 subdomains

• ISTE NETS-T 2008

• 20 competencies in 5 competency groups

• Used in Norway, Netherlands, Finland, etc.

Design challenges

Design challenges

• How to define performance indicators of all competencies in ETCM?

• How to select appropriate methods and instruments for assessing competencies?

• How to implement selected assessment methods in a Web-based tool?

Educational Technology Competency Model for Estonian Teachers

• Based on ISTE NETS-T 2008

• 5-level assessment rubric developed by local expert group

Assessment rubrics example3.1. Demonstrate fluency in technology systems and the transfer of current knowledge to new technologies and situations3.1. Demonstrate fluency in technology systems and the transfer of current knowledge to new technologies and situations3.1. Demonstrate fluency in technology systems and the transfer of current knowledge to new technologies and situations3.1. Demonstrate fluency in technology systems and the transfer of current knowledge to new technologies and situations3.1. Demonstrate fluency in technology systems and the transfer of current knowledge to new technologies and situations

1 2 3 4 5

Creates a user account in a web-based system and creates/uploads resources; uses common software/web environments/hardware with the help of a user manual; uses presentation tools and a printer; saves/copies files to external drive.

Manages access rights to the resources published in the web.

Solves independently the problems that occur during the use of ICT tools (using help, manual, FAQ or forums when needed); combines different tools; changes the settings of a web-based system.

Transfers working methods from known web environment/software to an unknown environment.

Chooses (compares, evaluates) the most suitable tool for a given task.

Measuring Educational Technology Competencies

• Assessment methodology and instruments must be reliable, valid, flexible, but also affordable with respect to time and costs.

• Four levels of measuring competencies (Miller, 1990):

• knows

• knows how

• shows how

• does

Web-based Assessment of Competencies

• Five-dimensional framework for authentic assessment (Gulikers et al, 2004):

• tasks: meaningful, relevant, typical, complex, ownership of problem and solution space;

• physical context: similar to professional work space and time frame, professional tools;

• social context: similar to social context of professional practice (incl. decision making);

• form: demonstration and presentation of professionally relevant results, multiple indicators;

• criteria: used in professional practice, related to realistic process/product, explicit

Design methodology

Research-based design methodology

Adapted from (Leinonen et al, 2008)

Contextual Inquiry

Personas

Participatory Design

Scenario-based design

Participatory design sessions

Concept mapping

Product Design

User stories

Paper prototyping

Information architecture

High-fidelity prototyping

Software PrototypeAs Hypothesis

Agile sprints

Personas

• Teacher training master student

• Novice teacher

• Experienced teacher

• Educational technologist of a school

• Trainings manager (in a national organization)

(Cooper et al, 2007)

Scenarios

• Master student is evaluating her educational technology competencies

• Peer assessment of problem solving tasks

• Educational technologist of a school is getting an overview of teachers’ educational technology competencies

• Training manager is compiling a training group with sufficient level of competencies

(Carroll, 2000)

Participatory design sessions

• 2 sessions

• Discussing the scenarios

• Drawing the sketches

Main concepts

Competency Test

• Competency test can be taken several times to measure the advancement

• Usability issue: large number of tasks (20 competencies, 5 levels)

• Solutions:

• Can be saved and continued later

• Setting the starting level with self-evaluation

Tasks

• Three types of tasks:

• automatically assessed self-test items (29)

• peer assessment task (23)

• self reflection task (41)

• Need to increase the number of competencies that can be assessed with a self-test

• Peer assessment requires blind review from 2 users in a same or higher competency level

Tasks

• Tasks are authored using IMS QTI compatible test authoring tool TATS (Tomberg & Laanpere, 2011)

• 5 IMS QTI question types are used:

• choiceInteraction (multi-choice)

• choiceInteraction (multi-response)

• orderInteraction

• associateInteraction

• extendedTextInteraction

Competency Profile

• Level of competencies is displayed as a diagram

• User can compare her competency level with the average level of various groups

• Privacy settings (private, group members, public)

• Can be linked or embedded to external web pages

Group

• Typically created for a school or a group of teacher training students

• Group owner can see competency profiles of all members

• Anybody can create a group

• Groups can be set up as private or public

Competency Requirements

• Large number of competency profiles would make DigiMina a valuable planning tool

• Competency requirements can be created by the training manager, teacher trainer and group owner

• Will be implemented in a later phase

Current prototype

Conclusions and future work

• DigiMina as a component of a larger digital ecosystem

• Involving expert teachers in creating and evaluating assessment tasks

• Integrating DigiMina with the national teachers’ portal

• DigiMina as a framework for assessing various competency models

References

• Gulikers, J. T. M., Bastiaens, T. J., & Kirschner, P. A. (2004). A Five-Dimensional Framework for Authentic Assessment. Educational Technology Research & Development, 52(3), 67–86.

• Miller, G. E. (1990). The assessment of clinical skills/competence/performance. Academic Medicine, 65(9), S63–S67.

• Leinonen, T., Toikkanen, T., & Silvfast, K. (2008). Software as Hypothesis: Research-Based Design Methodology. In Proceedings of the Tenth Anniversary Conference on Participatory Design 2008 (pp. 61–70). Indianapolis, IN: Indiana University.

• Cooper, A., Reimann, R., & Cronin, D. (2007). About Face 3: The Essentials of Interaction Design. Indianapolis, IN: Wiley Publishing, Inc.

• Carroll, J. M. (2000). Making Use: Scenario-Based Design of Human-Computer Interactions. Cambridge, MA: The MIT Press.

• Tomberg, V., & Laanpere, M. (2011). Implementing Distributed Architecture of Online Assessment Tools Based on IMS QTI ver.2. In F. Lazarinis, S. Green, & E. Pearson (Eds.), Handbook of Research on E-Learning Standards and Interoperability: Frameworks and Issues (pp. 41–58). IGI Global.

Acknowledgements

This research was supported by

• EDUKO program of Archimedes Foundation

• Estonian Ministry of Education and Research targeted research grant No. 0130159s08

• Tiger University Program of the Estonian Information Technology Foundation

Thank You!

Hans Põldoja

Research Associate

Tallinn University, Estonia

hans.poldoja@tlu.ee

http://www.hanspoldoja.net

http://www.slideshare.net/hanspoldoja

http://trac.htk.tlu.ee/digimina/