These Confusing Technology Times: Making Decisions About What to Assess and Evaluate

Post on 03-Jan-2016

17 views 1 download

description

These Confusing Technology Times: Making Decisions About What to Assess and Evaluate. Dr. Curtis J. Bonk Indiana University and CourseShare.com http://php.indiana.edu/~cjbonk cjbonk@indiana.edu. Confusion Reigns. How allocate time? When to assess? How to assess? How grade teamwork? - PowerPoint PPT Presentation

Transcript of These Confusing Technology Times: Making Decisions About What to Assess and Evaluate

These Confusing Technology Times: Making Decisions About What to Assess and Evaluate

Dr. Curtis J. Bonk Indiana University and CourseShare.com

http://php.indiana.edu/~cjbonk

cjbonk@indiana.edu

                                      

    

Confusion Reigns1. How allocate time?2. When to assess?3. How to assess?4. How grade teamwork?5. Whose work is it?

Other Issues?1. …

Student Assessment:

Product Focus

Traditional Assessment Methods

Most often, students are assessed in one of the following knowledge-focused ways Objective test questions Essay test questions Papers/Reports Projects

All are product-oriented in nature

Most Assessment Tools Focus on tests

Automatic grading/feedback Test pools Timing

Favor objective questions Few tools to facilitate other

forms of assessment File exchange/dropbox

Focus of Assessment?

1. Basic Knowledge, Concepts, Ideas

2. Higher-Order Thinking Skills, Problem Solving, Communication, Teamwork

3. Both of Above!!!4. Other…

Technology Assessments Possible

Online Portfolios of Work Discussion/Forum Participation Online Mentoring Weekly Reflections Tasks Attempted or

Completed, Usage, etc.

Sample Portfolio Scoring Dimensions

(10 pts each)(see: http://php.indiana.edu/~cjbonk/p250syla.htm)

1. Richness2. Coherence3. Elaboration4. Relevancy5. Timeliness6. Completeness7. Persuasivenes

s8. Originality

1. Insightful2. Clear/Logical3. Original4. Learning5. Fdback/

Responsive6. Format7. Thorough8. Reflective9. Overall Holistic

More Possible Assessments

Quizzes and Tests Peer Feedback and

Responsiveness Cases and Problems Group Work Web Resource Explorations &

Evaluations

E-Case Analysis Evaluation

Peer Feedback Criteria(1 pt per item; 5 pts/peer feedback)

(a) Provides additional points that may have been missed.

(b) Corrects a concept, asks for clarification where needed, debates issues, disagrees & explains why.

(c) Ties concepts to another situation or refers to the text or coursepack.

(d) Offer valuable insight based on personal experience.

(e) Overall constructive feedback.

Possible Methods of Assessment

Review of online group work spaces Evidence of regular and substantial

contributions Self and peer assessment

Have students rate team members on various dimensions

Have students indicate where work plan was followed/not followed

Student reflection Have students write brief reflections on their

group process, indicating what they might change the next time

E-Peer Evaluation Form

Peer Evaluation. Name: ____________________Rate on Scale of 1 (low) to 5 (high):

___ 1. Insight: creative, offers analogies/examples, relationships drawn, useful ideas and connections, fosters growth.

___ 2. Helpful/Positive: prompt feedback, encouraging, informative, makes suggestions & advice, finds, shares info.

___ 3. Valuable Team Member: dependable, links group members, there for group, leader, participator, pushes group.

___ Total Rec’d Contribution Pts (out of 15)

Assessment Issues

Issues to Consider…

1. Bonus pts for participation?2. Peer evaluation of work?3. Assess improvement?4. Is it timed? Give unlimited time

to complete? 5. Allow retakes if lose

connection? How many retakes?

Issues to Consider…

6. Cheating? Is it really that student?

7. Authenticity?8. Negotiate tasks and

criteria?9. How measure competency? 10. How do you demonstrate

learning online?

Catching da cheaters!

Increasing Cheating Online

($7-$30/page, http://www.syllabus.com/ January, 2002, Phillip Long, Plagiarism: IT-Enabled Tools for

Deceit?)

http://www.academictermpapers.com/

http://www.termpapers-on-file.com/ http://www.nocheaters.com/ http://www.cheathouse.com/uk/index.html http://www.realpapers.com/ http://www.pinkmonkey.com/ (“you’ll never buy Cliffnotes again”)

Reducing Cheating Online

Ask yourself, why are they cheating? Do they value the assignment? Are tasks relevant and challenging? What happens to the task after

submitted—reused, woven in, posted?

Due at end of term? Real audience? Look at pedagogy b4 calling

plagiarism police!

Reducing Cheating Online Proctored exams Vary items in exam Make course too hard to cheat Try Plagiarism.com ($300) Use mastery learning for some tasks Random selection of items for item

pool Use test passwords, rely on IP#

screening Assign collaborative tasks

Reducing Cheating Online($7-$30/page, http://www.syllabus.com/ January,

2002, Phillip Long, Plagiarism: IT-Enabled Tools for Deceit?)

http://www.plagiarism.org/ (resource) http://www.turnitin.com/ (software, $100,

free 30 day demo/trial) http://www.canexus.com/ (software; essay

verification engine, $19.95) http://www.plagiserve.com/ (free database

of 70,000 student term papers & cliff notes) http://www.academicintegrity.org/ (assoc.) http://sja.ucdavis.edu/avoid.htm (guide)

Turnitin Testimonials

"Many of my students believe that if they do not submit their essays, I will not discover their plagiarism. I will often type a paragraph or two of their work in myself if I suspect plagiarism. Every time, there was a "hit." Many students were successful plagiarists in high school. A service like this is needed to teach them that such practices are no longer acceptable and certainly not ethical!”

Online Assessment Concerns Problem: Cheating on tests

Copying from neighbor Copying from course materials Someone else taking the test

Problem: Plagiarism Submitting someone else’s paper (previous

class) Copying from (online) sources Buying paper online

Both are product-oriented concerns

Assessment: Process Focus(Vanessa Dennen, Sept 2002)

Assessing Process

Easy to do Many technology tools will archive student

work/interactions Students create a document trail in process

Helps students develop metacognitive knowledge Instructors structure/model/encourage

productive work processes Students learn how to manage their own

work processes

Why Assess Process?

For the instructor … Provides formative feedback on

course (e.g., helps gather data about why students have difficulty with product-oriented assessments)

Provides sense of instructor guidance Clarifies who is doing most work in

small group assignments Helps prevent cheating

Why Assess Process?

For the student … Typically improves the quality of their

products Helps them develop productive work

processes Puts on a schedule Shows that you care about individual

growth

Assessment Project Cycle From Classroom Assessment

Techniques by Angelo & Cross (1993)

Step 1: Plan Choose class Focus on assessable question Design project to answer question

Assessment Project Cycle [2] Step 2: Implement

Teach target lesson Collect assessment data Analyze data

Step 3: Interpret results Communicate results Evaluate assessment project

I. Term Papers

How to do online: Have students each start their own

thread and post topic of interest Peers and instructors give feedback Students post thesis statements,

research sources, etc., with iterations of feedback

Final paper is posted

Term Paper Assessments Product: the paper Process: quality and timeliness of

student work from time when paper is assigned

Process: quality and timeliness of feedback provided to peers

Process: responsiveness to feedback received from instructor and peers

II. Discussion Assignments

1. Chain of thought Have students develop a

solution to a problem Have students indicate what led

them to a particular conclusion, method or approach

Can be done in a discussion board

Discussion Assignments

2. Theory to Practice Have students match up theories you

are learning about to actual problems Present students with problems and

have them explain what theories they would use to solve these problems and how they would approach it

Debrief the assignment

Discussion Assignment

3. Synthesizer Have students take roles being the

weekly synthesizer of class discussion

Add a “meta” level in which students narrate their own experiences while reading the weekly discussion

III. Group Projects

Tools used Chat: brainstorming ideas, making group

decisions, regular way to feel connected (should be archived)

Discussion board: commenting on drafts E-mail: quick feedback File exchange: sharing project files MS Word: Track changes

HINT: If you don’t have a tool that will work, refer students to yahoo groups: http://www.groups.yahoo.com

Group Project Assessments

Product: project files that are turned in Process: online archive demonstrating

Who contributed what Who provided peer feedback Who worked in a timely manner How collaborative a group was

Process: peer ratings Process: interim instructor consultations

III. Project Assignments1. Work Plans

Have students develop a plan of work for their project

Make them outline topic, schedule, resources needed, division of labor and anticipated form of final deliverables

At end of project, have students evaluate how well they followed their own plan and how useful it was

Project Assignments

2. Research Trail Have students document the

steps they took in the research process and the results

Ask for a brief reflection on how effective their process was and what they might change the next time

Project Assignments

3. Process Presentations Have students focus on their process

as well as their product in class presentations

To maintain focus, ask them to share 3 main lessons learned

Might ask for some process documents to be shared, like an early draft

Project Assignments

4. Design Journal Have students maintain a journal of

all ideas related to their project Encourage sketches, lists,

organizational charts, etc. Require journals to be turned in with

final projects

IV. Reflection Assignments

Have students keep a weekly journal of their thoughts on readings and course content AND real-world related instances that they noticed

May make these public, with each student having their own discussion thread

Making it Happen

Learners need to see that process is valuable: Model appropriate processes Provide students with scaffolding (guide

sheets) to structure their processes Give students feedback on their process Require students to reflect on their

processes Grade students on process

Online Testing Tools

Choice: Select companies that

specialize in online assessment.

Or: Use what the courseware package gives ya…

Test Selection Criteria

(Hezel, 1999)

Easy to Configure Items and Test Handle Symbols Scheduling of Feedback (immediate?) Provides feedback for each response Randomize Answers Within a

Question Weighting of Answer Options Supports multiple items types:

multiple choice, true-false, essay, keyword

More Test Selection Criteria Recording of Multiple

Submissions Comprehensive Statistics Summarize in Portfolio and

Gradebook Confirmation of Test Submission Incorp graphic or audio

elements? Timed Tests

Flexible scoring—score first, last, or average submission

Flexible reporting—by individual or by item and cross tabulations.

Control over number of times students can submit an activity or test

Provides item analysis statistics (e.g., Test Item Frequency Distributions).

More Test Selection Criteria

(Perry & Colon, 2001)

Web Resource: http://www.indiana.edu/~best/

Online Survey Tools for

Assessment

Sample Survey Tools Zoomerang

(http://www.zoomerang.com) SurveyMonkey

(http://www.surveymonkey.com/) QuestionMark

(http://www.questionmark.com/home.html) SurveyShare (http://SurveyShare.com;

from Courseshare.com) Survey Solutions from Perseus

(http://www.perseusdevelopment.com/fromsurv.htm) Infopoll (http://www.infopoll.com)

Web-Based Survey Advantages

Faster collection of data Standardized collection format Computer graphics may reduce

fatigue Computer controlled branching

and skip sections Easy to answer clicking Wider distribution of

respondents

Web-Based Survey Problems: Why Lower

Response Rates?

Low response rate Lack of time Unclear instructions Too lengthy Too many steps Can’t find URL

Survey Tool Features Support different types of items (Likert,

multiple choice, forced ranking, paired comparisons, etc.)

Maintain email lists and email invitations Conduct polls Adaptive branching and cross tabulations Modifiable templates & library of past

surveys Publish reports Different types of accounts—hosted,

corporate, professional, etc.

Web-Based Survey Solutions: Some Tips… Send second request Make URL link prominent Offer incentives near top of request Shorten survey, make attractive,

easy to read Disclose purpose, use, and privacy E-mail cover letters Prenotify of intent to survey

Evaluation…

Champagne & Wisher (in press)

“Simply put, an evaluation is concerned with judging the worth of a program and is essentially conducted to aid in the making of decisions by stakeholders.” (e.g., does it work as effectively as the standard instructional approach).

Evaluation Purposes Cost Savings Improved Efficiency/Effectiveness Learner Performance/Competency

Improvement/Progress What did they learn?

Assessing learning impact How well do learners use what they

learned? How much do learners use what they learn?

Kirkpatrick’s 4 Levels

Reaction Learning Behavior Results

Figure 26. How Respondent Organizations Measure Success of Web-Based Learning According to the

Kirkpatrick Model

0102030405060708090

Learner satisfaction Change inknowledge, skill,

atttitude

Job performance ROI

Kirkpatrick's Evaluation Level

Pe

rcen

t o

f R

esp

on

den

ts

My Evaluation Plan…

Considerations in Evaluation Plan

1. Student

2. Instructor

3. Training

4. Task5. Tech Tool

6. Course

7. Program

8. University or

Organization

What to Evaluate?1. Student—attitudes, learning, jobs.

2. Instructor—popularity, course enrollments.

3. Training—internal and external.

4. Task--relevance, interactivity, collaborative.5. Tool--usable, learner-centered, friendly, supportive.

6. Course—interactivity, completion rates.

7. Program—growth, long-range plans.

8. University—cost-benefit, policies, vision.

1. Measures of Student Success

(Focus groups, interviews, observations, surveys, exams,

records)

Positive Feedback, Recommendations Increased Comprehension, Achievement High Retention in Program Completion Rates or Course Attrition Jobs Obtained, Internships Enrollment Trends for Next Semester

1. Student Basic Quantitative

Grades, Achievement Number of Posts Participated Computer Log Activity—peak

usage, messages/day, time of task or in system

Attitude Surveys

1. Student High-End Success

Message complexity, depth, interactivity, q’ing

Collaboration skills Problem finding/solving and critical

thinking Challenging and debating others Case-based reasoning, critical thinking

measures Portfolios, performances, PBL activities

2. Instructor Success

High student evals; more signing up

High student completion rates Utilize Web to share teaching Course recognized in tenure

decisions Varies online feedback and

assistance techniques

3. TrainingOutside Support

Training (FacultyTraining.net) Courses & Certificates (JIU, e-education) Reports, Newsletters, & Pubs Aggregators of Info (CourseShare, Merlot) Global Forums (FacultyOnline.com; GEN) Resources, Guides/Tips, Link

Collections, Online Journals, Library Resources

Certified Online Instructor Program

Walden Institute—12 Week Online Certification (Cost = $995)

2 tracks: one for higher ed and one for online corporate trainer Online tools and purpose Instructional design theory

& techniques Distance ed evaluation Quality assurance Collab learning

communities

http://www.utexas.edu/world/lecture/

3. TrainingInside Support…

Instructional Consulting Mentoring (strategic planning $) Small Pots of Funding Facilities Summer and Year Round Workshops Office of Distributed Learning Colloquiums, Tech Showcases, Guest

Speakers Newsletters, guides, active learning grants,

annual reports, faculty development, brown bags

Technology and Professional Dev: Ten Tips to Make it Better (Rogers, 2000)

1. Offer training2. Give technology to take home3. Provide on-site technical support4. Encourage collegial collaboration5. Send to prof development

conference6. Stretch the day7. Encourage research8. Provide online resources9. Lunch bytes, faculty institutes10. Celebrate success

RIDIC5-ULO3US Model of Technology Use

4. Tasks (RIDIC): Relevance Individualization Depth of Discussion Interactivity Collaboration-Control-Choice-

Constructivistic-Community

RIDIC5-ULO3US Model of Technology Use

5. Tech Tools (ULOUS): Utility/Usable Learner-Centeredness Opportunities with Outsiders

Online Ultra Friendly Supportive

6. Course Success Few technological glitches/bugs Adequate online support Increasing enrollment trends Course quality (interactivity

rating) Monies paid Accepted by other programs

7. Program Considerations

Enrollment trends Relevant and current technology Number of Graduates and

graduation rates Sense of community Format: Self-paced,

collaborative, PBL, mentored, performance-based, individual, etc.

How are costs calculated in online programs???

7. Online Program or Course Budget (i.e., how pay, how large is course, tech fees charged, # of courses, tuition rate, etc.)

Indirect Costs: learner disk space, phone, accreditation, integration with existing technology, library resources, on site orientation & tech training, faculty training, office space

Direct Costs: courseware, instructor, help desk, books, seat time, bandwidth and data communications, server, server back-up, course developers, postage

8. Institutional Success

E-Enrollments from new students, alumni, existing

students Press, publication, partners,

attention Additional grants Making Money: Cost-Benefit model Faculty and student attitudes Acceptable policies (ADA

compliant)

Final advice…whatever you do…