Assessing Online Learning: Strategies, Challenges …...Assessing Online Learning: Strategies,...
Transcript of Assessing Online Learning: Strategies, Challenges …...Assessing Online Learning: Strategies,...
Effective Group Work Strategies for the College Classroom. • www.FacultyFocus.com
Featuring content from
A MAGNA PUBLICATION
Assessing Online Learning:Strategies, Challenges and
Opportunities
2Assessing Online Learning: Strategies, Challenges and Opportunities • www.FacultyFocus.com
Assessing Online Learning:Strategies, Challenges and Opportunities
As online education moves from the fringes to the mainstream, one question still persists:“How do I know what my online students have learned?” There are no simple answers,just as there aren’t in face-to-face courses, but with a little creativity and flexibility, yousoon discover that the online learning environment opens up a host of new student as-sessment possibilities. And, just as with traditional courses, the trick is finding the rightcombination that works best for your particular course.
This special report features 12 articles from Online Classroom that will cause you toexamine your current methods of online assessment, and perhaps add something new toyour assessment toolbox. It even talks about some of the common assessment mistakesyou’ll want to avoid.
Take a look at some of the articles you will find in Assessing Online Learning: Strategies,Challenges and Opportunities:
• Authentic Experiences, Assessment Develop Online Students’ Marketable Skills• Four Typical Online Learning Assessment Mistakes• Assessing Whether Online Learners Can DO: Aligning Learning Objectives withReal-world Applications
• Strategies for Creating Better Multiple-Choice Tests• Assessing Student Learning Online: It’s More Than Multiple Choice• Using Self-Check Exercises to Assess Online Learning• Measuring the Effectiveness of an Online Learning Community• Ongoing Student Evaluation Essential to Course Improvement
Online courses enable a strong student-centered approach to learning and, as a result,assessment. We hope this report helps you design and develop online assessment strate-gies that take full advantage of the many formal and informal assessment tools now atyour fingertips.
Rob KellyEditor
Online Classroom
3Assessing Online Learning: Strategies, Challenges and Opportunities • www.FacultyFocus.com
Table of Contents
Four Typical Online Learning Assessment Mistakes ..................................................................................................4
Authentic Experiences, Assessment Develop Online Students’ Marketable Skills ........................................................6
Assessing Whether Online Learners Can DO: Aligning Learning Objectives with Real-world Applications ................8
Strategies for Creating Better Multiple-Choice Tests ................................................................................................10
Assessing Student Learning Online: It’s More Than Multiple Choice ........................................................................13
To Plan Good Online Instruction, Teach to the Test ..................................................................................................14
Using Self-Check Exercises to Assess Online Learning ..............................................................................................16
Assessment for the Millennial Generation ................................................................................................................17
Self-Assessment in Online Writing Course Focuses Students on the Learning Process ..............................................19
Using Online Discussion Forums for Minute Papers ..................................................................................................20
4Assessing Online Learning: Strategies, Challenges and Opportunities • www.FacultyFocus.com
The goal of learning assessmentsshould be to measure whetheractual learning outcomes match
desired learning outcomes. Here’s ananalogy. Your freezer stops keepingfoods frozen, so you call theappliance repair folks. They show upon schedule and charge you exactlywhat they estimated on the phone. Isthat enough information for you toknow if the desired outcome (frozenfood) has been achieved? No, ofcourse not.We use freezers to achieve specific
outcomes. We build instruction toachieve specific outcomes as well.Well-written instructional objectivesdescribe the desired outcomes of in-struction and are critical to designinggood courses and assessments.A freezer that works means the
food stays frozen as expected.Instruction that works means peoplelearn as expected. Adequate learningassessments tell us whether the in-struction we built works and providesus with data to adjust our efforts.We measure whether instruction
“works” by seeing if the instructionwe build actually helps peopleachieve the learning objectives. I’deven argue that we cannot be consid-ered competent builders of instructionif we can’t show that what we builthelps learners learn. Some might saythat’s a big “duh,” but I’m guessing afair number of people who build in-struction haven’t really thought aboutit.
Here’s a scenario for us to consider.Lana Mercer, a new instructor, hasjust finished teaching her online
course, Introduction to ComputerGraphics. Here are the three mostcritical terminal objectives for thiscourse (these are reasonably well-written, unlike most of the objectivesI see, which makes it far easier todetermine what assessments areneeded):• Analyze common uses for thesecomputer graphics methods: 2-Drepresentation and manipulation,3-D representation and manipula-tion, animation, and image pro-cessing and manipulation.
• Describe methods for defining,modeling, and rendering of 2-Dand 3-D objects.
• Determine the best tools to usefor defining, modeling, andrendering of 2-D and 3-D objects.
Mercer graded students based onweekly homework assignments (10percent of the grade), two projects (20percent of the grade), and a final test(70 percent of the grade). More than athird of the students got a C or loweron the final and as a result, becausethe final was such a large percentageof the final grade, received low gradesfor the course. Lana didn’t understandwhy students were upset, becausefinal grade scores reflected a bellcurve, so the range of grades was asshe expected. See any assessmentproblems? (Yep, you should.)
Four typical mistakesPeople who build instruction make
some typical but unfortunate mistakeswhen designing learning assessments,and these mistakes compromise boththeir competence as designers of in-struction and the quality of the in-struction they build. These mistakesinclude:1.Expecting a bell curve2.The wrong type of assessment3.Not valid (enough) assessments4.Poorly written multiple-choicetests
Expecting a bell curveBenjamin Bloom (1968), a distin-
guished educational psychologist,proposed that a bell curve model,with most students performing in themiddle and a small percentage per-forming very well and very poorly(e.g., a normal or bell curve) is thewrong model of expected outcomesfrom most instruction. The bell curvemodel is what might be expectedwithout instruction. Instructionshould be specifically designed toprovide the instruction, practice,
Four Typical Online LearningAssessment Mistakes
By Patti Shank, PhD, CPT
People who build instruction
make some typical but
unfortunate mistakes when
designing learning
assessments, and these
mistakes compromise both
their competence as designers
of instruction and the quality of
the instruction they build.
PAGE 5�
5
feedback, and remediation needed tobring about achievement of thedesired outcomes. His “mastery”model assumes that most studentswill be high achievers and that the in-struction needs to be fixed if this doesnot occur.Mercer should not have expected
her students’ final grades to fall on abell curve. A mastery model assumesthat most students will achieve thedesired outcomes, and therefore, mostwill achieve higher grades.
The wrong type of assessmentThere are two primary learning as-
sessment formats: performance as-sessments and “test” assessments.The former involves assessing per-formance in a more realistic way (insituations), and the second involvespaper or computer-based forms withmultiple choice, matching, fill-in-the-blank, and short- and long-answer-type (i.e., essay) questions. Testassessments are by their nature a lessauthentic way of assessing learningbut are very practical and aretherefore commonly used.The optimal assessment type
depends primarily on whether theobjective is declarative (facts: name,list, state, match, describe, explain…)or procedural (task: calculate,formulate, build, drive, assemble,determine…). Research shows thatthere is a big difference between thesetwo types—the difference betweenknowing about and knowing how(practical application to real-worldtasks).Let’s take, for example, a biomed-
ical technology course. A biomedicaltechnician needs to know the namesof a cardiac monitor’s parts (declara-tive objective) in order to find appli-cable information in thetroubleshooting manual. But knowingpart names only goes so far. Knowing
how to troubleshoot the cardiacmonitor (procedural objective)involves far deeper skills. So askingbiomedical technicians to name partsor even list the troubleshooting stepson a final test is an inadequate assess-ment of their troubleshooting skills.The bottom line is whether they can,in fact, troubleshoot, and that requiresa performance assessment.When it comes to designing
adequate assessments, it’s inadequateto determine only whether learnersknow about if you need to determinewhether they actually can perform inthe real world. Many higher educationinstructors don’t adequately infusetheir courses with real-world implica-tions and skills, and I believe this is amistake.Mercer’s objectives are a mix of de-
clarative and procedural, but her as-sessment scheme is heavily weightedtoward achievement of declarative ob-jectives (and the tests used to assessthem). That made her grading schemeunbalanced and inappropriate. Amore balanced and appropriategrading scheme would have givenmore weight to projects that showachievement of procedural objectives.
Not valid (enough)assessmentsThe gold standard for assessment
quality is validity. A valid assessmentmeasures what it claims to measure.For example, a biomedical equipmenttroubleshooting assessment shouldmeasure the skills of the person doingactual or simulated troubleshooting.It’s easier than you might think todesign assessments that measuresomething other than what isintended.Let’s say the biomedical equipment
troubleshooting assessment primarilyasks students to match parts,functions, and typical problems. Isthis a valid assessment of trou-bleshooting skills? Unlikely. And what
if another assessment is written at toohigh a reading level. What is itmeasuring? For one thing, readingskills. Both tests are likely less validthan is necessary. The best way toestablish validity is to carefully matchobjectives and assessments, asexplained in the last mistake.Lack of validity impacts course
quality and fairness. And if the resultsof assessments impacts passing thecourse (as they usually do), invalidassessments are not only unfair butpotentially illegal.Objectives and assessments in
Mercer’s class didn’t match up.Because students in Mercer’s classneeded a passing grade in order totake higher-level graphics courses, sheneeds to rethink the validity of her as-sessments, starting with mapping as-sessment types to objective types.
Poorly written multiple-choicetestsMany assessments, even if they are
the right kind, are poorly written. Twoof the most common mistakes formultiple-choice questions areconfusing or ambiguous language andimplausible distractors (wrong alter-natives from which the learner selectsthe correct answer[s]). A poorlywritten multiple-choice question auto-matically lowers the validity of the as-sessment.
Final thoughtsInadequate learning assessments
are at best frustrating. At worst, theycan damage students and institutions.Adequate learning assessments areone of the hallmarks of competence inbuilding good instruction andmarkedly improve the quality of in-struction.The final assessment for the
Introduction to Computer Graphicscourse suffered from all the mistakes
Assessing Online Learning: Strategies, Challenges and Opportunities • www.FacultyFocus.com
PAGE 6�
FROM PAGE 4
6
listed, even though the instructor waswell-intentioned. In an online course,where students often require extrafeedback and motivation, unintendedfrustrations and unfairness can causemany problems including complaints,reduced enrollments, and lack of per-sistence.Writing good performance assess-
ments and test questions is a skill thattakes training, time, and feedback.
References/ResourcesAmerican Educational Research
Association, American PsychologicalAssociation (APA), & National Council
on Measurement in Education(NCME). (1999). Standards for educa-tional and psychological testing.Bloom, B. (1968). Learning for
mastery. Evaluation Comment, 1(2),1–5.Bond, L. A. (1995, August). Norm-
referenced testing and criterion-refer-enced testing: The difference inpurpose, content, and interpretation ofresults. Oak Brook, IL: North CentralRegional Educational Laboratory.(ERIC Document Reproduction ServiceNo. ED402327).Shank, P. (2005). Developing
learning assessments for classroom,online, and blended learning.
Workshop Materials. Denver, CO:Learning Peaks.Shrock, S., & Coscareli, W. (2000).
Criterion-referenced test development.Silver Spring, MD: InternationalSociety for Performance Improvement.
Patti Shank, PhD, CPT, is a widelyrecognized instructional designer andtechnologist, writer, and author whoteaches and helps others teach online.She can be reached through herwebsite: http://www.learningpeaks.com/. @
Maureen Colenso, an instruc-tor at Northeast WisconsinTechnical College, sums up
her assessment philosophy for heronline courses as follows: “Whateverit is students are going to need to bedoing on the job, then that’s howthey need to be assessed for theclassroom. I start with the assump-tion that the competencies for theclass represent marketable skills, andI find that there’s pretty much alwaysa way to figure out how to havestudents do a project that will be rep-resentative of the type of work theywould do for an employer.”Colenso didn’t come to teaching
from a traditional path and says shehas a pragmatic approach in heronline courses and the ways she
assesses learning outcomes. “I teachat a community college, and we fre-quently have returning adult studentswho need to be trained for a newcareer. They need marketable skills,and we need to know that they reallyhave learned these skills,” Colensosays.The courses Colenso teaches online
lend themselves to authentic assess-ment. Colenso teaches documenta-tion, training, database theory,database applications, and severalentry-level software skills courses. Inall these courses, she finds ways tohave students produce authenticproducts for local businesses.Students produce electronic
products that are easily deliveredonline and that can be shown to
prospective employers. “It givesstudents something real. It makes aworld of difference in a competitivejob market if the applicant, instead ofjust saying, ‘Yes, I can write aprogram that will suit your needs,’can say, ‘Oh yes, here’s an example,’”Colenso says. “For example, Iencourage students to build a customdatabase in the database class andthen use the documentation class toprovide good documentation for thatsystem and use the training class todevelop training materials to teachsomebody how to use that system,and then they have a whole packagethey can show a prospectiveemployer.”
Assessing Online Learning: Strategies, Challenges and Opportunities • www.FacultyFocus.com
PAGE 7�
FROM PAGE 5
Authentic Experiences, Assessment DevelopOnline Students’ Marketable Skills
By Rob Kelly
7
Working with real clients adds tothe authenticity of the learning expe-rience. “It inspires much better work.To me that’s one of the real benefits.For one thing, it’s hard for a studentto be really motivated when he orshe knows that all that’s going tohappen is a certain amount of pointsbeing awarded and nobody’s evergoing to think about it again. Ifsomething is real, if a local businessis going to be using [the product] andif they’re going to be sharing theresults with other members of theclass, it just spurs them to make amuch greater effort,” Colenso says.The experience of working online
adds another element of authenticitybecause in many work situationsworkers are called on to collaborateonline with their coworkers. WhenColenso began teaching online sixyears ago, she found that the differ-ence distinguishing online from face-to-face learning is that students havethe additional challenge of negotiat-ing distance communicationmethods. “But since that’s very mucha part of what happens in theworking world today, I think that thatprovides a better experience,”Colenso says.The benefits of this experience are
so strong that Colenso has begun in-corporating online collaboration inher face-to-face courses. “Once Itaught online, I thought the tradi-tional students weren’t really gettingtheir full measure, because theyshould be having these experiencestoo.“Online students might be more
likely to pick a client for their projectwhere they’re going to be communi-cating more by e-mail and attach-ments and things like that. I do findthat where I want to have studentswork in collaborative groups, thereare some additional logistical chal-
lenges, but all of these softwarepackages that are used for onlinedelivery have things like virtual class-rooms.”Even courses that are more theory-
based include some form of authenticassessment. For example, in arecently created database theorycourse, Colenso includes an assign-ment that has the students work witha local business to analyze its needsand to design an information-leveldatabase and make a recommenda-
tion as to which platform would bebest for that customer to implementthis database. The students thenwrite a proposal, which may or maynot lead to that database beingcreated, “but it still provides that in-teraction,” Colenso says. “It’s stillreal work. When it comes to lower-level classes, I’m more inclined to puttogether lab practicals that representwhat businesses need.”Course quality also figures into
Colenso’s use of authentic assess-ment. “We received a grant to makeall of our courses available online,and I was somewhat horrified when afew people implied that the courses
would somehow be watered-downversions of what students did in theclassroom. It turned out that that wasa common assumption. I said, ‘I’llshow you. These aren’t going to beeven remotely watered-down.’”Because they do not represent
anything that students will be askedto do on the job, Colenso does notuse objective, multiple-choice tests inher courses, except as a way toencourage students to complete thereadings and review basic facts.One of the selling points of the
college’s online computer scienceprogram is the reputation theprogram has among the local ITcommunity. “This is a smallcommunity. Everybody in the ITcommunity kind of knows everybodyelse. If word gets out that some ofour students can get all the waythrough and graduate our programand not be able to perform on thejob, that would be bad. Part of howwe sell our program is the good em-ployment results at the end,”Colenso says.Business clients get free services
from the students, but they also helpby providing mentoring for thestudents. Before working withbusiness clients, it’s important thateverybody knows the nature of thisrelationship, how students will beevaluated, and the client’s role in thatevaluation. @
Assessing Online Learning: Strategies, Challenges and Opportunities • www.FacultyFocus.com
FROM PAGE 6
Because they do not represent
anything that students will be
asked to do on the job, Colenso
does not use objective,
multiple-choice tests in her
courses, except as a way to
encourage students to
complete the readings and
review basic facts.
8
Many of the criticisms ofeducation come from thefact that much instruction
isn’t designed so learners can DO. Forexample, it’s not good enough todesign math instruction so learnerscan do the problems at the back ofeach chapter. They need to be able tocalculate discounts, determine theimplications of different interestrates, and so on, in the real world.And instructors need to assess iflearners can DO, which commonlyinvolves designing real or realisticperformance assessments.In this article, I’ll first explain how
the type of assessment should matchthe level of learning objectives andthen describe how checklists andrating scales can be built to assesslearning objectives that requiredemonstration of more complexskills. A foundational skill for devel-oping good performance assessmentsis writing good learning objectives.
Two levels of objectives, twotypes of assessmentsThere are two levels of objectives,
and each level is ideally assessedusing a different type of assessment.Declarative objectives ask learners toremember or recall facts andconcepts. In other words, they ask iflearners know. These objectives use
verbs such as name, list, state,match, describe, and explain.Procedural objectives ask learners toapply what they know in realistic orreal situations. They ask if learners
can DO. They use task-oriented verbssuch as calculate, formulate, build,drive, assemble, and determine.Most, if not all, terminal learning
objectives in our courses should beprocedural because we need learnersto be able to DO real tasks in the realworld. Knowing is foundational todoing, but it doesn’t go far enough.Courses with mostly declarativeterminal objectives probably don’t gofar enough either.
To illustrate the match betweenobjective type and assessment type,let’s analyze three objectives in apersonal wellness course.As a result of course activities,
learners will1. explain how the seven dimen-sions of wellness impact dailyliving,
2. analyze how deficits in nutritionimpact human performance andlongevity, and
3. plan a personal diet and fitnessprogram based upon a personaldiet and fitness assessment andpersonal goals.
Objective #1 is declarative, askingthe learner to recall facts andconcepts about the seven dimensionsof wellness. Test questions are an ap-propriate and efficient way to assessif the learner can recall how physical,intellectual, emotional, social, occu-pational, environmental, and spiritualwellness impact daily living.
Objective #2 is a fairly uncompli-cated procedural objective that askslearners to be able to reach conclu-sions based on a limited amount ofinformation. Although a performanceassessment might be optimal, simplerprocedural objectives that don’tinvolve much complexity can beassessed quite nicely (and efficiently)using well-written case or scenario-based multiple-choice test questions.
Objective #3 is a more complicatedprocedural objective that requireslearners to deal with myriad factorsin a realistic way. How can youassess if learners can develop an ap-propriate diet and fitness planwithout having them develop theplan to see how well it was done?You can’t. A test isn’t adequate.
Assessing Online Learning: Strategies, Challenges and Opportunities • www.FacultyFocus.com
Assessing Whether OnlineLearners Can DO: AligningLearning Objectives withReal-world Applications
By Patti Shank, PhD, CPT
PAGE 9�
How can you assess if
learners can develop an
appropriate diet and fitness
plan without having them
develop the plan to see how
well it was done? You can’t. A
test isn’t adequate.
9
Performance assessmentsPerformance assessments assess
actual or simulated performance. Forexample, if we want to see if acomputer science student can write areal program, we provide the studentwith a programming challenge andthen assess his or her performance.In order to reduce subjectivity and
improve validity when assessing per-formance, we can build a checklist orrating scale. A performance checklistor rating scale lists specific behaviorsor results to be demonstrated.Checklists and rating scales will ofteninclude a mechanism for tallying afinal “score.” Many instructors sharethese in advance with students sothey know how they will be assessed(good idea!). These checklists andrating scales are also commonlycalled rubrics.Many instructors use authentic ac-
tivities but haven’t considered addingchecklists or rating scales to assessthem. Checklists and rating scalesreduce subjectivity and increase relia-bility and validity. They help learnersachieve the desired success.Let’s look at examples of portions
of common types of performance as-sessment checklists and rating scales.
ChecklistA checklist lists the criteria and
specifies the presence or absence ofthe behavior or results. A checklist isoften the best verification instrumentto use when unqualified presence isimportant (and it’s also easy toassess) because it doesn’t require therater to make in-between judgments.
Rating Scale with DescriptorsA rating scale with descriptors
provides a list of criteria with a de-scription of what the behavior orresults look like for each level of per-
formance. There is no specificnumber of levels to use in a ratingscale. It is better to start with fewerlevels and increase them if needed,because with more levels judgmentsbecome finer and finer, and the likeli-hood of rater error increases. Ratingscales with descriptors are best whenthere are clearly more than two levelsof performance.
Rating Scale with Descriptors andWeighted ScoreWhen building a rating scale, some
behaviors or results may need moreweight than others. In that case, aweighted rating scale may beappropriate.
Performance assessmentonlineSome online instructors dumb
down their activities and assessmentsbecause online course systems makegrading tests easy. This shortchangeslearners and learning.Consider what’s needed to make
real or realistic performance a possi-bility and how to assess it (ask otheronline instructors and the instruc-
tional designers for ideas). Forexample, in an online presentationskills course I taught, learners peerreviewed each others’ presentationplans and slides using a checklist Idevised. Learners gave their presenta-tion to a real audience and wereassessed by that audience using therating scale I devised. The ratingscales were faxed back to me directly.Providing checklists and rating
scales for download is the same asproviding other types of print-basedmaterials. The real challenge isdesigning real or realistic activitiesthat allow learners to DO, andbuilding clear and specific enoughchecklists and rating scales to assessthem. Think out of the box.Remember that not all activities haveto occur online (and manyshouldn’t).
Patti Shank, PhD, CPT, is a widelyrecognized instructional designer andtechnologist, writer and author whobuilds and helps others build goodonline courses and facilitate learning.She can be reached through herwebsite, http://www.learningpeaks.com/. @
Assessing Online Learning: Strategies, Challenges and Opportunities • www.FacultyFocus.com
FROM PAGE 8
Providing checklists and rating
scales for download is the
same as providing other types
of print-based materials. The
real challenge is designing real
or realistic activities that allow
learners to DO, and building
clear and specific enough
checklists and rating scales to
assess them.
10Assessing Online Learning: Strategies, Challenges and Opportunities • www.FacultyFocus.com
Multiple-choice tests arecommonly used to assessachievement of learning ob-
jectives because they can be efficient.Despite their widespread use, they’reoften poorly designed. Poorly writtenmultiple-choice tests are equallydamaging in classroom-based andonline courses, but in online courseslearners often have to contend withmore challenges, and poor assess-ments can add insult to injury. Inaddition, poorly written tests can bemore visible to the world when theyare placed online.In this article, I’ll look at the
plusses and minuses of multiple-choice tests, when they are appropri-ate and less appropriate, typicalmistakes writers of multiple-choicequestions make, and how to avoidthese mistakes.
Some plusses and minusesMultiple-choice tests can be
developed for many different types ofcontent and, if the test items are wellwritten, can measure achievement ofmultiple levels of learning objectives,from simple recall and comprehen-sion to more complex levels, such asability to analyze a situation, applyprinciples, discriminate, interpret,judge relevance, select best solutions,and so on.Multiple-choice tests are easy to ad-
minister and can be improved usingitem analysis in order to eliminate orcorrect poorly written items. They are
easy to score and less susceptible toscoring subjectivity than short-answer or essay-type items. Theydon’t measure writing ability (whichcan be a plus or minus) and often doassess reading ability (anotherpotential plus or minus, but in realityoften a minus). They are moresubject to guessing than many othertypes of learning assessments.Multiple-choice tests are often
promoted as “objective.” Althoughscoring them doesn’t involve subjec-tivity, humans do judge whatquestions to ask and how to askthem. These are very subjectivedecisions!
When multiple-choice isappropriateMultiple-choice test items call for
learners to select an answer or
answers from a list of alternatives.Because they do not ask learners toconstruct an answer or actuallyperform, they tend to measureknowing about rather than knowinghow.Multiple-choice items cannot assess
learners’ ability to construct, build, orperform. They are best used for ob-jectives that can be assessed byselecting the correct answer from alist of choices rather than supplyingthe answer or performing a task.Think for a moment about howdifferent selecting is from construct-ing and performing and you’llrecognize the limitations of multiple-choice testing.Consider the following learning ob-
jectives (Table 1) and decide if youthink a multiple-choice test is a goodassessment option. My answersfollow, so you may wish to coverthese up before proceeding.
My answers: Learning objectivesone and two can be assessed withmultiple-choice items because theresponse can be selected effectively.Multiple choice is not the best way toassess learning objectives three andfour because they require the learnerto either construct a response orperform a task.
Strategies for Creating BetterMultiple-Choice Tests
By Patti Shank, PhD, CPT
Learning objective Is multiple choice a goodassessment option?
1. Identify risk factors… yes noreason:
2. Detect errors… yes noreason:
3. Explain the purpose… yes noreason:
4. Create a plan… yes noreason:
Table 1
PAGE 11�
11Assessing Online Learning: Strategies, Challenges and Opportunities • www.FacultyFocus.com
Item parts and problemsIt’s important to use multiple-
choice tests wisely and to write goodtest items when you use them. Let’slook at the parts of a multiple-choiceitem and common problems withthose parts. (Table 2)A multiple-choice question test
item has two parts: a stem andmultiple alternatives. The steminitiates the item with a question, in-complete statement, or situation. Thealternatives are a list of possibleanswers or solutions. They includethe right answer(s), also known asthe key, and inferior or wronganswers, known as distractors.Let’s look at a few examples to see
these problems in action.
Example 1Plants that are bona fide annuals,
rather than half-hardy annuals,biennials, or perennials:a. Live for one growing seasonb. Bloom during their growingseason
c. Live and bloom for multipleyears
d. Are more attractive than otherflowers
The stem is potentially confusingand the directions do not tell us toselect all that apply. Both a and b arecorrect, and b is also true for theother types of plants described. Andd is clearly implausible.
A better item:Which of the following statements
are true about annuals? Select all thatapply.a. They grow and bloom duringone season.
b. Many bloom throughout theirgrowing season.
c. Many bloom in the early springonly.
d. They commonly have shortergrowing seasons than annuals.
The writing is clearer, and a and care unambiguously correct.Distractors b and d are plausible tothose who don’t know the content.
Example 2A vendor offers a staff member two
tickets to the World Series. Based onthe rules listed in the vendor giftpolicy, he or she cannot accept them:a. Unless the tickets have no streetvalue
b. If the vendor is expecting “quidpro quo”
c. If the gift is worth more than $25or is considered to be an induce-ment
d. Unless the vendor is also afriend
e. None of the above
The language requires understand-ing of “quid pro quo” and “induce-ment” and includes confusingnegatives. Distractor c has twoanswers in one distractor; d isobviously implausible; and e, “Noneof the above,” is not recommended asa distractor choice.
A better item:Under what circumstances can a
staff member accept a gift from avendor, based on the vendor giftpolicy? Select the best answer.a. A gift can be accepted if the giftis not considered valuable.
b. A gift can be accepted if thegift’s actual value is under $25.
c. A staff member may neveraccept a gift from a vendor.
The writing is clearer and the dis-tractors are plausible to those whodon’t know the content.
Writing better multiple-choiceitemsConfusing and ambiguous language
and poorly written or implausible dis-tractors are very common errorswhen writing multiple-choice testitems. Here’s a to-do list to help youavoid these mistakes and write bettermultiple-choice test items.
Table 2
Part Description Common Problems
1. Stem Describes the• question to be answered,• incomplete statement tobe completed,• decision to be made, or• problem or situation tobe resolved
• Confusing orambiguous language• Inadequate instruc-tions
2. Alternatives
- Key- Distractors
The alternatives fromwhich the learnerselects the correctanswer(s)
The correct answer(s)The incorrect answer(s)
• Confusing or ambiguouslanguage• No clear right answer• Poorly written or im-plausible distractors
PAGE 12�
FROM PAGE 10
Use clear, precise language1.Provide clear directions. Groupquestions with the same direc-tions together.
2.Include as much of the questionas possible in the stem, andreduce wordiness of alternatives.Include words in the stem thatwould otherwise be repeated ineach of the alternatives.
3.Make sure language is precise,clear, and unambiguous. Includequalifiers as needed, but don’tadd unnecessary information orirrelevant sources of difficulty.
4.Avoid highly technical languageor jargon unless technicalknowledge and jargon are part ofthe assessment.
5.Avoid negatives and these words:always, often, frequently, never,none, rarely, and infrequently.When a negative is used, itshould be CAPITALIZED, under-lined, or bolded to call attentionto it.
6.Don’t use double negatives ordouble-barreled questions(asking two things in onequestion).
Write good alternatives/distractors1.If alternatives include best andnot-as-good alternatives (“Selectthe best answer…”), provideenough detail to differentiate bestfrom not-as-good.
2.Make sure that each item has anunambiguously correct answer oranswers.
3.Make sure distractors areplausible, especially to those withlower skills or lesser knowledge.These are the best types ofplausible-but-incorrect distrac-tors:a) Common errors andcommonly held myths or mis-conceptions (for those with
less knowledge or skill)b) Statements that are true, butdo not answer this question
c) Content that is paraphrasedincorrectly
4.Avoid distractors that combinedistractors (“b and c”), “all of theabove,” and “none of the above.”
5.Avoid giving away the correctanswer in the stem or alterna-tives. Common problems:a) Alternatives that do not followgrammatically from the stemindicate an incorrect response.
b) An alternative that uses theexact same terminology as thestem indicates a correctresponse.
c) Two options that are synony-mous or almost identicalindicate two incorrect or twocorrect responses.
d) An alternative that is muchlonger than the othersindicates a correct response.
e) Don’t give away the answer inthe wording of anotherquestion!
6.Make sure to use different place-ments of the correct answer. (Themost common placement of thecorrect answer is c, and test-wiselearners know this.)
Your turnGet together with other multiple-
choice item writers (faculty, instruc-tional designers, etc.) and use thisguidance to critique existing testitems and suggest improvements.Poorly written multiple-choice test
items are almost as ubiquitous asmultiple-choice tests, but it doesn’thave to be this way. There are manyways to frustrate learners in onlinecourses, but this shouldn’t be one ofthem. Although it takes time andpractice to write good items, this timeand effort is certainly well spent.
References/resourcesShank, P. (2006) Developing
Learning Assessments for Classroom,Online, and Blended Learning.Workshop Materials. Centennial, CO:Learning Peaks.
Patti Shank, PhD, CPT, is a widely
recognized instructional designer and
technologist, writer, and author who
builds and helps others build good
online courses and facilitate learning.
She can be reached through her
website: http://www.learningpeaks.com/. @
12Assessing Online Learning: Strategies, Challenges and Opportunities • www.FacultyFocus.com
FROM PAGE 11
Poorly written multiple-
choice test items are almost
as ubiquitous as multiple-
choice tests, but it doesn’t
have to be this way. There
are many ways to frustrate
learners in online courses,
but this shouldn’t be one of
them.
13
Asmore and more instructorsmove their courses into theonline environment, one con-
sistent question that arises is, “Howdo I know what the students havelearned?” The answer is not simple,but it can be effectively addressedwith some common sense and a littlebit of creativity.
How do I know what thestudents have learned?The best place to start to resolve
this question is your course syllabus.Aligning assessment strategies withyour learning objectives allows youto check whether or not studentshave met the objectives of the course.Any course, online or face-to-face,should have clear, measurable objec-tives. An instructor need only pickthe assessment technique that bestmatches each learning objective.For example, if the objective states
that students will be able to describea concept, ask them to write a paper,post information on a discussionboard, or create a flowchart. If theyneed to be able to identify or locateparticular elements, they cancomplete an objective quiz, postrelevant URLs, or submit digitalpictures. If you have taken the timeup front to identify clear, measurableobjectives, determining assessmentstrategies becomes much simpler(Florida State University, 2006).
Typical assessment strategies ofobjective tests are easily administeredthrough the various course manage-ment systems. Once online, theseobjective tests can now incorporatevideo, audio, and other media. Otherassessment strategies commonly usedin the traditional classroom can alsobe easily moved into the online envi-ronment, such as discussions andsubmission of written papers, essays,or reports.With a bit of creativity, effort, and
basic technical skills you can add 1)simulations and activities wherestudents can virtually arrangeelements, manipulate variables in ex-periments and equations, and evendiagnose virtual patients; 2) groupprojects where students can addresscase studies, collaborate on presenta-tions or reports, and develop originalprograms, lesson plans, or othermaterials; and 3) role plays whereeach person has a role, position, orcharacter to represent in course dis-cussions.Allow the students the opportunity
to practice with the technology priorto a formal assessment. Practiceallows the students, and you, tobecome comfortable with therequired technology; thus, results areless likely to be skewed by students’technical skills.
How can you actively engagestudents in the assessmentprocess?Assessment strategies are most suc-
cessful if they replicate somethingthat the student will do in his or herprofession, that is clearly relevant tothe course, and that is useful indemonstrating his or her knowledgeand abilities. This type of assessmentstrategy, known as authentic assess-ment, actively engages students anddemonstrates to the instructor thatthey not only understand theconcepts but can also apply them inreal-life scenarios (Mueller, 2006).In addition to using appropriate
and, when possible, authentic assess-ments in an online environment, it isimportant to keep students activelyengaged in the “classroom.” This isbest accomplished by requiringfrequent, small assessments that willrequire the student to access thecourse two or three times a week. Forexample, Wednesday requires a short,objective quiz on the assignedreading material; Friday, a submis-sion of a written summary of thisweek’s activity and the posting of areflection of what was learnedthrough the assignment on the dis-cussion board; and finally, Sunday, areply to a fellow student’s reflection.Frequent, small assessments letsthem know how they are doing in thecourse and provides ample room forimprovement or adjustment in studyhabits, if necessary.
How do you know it’s thestudent’s work?How can I be sure that this is really
the student’s work? Unfortunately,there is no guarantee; though if youhonestly think about a traditionalclassroom, you typically can’t be surethere either. While this may seem likea dire situation, there are ways in-
Assessing Online Learning: Strategies, Challenges and Opportunities • www.FacultyFocus.com
Assessing Student LearningOnline: It’s More Than MultipleChoice
By Elizabeth Reed Osika, PhD
PAGE 14�
14
structors can limit the amount of dis-honesty in their courses.One of the easiest ways is to assess
higher-order and critical-thinkingskills and limit the use of objectiveassessments such as multiple-choicetests. Objective assessments are theeasiest for students to negatively ma-nipulate. Use multiple-choice or otherobjective questions to assess informa-tion that provides a foundation forthe course, the information they willhave to use later to demonstrate theirunderstanding through an authenticassessment. This allows the instruc-tor the opportunity to make sure thateach student is familiar with the mostimportant aspects in the text.If you use objective assessments as
a primary assessment strategy,assume your students will use theirbooks and discuss the concepts withfriends and write your questions ac-cordingly. In fact, encourage students
to use all relevant resources tocomplete a test, as this best reflectswhat occurs in real life. If we arefaced with a question we don’t knowthe answer to, we Google the topic,seek knowledge from others, and, ifnecessary, pull out texts to determinethe answer. Why not encourage thisproblem-solving strategy in yourstudents by posing complexquestions that aim at higher-levelthinking skills (Carneson, Delpierre,& Masters, 1996)?Use a variety of assessments. If you
have only one or two large assess-ments in the course, the pressure onstudents to succeed is enormous, andmany may revert to less-than-honestactivities. However, if you provide avariety of assessments that build oneach other, the pressure is reduceddrastically. Using frequent assessmentpoints also allows the instructor toget to know students’ work and per-sonalities, making the identificationof dishonest work easier.
ReferencesCarneson, J., Delpierre, G., &
Masters, K. (1996). Designing andManaging Multiple Choice Questions.Available online athttp://web.uct.ac.za/projects/cbe/mcqman/mcqman01.html.Florida State University (2006).
Information about behavioral objec-tives and how to write them.Available online atwww.med.fsu.edu/education/FacultyDevelopment/objectives.asp.Mueller, J. (2006). Authentic
Assessment Toolbox. Available onlineat http://jonathan.mueller.faculty.noctrl.edu/toolbox/.
Elizabeth Reed Osika, Ph.D. is anassistant professor at Chicago StateUniversity. @
Building effective instructioninvolves multiple tasks, butplanning is one of the most
critical. For online courses, planning isespecially important because evenunder the best of circumstances, onlinelearners often struggle with under-standing what’s expected of them. At adistance, they can get unbelievablyfrustrated (or worse) and stop trying.That’s one of the best reasons for usinga systematic approach to planning yourinstruction.One of the best planning strategies
for good instruction is teaching to the
test. You likely have heard the words“teaching to the test” uttered contemp-tuously. But it can be a very good thingindeed. I’m going to take a bit of a cir-cuitous route in explaining why so youcan understand my logic.Objectives are the cornerstone for
planning effective instruction, andgood assessments determine if the ob-jectives have been met. You mightconsider these the “bookends” ofplanning effective instruction.
ADDIE who?Instructional designers (people who
typically have specialized training inusing cognitive and other principles todesign effective instruction) call thepractice of systematically planning in-struction “instructional design.” Thereare numerous philosophies of instruc-tional design but all have certain thingsin common, including following a listof tasks that ensure better end results.Here is a list of typical instructional
planning tasks, in order:1.Identify learning objectives2.Design assessments
Assessing Online Learning: Strategies, Challenges and Opportunities • www.FacultyFocus.com
FROM PAGE 13
To Plan Good Online Instruction, Teach to the Test
By Patti Shank, PhD, CPT
PAGE 15�
15
3.Design content and activities4.Select media and delivery options5.Develop the course materials6.Implement the course7.Evaluate and revise
If you have worked with instruc-tional designers or read articles orbooks on instructional design, youmay be familiar with the ADDIEmodel, one of the most commonmodels for the systematic design ofinstruction. ADDIE is an acronym forAnalysis, Design, Development,Implementation, and Evaluation.Following a systematic process suchas ADDIE can help prevent some ofthe typical problems that happenwhen instruction isn’t well planned,including instruction that doesn’tseem to have a clear goal; quirky (notin a good way) or deficient coursecontent, activities, and assessments;and poor evaluations for the courseand instructor.Notice that identifying learning ob-
jectives is first on the list of tasks.And designing assessments is next,for good reason.
Design assessments afteridentifying learning objectivesDesigning assessments should
optimally occur right after identifyinglearning objectives. That’s becauseassessments should measure if theobjectives were met. If the learningobjectives are well written, appropri-ate methods of assessment aregenerally quite clear.See TABLE 1 on how the appropri-
ate assessment matches the learningobjective? If you design assessmentsas an afterthought at the end ofdesigning the instruction (a commonbut unfortunate mistake), you arelikely to design the wrong contentand the course activities and the as-sessments are likely to be far lessmeaningful or appropriate. In other
words, designing the assessment(test) right after identifying thelearning objectives 1) makes theneeded assessment very obvious and2) provides clear cues about whatcontent and activities are needed.Design content and activities after
designing assessments
I’ve finally made my way to tellingyou to design to the test. Firstidentify the learning objectives andmatching assessment (test). Thelearning objectives should clearlystate what the learner should be able
Assessing Online Learning: Strategies, Challenges and Opportunities • www.FacultyFocus.com
PAGE 16�
FROM PAGE 14If the learning objective is… A matching assessment could be…
1. Learners will label the parts ofthe human respiratory system,including the trachea, bronchi, lungs,thoracic cavity, and diaphragm.
Illustration of the human respira-tory system prompting learners tolabel the trachea, bronchi, lungs,thoracic cavity, and diaphragm
2. Learners will demonstrate threeelements of a proper phone greeting.
Demonstration(s) of the threeelements of a proper phone greeting
3. Learners will use the three-criteria model to evaluate a play.
Verbal or written application of thethree-criteria model to evaluate a play
TABLE 1
If the learningobjective is…
A matching assess-ment could be…
Matching contentand activities
1. Learners will label the
parts of the human respira-
tory system, including the
trachea, bronchi, lungs,
thoracic cavity, and
diaphragm.
An image of the human
respiratory system
prompting learners to label
the trachea, bronchi, lungs,
thoracic cavity, and
diaphragm
• Content: Images of the
different parts, sepa-
rately and together
• Activity: Practice
labeling the parts
2. Learners will demon-
strate three elements of a
proper phone greeting.
Demonstration(s) of the
three elements of a proper
phone greeting
• Content: Discussion of
the three elements
• Content: Examples of
the three elements in
use
• Activity: Practice using
the three elements
1. Learners will demon-
strate three elements of a
proper phone greeting.
2. Learners will use the
three-criteria model to
evaluate a play.
Verbal or written appli-
cation of the three-criteria
model to evaluate a play
• Content: Discussion of
the three-criteria model
• Content: Examples of
the three-criteria model
applied to a variety of
plays
• Activity: Practice
applying the three-
criteria model to a
variety of plays
TABLE 2
16
to do, and the assessment (test)should measure if they can, in fact,do that. The content and activitiesshould then be designed specificallyso that the learner can pass the test,because that means they have metthe learning objectives. And that’sthe goal of effective instruction.Let’s look at TABLE 2 once again at
the three objectives and matching as-sessments to see what content andactivities make sense.As you can see, a well-written
objective and matching assessmentprovide pretty clear cues about whatcontent and activities are needed. Itmakes the instruction not only moreeffective, but also easier to design.Better instruction and less work.Terrific!
A few more words aboutactivitiesSome people ask me whether
content plus assessments is enoughfor a good online course—forexample, PowerPoint slides and tests.
Aside from the fact that this would beunengaging for learners, thisapproach is not instruction. Activitiesand feedback are needed for instruc-tion. In fact, I’d go so far as to saythat the purpose of instructionalcontent is to support instructional ac-tivities. Activities allow learners toreflect on and apply the content andmake it personally meaningful. Whenwe don’t do this, we’re likelyteaching at only a surface level,preparing learners to do nothing withthe content other than forget about itonce the test is over.
Your turnIf activities are the opportunities
for learners reflect on and apply thecontent so that it becomes meaning-ful to them, now would be a goodtime for you to do that with thecontent in this article! Using myprevious articles on writing objectivesand developing assessments, ifneeded, see if you can write twogood learning objectives and thenmatch assessments and content andactivities. Try swapping your work
with someone else (another facultymember or maybe even an instruc-tional designer) to get feedback.Some people think it’s hard or even
impossible to create meaningfulonline activities, but that’s not so. Infact, an asynchronous online learningenvironment provides opportunitiesfor activities that would be hard to doin person.
ReferencesShank, P. (2006.) Developing
Learning Assessments for Classroom,Online, and Blended Learning.Workshop Materials. Centennial, CO:Learning Peaks.Smith, P.L., & Ragan, T.J. (2005).
Instructional Design, 3e. SanFrancisco: John Wiley & Sons, Inc.
Patti Shank, PhD, CPT, is a widelyrecognized instructional designer andinstructional technologist, and writerwho builds and helps others buildgood online and blended courses andfacilitate learning. She can be reachedthrough her website: www.learningpeaks.com/. @
The intermediate statistics classI took quite a number of yearsago had two types of learners
at the outset—those who wereworried about passing the course andthose who were sure they couldn’tpass it. The professor clearly under-stood the “fear-of-stats” phenomenonand used a number of instructionaltechniques to help learners gain con-fidence and skills.
One especially valuable techniquewas consistent use of self-checkexercises. These were handed out atthe end of each class along with ananswer sheet. Class started each timewith a question-and-answer periodabout the self-check exercises fromthe previous session. Doing theexercises was optional and theyweren’t handed in or graded, butnearly everyone did them, and the
folks who did easily gained confi-dence and passed the course.What are self-check exercises,
exactly? They are problems (withanswers) given to learners that allowthem to assess how they are doing onan ongoing basis. Doing them onlinewith self-grading provides immediatefeedback. Links to additional
Assessing Online Learning: Strategies, Challenges and Opportunities • www.FacultyFocus.com
Using Self-Check Exercises toAssess Online Learning
By Patti Shank, PhD, CPT
PAGE 17�
FROM PAGE 15
17
materials can be provided to helpanyone who is having difficulties.Online learners can do theseexercises and submit questions theyhave, which the instructor canaggregate and respond to for thebenefit of all learners.Studies show that these types of ac-
tivities help learners keep tabs ontheir progress and adjust their efforts,know when to seek help, and stay ontrack. These outcomes are especiallyimportant in online courses.Some of the most important
benefits of self-check exercises foronline learning include
• Helping learners determine whatthey do and do not understand sothey can target where extra studyis needed;
• Providing immediate feedback tolearners and an option to link toadditional materials (which mayreduce the number of unfocusedquestions sent to the instructor);
• Providing feedback to the instruc-tor about where learners arehaving difficulties so immediateinterventions can be imple-mented; and
• Increasing learner satisfactionwith the instructor and thecourse.
Getting StartedConsider how self-check exercises
can be used in the courses you teach.Are there concepts that learners con-sistently have problems understand-ing? Are there terms that learnersneed to memorize or concepts thatthey need to understand? Thesemight be the best places to start.
Patti Shank, PhD, CPT, is a widelyrecognized instructional designer andinstructional technologist, and writerwho builds and helps others buildgood online and blended courses andfacilitates learning. She can bereached through her website:www.learningpeaks.com/. @
Fresh assessment techniques areneeded to gauge studentlearning in online classes, Julie
Giuliani says. Giuliani is the executivedean of the Virtual College at FloridaCommunity College-Jacksonville(FCCJ), which servessome 39,000 distance students.Giuliani has what she refers to as “apassion for assessment.” She hasbeen working with assessment foryears, since she was a graduatestudent researching classroom assess-ment techniques.“There’s been volumes of research
that says there’s no significant differ-ence in terms of grades betweenstudents on land and online,” Giulianisays. “But there isn’t any formativekind of mechanism to gauge onlinestudent learning progress. We reallycan’t validate how well our studentsare learning online.”
She believes that the new wave ofso-called Millennial students calls forwhat she refers to as “new millen-nium assessment approaches.” Whatshe means by this is the employmentof assessment techniques that arespecifically designed for online mediathat are part of the structure of thecourse, and where students performself assessment as they participate incourse activities.One of the first things Giuliani did
when she came into her position wasto familiarize herself with the qualityof teaching and learning in the onlinecourses. As she went through theprocess, it became evident to her that,while they were doing many thingsright, they still weren’t where theyneeded to be in terms of creating aculture of systematic observation ofstudent learning.The students were already familiar
with social networking sites likeYouTube and MySpace, with podcastsand blogs, and with devices like cellphones and iPods. She came tobelieve that to engage these studentsin the process of assessment your bestbet was to use the same kinds ofmedia. FCCJ had been forwardlooking in the emphasis it gave tomultimedia in its online program. Theschool had created a faculty resourcecenter where any faculty member—full-time or adjunct—could producetheir own multimedia tools.Giuliani soon began to look for an
opportunity to try out her ideas. Theschool’s interim associate deanhappened to also be an online adjuncthistory professor. Giuliani proposed toher that they use her Americanhistory course as a laboratory to ex-
Assessing Online Learning: Strategies, Challenges and Opportunities • www.FacultyFocus.com
Assessment for the Millennial Generation
By Christopher Hill
PAGE 18�
FROM PAGE 16
18
periment with multimedia assess-ment tools.
Podcasts and self-assessmentThe first thing that they tried was a
podcast of a re-enactment of amoment in American history. Theyscripted a podcast in which a 14year-old girl described her family’sjourney West on the Oregon Trail inthe 19th Century. They recruited apart-time actress from the resourcecenter staff to read the script. Theycreated a link to the podcast on thecourse’s website. The online studentsdid the assigned reading about theWestward expansion, and thenlistened to the podcast’s first-personaccount of pioneer life. The instructorset up discussion board questions toassess how much the students hadgotten from both the text reading andthe podcast. But in addition, the in-structor designed a discussion boardrubric, so that not only were studentsdiscussing and giving their feedbackregarding their learning experience,they were also encouraged to use therubric to self-assess what they hadlearned as a result of the multimediaexperience.
Using the discussion board forself-assessmentGiuliani and her partner found a
way to use the discussion board for amore formative assessment purpose.The idea was that the students wereto read the learning module, partici-pate in the activities (such as thepodcast) and then in the discussionboard they were to do what is calleda 3-2-1 evaluation. The student weredirected to state three themes orconcepts they learned during theunit, two questions that they stillhad, and one idea they wanted toshare with others. This is beneficialon two levels, Giuliani explains. Not
only do students interact with andget different perspectives and ideasfrom each other, it gives the instruc-tor feedback on their pedagogicalstrategies. “They can say, ‘Wow, Ineed to change my approach for thenext module because obviously theydidn’t get it.’ Or ‘Wow, this reallyworked and I’m going to grow thisidea when I design the next module.’”
Three-2-1 assessment can beapplied to any content area. The firsttwo stages of the process areobjective, while the final iteminvolves a more subjective response.That difference in the questions helpsthe instructor gauge how much thestudents have absorbed from the unitand make adjustments based on thestudent responses.Giulani and her associates call such
practices VOAT’s -- virtual online as-sessment techniques. Creative onlineinstructors acquire a “toolbox” ofdifferent VOAT’s. An instructor triesone out and sees how it works,tinkers with it a little, and tries again.Part of the mission of introductory
online courses, in Giuliani’s view, is
to teach students how to use technol-ogy effectively. Giuliani and hercolleague developed a VOAT for that,too. The course had a section on theiron mining and smelting industry inthe United States. In the first weeksof the class, the instructor asked thestudent, after reading this segment,to take their cell phones or digitalcamera and go out in the field antake a picture of something that wasmade of iron--making sure thestudents themselves were in the shot.They would then send that picturevia their cell phone to the course site.As an alternative, if you didn’t knowhow to use your cell phone as acamera, or didn’t have a digitalcamera, you could go online and findsome pictures of iron products of thatera, and then send them to the in-structor.
A culture of multimediaassessment“My hope is that through this
model course that eventually we’ll beable to do more training with ouradjuncts using this as a model of bestpractices,” Giuliani says.Giuliani hopes that eventually full-
time faculty, not just adjuncts, willbecome interested in the multimediaassessment practices that she is initi-ating and come to the facultyresource center to create their owntools.“[The history course] was our pilot
test,” Giuliani says. “One of the firstgoals that was given to me was towork with quality assurance. And soI saw assessment and using the toolsof the new generation as two of thesupporting drivers to achieve the goalof quality and integrity of courses.”@
Assessing Online Learning: Strategies, Challenges and Opportunities • www.FacultyFocus.com
FROM PAGE 17
The students were already
familiar with social
networking sites like YouTube
and MySpace, with podcasts
and blogs, and with devices
like cell phones and iPods.
She came to believe that to
engage these students in the
process of assessment your
best bet was to use the same
kinds of media.
19
Worth Weller, continuingstudies lecturer in IndianaUniversity-Purdue
University Fort Wayne’s department ofEnglish and linguistics, believes in theefficacy of online learning, but hedoesn’t agree with the cliché, “I feel Iget to know my online students betterthan my face-to-face students.”“I think you get a lot of utopian dis-
cussion about teaching online. I dothink it’s wonderful, but I sense a realdisconnect with my students. Unlessyou make a superior effort, you don’thave the type of community that youhave in the classroom,” Weller says.This disconnect makes it harder for
online instructors to have the samelevel of contact with students that isnecessary for students to master thewriting process. To compensate forthe shortcomings of the onlineclassroom, Weller uses sequencedself-assessment in his online composi-tion courses. (Weller also uses thistechnique in his face-to-face writingcourses, but, he says, it is particularlyimportant in his online courses.)Weller begins his online course with
a chat session that asks students todiscuss assessment criteria. Hedivides students into groups and asksthem to come up with criteria theythink would distinguish an A college
paper. After 15 or 20 minutes hebrings the students back into themain chat room and asks a represen-tative of each group to present thegroup’s ideas and asks them why toexplain why each item is important.This first session instills in students
the notion that they will be involvedwith assessment. Students have fourpapers to write during the semester.There are 10 points throughout thesemester in which students assesstheir own writing and consult withWeller through synchronousdiscussions.For each writing assignment, each
student sends his or her paper topeers for review. Weller asks studentsto focus on structure – things liketopics sentences and the use of quotes– rather than grammar. “I don’t like toeavesdrop on the peer review process,but I train them well and tell themwhat the expectations are,” Wellersays.After the peer review, each student
writes a half-page reflective memoabout his or her paper and the peer-review process. The students submittheir memos to Weller (via a dialoguebox in WebCT), which prepares themfor the next step in the process: thesynchronous conference with the in-structor.
Having students write the reflectivememo prior to the conference helpsstudents create an agenda for the con-ference and tends to reduce thechance of the instructor dominatingthe conference, Weller says.During the conference, the student
and Weller “talk” about the paper,and Weller develops a revisionstrategy. (Weller thinks it might bemore effective if the studentsdeveloped their own revision plans,something he is thinking of doing.)After the conference, Weller sends
the student a memo that highlightswhat was accomplished in the confer-ence as well as a provisional grade(the grade the student would receiveif no revisions are made).Rather than marking up the paper,
Weller comments on the good pointsand shortcomings of the paper, oftenquoting sections of the paper asexamples. To help students under-stand some common errors, Wellerwill direct them to resources such aswebsites and help documents.Weller also requires students to
send a short note to him reflecting onthe conference and outlining therevisions they intend to make.Students have until the end of the
semester to revise their papers. Inaddition to the revisions, Weller hasstudents submit a memo describingthe changes they made. These memosserve several purposes: They getstudents to think about the processand serve as documentation thatenables Weller to more easily gradestudents’ work.In addition to the self-assessments
for each assignment, students alsosubmit mid-term and final self-assess-ments, asking them to reflect on theirperformance and what they thinktheir overall grade should be and why.
Assessing Online Learning: Strategies, Challenges and Opportunities • www.FacultyFocus.com
Self-Assessment in OnlineWriting Course FocusesStudents on the LearningProcess
By Rob Kelly
PAGE 20�
20
All of these self-assessmentdocuments and drafts of their papersare compiled in a portfolio along witha cover letter and are also graded. Forstudents’ final grades, Weller usesthis thorough documentation to makesure they did the revisions they saidthey would.The ongoing self-assessment and
portfolio make it unlikely thatstudents will plagiarize, Weller says.“With self-assessment it’s muchharder to turn in a bogus paperbecause you’re calling on them to doall this writing about their writing.”Weller says his use of sequenced
self-assessment in his courses helpsstudents take control of their writing.“A lot of students who come to 100-level writing courses hate writing.
Even students who write well oftensay, ‘I hate writing. I really strugglewith this.’ When they learn that theyare in control of their own writingthey feel a sense of freedom, whichtakes a lot of anger out of it. Andthey learn about learning,” Wellersays. @
Assessing Online Learning: Strategies, Challenges and Opportunities • www.FacultyFocus.com
Most of us are familiar withthe informal assessment toolcalled the minute paper in
which students write a shortnarrative about what they havelearned about a particular topiccovered in class. Many faculty usethe minute paper at the end of a classperiod in order to gauge student un-derstanding of the material. But therehave been many successful modifica-tions of the basic strategy. A numberof them are reported in the well-known book Classroom AssessmentTechniques by Tom Angelo and PatCross, who first proposed thetechnique.I have used the minute-paper
strategy previously and found it auseful assessment tool, so I decidedto change the format and make theminute paper online and interactive.In my courses, I upload documents tothe university’s Blackboard website.One of the features of Blackboard isthe communication link, whichallows instructors to create online
discussion boards where students canpost comments and reply to otherstudents’ remarks. This featurepresents the perfect opportunity toassess student learning viatechnology.In a psychology research methods
course, I used the minute paperduring the last 15 minutes of a labperiod. The lab portion of the coursewas held in a computer lab, whichmade Blackboard easily assessable tothe students. At the completion of alecture on identifying variables, Ishowed a video about the nonverbalsigns of attraction during dating, atopic of interest to most collegestudents. After the video and classdiscussion, students were required topost a comment on the discussionboard and reply to a fellowclassmate’s remarks. I gave them thefollowing instructions: 1) describewhat you learned today either fromthe video or class discussion; 2)describe one thing that you foundfascinating about human behavior;
and 3) reply to one of yourclassmate’s comments.To my pleasant surprise, students
were so occupied with this exchangewith their peers that I had to remindthem that the period was about toend. Even students who were shyand unlikely to speak during classwere now communicating with otherswho were less inhibited aboutspeaking in public. I learned that anonline minute paper not only servesas an assessment tool for studentlearning, it can be an effective meansfor stimulating classroom participa-tion. The experience has renewed myrespect for this simple but valuablefeedback tool.
Debra Vredenburg-Rudy is anassociate professor at MillersvilleUniversity. @
Using Online Discussion Forumsfor Minute Papers
By Debra Vredenburg-Rudy
FROM PAGE 19
Stay currentwith the latesttrends in onlinelearning!
Don’t miss another issue subscribe today! Visit www.magnapubs.com/onlineclassroom
About Online Classroom
Online Classroom helps you:• Create and design quality onlinecourses
• Stay current with the latestlearning strategies
• Manage courses without becomingoverwhelmed
• Keep online students engaged andcommitted
• Continuously evolve and improveyour courses
A MUST read for:• Faculty currently online—orwanting to get there
• Distance education administrators• Faculty development personnel• Academic deans• Department chairs• Continuing or extension educationpersonnel
Online Classroom has been helpingeducators develop and define the worldof online education since its inceptionin 2001. Issue after issue, it looks at thechallenges and opportunities presentedby this dynamic, fast-growing teachingmedium.
Broad in scope, Online Classroomcovers topics including:• Instructor roles• Course management• Student support• Academic integrity• Time management• Synchronous/Asynchronousinteraction
• Assessing student learning• Threaded discussions• And much more!
Recently in Online Classroom…With every issue, subscribers are chal-lenged and inspired. Here’s just a smallsampling of recent articles contributedby our editors and your peers nation-wide:• Establishing Rapport With OnlineStudents
• Protecting the ClassroomCommunity
• Managing a High-EnrollmentCourse
• Engagement Is the Key to OnlineLearning Success
• I'm Considering Teaching Online:What Questions Should I Ask?
• So — What is the Future of OnlineEducation?
Exceptional Value!1 full year … just $177Ask about multiple-copy subscriptions.Available in print or online.Choose the online version of OnlineClassroom and enjoy these extras:• Free access to our full archive ofback issues!
• Convenient search capabilities!• Easy login to your subscriptionwherever you have a Webconnection!
• One extra issue!
Share this newsletter with your entirestaff with a Group Online Subscription.Contact [email protected] formore details.
Submission Guidelines:Please read the following author’sguidelines or contact the EditorialDepartment at [email protected] (608) 227-8120.
Ideas and advicefor the newtrailblazers inhigher education
Magna Publications, Inc.2718 Dryden Drive
Madison, Wisconsin 53704USA
www.magnapubs.com