The Incorporation of Hands-On Tasks in an Online Course

15
This article was downloaded by: [Columbia University], [Columbia University] On: 26 September 2011, At: 08:56 Publisher: Routledge Informa Ltd Registered in England and Wales Registered Number: 1072954 Registered office: Mortimer House, 37-41 Mortimer Street, London W1T 3JH, UK Interactive Learning Environments Publication details, including instructions for authors and subscription information: http://www.tandfonline.com/loi/nile20 The incorporation of hands-on tasks in an online course: an analysis of a blended learning environment Thomas Chandler a , Yoon Soo Park b , Karen L. Levin c & Stephen S. Morse c a Center for Public Health Preparedness, Columbia University, New York, NY, USA b National Center for Disaster Preparedness, Columbia University, New York, NY, USA c Mailman School of Public Health, Columbia University, New York, NY, USA Available online: 30 Aug 2011 To cite this article: Thomas Chandler, Yoon Soo Park, Karen L. Levin & Stephen S. Morse (2011): The incorporation of hands-on tasks in an online course: an analysis of a blended learning environment, Interactive Learning Environments, DOI:10.1080/10494820.2011.593524 To link to this article: http://dx.doi.org/10.1080/10494820.2011.593524 PLEASE SCROLL DOWN FOR ARTICLE Full terms and conditions of use: http://www.tandfonline.com/page/terms-and- conditions This article may be used for research, teaching and private study purposes. Any substantial or systematic reproduction, re-distribution, re-selling, loan, sub-licensing, systematic supply or distribution in any form to anyone is expressly forbidden. The publisher does not give any warranty express or implied or make any representation that the contents will be complete or accurate or up to date. The accuracy of any instructions, formulae and drug doses should be independently verified with primary sources. The publisher shall not be liable for any loss, actions, claims, proceedings,

description

Online learning

Transcript of The Incorporation of Hands-On Tasks in an Online Course

Page 1: The Incorporation of Hands-On Tasks in an Online Course

This article was downloaded by: [Columbia University], [Columbia University]On: 26 September 2011, At: 08:56Publisher: RoutledgeInforma Ltd Registered in England and Wales Registered Number: 1072954 Registeredoffice: Mortimer House, 37-41 Mortimer Street, London W1T 3JH, UK

Interactive Learning EnvironmentsPublication details, including instructions for authors andsubscription information:http://www.tandfonline.com/loi/nile20

The incorporation of hands-on tasksin an online course: an analysis of ablended learning environmentThomas Chandler a , Yoon Soo Park b , Karen L. Levin c & StephenS. Morse ca Center for Public Health Preparedness, Columbia University,New York, NY, USAb National Center for Disaster Preparedness, Columbia University,New York, NY, USAc Mailman School of Public Health, Columbia University, New York,NY, USA

Available online: 30 Aug 2011

To cite this article: Thomas Chandler, Yoon Soo Park, Karen L. Levin & Stephen S. Morse (2011):The incorporation of hands-on tasks in an online course: an analysis of a blended learningenvironment, Interactive Learning Environments, DOI:10.1080/10494820.2011.593524

To link to this article: http://dx.doi.org/10.1080/10494820.2011.593524

PLEASE SCROLL DOWN FOR ARTICLE

Full terms and conditions of use: http://www.tandfonline.com/page/terms-and-conditions

This article may be used for research, teaching and private study purposes. Anysubstantial or systematic reproduction, re-distribution, re-selling, loan, sub-licensing,systematic supply or distribution in any form to anyone is expressly forbidden.

The publisher does not give any warranty express or implied or make any representationthat the contents will be complete or accurate or up to date. The accuracy of anyinstructions, formulae and drug doses should be independently verified with primarysources. The publisher shall not be liable for any loss, actions, claims, proceedings,

Page 2: The Incorporation of Hands-On Tasks in an Online Course

demand or costs or damages whatsoever or howsoever caused arising directly orindirectly in connection with or arising out of the use of this material.

Dow

nloa

ded

by [

Col

umbi

a U

nive

rsity

], [

Col

umbi

a U

nive

rsity

] at

08:

56 2

6 Se

ptem

ber

2011

Page 3: The Incorporation of Hands-On Tasks in an Online Course

The incorporation of hands-on tasks in an online course: an analysis

of a blended learning environment

Thomas Chandlera*, Yoon Soo Parkb, Karen L. Levinc and Stephen S. Morsec

aCenter for Public Health Preparedness, Columbia University, New York, NY, USA;bNational Center for Disaster Preparedness, Columbia University, New York, NY, USA;cMailman School of Public Health, Columbia University, New York, NY, USA

(Received 19 February 2010; final version received 4 May 2011)

This article describes the design and evaluation of a blended online/face-to-facecourse completed by more than 6000 learners throughout the United States ofAmerica and internationally. The educational impact was monitored using avariety of evaluation strategies. The results, in terms of achieved knowledgeand overall satisfaction, indicate that a focus on online instruction combinedwith face-to-face, hands-on activities showed statistically significant improve-ment in the learners’ understanding of the course material, while alsovalidating the impact of the curriculum in their workplace. As illustratedthrough the blended course design, this study further showed that onlinelearners with greater improvement in their pre- and posttest scores alsoexhibited significantly greater likelihood in demonstrating competency inseveral areas during the hands-on portion of the course. In particular,participants working in the information systems field exhibited the highestmean difference score (21.49) on the pre- and the posttests, while thoseworking in the laboratory had the lowest (12.17). Likewise, the odds thatparticipants who reviewed the course contents sought to further understandtheir job roles was 58.2 times greater for those in information systems, while itwas only 19.0 times greater for laboratory staff, than those who did not reviewtheir job roles.

Keywords: blended learning; online learning; competency; constructivism

Introduction

Since the late 1990s, there have been numerous efforts to incorporate distancelearning content into pedagogical initiatives (Garrison and Kanuka, 2004; Weiss,Knowlton, & Speck, 2000). Such efforts have often been well received, primarilybecause online instruction has been considered to be a cost-effective way forimproving learning outcomes (Gabriel, 2011; Hiltz & Turoff, 2002). The inclusion ofvirtual content allows participants to practice tasks asynchronously at their ownpace, while also receiving immediate feedback which is often constructive as wellas personalized. Likewise, online instruction can be highly visual, providingmechanisms for learning that are not available in most face-to-face classroom

*Corresponding author. Email: [email protected]

Interactive Learning Environments

2011, 1–13, iFirst article

ISSN 1049-4820 print/ISSN 1744-5191 online

� 2011 Taylor & Francis

DOI: 10.1080/10494820.2011.593524

http://www.informaworld.com

Dow

nloa

ded

by [

Col

umbi

a U

nive

rsity

], [

Col

umbi

a U

nive

rsity

] at

08:

56 2

6 Se

ptem

ber

2011

Page 4: The Incorporation of Hands-On Tasks in an Online Course

environments (Bransford, Brown, & Cocking, 2000; Christensen, Horn, & Johnson,2008).

Yet, while online instruction does clearly offer a wide range of features notreadily accessible in the physical classroom, challenges often arise in enablingparticipants to actually demonstrate what they know. Instead, online instructionfocusing on acquisition of a particular skill is usually ‘‘slide based,’’ with learnersprogressing from one concept to the next in a fairly linear manner. What is oftenlacking from this process is the sense that the participant is experiencing a complexlife-like scenario (Hron & Friedrich, 2003). As noted by Bonk and Graham (2005),the absence of meaningful face-to-face social interaction may even impede the entireonline learning process.

Consequently, due to the challenges of mimicking what may actually occurin the physical world, it is thus necessary to develop ways in which a richerlevel of understanding can be fostered online. Although one option is todevelop high-end software simulations, a more realistic and cost-effectivesolution is to provide users with downloadable competency-based documentsthat can be printed out and incorporated into face-to-face training sessions, inconjunction with other online activities. An essential element of this process isthe facilitation of instructive and formative assessment of hands-on tasks, alongwith ways in which new understandings can be re-enforced through onlinemechanisms (Johnson & Johnson, 2004). In short, a blended approach is often anecessity.

To examine the distinctions between online and hands-on learning, this studyreports on the findings from a course that incorporated both virtual and face-to-faceelements. The research question upon which this study was based is as follows:

. Did online learners with greater improvement on their pre- and posttest scoresexhibit significantly greater likelihood in demonstrating competency during thehands-on portion of the course?

Theoretical framework

The justification for the incorporation of hands-on activities within an onlinecourse exists in educational theory. The most common pedagogical approachwithin online education has traditionally been objectivism. With an objectivistinstructional approach, the goal of learning is to gain the knowledge that istransmitted. For example, learners are often expected to read a series of slides andthen complete an online multiple-choice examination. Yet, this means of conveyingknowledge is not likely to be effective when focusing on complex problem-solvingtasks involving the actual demonstration of what people know (Gagne, Briggs, &Wager, 1992).

Constructivism is an alternative educational paradigm, holding that knowledge isnot conveyed directly from one learner to another, but is ‘‘constructed’’ byindividuals. Within this epistemology, learning is considered to be an active processin which meaning comes from experience. In this type of classroom environment,students learn from sharing perspectives with others, often in social contexts, andadjust their previously held notions accordingly in order to respond to newperspectives (Duffy & Jonassen, 1992). Therefore, following constructivist reasoning,learning should take place in authentic, hands-on manner in which students have

2 T. Chandler et al.

Dow

nloa

ded

by [

Col

umbi

a U

nive

rsity

], [

Col

umbi

a U

nive

rsity

] at

08:

56 2

6 Se

ptem

ber

2011

Page 5: The Incorporation of Hands-On Tasks in an Online Course

many opportunities to build their understanding and to be able to practicetransferring their knowledge and skills to new situations (Cho & Schunn, 2003).

More specifically, Lave and Wenger (1991) argued that learning should beconsidered as a situational activity that has at its foundation a process referred to as‘‘legitimate peripheral participation.’’ In this regard, novices have the opportunity tolearn how to respond to a given situation based on watching and learning with anexpert who has a high level of knowledge and skill, as specified by the givensituation. They asserted that learning could then become more social and interactive,with challenging tasks that can be best accomplished through guided instruction.Likewise, Lave and Wenger’s (1991) primary unit of analysis was not the individualas learner, or the institutions in which learners interacted, but the ‘‘communities ofpractice’’ which people formed as they pursued shared enterprises. Taking each ofthe defining characteristics of this theory into consideration, Wenger (1998) thendefined the notion of a community of practice as being characterized by coherencethrough mutual engagement in a meaningful joint enterprise, thereby producinginterdependent relationships sustained by mutual accountability. Within thecommunity, social practices were said to emerge as resources for negotiatingmeaning. Practices included routines, tools, ways of doing things, shared stories, andgestures.

Although communities of practice can exist online, such as within discussionboards and newsgroups, several researchers have indicated that entirely onlinelearning environments generally fail to provide the communication mechanisms thatlearners often need in order to advance (Bonk & Graham, 2005). Rather, what iscalled for is the inclusion of blended approaches, which have elements of authenticparticipation in online environments. But how can this be achieved?

As noted by Garrison and Kanuka (2004), this fusion of hands-on and onlinelearning can result in an enhanced learning experience, primarily because courseparticipants are able to better develop critical thinking skills while also collaboratingwith peers in a manner that is difficult, if not impossible, to foster with just onelearning modality. Examples of this blended approach’s success in the literature arepredominantly found in undergraduate level degree programs. For instance, at theUniversity of Central Florida, Dziuban, Hartman, Juge, Moskal, and Sorg (2006)asserted that students participating in more than 100 blended courses performed at ahigher level, while also indicating a more significant degree of satisfaction thanlearners in solely face-to-face courses and fully online courses. Further, facultyteaching the blended courses also noted these performance and attitudinaldifferences. Dirkx and Smith (2004) also found that learners in solely online coursesare often reluctant, frustrated, and dissatisfied with virtual collaborative learningmethods, especially when working within small online groups, because they ‘‘strugglewith the development of a sense of interdependence and intersubjectivity within theironline groups, but end up holding fast to subjective, individualistic conceptions oflearning’’ (p. 134). The authors additionally suggested that these aspects can beexacerbated in entirely online environments, due to the difficulty in providing theemotional dynamics, which are often cited as being a critical element of thecollaborative learning process. Likewise, Kirkley and Kirkley (2005) argued thatdifficulties might be more likely to occur when online learners try to reach aconsensus in online group work, since there are no verbal or facial cues to helpresolve possible conflicts. An adequate solution is, thus, to enhance onlinemodalities, when possible, with traditional elements of the face-to-face instruction.

Interactive Learning Environments 3

Dow

nloa

ded

by [

Col

umbi

a U

nive

rsity

], [

Col

umbi

a U

nive

rsity

] at

08:

56 2

6 Se

ptem

ber

2011

Page 6: The Incorporation of Hands-On Tasks in an Online Course

Although less pronounced at the K-12 level as a curriculum trend, blendedlearning has still been gaining a foothold in US schools. In recent years, an emphasison Web 2.0 technologies combined with more technologically oriented preserviceeducation programs has resulted in new instructional modalities. Many educatorsand administrators have thus looked to blended learning more so than in the past.For instance, in Disrupting Class, Christensen et al. (2008) note that by 2019, 50% ofall high-school courses in the United States are likely to be delivered online, withblended learning being a key component. The implications of this trend arepromising, given that the technologies available for the incorporation of face-to-facecurricula in online courses continue to improve, with more educators becomingaware of how to combine both modalities and leverage the benefits.

In addition to new realizations within the literature on the need for blendedlearning, there have also been efforts in evaluating the effectiveness of suchendeavors.

To assess effectiveness, a regression-based model can be applied to evaluate theassociation between students’ scores in a test instrument, in order to predict theirsubsequent performance (Allen & Yen, 2001; Crocker & Algina, 2006). Regression-based approaches examine the association between two measures where, forexample, a student’s test score is assumed to predict an outcome (i.e. subsequentperformance); that is, the variability in the outcome measure is explained using thetest score. This means that if there is greater proportion of variance accounted for bythe test score, then the regression-based model implies that the test achieved a levelof predictive validity. In fact, many large-scale assessments use this method tostrengthen the predictive validity of their tests. For example, the Scholastic AptitudeTest (College Board, 2008) treats their assessment as a measurement of students’high-school scholastic ability and uses students’ first-year college grade point averageas the outcome variable to measure their performance in the future. From a blended-learning environment, Lynch and Dembo (2004) conducted a study that examinedthe predictive effect of self-regulation on the final grades of students. Usingcorrelations and regression-based analyses, they examined whether self-regulatoryattributes were predictive of distance learner success. They concluded that inferencesbased on strong associations between the two were implications of predictivevalidation for the effectiveness of the assessment.

The implications of this model for the evaluation of blended learningenvironments are noteworthy. Although traditional online evaluation proceduresare designed to rank students by their overall performance, it is also necessary inblended environments to implement further evaluation components that can linkpre- and postassessments and course knowledge to fine-grained hands-on tasks thatare additionally measured in the face-to-face environment. As such, this extendsbeyond broad domain-based scores that have been found to be difficult to interpretin conjunction with demonstrated performance. In other words, based on test andhands-on performance, learner’s profiles (i.e. a list of competencies that a specificparticipant has demonstrated to perform well) can be created to provide diagnosticfeedback for further instruction and training purposes. It is from this dual-assessment component of the online and face-to-face sections that adds to theeffectiveness of the blended environment.

Recently, studies have also evaluated the blended course to establish bothvalidity and reliability for online learning environments. In Barnard, Lan, To, Paton,and Lai’s (2009) recent study, a confirmatory factor analysis was used to separately

4 T. Chandler et al.

Dow

nloa

ded

by [

Col

umbi

a U

nive

rsity

], [

Col

umbi

a U

nive

rsity

] at

08:

56 2

6 Se

ptem

ber

2011

Page 7: The Incorporation of Hands-On Tasks in an Online Course

test the validity and the reliability of online learners and blended learners on theirmeasurement of self-regulation. Confirmatory factor analysis allows a reliablemeasure of a directly unobserved construct (e.g. students’ performance, ability, orpsychological perception) using several observable measures. This approach hasbeen used for assessing psychological measures due to its capability of addressingboth reliability and validity issues (see Bollen, 1989). They found that their approachsupported validity for a higher-order factor that was measured in their study. Theynoted that further validation utilizing information from both online and blendedlearning environments will be in need as the demand for courses that deviate fromtraditional face-to-face environments increases.

The aforementioned evaluation approaches provide the analytical framework forthis study. The online portion of the course can be thought of as the assessment,which measures the effectiveness of the course, whereas the face-to-face componentof the course can be considered as a measurement of future performance. Thetreatment of the online course and the hands-on section to follow the framework ofvalidity studies is unique to this study and is discussed further in the ‘‘results’’section.

Blended course design

The blended course examined for this study was specifically designed for the publichealth workforce. It was freely available from March 2005 until March 2009 as aresource. This first portion was intended to be an online training program to providethe knowledge required to prepare learners for how to respond to natural andhuman-made disasters. It included a pretest, interactive slides, and then a posttest.The second phase involved the incorporation of hands-on face-to-face activities,such as the inclusion of a downloadable follow-up evaluation. Figure 1 illustrates thecurricular flow of the online course.

The printed follow-up evaluation portion of the training needed to be completedat the participant’s workplace and was entirely hands-on. Participants were asked towork with colleagues to determine how they would solve the problem during anemergency, what their new functional role might be, what type of equipment theywould need to use, and who they should contact for back-up support. As a means ofverification, the learner’s supervisor then took on the responsibility of approving thecompleted electronic form and verifying in the online learning management systemthat the participant’s performance was satisfactory. The supervisor’s approval waskey to the reliability of the follow-up evaluation. It not only provided face-to-facevalidity of the employee’s response but also functioned as a source of information to

Figure 1. Curricular flow of the online course.

Interactive Learning Environments 5

Dow

nloa

ded

by [

Col

umbi

a U

nive

rsity

], [

Col

umbi

a U

nive

rsity

] at

08:

56 2

6 Se

ptem

ber

2011

Page 8: The Incorporation of Hands-On Tasks in an Online Course

confirm the curricular value of the online course. At this point, the learner could alsoreceive an online certificate of completion.

Methods

Results from the follow-up evaluation were used to assess the effectiveness of theblended course. In other words, it served as validation of the course’s impact.Current studies on the effectiveness of online courses rely on the pre- and posttestscores of participants. The method of analysis conducted was a combination oflinear and logistic regressions (Agresti, 2002).

To account for pretest and posttest changes as predictors for participants’response in the follow-up evaluation, a difference score between the pre- and theposttest scores was first calculated. Here, the difference scores were assumed to be theeducational change resulting from the course. Using the difference score as apredictor, a logistic regression was fit for items that had dichotomous responses(items 3 and 4 from the follow-up evaluation), and a linear regression was fit for itemswith continuous outcomes (item 5 from the follow-up evaluation). Figure 2 shows theselected items from the follow-up survey used for this analysis. Three items from thefollow-up survey were selected as outcome variables for the analysis; these itemsrepresented quantifiable variables, whereas the remaining items were qualitative. Item3 asked respondents whether they understood their division’s role, and item 4 askedwhether they had practiced their functional roles. The logistic regressions modeled thelikelihood in which the respondents answered ‘‘yes’’ given their difference score. Item5 asked respondents to check a list of eight communication equipment items that theywere supposed to know how to use. This was converted to a score out of 8; thedifference score was regressed on this score to predict the number of communicationequipment items.

The regression models attempted to predict the likelihood that the respondent hada greater role in understanding his or her job after completing the course. Figure 3shows a path diagram that illustrates the analytical framework discussed. Theprimary assumption in this analysis was that the greater the difference score, thegreater the likelihood of a participant say ‘‘yes’’ and subsequently indicate knowingmore.

Data sources

This online course was developed and funded through a partnership between theCenters for Disease Control (CDC) and a large university in the northeastern UnitedStates, with all of the data being downloaded from a secure MySQL database. Basedon over 6000 participants who completed the online course, 1676 respondents,representing 719 unique workplaces from over 12 countries and 42 states within theUnited States, completed the follow-up evaluation to be used for this study.

Results

Based on results from the follow-up evaluation, this study provides an analysis ofparticipants’ performance on their pretest (M ¼ 74.85, SD ¼ 18.70) and posttest(M ¼ 91.62, SD ¼ 10.41) scores, with posttest scores showing a significantimprovement from the pretest scores, t(1676) ¼ 41.53, p 5 0.001.

6 T. Chandler et al.

Dow

nloa

ded

by [

Col

umbi

a U

nive

rsity

], [

Col

umbi

a U

nive

rsity

] at

08:

56 2

6 Se

ptem

ber

2011

Page 9: The Incorporation of Hands-On Tasks in an Online Course

Figure 4 shows a scatter plot and the best linear fit of the pretest and the posttestscores. The estimated slope, which indicates the incremental change between thepretest and posttest scores, was 1.17 points, p 5 0.001, showing a significant changein the online students’ scores.

Figure 2. Selected items from the follow-up survey.

Interactive Learning Environments 7

Dow

nloa

ded

by [

Col

umbi

a U

nive

rsity

], [

Col

umbi

a U

nive

rsity

] at

08:

56 2

6 Se

ptem

ber

2011

Page 10: The Incorporation of Hands-On Tasks in an Online Course

The reliability of the pre- and the posttest items was 0.73 using Cronbach’s alphacoefficient, showing that the test was a reliable measurement of the participants’ability using the reliability threshold set by Nunnally (1978). The pretest and posttest

Figure 3. A path diagram to measure the effectiveness of the blended course.

Figure 4. Scatter plot of pretest and posttest scores.

8 T. Chandler et al.

Dow

nloa

ded

by [

Col

umbi

a U

nive

rsity

], [

Col

umbi

a U

nive

rsity

] at

08:

56 2

6 Se

ptem

ber

2011

Page 11: The Incorporation of Hands-On Tasks in an Online Course

data show that on average, as the participants’ pretest scores increased, their posttestscores increased by 0.265 points (p 5 0.001). However, to use the printed ‘‘hands-on’’ follow-up evaluation as a validation tool, we used the participant responsesas the outcome variable and the difference scores as the predictor, to test whetherthe entire blended online/face-to-face course was effective. Consequently, as thedifference score between the pre- and the posttest increased, we sought to determineif there was a greater likelihood that the participant actually demonstrated his or herknowledge while participating in the hands-on portion of the course.

Results showed that the blended course was significantly effective whenassessing the respondents’ application of skills. The likelihood that a participantunderstood his or her division’s role was 1.01 times greater (p 5 0.025) than aparticipant who did not understand his or her role for every increase in thedifference score, controlling for the effect that the participant reviewed the contentof the course. When controlling for the difference score, the likelihood that aworker who reviewed the course contents actually understood his or her division’srole was 13.80 times greater than someone who did not review its contents.Further, when shown their functional roles, the likelihood that participants hadpracticed these roles was again 1.01 times greater (p 5 0.022) as their differencescore increased (see Table 1).

Item 5 inquired about the participant’s knowledge of using communicationequipment. Among eight possible types of equipment listed, as the difference scoreincreased, the results indicated that the participant gained competency in using 0.016more types of equipment (p 5 0.001) (see Table 2).

Discussion

Findings from this study indicate that the incorporation of face-to-face tasks in anonline course adds considerable value, through improved learning outcomes andknowledge gained. The results from this analysis showed not only the effectiveness ofthe blended design but also the predictive validity of the online course through thefollow-up hands-on assessment. In particular, the item analysis of the proportion ofcorrectly answered test questions illustrates that item 2, pertaining to the learners’understanding of the incident command system (ICS) – an organizational structure

Table 1. Summary of logistic regression for selected items in the follow-up evaluation.

I understand my division’srole

I have practiced thefunctional roles

Predictor B SE B eB Predictor B SE B eB

Differencescore

0.009 0.004 1.009 Differencescore

0.007* 0.003 1.007

I have reviewedits contents

2.624 0.138 13.797 Constant 0.971*** 0.077

Constant –0.078 0.113w 2 437.500 5.270df 2.000 1.000% Yes 76.910 69.990

Note: eB ¼ exponentiated B. Items coded as 1 for yes and 0 for no.

*p 5 0.05; **p 5 0.01; ***p 5 0.001.

Interactive Learning Environments 9

Dow

nloa

ded

by [

Col

umbi

a U

nive

rsity

], [

Col

umbi

a U

nive

rsity

] at

08:

56 2

6 Se

ptem

ber

2011

Page 12: The Incorporation of Hands-On Tasks in an Online Course

used in the public health field to improve emergency response operations (Gebbie,Valas, Merrill, & Morse, 2006) – exhibited the greatest point difference inimprovement. The ICS portion of this course was intended to provide learnerswith an enhanced understanding of their functional roles during an emergencysituation. Since these functional roles were often different than what learners mightdo during day-to-day activities, a proper understanding of the new tasks involvedwas a fundamental aim of the course.

Further, when controlling for the difference score, the likelihood that aparticipant who reviewed the course contents actually understood his or herdivision’s ICS role for the hands-on portion of the course was 13.80 times greaterthan someone who did not review its contents. This latter finding suggests thatlearners were able to transfer and synthesize the knowledge gained from the online,didactic portion of the course (consisting of the pre- and posttests) to the hands-on,face-to-face activity. This was evident for all of the job roles, ranging from cliniciansto those in information technology. More specifically, when considering that theprimary aim of this blended course was to enable participants to gain a strongerfamiliarity with their potential functional roles during an emergency response, it isclear that this objective was achieved.

Additionally, when examining these findings in relation to the participants’ jobroles, the implications are also illuminating. For instance, while participantsworking in the information systems field exhibited the highest mean differencescore (21.49) on the pre- and the posttests, those working in the laboratory had thelowest (12.17). Likewise, the likelihood that participants who reviewed the coursecontent sought to further understand their job roles was 58.2 times greater forthose in information systems, while it was only 19.0 times greater for laboratorystaff, than those who did not review their job roles. These findings suggest that theparticipants’ job role was a key indicator of performance on the hands-on portionof the course. More specifically, the blended learning modality seems to havebenefited those working at computer work stations more so than those performingvarious procedures in a laboratory setting. These distinctions need to be analyzedmore closely and will be the subject of future research. It is certainly possible thatthe information systems staff may have been more adept at completing andsubmitting the online form, and receiving subsequent follow-up informationelectronically from their supervisors. In other words, the information systems staffwere most likely working in an environment that was more conducive to distancelearning, while also possessing the self-regulatory attributes that are oftenpredictive of distance learner success.

Table 2. Summary of regression analysis for variables predicting the number ofcommunication equipment items in the follow-up evaluation.

Number of communication equipment

Variable B SE B b

Difference score 0.016 0.004 0.102***

Constant 5.855 0.088R2 0.011F 17.750

Note: *p 5 0.05; **p 5 0.01; ***p 5 0.001.

10 T. Chandler et al.

Dow

nloa

ded

by [

Col

umbi

a U

nive

rsity

], [

Col

umbi

a U

nive

rsity

] at

08:

56 2

6 Se

ptem

ber

2011

Page 13: The Incorporation of Hands-On Tasks in an Online Course

The example presented using strata comparison by position shows anotheradvantage to the blended course design. Repeated measures in the effectiveness ofthe course content using both online and hands-on features added to the implicationof differences that may not have been intuitive when only the online portion of thecourse was evaluated. Further, as indicated in our review of the literature, the intentto evaluate both online and hands-on mediums in a single course is not verycommon. Although a confirmatory factor analysis model is often applied in studiesof validity to evaluate the association between students’ scores in printed testinstruments to predict learners’ subsequent performance, this approach is rarelyintegrated into evaluations of blended courses.

Conclusion

When online courses include hands-on, face-to-face tasks in which students need totransfer their knowledge and skills to new situations (Cho & Schunn, 2003), theevaluation of the entire learning process is critical. In short, how can we ensure thatknowledge transfer from one modality to another is actually taking place? This studyevaluated this process from the virtual to the hands-on, by first considering theeffectiveness of overall blended course design, and then the predictive validity of theonline course through the follow-up hands-on assessment.

Within the larger realm of pedagogical theory, this form of analysis can helpanswer the perennial question: How do people learn best in blended online/face-to-face environments? Although this study only examined the probability in which anonline learner might demonstrate an enhanced understanding of a hands-on skill, thefindings nonetheless provide a window into the knowledge transfer process fromobjectivist to hands-on-based tasks.

The evaluation approach outlined in the proceeding paragraphs is also particularlyimportant, since most learning situations are inherently social activities (Lave &Wenger, 1991), with actual participation as a primary component. In this regard, thehands-on component of this online course was an essential mechanism for bringing theonline learners closer to new realizations about their functional roles during emergencyresponse. Although these same realizations could have occurred in an entirely onlinecourse, the learners’ ability to practice in their own ‘‘face-to-face’’ work environmentclearly added value. Thus, by analyzing the effectiveness of the online and face-to-faceelements in a single course, this study has the potential to help other educators to betterdesign and facilitate other instructional modalities. Blended learning is an emergingarea of research, and one in which new evaluation approaches are sorely needed.

In the future, the authors intend to conduct further analysis of the hands-onelements of this learning environment, in order to assess the ways in which specificcompetencies were understood by different groups of learners. For instance,since participants in some job roles exhibited much less likelihood than others toengage in the hands-on tasks, more attention needs to be paid to the reasons why thishas occurred. The continued usage of a regression-based evaluation model to predictsubsequent performance on the hands-on assessment will make this possible.

Notes on contributors

Dr. Thomas Chandler is an Associate Research Scientist at Columbia University, and anAdjunct Assistant Professor of Communication, Computing, and Technology in Educationat Teachers College, Columbia University. His research focuses on blended learning

Interactive Learning Environments 11

Dow

nloa

ded

by [

Col

umbi

a U

nive

rsity

], [

Col

umbi

a U

nive

rsity

] at

08:

56 2

6 Se

ptem

ber

2011

Page 14: The Incorporation of Hands-On Tasks in an Online Course

environments, cognition and knowledge representations, and fostering metacognitionthrough ‘‘just in time’’ instruction. Dr. Chandler has more than 10 years of experiencein instructional design, evaluation, and distance learning. He is also an author of theTeaching the Levees curriculum guidebook pertaining to the disaster response during andafter Hurricane Katrina.

Dr. Yoon Soo Park is a Senior Research Project Manager at the Educational Testing Service.He is interested in statistical models that describe psychological and social processes. Inparticular, his research focuses on refining psychometric models that involve raters. He is alsointerested in applied statistical methods that use clustered data, multiple imputation, andBayesian estimation techniques.

Karen L. Levin, RN, BSN, MPH MCHES, is the Director of Columbia University’sRegional Learning Center for Preparedness and Emergency Response, at the NationalCenter for Disaster Preparedness. She is a nurse, an epidemiologist and a health educator.Karen has written and developed several online courses for the public healthworkforce and has served on the CDC’s Expert Panel for Rapid Needs Assessment,post disasters, while also being a panelist for the Federal Education Interagency Group forNational Consultation. She has more than 15 years experience designing, developing andevaluating public health learning projects in both distance learning and face-to-faceenvironments.

Dr. Stephen S. Morse is a Professor of Clinical Epidemiology and Founding Director of theCenter for Public Health Preparedness, Columbia University. Dr. Stephen Morse’s interestsfocus on epidemiology of infectious diseases, and improving disease early warning systems. Hewas the founding chair of ProMED (the nonprofit international Program to MonitorEmerging Diseases) and was one of the originators of ProMED-mail, an internationalnetwork inaugurated by ProMED in 1994 for outbreak reporting and disease monitoringusing the Internet. He is currently Director of the PREDICT Program, sponsored by USAID.His book, Emerging Viruses (Oxford University Press), was selected by ‘‘American Scientist’’for its list of ‘‘100 Top Science Books of the 20th Century’’.

References

Agresti, A. (2002). Categorical data analysis (2nd ed.). Hoboken, NJ: Wiley.Allen, M.J., & Yen, W.M. (2002). Introduction to measurement theory. Prospect Heights, IL:

Waveland Press.Barnard, L., Lan, W.Y., To, M.Y., Paton, V.O., & Lai, S.-L. (2009). Measuring self-

regulation in online and blended learning environments. Internet and Higher Education, 12,1–6.

Bollen, K.A. (1989). Structural equations with latent variables. New York, NY: Wiley.Bonk, C., & Graham, C. (2005). Handbook of blended learning: Global perspectives, local

designs. New York, NY: Jossey Bass.Bransford, J.D., Brown, A., & Cocking, R. (Eds.). (2000). How people learn: Mind, brain,

experience and school (Expanded ed.). Washington, DC: National Academy Press.Cho, K., & Schunn, C. (2003). Successful factors in networked collaboration. In R.

Alterman & D. Kirsh (Eds), Proceedings of the 25th Annual Conference of the CognitiveScience Society, Mahwah, NJ: Erlbaum. Retrieved from http://csjarchive.cogsci.rpi.edu/proceedings/2003/mac/index.html

Christensen, C.M., Horn, M.B., & Johnson, C.W. (2008). Disrupting class: How disruptiveinnovation will change the way the world learns. New York, NY: McGraw-Hill.

College Board. (2008). SAT validity studies. Retrieved February 12, 2011, from http://professionals.collegeboard.com/profdownload/Validity_of_the_SAT_for_Predicting_First_Year_College_Grade_Point_Average.pdf

Crocker, L., & Algina, J. (2006). Introduction to classical and modern test theory. Pacific Grove,CA: Wadsworth.

Dirkx, J.M., & Smith, R.O. (2004). Thinking out of a bowl of spaghetti: Learning to learningin online collaborative groups. In T.S. Roberts (Ed.), Online collaborative learning: Theoryand practice (pp. 132–159). Hershey, PA: Information Science Publishing.

12 T. Chandler et al.

Dow

nloa

ded

by [

Col

umbi

a U

nive

rsity

], [

Col

umbi

a U

nive

rsity

] at

08:

56 2

6 Se

ptem

ber

2011

Page 15: The Incorporation of Hands-On Tasks in an Online Course

Duffy, T.M., & Jonassen, D.H. (Eds.). (1992). Constructivism and the technology of instruction:A conversation. Hillsdale, NJ: Lawrence Erlbaum.

Dziuban, C., Hartman, J., Juge, F., Moskal, P., & Sorg, S. (2006). Blended learning enters themainstream. In C.J. Bonk & C.R. Graham (Eds.), The handbook of blended learning:Global perspectives, local designs (pp. 195–208). San Francisco, CA: Pfeiffer.

Gabriel, T. (2011, April 6). More pupils are learning online, fueling debate on quality. TheNew York Times, p. A1.

Gagne, R.M., Briggs, L.J., & Wager, W.W. (1992). Principles of instructional design (4th ed.).Belmont, CA: Wadsworth/Thomson Learning.

Garrison, R., & Kanuka, H. (2004). Blended learning: Uncovering its transformative potentialin higher education. Internet and Higher Education, 7, 95–105.

Gebbie, K.M., Valas, J., Merrill, J., & Morse, S. (2006). Role of exercises and drills in theevaluation of public health in emergency response. Prehospital Disaster Medicine, 21, 173–182.

Hiltz, S.R., & Turoff, M. (2002). What makes learning effective? Communications of the ACM,45, 56–59.

Hron, A., & Friedrich, H.F. (2003). A review of web-based collaborative learning: Factorsbeyond technology. Journal of Computer Assisted Learning, 19, 70–79.

Johnson, D.W., & Johnson, R.T. (2004). Cooperation and the use of technology. In D.H.Johanssen (Ed.), Handbook of research on educational communications and technology (2nded.) (pp. 785–811). Mahwah, NJ: Lawrence Erlbaum Associates.

Kirkley, S., & Kirkley, J. (2005). Creating next generation blended learning environmentsusing mixed reality, video games and simulations. TechTrends: Linking Research &Practice to Improve Learning, 49, 42–89.

Lave, J., & Wenger, E. (1991). Situated learning: Legitimate peripheral participation.Cambridge, MA: Cambridge University Press.

Lynch, R., & Dembo, M. (2004). The relationship between self-regulation and online learningin a blended learning context. International Review of Research in Open and DistanceLearning, 5, 1–16.

Nunnally, J.C. (1978). Psychometric theory (2nd ed.). New York, NY: McGraw-Hill.Weiss, R.E., Knowlton, D.S., & Speck, B.W. (Eds.). 2000). Principles of effective teaching in

the online classroom. San Francisco, CA: Jossey-Bass.Wenger, E. (1998). Communities of practice: Learning, meaning, and identity. Cambridge, MA:

Cambridge University Press.

Interactive Learning Environments 13

Dow

nloa

ded

by [

Col

umbi

a U

nive

rsity

], [

Col

umbi

a U

nive

rsity

] at

08:

56 2

6 Se

ptem

ber

2011