The Virtual-Patient Pilot: Testing a New Tool for Undergraduate Surgical Education and Assessment

8
ORIGINAL REPORTS The Virtual-Patient Pilot: Testing a New Tool for Undergraduate Surgical Education and Assessment Rachel L. Yang, BA, Daniel A. Hashimoto, BA, Jarrod D. Predina, MD, Nina M. Bowens, MD, Elizabeth M. Sonnenberg, BA, Emily C. Cleveland, BA, Charlotte Lawson, BA, Jon B. Morris, MD, and Rachel R. Kelz, MD Department of Surgery, University of Pennsylvania Health System, Philadelphia, Pennsylvania BACKGROUND: The virtual patient (VP) is a web-based tool that allows students to test their clinical decision- making skills using simulated patients. METHODS: Three VP cases were developed using com- mercially available software to simulate common surgical scenarios. Surgical clerks volunteered to complete VP cases. Upon case completion, an individual performance score (IPS, 0-100) was generated and a 16-item survey was administered. Surgery shelf exam scores of clerks who completed VP cases were compared with a cohort of students who did not have exposure to VP cases. Descrip- tive statistics were performed to characterize survey results and mean IPS. RESULTS: Surgical clerks felt that the VP platform was simple to use, and both the content and images were well presented. They also felt that VPs enhanced learning and were helpful in understanding surgical concepts. Mean IPS at conclusion of the surgery clerkship was 69.2 (SD 26.5). Mean performance on the surgery shelf exam for the student cohort who had exposure to VPs was 86.5 (SD 7.4), whereas mean performance for the unexposed student cohort was 83.5 (SD 9). DISCUSSION: The VP platform represents a new educational tool that allows surgical clerks to direct case progression and re- ceive feedback regarding clinical-management decisions. Its use as an assessment tool will require further validation. ( J Surg 70:394-401. Published by Elsevier Inc. on behalf of the Association of Program Directors in Surgery) KEY WORDS: virtual patient, simulation, medical educa- tion, assessment, web-based learning COMPETENCIES: Patient Care, Medical Knowledge, Practice-Based Learning and Improvement INTRODUCTION Educational technology can aid cognitive development, not merely serving as an alternative form of information delivery. 1 As such, medical educators, administrators, and students are eager to identify new medical education technology that can be implemented within medical school curriculum. The virtual patient (VP), ‘‘an interactive computer simulation of real-life clinical scenarios for the purpose of healthcare and medical training, education or assessment’’, permits the application of clinical reasoning to the care of simulated patients. 2 VPs have been implemented in undergraduate medical education 3,4 and have been explored in surgical education. 5 Surgeons appreciate the complex clinical decision-making that occurs outside of the operating room, but surgical clerks often do not recognize the elaborate cascade of events that takes someone safely through a surgical encounter. There are few currently available educational tools that thoroughly address the students’ need to apply clinical reasoning and receive appropriate feedback on their skills. Furthermore, close scrutiny of patient outcomes has limited opportunities for medical students to test their clinical decision-making skills prior to graduation. The virtual surgical patient provides a platform for medical students to direct case progression without risking harm to a real patient. The VP can also be used as an assessment tool. High- fidelity simulators and standardized patients are often used for assessment, yet VPs are infrequently implemented for this purpose. Simulation-based assessment is commonly used in graduate medical education, 6 but at the under- graduate medical education level it has mostly been reserved for high-stakes exams. 7-9 Currently, little is known about the use of the VP simulation as a cognitive- assessment tool for medical students. 10 All authors have read and approved the manuscript. This work is previously unpublished. *Correspondence: Inquiries to Rachel R. Kelz, MD, MSCE, Department of Surgery, UPHS, 4 Silverstein, 3400 Spruce Street, Philadelphia, PA 19104; fax: 215 662 7476; e-mail: [email protected] 394 Journal of Surgical Education Published by Elsevier Inc. on behalf of the Association of Program Directors in Surgery 1931-7204/$30.00 http://dx.doi.org/10.1016/j.jsurg.2012.12.001

Transcript of The Virtual-Patient Pilot: Testing a New Tool for Undergraduate Surgical Education and Assessment

Page 1: The Virtual-Patient Pilot: Testing a New Tool for Undergraduate Surgical Education and Assessment

ORIGINAL REPORTS

The Virtual-Patient Pilot: Testing a New Toolfor Undergraduate Surgical Educationand Assessment

Rachel L. Yang, BA, Daniel A. Hashimoto, BA, Jarrod D. Predina, MD, Nina M. Bowens, MD,Elizabeth M. Sonnenberg, BA, Emily C. Cleveland, BA, Charlotte Lawson, BA,Jon B. Morris, MD, and Rachel R. Kelz, MD

Department of Surgery, University of Pennsylvania Health System, Philadelphia, Pennsylvania

BACKGROUND: The virtual patient (VP) is a web-basedtool that allows students to test their clinical decision-making skills using simulated patients.

METHODS: Three VP cases were developed using com-mercially available software to simulate common surgicalscenarios. Surgical clerks volunteered to complete VP cases.Upon case completion, an individual performance score(IPS, 0-100) was generated and a 16-item survey wasadministered. Surgery shelf exam scores of clerks whocompleted VP cases were compared with a cohort ofstudents who did not have exposure to VP cases. Descrip-tive statistics were performed to characterize survey resultsand mean IPS.

RESULTS: Surgical clerks felt that the VP platform wassimple to use, and both the content and images were wellpresented. They also felt that VPs enhanced learning andwere helpful in understanding surgical concepts. Mean IPSat conclusion of the surgery clerkship was 69.2 (SD 26.5).Mean performance on the surgery shelf exam for thestudent cohort who had exposure to VPs was 86.5 (SD7.4), whereas mean performance for the unexposed studentcohort was 83.5 (SD 9).

DISCUSSION: The VP platform represents a new educationaltool that allows surgical clerks to direct case progression and re-ceive feedback regarding clinical-management decisions. Its useas an assessment tool will require further validation. ( J Surg70:394-401. Published by Elsevier Inc. on behalf of theAssociation of Program Directors in Surgery)

KEY WORDS: virtual patient, simulation, medical educa-tion, assessment, web-based learning

All authors have read and approved the manuscript.

This work is previously unpublished.

*Correspondence: Inquiries to Rachel R. Kelz, MD, MSCE, Department of Surgery,

UPHS, 4 Silverstein, 3400 Spruce Street, Philadelphia, PA 19104; fax: 215 662

7476; e-mail: [email protected]

394 Journal of Surgical Education � Published by Elsevier Inc. oof Program Directors in Sur

COMPETENCIES: Patient Care, Medical Knowledge,Practice-Based Learning and Improvement

INTRODUCTION

Educational technology can aid cognitive development, notmerely serving as an alternative form of information delivery.1

As such, medical educators, administrators, and students areeager to identify new medical education technology that can beimplemented within medical school curriculum. The virtualpatient (VP), ‘‘an interactive computer simulation of real-lifeclinical scenarios for the purpose of healthcare and medicaltraining, education or assessment’’, permits the application ofclinical reasoning to the care of simulated patients.2 VPs havebeen implemented in undergraduate medical education3,4 andhave been explored in surgical education.5

Surgeons appreciate the complex clinical decision-makingthat occurs outside of the operating room, but surgical clerksoften do not recognize the elaborate cascade of events thattakes someone safely through a surgical encounter. There arefew currently available educational tools that thoroughlyaddress the students’ need to apply clinical reasoning andreceive appropriate feedback on their skills. Furthermore,close scrutiny of patient outcomes has limited opportunitiesfor medical students to test their clinical decision-makingskills prior to graduation. The virtual surgical patientprovides a platform for medical students to direct caseprogression without risking harm to a real patient.

The VP can also be used as an assessment tool. High-fidelity simulators and standardized patients are often usedfor assessment, yet VPs are infrequently implemented forthis purpose. Simulation-based assessment is commonlyused in graduate medical education,6 but at the under-graduate medical education level it has mostly beenreserved for high-stakes exams.7-9 Currently, little is knownabout the use of the VP simulation as a cognitive-assessment tool for medical students.10

n behalf of the Associationgery

1931-7204/$30.00http://dx.doi.org/10.1016/j.jsurg.2012.12.001

Page 2: The Virtual-Patient Pilot: Testing a New Tool for Undergraduate Surgical Education and Assessment

A lump is noted in the left breast on physical exam.

We report the results of a pilot study aimed to test thefeasibility of developing VP cases for surgical clerks,including opinions regarding the usability of the platform,the VP concept, and information on clerk performance.

What should be done next?

Left breast mammogram

-10points

Bilateral breast mammogram

Bilateral breast MRI

-20points

Core Biopsy

-30points

FIGURE 1. Branching decision trees were generated for eachquestion embedded in the VP cases, allowing for student-guided caseprogression. Point deductions were assigned based on the severity ofclinical consequences resulting from each incorrect answer choice.

METHODS

Platform Development

Using the commercially available Discourse LLC learningplatform (JC 2009-2012), we developed 3 VP cases. Medicalstudents, residents, and a faculty advisor worked together toaccomplish this task. The cases were designed for theundergraduate medical student, specifically the surgicalclerk, and focused on decision-making during the care ofthe surgical patient. The first 3 cases were chosen to addresscommon surgical problems: the outpatient presentationof a breast mass, the emergency room presentation of asmall-bowel obstruction related to Crohn’s disease, and theinpatient deterioration of a postoperative patient experien-cing sepsis.

Each case was designed with case-specific learningobjectives (Table 1). For each learning objective, keyclinical decisions were identified to direct development ofthe patient scenarios. Free-response and multiple-choicequestions based on the key clinical decisions were devel-oped. Branching decision trees based on student responsesto the questions were used to individualize progressionthrough each case (Fig. 1). Faculty experts selected for theirclinical expertise and interest in medical education reviewedthe content of each case for accuracy and appropriateness.

The clinical scenarios and questions were then con-verted into a web-based VP interface. Students were ableto access the VP platform through a web browser using alogin and password. After logging on to the platform,students could choose the assigned case from a caselibrary. Once the case was selected, students were pre-sented with a brief clinical history and an image of thepatient to personalize the ‘‘doctor-patient’’ experience

TABLE 1. Case-Specific Learning Objectives

Key Learn

BreastCase

(1) Generate a differential diagnosis for a patient witfactors.

(2) Guide the appropriate workup of a breast mass, i(3) Understand surgical management of different brea

IBD Case (1) Recognize the presentation of a small-bowel obstrdiagnosis.

(2) Appropriately manage a patient with a small-bow(3) Identify and apply key principles in the surgical m

SepsisCase

(1) Guide the initial workup for a patient with postop(2) Identify indications for higher level clinical care an(3) Determine the initial steps in treating the infectiou

Journal of Surgical Education � Volume 70/Number 3 � May/June

(Fig. 2). For the breast disease case, students wereintroduced to the patient in the outpatient clinic; forthe inflammatory bowel disease case, students wereintroduced to the patient in the emergency room; andfor the sepsis case, students were introduced to the patientin the inpatient ward. The students could advancethrough the case by choosing amongst opportunities togather pertinent medical history, perform aspects of thephysical examination, order laboratory tests or radio-graphic studies, and execute discrete interventions. Stu-dents would make key clinical decisions (multiple-choiceand free-response questions) regarding further diagnosticworkup or therapeutic interventions based on the infor-mation gathered under their direction. Students couldchoose to skip through the physical exam, laboratory tests,or radiographic studies, but without gathering this vitalinformation they would not be able to correctly answerthe assessment questions. Upon case completion, anindividual performance score (IPS, 0-100) was generatedand incorrect answers were explained. After viewing their

ing Objectives

h a breast mass, based on demographics and personal risk

ncluding laboratory tests, imaging studies, and tissue biopsy.st neoplasms.

uction and be able to guide the workup for confirming the

el obstruction.anagement of Crohn’s disease.

erative fever.d invasive monitoring.

s source of a patient with sepsis.

2013 395

Page 3: The Virtual-Patient Pilot: Testing a New Tool for Undergraduate Surgical Education and Assessment

FIGURE 2. After selection of the VP case, students meet the patient and are provided with a brief clinical history as shown above. The student canthen choose to gather pertinent medical history, perform aspects of the physical examination, order laboratory tests or radiographic studies, andprogress through the case by executing discrete interventions.

IPS, students were directed to a case-specific educationaldocument for review.

From start to finish, development of the 3 cases tookapproximately 4 months. All 3 VP cases were successfullyimplemented. The sepsis case had a technical issue duringthe orientation that precluded scoring of the case forpreclerkship performance and analysis. The case was fullyfunctional by the final week of the clerkship.

Scoring System

The initial scoring system was designed to provide anuncomplicated evaluation of student performance for thispilot study. Performance scores were based on answers tothe free-response and multiple-choice questions testingappropriate diagnostic and therapeutic maneuvers. Stu-dents were given 100 points each at the start of each case.In essence, each student was assumed competent at thestart and rewarded for the privilege of caring for the VP.However, when an incorrect decision was made, pointswere deducted in a step-wise fashion, to hold the studentaccountable for the erroneous decision. No extra creditwas given for correct answers, but those students whoselected appropriate answers benefitted indirectly by nothaving points deducted. The system of point deductionwas intended to mimic real-life clinical decision-makingin which physicians are held accountable for medicaldecisions that deviate from the standard of care or resultin patient harm. The grading schema was geared towardthe learning objectives for each case, with point

396 Journal of

deductions chosen with consensus from the authors.Points were deducted only around items preordained askey clinical decisions. For example, in a case withobjectives centered on diagnostic accuracy, points werededucted for explicit outcome measures such as thewrong diagnosis. Errors that would cause significant harmto the patient led to termination of the case and directedstudents to their IPS with explanations regarding theirincorrect answers.

Student-Case Completion and Survey

We implemented our pilot study with 1 block of surgeryclerkship students. At the introduction of the surgery clerk-ship, 27 of the 33 clerkship students were able to completeVP cases, and 31 of 33 participated during the final week ofthe clerkship. During the surgery clerkship orientation, eachstudent was assigned an anonymous username to access theVP. Students were asked to complete 1 randomly assignedcase during orientation. In the final week of the clerkship,surgical clerks were asked to complete the same VP case theyhad completed preclerkship, and an additional randomlyassigned new case.

Following completion of the case(s), an anonymous,voluntary, 16-item survey was distributed. The surveyqueried 3 basic domains: platform usability, self-reportedconfidence with clinical topics, and perceptions regardingthe role of the VP in surgical education and assessment.The responses were rated using a 5-point Likert scale.Students rated the platform’s ease of use, clarity of content,

Surgical Education � Volume 70/Number 3 � May/June 2013

Page 4: The Virtual-Patient Pilot: Testing a New Tool for Undergraduate Surgical Education and Assessment

and presentation of content and images (1-below average,5-excellent). They rated their comfort in managing surgicalpatients, critically ill patients, patients with inflammatorybowel disease, and patients with breast disease (1-notcomfortable, 5-very comfortable). Lastly, they rated theiragreement with statements regarding the use of VPs as amethod for surgical education and assessment (1-stronglydisagree, 5-strongly agree).

Performance on National Board of MedicalExaminers Shelf Examination

We obtained aggregate National Board of Medical Exam-iners Shelf Examination (NBME) surgery shelf scores forthe cohort of students who completed the VP modules. Wealso obtained aggregate NBME surgery shelf scores for asimilar-sized cohort of students who completed theirsurgery core clerkship 3 months prior and were not exposedto the VP modules. This unexposed cohort of studentsserved as our control group.

Statistical Analysis

Descriptive statistics were performed to characterize the surveyresults and IPS. Medians were used to describe the categoricaldata (survey results), and means were used to describe thecontinuous data (IPS). Students’ self-reported ratings ofconfidence in managing surgical patients were comparedpreclerkship with postclerkship using a Wilcoxon-Mann-

Comfortmanaging

surgical patients

0

1

2

3

4

5

Comfort managing SICU

patients

Not

Com

forta

ble

V

ery

Com

forta

ble

Pre-clerkship

Post-clerkship

FIGURE 3. Self-reported student confidence in managing surgical patientsWhitney test. *P-value o0.01.

Journal of Surgical Education � Volume 70/Number 3 � May/June

Whitney test. Ratings of student confidence were convertedinto a categorical variable [0 if not comfortable (1/5-2/5), 1 ifcomfortable (4/5-5/5)], and mean IPS were compared bystudent confidence using a Student’s t-test. Mean IPS for theaggregate of all VP cases was compared between preclerkshipcases, postclerkship new cases, and postclerkship repeat casesusing an ANOVA. For those cases completed postclerkshipand not seen previously, mean individual performance wascompared between each of the 3 cases using an ANOVA. Theoverall mean performance at clerkship completion and thegroup’s aggregate mean performance on the NBME surgeryshelf examination were described. A Student’s t-test was usedto compare mean surgery shelf exam performance of the studygroup with mean surgery shelf exam performance of the priorclerkship block that was not exposed to VP cases. Datamanagement was performed using SAS Version 9.2 (SASInstitute Inc. 2009, Cary, NC) and statistical analyses wereperformed using Stata/SE Version 11.1 (StataCorp LP 2009,College Station, TX). A p-value of r0.05 was consideredsignificant in all analyses.

RESULTS

Usability and Opinions Regarding thePlatform

The survey response rate was 100% amongst participants.Median ratings of the VP interface were the same at both

Comfortmanaging IBD

Comfortmanaging breast

disease

was compared preclerkship to postclerkship using a Wilcoxon-Mann-

2013 397

Page 5: The Virtual-Patient Pilot: Testing a New Tool for Undergraduate Surgical Education and Assessment

TABLE 2. Congruency Between Student Confidence and VP Performance

Student Confidence Mean � SD IPS(Postclerkship)

P-value

Comfort Managing Breast Disease Breast Case 0.14Not Comfortable (1/5 or 2/5) 58.3 � 46.5Comfortable (4/5 or 5/5) 79.1 � 21.7

Comfort Managing IBD IBD Case 0.11Not Comfortable (1/5 or 2/5) 42.8 � 35.1Comfortable (4/5 or 5/5) 67.0 � 23.3

Comfort Managing SICU Patients Critical Care Case 0.34Not Comfortable (1/5 or 2/5) 66.3 � 29.6Comfortable (4/5 or 5/5) 73.0 � 23.9

time points so we present results from the completion ofthe rotation. Students reported that the VP interface hadgood clarity of content (median 4/5), good presentation ofcontent (median 4/5), good presentation of images (median4/5), and good ease of use (median 4/5).

Students reported that the VP enhanced learning (med-ian 4/5), was useful in understanding surgical concepts(median 4/5), should supplement didactics (median 4/5),and reflected real patients (median 4/5). Students disagreedwith entirely replacing the surgery clerkship shelf exam(median 2/5) or clinical evaluations (median 2/5) with aVP-based assessment.

Student Confidence

At the beginning of the surgery clerkship, students werenot comfortable managing patients with breast disease,

0

10

20

30

40

50

60

70

80

90

100

Breast Case IBD Case Seps

Mea

n P

erfo

rman

ce S

core

(SD

)

FIGURE 4. Mean and standard deviation of performance on the VP casesdeviation of performance on the NBME surgery shelf exam are reported for

398 Journal of

patients with inflammatory bowel disease, or critically illpatients (Fig. 3). However, the postclerkship surveyrevealed that students’ confidence in managing surgicalpatients had risen for all types of patients (p o 0.01,Fig. 3). At completion of the clerkship, students who hadself-reported confidence in managing surgical patientstended to perform better on the corresponding VP case,although this difference was not statistically significant(Table 2).

Performance

Student performance on all 3 VP cases ranged fromzero to 100 at each time point. For all cases combined,mean IPS at the start of the surgical rotation was 56.7(SD 34.8). Cases that were repeated at the completion ofthe rotation resulted in a mean IPS of 70.8 (SD 25.9).

is Case All Cases Surgery Shelf

completed at clerkship conclusion are reported. Mean and standardthe same student cohort.

Surgical Education � Volume 70/Number 3 � May/June 2013

Page 6: The Virtual-Patient Pilot: Testing a New Tool for Undergraduate Surgical Education and Assessment

Cases that were completed at the end of the surgeryclerkship but were seen for the first time generated amean IPS of 67.6 (SD 27.5). IPS did not differ signifi-cantly between preclerkship and postclerkship time points(p ¼ 0.24).

Upon completion of the clerkship, mean IPS for thebreast case, if seen anew by the student, was 78.5 (SD27.9). For the inflammatory bowel disease case the meanIPS was 51.9 (SD 24.5), and for the sepsis case the meanIPS was 68.9 (SD 26.2, Fig. 4). Mean performance did notdiffer significantly by individual case (p ¼ 0.12).

The aggregate mean performance on the surgery NBMEshelf exam for the student cohort exposed to the VPplatform was 86.5 (SD 7.4, N ¼ 33). Mean surgery shelfperformance for the prior block of clerkship students whowere unexposed to VPs was 83.5 (SD 9, N ¼ 36), Therewas not a statistically significant difference between thetwo groups (p ¼ 0.14).

DISCUSSION

The VP has great potential as an instructional tool.Although VPs are not new to medical education,3,11 theyhave been understudied in undergraduate surgical educa-tion. As such, there is a paucity of evidence to guideeducators in using the VP as an adjunct to the traditionalsurgical curriculum. Furthermore, few studies have inves-tigated the correlation between VP simulation and surgeryshelf exam scores, as the currently available literaturefocuses on student perception of VP modules.5,12

Our pilot study suggests that the development of virtualsurgical patients for undergraduate surgical education isfeasible. The development of clear learning objectives foreach case allowed for the creation of clinical scenarios andquestion stems. This approach was previously adopted bythe developers of the Computer-assisted Learning inPediatrics Project (CLIPP) cases. They followed the guide-line that ‘‘case development should proceed from clearlywritten learning objectives’’.13 We included rich descrip-tions of the clinical context to engage students in thecomplex cognitive processes characteristic of true clinicalpractice. In addition, the cases were designed withmultiple-choice questions and free-response items gearedto assess both recognition and recall of correct answers inclinical reasoning. To assure that students benefitted frommistakes made while using a computer-based learningtool,14,15 the platform was designed for students to viewerrors along with corresponding explanations upon casecompletion.

We aimed to evaluate the use of the VP as an educationaltool. Our students reported that the interface was user-friendly. It has been demonstrated that learners are mostsatisfied with web-based learning when the interface is easyto use and the download speed is fast.16 In this era of

Journal of Surgical Education � Volume 70/Number 3 � May/June

technologically facile students, it is essential that newelectronic learning tools have visual appeal and are alsointuitive in their manipulation. Surgery clerkship studentswere enthusiastic about the opportunity to use VPs as anaddition to their didactic curriculum. A study examiningthe use of web-based learning for family medicine andinternal medicine residents found that the learning toolswere well received by learners, and students reported higherratings of satisfaction than with traditional learning tools.17

A recent evaluation of the Web Initiative for SurgicalEducation (WISE-MD) found that students had animprovement in clinical reasoning when the WISE-MDcases were implemented in the surgery clerkship.5 Further-more, studies have shown that web-based learning oftenimproves knowledge about the subject and is usually asefficient at educating as traditional curricula.17,18 Huanget al. reported that students appear to engage in highercritical thinking when learning through VPs as comparedwith a paper case.16

We hope to direct future efforts at expanding the useof the VP as an educational tool but also as a meansfor assessment. However, the process of validating a newassessment tool is an elusive one, especially in the domainof VPs. A review from 2006 reported that during the prioryear 232 articles were published on ‘‘simulation in medicaleducation’’, yet only 13 addressed the use of simulation inassessment.10Nevertheless, there is much literature regard-ing assessment in medical education, though not specificto simulation. One key goal in developing a new assess-ment tool is determining the capacity to discriminate instudent knowledge. We found that all 3 of our cases wereable to discriminate in relative overall strengths andweaknesses, as evidenced by the wide range of performanceon each case. As we continue with development of the VPplatform, we plan to establish construct validity bycomparing scores of faculty and resident experts with thatof student novices.

Other authors have stressed the importance of medicaleducation assessment tools to have ‘‘various domains ofcompetence assessed in an integrated, coherent, and long-itudinal fashion’’.19 The cases we developed attempted toassess multiple domains including clinical reasoning andfund of knowledge, but there remains potential for asses-sing a greater scope of subject matter including clinicalefficiency and cost-effectiveness.

Unique from traditional methods of assessment, our VPcases focus on evaluating clinical reasoning and thus are anovel approach to medical student assessment. A priorstudy found that internal medicine clerkship studentsviewed VPs as a good evaluation tool, one that emphasizedthe clinical reasoning process and thus was intrinsically abetter tool than traditional exams.20 Furthermore, realisti-cally simulating hard-to-handle clinical cases is helpful inimproving the learner’s ability to cope with similar situa-tions in real life.21 Navigation of these critical decisions

2013 399

Page 7: The Virtual-Patient Pilot: Testing a New Tool for Undergraduate Surgical Education and Assessment

when caring for ill and unstable patients is an issue centralto the field of surgery; thus, surgical education is arguablythe field most in need of a simulation-based tool to evaluatemedical students’ clinical reasoning without risking harmto the patient.

Our pilot study was limited by the small number ofsubjects, resulting in underpowered analyses of the assess-ment data. However, the validation of a new assessmenttool takes rigorous comparison testing and was beyondthe scope of this pilot study. Our study was limited toevaluation of only the 3 VP cases developed for thepilot. Although these 3 cases spanned a wide range ofsubject areas within general surgery, we ideally wouldwant to test student performance using a larger bank ofVP cases. Furthermore, it would be ideal to comparemean shelf performance of students who were exposed toVP cases with students who did not use the cases, for anentire year. Due to anonymity of scores, we were unableto directly link individual VP performance to students’shelf exam scores and clinical evaluations. As such, weneither could assess the relationship between individuals’VP performance and clerkship performance, nor couldwe account for student’s unique clinical experiences basedon rotation assignments that might have affected theirfund of knowledge tested by the VP cases. Lastly, it ispossible that performance on VP modules followingclerkship completion was falsely elevated for studentswho had seen the same VP case at the beginning of theclerkship.

Overall, the pilot study suggests that the VP caseswere well received and feasible. The cases permitmedical students to apply their factual knowledge toclinical scenarios where they can safely direct caseprogression. The potential applications of the VPinclude primary education as well as assessment. Thepilot study’s findings support the value of additionalwork to increase the VP library and further refinementof the VP system to develop its use as an educational andassessment tool.

REFERENCES

1. Evgeniou E, Loizou P. The theoretical base ofe-learning and its role in surgical education. J SurgEduc. 2012;69(5):665-669.

2. Ellaway R, Masters K. AMEE guide 32: e-learning inmedical education Part 1: Learning, teaching andassessment. Med Teach. 2008;30(5):455-473.

3. Chumley-Jones HS, Dobbie A, Alford CL. Web-basedlearning: sound educational method or hype? A reviewof the evaluation literature Acad Med. 2002;77(10Suppl):S86-S93.

400 Journal of

4. Shah H, et al. Interactive virtual-patient scenarios: anevolving tool in psychiatric education. Acad Psychiatry.2012;36(2):146-150.

5. Kalet AL, et al. Preliminary evaluation of the WebInitiative for Surgical Education (WISE-MD). Am JSurg. 2007;194(1):89-93.

6. Dillon GF, et al. The introduction of computer-basedcase simulations into the United States medicallicensing examination. Acad Med. 2002;77(10Suppl):S94-S96.

7. Carroll JD, Messenger JC. Medical simulation: thenew tool for training and skill assessment. Perspect BiolMed. 2008;51(1):47-60.

8. Rosen KR. The history of medical simulation. J CritCare. 2008;23(2):157-166.

9. Hawkins R, et al. Assessment of patient management skillsand clinical skills of practising doctors using computer-based case simulations and standardised patients. MedEduc. 2004;38(9):958-968.

10. Cook DA. Where are we with web-based learningin medical education? Med Teach. 2006;28(7):594-598.

11. Zary N, et al. Development, implementation andpilot evaluation of a web-based virtual patient casesimulation environment—Web-SP. BMC Med Educ.2006;6:10.

12. Gormley GJ, et al. A virtual surgery in generalpractice: evaluation of a novel undergraduate virtualpatient learning package. Med Teach. 2011;33(10):e522-e527.

13. Fall LH, et al. Multi-institutional development andutilization of a computer-assisted learning programfor the pediatrics clerkship: the CLIPP project. AcadMed. 2005;80(9):847-855.

14. Gordon JA, Oriol NE, Cooper JB. Bringinggood teaching cases ‘‘to life’’: a simulator-basedmedical education service. Acad Med. 2004;79(1):23-27.

15. Adams AM. Pedagogical underpinnings of computer-based learning. J Adv Nurs. 2004;46(1):5-12.

16. Huang G, Reynolds R, Candler C. Virtual patientsimulation at US and Canadian medical schools. AcadMed. 2007;82(5):446-451.

17. Bell DS, et al. Self-study from web-based and printedguideline materials. A randomized, controlled trialamong resident physicians. Ann Intern Med.2000;132(12):938-946.

Surgical Education � Volume 70/Number 3 � May/June 2013

Page 8: The Virtual-Patient Pilot: Testing a New Tool for Undergraduate Surgical Education and Assessment

18. Botezatu M, et al. Virtual patient simulation forlearning and assessment: superior results in compar-ison with regular course exams. Med Teach.2010;32(10):845-850.

19. Epstein RM. Assessment in medical education. N EnglJ Med. 2007;356(4):387-396.

Journal of Surgical Education � Volume 70/Number 3 � May/June

20. Botezatu M, Hult H, Fors UG. Virtual patientsimulation: what do students make of it? A focusgroup study BMC Med Educ. 2010;10:91.

21. Ziv A, Ben-David S, Ziv M. Simulation basedmedical education: an opportunity to learn fromerrors. Med Teach. 2005;27(3):193-199.

2013 401