SIMULATION SUPPORTED TRAINING IN ORAL RADIOLOGY140258/FULLTEXT01.pdf · SIMULATION SUPPORTED...

52
UMEÅ UNIVERSITY ODONTOLOGICAL DISSERTATIONS New series No. 99, ISSN 0345-7532, ISBN 978-91-7264-293-5 ___________________________________________________________________________________________________ SIMULATION SUPPORTED TRAINING IN ORAL RADIOLOGY Methods and Impact on Interpretative Skill Tore Nilsson Department of Odontology, Oral and Maxillofacial Radiology Umeå University Umeå 2007 1

Transcript of SIMULATION SUPPORTED TRAINING IN ORAL RADIOLOGY140258/FULLTEXT01.pdf · SIMULATION SUPPORTED...

Page 1: SIMULATION SUPPORTED TRAINING IN ORAL RADIOLOGY140258/FULLTEXT01.pdf · SIMULATION SUPPORTED TRAINING IN ORAL RADIOLOGY ... Radiographic examinations ... Example of two intra-oral

UMEÅ UNIVERSITY ODONTOLOGICAL DISSERTATIONS New series No. 99, ISSN 0345-7532, ISBN 978-91-7264-293-5

___________________________________________________________________________________________________

SIMULATION SUPPORTED TRAINING IN ORAL RADIOLOGY Methods and Impact on Interpretative Skill

Tore Nilsson

Department of Odontology, Oral and Maxillofacial Radiology Umeå University

Umeå 2007

1

Page 2: SIMULATION SUPPORTED TRAINING IN ORAL RADIOLOGY140258/FULLTEXT01.pdf · SIMULATION SUPPORTED TRAINING IN ORAL RADIOLOGY ... Radiographic examinations ... Example of two intra-oral

Department of Odontology, Oral and Maxillofacial Radiology Umeå University SE-901 87 Umeå, Sweden Copyright© by Tore Nilsson 2007 ISSN 0345-7532 ISBN 978-91-7264-293-5 Printed by Print & Media, Umeå, 2007

2

Page 3: SIMULATION SUPPORTED TRAINING IN ORAL RADIOLOGY140258/FULLTEXT01.pdf · SIMULATION SUPPORTED TRAINING IN ORAL RADIOLOGY ... Radiographic examinations ... Example of two intra-oral

Contents Preface ...................................................................................................4

Abstract .................................................................................................5

Definitions and abbreviations ...............................................................6

Introduction ...........................................................................................7

Aim......................................................................................................19

Material and methods..........................................................................20

Results .................................................................................................29

Discussion ...........................................................................................34

Conclusions .........................................................................................40

Acknowledgements .............................................................................41

References ...........................................................................................42

Appendix .............................................................................................50

Papers I-IV ..........................................................................................53

3

Page 4: SIMULATION SUPPORTED TRAINING IN ORAL RADIOLOGY140258/FULLTEXT01.pdf · SIMULATION SUPPORTED TRAINING IN ORAL RADIOLOGY ... Radiographic examinations ... Example of two intra-oral

Preface This text is based on the following papers which will be referred to in the text by their Roman numerals. I. Nilsson T, Ahlqvist J, Johansson M, Isberg A. Virtual reality

for simulation of radiographic projections: validation of projection geometry. Dentomaxillofac. Radiol. 2004; 33: 44-50.

II. Nilsson T, Hedman L, Ahlqvist J. Visual-spatial ability and interpretation of three-dimensional information in radiographs. Dentomaxillofac. Radiol. 2007;36: 86-91

III. Nilsson TA, Hedman LR, Ahlqvist JB. A Randomized Trial of Simulation Based vs. Conventional Training of Dental Student Skill at Interpreting Spatial Information in Radiographs. Submitted for publication.

IV. Nilsson T, Hedman L, Ahlqvist J. Retention of Dental Student Skill at Interpreting Spatial Information in Radiographs after Radiology Simulator Training. Manuscript.

Papers I and II are reproduced with kind permission of the British Institute of Radiology.

4

Page 5: SIMULATION SUPPORTED TRAINING IN ORAL RADIOLOGY140258/FULLTEXT01.pdf · SIMULATION SUPPORTED TRAINING IN ORAL RADIOLOGY ... Radiographic examinations ... Example of two intra-oral

Abstract Simulation is an important tool when training is hazardous, time consuming, or expensive. Simulation can also be used to enhance reality by adding features normally not available in the real world. The aim with this work has been to develop and evaluate methods that could improve learning in oral radiology utilising a radiation-free simulator environment.

Virtual reality software for radiographic examinations was developed. The virtual environment consisted of a model of a patient, an x-ray machine, and a film. Simulated radiographic images of the patient model could be rendered as perspective projections based on the relative position between the individual models. The software was incorporated in an oral radiology simulator with a training program for interpretation of spatial relations in radiographs. Projection geometry was validated by comparing length dimensions in simulated radiographs with the corresponding theoretically calculated distances. The results showed that projection error in the simulated images never exceeded 0.5 mm.

Dental students participated in studies on skill in interpreting spatial information in radiographs utilising parallax. Conventional and simulator based training methods were used. Training lasted for 90 minutes. Skill in interpreting spatial information was assessed with a proficiency test before training, immediately after training, and eight months after training. Visual-spatial ability was assessed with mental rotations test, version A (MRT-A). Regression analysis revealed a significant (P<0.01) association between visual-spatial ability and proficiency test results after training. At simulator training, proficiency test results immediately after training were significantly higher than before training (P<0.01). Among students with low MTR-A scores, improvement after simulator training was higher than after conventional training. Eight months after simulator training proficiency test results were lower than immediately after training. The test results were, however, still higher than before training.

In conclusion, the simulation software produces simulated radiographs of high geometric accuracy. Acquisition of skill to interpret spatial relations in radiographs is facilitated for individuals with high visual-spatial ability. Simulator training improves acquisition of interpretative skill and is especially beneficial for individuals with low visual-spatial ability. The results indicate that radiology simulation can be an effective training method.

Key words: Virtual reality, simulation, simulator, radiology, radiography, learning, skill acquisition, visual-spatial ability, parallax

5

Page 6: SIMULATION SUPPORTED TRAINING IN ORAL RADIOLOGY140258/FULLTEXT01.pdf · SIMULATION SUPPORTED TRAINING IN ORAL RADIOLOGY ... Radiographic examinations ... Example of two intra-oral

Definitions and abbreviations Computed tomography (CT)

Tomographic image produced by computer calculation of the variation in radiographic attenuation in a predefined object layer. The result is presented as a digital image of the object layer.

Parallax The apparent shift of an object against a background caused by a change in observer position.

Plain film radiography Radiographic image projected on a film or digital detector after the passage of an x-ray beam through a patient.

Six degrees of free-dom (6DOF)

In this context the definition is limited to translations in three directions and rotations around three axes.

Tomography Radiographic examination of a predeter-mined object layer.

Virtual environment (VE)

Three-dimensional data set describing an environment based on real-world or abstract objects and data.

Virtual Reality (VR) A technology which allows a user to interact with a computer-simulated environment.

6

Page 7: SIMULATION SUPPORTED TRAINING IN ORAL RADIOLOGY140258/FULLTEXT01.pdf · SIMULATION SUPPORTED TRAINING IN ORAL RADIOLOGY ... Radiographic examinations ... Example of two intra-oral

Introduction

Background Reports concerning oral radiology have shown that inferior radiographic image quality is not unusual and that it can affect diagnostic outcome and treatment planning.1-4 Inferior image quality can be the result of deficits in knowledge and skills concerning radiographic imaging techniques which in turn indicate that educational goals have not been met. A way to reduce such deficits is to improve efficacy in radiology education, especially in subjects related to image quality. Education on this topic is complicated by the inherent fact that details in the volume to be imaged are more or less hidden from view, and that the resultant images are transparent representations that cannot be interpreted as easily as ordinary pictures. These circumstances make the understanding of radiographic imaging principles demanding and require a great effort on the part of the learner to fully understand. An important teaching goal therefore is to find methods that facilitate the learning process.

The advent of advanced computer graphics and virtual reality (VR) technique gave rise to the idea of developing a radiology simulator in which the user could perform and analyse simulated radiographic examinations without the use of ionising radiation. A literature search produced no evidence of such a simulator. Therefore the ‘Virtual Radiography’ project was initiated. The main goal for the project was to develop a simulator for radiology training and the project was divided into the following parts: o development of methods for simulated radiographic projections o design of a simulator interface o creation of exercises in oral radiology o evaluation of learning outcome after training in the simulator This thesis is based on results from the Virtual Radiography project.

Radiology X-rays are a type of electromagnetic radiation which is emitted in the rearrangement of electron shells of atoms. The radiation can produce ionisations at interaction with matter.5 X-ray radiation will produce free electrons at interaction that will initiate a number of events starting with ionisations, radical reactions, and molecular and biochemical reactions ending up with biological effects. The effects can be harmless to lethal to the cells and sometimes lesions of the DNA genome occur.6

7

Page 8: SIMULATION SUPPORTED TRAINING IN ORAL RADIOLOGY140258/FULLTEXT01.pdf · SIMULATION SUPPORTED TRAINING IN ORAL RADIOLOGY ... Radiographic examinations ... Example of two intra-oral

A radiation protection system has been developed to reduce the risk of harmful radiation effects. The three basic principles of radiological protection are justification of activities that could cause or affect radiation exposures, optimisation of protection in order to keep doses as low as are reasonably achievable, and the use of dose limits. In order to reduce potential harmful health effects from ionising radiation used at diagnostic radiology, these principles must be considered before a decision is made to perform any radiographic examination.7

Radiographic examinations X-rays can be used for plain film radiography and for tomography. In plain film radiography the resultant image is a transparent perspective projection of the imaged object on the detector. In tomography a slice of the object is visualised (Figure 1). In computed tomography (CT) the image is calculated from a large number of x-ray transmission measurements of the object layer of interest.8,9

A BFigure 1. Plain film radiograph and CT image of the head. A) In the plain film radiograph all structures are superimposed on each other. The depth relation of the structures cannot be distinguished. B). The CT slice represents a layer of the skull. In this image slice thickness was 1.25 mm. The level of the CT slice is indicated by the horizontal line in figure A.

The formation of CT images by a CT scanner involves both data

acquisition and data processing. The process ends up with a digital image with a defined slice thickness. Each pixel in the image is assigned a CT number which is related to the attenuation of the tissue volume representing the actual pixel. A CT examination typically consists of a stack of contiguous scans of the object representing a volume of the examined object (Figure 2) The stack can be regarded

8

Page 9: SIMULATION SUPPORTED TRAINING IN ORAL RADIOLOGY140258/FULLTEXT01.pdf · SIMULATION SUPPORTED TRAINING IN ORAL RADIOLOGY ... Radiographic examinations ... Example of two intra-oral

A B

Figure 2. A) Stack of CT slices. The stack is defined by pixel size in each slice, slice thickness, and slice spacing. B) A simplified illustration of a stack of digital images. The pixel size is increased. The figures represent pixel values.

as a 3D set of images where 3D imaging systems can be applied for further processing and visualisation of the data set.9,10

Image clarity The term image clarity has been suggested to describe the visibility of diagnostically important details in a radiograph. It depends on a number of technical factors such as the quality and quantity of x-ray radiation, detector properties, the properties of the imaged object itself, and radiographic imaging geometry. Image brightness, contrast, mottle, sharpness, and resolution of the radiographic image are to a great extent influenced by the technical factors,11 but depiction of anatomical or pathological structures depends also on the projection geometry used.

In plain film radiography where object details are superimposed on each other, standard projections are important tools for image clarity.12 However, overlapping of anatomic structures can negatively affect image clarity when standard procedures are performed. In such cases modification of the standard procedures are necessary. Such modification demands a thorough knowledge of anatomy and a good understanding of radiographic imaging principles.

Interpretation of spatial relations in images The x-ray image is the pattern of information that the x-ray beam acquires as it passes through and interacts with the patient.11 Radiographs as well as ordinary pictures are two-dimensional representations of three-dimensional objects. The images, however, represent the objects in different ways.

In pictures and photographs, the image is built up by reflected light from the surfaces of the objects depicted. Pictorial depth cues such as interposition, shading, shadows, relative brightness, and linear

9

Page 10: SIMULATION SUPPORTED TRAINING IN ORAL RADIOLOGY140258/FULLTEXT01.pdf · SIMULATION SUPPORTED TRAINING IN ORAL RADIOLOGY ... Radiographic examinations ... Example of two intra-oral

perspective make it easy to perceive depth relations between the items portrayed. These cues are also called monocular cues because they do not only appear in pictures but also are available when only one eye is used to view a scene.13 In other words, human beings can use their natural senses for correct interpretation of spatial relations between objects in photographic images.

Plain film radiographs are transparent images of objects where a gray scale or a pseudo-colour scale14 represents the total attenuation of transmitted x-ray radiation through the object volume. This implies that there is no information about depth relations in a single radiograph. Radiographic images therefore must be interpreted within a framework relying on the properties of such images. Applying natural human perception for analysis of radiographic images therefore can be misleading. For example, a bright object in a radiograph can falsely be perceived as being closer to the viewer than darker objects, but the correct interpretation is that the attenuation of the total volume in that area is higher than the surrounding area (Figure 3).

Figure 3. Example of two intra-oral radiographs exposed at slightly different angles of em-bedded tooth 15. In the upper radiograph, the crown of tooth 15 (black arrow) and the root of tooth 14 (white arrow) are super-imposed. The brightness of the crown can make one assume that the crown is closer to the observer than the root. A single radiograph gives, however, no information of the relative depth relations of the two objects. An additional radio-graph exposed from another projection angle (lower radio-graph) adds information necessary for determination of the actual depth relation between the crown and the root.

Simulation According to the Oxford English Dictionary15 the word simulation can be defined as ‘The technique of imitating the behavior of some situation or process (whether economic, military, mechanical, etc.) by

10

Page 11: SIMULATION SUPPORTED TRAINING IN ORAL RADIOLOGY140258/FULLTEXT01.pdf · SIMULATION SUPPORTED TRAINING IN ORAL RADIOLOGY ... Radiographic examinations ... Example of two intra-oral

means of a suitably analogous situation or apparatus, esp. for the purpose of study or personnel training.’ The same source defines the word simulator as ‘An apparatus designed to simulate the behavior of a more complicated system; esp. one for training purposes that simulates the response of a vehicle, craft, or the like, having a similar set of controls and giving the illusion to the operator of responding like the real thing.’ In addition to the above mentioned purposes, entertainment is also an important area for simulation.16 A more health care directed definition is given by Gaba who claims that ‘Simulation is a technique–not a technology–to replace or amplify real experiences with guided experiences that evoke or replicate substantial aspects of the real world in a fully interactive manner.’17

Motives for use of simulation Simulation is an important tool when training on the actual system is hazardous, time consuming, or expensive but it can also be used to enhance reality by adding features normally not available in the real world.18 In medicine it is used for the purpose of patient safety and patient care and to make education more effective.17,19 The US Institute of Medicine recommends health care organisations and teaching organisations participate in the development and use of simulation for training novice practitioners as a part of the creation of safety systems in health care organisations.20 When learning invasive procedures, students and residents are not comfortable to initiate learning of such procedures on patients and demand simulated training prior to practicing on patients.21 In clinical settings, most patients will allow supervised medical students to perform minor procedures22 indicating that there is patient acceptance for being part of the students training. This acceptance can be maintained if patients are satisfied with the treatment, and one necessary prerequisite is that the students are well prepared before they perform their first procedures.

The cost of implementing simulation depends on the kind of simulation. In cases where simulation training replaces existing training the costs can be reduced. Benefits such as reduced risk to the patient or improved knowledge, skills and attitudes are difficult to measure.17,23

History Simulation has been used before the advent of computer based simulators. In dentistry for example, drilling in plastic teeth to acquire skill in preparing teeth to be filled is a kind of technically simple but high fidelity simulation. Radiographic examinations of manikins before performing an examination on real patients is also a kind of

11

Page 12: SIMULATION SUPPORTED TRAINING IN ORAL RADIOLOGY140258/FULLTEXT01.pdf · SIMULATION SUPPORTED TRAINING IN ORAL RADIOLOGY ... Radiographic examinations ... Example of two intra-oral

simulated training aimed at improving the trainees’ skills. Advanced simulation technology is well established in flight simulations, war games, business management games, and nuclear power plant operations as well as modelling of biological effects.24-27 In medicine an early patient simulator for anaesthesia training, referred to as Sim One, was developed in the late 1960s.28,29 It has been followed by more sophisticated patient simulators such as ‘Harvey’, the cardiology patient simulator, which is a life-sized manikin with a complete curriculum of cardiovascular diseases.26,30

Today, medical training simulators are found in a number of areas such as patient simulators,28,31-35 simulators for training in anaesthesia,32,36 laparoscopy and endoscopy,26,37-39 endonasal surgery,40 radiology,41,42 arthroscopy,43 and for anatomy education44

Classifications A generally accepted classification of simulations cannot be found. Simulators are often described from a technical perspective but other perspectives are also in use. Surgical simulations are classified as model-based or mechanical simulations, computer-based or virtual reality simulations and hybrid simulations which are based on combinations of physical models and computers.23,45 Taxonomies based on the skill to be trained are proposed by Satava and Kneebone23,46 and a classification in terms of simulation complexity is used by Liu et al.47 In airplane simulation classification by objective fidelity sets a basis from which the training community can identify the specific simulation device that is optimised for their needs.25 For team training a three category typology with case studies/role plays, part task trainers, and full mission simulations is proposed.48

Virtual environments and virtual reality for simulation A virtual environment (VE) is a three-dimensional data set describing an environment based on real-world or abstract objects and data. Usually VE and virtual reality (VR) are used synonymously, but some authors reserve VE for an artificial environment that the user interacts with.49 VR can be described as a way of using computers to create images of 3D scenes with which one can navigate and interact.18

The interfaces of VR simulators can vary, depending on the goals for the simulation, from multimodal input and output devices to ordinary equipped standard personal computers. The most advanced full flight simulators have a physical cockpit with instruments which provide realistic sensory feedback. This high fidelity interface is regarded as necessary to achieve realistic training in order to reach

12

Page 13: SIMULATION SUPPORTED TRAINING IN ORAL RADIOLOGY140258/FULLTEXT01.pdf · SIMULATION SUPPORTED TRAINING IN ORAL RADIOLOGY ... Radiographic examinations ... Example of two intra-oral

flight safety goals.25 The interface of a surgical simulator for training minimally invasive procedures can be equipped with real instruments and the user interacts with a VE where both visual and sensory feedback can be provided. Depending on the sophistication of the simulation the learning goal can vary from psychomotor training to procedural training.45 In the low technical end, simulation can be performed on ordinary computer hardware. This kind of simulation is used for gaming as well as for simulations for improvement of knowledge and skills.44,50,51

Almost all newly developed simulation technology relies on VR computer technology. Rapid development in computer hardware has made it possible to use advanced VR applications on standard personal computers. Stereoscopic vision can be created by looking at a single monitor screen presenting two separate images, one for each eye. The two images are updated separately. By means of shutter glasses which are synchronised with the updating frequency of the monitor the lenses are opened and closed with the same frequency as the images are presented on the monitor. When the image intended for the left eye is displayed, the glasses shut the right-hand lens, and vice versa for the right image. The result is a realistic 3D effect.18 Stereo effects can also be obtained by using polarised light or auto-stereo displays.

The 3D mouse and tracking system are input devices associated with VR systems. A 3D mouse can communicate three directions of translations and rotations around three axes, thus having six degrees of freedom (6DOF). It is therefore a useful tool for navigation in VEs, especially in desktop applications. Tracking is a technology that has emerged to capture the motion of objects such as humans. In VR, trackers are used as input devices to monitor the real time 3D position and orientation of an object such as a hand or an instrument. Tracking technology has a wide variety of applications for monitoring the position of instruments in simulators. It can also be used as intuitive 6DOF interaction tools in VEs where they can be used to grab and freely move an object in 3D space. Haptic devices can, in addition to monitoring 3D positions, also return force information. It will provide sensory feedback when the user faces or manipulates a virtual object. Haptics is used to increase fidelity in for example surgical simulators.18,49

In order to determine the effectiveness of a VE, human performance in the VE must be assessed.52 In a series of experiments by Arnold et al performance of a perceptual-motor task was assessed in both the real world and in stereoscopically presented VEs with no haptic

13

Page 14: SIMULATION SUPPORTED TRAINING IN ORAL RADIOLOGY140258/FULLTEXT01.pdf · SIMULATION SUPPORTED TRAINING IN ORAL RADIOLOGY ... Radiographic examinations ... Example of two intra-oral

feedback. The results showed that fine motor tasks are much more difficult to perform in VEs compared to the corresponding task in the real world. The VE task was more difficult both in terms of speed and accuracy.53 Waterworth has in a similar experiment demonstrated that stereoscopic vision is important for both accuracy and speed of task completion in VEs.54

Transfer of training The degree of transfer from the simulated to the real environment is generally measured as effectiveness of subsequent performance on the actual task. Transfer should be best when training mimics performance as closely as possible. The optimal way to learn a specific procedure is to learn the procedure precisely as it will be tested.55-57

The match between simulated and real environments is the training environments fidelity. With perfect fidelity a training environment would be indistinguishable from the actual task environment. There is limited knowledge about what kind and degree of fidelity is required for any performance, however, there is evidence that better match between VE and real world tasks improves transfer.48,55,58

Learning In order to understand and describe the learning process, learning has been studied from different perspectives. The result of the research has been presented as a number of learning theories. The theories focus on different aspects of the educational process and are not necessarily in conflict with each other. The content of this section refers to ‘Psychology applied to teaching’ by Snowman and Beihler57 when other references are not given.

Bloom’s taxonomy Bloom’s taxonomy of educational objectives can be applied in the analysis of development of a professional competence in radiology. It describes the prerequisites for change in the cognitive domain of knowledge and skill.

Knowledge means remembering previously learned information in areas such as facts, terms, procedures, and principles. In the context of interpreting radiographs, knowledge of anatomy, pathology, projection geometry, and radiation physics is necessary. In order to gain new knowledge effectively, information must be reliable, readily accessible, and possible to be processed with efficiency.

14

Page 15: SIMULATION SUPPORTED TRAINING IN ORAL RADIOLOGY140258/FULLTEXT01.pdf · SIMULATION SUPPORTED TRAINING IN ORAL RADIOLOGY ... Radiographic examinations ... Example of two intra-oral

Skills refer to organised modes of operation and generalised techniques for dealing with materials and problems. The skills objective emphasise the mental processes of organising and reorganising material to achieve a particular purpose. They include comprehension, application, analysis, synthesis, and evaluation.23,59 Examples of fundamental skills in radiology are comprehension of the nature of the radiographic representation and the skill to analyse and deduce spatial relationships.

Operant conditioning The theory of operant conditioning, which is one of the behavioural theories, was introduced by BF Skinner. The theory attempts to explain how behaviours are learned. The basic idea behind the theory is that all behaviours are accompanied by certain consequences which strongly influence whether these behaviours are repeated or not. In general, the consequences that follow behaviour are either pleasant or unpleasant. Depending on conditions, these consequences either increase or decrease the likelihood that the preceding behaviour will recur under the same or similar conditions. When consequences strengthen a preceding behaviour, reinforcement has taken place and when consequences weaken a preceding behaviour, punishment and extinction have occurred.

Computer programs for educational purposes can be designed as programs of stimuli and consequences. Computer based instruction (CBI) has evolved from Skinner’s programmed instruction. Instructional programs generally fall into one of three categories; drill and practice programs, simulation programs, or tutorial programs. Research on the effects of CBI on high school and college students has revealed that simulation programs have beneficial effect in science achievement. CBI is more effective when computers are used individually and when they are used to supplement traditional instruction.60-62

Information-processing theory Operant conditioning emphasises the role of external factors in

learning. In cognitive psychology non-observable behaviour such as thought processes are studied. The information-processing theory seeks to understand and describe how the internal processes function when people acquire new information, store information, and recall it from memory. It also deals with how what is already known guides and determines what will be learned. The information-processing theory therefore adds the nature of the learner as another perspective on learning.

15

Page 16: SIMULATION SUPPORTED TRAINING IN ORAL RADIOLOGY140258/FULLTEXT01.pdf · SIMULATION SUPPORTED TRAINING IN ORAL RADIOLOGY ... Radiographic examinations ... Example of two intra-oral

According to the information-processing theory there are different memory systems involved in processing and storing information. The first system is recognised as the sensory register which receives the incoming environmental stimuli such as visual, auditory, and tactile stimuli. The sensory register holds the information just long enough (about one to three seconds) for the individual to decide if the stimuli should be attended to. Once information is attended to it is transformed to the second system, the short-term memory or working memory. Information in short-term memory is what we are currently thinking about. The short-term memory can hold memories from seconds to hours. The third system is the long-term memory where information is stored for hours to months and in the long-lasting memory information is permanently stored.63

In order to move information from the short-term memory to the long-term memory individuals must utilise certain strategies such as elaborative rehearsal. Elaborative rehearsal means that individuals consciously relate new information to knowledge already stored in the long-term memory. Elaboration occurs when individuals use information stored in the long-term memory to add new details of information, to clarify the meaning of a new idea, construct visual images, and create analogies. Meaningful learning is a powerful strategy to facilitate the transfer of information from short-term to long-term memory. Meaningful learning occurs when a learner encounters clear, logically organised material and consciously tries to relate the new material to ideas and experiences stored in the long-term memory.64

Retrieval of information from long-term memory is rapid and accurate. Most cognitive psychologists believe that the storage capacity in long-term memory is unlimited and that it contains a permanent record of what is learned, although some doubt about the latter exists.65

Constructivist learning theory According to Kolb ‘Learning is the process whereby knowledge is created through the transformation of experience.’ He emphasises that learning is an active process based on reflection on existing knowledge and new experiences.66 In problem solving, memory must be searched for information that can be used to fashion a solution. Using information can mean experimenting, questioning, reflecting. This process of creating knowledge is referred to as constructivism, as individuals construct an interpretation of how and why things are by filtering new ideas and experiences through existing knowledge

16

Page 17: SIMULATION SUPPORTED TRAINING IN ORAL RADIOLOGY140258/FULLTEXT01.pdf · SIMULATION SUPPORTED TRAINING IN ORAL RADIOLOGY ... Radiographic examinations ... Example of two intra-oral

structures.67 Constructivism focuses on how people build a meaning that is relevant to them. Its essence is the individual’s internal construction of reality.23

An early constructivist perspective is the concept of discovery learning. The principle content of what is to be learned is not given but must be discovered by the learner before it can be meaningfully incorporated into the student’s cognitive structure. The learner must rearrange information, integrate it, and reorganise the integrated information to generate a desired end-product. After discovery learning itself is completed, the discovered content is made meaningful in much the same way as if it was presented to the learner in the final form.64 Discovery learning is a psychologically complex and time consuming process. Discovery learning is increasingly being done with computer simulation programs and for learning science concepts and skills. The effect of such programs has been inconsistent; it is believed that many students do not have the self-regulation skills to cope with the demands of discovery learning.

Conditions fostering constructivism include a cognitive apprenticeship between student and teacher, a use of realistic problems and conditions, and an emphasis on multiple perspectives. The first condition, cognitive apprenticeship means that the teacher provides the student with enough help to complete a task and gradually decrease the help as the student becomes able to work independently. The second, often called situated learning, is that students are given learning tasks set in realistic contexts. The third condition fostering constructivism is that students should view ideas and problems from multiple perspectives. The rationale for this is that problems often are multifaceted and the knowledge base of experts is a network of interrelated ideas.

Skill retention and decay Skill decay refers to the loss or decay of trained or acquired skill after a period when the skill is not used. The degree of decay is influenced by the length of the time when the skill is not used, and is more pronounced for cognitive skills compared to physical skills.68 Skill retention has been well investigated but only a limited number of publications addressing skill retention after training in medical simulators have been found. The results show that acquired skills can be retained if training engages in deliberate practice of the skill,69-71 or if repeated feedback is given.72-74 When skill decay is recorded, most skill loss occurs shortly after training.73

17

Page 18: SIMULATION SUPPORTED TRAINING IN ORAL RADIOLOGY140258/FULLTEXT01.pdf · SIMULATION SUPPORTED TRAINING IN ORAL RADIOLOGY ... Radiographic examinations ... Example of two intra-oral

Visual-spatial ability Spatial ability may be defined as the ability to generate, retain, retrieve, and transform well-structured visual images.75 As spatial abilities mostly rely on visual information they are often referred to as visual-spatial ability. Spatial problems tend to be solved by generating a mental representation of a 2D or 3D structure and then assessing its properties or performing a transformation of the representation.76 For assessment of spatial ability there are a number of psychometric tests. Mostly they test ability to recognise objects, but problems can also be formulated in words. Visual tasks vary from recognition of shapes, recognition of shapes in different perspectives to rotation of asymmetrical 3D objects, and paper folding tests.77

There are two aspects to be considered when testing spatial ability. The first is the distinction between abilities and skills. Abilities are conceived as broad traits while skills reflect performance in specific tasks. In adults abilities are considered to be relatively fixed, but it is believed that skills can be improved through training. Although several studies suggest that appropriate training can improve performance in psychometric tests of spatial ability, the question of whether anyone can improve spatial ability with adequate training is not definitely answered.76 The second aspect is that variation in spatial test results are not only the result of the individual’s spatial ability but also on other abilities such as general reasoning ability.75

18

Page 19: SIMULATION SUPPORTED TRAINING IN ORAL RADIOLOGY140258/FULLTEXT01.pdf · SIMULATION SUPPORTED TRAINING IN ORAL RADIOLOGY ... Radiographic examinations ... Example of two intra-oral

Aim The overall aim with this work has been to develop and evaluate methods that could improve learning in oral radiology. The usability of a simulator as a facilitator for learning depends on technical solutions as well as the topic for training and design of training programmes. The challenge when designing the first training program in the simulator was to find a well defined task, relevant for the education of students, which was possible to implement and evaluate. Interpretation of spatial relations in radiographs utilising parallax78 was regarded to meet the demands. It is a truly cognitive skill which students can find hard to acquire. Training results were thought to be an indicator of the potential of the simulator concept as an instrument in radiology education and training. There is, however, limited knowledge about factors of importance for acquiring the skill to interpret spatial relations in radiographs. Therefore improvement of knowledge on that topic was included in the aim. The specific aims were to o develop a method for simulation of radiographic images o evaluate the accuracy in projection geometry in simulated

radiographs o develop a radiology simulator with an oral radiology training

program on the topic interpretation of spatial relations in radiographs utilising parallax

o investigate whether skill in interpretation of spatial information in radiographs utilising the parallax phenomenon is associated with visual-spatial ability

o compare the learning outcome immediately after training object depth localisation in the simulator with learning outcome immediately after conventional training

o compare the corresponding long-term effects on learning outcome o evaluate the impact of visual-spatial ability on learning outcome in

simulator based and conventional training

19

Page 20: SIMULATION SUPPORTED TRAINING IN ORAL RADIOLOGY140258/FULLTEXT01.pdf · SIMULATION SUPPORTED TRAINING IN ORAL RADIOLOGY ... Radiographic examinations ... Example of two intra-oral

Material and methods

Simulation of radiographic images, validation of projection geometry (Paper I) A virtual reality software was developed with models of a patient, an x-ray machine, and a detector. The objects had 6DOF. This implies that the user can move each object individually and arrange them in any desired position. Based on the individual position of each object, an x-ray image can be calculated.

The patient model was visualised as a transparent custom made torso with the teeth visualised inside as polygon models. In addition there was an invisible part of the patient model which consisted of a CT data set of a dry skull. The visual model of the teeth was created by segmentation and surface rendering of the CT data set according to a method developed specifically for the project.79 The teeth model was geometrically congruent with the CT data from which it emanated. The congruency was maintained during interaction and navigation.

The model of the x-ray machine could emit simulated x-rays from a defined point. The beam was collimated according to the shape and direction of the outer collimator of the model. The visual model of the detector was a rectangular plane. Simulated radiographic images of the patient model could be rendered as perspective projections based on the relative position between the individual models.

The projection geometry of the software was validated by replacing the patient CT data set with an artificial CT data set containing high attenuation points as objects to be imaged. The distance between the two test points was 21.65 mm. Series of simulated radiographs of the artificial CT data set were rendered by the software in a systematic way. The distances between the two dots, representing the projected test points in the simulated radiographic images, were measured. They were compared to theoretical calculations of the corresponding distances, by using traditional mathematical tools. The error in simulated projection distance was calculated as the deviation of the simulated projected distance relative to the corresponding calculated distance. The uncertainty in reading the position of the dots in the simulated images was estimated to be in the magnitude of the size of one pixel, which was 0.25 mm.

20

Page 21: SIMULATION SUPPORTED TRAINING IN ORAL RADIOLOGY140258/FULLTEXT01.pdf · SIMULATION SUPPORTED TRAINING IN ORAL RADIOLOGY ... Radiographic examinations ... Example of two intra-oral

The radiology simulator

Hardware including human-computer interface The developed software was implemented as core technology in a prototype radiology simulator. The simulator is shown in Figure 4. The hardware was a standard PC equipped with an Intel® Pentium 4 1.7 GHz CPU with 512 MB memory, a NVIDIA Quadro® DCC graphics card with 64 MB graphics memory, and Microsoft Windows® 2000 operating system. The VE presented a scene with the three objects; a patient, an x-ray machine, and a detector. No other objects were present. Interaction with the VE was performed with a 3D mouse (Spaceball 2003™, 3Dconnexion Inc., San Jose, Ca, USA) and a tracker system (Fastrack®, Polhemus, Colchester, Vt, USA), both with 6DOF. The 3D mouse was used for navigation in the VE which made it possible for the viewer to watch the scene from any perspective and distance. The tracker, which was incorporated in a pencil-like pointer, was used for interaction with the separate objects in the scene.54 In order to facilitate interaction with the individual objects the VE was displayed stereoscopically on a separate monitor for active stereo. The users therefore wore shutter glasses (Chrystal Eyes®, StereoGraphics Corporation, San Rafael, Ca, USA) in order to get stereoscopic vision. A second non-stereo monitor was used for displaying the rendered

Figure 4. The radiology simulator. The left monitor displayed the virtual environment (VE) stereoscopically. It included a patient, an x-ray machine, and a film. Navigation in the VE was performed with a 3D mouse (left hand). Interaction with the individual objects in the VE was performed with a tracker (right hand). The right monitor displayed rendered radiographs and feedback information.

21

Page 22: SIMULATION SUPPORTED TRAINING IN ORAL RADIOLOGY140258/FULLTEXT01.pdf · SIMULATION SUPPORTED TRAINING IN ORAL RADIOLOGY ... Radiographic examinations ... Example of two intra-oral

radiographs. Neither a physical manikin nor any replica of the x-ray machine or detector was used as part of the simulator, and no haptic feedback was provided. From a technical point of view, the simulator can be regarded as a desk top simulator with a 3D human-computer interface.

Training program A training program for object depth localisation was developed. The learning goal was to improve the trainees’ skill in interpreting spatial information in radiographs utilising the parallax phenomenon, which is a true cognitive task. The rationale for the object localisation procedure derives from the manner in which the relative position of radiographic images of two separate objects change when the projection angle at which the images were made is changed.80,81 In figure 3 two radiographs exposed at different angles depict an embedded tooth in the upper jaw. The depth position of the tooth crown relative to the adjacent tooth roots can be deduced if the difference in projection between the radiographs is known. If the position of the x-ray machine is not known, comparison of the relative position of anatomy details displayed on the images helps distinguish changes in projection.78

The training program had four structured exercises, each emphasising different aspects of the localisation procedure. The exercises could be selected freely. Before an exercise was started the user selected an area to be examined. All exercises had a stepwise design. In the first step, which was identical in all exercises, a simulated radiograph was displayed of the selected area. Simultaneously the corresponding position of the x-ray machine and film relative to the patient were displayed in the VE. In Figure 5 an example of the training program workflow is outlined. The following exercises were available:

ANALYSE BEAM DIRECTION is aimed at acquiring skill in deducing change in projection angle from analysis of the relative position of anatomy details in radiographs. After the initial radiograph was displayed a second one was shown over the same area. The projection angle for the second image was randomly chosen by the simulation program. However, the position of the x-ray machine and film (minor movement) were not displayed in the VE. The user was asked to move the x-ray machine from the original position to the position where it was thought to be when the second radiograph was exposed. A third radiograph reflecting the actual position of the x-ray machine was displayed after the user moved the x-ray machine to the new position.

22

Page 23: SIMULATION SUPPORTED TRAINING IN ORAL RADIOLOGY140258/FULLTEXT01.pdf · SIMULATION SUPPORTED TRAINING IN ORAL RADIOLOGY ... Radiographic examinations ... Example of two intra-oral

F

E D

BG

A

C

Figure 5. Illustration of the workflow at the ‘Ordinary radiography’ training program.

A. The user starts training by selecting the area to be examined. The x-ray machine is positioned by the program

B. The simulator presents a radiograph over the selected area. A radiopaque object is visible in the radiograph.

C. By moving the x-ray machine to a new position the user can expose a second radiograph from a slightly different projection angle. From the information in the two radiographs the depth position of the radiopaque object relative to the teeth can be deduced.

D. In the next step a blue sphere is visualised inside the mouth of the patient. The task for the user is to position the blue sphere in the position where the radiopaque object is thought to be situated from analysis of the radiographs.

E, F. The correct position is revealed by visualisation of a white sphere which has been depicted in the radiographs as the radiopaque object.

G. In addition to visual feedback (E and F) text based feedback is given.

23

Page 24: SIMULATION SUPPORTED TRAINING IN ORAL RADIOLOGY140258/FULLTEXT01.pdf · SIMULATION SUPPORTED TRAINING IN ORAL RADIOLOGY ... Radiographic examinations ... Example of two intra-oral

Feedback was given as angulation error and as visual comparison between the three simulated radiographs.

ORDINARY RADIOGRAPHY is aimed at acquiring a skill to deduce the relative depth position of an object displayed in pairs of radiographs when the difference in projection angle is known. The initial radiograph displayed an artificial spherical radiopaque object situated in a random position in the jaw. The sphere was not visualised in the VE. The user was asked to expose a second radiograph from a new projection angle. Then the user was urged to analyse the radiographic information and deduce the three dimensional position of the sphere in the jaw. Thereafter he/she was to take a blue marking sphere from a neutral position and place it in the correct position in the jaw. The operation was facilitated by the fact that all the teeth in the jaw were visualised. The simulator gave immediate feedback revealing the correct position of the radiopaque sphere in the jaw as well as the distance error.

FLUOROSCOPY had the same aim and basic design as “ordinary radiography” with one important difference. In fluoroscopy dynamic radiographs were rendered when the x-ray machine was moved. When the exercise started two identical radiographs were presented. When the x-ray machine was moved one of the radiographs was continuously updated. It was therefore possible to follow the change in relative position between object details in real-time and compare them with the initial image. The exercise was finished when the sphere was positioned in the jaw.

The fourth exercise was OBJECT LOCALISATION. In this exercise the first and the second exercise were fused together into one unit. The fluoroscopy function was available in part of the exercise.

In all exercises the region for examination was chosen by the user, and for every new task the position of the radiopaque sphere was randomly changed.

Experiments on development of skills to interpret spatial information in radiographs at training object depth localisation (Papers II, III, and IV) Three groups of dental students participated as test subjects. The studies were designed in accordance with the ethical principles of the Helsinki declaration and approved by the University Ethical Board. Informed consent was obtained after the participants received information regarding study design and protocol.

24

Page 25: SIMULATION SUPPORTED TRAINING IN ORAL RADIOLOGY140258/FULLTEXT01.pdf · SIMULATION SUPPORTED TRAINING IN ORAL RADIOLOGY ... Radiographic examinations ... Example of two intra-oral

Inclusion criteria and study population To be included in a study, the students were required to have passed the final exams of the dental program in anatomy. It was also required that they had taken the instruction lessons on object depth localisation utilising parallax (the tube shift technique)78 which is part of the fourth semester oral radiology course. The seventh and ninth semester students were assumed to be more skilled in radiology as they had passed the fourth semester oral radiology course, while the fourth semester students still were struggling with the actual course. In total 86 individuals participated, 58 women and 28 men. The median age was 25 years, ranging from 21 to 45 years. Characteristics of the individual student groups are presented in Table 1.

Table 1. Characteristics of the study population. N=86

S4

(N=29)1

S7

(N=29)1

S9

(N=28)1

25.0±4.9 27.5±6.1 26.1±2.4Mean age ± SD Mean MRT-A ± SD 9.9±4.3 8.0±4.5 9.1±3.9

Women 24 (83) 18 (62) 16 (57)Gender, N (%) Men 5 (17) 11 (38) 12 (43)1 S denotes semester level

The fourth semester students participated in what became a pilot

study. During the experiment it was found that the usability of the simulator training program was insufficient and it was therefore redesigned. The other two student groups used the redesigned program.

In paper II, all students were included in the study (N=86). In paper III, the study population comprised only students from the

seventh and ninth semesters (N=57). Paper IV was a follow-up to paper III and comprised the same study

population as in study III. Drop-out reduced the population (N=45).

Assessment of visual-spatial ability All participants were tested using the redrawn Vandenberg and Kuse mental rotations test version A (MRT-A).82 The test items are made up of asymmetric 3D objects. The task was to mentally rotate the figures and find those that were identical to a target figure. The test was chosen for two reasons. The first was the resemblance between a skill for interpreting 3D information in radiographs and the mental rotation of 3D test figures, and the second was that the MRT-A test is a well recognised test. Table 1 presents test results by student group.

25

Page 26: SIMULATION SUPPORTED TRAINING IN ORAL RADIOLOGY140258/FULLTEXT01.pdf · SIMULATION SUPPORTED TRAINING IN ORAL RADIOLOGY ... Radiographic examinations ... Example of two intra-oral

Proficiency testing A proficiency test instrument that comprised three separate subtests was designed. Each subtest dealt with a certain aspect considered to be of importance in interpreting spatial relations in radiographs utilising parallax. The subtests were called, principle subtest, projection subtest, and radiography subtest. Examples of the test instrument are presented in the appendix.

The aim of the principle subtest was to test the understanding of the principles for determining spatial relations utilising parallax. It was a paper and pencil version of training equipment designed for learning the principles of the tube shift technique, used in the introductory course. The projection subtest tested skill in identifying differences in beam direction based on a combination of anatomical knowledge and understanding the principles for determining spatial relations utilising parallax. The radiography subtest was identical to the ordinary procedure for analysing spatial relations in pairs of radiographs using the tube shift technique. It was therefore regarded as a highly valid test.

The proficiency test result was calculated as the sum of the scores for the individual subtests. Proficiency testing was performed before training, immediately after training, and eight months after training. The design was identical before and immediately after training but content was altered. At testing eight months after training, the same test instrument as before training was used. For assessment of the test instrument’s construct validity,83,84 the proficiency test was given to seven experts in oral radiology.

Interaction training Interaction with a VE and the use of 3D interaction tools were regarded to be new experiences for most of the subjects. It was assumed there was a risk that operating the simulator itself would give rise to technical problems that could detract attention from the actual training task thereby reducing transfer of training. Therefore the simulator’s basic functions needed to be introduced to the subjects before the actual radiology training started.

Two interaction training exercises were designed to give the users an opportunity to get acquainted with the VE and learn how to interact with the simulator before the actual training. The exercises were part of the ordinary training program which was modified to be used for interaction training. There was no radiology training included in the interaction training.

26

Page 27: SIMULATION SUPPORTED TRAINING IN ORAL RADIOLOGY140258/FULLTEXT01.pdf · SIMULATION SUPPORTED TRAINING IN ORAL RADIOLOGY ... Radiographic examinations ... Example of two intra-oral

Intervention Each student group was treated separately at randomisation and intervention. Randomisation to experimental and control group was based on proficiency test results before training with the aim of creating equivalent groups in regard to interpretative skills. Before training all students performed interaction training for 20 minutes each. The interaction training had two purposes; to prepare the subjects in the experimental group to interact with the simulator, and to collect a substantial amount of data for later usability analysis of the interface.

The subjects trained interpretation of spatial information in radiographs utilising parallax for 90 minutes. The subjects in the experimental group trained individually. Training was divided into two sessions in the assumption this would reduce the risk of VR-illness.85 The subjects in the control group utilised the ordinary educational material in one session. Training was performed individually or in small groups. Discussions with the teacher and fellow students were encouraged in the control group.

Proficiency testing was performed immediately after completion of the 90 minutes session. The seventh and ninth semester students were retested 8 months later.

Statistical analyses

Paper I No formal statistical evaluations were made.

Paper II Mann-Whitney U (Wilcoxon rank-sum test) test was used for testing the differences between two independent samples and Wilcoxon signed-ranks test was used for testing differences between two related samples.

Multiple linear regression analysis was used to analyse the effect of student training characteristics on radiography subtest principle subtest results.

Papers III and IV One-sample Kolmogorov-Smirnov test was used to test if a variable was normally distributed. Paired-samples t test was used for testing differences between two related samples. When data did not meet the normality assumptions necessary for the t test Wilcoxon signed-ranks

27

Page 28: SIMULATION SUPPORTED TRAINING IN ORAL RADIOLOGY140258/FULLTEXT01.pdf · SIMULATION SUPPORTED TRAINING IN ORAL RADIOLOGY ... Radiographic examinations ... Example of two intra-oral

test was used. Independent-samples t test was used for assessing differences between two independent samples.

Analysis of variance (ANOVA) was applied to determine if the effect of the training method was modified by the MRT-A level. Effect size (ES) estimates86,87 were calculated for comparison of mean test results for the experimental and control groups.

All tests were two-sided. P-values less than 0.05 were regarded statistically significant. All analyses were performed with SPSS 13.0 for Windows, SPSS Inc, Chicago, Illinois.

28

Page 29: SIMULATION SUPPORTED TRAINING IN ORAL RADIOLOGY140258/FULLTEXT01.pdf · SIMULATION SUPPORTED TRAINING IN ORAL RADIOLOGY ... Radiographic examinations ... Example of two intra-oral

Results

Accuracy in projection geometry Paper (I) Simulated and calculated projected distances between the projected dots and error in simulated projected distance were analysed as functions of focus-object distance, as function of rotation of the object, and as function of rotation of the detector. The simulated projected distances were almost the same as that calculated. Distance error ranged from -1.1% to 1.7%. The largest errors represented extreme projections. The differences between measured and calculated distances never exceeded the size of two pixels (0.5 mm) in the simulated radiograph. Figure 6 shows projected distances and error in projected distances as a function of rotation of the detector.

0

10

20

30

40

0 60 120 180 240 300 360Rotational angle (deg)

Pro

ject

ed d

ista

nce

(mm

)

A

-2

-1

0

1

2

0 90 180 270 360

Rotational angle (deg)

Erro

r in

per c

ent

BFigure 6. Plots of projected distances between two points as a function of rotation of the object and error in projected distances. Focus object distance 0.6 m, focus-detector distance 0.8 m. A) Simulated and calculated projected distances. B) Error in simulated projected distance in per cent of theoretically calculated distance. ○ Simulated projected distance, × Calculated projected distance

Proficiency test validation Students’ proficiency test results before training and experts’ test re-sults are presented in Table 2. Students scored significantly lower than

Table 2. Proficiency test results (mean and standard deviation) for students (before training) and for experts. Students (N=86) Experts (N=7) Mean SD Mean SD

P-value1

Proficiency test 11.16 3.20 16.00 1.00 <0.01 Principle subtest 3.94 2.41 6.00 0.00 0.01 Projection subtest 4.14 1.30 5.29 0.49 0.01 Radiography subtest 3.08 1.24 4.71 0.76 <0.011 Wilcoxon rank-sum test

29

Page 30: SIMULATION SUPPORTED TRAINING IN ORAL RADIOLOGY140258/FULLTEXT01.pdf · SIMULATION SUPPORTED TRAINING IN ORAL RADIOLOGY ... Radiographic examinations ... Example of two intra-oral

experts. Immediately after training and eight months after training the differences in test results between students and experts still were significant except for the radiography subtest where the differences were borderline significant. The P-values were 0.06 and 0.05, respectively.

Association between skills in interpretation of spatial information in radiographs and visual–spatial ability (Paper II)

Radiography subtest median scores increased from 60% before training to 86% after training. The difference was statistically significant (P<0.01). Principle subtest median results were 83% before training and 75% after training. The difference was not statistically significant. At radiography testing after training, median scores for women were 71% and 86% for men. The difference was statistically significant (p=0.03). No other gender differences were found at proficiency testing.

Test results displayed in subgroups with low and high MRT-A scores are presented in Figure 7. Before training the radiography subtest results were almost equal for subgroups with low and high MRT-A scores. After training there was a significant difference between the subgroups (P<0.01).The principle subtest results showed significant differences between the subgroups both before (P=0.01) and after training (P=0.01).

A BFigure 7. Box-plots presenting test results before and after training by groups with low and high MRT-A scores. A) Radiography subtest. B) Principle subtest.

30

Page 31: SIMULATION SUPPORTED TRAINING IN ORAL RADIOLOGY140258/FULLTEXT01.pdf · SIMULATION SUPPORTED TRAINING IN ORAL RADIOLOGY ... Radiographic examinations ... Example of two intra-oral

In Table 3 the results from the multiple regression analyses are presented. The regression models included MRT-A, age, and level of dental curriculum as independent variables. Gender was excluded from the analyses since it had no significance in any of the regression models. The MRT-A regression coefficient was significantly different from zero in the regression equation with radiography subtest results after training. The adjusted R2 was 0.28. The MRT-A regression coefficient was also statistically significant in the equations with the principle subtests before and after training. The adjusted R2 was lower for the principle subtests than for the radiography subtest.

Table 3. Regression coefficients (B) and P-values for the independent variables MRT-A, age, and level of dental curriculum, and adjusted R squares in multiple linear regression models with radiography subtest results and principle subtest results as dependent variables. N=86 MRT-A Age LevelTest result B P B P B P

Adj R2

Radiography subtest Before training 0.79 0.22 0.45 0.44 5.86 0.32 0.00 After training 2.18 <0.01 -1.46 <0.01 9.77 0.03 0.28 Principle subtest Before training 2.57 0.01 -2.34 0.01 -2.56 0.77 0.14 After training 2.90 <0.01 -0.03 0.97 20.07 0.02 0.11

Effects of simulation based training on skill in interpreting spatial information in radiographs (Paper III)

Figure 8 displays the test results before and immediately after training. For the experimental group the results on both the proficiency test and radiography subtest was significantly higher immediately after training than before (P-values <0.01 for both tests). The corresponding differences for the control group were not significant.

A subgroup analysis of the variable improvement after training is displayed in table 4. For the low MRT-A category a significant higher improvement in the proficiency test was observed for the experimental group compared to the control group (P=0.02), however, among participants in the medium and high MRT-A category no significant differences between training groups were observed. This indication of MRT-A category being an effect modifier was supported by ANOVA revealing a significant interaction effect (P=0.03) between training group and MRT-A score levels. The corresponding analysis with the

31

Page 32: SIMULATION SUPPORTED TRAINING IN ORAL RADIOLOGY140258/FULLTEXT01.pdf · SIMULATION SUPPORTED TRAINING IN ORAL RADIOLOGY ... Radiographic examinations ... Example of two intra-oral

A B Figure 8. Box-plots presenting test results for the trial groups before and after training. A) Proficiency test results. B) Radiography subtest results. difference in improvement at radiography subtest showed the same pattern but interaction effect was not statistically significant (P=0.31). ES comparing experimental and control groups among the low MRT-A category was 1.15 for improvement at proficiency test and 0.99 for improvement at radiography subtest.

Table 4. Improvement after training for the proficiency test and the radiography subtest (mean and standard deviation) among control and experimental group by MRT-A group Control Experimental N Mean SD N Mean SD P-value1

Proficiency test Low MRT-A 13 -0.31 3.64 7 3.86 3.34 0.02 Medium MRT-A 10 1.60 2.41 9 2.44 2.17 0.44 High MRT-A 6 2.33 3.88 12 0.50 3.37 0.32 Radiography subtest Low MRT-A 13 -0.23 1.54 7 1.29 1.70 0.06 Medium MRT-A 10 1.00 1.63 9 0.89 1.76 0.89 High MRT-A 6 1.17 1.94 12 1.08 2.19 0.94 1 Independent-samples t test.

32

Page 33: SIMULATION SUPPORTED TRAINING IN ORAL RADIOLOGY140258/FULLTEXT01.pdf · SIMULATION SUPPORTED TRAINING IN ORAL RADIOLOGY ... Radiographic examinations ... Example of two intra-oral

Retention of interpretative skill at long-term follow-up (Paper IV) Test results from the very beginning of the trial are summarised in Figure 9. The experimental group’s test results decreased during the non-training period. The difference between the proficiency test results after training and eight months after training was statistically significant (P=0.01). The corresponding difference for the radiography subtest was not significant. For the control group the proficiency test results decreased but the radiography subtest results increased. There were no significant differences between test results after training and eight months after training.

2

3

4

5

BT AT 8AT

10

11

12

13

14

15

BT AT 8AT

A BFigure 9. Test results before (BT), immediately after training (AT), and eight months after training (8AT). Solid line – experimental group, broken line – control group. Error bars represent SEM. A) Proficiency test results. B) Radiography subtest results.

A comparison between the pre-training test results and the test results eight months after training is displayed in Table 5. Significant difference in test results was seen only for the experimental group with the radiography subtest.

Table 5. Proficiency test and radiography subtest results (mean and standard deviation) before and after training by training method

Before training Follow-up N Mean SD Mean SD

P-value1

Proficiency test Control group 25 11.52 3.44 12.04 3.17 0.39 Experimental group 20 11.05 3.20 11.70 3.20 0.42Radiography subtest Control group 25 3.40 1.16 3.92 1.32 0.14 Experimental group 20 3.20 1.40 4.00 0.86 0.021 Wilcoxon signed-ranks test

33

Page 34: SIMULATION SUPPORTED TRAINING IN ORAL RADIOLOGY140258/FULLTEXT01.pdf · SIMULATION SUPPORTED TRAINING IN ORAL RADIOLOGY ... Radiographic examinations ... Example of two intra-oral

Discussion

Brief summary of results A method for simulation of radiographic images was developed and evaluated. Evaluation showed that the software produced simulated x-ray images with small deviations in projection geometry compared to theoretically calculated geometries (Paper I). The proficiency test validation showed that students tested significantly lower than experts at the individual subtests as well as at the proficiency test as a whole. Assessment of the skill to interpret spatial relation in radiographs before and after training revealed that principle subtest results did not change at training. Principle subtest scores were associated with MRT-A score before as well as after training. Radiography subtest results increased after training and the score after training was associated with MRT-A score (Paper II).

Simulator training revealed a significant increase in test results after training at proficiency testing (Paper III). Repeated testing eight months after training revealed a reduction in test results compared to the results immediately after training. Radiography subtest results remained at a significantly higher level than before training (Paper IV). The recorded increase in test results immediately after and eight months after conventional training were not statistically significant (Papers III, IV). Among individuals with low visual spatial ability the simulator training group had higher proficiency test results immediately after training than the conventional training group (Paper III).

The simulator (Paper I) The high accuracy in projection geometry in rendered radiographic images was an important result in the development of radiology simulation. By implementing the method into training programs a reduction in the use of ionising radiation for training purposes would be possible and thereby reducing risk of harmful biological effects to patients and students.

At validation it was found that the error in projected distance never exceeded 0.5 mm or two pixels. The results were achieved from projection through a CT data matrix with 0.4 mm pixel size, 2 mm thick slices, and 1 mm slice spacing. A subjective comparison between simulated and real intra-oral radiographs revealed that resolution and image clarity was lower in the simulated radiographs. It was, for example, in some instances difficult to distinguish low

34

Page 35: SIMULATION SUPPORTED TRAINING IN ORAL RADIOLOGY140258/FULLTEXT01.pdf · SIMULATION SUPPORTED TRAINING IN ORAL RADIOLOGY ... Radiographic examinations ... Example of two intra-oral

contrast object details such as thin roots. This fact reduced the possibility of interpreting the simulated images. The lower image clarity was a result of the limited spatial resolution of the CT data matrix used for the simulated images. The simulated radiographs were, however, regarded to be of sufficient quality to be used for training purposes. The quality of the simulated radiographs reflect the hospital CT scanner’s resolution and the power of computer graphics at the time of the beginning of the Virtual Radiography project. There is a risk that the lower image clarity might have reduced the value of the simulator training due to the difficulty identifying anatomical details of importance.

Modern hospital CT scanners and micro CT scanners can produce data matrixes with higher resolution.88 With the simulation method a higher resolution in the data matrix will produce higher resolution and better image clarity in rendered radiographs. After completion of these studies new CT data with higher resolution has been implemented into the software. Together with better performing computer graphics boards a substantial improvement in image clarity has been achieved. Figures 4 and 5 are based on the updated version of the simulator.

Interaction training was intended to prepare the subjects to interact with the simulator and it was also intended to collect data for later usability analysis. The training was therefore standardised to 20 minutes. The collected data has not been scientifically evaluated but a substantial variation in performance at the end of the training was noted. A few individuals did not reach a level where they easily interacted with the simulator at the interaction training, while most of the subjects reached a stable performance level. These observations indicate that simulator usability needs further analysis. The inferior performance at interaction training might also have detracted attention from the actual training task thereby reducing the training effect.

Proficiency assessment Transfer of training was measured using a test instrument designed by the authors. Since the test instrument was new its validity needed to be analysed. Skill interpreting spatial relations in radiographs utilising parallax is dependent upon knowledge of topographic anatomy, radiographic projection geometry, an understanding of the nature of the radiographic representation, and the parallax phenomenon. The content of the test instrument reflected these areas and was in that respect estimated to be valid. In addition, the difference between student and expert performance at the proficiency test as well as the

35

Page 36: SIMULATION SUPPORTED TRAINING IN ORAL RADIOLOGY140258/FULLTEXT01.pdf · SIMULATION SUPPORTED TRAINING IN ORAL RADIOLOGY ... Radiographic examinations ... Example of two intra-oral

individual subtests gave evidence for the test instrument’s construct validity.83,84

The radiography subtest presented tasks that replicated the clinical task. Transfer from test instrument to the real environment must therefore be considered high.55 Therefore the radiography subtest results were interesting to report. The low number of tasks (five) in combination with the dichotomous nature of the response alternatives made it sensitive to random effects due to high probability for guessing correctly. With an additional number of tasks these effects would have been less. The decision of the number of tasks was influenced by the fact that the radiography subtest had the same design as the training material for the control group. In order to reduce potential training effects from the test instrument that could be beneficial for one of the trial groups (the control group), the number of tasks in the radiography subtest was limited. Although there was a risk for random effects, the radiography subtest was regarded to be a highly valid test.

Radiography subtest results immediately after training showed that a substantial number of subjects had a top score. This condition reflects a ceiling effect and indicates that the radiography subtest had a reduced capacity to differentiate between high performing individuals. The validity of the radiography subtest was therefore reduced under those conditions. The proficiency test results did not show any ceiling effects and validity was therefore considered not to be affected by performance level.

Interpretation of spatial relations in radiographs utilising parallax (Paper II) Skill interpreting spatial relations in radiographs after training as assessed by radiography subtest was better for individuals with high visual-spatial ability as assessed by the MTR-A test instrument. The results were obtained among students at two levels of radiology training (4th semester vs. 7th and 9th semester students) in combination with conventional training and two versions of radiology simulator training (pilot and final version). These background variables were included in the linear regression analyses. The final model explained one fourth of the variation in test results. The results suggest that most of the variation in development of skill in interpreting spatial relations in radiographs was due to factors other than visual-spatial ability. The observed gender difference at radiography subtest was probably an effect of the well-known gender differences in visual-spatial ability which also was observed in this material.82,89

36

Page 37: SIMULATION SUPPORTED TRAINING IN ORAL RADIOLOGY140258/FULLTEXT01.pdf · SIMULATION SUPPORTED TRAINING IN ORAL RADIOLOGY ... Radiographic examinations ... Example of two intra-oral

Difference in principle subtest results between low and high MRT-A groups and regression models with association between principle subtest results and MRT-A score suggests an association between visual-spatial ability and understanding of the parallax phenomenon. The regression models, however, explained only ten to fourteen percent of the variation.

Skill interpreting spatial relations in radiographs utilising parallax is regarded to be dependent on knowledge of topographic anatomy, radiographic projection geometry, understanding the nature of the radiographic representation, and the parallax phenomenon. In earlier studies a high correlation between spatial visualisation aptitude and interpretation of cross-sectional images has been found,90 and students with higher visual-spatial ability have been found to be more adept at learning anatomy.51 The reported findings and our results suggest that anatomy knowledge and understanding of the parallax phenomenon are important factors for interpretation of spatial relations in radiographs.

Simulator vs. conventional training (Papers III and IV) Conventional training adhered closely to the radiography subtest. Although there was a better fidelity between conventional training and test instrument, there was no significant improvement of radiography subtest result after training. The results indicate that conventional training had limited effect on interpretative skill improvement at this level of education.

Simulator training was organised as training distributed in two sessions. Subsequent retention is often better if practice is distributed rather than massed. The effect is, however, less pronounced with complex tasks such as the interpretative skill and its effect is unknown for skill retention for longer periods.91-93 The distributed training might have had beneficial training results but the effect might have been reduced since the students had to recall how to use the simulator at the second training session which to a varying degree reduced the effective training time. When interpreting the results these effects must be considered and they add some uncertainty to the interpretation.

The pros and cons of the two training modalities for skill acquisition can be analysed within the framework of learning theories. In the operant conditioning theory the basic idea is that all behaviours are accompanied by certain consequences. Therefore feedback at training is of interest to analyse. The simulator feedback was visual and text based and conventional training feedback was only text based.

37

Page 38: SIMULATION SUPPORTED TRAINING IN ORAL RADIOLOGY140258/FULLTEXT01.pdf · SIMULATION SUPPORTED TRAINING IN ORAL RADIOLOGY ... Radiographic examinations ... Example of two intra-oral

Therefore simulator feedback became more detailed which could reinforce the progress a student makes during training. On the other hand, if the student could not control the basic functions of the simulator such as navigation and interaction, the consequent training would be obscured by problems outside the real task thereby reducing training effect. A simulator must therefore be designed for easy user interaction.

The information processing theory seeks to describe how the internal processes function when people acquire new information, store information, and recall it from memory. There was no reason to suspect differences in the already stored information on group level between the groups, since randomisation was based on pre-training proficiency test results. Thus, possible differences between the groups must be related to differences in training modality. Simulator training is considered to provide more new information for elaboration than conventional training. Simulator training added visualisation of the internal structures of the jaw, visualisation of the exact position of the object to be localised, and added possibility for interaction and experimentation with the fluoroscopic function. In the actual study the extent these functions were used was not analysed. In conclusion, simulator training gave more opportunity for elaborative rehearsal which would be beneficial for skill acquisition. In addition, the accurate projection geometry of the simulator supported meaningful learning.64

In a constructivist perspective analysis of the two training groups it was found that the conventional training group benefited from a cognitive apprenticeship which was not available for the simulator students. The skill to be trained was a kind of problem solving where different tools were available for the two groups. In conventional training information that could be used for the solution of the problem was gained from discussions with other students or the teacher. The simulator, on the other hand, was built to visualise as much as possible of the internal structures that were depicted in the radiographs and it was also built to allow experimentation with x-ray projections. The simulator provided a large amount of reliable information that could be used for the individuals’ internal construction of how spatial information in radiographs could be interpreted. It is clear that the simulator tasks provided information from more perspectives than conventional training tasks. It is therefore assumed that simulator training promoted a deeper understanding of the procedure for determining spatial relations in radiographs utilising parallax and

38

Page 39: SIMULATION SUPPORTED TRAINING IN ORAL RADIOLOGY140258/FULLTEXT01.pdf · SIMULATION SUPPORTED TRAINING IN ORAL RADIOLOGY ... Radiographic examinations ... Example of two intra-oral

thereby made the students less dependent on memory rules such as ‘the SLOB rule’ or ‘the buccal object rule’.78,81

The finding that individuals with low visual-spatial ability benefited most from simulator training is an important finding. The effect size87 for simulator training as compared with conventional training was of crucial importance. This implies that the present simulator training could become a valuable training method for individuals who are likely to underperform with conventional training methods. As simulator training relies on richer visual information it is assumed that the simulator effect is mediated by the enhanced visualisation which gives perceptual and cognitive support especially valuable for these individuals.

The short-term results show that simulator training was an effective training modality. The acquired skill was, however, attributed to decay. The training and practice subsequent to the training session was not enough to maintain the acquired interpretative skill. In spite of the decay there were strong indications that the retained skill after simulation training was still better than before training. It is assumed that test results immediately after training to some degree reflect content in working memory which can be lost without further training.63 Experiences for other medical simulators show that repeated practice and feedback improve skill retention. 68-70,72,73,94

39

Page 40: SIMULATION SUPPORTED TRAINING IN ORAL RADIOLOGY140258/FULLTEXT01.pdf · SIMULATION SUPPORTED TRAINING IN ORAL RADIOLOGY ... Radiographic examinations ... Example of two intra-oral

Conclusions o The method developed for simulation of radiographic projections

produces simulated radiographs of high geometrical accuracy. The simulation method can therefore be utilised in the construction of radiology training simulators

o Understanding the parallax phenomenon is associated with visual-spatial ability.

o High visual-spatial ability facilitates acquisition of skill in interpreting spatial information in radiographs utilising parallax.

o Skill in interpreting spatial information in radiographs is significantly improved by training in a radiology simulator. The training effect is most pronounced for individuals with low visual-spatial ability.

o Interpretative skill improvement is partly retained eight months after simulator training. Conventional training does not result in a corresponding skill improvement.

A general conclusion drawn from the results is that training in a radiology simulator can be an effective tool for skill acquisition in oral radiology.

40

Page 41: SIMULATION SUPPORTED TRAINING IN ORAL RADIOLOGY140258/FULLTEXT01.pdf · SIMULATION SUPPORTED TRAINING IN ORAL RADIOLOGY ... Radiographic examinations ... Example of two intra-oral

Acknowledgements This work would not have been completed without the help and commitment of several individuals and organisations. I want to express my sincere gratitude to: Jan Ahlqvist, my tutor and friend, for his enormous enthusiasm and never-ending support from the very beginning of this project. Annika Isberg, adviser and co-author, for introducing me to research and for sharing her experience and ideas. Magnus Johansson, computer programmer and co-author, for writing all the radiology simulator programs. Leif Hedman, co-author, for introducing me to the world of psychology. Kenneth Bodin, VRlab Umeå, for his commitment and support in the development of the radiology simulator. Per-Olof Söderström, County Council of Västerbotten, for believing in the simulation ideas and for valuable support. Fredrik Wiklund, statistician, for valuable statistical guidance. Fredrik Bryndahl, colleague and fellow PhD student, for discussions about almost everything, for doing illustrations, and sometimes for sharing the lunch hour. Ulrika Bertilssson, research engineer, for professional help with illustrations. Lillemor Hägglund, secretary, for keeping practical issues in excellent order. Eva Levring Jäghagen, colleague, for her advice and support Per Erik Legrell, colleague, for his encouragement and helpful criticism of texts and manuscripts Alyce Norder for prompt and skilful editing of the English text. All the dental students who were volunteer test subjects. The staff at the Department of Oral and Maxillofacial Radiology, Umeå, for their support throughout the years. This project was supported by grants from, the Knowledge Foundation in Sweden, The Västerbotten County Council, Sweden and EU Structural Funds, Objective 1, Northern Norrland, Sweden. Last but not least, I want to thank my family. I am especially thankful to my beloved wife, Inga Greta, for having patience with me. I will also thank Erik, Olov and Karin, Lars and Malin, for their interest and support and Alma, Runa, Greta, and Arthur for being my lovely grandchildren.

41

Page 42: SIMULATION SUPPORTED TRAINING IN ORAL RADIOLOGY140258/FULLTEXT01.pdf · SIMULATION SUPPORTED TRAINING IN ORAL RADIOLOGY ... Radiographic examinations ... Example of two intra-oral

References 1. Svenson B, Eriksson T, Kronström M, Palmqvist S. Image

quality of intraoral radiographs used by general practitioners in prosthodontic treatment planning. Dentomaxillofac Radiol 1994;23:46-8.

2. Svenson B, Eriksson T, Kronström M, Palmqvist S. Quality of intraoral radiographs used for prosthodontic treatment planning by general dentists in the Public Dental Health Service. Swed Dent J 1995;19:47-54.

3. Helminen SE, Vehkalahti M, Wolf J, Murtomaa H. Quality evaluation of young adults' radiographs in Finnish public oral health service. J Dent 2000;28(8):549-55.

4. Hellén-Halme K, Johansson PM, Håkansson J, Petersson A. Image quality of digital and film radiographs in applications sent to the Dental Insurance Office in Sweden for treatment approval. Swed Dent J 2004;28(2):77-84.

5. Knoll GF. Radiation detection and measurement, 2nd edn. New York: Wiley; 1989, pp 1-2.

6. Tubiana M, Dutreix J, Wambersie A. Introduction to radiobiology. London: Taylor & Francis; 1990.

7. ICRP P6. 1990 recommendations of the international commision on radiological protection, 60. Amsterdam: Elsevier; 1991.

8. Hounsfield GN. Nobel Award address. Computed medical imaging. Med Phys 1980;7(4):283-90.

9. Seeram E. Computed Tomography: Physical Principles, Clinical Applications, and Quality Control, 2nd ed. Philadelphia: W.B. Saunders Company; 2001.

10. Udupa JK. Three-dimensional visualization and analysis methodologies: a current perspective. Radiographics 1999;19(3):783-806.

11. Curry III TS, Dowdey JE, Murry Jr RC. Christensens Physics of Diagnostic Radiology, 4th edition. Philadelphia: Lea & Febiger; 1990.

12. Rådberg C, Welander U. Maxillo-facial radiography. Stockholm: Siemens-Elema.

13. Coren S, Ward LM, Enns JT. Sensation and Perception, 6th edn. Hoboken, NJ: Wiley; 2004, pp 256-90.

42

Page 43: SIMULATION SUPPORTED TRAINING IN ORAL RADIOLOGY140258/FULLTEXT01.pdf · SIMULATION SUPPORTED TRAINING IN ORAL RADIOLOGY ... Radiographic examinations ... Example of two intra-oral

14. Shi XQ, Li G, Yoshiura K, Welander U. Perceptibility curve test for conventional and colour-coded radiographs. Dentomaxillofac Radiol 2004;33(5):318-22.

15. Simpson J, Weiner E. The Oxford English Dictionary, 2nd edn. Oxford: Oxford university press; 1989.

16. Badiqué E et al. Entertainment applications of virtual environments. In: Stanney KM. Handbook of virtual environments: design, implementation, and applications. Mahwah, NJ: Lawrence Erlbaum Associates; 2002. p. 1143-66.

17. Gaba DM. The future vision of simulation in health care. Qual Saf Health Care 2004;13 Suppl 1:i2-i10.

18. Vince J. Essential Virtual Reality Fast: How to Understand the Techniques and Potential of Virtual Reality. London: Springer-Verlag; 1998.

19. Issenberg SB, McGaghie WC, Petrusa ER, Gordon DL, Scalese RJ. Features and uses of high-fidelity medical simulations that lead to effective learning: a BEME systematic review. Med Teach 2005;27(1):10-28.

20. Kohn LT, Corrigan JM, Donaldson MS. Creating safety systems in health care organizations. To err is human: building a safer health system. Washington, D.C: National Academy Press; 2000, pp 155-98.

21. Greene AK, Zurakowski D, Puder M, Thompson K. Determining the need for simulated training of invasive procedures. Advances in health sciences education: theory and practice 2006;11(1):41-9.

22. Santen SA, Hemphill RR, Spanier CM, Fletcher ND. 'Sorry, it's my first time!' Will patients consent to medical students learning procedures? Med Educ 2005;39(4):365-9.

23. Kneebone R. Simulation in surgical training: educational issues and practical implications. Med Educ 2003;37(3):267-77.

24. Davis PK. Distributed interactive simulation in the evolution of DoD warfaremodeling and simulation. Proceedings of the IEEE 1995;83(8):1138-55.

25. Rehmann AJ, Mitman RD, Reynolds MC. A handbook on flight simulation fidelity requirements for human factors research. Atlantic City, NJ: Federal Aviation Administration; 1995.

26. Issenberg SB, McGaghie WC, Hart IR, Mayer JW, Felner JM, Petrusa ER et al. Simulation technology for health care

43

Page 44: SIMULATION SUPPORTED TRAINING IN ORAL RADIOLOGY140258/FULLTEXT01.pdf · SIMULATION SUPPORTED TRAINING IN ORAL RADIOLOGY ... Radiographic examinations ... Example of two intra-oral

professional skills training and assessment. JAMA 1999;282(9):861-6.

27. Breier R, Bohm R, Kopani M. Simulation of radiation damage to lung cells after exposure to radon decay products. Neuroendocrinol Lett 2006;27 Suppl 2:86-90.

28. Good ML. Patient simulation for training basic and advanced clinical skills. Med Educ 2003;37 Suppl 1:14-21.

29. Abrahamson S, Denson JS, Wolf RM. Effectiveness of a simulator in training anesthesiology residents. 1969. Qual Saf Health Care 2004;13(5):395-7.

30. Ewy GA, Felner JM, Juul D, Mayer JW, Sajid AW, Waugh RA. Test of a cardiology patient simulator with students in fourth-year electives. J Med Educ 1987;62(9):738-43.

31. Gordon MS, Ewy GA, Felner JM, Forker AD, Gessner I, McGuire C et al. Teaching bedside cardiologic examination skills using "Harvey", the cardiology patient simulator. Med Clin North Am 1980;64(2):305-13.

32. Gaba DM, DeAnda A. A comprehensive anesthesia simulation environment: re-creating the operating room for research and training. Anesthesiology 1988;69(3):387-94.

33. Issenberg SB, Pringle S, Harden RM, Khogali S, Gordon MS. Adoption and integration of simulation-based learning technologies into the curriculum of a UK Undergraduate Education Programme. Med Educ 2003;37 Suppl 1:42-9.

34. van Meurs WL, Couto PM, Couto CD, Bernardes JF, Ayres-de-Campos D. Development of foetal and neonatal simulators at the University of Porto. Med Educ 2003;37 Suppl 1:29-33.

35. Pugh CM, Youngblood P. Development and validation of assessment measures for a newly developed physical examination simulator. J Am Med Inform Assoc 2002;9(5):448-60.

36. Nyssen AS, Larbuisson R, Janssens M, Pendeville P, Mayné A. A comparison of the training value of two types of anesthesia simulators: computer screen-based and mannequin-based simulators. Anesth Analg 2002;94(6):1560-5

37. Kneebone RL, Nestel D, Moorthy K, Taylor P, Bann S, Munz Y et al. Learning the skills of flexible sigmoidoscopy - the wider perspective. Med Educ 2003;37 Suppl 1:50-8.

44

Page 45: SIMULATION SUPPORTED TRAINING IN ORAL RADIOLOGY140258/FULLTEXT01.pdf · SIMULATION SUPPORTED TRAINING IN ORAL RADIOLOGY ... Radiographic examinations ... Example of two intra-oral

38. Wilhelm DM, Ogan K, Roehrborn CG, Cadeddu JA, Pearle MS. Assessment of basic endoscopic performance using a virtual reality simulator. J Am Coll Surg 2002;195(5):675-81.

39. Ritter EM, McClusky DA, Lederman AB, Gallagher AG, Smith CD. Objective psychomotor skills assessment of experienced and novice flexible endoscopists with a virtual reality simulator. J Gastrointest Surg 2003;7(7):871-7.

40. Caversaccio M, Eichenberger A, Häusler R. Virtual simulator as a training tool for endonasal surgery. Am J Rhinol 2003;17(5):283-90.

41. Anderson J, Chui CK, Cai Y, Wang Y, Li Z, Ma X et al. Virtual reality training in interventional radiology: the Johns Hopkins and Kent Ridge Digital Laboratory experience. Semin Intervent Radiol 2002;19(2):179-85.

42. Tokuyasu T, Yamamoto M, Okamura K, Yoshiura K. Development of a training system for intraoral radiography. Proceedings of the 2006 IEEE International Conference on Robotics and Automation. 2006. pp 3286-91.

43. Ström P, Kjellin A, Hedman L, Wredmark T, Felländer-Tsai L. Training in tasks with different visual-spatial components does not improve virtual arthroscopy performance. Surg Endosc 2004;18(1):115-20.

44. Nicholson DT, Chalk C, Funnell WR, Daniel SJ. Can virtual reality improve anatomy education? A randomised controlled study of a computer-generated three-dimensional anatomical ear model. Med Educ 2006;40(11):1081-7.

45. Halvorsen FH, Elle OJ, Fosse E. Simulators in surgery. Minim Invasive Ther Allied Technol 2005;14(4):214-23.

46. Satava RM. Virtual reality surgical simulator. The first steps. Surg Endosc 1993;7(3):203-5.

47. Liu A, Tendick F, Cleary K, Kaufmann C. A Survey of Surgical Simulation: Applications, Technology, and Education. Presence: Teleoperators and Virtual Environments 2003;12(6):599-614.

48. Beaubien JM, Baker DP. The use of simulation for training teamwork skills in health care: how low can you go? Qual Saf Health Care 2004;13 Suppl 1:i51-6.

49. Blade RA, Padgett ML. Virtual environments standards and terminology. In: Stanney KM. Handbook of virtual environments:

45

Page 46: SIMULATION SUPPORTED TRAINING IN ORAL RADIOLOGY140258/FULLTEXT01.pdf · SIMULATION SUPPORTED TRAINING IN ORAL RADIOLOGY ... Radiographic examinations ... Example of two intra-oral

design, implementation, and applications. Mahwah, NJ: Lawrence Erlbaum Associates; 2002. pp. 15-27.

50. Garg A, Norman GR, Spero L, Maheshwari P. Do virtual computer models hinder anatomy learning? Acad Med 1999;74(10 SUPPL):S87-9.

51. Garg AX, Norman G, Sperotable L. How medical students learn spatial anatomy. Lancet 2001;357(9253):363-4.

52. Stanney KM, Mourant RR, Kennedy RS. Human Factors Issues in Virtual Environments: A Review of the Literature. Presence: Teleoperators and Virtual Environments 1998;7(4):327-51.

53. Arnold P, Farrell MJ, Pettifer S, West AJ. Performance of a skilled motor task in virtual and real environments. Ergonomics 2002;45(5):348-61.

54. Waterworth JA. Dextrous and Shared Interaction with Medical Data: stereoscopic vision is more important than hand-image collocation. In: Westwood JD, Hoffman HM, Robb RA, Stredney D. Medicine Meets Virtual Reality 02/10. Amsterdam: IOS Press; 2002. pp. 560-6.

55. Lathan CE, Tracey MR, Sebrechts MM, Clawson DM, Higgins GA. Using virtual environments as training simulators: measuring transfer. In: Stanney KM. Handbook of virtual environments: design, implementation, and applications. Mahwah, NJ: Lawrence Erlbaum Associates; 2002. pp. 403-14.

56. Cox BD. The rediscovery of the active learner in adaptive contexts: A developmental-historical analysis of transfer of training. Educ Psychol 1997;32(1):41-55.

57. Snowman J, Biehler R. Psychology applied to teaching, 11th edn. Boston: Houghton Mifflin; 2006.

58. Kenyon RV, Afenya MB. Training in virtual and real environments. Ann Biomed Eng 1995;23(4):445-55.

59. Bloom BS, Engelhart MD, Furst EJ, Hill WH, Krathwohl DR. Taxonomy of educational objectives. The classification of educational goals. Handbook 1. Cognitive domain. New York: David McKay Company, Inc; 1956.

60. Bayraktar S. A Meta-analysis of the Effectiveness of Computer-Assisted Instruction in Science Education. Journal of Research on Technology in Education 2001/2002;34(2):173-88.

61. Christmann E, Badgett J, Lucking R. Microcomputer-based computer-assisted instruction within differing subject areas: a

46

Page 47: SIMULATION SUPPORTED TRAINING IN ORAL RADIOLOGY140258/FULLTEXT01.pdf · SIMULATION SUPPORTED TRAINING IN ORAL RADIOLOGY ... Radiographic examinations ... Example of two intra-oral

statistical deduction. Journal of Educational Computing Research 1997;16(3):281-96.

62. Christmann E, Badgett J. A Comparative Analysis of the Effects of Computer-Assisted Instruction on Student Achievement in Differing Science and Demographical Areas. Journal of Computers in Mathematics and Science Teaching 1999;18(2):135-43.

63. McGaugh JL. Memory–a century of consolidation. Science 2000;287(5451):248-51.

64. Ausubel DP, Novak JD, Hanesian H. Educational psychology: a cognitive view. 2nd edn. New York: Holt, Rinehart and Winston; 1978.

65. Schunk DH. Learning theories: an educational perspective, 4th edn. Upper Saddle River, N.J.: Pearson/Merrill/Prentice Hall; 2004.

66. Kolb DA. Experiential learning: Experience as the source of learning and development. Englewood Cliffs, N.J: Prentice-Hall; 1984.

67. Haney JJ, McArthur J. Four case studies of prospective science teachers' beliefs concerning constructivist teaching practices. Sci Educ 2002;86(6):783-802.

68. Arthur Jr W, Bennett Jr W, Stanush PL. Factors That Influence Skill Decay and Retention: A Quantitative Review and Analysis. Human Performance 1998;11(1):57-101.

69. Ericsson KA. Deliberate practice and the acquisition and maintenance of expert performance in medicine and related domains. Acad Med 2004;79(10 SUPPL):S70-81.

70. Wayne DB, Siddall VJ, Butter J, Fudala MJ, Wade LD, Feinglass J et al. A longitudinal study of internal medicine residents' retention of advanced cardiac life support skills. Acad Med 2006;81(10 SUPPL):S9-S12.

71. Wayne DB, Butter J, Siddall VJ, Fudala MJ, Lindquist LA, Feinglass J et al. Simulation-Based Training of Internal Medicine Residents in Advanced Cardiac Life Support Protocols: A Randomized Trial. Teach Learn Med 2005;17(3):202-8.

72. Kovacs G, Bullock G, Ackroyd-Stolarz S, Cain E, Petrie D. A randomized controlled trial on the effect of educational interventions in promoting airway management skill maintenance. Ann Emerg Med 2000;36(4):301-9.

47

Page 48: SIMULATION SUPPORTED TRAINING IN ORAL RADIOLOGY140258/FULLTEXT01.pdf · SIMULATION SUPPORTED TRAINING IN ORAL RADIOLOGY ... Radiographic examinations ... Example of two intra-oral

73. Stefanidis D, Korndorffer JR, Sierra R, Touchard C, Dunne JB, Scott DJ. Skill retention following proficiency-based laparoscopic simulator training. Surgery 2005;138(2):165-70.

74. Grober ED, Hamstra SJ, Wanzel KR, Reznick RK, Matsumoto ED, Sidhu RS et al. Laboratory based training in urological microsurgery with bench model simulators: a randomized controlled trial evaluating the durability of technical skill. J Urol 2004;172(1):378-81.

75. Lohman DF. Complex information processing and intelligence. In: Sternberg RJ. Handbook of intelligence. Cambridge, UK: Cambridge university press; 2000.pp 285-340.

76. Carpenter PA, Just MA. Spatial Ability: An information processing approach to psychometrics. In: Sternberg RJ. Advances in the Psychology of Human Intelligence. Hillsdale, NJ: Lawrence Erlbaum Associates, Publishers; 1986. 3, p. 221.

77. Gardener H. De sju intelligenserna. 3:e upplagan [Frames of mind - the theory of multiple intelligences]. Jönköping: Brain Books AB; 1994.

78. White SC, Pharoah MJ. Oral radiology: principles and interpretation, 5th edn. St. Louis, Missouri: Mosby; 2004: 91-3.

79. Jutterström M. Pilot study and specification of The Virtual Mouth. Umeå: Umeå University, Dept. of Computing Science; 1999.

80. Clark CA. A method of ascertaining the relative position of unerupted teeth by means of film radiographs. Proc R Soc Med 1910;3(PART III):87-90.

81. Richards AG. The Buccal Object Rule. J Tenn State Dent Assoc 1953;33:263-8.

82. Peters M, Laeng B, Latham K, Jackson M, Zaiyouna R, Richardson C. A redrawn Vandenberg and Kuse mental rotations test: different versions and factors that affect performance. Brain Cogn 1995;28(1):39-58.

83. Carmines EG, Zeller RA. Reliability and validity assessment. Beverly Hills: Sage, cop.; 1979.

84. Gallagher AG, Ritter EM, Satava RM. Fundamental principles of validation, and reliability: rigorous science for the assessment of surgical education and training. Surg Endosc 2003;17(10):1525-9.

48

Page 49: SIMULATION SUPPORTED TRAINING IN ORAL RADIOLOGY140258/FULLTEXT01.pdf · SIMULATION SUPPORTED TRAINING IN ORAL RADIOLOGY ... Radiographic examinations ... Example of two intra-oral

85. Kennedy RS, Kennedy KE, Bartlett KM. Virtual environments and product liability. In: Stanney KM. Handbook of virtual environments: design, implementation, and applications. Mahwah, NJ: Lawrence Erlbaum Associates; 2002. pp 543-53.

86. Cohen J. Statistical Power Analysis for Behavioral Sciences. Hillsdale, NJ: Lawrence Erlbaum Associates; 1988, pp 8-27.

87. Hojat M, Xu G. A visitor's guide to effect sizes: statistical significance versus practical (clinical) importance of research findings. Advances in health sciences education : theory and practice 2004;9(3):241-9.

88. Dowker SEP, Davis GR, Elliott JC. X-ray microtomography. Oral Surg Oral Med Oral Pathol 1997;83:510-6.

89. Voyer D, Voyer S, Bryden MP. Magnitude of sex differences in spatial abilities: a meta-analysis and consideration of critical variables. Psychol Bull 1995;117(2):250-70.

90. Berbaum KS, Smoker WRK, Smith WL. Measurement and prediction of diagnostic performance during radiology training. AJR Am J Roentgenol 1985;145(6):1305-11.

91. Rohrer D, Taylor K. The effects of overlearning and distributed practise on the retention of mathematics knowledge. Appl Cognit Psychol 2006;20(9):1209-24.

92. Kerfoot BP, DeWolf WC, Masser BA, Church PA, Federman DD. Spaced education improves the retention of clinical knowledge by medical students: a randomised controlled trial. Med Educ 2007;41(1):23-31.

93. Donovan JJ, Radosevich DJ. A Meta-Analytic Review of the Distribution of Practice Effect: Now You See It, Now You Don't. J Appl Psychol 1999;84(5):795-805.

94. Wayne DB, Butter J, Siddall VJ, Fudala MJ, Wade LD, Feinglass J et al. Mastery learning of advanced cardiac life support skills by internal medicine residents using simulation technology and deliberate practice. J Gen Intern Med 2006;21(3):251-6.

49

Page 50: SIMULATION SUPPORTED TRAINING IN ORAL RADIOLOGY140258/FULLTEXT01.pdf · SIMULATION SUPPORTED TRAINING IN ORAL RADIOLOGY ... Radiographic examinations ... Example of two intra-oral

Appendix

Principle subtest

OV OO V O

O V O

O V O

O V O

O V O

O V O

OV O

OV O

O VO

O VO

O VO

Lägesbestäm de olika föremålen I förhållande till varandra. Närmast betraktaren = 1 därefter 2, 3 osv. Skriv siffran i respektive figur nedan.

1

4

3

2

5

6

50

Page 51: SIMULATION SUPPORTED TRAINING IN ORAL RADIOLOGY140258/FULLTEXT01.pdf · SIMULATION SUPPORTED TRAINING IN ORAL RADIOLOGY ... Radiographic examinations ... Example of two intra-oral

Projection subtest

51

Page 52: SIMULATION SUPPORTED TRAINING IN ORAL RADIOLOGY140258/FULLTEXT01.pdf · SIMULATION SUPPORTED TRAINING IN ORAL RADIOLOGY ... Radiographic examinations ... Example of two intra-oral

Radiography subtest Bild 1. (264) Var ligger 13:s krona i förhållande till 12:s rot? Buckalt eller palatinalt? ………………………………………………… Bild 2.(291) A. Hur är de två bilderna tagna i förhållande till varandra? Använd begreppet ”ortoradiell” och ett av begreppen ”mesialexcentrisk” eller ”distalexcentrisk”. Den övre bilden är ……………………………………………………………………… Den undre bilden är ……………………………………………………………………… B. Var ligger tandanlaget 15 i förhållande till 16 mesiobuckala rot? Buckalt eller palatinalt? ………………………………………………… Bild 3. (584) A. Var ligger det uppåtriktade extra tandanlaget i förhållande till 11:s rot? Buckalt eller palatinalt? ………………………………………………… B. Var ligger det nedåtriktade extra tandanlaget i förhållande till 21:s rot? Buckalt eller palatinalt? ………………………………………………… Accompanying radiographs were displayed on a monitor.

52