Clinical Instructor Academystatic.crowdwisdomhq.com/asrt/courses/CIA_Transcipts/Clinical... ·...

29
©2011 ASRT. All rights reserved. For educational and institutional use. This transcript is licensed for noncommercial, educational in- house or online educational course use only in educational and corporate institutions. Any broadcast, duplication, circulation, public viewing, conference viewing or Internet posting of this product is strictly prohibited. Purchase of the product constitutes an agreement to these terms. In return for the licensed use, the Licensee hereby releases, and waives any and all claims and/or liabilities that may arise against ASRT as a result of the product and its licensing. Clinical Instructor Academy Module 4: Assessment of Clinical Performance Transcript

Transcript of Clinical Instructor Academystatic.crowdwisdomhq.com/asrt/courses/CIA_Transcipts/Clinical... ·...

Page 1: Clinical Instructor Academystatic.crowdwisdomhq.com/asrt/courses/CIA_Transcipts/Clinical... · ©2011 ASRT. All rights reserved. Clinical Instructor Academy Transcript CIA Module

©2011 ASRT. All rights reserved.

For educational and institutional use. This transcript is licensed for noncommercial, educational in-

house or online educational course use only in educational and corporate institutions. Any broadcast,

duplication, circulation, public viewing, conference viewing or Internet posting of this product is

strictly prohibited. Purchase of the product constitutes an agreement to these terms. In return for the

licensed use, the Licensee hereby releases, and waives any and all claims and/or liabilities that may

arise against ASRT as a result of the product and its licensing.

Clinical Instructor Academy Module 4: Assessment of Clinical Performance

Transcript

Page 2: Clinical Instructor Academystatic.crowdwisdomhq.com/asrt/courses/CIA_Transcipts/Clinical... · ©2011 ASRT. All rights reserved. Clinical Instructor Academy Transcript CIA Module

©2011 ASRT. All rights reserved. Clinical Instructor Academy Transcript

CIA Module Navigation

This guide is designed to help you view the Clinical Instructor Academy modules. The transcript corresponds to the audio and visual components of the module and replaces any closed captioning. Topic headings are represented in bold in the script and the text is the narration.

Navigation controls

1. Pause/Play: Pauses and restarts the presentation. 2. Back: Returns you to a previous section of the module. 3. Fast Forward: Click to advance at 2x speed; click a second time to advance at 4x speed. Click again to

return to normal speed. 4. Slider Bar Control: Drag the indicator along the bar to navigate to any point in the presentation. 5. Mute: Turns the audio track on and off. 6. Section Indicator: Shows the current section and total number of module sections.

Page 3: Clinical Instructor Academystatic.crowdwisdomhq.com/asrt/courses/CIA_Transcipts/Clinical... · ©2011 ASRT. All rights reserved. Clinical Instructor Academy Transcript CIA Module

©2011 ASRT. All rights reserved. Clinical Instructor Academy Transcript

Clinical Instructor Academy Module 4 – Assessment of Clinical Performance

Slide 1 American Society of Radiologic Technologists essential education

-----------------------------------------------------------------------------------------------

Slide 2 Clinical Instructor Academy Module 4 - Assessment of Clinical Performance Andrew Woodward, M.A., R.T.(R)(QM)(CT) Clinical Assistant Professor University of North Carolina Chapel Hill, North Carolina License Agreement For individual use only. This product is licensed for use by the individual purchaser only. Institutional, educational, community, library or public use of this product is not permitted. This product may not be duplicated, resold, rented, transmitted in any form or used in any other unauthorized manner. Purchase of the product constitutes an agreement to these terms. In return for the licensed use, the Licensee hereby releases, and waives any and all claims and/or liabilities that may arise against ASRT as a result of the product and its licensing.

Disclaimer The carefully researched information contained in this activity is generally accepted as factual at the time of production. The ASRT and the presenter disclaim any responsibility for new or contradictory data that may become available before the next revision. Radiologic technologists must take into account existing state statutes and institutional policies as they relate to the information presented. This activity may be available in multiple formats or from different sponsors. ARRT regulations state that an individual may not repeat a self-learning activity for credit if it was reported in the same or any subsequent biennium.

-----------------------------------------------------------------------------------------------

Slide 3 Objectives Hello. My name is Andrew Woodward, and this video is going to talk about the assessment of clinical performance. The objectives we're going to look at are:

Page 4: Clinical Instructor Academystatic.crowdwisdomhq.com/asrt/courses/CIA_Transcipts/Clinical... · ©2011 ASRT. All rights reserved. Clinical Instructor Academy Transcript CIA Module

©2011 ASRT. All rights reserved. Clinical Instructor Academy Transcript

Define assessment and evaluation.

Describe an assessment rubric.

Identify why clinical performance is assessed.

Describe the difference between competent and proficient.

Describe when an assessment should take place.

Discuss the formative assessment process.

Describe the role of feedback in the assessment process.

Discuss the summative assessment process.

Relationships The student needs to interact with you as the instructor and not the “enforcer” of rules.

To start off with, we need to talk about relationships that you will need to establish as the clinical faculty. In order for clinical assessment to be valued and effective, the student needs to interact with you as an instructor and not as the enforcer of rules. In other words, you and the student need to be allies and not adversaries.

You may spend more time acting as the “enforcer” instead of the instructor. Unfortunately, you may find you spend more time acting as an enforcer and not as an instructor, and it's a difficult role that you'll have to juggle in the process.

Being a successful instructor carries a great personal reward. I can assure you, being a successful instructor carries with it a greater personal reward than being the successful enforcer.

Assessment vs. Evaluation

Assessment – a continuous process of gathering information related to a student’s knowledge, attitudes, behaviors and skills.

What's the difference between assessment and evaluation? Assessment is a relatively continuous process where you're to be gathering information related to the student's knowledge, attitudes, behaviors and skills. The information that you gather is going to be used to provide feedback to the student so that they may improve their performance. This is an important piece because if you don't give feedback to the student, they're not going to know what they need to do to improve. You don't want to wait till the end of a semester to give feedback, because at that point they've been able to develop all their bad behaviors.

Evaluation – occurs at the end of a specified time frame and is the result of an analysis of the information contained in assessments.

Generally an evaluation, on the other hand, is going to be a process that occurs at the end of a specified time frame and is the result of the summary of the analysis of information contained in all of your assessments completed over that time frame. What you're looking at is, for assessment, is ongoing – you're watching, you're taking notes, meeting with your student, providing them with feedback. At the end, what you're going to do is this all-encompassing evaluation which takes into account all of your assessment data that you've gathered on the student.

A challenge for the clinical instructor is to gather information over a contiguous period of time instead of at intervals or “snapshots.”

Page 5: Clinical Instructor Academystatic.crowdwisdomhq.com/asrt/courses/CIA_Transcipts/Clinical... · ©2011 ASRT. All rights reserved. Clinical Instructor Academy Transcript CIA Module

©2011 ASRT. All rights reserved. Clinical Instructor Academy Transcript

A challenge for the clinical instructor is the ability to gather information over this contiguous period of time, in essence, back-to-back period of time, instead of interval or snapshots. You see a student one week a month and at the end of the semester, you're trying to really build a big picture when it would have been more useful to have more closely spaced together pieces of data that you observed. When you look at it, don't work with snapshots, but let's work with closer pieces of information so that you don't end up with an inaccurate assessment because you caught the student on the bad day.

Assessment Rubric A tool designed to standardize the assessment process.

How are you going to do all this? First let's take a look at an assessment rubric. Rubric’s a fancy word for a tool that's designed to standardize your assessment process. It basically is your map for how you're going to be able to assess the student, or I should say multiple students, consistently.

Standardization helps reduce the subjectivity of the assessment. This standardization helps to reduce the subjectivity of your assessment. The map is telling you where you want to go.

The contents of a rubric may be generic or very specific. The contents in that rubric can be very generic, or you can make it very specific, depending upon what the goal is of your assessment.

Let's take a look at this first example. There're several to follow, but it's one of those things that will help you build a picture for why this will make it easier in the evaluation process.

In this particular grid, across the top you see we have a Performance Level 3, a Level 2 and a Level 1. Basically, Level 3 would be the highest; Level 1 would be the lowest. Then on the left-hand column you have different characteristics. I just happened to pick these to use as examples, but what you need to do is pick items that are relevant to your program. If we look at professionalism, each box below a performance level would have a descriptor. You want to create the descriptors for what you feel to be the most important level for Level 3, Level 2 and Level 1.

Page 6: Clinical Instructor Academystatic.crowdwisdomhq.com/asrt/courses/CIA_Transcipts/Clinical... · ©2011 ASRT. All rights reserved. Clinical Instructor Academy Transcript CIA Module

©2011 ASRT. All rights reserved. Clinical Instructor Academy Transcript

In this case, for professionalism, a Level 1 you might describe as the student is “frequently disrespectful and inconsiderate or does not accept responsibility for errors or does not adhere to dress code.” For teamwork you may have a student who's very good, this is the same student, where they are consistently conscientious, they routinely demonstrate extra effort to be an integral team member and they earn the respect of support staff. Then when we look at knowledge, you have a performance level of 2, where the student demonstrates appropriate knowledge and consistently and correctly applies it in the clinical setting. This would be one where you would go through and check the box. You can choose to put a point value in that box or not have a point value in there. That's up to you. If you put the point value in there, it makes it easier to score it at the end. The thing here is that all of these, when we look at all these rubrics, you want to make sure that you are sharing them with the students so that they know what to expect.

Let's take a look at this next rubric. The top one is about patient care skills; the bottom one is positioning skills. If we look at the patient care skills, you want to define in there what are you looking for. They’re there to be aware of the patient, support the patient emotionally. Did they make sure the patient was comfortable? Were they aware of what was going on with the patient? Were they aware of the patient's condition? Did they protect the patient using lead shielding or lead strips? And in this case, you'd score it on about 1 to 5, circling it. Very straight forward, simple, you don't have to worry about what does this description really mean. When we go down in the positioning skills and evaluating positioning skills, you're going to look at it and say, “Was there proper centering, was there positioning adaptability of the exam to the patient's condition and what about patient history?”, that might make you want to alter the process of determining what position or how you want to position the patient, and does the student understand the protocol for the procedure that's being done.

Page 7: Clinical Instructor Academystatic.crowdwisdomhq.com/asrt/courses/CIA_Transcipts/Clinical... · ©2011 ASRT. All rights reserved. Clinical Instructor Academy Transcript CIA Module

©2011 ASRT. All rights reserved. Clinical Instructor Academy Transcript

Two rubrics are shown here. The top one you could look at as far as an overall evaluation, maybe at mid- semester or maybe at the end of the week. Once again, it has a high score of a 5 and a low score of a 1. And in this particular case, you're going to evaluate clinical adaptability, professional appearance, the efficient use of educational opportunities and overall clinical performance. Make sure you share it with the student so they know what it is that's there that's going to be used to assess them. For all of these you're going to have an area for comments. A different type of rubric, what you see at the bottom of the screen here, is one where I've used the terms novice, advanced beginner and competent. A little bit different terminology, but it's terminology that Dreyfus and Dreyfus have identified in their writings in regards to a person learning a skill. All of us can relate to these terms of novice, advanced beginner and competent in respect to what it was like when we first enrolled in a radiography program compared to where our skill set is now. You would look at a student, and the nice thing with this is you would expect to see growth across novice to advanced to competent. A first-semester student, when you talk about tube and Bucky manipulation, it's going to be novice manipulation. They're kind of clunky in moving the tube and the Bucky and lining everything up. They get to the control panel, and maybe they're very good at the control panel, so you'd mark them as an advanced beginner. You put them on a portable machine, they run into the wall or they knock into things, would be novice. After a week in the OR, you would hope that they're going to be an advanced beginner, maybe competent. That's hard to say; we'll talk about that coming up. What's their level of skill in operating the radiology information system, or RIS, and how comfortable are they with the whole PACS routine, because we really do need to evaluate their assessment in this process. At the bottom of this, you're going to have an area where you can write comments. That's an important piece, comments. You don't want to hold onto this for two weeks. You want to try to get back to that student as quickly as possible so that you can identify the things that they're doing right and you can work with them on changing the things that need to be corrected.

Page 8: Clinical Instructor Academystatic.crowdwisdomhq.com/asrt/courses/CIA_Transcipts/Clinical... · ©2011 ASRT. All rights reserved. Clinical Instructor Academy Transcript CIA Module

©2011 ASRT. All rights reserved. Clinical Instructor Academy Transcript

Here's another rubric, using the terminology, once again, of the novice, the advanced beginner and competent. In this particular case, you might use this when you evaluate the student while they're doing a radiographic procedure. Because you look at it in terms of marker placement, central ray to the receptor alignment, collimated exposure field, part centering to the exposure field, technique selection vs. exposure indicator, artifacts, part positioning. What you want to see as the student progresses, when you hit that final semester and you're ready to send them on their way, you want to start to go through and see all competents. That is letting you know that your feedback has worked and the student is doing what needs to be done.

-----------------------------------------------------------------------------------------------

Slide 4 Make Your Own Rubric

Using a rubric created by someone else can lead to frustration. Let's talk about making your own rubric. While it may seem easier to use a rubric designed by someone else, it's actually likely to lead to frustration on your part unless you have the same criteria for grading as the creator of that rubric. You want to make sure that you understand the terms that the person who created the rubric had in mind if you're going to use somebody else's. What I would suggest is you search for any number of rubrics that are available.

Gather several examples; select the best element from each example. You can get on the World Wide Web and also look at the resources with this video and gather together several examples, put together a rubric that contains what you like the best from each example. Turn it into your work. It just helps to serve as a guide so that you can come up with a tool that's really allowing you to assess and provide the feedback that you want.

Consider your rubric a work in progress. You want to treat your rubric like a work in progress. Once you've designed it, it's not done. It's really important that you realize that rubric is going to have to evolve as your teaching and evaluation skills

Page 9: Clinical Instructor Academystatic.crowdwisdomhq.com/asrt/courses/CIA_Transcipts/Clinical... · ©2011 ASRT. All rights reserved. Clinical Instructor Academy Transcript CIA Module

©2011 ASRT. All rights reserved. Clinical Instructor Academy Transcript

evolve. You may at the end of a year want to sit down, re-evaluate that tool and say, “You know, I really think that we need to change what it is in this particular assessment area.”

Why Assess Clinical Performance? Now we get to the big question: Why assess clinical performance? Sometimes we say, “Well, look, that's a given; we know why we want to do that.” It's important that when we do this, we understand the importance of the assessment and making sure that our assessments and our evaluations are a true reflection of student behaviors, student performance. Documentation of:

Safe clinical practice.

Competent clinical skills.

Proficient clinical skills. First and foremost, we're assessing to provide documentation of the following things: safe clinical practice. If your student is not safe in the clinical setting, it's not appropriate for us to keep moving them along. No one wants to go into the hospital and, you know, be the recipient of unsafe clinical skills because of the injury possibility. When we look at safe clinical practice, we want to make sure that the student is demonstrating competent clinical skills and proficient clinical skills.

Acceptable professional behavior: Integrity. Interpersonal relationships. Dependability. Appearance.

In addition to that, we want to make sure that we are documenting and promoting acceptable professional behavior. The student spends more time in clinical than they do in the classroom, so the development of an acceptable professional behavior is greatly influenced by – or I should say the most influenced by – the time in clinical, not the time in the classroom with the instructor. When you look at what do you want to define in terms of your acceptable professional behavior. I've listed some terms here, but this is something that everybody that's associated with the program needs to sit down and say, “This is what we want to have.” In this case, you're looking at what's the integrity of the person, what are their interpersonal relationships like, whether it be with fellow students, the technologists on the floor, support staff that are on the floor, what's their dependability, what's their appearance. The list goes on and on. This gives you an idea of those things that you want to spell out with descriptors, with scores, that the student needs to know, this is how we're going to evaluate you. It's important that you realize that the program needs to define the behaviors and define the competency skills and that it's not the student's role to say, "Well, I don't think that that's what this should be." It’s really up to the program to define what they want.

Competent

Possessing a level of knowledge, skill and attitude that permits one to perform a specific procedure.

What is competent? There's a whole big discussion on this, and as I looked at this, I was like, what would make the most sense? When we look at competent, putting together a definition, it's the student possessing a level of knowledge, skill and attitude that permits them to perform a specific procedure.

Page 10: Clinical Instructor Academystatic.crowdwisdomhq.com/asrt/courses/CIA_Transcipts/Clinical... · ©2011 ASRT. All rights reserved. Clinical Instructor Academy Transcript CIA Module

©2011 ASRT. All rights reserved. Clinical Instructor Academy Transcript

When you were saying the student is competent to do a chest radiograph, they're competent to do that chest radiograph on the same type of patient, in the same room, with the same patient conditions. That doesn't mean they're competent to do every chest radiograph.

Dreyfus and Dreyfus (1986) identified five steps to becoming an expert: Novice. Advanced beginner. Competence. Proficiency.

If we look at this phase of developing skills or the stages of developing skills, Dreyfus and Dreyfus identified five steps to becoming an expert. You may look at this source and say, "Gee, that's 1986… that's a long time ago." Well, the fact of the matter is how we learn a skill really has not changed. If you were to talk to someone who went through a radiography program in 1960 and said, "How did you learn how to do this?", the steps they went through are going to be the same steps that you went through or I went through or students in 10 years will go through. That comes down to basically five steps. The steps that are most relevant to the development of the clinical skills of our student radiographers are, number 1, they're a novice, day one in the program, they're a novice; and then they become an advanced beginner, they're a little bit better; and after some time at that advanced beginner level, they start to demonstrate competence. Our end goal in this is to hopefully have a graduate who's going to be proficient. They're not going to be proficient in every single exam. We can produce a graduate that is going to be competent. They have a level of knowledge and skill and attitude that's going to permit them to perform specific procedures. If they've never done mastoids the entire time they're in the program, you can't say that they're competent to do mastoids at graduation. But several years after graduation, if they happen to be in a situation where mastoids are performed routinely, they will have developed competence in that procedure and quite possibly be proficient.

Competence Let's talk some more about competence. You started off as a novice. The novice is … you have the student in the lab and you say, "OK. This is how a PA hand is done: You get a 10 x 10, you have a right marker for a right hand. You're going to place the part in the middle of the cassette.” You have a very set of specific rules. The student memorizes those rules. They memorize those rules out of the context of variations in patients, variations in equipment, but they've locked them away in their brain to say, “I know what I need to do to do a hand.”

Competence occurs only after going through the novice and advanced beginner stage. Achieving a competence occurs only after going through the novice phase. Then from there, you've moved into that advanced beginner, student's a little bit better. And then that's in the process of this directed skill.

The novice learns the rules related to a given skill. During that novice phase, the student learns the rules related to whatever that skill is – positioning a PA hand, positioning a KUB. They understand how to follow those rules in the laboratory setting, which is not real life. In essence, they may be great with a classmate to position the part with that x-ray tube and that table and your color cassettes, but that doesn't necessarily translate to them doing real well in their actual clinical setting.

Page 11: Clinical Instructor Academystatic.crowdwisdomhq.com/asrt/courses/CIA_Transcipts/Clinical... · ©2011 ASRT. All rights reserved. Clinical Instructor Academy Transcript CIA Module

©2011 ASRT. All rights reserved. Clinical Instructor Academy Transcript

The advanced beginner starts to understand how to apply the rules. They'll leave your lab and then move into the clinical setting. When they get into the clinical setting, they start to transform into what we're going to call that advanced beginner. They learn the rules. They're just beginning to understand how to apply the rules in the context of a real-life situation. You've taken them from a lab, because, as we all know, what a student will often hear in the clinical setting is, "Well, that's what you learned in the classroom, this is real life," in clinical. Which is very true; that which you learn in the classroom is completely out of context to what is going to occur in the real-life setting. But you still have to know the novice rules and the advanced beginner rules before you can function in the real-life clinical setting.

Novice/Advanced Beginner Example A novice is likely to say, “The book says this is how to position for a hand.”

If we look at what is a novice or advanced beginner likely to say to you – you know, a novice is likely to state to the technologist, "The book says this is how to position for a hand: Put the central ray at the

third metacarpal phalangeal joint, and it has to be 45 for the oblique.” They can give you the exact rules, but that doesn't take into account that you have a patient that you're doing the hand on that might have rheumatoid arthritis – rheumatoid arthritis and significant hand deformity. That changes how things are done.

An advanced beginner is likely to say, “During our lab practical exam I positioned the part like this."

The advanced beginner is more likely to say, "Well, gee, during our lab practical exam last semester, this is how I positioned to do this." They're trying to apply – when you look at this advanced beginner, they're saying, "Oh, I'm trying to apply this to the situation I'm in right now, isn't necessarily working." A piece here that you, as the clinical instructor, as well as the staff and the department need to realize is the student isn't questioning what you're doing. They're trying to apply what they learned to the current real-life situation, because they're not sure how to do that because they haven't demonstrated competence. But more importantly, they don't have the proficiency or the mastery that the technologist that's been working on the floor for several years has.

Competence Students develop competence after considerable clinical experience.

We come back to competence again. And the student develops competence after considerable experience with applying the rules they learned to their own real clinical experiences. If they haven't done the real exam in that real-life clinical situation, we can't say, “Yes, you are competent.” You could say, “Yes, you're competent in the lab, but being competent in the lab is not the same as being competent with a live patient in the departmental radiographic room.”

Students understand the application of rules and make decisions regarding the rules in the context of the procedure in real time.

The competent stage is going to see the student develop an understanding – and this is the important piece – they're developing an understanding of the application of the rules and they're making decisions regarding the rules in the context of completing the procedure in real time. That's very important and you need to communicate this to the student in letting them know, day one, you're a novice. Depending upon how many procedures you've done of a particular thing, you go from novice to advanced beginner. Then that student is going to say to you, "Well, OK, I want to do my competency,"

Page 12: Clinical Instructor Academystatic.crowdwisdomhq.com/asrt/courses/CIA_Transcipts/Clinical... · ©2011 ASRT. All rights reserved. Clinical Instructor Academy Transcript CIA Module

©2011 ASRT. All rights reserved. Clinical Instructor Academy Transcript

"Well, let's go in and do this," and they may not do very well because the exam that they've chosen doesn't match the situations that they've been in prior to with all their practice. It's important to let the student do that because they're going to gain experience from it, but you can't turn and say, "You're competent now," if they didn't do the exam the way you've defined.

-----------------------------------------------------------------------------------------------

Slide 5 Proficient

In order to be considered proficient, a student must demonstrate increased knowledge, skills and attitudes during the performance of a procedure.

What about proficient? One of the things that all programs like to say is, our graduates are going to be competent and proficient. Proficient’s that next step, and at the end we would like to see all of our graduates be proficient. And in order to be considered proficient, a student must demonstrate increased – before we just said a specific knowledge specific knowledge, skills and attitudes – but now, with proficiency, we're looking at increased knowledge, skills and attitudes during the performance of a procedure. This starts to begin its development as the student does a greater variety of procedures. Or really what it comes down to is a greater variety of patients with varying conditions in different radiographic rooms that influence what they're going to have to do to do the procedure.

A student may not be deemed proficient until they have demonstrated that they are competent.

A student may not be deemed proficient until they have demonstrated that they are competent. These are all steps, and basically they're a series of sequential steps, starting once again, you're a novice, and from a novice, you are then that advanced beginner. Once that student starts to feel very comfortable and can demonstrate the specific knowledge, skills and attitudes about a particular exam, they're going to say, "Assess my competence." And you're either going to say, "You are" or "You're not competent." As you get towards the end of the program, and this is where your students are really working more and more with indirect supervision, you're going to start to see that their level of skill is increased. They have much greater experience in performing a wide variety of exams and you're starting to see that they're becoming proficient.

Proficiency Proficiency is the result of increased exposure to a wider variety of examinations that permit

the student to practice the application of their competency skills quite often with intuition (Dreyfus and Dreyfus, 1986).

If we were to spell out proficiency a little bit more, and the bottom line here is proficiency is a result of increased exposure to a wider variety of examinations that permit the student to practice the application of their competency skills quite often with intuition. They're no longer running through the rules of positioning in their head while they do the exam. They are actually starting to intuitively know, “This is what I need to do during this procedure.”

The student is able to: Recognize how to complete the examination. Use prior learning experiences. Apply guidelines.

Page 13: Clinical Instructor Academystatic.crowdwisdomhq.com/asrt/courses/CIA_Transcipts/Clinical... · ©2011 ASRT. All rights reserved. Clinical Instructor Academy Transcript CIA Module

©2011 ASRT. All rights reserved. Clinical Instructor Academy Transcript

The student is able to evaluate the examination and recognizes how to complete the examination based on their prior learning experiences in the application of the guidelines they've learned. What we have when we talk about proficiency is, we're putting everything together. The student has the big picture. This is a very important conversation that you want to have with your students about the clinical experience. Many times they come in and they figure, let me teach you one, show you one, see one, do one, now you're ready to go. That isn't necessarily going to be the case. It's really important to understand also at this point that every student will progress differently because they're all processing things differently that are going on in the clinical experience.

Beyond Proficient Expertise is the fifth level of the Dreyfus Model of Skill Acquisition.

What about beyond proficient? Just about anyone can relate to this when we look at the acquisition of a skill. When we go beyond proficient, expertise is the fifth level of the Dreyfus Model of Skill Acquisition.

Expertise is developed after extensive knowledge, skills and attitudes are learned through the performance of many procedures.

Expertise is developed after extensive knowledge, skills and attitudes are learned through the performance of many procedures. You're looking at someone who has years of experience, but not only those years of experience, but they've done many, many, many exams. You may have someone who's only worked in a doctor's office or someone who's worked in a very busy hospital, and you're going to have two different levels of expertise. But in either case, you're going to be dealing with an expert. How do you recognize the expert?

The expert radiographer intuitively knows how to complete a procedure and generally does not stop and analyze the situation to determine how best to complete the procedure.

If you think back to when you were a student or if you look around the departments that you're in now, the expert radiographer intuitively knows how to complete a procedure and does not stop and analyze the situation to determine how best to complete that procedure. This is an important piece because the students get really frustrated because the expert makes it look simple. The expert gets frustrated when the student asks questions because then the expert has to stop and think about, why is it I did that. They intuitively know what to do in terms of tube movement, tube angle, part placement, rotation – the fact that, we're not going to be able to do this supine, we're going to have to do this prone, or let's do this at the upright Bucky vs. doing this with a grid cassette. It makes for a real challenge because their experience has allowed them to develop this intuitive sense of what to do.

It is not realistic to expect a student to demonstrate expertise during their enrollment in school.

An important piece of this is it's not realistic for us, as educators, to expect a student to demonstrate expertise during their enrollment in school. You're not going to graduate experts. It's just not going to happen. You're not going to be able to take a recent graduate and drop them into a setting and they're going to know how to do everything. You're going to graduate somebody who is safe in their clinical practice, they have the knowledge of clinical practice, they have the skills for clinical practice and they have the attitudes for clinical practice. But they're not experts. They're not going to become experts until they have developed this extensive knowledge and extensive set of skills and this extensive set of attitudes that they're going to be able to apply across the board in their examinations.

Page 14: Clinical Instructor Academystatic.crowdwisdomhq.com/asrt/courses/CIA_Transcipts/Clinical... · ©2011 ASRT. All rights reserved. Clinical Instructor Academy Transcript CIA Module

©2011 ASRT. All rights reserved. Clinical Instructor Academy Transcript

-----------------------------------------------------------------------------------------------

Slide 6 What Is Assessed?

Clinical courses are unique in that they require assessment of a student’s cognitive, affective and psychomotor skills.

What do we want to assess? In clinical, there's any number of things to assess, but clinical is unique in that in clinical your assessments of a student occur in the cognitive, affective and psychomotor domain. What's that mean?

Cognitive – thinking. When we talk about the cognitive domain, you are assessing the thinking skills of the student. It's one of those times when you get really frustrated, and you look at the student and say, "Didn't you think about what you were going to do?" Quite possibly they didn't, because as a novice, they're not thinking about things. They are merely memorizing the rules and how that rule applies, not the impact of not following that rule.

Affective – attitudes, emotions, feelings. When we talk about assessing the affect, you're looking at the person's attitudes, their emotions, their feelings, body language, did they roll their eyes, those types of things.

Psychomotor – hands-on. Last but not least is psychomotor. When we look at psychomotor, we're looking at the hands-on piece. This is one of those – and you're heard it before - if you haven't, I assure you'll hear it in your future. The student is great in the classroom, but they just don't get it in the clinical setting. In this case, in the classroom, what's assessed? Cognitive. In a cognitive situation, in a classroom, paper and a pencil or an oral question situation, you find the student can really think. But when you send them out to the clinical setting, you have a great thinker, but when it comes time to putting their hand on and applying what they're going to do, it's very, very difficult. In the process with that, you're going to see the affect of the individual. What's their level of frustration, do all of a sudden they become defensive, do they burst into tears when things become way too challenging? In clinical, all three of these are occurring simultaneously when you evaluate a student. In the classroom, we don't look at all three at the same time. Clinical is much more difficult to complete the evaluation process because you have all three of these and all three interact with each other. If you have a situation that is high stress, you may find the student doesn't do well with thinking through with the process and their psychomotor skills aren't the greatest. In this whole cognitive, affective and psychomotor assessment, that's all linked in with all of your assessments that you're going to do. The competency – what happens if they make a mistake in the middle of that competency, do they fall to pieces? Are they unable to think about where they want to go? As you create your tools, you're also going to have different levels of scoring through this.

When To Assess? Before an assessment, provide a detailed rubric to the student.

The next question is when do you assess? And at this point, prior to completing an assessment, it is best to provide a detailed rubric, which we talked about earlier. But you're going to have a detailed rubric of the assessment process, and you're going to give this rubric to the student. This is not the same thing as

Page 15: Clinical Instructor Academystatic.crowdwisdomhq.com/asrt/courses/CIA_Transcipts/Clinical... · ©2011 ASRT. All rights reserved. Clinical Instructor Academy Transcript CIA Module

©2011 ASRT. All rights reserved. Clinical Instructor Academy Transcript

handing the student a written test before they take the test and say, "Hey, read through this so you know what's on it so you can put the answers in your head." We're not assessing in that way. You're basically letting the student see the criteria that is going to be used to assess their performance, to assess their skills. You may be thinking, “Why in the world would I want to let the student know what they're being assessed on?” Well, the fact of the matter is, it's only fair. I don't think you will find anyone who does not want to know what they're going to be evaluated on. If somebody came in to you and said, "Oh, gee, by the way, you failed this part of your assessment," and you had no idea you were being assessed on that, the likelihood is you'd be upset.

Providing the rubric helps to: Define roles in the assessment process. Identify how the assessment works. Identify what is going to be assessed. Reduce uncomfortable situations.

In this particular case, sharing this with the student, explaining it to the student will do the following things: Number 1, it keeps all parties involved aware of their role in the assessment process; it lets you know this is what you're supposed to do, and it lets the student know this is what I'm supposed to be doing. It identifies for everyone how the assessment works. It identifies what is going to be assessed. Ultimately, it's going to reduce uncomfortable situations that may arise, or I can promise you, will arise when the assessment results don't match the student expectations. If you decided to add on a piece to the rubric but didn't share with the student that that was added on to the rubric, they may in fact, not be too happy about that, could have a confrontation with the student. Things become uncomfortable because you hadn't let them know that this is part of the rubric. Having all parties completely informed really is important.

Types of Assessment Formative: Continuous. Formal. Informal. Promotes growth and progress.

What types of assessments do we have? First off, there's formative. Formative assessments basically are going to be continuous types of assessment. They are ongoing, once a week or everyday type of thing. And what you have with formative assessments in this continuous process, they're going to be – they can possibly be formal, they can be informal. The goal of a formative assessment, and this is really important and the biggest challenge that I've experienced, is that our goal here is to help the student grow and the student to make progress. With formative assessments, we need to be providing feedback that is going to help this student grow and progress towards the goals of the program. We do it in a formal and informal way.

Summative: Time based. Formal. Used for grade assignment.

Besides that formative assessment, we have the summative assessment. The summative assessment, the easiest way to look at this is it's time based. It's a very formal assessment, and generally speaking,

Page 16: Clinical Instructor Academystatic.crowdwisdomhq.com/asrt/courses/CIA_Transcipts/Clinical... · ©2011 ASRT. All rights reserved. Clinical Instructor Academy Transcript CIA Module

©2011 ASRT. All rights reserved. Clinical Instructor Academy Transcript

it's going to be used for grade assignment. Your summative assessments are, something like, OK, here's your midterm progress report, here is your end-of-semester progress report, and it sums up everything. Whereas, your formative assessments are going to be the evaluations of the student throughout the time, or assessments of the student throughout the entire time that they're with you. If we look at formative assessment a little bit more, ideally – and this is a challenge, because as the clinical instructor you're going to have – people are going to call you, interrupt you, pull you away from that student. Things are going to be interrupted, you may have one student who gets more time than another, it's a juggling act to balance this all out and make sure that you're fair and equitable to everybody that you have in your department.

Formative Assessment Formative assessment occurs continuously and with continuity.

Ideally, formative assessment occurs continuously and with continuity. Continuity means everything is linked together; they're not all spread apart. You're doing this in order to provide the student with helpful feedback regarding their clinical performance. If the feedback is not going to be helpful, I don't think you should give feedback to the student. We want to give them feedback that's going to help them get better and to improve their performance.

The goal of a formative assessment is to help the student reach a goal. The goal of formative assessment should be to help the student reach a goal. Your feedback shouldn't be, "We don't want you to reach your goal." "This is what you're going to need to do to reach your goal."

Two ways to deliver feedback: Informal. Formal.

As an instructor you're going to have two ways in which to deliver the feedback to the student: You can deliver it informally, and you can deliver it formally. Both work very well, when we talk about this.

Informal Feedback Informal assessment is a teachable moment.

If I talk about informal feedback, informal assessment is not done at a specific time but is done when an opportunity presents itself to allow the instructor to assess the student in a spontaneous situation that can't quite likely be replicated. This is one of those teachable moments.

Unique opportunity for feedback. You've gone over with the student to the emergency room with a portable machine. The requisition says, "PA hand" and at that point, you're saying – and the indication for that is “patient fell.” When you get there, you discover that the examination is going to be very challenging, so you walk through this exam with the patient and with the student. And at that point, you're able to really provide the feedback to the student as maybe they forgot to put the RAD bag over the cassette and the hand is very light. It's a very informal feedback point.

Spontaneous situation unlikely to be replicated. The other thing you have, another scenario, you're working with a student during their performance of an uncommon exam, a double-contrast barium enema. You have the opportunity here to specific time during this exam. It's very spontaneous, your level of feedback with them. Maybe they're doing

Page 17: Clinical Instructor Academystatic.crowdwisdomhq.com/asrt/courses/CIA_Transcipts/Clinical... · ©2011 ASRT. All rights reserved. Clinical Instructor Academy Transcript CIA Module

©2011 ASRT. All rights reserved. Clinical Instructor Academy Transcript

zygomatic arch views, or maybe it's axiolateral mandibles. But in either case, in this situation with the informal feedback, it's providing you with an opportunity to help the student get better. The other piece here that you need to realize is that it's not likely you're going to replicate that scenario. That's why it becomes informal – how many double-contrast barium enemas come through your department that you're going to be able to have with all students? Or how many times do we have zygomatic arches come through or axiolateral mandibles or mastoids, for that matter? But in either case, it's these unique opportunities where we're able to spontaneously work with the student. You want to make anecdotal notes regarding the interaction with the student and kind of do a little summary at the end of the day to say, "OK. Let's, you know, go back and revisit, you know, what I did with you earlier today."

Formal Feedback Formal assessment is done at a specific time.

Done in a specific situation. Formal feedback is different. This formal feedback or formal assessment is done at a specific time and is done in a specific situation that allows the instructor to assess the student in a relatively controlled situation that quite often can be replicated with the same student or a different student. What do we mean by this? Scenario here would be you go ahead and spend the day with each student in the radiographic room that's dedicated to outpatient chest radiography. The only exam that's going to be done in this room will be chest radiographs. The equipment is the same. Everything in the room is the same; all the interfaces are the same. The only thing that's going to change happens to be the patient. In this particular instance, you can basically look at it as far as you will know patients who match up with the same level of difficulty throughout the day, so you effectivity evaluate how each of your students do.

The situation can be replicated. It's important, it's a very specific situation that you are relatively in control of. It's not taking the portable machine, going over to the emergency department during a trauma exam, because no two traumas are going to be alike. When we look at your radiographic room that's dedicated to outpatient chest radiography, there's a lot of consistency in the types of patients that you're going to see coming through there.

-----------------------------------------------------------------------------------------------

Slide 7 Timing of Feedback

Establish a time and place to deliver feedback. Let's talk about timing of feedback, because this entire assessment process is only going to be effective if we provide the student with good feedback and timely feedback. You need to establish a timeline of when and where you are going to provide feedback to the student. Timing is critical because you want it to be as close to the moment when you did the assessment as possible so that everything is fresh in everybody's memory. But the other thing is you need to be concerned where are you going to provide this feedback. You don't want to provide the feedback especially if it's one which could be construed as being negative and in an

Page 18: Clinical Instructor Academystatic.crowdwisdomhq.com/asrt/courses/CIA_Transcipts/Clinical... · ©2011 ASRT. All rights reserved. Clinical Instructor Academy Transcript CIA Module

©2011 ASRT. All rights reserved. Clinical Instructor Academy Transcript

environment surrounded by a bunch of people. We want to have some privacy when we deliver feedback to the student. Certainly, don't deliver your feedback in front of the patient unless it's something like, "You did a wonderful job. That was very good," instead of saying, "Well, you completely misplaced these items and we're going to have to redo those imagines." You want to make sure that the timing and where you give feedback is appropriate.

Be consistent when delivering feedback. The other thing is the continuity and consistency. Continuity and consistency are important when delivering feedback in order to help the student grow. If you assess me in the first week of the semester and don't tell me about that assessment until four weeks later, the likelihood is whatever bad behaviors I demonstrated in week one have now become fully ingrained by week four, so I'm going to have to unlearn all of those bad habits. That can be particularly challenging. For you, the continuity and consistency in delivering the feedback is very challenging to achieve.

Develop a realistic timeline to deliver feedback. As you look at what you're going to do for your assessments of students during their clinical rotations at your site is you need to be realistic about what you can deliver. It's fine if you develop some very simple to very elaborate rubrics and you have a very elaborate formative assessment process, where you have your informal and formal setup. If you don't have the time to do that with all of your students, you're going to end up in a position of not being consistent in the application of your assessments for all the students. You will get frustrated and the students will get frustrated because you haven't been consistent. The continuity piece is important because the continuity of the experience and the continuity of the feedback are really going to help that student grow.

Content of Feedback Ask the students how they think they are doing.

What needs to be in your feedback? This is important and more about helping the student grow and not necessarily pointing out these are all the things you've done wrong and you have to fix, but also these are the things you've done right and continue with, but ask the student how they think they're doing. Say, "How do you feel about this past week's performance? Tell me about your strengths this week. Tell me about your weaknesses this week. What opportunities did you have to grow?" Ask them if there are any barriers to them being able to grow. Start off with having them feed back to you with what they think they did.

Base your feedback on direct firsthand observation. Next in line is you really need to base your feedback on direct, firsthand observation. Not everyone in the department is going to have the same level of evaluation skills that you have. When you're going to assess the student and you're going to sit the student down and provide them with feedback, the best feedback that you can provide them is going to be that which is based upon what you saw firsthand. You never know what may have occurred between a student and a technologist in the department. If you're basing a lot of your feedback on the information coming from the staff and the department, you need to make sure that it's valid and reliable information.

Refer to goals you established with the student. Next in line is you want to make sure that you refer back to the goals that you established with the student before it all started. Remember, beginning that week you have set with the student or beginning of that semester, you said to the student, "These are the goals. These are the things that

Page 19: Clinical Instructor Academystatic.crowdwisdomhq.com/asrt/courses/CIA_Transcipts/Clinical... · ©2011 ASRT. All rights reserved. Clinical Instructor Academy Transcript CIA Module

©2011 ASRT. All rights reserved. Clinical Instructor Academy Transcript

you're going to need to be able to do." And throughout that timeframe, you've done your assessments to help the student achieve that goal. Now, the thing you have to realize is, you don't know whether or not that student is going to do positive things with the feedback, in essence, improve themselves, ignore your feedback and make no change in their behaviors, or decide to do things which are actually detrimental to them being able to achieve the goals. But it's important that you review the goals with the student.

Avoid using judgmental terminology. When you provide your feedback to the student, it's best to avoid judgmental terminology when you describe the student's performance. In essence, say, "Well, you were completely wrong in the placement of that central ray, and because you did that, bad things are going to happen to the patient." In this circumstance, when you talk about wanting to help the student change their behaviors, stating it in a way that hopefully does not cause the student to become defensive is really important. Instead of saying, "Well you picked the wrong size cassette and that's why your imagines were poor," It would be better to say, "A better choice in cassette size …" or "A better choice for placing the central ray would be here… in this particular circumstance.” Because, remember, in this process with feedback, you're looking at providing feedback regarding these students' cognitive or thinking skills. You're providing them with feedback regarding their affective behaviors, their feelings and their attitudes, and also you're providing them with feedback on their hands-on ability, how well did they do applying the skills that they've filed away in the hands-on environment. And in the affect, you might say, "You may find that when dealing with distraught patients or upset patients or patients that walk slowly that this particular thing works better" vs. saying, "What you did was wrong," which, automatically, in most instances, is going to cause somebody to become defensive. Once you become adversaries, your feedback at that point isn't going to be heard. The student's going to basically block you out, become very upset, and at that point, you no longer are really in the role of instruction, in providing the feedback of assessment. You're this enforcer. You're pointing out problems. Yes, there are going to be times when you're going to talk about things that the student has done wrong, but we also want to talk about times where the student has done things right. But add in, "In order to perform this examination better, you could perform it this way." The other thing when we talk about in this feedback, show the correct way of doing something and try not to dwell on what was done wrong. Inevitably, what's going to happen when you have a student who isn't meeting their goals is your list of items will be everything they've done wrong, and you haven't created this list of this is what you've done right. It's really important that the person can realize, I can do things right, but I'm also doing things wrong. You want to work with them to try to get them to be able to do things the right way. And showing the proper way to do it, showing the correct way to do it, showing the alternative way to do it that will achieve the good image is really important.

Give specific feedback. The other thing is you really need to make sure that you are specific in what you say and stay away from generalizations. What I mean by that is, “Well, you're always drinking coffee in the morning." And, "Well, no, I don't always drink coffee in the morning." What you want to say is, "I've noticed on Mondays, you drink four or five cups of coffee before you even go get a patient. But on Tuesday, Wednesday, Thursday and Friday, you are in here first thing, bright and early, ready to go, no coffee. Can

Page 20: Clinical Instructor Academystatic.crowdwisdomhq.com/asrt/courses/CIA_Transcipts/Clinical... · ©2011 ASRT. All rights reserved. Clinical Instructor Academy Transcript CIA Module

©2011 ASRT. All rights reserved. Clinical Instructor Academy Transcript

you explain or help me understand what's going on Monday mornings that you need to have five cups of coffee before you're able to go in and work with your patients after the start time of the day?"

Limit the amount of feedback. The other thing is – and this is critical, and this applies for everybody – limit the amount of feedback you give. Don't create a list of 25 items that the student needs to work on. You will overload their capability to process the information that you've given them. Too much feedback is not a good thing. It may be one of those that you break it up and you say, "Let's talk about your affective behavior" or "Let's talk about your thinking during this procedure" or "Let's talk about your hands-on performance." "While pushing the stretcher down the hallway, you managed to hit the wall two times. You want to learn how to lock the wheels so that you can steer it," and those types of things. But, to hand the student a 25-item list of all the feedback will certainly be overwhelming and possibly a very daunting task for you to get through everything in the process. Have the student talk about what they do when they have to complete similar studies. You've gone in and you've evaluated them doing a portable chest in ICU. You've done your assessment. And before you go over the assessment with them, you inquire, "So tell me about how you normally approach the ICU patient that's on the ventilator with multiple central lines." After the student has explained to you what they've done, you then go through what your feedback is regarding their performance through that procedure.

Summative Assessment A formal assessment completed at the end of a module, rotation or course.

At this point, let's take a look at summative assessment. When we talk about summative assessment, this basically is – it's a summary. We're going to wrap things up for the student, so it's a formal assessment completed at the end of a module, end of a rotation, end of a course, and you're going to use it to determine a grade.

Often easier to complete than formative assessments. Oftentimes what you're going to find with summative assessments is that they're easier to complete than the formative assessments, because it's the final assessment vs. ongoing. You're going to put it all together and say, "Here we are," and you don't have this extensive amount of time with the continuity and the consistency like you have with formative. In this case your final assessment is of the course and what you're doing is you're summarizing all of your feedback, all of the student performances at the end, and you can look at and identify trends at this point and say, "I met with you at the beginning of the semester and we talked about making sure you have accurate shield placement and collimation and these other items, and we've reached the end of the semester and all of these things improved," which lets you know that your feedback was useful to the student. Whereas, you may have a student who you talked about shield placement and collimation and marker placement and central ray alignment and it has not improved, even though, throughout the entire semester, while you've met with them, you have pointed these things out. That's important because you've documented your formative assessment process; the documentation is throughout the semester of student performance. If you can consistently document that the student is performing or not performing, when it comes to the end of the course, the assignment of that grade is much easier. No one ever complains about the A grade. I've yet to have a student ever complain about receiving an A. But any time that a student receives the grade of F, they're immediately in your office saying, "How did I

Page 21: Clinical Instructor Academystatic.crowdwisdomhq.com/asrt/courses/CIA_Transcipts/Clinical... · ©2011 ASRT. All rights reserved. Clinical Instructor Academy Transcript CIA Module

©2011 ASRT. All rights reserved. Clinical Instructor Academy Transcript

possibly fail this course?" At that point is when you're able to pull out all of your formative assessments and document for the student and possibly during a grievance hearing about the grade, that this is the level of performance this student achieved by the end of the course. We've provided this feedback consistently and with continuity during the entire semester. They were not unaware of the lack of their performance. And as a result, this is the grade that they earned.

-----------------------------------------------------------------------------------------------

Slide 8 The icons on the following page link to audio recordings from fellow educators. Click on each icon to listen to the recording. Once you have listened to all recordings, click on the continue button to go to the last section of this module.

-----------------------------------------------------------------------------------------------

Slide 9

Page 22: Clinical Instructor Academystatic.crowdwisdomhq.com/asrt/courses/CIA_Transcipts/Clinical... · ©2011 ASRT. All rights reserved. Clinical Instructor Academy Transcript CIA Module

©2011 ASRT. All rights reserved. Clinical Instructor Academy Transcript

-----------------------------------------------------------------------------------------------

Slide 10 “Your Students will rise to your expectations.”

“My single greatest bit of advice to a new educator is to let them know that your students will rise to your expectations. Treat them with respect and they’ll show you the same. Try different ways of teaching ideas. Students like variety, but remember that you’re not there for their entertainment. Let them help with the learning process.”

-----------------------------------------------------------------------------------------------

Page 23: Clinical Instructor Academystatic.crowdwisdomhq.com/asrt/courses/CIA_Transcipts/Clinical... · ©2011 ASRT. All rights reserved. Clinical Instructor Academy Transcript CIA Module

©2011 ASRT. All rights reserved. Clinical Instructor Academy Transcript

Slide 11 “It was really hard to take someone’s dream from them.”

“The day I had to dismiss a student. I felt so bad for the student, but the kindest thing to do was to just tell them that this was really not the field for them. But to string the student along would have been wrong. It would be wrong for the student, it would have been wrong for me, and it would have been wrong for the patients the student would have treated. It was really hard to take someone’s dream from them.”

-----------------------------------------------------------------------------------------------

Slide 12 “I kind of felt like I let the student down.”

“My least memorable experience as an educator was my first test failure. I kinda felt like I let the student down.”

-----------------------------------------------------------------------------------------------

Page 24: Clinical Instructor Academystatic.crowdwisdomhq.com/asrt/courses/CIA_Transcipts/Clinical... · ©2011 ASRT. All rights reserved. Clinical Instructor Academy Transcript CIA Module

©2011 ASRT. All rights reserved. Clinical Instructor Academy Transcript

Slide 13 Panel Discussion

Kevin: It’s time once again to spend a few minutes talking with our panel of subject matter experts. I’d like to start by first thanking Andrew for his very thoughtful presentation on assessment of clinical instruction. All of you on the panel have been associated with your educational programs over several years, and I’m sure over that period of time you’ve had many clinical instructor positions that have turned over. Andrew, I’d like to address my first question to you. What do you see as the most common error made by new clinical instructors when it comes to dealing with assessing student performance in the clinical setting? Andrew: What I’ve found to be the most common error is that the new clinical instructor, generally speaking, is too nice. They’re really not willing to point out to the student the areas where they need to improve, areas that they’re strong in. Everything is always good, instead of saying this is what you’re doing well; these are the things you really need to improve on, being very specific in how they should improve. Kevin: Barbara, do you have that same situation? Barbara: Yes, we have too nice clinical instructors. We also have the other extreme which are the ones that grade rather harshly. A lot of times these instructors in the beginning need to learn how to appropriately judge or evaluate the student for the level that they’re at. Kevin: Angie, when do you find that you discover that there is this real separation between the real hard and the too easy clinical instructor? Angela: You usually find out from the students. Either you hear from the students directly the one who is being handled very critically because they’ll come to you thinking they’re a huge failure, or by listening to groups of students from other clinical sites, comparing the experiences of the different students at

Page 25: Clinical Instructor Academystatic.crowdwisdomhq.com/asrt/courses/CIA_Transcipts/Clinical... · ©2011 ASRT. All rights reserved. Clinical Instructor Academy Transcript CIA Module

©2011 ASRT. All rights reserved. Clinical Instructor Academy Transcript

the different sites. “I wish I was at that site because the instructor is way easy,” or “Man, I’m glad I’m not at Hospital ABC, because that instructor’s really mean.” Kevin: Nancy, in your situation now, what kind of techniques do you recommend to overcome this? Nancy: There’s a number of different things you can do. I like that in doing this, and evaluating performance that we take it not as a personal perspective but get other people’s input so what’s the other staff think? It also gives more credibility to it, and it might help with the nice guy and the harsh person because it might balance it out. A broad perspective is better from the clinical instructor side if they are going to be evaluating the performance. Kevin: Angie, you mentioned before the importance of having the clinical instructor learn the tools that are used for clinical instruction or evaluating student performance. Angela: Yeah, exactly. It’s really important that up front the clinical instructor, especially if they’re new, takes the time to learn the tools, to know the objectives, to understand the grading criteria and to know the timelines when the evaluations need to be completed. Kevin: That brings me to my next question. I’m sure for many clinical instructors that are new to the position it’s hard to get a sense for the criteria that’s used in the grading rubrics, especially those rubrics that have been created by somebody else. Nancy, what approach do you have to help reduce this area for the clinical instructor? Nancy: The program has an obligation to orient the new clinical instructor, especially if you want them to use the tools correctly. When you don’t have control over what the tool is, as an educator, it’s difficult to adopt somebody else’s criteria. I really think you have to do an orientation to let the clinical instructor know what that grading rubric means and how to use it. Kevin: Andrew, I know you have many clinical affiliates and a lot of different clinical instructors. What’s your approach to helping the clinical instructors understand the metrics used in your grading rubrics? Andrew: Most of the time, actually all of the time, what I’ll do is, because I’m out on the clinical setting once a week, I’ll stop and make sure that there aren’t any questions that are unanswered, go over the evaluation tools, talk with them about student performance and provide them with examples of what would warrant a particular score. Kevin: Barbara, you indicated before that you have workshops that you conduct for your clinical instructors. Barbara: Yes. Every fall we have a workshop for our clinical instructors that are out in the hospitals and they send in requests for things to be covered and as well as we try to cover all of the paperwork to make sure they understand how the grading process works. It’s a good review, even for the ones who have been doing this for a long time. Kevin: Angie, I know you’ve moved away from paper as far as your clinical assessment approach to an electronic format. What strategies have you used?

Page 26: Clinical Instructor Academystatic.crowdwisdomhq.com/asrt/courses/CIA_Transcipts/Clinical... · ©2011 ASRT. All rights reserved. Clinical Instructor Academy Transcript CIA Module

©2011 ASRT. All rights reserved. Clinical Instructor Academy Transcript

Angela: Because of the electronic records, you’ll find that the fields are small so the criteria can’t be spelled out in full on the electronic form. We came up with handbooks. We have two separate ones: one for the staff that deals with the evaluations that the staff may have to do in the clinical site; the other one is a more detailed manual for the clinical instructors. Every evaluation tool, every assessment tool is explained in great detail, the objectives are spelled out completely, the grading criteria are defined and copies of that are also included in our students manual so the students are on the same page as the clinical instructors. Kevin: Have you found that to be helpful in terms of consistency? Angela: It’s helped a lot since we first started with the electronic records; yes. Kevin: Nancy, you mentioned that you use an on-campus computer lab to help orient your clinical instructors. Can you give us an overview of that? Nancy: We, like Angie, have changed to an electronic clinical record. All the clinical performances are online. But the difficulty is in using it, or knowing how to use it – we bring the clinical instructors in for meetings and part of one of those meetings annually is to go into the computer lab, open up the system, we have students that we can practice with in there, they’re not the real students. The CIs can actually do it, see what it’s going to come up with and how the grade is going to turn out. That way they don’t have to play with the real live students and try to make adjustments. It gives us an opportunity to help them use the performance assessment correctly and without making any changes that shouldn’t be there. Kevin: One of the things in our presentation we talked about previously was the difficulties– especially for new clinical instructors – in balancing the time between their role as clinical instructor and role as staff technologist. Oftentimes new clinical instructors find themselves in the situation at the end of a grading period they just haven’t had enough time to spend with a given student or enough frequency to see a student in terms of their performance or growth of their performance. What kind of strategies, Andrew, do you think would be best to help the new clinical instructor if they’re faced with a situation where they’re coming to the end of a grading period and they haven’t observed somebody enough to make an honest appraisal of their performance? Andrew: At that point you want to work with that clinical instructor to develop a core group of technologists in the department that can be their eyes and ears regarding a student’s performance. You don’t want to use everybody because it dilutes whatever information you’re going to get back. You get a core group and say these are the things I need you to watch out for with the students. And that way if you haven’t had as much one-on-one time as you’d like, you can go to these core technologists and ask them, “Tell me about this student’s performance,” and get some good objective feedback on that. And then last but not least, if in doubt, you can squeeze in at the end of a semester some one-on-one time just to verify how accurate your core people are in giving you the information about that student. Kevin: Angie, you’ve stressed the idea of building relationships in the clinical setting. Can you expand upon that for us? Angela: It’s really important that the clinical instructor has a relationship with each and every staff member that’s at that clinical site, so that there’s an open, honest dialogue or conversation between them, where the staff member can feel comfortable giving honest feedback to the clinical instructor

Page 27: Clinical Instructor Academystatic.crowdwisdomhq.com/asrt/courses/CIA_Transcipts/Clinical... · ©2011 ASRT. All rights reserved. Clinical Instructor Academy Transcript CIA Module

©2011 ASRT. All rights reserved. Clinical Instructor Academy Transcript

knowing that it’s going to be kept confidential. And that the clinical instructor feels like they can approach the staff member and ask questions about a student’s performance. Kevin: Nance, what problems do you see where there are gaps in the clinical instructor’s ability to observe the students or critique the students and then relying upon the staff to be the primary source of evaluations? Nancy: The problem sometimes is staff doesn’t critically evaluate and they like to be the nice guy, and so the feedback doesn’t help for improvement and it doesn’t give feedback to the clinical instructor on things on how it should be done.It’s really important that we look at that and help evaluate that and get some consistency. Kevin: Barb, in your program, what technique do you use to help reinforce, in the clinical setting, that kind of relationship between the staff and the clinical instructor? Barbara: Our clinical coordinator does go out, as do the others here, frequently every week to different sites. Going to staff in-services occasionally helps when they have a staff meeting. Maybe they have a monthly meeting and our clinical coordinator can go out and do a quick little overview of the grading and how to evaluate students kind of in a general staff meeting. Kevin: The last question I have for you – the last area I’d like to cover – is especially for the new clinical instructor. There’s times when they feel like they’re on an island all by themselves, and they’re physically separated from the educational program. What kind of techniques or strategies do you employ in your program to help the new clinical instructor feel connected to the educational program? Let’s start with Angie. Angela: We like to reinforce that we want to keep the lines of communication open, make sure new clinical instructors know that they can call the full-time faculty anytime, send them an e-mail, call us at home, if they need to, and know that we’ll get back to them in a timely manner. And that if they have questions, we’re there for them. Kevin: Barb? Barbara: We like to have monthly meetings with all the clinical instructors from all of our hospitals. And that way they can kind of socialize with each other, talk about similar problems, different problems and get opinions from other hospitals about how they do things. It gives them a better feel for the college, because we from the college attend these meetings, too, so it’s more collegial. Kevin: Do you serve food? Barbara: Sometimes we bring cookies. Kevin: One of the greatest things you can do to make for a positive meeting is to serve food or beverages. Barbara: Yes. Kevin: Nancy, how about your program?

Page 28: Clinical Instructor Academystatic.crowdwisdomhq.com/asrt/courses/CIA_Transcipts/Clinical... · ©2011 ASRT. All rights reserved. Clinical Instructor Academy Transcript CIA Module

©2011 ASRT. All rights reserved. Clinical Instructor Academy Transcript

Nancy: Our program, similar to Barbara, we have clinical instructor meetings monthly. And because of the size of the program we have three to four clinical coordinators. The coordinators meet on a monthly basis, as well, to try to coordinate and make sure the clinical instructors are all getting the same information even outside of our clinical instructor meeting. The other component that we utilize that’s in a mentoring or a coaching fashion we always provide the clinical instructors with a list of who all the clinical instructors are and contact information. This way we can have some peer mentoring and networking going on. In the meantime, if something goes wrong or a CI is questioning, especially if they are new, they have a contact person besides somebody at the college that they might feel more comfortable in discussing the issue with. Kevin: Andrew, with your large program, what kind of techniques do you employ? Andrew: We have a Web page that we keep updated so the staff can go to that at any point in time. I have a dedicated clinical cell phone that staff can call at any point, as well as e-mail and then weekly visits, or possibly biweekly visits. I don’t think we can have too much communication. In essence, the more that we’re out there to support the clinical staff, the better that clinical experience is going to be. We don’t want anybody to be out there and feel like I’m here alone; nobody’s back there to support me, and therefore, things tend to fall apart. The Web, e-mail, text messaging, phone calls. And they don’t necessarily have to call just me – they can call the program director or any of the other faculty in case they can’t get in touch with me. Just to make sure the lines of communication are always open. Kevin: I want to thank you very much for your comments and this brings to conclusion our panel discussion for this module.

-----------------------------------------------------------------------------------------------

Slide 14 Bibliography Dreyfus HL, Dreyfus SE. Mind Over Machine – the Power of Human Intuition and Expertise in the Era of the Computer. New York, NY: The Free Press; 1986.

-----------------------------------------------------------------------------------------------

Page 29: Clinical Instructor Academystatic.crowdwisdomhq.com/asrt/courses/CIA_Transcipts/Clinical... · ©2011 ASRT. All rights reserved. Clinical Instructor Academy Transcript CIA Module

©2011 ASRT. All rights reserved. Clinical Instructor Academy Transcript

Slide 15 Production Credits

-----------------------------------------------------------------------------------------------

Slide 16 Clinical Instructor Academy Assessment of Clinical Performance Close this window to return to your study area.

-----------------------------------------------------------------------------------------------