Carol E. Reiley 1 Henry C. Lin 1, Balakrishnan Varadarajan 2, Balazs Vagvolgyi 1, Sanjeev Khudanpur...

17
Carol E. Reiley 1 Henry C. Lin 1 , Balakrishnan Varadarajan 2 , Balazs Vagvolgyi 1 , Sanjeev Khudanpur 2 , David D. Yuh 3 , Gregory D. Hager 1 1 Engineering Research Center for Computer-Integrated Surgical Systems and Technology, The Johns Hopkins University 2 Center for Speech Language Processing, The Johns Hopkins University 3 Division of Cardiac Surgery, The Johns Hopkins Medical Institutions MMVR January 31 st , 2008 Automatic Recognition of Surgical Motions Using Statistical Modeling for Capturing Variability

Transcript of Carol E. Reiley 1 Henry C. Lin 1, Balakrishnan Varadarajan 2, Balazs Vagvolgyi 1, Sanjeev Khudanpur...

Page 1: Carol E. Reiley 1 Henry C. Lin 1, Balakrishnan Varadarajan 2, Balazs Vagvolgyi 1, Sanjeev Khudanpur 2, David D. Yuh 3, Gregory D. Hager 1 1 Engineering.

Carol E. Reiley1

Henry C. Lin1, Balakrishnan Varadarajan2, Balazs Vagvolgyi1, Sanjeev Khudanpur2, David D. Yuh3, Gregory D. Hager1

1Engineering Research Center for Computer-Integrated Surgical Systems and Technology, The Johns Hopkins University

2Center for Speech Language Processing, The Johns Hopkins University

3Division of Cardiac Surgery, The Johns Hopkins Medical Institutions

MMVR January 31st, 2008

Automatic Recognition of Surgical Motions Using Statistical Modeling for

Capturing Variability

Page 2: Carol E. Reiley 1 Henry C. Lin 1, Balakrishnan Varadarajan 2, Balazs Vagvolgyi 1, Sanjeev Khudanpur 2, David D. Yuh 3, Gregory D. Hager 1 1 Engineering.

Introduction

• Our Goal

• Automatically segment and recognize core surgical motion segments (surgemes)

• Capture the variability of a surgeon’s movement techniques using statistical methods

Page 3: Carol E. Reiley 1 Henry C. Lin 1, Balakrishnan Varadarajan 2, Balazs Vagvolgyi 1, Sanjeev Khudanpur 2, David D. Yuh 3, Gregory D. Hager 1 1 Engineering.

Introduction

• Given a surgical task, a single user tends to use similar movement patterns

Lin 2005 Miccai

Page 4: Carol E. Reiley 1 Henry C. Lin 1, Balakrishnan Varadarajan 2, Balazs Vagvolgyi 1, Sanjeev Khudanpur 2, David D. Yuh 3, Gregory D. Hager 1 1 Engineering.

Introduction

• Different users demonstrate more variability to complete the same surgical task

• Our goal is to identify core surgical motions versus error/unintentional motion

Page 5: Carol E. Reiley 1 Henry C. Lin 1, Balakrishnan Varadarajan 2, Balazs Vagvolgyi 1, Sanjeev Khudanpur 2, David D. Yuh 3, Gregory D. Hager 1 1 Engineering.

Related Work

Low level surgical modeling: Imperial College-ICSAD

High level surgical modeling: University of Washington-Blue Dragon

Low level surgical modeling: MIST-VR

• Prior work focuses on surgical metrics for skill evaluation

• High level (applied force and motion)

• Low level (motion data)

• Our work aims to automatically identify fundamental motions

Page 6: Carol E. Reiley 1 Henry C. Lin 1, Balakrishnan Varadarajan 2, Balazs Vagvolgyi 1, Sanjeev Khudanpur 2, David D. Yuh 3, Gregory D. Hager 1 1 Engineering.

Our Approach

• Surgeme: elementary portions of surgical motion

Reaching for needle Positioning Needle Pull Suture with Left Hand

Page 7: Carol E. Reiley 1 Henry C. Lin 1, Balakrishnan Varadarajan 2, Balazs Vagvolgyi 1, Sanjeev Khudanpur 2, David D. Yuh 3, Gregory D. Hager 1 1 Engineering.

Motion Vocabulary

End of Trial, Idle Motion

Label Description

A Reach for Needle (gripper open)

B Position Needle (holding needle)

C Insert Needle/Push Needle Through Tissue

D Move to Middle With Needle (left hand)

E Move to Middle With Needle (right hand)

F Pull Suture With Left Hand

G Pull Suture With Right Hand*

H Orient Needle With Two Hands

I Right Hand Assisting Left While Pulling Suture*

J Loosen Up More Suture*

K

*Added based on observed variability of technique

Page 8: Carol E. Reiley 1 Henry C. Lin 1, Balakrishnan Varadarajan 2, Balazs Vagvolgyi 1, Sanjeev Khudanpur 2, David D. Yuh 3, Gregory D. Hager 1 1 Engineering.

Our Approach

Extraction of Structure

SignalProcessing Classificatio

n/Modeling

Feature Processing

Page 9: Carol E. Reiley 1 Henry C. Lin 1, Balakrishnan Varadarajan 2, Balazs Vagvolgyi 1, Sanjeev Khudanpur 2, David D. Yuh 3, Gregory D. Hager 1 1 Engineering.

Data CollectionThe da Vinci Surgical Robot System

Courtesy of Intuitive Surgical

With the increasing use of robotics in surgical procedures,

a new wealth of data is available for analysis.

Recorded parameters at 23 Hz: (Patient and master side) • Joint angles, velocities• End effector position, velocity, orientation• High-quality stereo vision

Page 10: Carol E. Reiley 1 Henry C. Lin 1, Balakrishnan Varadarajan 2, Balazs Vagvolgyi 1, Sanjeev Khudanpur 2, David D. Yuh 3, Gregory D. Hager 1 1 Engineering.

Experimental Study

Subject Medical Training Da Vinci Training Hrs

1 - - 10-15

2 - - 100+

3 X X 100+

4 - X 100+

5 - X <10

6 - X <10

7 - - <1

• Users had varied level of experience

• Each user performed five trials

• Each trial consisted of a four-throw suturing task

Page 11: Carol E. Reiley 1 Henry C. Lin 1, Balakrishnan Varadarajan 2, Balazs Vagvolgyi 1, Sanjeev Khudanpur 2, David D. Yuh 3, Gregory D. Hager 1 1 Engineering.

Classification Methods

• Linear Discriminant Analysis (LDA) with Single Gaussian

• LDA + Gaussian Mixture Model (GMM)

• 3-state Hidden Markov Model (HMM)

• Maximum Likelihood Linear Regression (MLLR)

• Supervised

• Unsupervised

Page 12: Carol E. Reiley 1 Henry C. Lin 1, Balakrishnan Varadarajan 2, Balazs Vagvolgyi 1, Sanjeev Khudanpur 2, David D. Yuh 3, Gregory D. Hager 1 1 Engineering.

Results

•Leave one trial out per user cross-validation

•MLLR not applicable

Percent classifier accuracy (average):

Page 13: Carol E. Reiley 1 Henry C. Lin 1, Balakrishnan Varadarajan 2, Balazs Vagvolgyi 1, Sanjeev Khudanpur 2, David D. Yuh 3, Gregory D. Hager 1 1 Engineering.

Results• Example classifier to manual segmentation

result

Page 14: Carol E. Reiley 1 Henry C. Lin 1, Balakrishnan Varadarajan 2, Balazs Vagvolgyi 1, Sanjeev Khudanpur 2, David D. Yuh 3, Gregory D. Hager 1 1 Engineering.

Results

• We repeated the analysis, this time leaving one user out

• Supervised: Surgeme start/stop events manually defined

• Unsupervised: Surgeme start/stop events automatically derived

67.21 67.49 67.62

70.9470.34

65

66

67

68

69

70

71

72

LDA GMM HMM sup.MLLR

unsup.MLLRStatistical Method

Av

era

ge

Pe

rce

nta

ge

s

Page 15: Carol E. Reiley 1 Henry C. Lin 1, Balakrishnan Varadarajan 2, Balazs Vagvolgyi 1, Sanjeev Khudanpur 2, David D. Yuh 3, Gregory D. Hager 1 1 Engineering.

Conclusions• Preliminary results show the potential

for identifying core surgical motions

• User variability has a significant effect on classification rates

• Future work:

• Use contextual cues from video data

• Filter class decisions (eg. majority vote) to eliminate class jumping

• Apply to data from live surgery (eg. Prostatectomy)

Page 16: Carol E. Reiley 1 Henry C. Lin 1, Balakrishnan Varadarajan 2, Balazs Vagvolgyi 1, Sanjeev Khudanpur 2, David D. Yuh 3, Gregory D. Hager 1 1 Engineering.

Acknowledgements

• Intuitive Surgical

• Dr. Chris Hasser

• This work was supported in part by:

• NSF Grant No. 0534359

• NSF Graduate Research Fellowship

Page 17: Carol E. Reiley 1 Henry C. Lin 1, Balakrishnan Varadarajan 2, Balazs Vagvolgyi 1, Sanjeev Khudanpur 2, David D. Yuh 3, Gregory D. Hager 1 1 Engineering.

References