Simulation in virtual reality: Robotic training and ...

6

Transcript of Simulation in virtual reality: Robotic training and ...

Page 1: Simulation in virtual reality: Robotic training and ...

Scientia Iranica B (2019) 26(6), 3369{3374

Sharif University of TechnologyScientia Iranica

Transactions B: Mechanical Engineeringhttp://scientiairanica.sharif.edu

Research Note

Simulation in virtual reality: Robotic training andsurgical applications

A.J.R. Almusawia, L.C. D�ulgerb;�, and S. Kapucub

a. Department of Mechatronics Engineering, University of Baghdad, Iraq.b. Department of Mechanical Engineering, Faculty of Engineering, Gaziantep University, Gaziantep, Turkey.

Received 13 February 2018; received in revised form 24 July 2018; accepted 2 September 2019

KEYWORDSRobotic Surgery (RS);Robotic Training(RT);Virtual Reality (VR);Virtual Reality Model(VRM);Thoracoscopicsurgery.

Abstract. Two case studies are considered in this study: one with a 4-dof roboticsystem and the other with a 6-dof industrial robot arm. Both robot arms have beenactually operational in Mechatronics Laboratory, Gaziantep University. Di�erent motiontrajectories were designed and implemented for training, medical tasks, and surgicaloperations base. Simulations were carried out by using VR Toolbox in Matlabr. Virtualreality environment was developed through Simulink with real-time examples. The motionsand trajectories necessary for training and surgical applications were directly observed,enabling the training of surgeons with many bene�ts such as reduction of the training timeperiod through greater control during tasks and the possibility of error-free tasks.© 2019 Sharif University of Technology. All rights reserved.

1. Introduction

In recent years, haptic feedback in robot-assistedsurgery and bilateral telemanipulation have been ap-plied. Robotic surgical systems, telerobotics, and vir-tual �xtures are required to assist a surgeon. Medicalsimulators or surgical systems based on virtual realityrequire software engineering, robotics, mechatronic en-gineering, and microtechnology. E�cient interaction isperformed in real-time using senses and skills. Virtualreality can be performed with visualization platformssuch as computer monitors, head-mounted displays,and large-screen projection systems. Virtual trainingof medical students provides patients with safety andan increase in operating room performance. Appren-ticeship is vital for surgical operations. The students

*. Corresponding author.E-mail addresses: eng [email protected] (A.J.R.Almusawi); [email protected] (L.C. D�ulger);[email protected] (S. Kapucu)

doi: 10.24200/sci.2019.50451.1702

must spend a great amount of time for quali�edtraining. Di�erent virtual tasks can be assigned tothe sensorimotor functions. Surgical training is veryexpensive. Therefore, reliable training is possible withVR technology and used as supplementary trainingmaterials with a great promise [1]. The virtual realityis introduced by surgeons to ensure reliable training byimproving their surgical skills. Repeated trainings canbe performed with no harm to a patient. Simulatorscan train nurses, medical students, etc. while o�eringno need for supervision. Therefore, a platform isprovided that causes no harm to a patient. Repeatedtraining can be easily performed. Virtual Reality (VR)based systems have been at play since early 1980s.Di�erent studies have been conducted in this regardsince then, some of which are included here to showtheir importance in medical training [2].

Khor et al. [3] carried out a study on AugmentedReality (AR) and Virtual Reality (VR) for the use ofdigital surgical environment. In surgical health care,anatomical evaluation, broadcasting and recordingsurgery, telementoring and education, operational andpatient-wise bene�ts are considered that are of course

Page 2: Simulation in virtual reality: Robotic training and ...

3370 A.J.R. Almusawi et al./Scientia Iranica, Transactions B: Mechanical Engineering 26 (2019) 3369{3374

subject to limitations. Al-Mashhadany [4] studiedScara robot by modeling and simulating the virtualreality model. Buckley et al. [5] showed that surgicalskills, hand-eye coordination, intuitive movements, andthe ability to work with images could be improved bysimulators. Therefore, a surgeon can be trained in theoperation theatre by providing a safe environment forhim/her. Nooshabadi et al. [6] achieved an e�ectivesurgical training system to simulate tool-tissue systemby a meshless method. Almusawi et al. [7] performeda simulation study of a robotic arm (4 degrees offreedom) with Virtual Reality Model (VRM). Then,Almusawi [8] performed a study on haptic teachingand virtual reality on thorascopic surgery in his thesis.Later, Almusawi et al. [9] combined the aforementionedideas using neural networks.

The purpose of this study is to show the interfacebetween an operator and the robotic arm in the virtualworld. The paper is structured by 4 sections. Roboticsystems are given in Section 2 with Matlabr robotictoolbox. Virtual Reality Models for OWI-535 andDenso robot arm are given in Section 3. Conclusionsare included in Section 4.

2. Description of robotic systems

Two robotic systems are presented here. OWI-535robot arm is an articulated manipulator for performingpick-and-place in a laboratory environment. Thissystem is built at Gaziantep University, MechatronicsLaboratory. This arm has four revolute joints withthree links and a gripper. All joints are actuatedby �ve small DC motors; to be speci�c, the base isof 270� rotation, a shoulder is of 180�, an elbow'srange is 300�, a wrist is of 120�, and the gripperis 1.7 inches open [10]. OWI-535 robot is shown inFigure 1. Some modi�cations are applied to OWI-535 robot arm to respond to possible requirementsduring the tests in the laboratory by integrating an-gular position sensors with each joint. A computerinterface system is designed to drive 5 DC motorsusing the angular joint positions on PC. The interfacesystem is introduced as a microcontroller board witha USB connection. Control software is designed byMatlabr GUIDE next, D-H model execution, forwardand inverse kinematics, control, and trajectory gener-ation [11,12].

Denso 6-axis robot system is a 6-DOF roboticarm. The Denso robot arm has six encoders thatmeasure the angular position of the six motors. Theoverall arm length is 420 mm, the Payload is 2 kg, andposition axes with repeatability in each of X, Y , andZ directions are equal to � 0:02 mm. The positiondetectors are absolute encoders; Denso robot weight isapproximately 14 kg. The controller is quanser customethernet-based. The maximum composite speed is

3900 mm/s. Figure 2 shows the world coordinate frameand the joint coordinate frames to de�ne the kinemat-ics, inverse kinematics, and Jacobian matrix [12].

Figure 1. Robotic systems.

Figure 2. Coordinate frames of the OWI-535 robotic armlinks.

Page 3: Simulation in virtual reality: Robotic training and ...

A.J.R. Almusawi et al./Scientia Iranica, Transactions B: Mechanical Engineering 26 (2019) 3369{3374 3371

2.1. Coordinate representationsThe robot manipulator OWI-535 has four links. Thecoordinate frames of each link are shown in Figure 2. Ahomogeneous transformation matrix, T 0

4 , is expressedas the position and orientation of the robot.

T 04 =

�R0

4 O04

0 1

�; (1)

where R04 is the rotation matrix (3 � 3), O0

4 is theposition vector of the end-e�ector in the base frame,and the Denavite Hartenberg (D-H) for each link homo-geneous transformation, Ai, is represented as a productof four basic transformations [13]. The homogeneoustransformation matrix for the four-axe robot is T 0

4 . D-H parameters for the links are shown in Table 1.

Ai = Rotz;�iTransz;diTransx;aiRotx;ai ; (2)

T 04 = A1A2A3A4: (3)

The relationship between the joint angles and theposition and orientation of its end e�ector is describedby kinematics [11]. Denso robot kinematics is requiredfor teaching control method with the transformationmatrices that control the system. The Denso roboticsystem is a six-DOF serial link with 6 revolutejoints. The kinematics analysis is done following theperformance of the system coordinate frame. Thecoordinate O0, X0, Y0, and Z0 is �xed to the base,which is the base frame. The other coordinate framesare attached to the corresponding links. The referencecoordinate frame is shown in Figure 3.

The homogeneous transformation matrix repre-sents the position and orientation of the end e�ectorwith respect to the base coordinate; T 0

6 is then givenfor the overall system.

T 06 =

�R0

6 P 06

0 1

�; (4)

where R06 is the rotation matrix 3 � 3, and P 0

6 is theposition vector of the end-e�ector in the base framecoordinate. Forward and inverse kinematics are createdto create a relationship between the joint angles andthe position and orientation of the end e�ector; inaddition, the Denavite Hardenberg (D-H) is again used

Table 1. D-H parameters of links (OWI-535).

Link no. � D (cm) a (cm) �

1 �1 7 0 �=2

2 �2 0 9 0

3 �3 0 11.2 0

4 �4 0 11.2 0

Figure 3. Reference coordinate frame for the Densorobot system.

to analyze DENSO robot. The single-link homogenoustransformation matrix, Ai, is as follows:

Ai = Rotz;�iTransx;diTransx;�iRotx;�i; (5)

Ai =

2664C�i �S�iC�i S�iS�i aiC�iS�i C�iC�i �C�iS�i aiS�i0 S�i C�i di0 0 0 1

3775 ; (6)

where i is the link no., �i is the joint rotation angle,ai is the length of links, �i represents the twist angles,di is the link o�sets, and � is the joint angles. Thesystem has six links and a gripper. Here, i is linkno., S�i = sin �i, C�i = cos�i, �i is the joint rotationangle, ai is the length of links, �i is the twist angles,di is the link o�sets, and � is the joint angles. Thehomogeneous transformation matrix is calculated bythe multiplication of matrices in Eq. (7):

T 06 = A1A2A3A4A5A6Agripper: (7)

Agripper is the transformation matrix of the gripper.The joint angles and the gripper transformation matrixare given to calculate the position and orientation ofgripper's �nger. Table 2 gives D-H parameters of theDenso robot.

Table 2. D-H parameters of the Denso robot.

Link no �i di (mm) ai (mm) �i1 q1 125 0 �=22 q2 0 210 03 q3 0 -75 ��=24 q4 210 0 �=25 q5 0 0 ��=26 q6 70 0 0

Page 4: Simulation in virtual reality: Robotic training and ...

3372 A.J.R. Almusawi et al./Scientia Iranica, Transactions B: Mechanical Engineering 26 (2019) 3369{3374

2.2. Kinematics using Matlabr robotic toolboxRobotic toolbox includes many functions to simulaterobotic arms. Forward and inverse kinematics, forwardand inverse dynamics, and trajectory generation areperformed. Trajectories are designed through trainingand surgical operations. In this study, the toolbox isused for OWI-535 and Denso robotic arm [11]. Thedynamic equation of the robot is generated by theEuler-Lagrange method as follows:

M (q; _q) �q + C (q; _q) _q +G(q) = � �B _q; (8)

where M(q; _q) is the mass and inertia matrix with ndegree of freedom; C(q; _q) is the matrix with centripetaland Coriolis e�ects; G(q) is the vector of gravitationaltorques and forces; � represents the joint torques andforces; and (q; _q; �q) are the joint positions, velocities,and accelerations. The graphical representation ofOWI-535 and Denso robot is given by MATLAB usingrobotics toolbox, as shown in Figure 4(a) and (b).

Figure 4. Representation of robotic systems.

3. Building virtual reality model

The robot dynamics is modeled by Simulink blocks andvisualized in Virtual Reality Model (VRM). Figure 5shows Virtual Reality Model (VRM) for OWI-535robot. The signals are transferred between Simulinkmodel and VRM by the link rotation matrix (3 �3) [13,14]. The rotation matrixes are converted toequivalent rotation axis, and the angles form VirtualReality Modelling Language (VRML) for the orienta-tion of bodies. The structure of virtual model is builtby de�ning every node in a tree-object structure and,as a box node, transforming node and material node.Trajectories are synchronized with physical hardwareenvironment of OWI-535 Robot in Mechatronics Lab-oratory. The arm is built and, then, modi�ed with acard for further communication and control. A VirtualReality Model (VRM) is also prepared. Programs aredeveloped by Graphical User Interface (GUI). Simulinkmodel is built for the arm. Similarly, CAD drawing isdone by Solid Works for Denso robot, F/T sensor, andgripper. CAD model is connected with Simulink to

Figure 5. Robot Virtual Reality Model (RVRM) .

Page 5: Simulation in virtual reality: Robotic training and ...

A.J.R. Almusawi et al./Scientia Iranica, Transactions B: Mechanical Engineering 26 (2019) 3369{3374 3373

Figure 6. CAD model Denso system.

create a visualized Denso system by using Simulink 3Danimation. The virtual model is used to display theteaching path and motion in real-time. Figure 6 showsCAD of Denso system and with a surgical operation.

4. Conclusion

Two robotic systems were built and used during thestudy. VRM of both systems was studied. OWI-535system was built �rst for the initial experiments andtraining purposes. VRM of Denso robot was partic-ularly used for thorascopic surgery as an example.Robotic toolbox was used for building robotic analysis.Kinematic analysis of both systems was performed withtrajectories necessary during operations. Simulatedhuman models can be used. Studies of VR continuedfor the purpose of rehabilitation in mechatronicslaboratory. Surgical simulators and VR models willhave a great role in microsurgery and nano surgery.

Acknowledgements

This study is supported by Scienti�c Research ProjectsGoverning Unit (BAPYB) under Grant No. MF.14.02and Mechatronics Research Lab. in University ofGaziantep. We would like to thank them for their helpand support. The authors are thankful to Prof. Dr.Maruf S�anl� (Faculty of Medicine, Gaziantep Univer-sity) for their valuable suggestions during this study.

References

1. Fulcar, V.N. and Shivramwar, M.V. \Applications ofHaptics technology in advance robotics", IEEE- 2010,pp. 273-277 (2010).

2. Staub, C., Can, S., Jensen, B., Knoll, A., andKohlbecher, S. \Human computer interfaces for inter-action with surgical tools in robotic surgery", The 4thIEEE RAS/EMBS Int. Conf. on Biomedical Rob. andBiomechatronics-Italy, 2012-IEEE, pp. 81-86 (2012).

3. Khor, W.S., Baker, B., Amin, K., Chan, A., Patel,K., and Wong, J. \Augmented and virtual reality insurgery-the digital surgical environment: applications,limitations and legal pitfalls", Annals of TranslationalMedicine, 4(23), p. 454 (2016).

4. Al-Mashhadany, Y.I. \Scara robot: Modeled, simu-lated, and virtual-reality veri�ed", Communications inComp. and Information Science, CCIS 330, Springer-Verlag, pp. 94-102 (2016).

5. Buckley, C.E., Nugent, E., Ryan, D., and Neary,P.C. \Virtual reality-A new era in surgical training",Chapter 7-INTECH, Book: Virtual Reality in Psycho-logical, Medical and Pedagogical Applications, pp. 139-166 (2012).

6. Nooshabadi, Z.S., Abdi, E., Farahmand, F., Narimani,R., and Chizari, M. \A Meshless method to simulatethe interactions between a large soft tissue and asurgical grasper", Scientia Iranica, 23(1), pp. 295-300(2016).

7. Almusawi, A.R.J., D�ulger, L.C., and Kapucu, S.\Robotic arm dynamic and simulation with virtualreality model (VRM)", CoDIT'16-IEEE, Malta, pp.335-340 (2016).

8. Almusawi, A.R.J. \Implementation of learning motionto control a robotic arm using haptic technology", PhDThesis, Gaziantep University (2016).

9. Almusawi, A.R.J., D�ulger, L.C., and Kapucu, S.\A new arti�cial neural network approach in solvinginverse kinematics of robotic arm (Denso VP6242)",Computational Intelligence and Neuroscience, pp. 1-10(2016).

10. Robotic Arm Edge OWI-535 Robot, Product Instruc-tion Manual (2008).

11. Quanser.com. [assessed July 2018]. http://www.quanser.com/products/denso

12. Leon, E.D., Nair, S.S., and Knoll, A. \User friendlyMatlab-toolbox for symbolic robot dynamic modelingused for control design", IEEE-Int. Conf. on Roboticsand Biomimetics, pp. 2181-2188 (2012).

13. Corke, P., Robotics, Vision and Control Fundamen-tal Algorithms in MATLAB, V.73: Springer-VerlagBerlin-Heidelberg, pp. 191-205 (2011).

14. www.MathWorks, \Virtual Reality Modeling Lan-guage (VRML)-MATLAB/Simulink".

Biographies

Ahmed J.R. Almusawi received the BS and MS de-grees in Mechatronics Engineering from the Universityof Baghdad, Iraq in 2003 and 2009. He is currentlyworking toward the PhD degree in Robotics and Haptic

Page 6: Simulation in virtual reality: Robotic training and ...

3374 A.J.R. Almusawi et al./Scientia Iranica, Transactions B: Mechanical Engineering 26 (2019) 3369{3374

Technology at the University of Gaziantep, Turkey,2016. He is a Lecturer at the Mechatronics EngineeringDepartment, University of Baghdad, Iraq. His researchactivities are focused on mechatronic systems andarti�cial intelligence algorithms.

Lale Canan D�ulger received her BS and MS degreein Mechanical Engineering from Middle East TechnicalUniversity (METU), Turkey, 1986 and 1988. Her PhDwas received in Mechanical Engineering from LiverpoolPolytechnic, England, 1992. She is a Professor in Me-chanical Engineering, _Izmir University of Economics.

Her research interests include control design, modeling,arti�cial intelligence, and robotics.

Sadettin Kapucu received his BS and MS degreesin Mechanical Engineering from Middle East TechnicalUniversity (METU), Turkey in 1987 and 1990. HisPhD was received in Mechanical Engineering fromthe University of Gaziantep, Turkey, 1994. He is aProfessor in Mechanical Engineering, Gaziantep Uni-versity. His research interests include robotics, robotvision, industrial hydraulic power control, and systemidenti�cation.