Imitation Control for Biped Robot Using Wearable Motion Sensor · Imitation Control for Biped Robot...

6
Imitation Control for Biped Robot Using Wearable Motion Sensor Tao Liu e-mail: [email protected] Yoshio Inoue Kyoko Shibata Department of Intelligent Mechanical Systems Engineering, Kochi University of Technology, 185 Miyanokuchi, Tosayamada-Cho, Kami-City, Kochi 782-8502, Japan In conventional imitation control, optical tracking devices have been widely adopted to capture human motion and control robots in a laboratory environment. Wearable sensors are attracting ex- tensive interest in the development of a lower-cost human-robot control system without constraints from stationary motion analy- sis devices. We propose an ambulatory human motion analysis system based on small inertial sensors to measure body segment orientations in real time. A new imitation control method was developed and applied to a biped robot using data of human joint angles obtained from a wearable sensor system. An experimental study was carried out to verify the method of synchronous imita- tion control for a biped robot. By comparing the results obtained from direct imitation control with an improved method based on a training algorithm, which includes a personal motion pattern, we found that the accuracy of imitation control was markedly im- proved and the tri-axial average errors of x-y- and z-moving dis- placements related to leg length were 12%, 8% and 4%, respec- tively. Experimental results support the feasibility of the proposed control method. DOI: 10.1115/1.4001097 Keywords: imitation control, biped robot, wearable motion sensor, human motion analysis 1 Introduction Humanoid robots are gradually appearing in daily life, with key technology being addressed and resolved for safe coexistence with humans, interactive communication with humans, and efficient operation of objects in human space 1–4. Integrating human motion analysis and existing robot control technology to imple- ment a friendly human-robot system will be an interesting re- search focus in biomedical, industrial, and aerospace fields. Re- cently, considerable research work has concentrated on the development of humanoid biped robots 5–8. However, in order for humanoid robots to assist people with work in human-centered environments, it is not only important to equip them with the manipulative, perceptive, and communication skills necessary for real-time interactions in a working environment, but it is also indispensable to train robots to interact accurately and efficiently with humans so as to make robots cooperate with different hosts. Recently, many researchers are becoming interested in robot imitation of learning 9–11. A generic framework for solving the correspondence problem: action learning via imitation corre- sponding embodiments ALICE was proposed in Ref. 12, in which the problem of different embodiments between the learner and instructor for imitation control was addressed. Ferreira et al. 13 used acquired human gait data to control a biped robot, in which computer vision techniques were adapted to extract kine- matic features of human motion and obtain gait signatures in the sagittal plane. Ude et al. 14 proposed a method for transforming human motion information captured by an optical tracking device into a high dimensional trajectory for a humanoid robot and used B-spline wavelets for noise reduction in offline analysis. These methods based on stationary motion capture devices are difficult to use in real-time control and are limited to laboratory environ- ments because a method based on optical motion analysis requires considerable work space and high-speed graphical signal process- ing. The devices are expensive, and precalibration experiments and offline analysis of recorded pictures are especially time con- suming. We are focusing on human dynamics analysis to develop intelligent walk support machines or robots, for real-time mea- surements of human joint orientations and ground reaction forces for controlling a rehabilitation robot to support human walking 15,16. In this paper, the goal of our work can be divided into two parts. On the one hand, a wearable sensor system is being devel- oped to implement human motion analysis without environmental restrictions, and on the other hand, real-time control of a biped robot is implemented to imitate human motion based on measure- ments by the sensor system. 2 Materials and Methods 2.1 Wearable Sensor System for Estimating Segment Orientations. As shown in Fig. 1, a sensor module was designed by integrating a gyroscope Murata ENC-03J, size of 15.5 8.0 4.3 mm 3 , weight of 10 g and an accelerometer chip ADXL202. The principle of operation of the gyroscope is based on a measurement of the Coriolis acceleration, which is generated when a rotational angular velocity is applied to an oscillating pi- ezoelectric bimorph. These inertial sensors can work with low energy consumption 4.6 mA at 5 V and are suitable for ambu- latory measurements. The signal from the gyroscopes and accel- erometer are amplified and low-pass filtered cutoff frequency of 25 Hz to remove electronic noise. The frequencies outside the passband were filtered out because they are irrelevant in the study of human kinetics. As shown in Fig. 2, the wearable sensor system includes three sensor modules composed of a gyroscope and bio-axial acceler- ometer. Two sensor modules are attached to the thigh and shank, respectively, and a sensor module and data sampling system are attached to the waist. Three local coordinates were defined for the sensor modules mounted on the thigh, shank, and waist. We aligned the y-direction along the line connecting the segment’s proximal and distal joints and let the x-direction be the anterior- posterior direction, and the z-axis was chosen such that the result- ing global coordinate system would be right handed. The gyro- scopes were used to measure the angular velocities of body segments of waist, thigh, and shank. The sensing axis was vertical to the medial-lateral plane so that the angular velocity in the sag- ittal plane could be detected. In the local coordinates of the three segments, the sensing axes of the gyroscopes are along the z-axis. The bio-axial accelerometer was attached to the base board plate of the sensor module to measure the two components of accelera- tion along the x-axis and y-axis. In our data analysis system, the data from the accelerometer were fused with data collected from gyroscopes for cycle system calibration, by supplying the initial angular displacement of the attached leg segments. In a simplified human model, the human body segments were assumed to be rigid. We analyzed the motion of the sensor module fixed to a body segment by dividing the motion into the linear motion of the segment’s rotation point and the angular motion of the sensor module around the rotation point. Therefore, to esti- Contributed by the Mechanisms and Robotics Committee of ASME for publica- tion in the JOURNAL OF MECHANISMS AND ROBOTICS. Manuscript received June 11, 2008; final manuscript received October 5, 2009; published online February 23, 2010. Assoc. Editor: Frank C. Park. Journal of Mechanisms and Robotics MAY 2010, Vol. 2 / 024501-1 Copyright © 2010 by ASME

Transcript of Imitation Control for Biped Robot Using Wearable Motion Sensor · Imitation Control for Biped Robot...

Page 1: Imitation Control for Biped Robot Using Wearable Motion Sensor · Imitation Control for Biped Robot Using Wearable Motion Sensor Tao Liu e-mail: liu.tao@kochi-tech.ac.jp Yoshio Inoue

IU

Te

Y

K

DEK1TK

Ibitcssodastftfpptc

Ks

1

thommscdfemriw

i

t22

J

mitation Control for Biped Robotsing Wearable Motion Sensor

ao Liu-mail: [email protected]

oshio Inoue

yoko Shibata

epartment of Intelligent Mechanical Systemsngineering,ochi University of Technology,85 Miyanokuchi,osayamada-Cho, Kami-City,ochi 782-8502, Japan

n conventional imitation control, optical tracking devices haveeen widely adopted to capture human motion and control robotsn a laboratory environment. Wearable sensors are attracting ex-ensive interest in the development of a lower-cost human-robotontrol system without constraints from stationary motion analy-is devices. We propose an ambulatory human motion analysisystem based on small inertial sensors to measure body segmentrientations in real time. A new imitation control method waseveloped and applied to a biped robot using data of human jointngles obtained from a wearable sensor system. An experimentaltudy was carried out to verify the method of synchronous imita-ion control for a biped robot. By comparing the results obtainedrom direct imitation control with an improved method based on araining algorithm, which includes a personal motion pattern, weound that the accuracy of imitation control was markedly im-roved and the tri-axial average errors of x-y- and z-moving dis-lacements related to leg length were 12%, 8% and 4%, respec-ively. Experimental results support the feasibility of the proposedontrol method. �DOI: 10.1115/1.4001097�

eywords: imitation control, biped robot, wearable motionensor, human motion analysis

IntroductionHumanoid robots are gradually appearing in daily life, with key

echnology being addressed and resolved for safe coexistence withumans, interactive communication with humans, and efficientperation of objects in human space �1–4�. Integrating humanotion analysis and existing robot control technology to imple-ent a friendly human-robot system will be an interesting re-

earch focus in biomedical, industrial, and aerospace fields. Re-ently, considerable research work has concentrated on theevelopment of humanoid biped robots �5–8�. However, in orderor humanoid robots to assist people with work in human-centerednvironments, it is not only important to equip them with theanipulative, perceptive, and communication skills necessary for

eal-time interactions in a working environment, but it is alsondispensable to train robots to interact accurately and efficientlyith humans so as to make robots cooperate with different hosts.Recently, many researchers are becoming interested in robot

mitation of learning �9–11�. A generic framework for solving the

Contributed by the Mechanisms and Robotics Committee of ASME for publica-ion in the JOURNAL OF MECHANISMS AND ROBOTICS. Manuscript received June 11,008; final manuscript received October 5, 2009; published online February 23,

010. Assoc. Editor: Frank C. Park.

ournal of Mechanisms and Robotics Copyright © 20

correspondence problem: action learning via imitation corre-sponding embodiments �ALICE� was proposed in Ref. �12�, inwhich the problem of different embodiments between the learnerand instructor for imitation control was addressed. Ferreira et al.�13� used acquired human gait data to control a biped robot, inwhich computer vision techniques were adapted to extract kine-matic features of human motion and obtain gait signatures in thesagittal plane. Ude et al. �14� proposed a method for transforminghuman motion information captured by an optical tracking deviceinto a high dimensional trajectory for a humanoid robot and usedB-spline wavelets for noise reduction in offline analysis. Thesemethods based on stationary motion capture devices are difficultto use in real-time control and are limited to laboratory environ-ments because a method based on optical motion analysis requiresconsiderable work space and high-speed graphical signal process-ing. The devices are expensive, and precalibration experimentsand offline analysis of recorded pictures are especially time con-suming. We are focusing on human dynamics analysis to developintelligent walk support machines or robots, for real-time mea-surements of human joint orientations and ground reaction forcesfor controlling a rehabilitation robot to support human walking�15,16�. In this paper, the goal of our work can be divided into twoparts. On the one hand, a wearable sensor system is being devel-oped to implement human motion analysis without environmentalrestrictions, and on the other hand, real-time control of a bipedrobot is implemented to imitate human motion based on measure-ments by the sensor system.

2 Materials and Methods

2.1 Wearable Sensor System for Estimating SegmentOrientations. As shown in Fig. 1, a sensor module was designedby integrating a gyroscope �Murata ENC-03J, size of 15.5�8.0�4.3 mm3, weight of 10 g� and an accelerometer chip�ADXL202�. The principle of operation of the gyroscope is basedon a measurement of the Coriolis acceleration, which is generatedwhen a rotational angular velocity is applied to an oscillating pi-ezoelectric bimorph. These inertial sensors can work with lowenergy consumption �4.6 mA at 5 V� and are suitable for ambu-latory measurements. The signal from the gyroscopes and accel-erometer are amplified and low-pass filtered �cutoff frequency of25 Hz� to remove electronic noise. The frequencies outside thepassband were filtered out because they are irrelevant in the studyof human kinetics.

As shown in Fig. 2, the wearable sensor system includes threesensor modules composed of a gyroscope and bio-axial acceler-ometer. Two sensor modules are attached to the thigh and shank,respectively, and a sensor module and data sampling system areattached to the waist. Three local coordinates were defined for thesensor modules mounted on the thigh, shank, and waist. Wealigned the y-direction along the line connecting the segment’sproximal and distal joints and let the x-direction be the anterior-posterior direction, and the z-axis was chosen such that the result-ing global coordinate system would be right handed. The gyro-scopes were used to measure the angular velocities of bodysegments of waist, thigh, and shank. The sensing axis was verticalto the medial-lateral plane so that the angular velocity in the sag-ittal plane could be detected. In the local coordinates of the threesegments, the sensing axes of the gyroscopes are along the z-axis.The bio-axial accelerometer was attached to the base board plateof the sensor module to measure the two components of accelera-tion along the x-axis and y-axis. In our data analysis system, thedata from the accelerometer were fused with data collected fromgyroscopes for cycle system calibration, by supplying the initialangular displacement of the attached leg segments.

In a simplified human model, the human body segments wereassumed to be rigid. We analyzed the motion of the sensor modulefixed to a body segment by dividing the motion into the linearmotion of the segment’s rotation point and the angular motion of

the sensor module around the rotation point. Therefore, to esti-

MAY 2010, Vol. 2 / 024501-110 by ASME

Page 2: Imitation Control for Biped Robot Using Wearable Motion Sensor · Imitation Control for Biped Robot Using Wearable Motion Sensor Tao Liu e-mail: liu.tao@kochi-tech.ac.jp Yoshio Inoue

mescTiskjtamvc

toit“ermr1ra

Fds

0

ate the joint angle, we calculate the two components of accel-ration of each joint in their local coordinate systems using mea-urements of the sensor modules fixed to the body segments andombine the measurements of gyroscopes in the sensor modules.he detailed calculation method to estimate segment orientations

s given in Ref. �16�. In this study, the human leg model wasimplified to a link with three degrees of freedom, in which thenee joint has one degree of freedom, and the hip joint and ankleoint have two degrees of freedom, respectively. We can measurehe three joint angles in the sagittal plane �x-y plane� and the rollngles of the hip joint and ankle joint around the x-axis in theotion sensor system. The three-dimensional �3D� displacement

ector pointing from the ankle joint to the hip joint can be used toontrol a biped robot moving in 3D space.

2.2 Synchronous Imitation Control for Biped Robot. Imi-ation control was implemented on a biped robot using the devel-ped wearable sensor system and algorithm. We developed a robotntegrated with angular position control for real-time motion con-rol by human inputs. As shown in Fig. 3, a small biped robotR1” �height of 330 mm, mass of 2 kg� was developed for thexperimental study. Each leg has six degrees of freedom. In thisobot, 12 servo-motors �Kondo Kagaku Co. Ltd.� were used toake actuators for all the robot joints and the specification of the

obot is given in Table 1. A microcomputer controller �PIC6F877A by Microchip Co., Chandler, AZ� was built into theobot and can output the 12-channel pulses for the drive motorsnd connect with a wireless module to communicate with a master

Fig. 1 Motion sensor module

ig. 2 Wearable motion sensor system. A strap system wasesigned to attach the sensor modules to the human body

egments.

24501-2 / Vol. 2, MAY 2010

computer.A simplified model of the right leg of the biped robot is given in

Fig. 4. We calculated robot joint angles from inputs of subjectivedisplacements of the foot through an inverse kinematics analysis.An algorithm was developed to produce the biped robot motionaccording to displacement vector �rx ,ry ,rz�. Geometric relationsfor the robot links are shown in Fig. 5 and we can resolve jointangles for motor control �17�.

First, the robot pitch angle can be calculated as follows:

Z2 = A2 + B2 − 2 · A · B · cos�� − �1� �1�

Fig. 3 Biped robot

Table 1 Specification of the robot

Dimensions Height 300 mmWidth 150 mm

Foot size 130�70 mm2

DOF �single leg� Yaw axis 1 degPitch axis 3 degRool axis 2 deg

Weight 1.8 kg �including battery�

Fig. 4 Simplified model of right leg of the biped robot

Transactions of the ASME

Page 3: Imitation Control for Biped Robot Using Wearable Motion Sensor · Imitation Control for Biped Robot Using Wearable Motion Sensor Tao Liu e-mail: liu.tao@kochi-tech.ac.jp Yoshio Inoue

Sta

sp

usjsttaf�

J

Z = �rx2 + ry2 + rz2 �2�

�1 = cos−1�Z2 − A2 − B2

2 · A · B� �3�

econd, we can use trigonometric functions and the sine theoremo derive the robot ankle roll angle �2 and ankle pitch angle �3ccording to

� = sin−1�A

C· sin�� − �1�� �4�

�2 = tan−1� ry

rz� �5�

�3 = tan−1� rx�ry2 + rz2� + � �6�

Finally, to maintain the standing balance of the robot, we con-trained the other joint angles including hip roll angle �4 and hipitch angle �5 with the following equations:

�4 = − �2 �7�

�5 = � − �1 + �3 �8�By using Eqs. �1�–�8�, an inverse kinematics algorithm can be

sed to produce predesigned motion trajectories such as walking,tanding, and rotation. We want to design cyclic joint angle tra-ectories, which are synchronous with the output of the motionensors mounted on the human body segments. Figure 6 showshe architecture of the synchronous imitation control system forhe biped robot. A database of robot movement trajectories can bedded in real time using the results of measurements obtainedrom the wearable motion analysis system. The acceleration data

W,T,S

Fig. 5 Geometric relations of the robot’s segment links

ax ,az� for the two leg segments and the waist mounted with

ournal of Mechanisms and Robotics

the sensor modules can be obtained and used to estimate the jointangles of human subjects, and the calculated displacement vectorspointing from the ankle joint to the hip joint were scaled to thesaved displacement vectors of the robot ��rx ,ry ,rz�left and�rx ,ry ,rz�right�. Therefore, the differences between the segmentlength rates and joint freedoms between the human being and therobot can be used to drive imitation control in a training process.

3 ResultsAn experimental study was carried out to verify the synchro-

nous imitation control method proposed for the biped robot. Asshown in Fig. 7, the three components of waist movement weresampled as inputs for the synchronous imitation control systemused to generate control trajectories to determine motor angularpositions. Based on a predesigned motion trajectory of the bipedrobot in a case with no human motion inputs, we could control therobot motion from real-time orientation information of humanlower limbs.

Fig. 6 Architecture of the synchronous imitation controlsystem

Fig. 7 Three-directional waist movement imitation

MAY 2010, Vol. 2 / 024501-3

Page 4: Imitation Control for Biped Robot Using Wearable Motion Sensor · Imitation Control for Biped Robot Using Wearable Motion Sensor Tao Liu e-mail: liu.tao@kochi-tech.ac.jp Yoshio Inoue

csorcmTteem

4

cact

ot

0

To quantitatively validate the performance of the robot motionontrol system based on measurements of the wearable sensorystem and control method, we compared the quantitative resultsf measurements of the motion sensor system, the robot motionesponses, and the control outputs of the synchronous imitationontrol system, which were synchronously measured using a com-ercial optical motion analysis system Hi-DCam �NAC Imageechnology, Japan�. The motion analysis system �Hi-DCam�

racked and measured the 3D trajectories of retroreflective mark-rs placed on the subject’s body segments and robot joints. Cam-ras with a sampling frequency of 100 Hz were used to track thearker positions with an accuracy of 1 mm.

DiscussionIn this paper, to examine the proposed synchronous imitation

ontrol method, the 3D motion of a human being was sampled andnalyzed to control a biped robot. We have compared the threeomponents of displacement of the human subject and of the con-

Fig. 8 Training results of waist m

rolled robot with sensor motion control system outputs in the first

24501-4 / Vol. 2, MAY 2010

motion test of a training experiment. As shown in Fig. 8, theresponses of the three components of waist displacement werealmost the same for the human subject and the robot, but therewere large differences in motion pattern between the robot andhuman subject because the human segment length rates and jointfreedoms were significantly different, which may reflect the mea-surement precision of the wearable motion sensor system in real-time robot control. This shows that there are large errors if theimitation control is only based on measurements of motion sen-sors, for example, the displacement error was larger than 150 mmalong the z-axis at one point in time. A synchronous imitationcontrol system was developed to address this problem by savingthe first series of human motion data to a motion trajectory data-base as reference input for robot imitation control. As shown inFig. 9, after taking into account the personal pattern of motion, weimplemented a waist motion imitation control experiment on thesame subject and the results proved the feasibility of the synchro-nous imitation control system. By comparing the results obtained

ion imitation control experiment

from the direct imitation control method and the improved method

Transactions of the ASME

Page 5: Imitation Control for Biped Robot Using Wearable Motion Sensor · Imitation Control for Biped Robot Using Wearable Motion Sensor Tao Liu e-mail: liu.tao@kochi-tech.ac.jp Yoshio Inoue

bpwar

R

J

ased on a training algorithm, which incorporates the personalattern of motion, we found that the accuracy of imitation controlas markedly improved and the tri-axial average errors of x-y-

nd z-displacements related to leg length were 12%, 8%, and 4%espectively.

eferences�1� Azad, P., Asfour, T., and Dillmann, R., 2007, “Stereo-Based 6D Object Local-

ization for Grasping With Humanoid Robot Systems,” Proceedings of theIEEE/RSJ International Conference on Intelligent Robots and Systems.

�2� Dillmann, R., 2004, “Teaching and Learning of Robot Tasks Via Observationof Human Performance,” Robot. Auton. Syst., 47, pp. 109–116.

�3� Asfour, T., Azad, P., Vahrenkamp, N., Regenstein, K., Bierbaum, A., Welke,K., Schröder, J., and Dillmann, R., 2008, “Toward Humanoid Manipulation inHuman-Centered Environments,” Robot. Auton. Syst., 56, pp. 54–65.

�4� Allison, B., Nejat, G., and Kao, E., 2009, “The Design of an Expressive Hu-manlike Socially Assistive Robot,” ASME J. Mech. Rob., 1, p. 011001.

�5� Nishiwaki, K., Sugihara, T., Kagami, S., Kanehiro, F., Inaba, M., and Inoue,H., 2000, “Design and Development of Research Platform for PerceptionactionIntegration in Humanoid Robots: H6,” Proceedings of the IEEE/RSJ Interna-

Fig. 9 Waist motion imitation

tional Conference on Intelligent Robots and Systems, pp. 1559–1564.

ournal of Mechanisms and Robotics

�6� Sakagami, S., Watanabe, T., Aoyama, C., Matsunage, S., Higaki, N., andFujimura, K., 2002, “The Intelligent ASIMO: System Overview and Integra-tion,” Proceedings of the IEEE/RSJ International Conference on IntelligentRobots and Systems, pp. 2478–2483.

�7� Cheng, G., Hyon, S., Morimoto, J., Ude, A., and Jacobsen, S., 2006, “A Hu-manoid Research Platform for Exploring Neuroscience,” Proceedings of theIEEE/RAS International Conference on Humanoid Robots.

�8� Park, J. L. I. W., Kim, J. Y., and Oh, J., 2005, “Mechanical Design of Human-oid Robot Platform KHR-3 �KAIST Humanoid Robot-3: HUBO,” Proceedingsof the IEEE/RAS International Conference on Humanoid Robots.

�9� Montesano, L., Lopes, M., Bernardino, A., and Santos-Victor, J., 2008,“Learning Object Affordances: From Sensory-Motor Coordination to Imita-tion,” IEEE Trans. Rob. Autom., 24, pp. 15–26.

�10� Alissandrakis, A., Nehaniv, C. L., and Dautenhahn, K., 2007, “Correspon-dence Mapping Induced State and Action Metrics for Robotic Imitation,” IEEETrans. Syst., Man, Cybern., Part B: Cybern., 37�2�, pp. 299–307.

�11� Alissandrakis, A., Nehaniv, C. L., Dautenhahn, K., and Saunders, J., 2005,“An Approach for Programming Robots by Demonstration to Manipulate Ob-jects: Considerations on Metrics to Achieve Corresponding Effects,” Proceed-ings of the Sixth IEEE International Symposium CIRA, pp. 61–66.

�12� Alissandrakis, A., Nehaniv, C. L., and Dautenhahn, K., 2002, “Imitation With

ntrol experiment after training

co

MAY 2010, Vol. 2 / 024501-5

Page 6: Imitation Control for Biped Robot Using Wearable Motion Sensor · Imitation Control for Biped Robot Using Wearable Motion Sensor Tao Liu e-mail: liu.tao@kochi-tech.ac.jp Yoshio Inoue

0

ALICE: Learning to Imitate Corresponding Actions Across Dissimilar Em-bodiments,” IEEE Trans. Syst. Man Cybern., Part A. Syst. Humans, 32�4�, pp.482–496.

�13� Ferreira, J. P., Crisostomo, M. M., and Coimbra, A. P., 2009, “Human GaitAcquisition and Characterization,” IEEE Trans. Instrum. Meas., 58�9�, pp.2979–2988.

�14� Ude, A., Atkeson, C. G., and Riley, M., 2004, “Programming Full-BodyMovements for Humanoid Robots by Observation,” Robot. Auton. Syst.,47�2–3�, pp. 93–108.

24501-6 / Vol. 2, MAY 2010

�15� Liu, T., Inoue, Y., Shibata, K., and Morioka, H., 2006, “Development of Wear-able Sensor Combinations for Human Lower Extremity Motion Analysis,”Proceedings of the IEEE International Conference on Robotics and Automa-tion, pp. 1655–1660.

�16� Liu, T., Inoue, Y., and Shibata, K., 2009, “Development of a Wearable SensorSystem for Quantitative Gait Analysis,” Measurement, 42�7�, pp. 978–988.

�17� Vukobratovic, M., Borovac, B., Surla, D., and Stokic, D., 1990, BipedLocomotion—Dynamics, Stability, Control and Application, Springer-Verlag,Berlin.

Transactions of the ASME