ZMP Trajectory from Human Body Locomotion Dynamics …ipisoft.com/pr/papers/paper166.pdf ·...

8
See discussions, stats, and author profiles for this publication at: https://www.researchgate.net/publication/301721671 ZMP Trajectory from Human Body Locomotion Dynamics Evaluated by Kinect-based Motion Capture System Conference Paper · February 2016 DOI: 10.5220/0005726001600166 CITATIONS 0 READS 78 4 authors, including: Some of the authors of this publication are also working on these related projects: Research and Development of Software Solutions for Static and Dynamic Based Control of Anthropomorphic Bipedal Robots Locomotion View project Localization, mapping and path planning for an unmanned ground vehicle (UGV) with a help of a group of unmanned aerial vehicles (UAVs) using active collaborative vision and multi- robot belief space planning View project Ilya Afanasyev Innopolis University 41 PUBLICATIONS 53 CITATIONS SEE PROFILE All in-text references underlined in blue are linked to publications on ResearchGate, letting you access and read them immediately. Available from: Ilya Afanasyev Retrieved on: 09 November 2016

Transcript of ZMP Trajectory from Human Body Locomotion Dynamics …ipisoft.com/pr/papers/paper166.pdf ·...

Page 1: ZMP Trajectory from Human Body Locomotion Dynamics …ipisoft.com/pr/papers/paper166.pdf · 2016-11-09 · ZMP trajectory from human body locomotion dynamics evaluated by Kinect-based

Seediscussions,stats,andauthorprofilesforthispublicationat:https://www.researchgate.net/publication/301721671

ZMPTrajectoryfromHumanBodyLocomotionDynamicsEvaluatedbyKinect-basedMotionCaptureSystem

ConferencePaper·February2016

DOI:10.5220/0005726001600166

CITATIONS

0

READS

78

4authors,including:

Someoftheauthorsofthispublicationarealsoworkingontheserelatedprojects:

ResearchandDevelopmentofSoftwareSolutionsforStaticandDynamicBasedControlof

AnthropomorphicBipedalRobotsLocomotionViewproject

Localization,mappingandpathplanningforanunmannedgroundvehicle(UGV)withahelp

ofagroupofunmannedaerialvehicles(UAVs)usingactivecollaborativevisionandmulti-

robotbeliefspaceplanningViewproject

IlyaAfanasyev

InnopolisUniversity

41PUBLICATIONS53CITATIONS

SEEPROFILE

Allin-textreferencesunderlinedinbluearelinkedtopublicationsonResearchGate,

lettingyouaccessandreadthemimmediately.

Availablefrom:IlyaAfanasyev

Retrievedon:09November2016

Page 2: ZMP Trajectory from Human Body Locomotion Dynamics …ipisoft.com/pr/papers/paper166.pdf · 2016-11-09 · ZMP trajectory from human body locomotion dynamics evaluated by Kinect-based

ZMP trajectory from human body locomotion dynamics evaluated byKinect-based motion capture system

Igor Danilov, Bulat Gabbasov, Ilya Afanasyev, and Evgeni MagidIntelligent Robotic Systems Laboratory (LIRS), Innopolis University, Universitetskaya 1, Innopolis, 420500, Russia

{i.danilov, b.gabbasov, i.afanasyev, e.magid}@innopolis.ru

Keywords: Motion Capture (MoCap), Zero moment point (ZMP), Kinect sensor, Biomechanical analysis of humanlocomotion, multi-depth sensor video recording

Abstract: This article presents the methods of zero moment point (ZMP) trajectory evaluation for human locomotionby processing biomechanical data recorded with Kinect-based motion capture (MoCap) system. Our MoCapsystem consists of four Kinect 2 sensors, using commercial iPi soft markerless tracking and visualizationtechnology. We apply iPi Mocap Studio software to multi-depth sensor video recordings, acquiring visual andbiomechanical human gait data, including linear and angular coordinates, velocities, accelerations and centerof mass (CoM) position of each joint. Finally, we compute ZMP and ground projection of the CoM (GCOM)trajectories from human body dynamics in MATLAB by two methods, where human body is treated as (1)a single mass point, and (2) multiple mass points (with following ZMP calculation via inertia tensor). Thefurther objective of our research is to reproduce the human-like gait with Russian biped robot AR-601M.

1 INTRODUCTION

The history of humanoid robotics is directly corre-lated with the active exoskeletons development. Eventhough one of the first exoskeleton was created al-most 50 years ago, in 1969, under the leadership ofYugoslav scientist Miomir Vukobratovic (Vukobra-tovic and Juricic, 1969), humanoid robotics relatedresearch topics keep attracting significant attentionof scientists as humanoid locomotion is still too farfrom human walking stability and energy efficiency(Larsen and Stoy, 2011).

Both humans and humanoids are underactuatedsystems with no actuation between a foot and support-ing surface (Dasgupta and Nakamura, 1999). In or-der to develop energy-efficient locomotion algorithmsfor a humanoid, a human gait analysis should be per-formed (Asano and Luo, 2008). Taking into accountsignificant differences between a human and a hu-manoid it is not feasible to apply a human gait directlyto a robot (Field et al., 2009). These differences in-clude distinct amount of Degrees of Freedom (DoFs)and skeletons, different mass distribution and CoMposition, limited capabilities of humanoids relativelyto humans in terms of joint constraints (position, ve-locity and acceleration). Thus, there is no direct map-ping of human relative positions to the robot and kine-matic mismatch requires kinematic corrections with

calculating the joint angle trajectories. At the sametime to keep locomotion balance an advanced controlshould be applied to the robot, overcoming problemsof underactuation and dynamic mismatch (Dasguptaand Nakamura, 1999; Naksuk et al., 2005).

Nevertheless, a number of research teams re-ported successful automatic generation of robot walk-ing from human walking data through different math-ematical models of locomotion control - analysis ofMoCap data of a human locomotion provided certainhuman locomotion outputs, which were further im-plemented into robotic bipedal locomotion. Sinnet et.al. (Sinnet et al., 2011) introduced canonical humanwalking functions, which were used to form a linearspring-damper system to estimate human-walking be-havior. Chalodhorn et al. (Chalodhorn et al., 2007)described a mapping of a human gait data (obtainedfrom MoCap data) onto a body skeleton, applying in-verse kinematics procedure; this was followed by us-ing principal component analysis (PCA) to reduce di-mensionality of the joints space, building 3D motionspace representation of kinematic postures.

As far as the terms of static (GCOM) and dy-namic (ZMP) stabilities (Mrozowski et al., 2007) canbe used to biped and human locomotion balance re-search, we are going to apply obtained with a Mo-Cap system human locomotion data for a humanoidrobot balancing. This could be realized by adapt-

Page 3: ZMP Trajectory from Human Body Locomotion Dynamics …ipisoft.com/pr/papers/paper166.pdf · 2016-11-09 · ZMP trajectory from human body locomotion dynamics evaluated by Kinect-based

ing human gait parameters (joint angles and angu-lar moments, GCOM and ZMP trajectories, etc.) tothe robot ones taking into account relevant constraintswith kinematic and dynamic corrections.

MoCap systems are usually based onmarker/markerless motion capture technologies,using sensors with different physical principles: opti-cal, magnetic, inertial, mechanical, and even acoustic(Field et al., 2009). The most precise technologiesare the most expensive, like mechanical exoskeletonswith direct joint angles tracking (Wong et al., 2015),inertial sensors (e.g., XSENS system1) with accel-eration and rotational velocity measurements fromtriaxial accelerometers and gyroscopes (Wong et al.,2015), and optical MoCap based on multi-camerasystem (e.g., VICON2), which captures 2D videosfrom several cameras and then merges them into 3Dmodel by triangulation (Field et al., 2009).

In our research we use low cost markerless opticalMoCap system based on four depth sensors Kinect 2and iPi Soft software. We calculate ZMP and GCoMtrajectories from human body dynamics and estimatethe accumulated errors. Static and dynamic criteriafurther will be applied for human data re-projectiononto human-like gait of Russian biped robot AR-601M, yielding its stable and natural locomotion.

The paper is organized as following. Section 2 de-scribes our system setup, consisting of Kinect-basedMoCap system, iPi Mocap Studio software and AR-601M robot. Section 3 considers human body approx-imation with a single and multiple mass points andintroduces two ZMP evaluation methods. Section 4presents the results of ZMP and GCoM trajectory cal-culations based on MoCap measurements with accu-racy estimation. Finally we conclude and discuss ourfuture research activities.

2 SYSTEM SETUP

2.1 Kinect-based Motion Capture

For human gait registration we use markerless opti-cal MoCap system based on four Kinect 2 sensors,which encapsulate a total area of 23 sq.m with a walk-ing area of 4.5 m length (Fig. 1, 2). Each Kinectsensor contains RGB camera and depth sensor. Mo-Cap system analyzes depth sensor video recordings

1XSENS is an inertial 3D motion tracking technologybased on body-mounted MEMS inertial sensors and multi-sensor data fusion algorithms, www.xsens.com

2VICON is a passive optical motion capture technologybased on high resolution multi-camera system and advancedreal-time computer vision algorythms, www.vicon.com

to tracks human motion. Next, in order to reproducehuman locomotion with a human-like bipedal robot,a human skeleton animation is built and a human 3Dmodel with linear and angular joint motion parame-ters is reconstructed. MoCap system acquires humanmotion data in three stages: 1) MoCap calibration; 2)human motion tracking; 3) non-real-time processingwith iPi Soft software and MATLAB. These stageswere executed with iPi Soft which acquired MoCapdata and provided non-real-time processing by merg-ing depth sensor video data from four Kinect sen-sors, correcting the data and filtering out the noise.Each Kinect sensor was connected to an individualWindows PCs, forming a distributed system with onemaster PC, which synchronized three other PCs andthe corresponding depth sensors’ records (Gabbasovet al., 2015).

Figure 1: Kinect-based MoCap system: The scene view.

Figure 2: MoCap system: The scene configuration.

The detailed information on MoCap calibration,measurement technique, data acquisition and process-ing with iPi Soft could be found in (Gabbasov et al.,2015) and iPi Docs resource3. The MoCap calibrationevaluates ground position and mutual Kinect sensor

3iPi Docs: User Guide for Multiple Depth Sensors Con-figuration, http://docs.ipisoft.com/User Guide for MultipleDepth Sensors Configuration

Page 4: ZMP Trajectory from Human Body Locomotion Dynamics …ipisoft.com/pr/papers/paper166.pdf · 2016-11-09 · ZMP trajectory from human body locomotion dynamics evaluated by Kinect-based

localization. During calibration process we had sig-nificant issue with incorrect definition of a glowingmarker position by iPi Mocap Studio which were pos-sibly caused by a weak contrast of the glowing markerrelatively to laboratory walls. We have overcome thisproblem by performing the calibration in the darkroom. Unfortunately, such approach increased the av-erage calibration error to 0.045 m (Gabbasov et al.,2015), and thus decreased MoCap measurement ac-curacy and also impacted the GCoM and ZMP errors.

In our further calculations of human GCoM andZMP trajectories we use the data which were obtainedfrom iPi Biomech Add-on plugin4, such as CoM co-ordinates and accelerations. We exported biomechan-ical characteristics into MATLAB and selected a timeperiod where a human walked forward, matching X-axis with forward walking direction and Z-axis withleft-hand direction (on default, Y -axis is in upward di-rection).

2.2 iPi Soft package

iPi Soft software5 uses 3D depth sensors to trackhuman joints and produce 3D animations, providinghuman pose data with centimeter-level accuracy off-line (Kim et al., 2015). It consists of free iPi Recorderand iPi Mocap Studio software. iPi Recorder acquiresdepth sensor video data from four Kinect 2 sensors,and then iPi Mocap Studio processes multiple sen-sor video records off-line, reconstructing 3D modelof human locomotion applying inverse kinematics ap-proach. Afterwards we use iPi Biomech Add-on plu-gin to calculate joint coordinates and angles, linearand angular velocities, accelerations, and CoM posi-tions over time.

Finally, we analyze human gait to identify keyfeatures of human locomotion in MATLAB, collect-ing statistically significant data to create an adequatehuman gait mathematical model, which could beadapted to AR-601M robot gait mathematical model.

2.3 AR-601M robot description

The AR-601M biped robot6 (Fig. 3) is a full-size hu-manoid with the height of 144 cm and weight of 65kg, having 43 active DoFs (including 12 DoFs inrobot legs). Nowadays, robot supports slow locomo-tion with GCoM trajectory laying within support feet

4http://docs.ipisoft.com/iPi Biomech Add-on5Motion Capture software, supporting markerless tech-

nology from Russian company iPi Soft, http://ipisoft.com6AR-601M robot is being developed by Russian

company Androidnaya Tehnika (Android Technics),http://en.npo-at.com/products/ar-600

during the its walking (Khusainov et al., 2016).

Figure 3: Android Technics AR-601M robot.

3 MOCAP-BASED HUMAN ZMPCALCULATION

3.1 ZMP from human body dynamicsas a single mass point

The biped/human gait is statically stable when theGCoM trajectory lays within a foot support area andthe corresponding support polygon (Goswami, 1999),whereas criterion of dynamical stability is describedwith zero moment point (ZMP) term (Vukobratovicand Borovac, 2004). ZMP of a properly balanced gaitcoincides with Center of Pressure (CoP), presenting apoint under the foot where the ground reaction forcefully reduces the effects of forces and moments on thefoot from the whole body (Vukobratovic and Borovac,2004). ZMP is considered as the dynamical equiv-alent of the GCoM: a body with the ZMP locationunder the foot is stable, otherwise it is not.

To calculate ZMP from body dynamics as a singlemass point we used a rough approximation of bipedlocomotion by a so-called cart-table model (Kajitaet al., 2003), which evaluates ZMP as a function ofCoM position and accelerations, anchoring the CoMheight (y) during locomotion:{

xzmp(t) = xcom(t)− yg xcom(t)

zzmp(t) = zcom(t)− yg zcom(t)

(1)

Page 5: ZMP Trajectory from Human Body Locomotion Dynamics …ipisoft.com/pr/papers/paper166.pdf · 2016-11-09 · ZMP trajectory from human body locomotion dynamics evaluated by Kinect-based

where xzmp, zzmp are coordinates of ZMP. We appliedequation (1) to the data which were exported intoMATLAB from iPi Biomech Add-on plugin.

3.2 ZMP from human body dynamicsas multiple mass points

We represent a human body as a model with a 12 masspoints mi set in the middle of the corresponding bodyparts (a head, a torso, two shoulders, two forearms,two thighs, two shins and two feet), and human massdistribution for the body parts (Table 1) is performedaccording to (Haley et al., 1988).

Table 1: Human mass distribution of the body parts (Haleyet al., 1988).

Segment Segment massHead 6%Torso 48%

Shoulder 2.5%Forearm 2.5%Thigh 12%Shin 4.5%Foot 1.5%

The ZMP equations with multiple mass points viainertia tensor in the sagittal and frontal planes are cal-culated as follows (Ha and Choi, 2007):

xzmp(t) =

12∑

i=1mi(yi+g)xi−

12∑

i=1mi xiyi

12∑

i=1mi(yi+g)

zzmp(t) =

12∑

i=1mi(yi+g)zi−

12∑

i=1mi ziyi

12∑

i=1mi(yi+g)

(2)

where xzmp, zzmp are coordinates of ZMP and mi is thepoint mass of the i-th body part.

3.3 CoM and ZMP accuracy estimationtechnique

Kinect-based MoCap brings a stochastic error to thetrue measurement characteristics such as coordinatesand accelerations of the i-th body part (Gabbasovet al., 2015) and therefore MoCap-measured valuesshould be treated as a non-stationary stochastic pro-cess. We assume that the total measurement errorconsists of several components: Kinect-based MoCapcalibration error (i.e. the error in cross-localization offour Kinect sensors with regard to each other), theaccuracy of human motion approximation with iPiSoft, the error in distance estimation between a hu-man body and sensors, etc. Moreover, measurement

error analysis is a quite difficult task because it is im-possible to take into account all Kinect sensor config-urations and the accuracy of human motion approx-imation with iPi Soft, which depends on the scenebackground, human individual anatomical features,clothes’ color, etc. Therefore, according to the Cen-tral Limit Theorem we can assume that the measure-ment errors are normally distributed and the varianceof the stochastic process is time-independent. Thus,the probability density of the measured values (coor-dinates and accelerations of the i-th body part’s CoM)is defined as:

p(x, t) =1√2πσ

exp− (x−µ(t))2

2σ2 (3)

where x is a measured value, µ(t) is a time-dependentmathematical expectation (e.g., a true coordinates oraccelerations for a body part’s CoM), σ is a varianceof the measured value.

In our study, the absolute measurement error couldbe estimated as a standard deviation from the time-dependent mean. For example, the measurement errorof x-coordinate of CoM is computed as the unbiasedstandard deviation:

∆xcom =

√√√√√ N∑

i=1(xi−M(x, t))2

(N−1)(4)

where M(x, t) is mathematical expectation for x-coordinate of CoM. The mathematical expectationof CoM for i-th body part changes over time alongx-coordinate during human locomotion (e.g., foot’sCoM acceleration is shown in Fig. 4, blue curve).Therefore we smooth the MoCap-measured data (bluecurve in Fig. 4) with moving average to estimate themathematical expectation (brown curve in Fig. 4).

Figure 4: Measured foot’s CoM acceleration along the x-axis (blue curve) and its smoothing (brown curve).

We use the same method to calculate the smoothedmathematical expectation of CoM acceleration andcoordinates for both human body as a single masspoint (the first method, Section 3.1) and all body parts

Page 6: ZMP Trajectory from Human Body Locomotion Dynamics …ipisoft.com/pr/papers/paper166.pdf · 2016-11-09 · ZMP trajectory from human body locomotion dynamics evaluated by Kinect-based

as multiple mass points (the second method, Sec-tion 3.2). As far as we provide indirect ZMP measure-ments, the Error Theory should be applied to calculateZMP measurement error ∆ZMPx with equation:

∆ZMPx =

√(

n

∑i=1

∂ZMP∂xi

dxi)2 (5)

where ∂ZMP∂xi

is a partial derivative of ZMP functionfor one of the variables and dxi is the estimation ofabsolute measurement error for this variable.

Therefore, the total ZMP error for the first methodis calculated as follows:∆ZMPx =

√(∆x2

com +( yg ∆axcom)2)

∆ZMPz =√(∆z2

com +( yg ∆azcom)2)

(6)

where y is average CoM position along the verticalY -axis.

Whereas the total ZMP error for the secondmethod is calculated as follows:

∆ZMPx =

√12∑

i=1(mi(yi+g)∆xi)2+

12∑

i=1(mixi∆yi)2

12∑

i=1mi(yi+g)xi−

12∑

i=1mi xiyi

+

√12∑

i=1(miyi∆xi)2+

12∑

i=1(mi xi∆yi)2

12∑

i=1mi(yi+g)xi−

12∑

i=1mi xiyi

√12∑

i=1mi∆yi

12∑

i=1mi(yi+g)

∆ZMPz =

√12∑

i=1(mi(yi+g)∆zi)2+

12∑

i=1(mizi∆yi)2

12∑

i=1mi(yi+g)zi−

12∑

i=1mi ziyi

+

√12∑

i=1(miyi∆zi)2+

12∑

i=1(mi zi∆yi)2

12∑

i=1mi(yi+g)zi−

12∑

i=1mi ziyi

√12∑

i=1mi∆yi

12∑

i=1mi(yi+g)

(7)

where xi,yi,zi, xi, yi, zi correspond to the coordi-nates and accelerations of the i-th body part, and∆xi,∆yi,∆zi,∆xi,∆yi,∆zi are absolute error estima-tions for coordinates and accelerations of the i-thbody part.

4 ZMP TRAJECTORY ANDACCURACY ANALYSIS

4.1 ZMP and GCoM trajectories fromhuman body dynamics

ZMP and GCoM trajectories on the ground planewere calculated from human body data, where hu-man body was approximated a single and multiple

mass points methods, according to equations (1) and(2) correspondingly. Figure 5 represents human ZMPtrajectories obtained by a single mass point method(red curve) and multiple mass points method (greencurve), whereas blue curve shows the GCoM trajec-tory and orange ellipses illustrate human footprints.The figure shows that both ZMP and GCoM trajec-tories are located close to the footprints (and conse-quently to the corresponding support polygons). Itsatisfies the static (GCoM) and dynamic (ZMP) sta-bility criteria (Mrozowski et al., 2007) and provesthat the human gait was properly balanced. More-over, the ZMP trajectory which was calculated apply-ing multiple mass point method lays slightly closer tothe GCoM trajectory than the calculated with a singlemass point method trajectory. As far as both methodsprocess the same MoCap data, it means that they havedifferent accuracy. Only the last segment of ZMP tra-jectories lays outside the support area (footprint) atthe coordinate of 2 m. It emphasizes the balancingchanges before a human stop at the end of MoCapwalking area.

Figure 5: Human ZMP trajectory from single mass point(red curve) and multiple mass points methods (green curve),GCoM trajectory (blue curve) and footprints (orange el-lipses) on the ground plane. The X-axis and Z-axis are ori-ented in the human walking and orthogonal to the walking(lateral) directions respectively.

Figures 6 and 7 show ZMP trajectories vs. time insagittal and frontal planes respectively which are cal-culated from human body locomotion dynamics. Thesignificant deviations of ZMP trajectory in the frontalplane in the vicinity of the footprint positions arisefrom the limitations of MoCap system measurementaccuracy.

Figure 6: Human ZMP trajectory vs. time in sagittal planefor single mass point (red curve) and multiple mass pointsmethods (green curve).

Page 7: ZMP Trajectory from Human Body Locomotion Dynamics …ipisoft.com/pr/papers/paper166.pdf · 2016-11-09 · ZMP trajectory from human body locomotion dynamics evaluated by Kinect-based

Figure 7: Human ZMP trajectory vs. time in frontal planefor single mass point (red curve) and multiple mass pointsmethods (green curve).

4.2 CoM and ZMP accuracy estimation

We estimated the GCoM measurement error as a stan-dard deviation applying equation (4), which gave usthe typical values:{

∆xcom ≈ 1cm∆zcom ≈ 1cm

(8)

For the human body model which is built with a sin-gle mass point method the total ZMP errors are time-independent due to the model simplicity (Gabbasovet al., 2015), and its values are computed according toequations (6): {

∆ZMPx ≈ 7cm∆ZMPz ≈ 7cm

(9)

For the 12 mass points human body model the to-tal ZMP errors were calculated according to equa-tions (7). The calculated values of the total ZMP er-ror in the walking direction ∆ZMPx varies from 1 mmto 35 mm (where the value ∆ZMPx = 35 mm corre-sponds to the MoCap walking area of 2 m length, i.e.±1 m from the center of MoCap scene in Fig. 2),whereas ZMP error in lateral direction ∆ZMPz variesin the range from 1 to 7 mm.{

∆ZMPx ≈ 3.5cm∆ZMPz ≈ 0.7cm

(10)

Moreover, the ZMP error in the walking direction∆ZMPx strongly depends on the x-coordinate of ZMPtrajectory (i.e., on a human position relatively to thecenter of Kinect-based MoCap system), which resultsin the minimal total ZMP error at the center of Mo-Cap walking area and maximal error on the bound-aries of the MoCap capture zone. The growth of∆ZMPx is nonlinear from MoCap capture zone centerto its boundaries. While ∆ZMPx was approximatelythe same for both ZMP evaluation methods on the

boundaries of the ±1.5 m interval of the MoCap cap-ture zone (centered in the middle of the 4.5 m walkingzone), for the MoCap capture zone of 4.5 m (whichcorresponds to start and stop lines in Fig. 2) the sec-ond ZMP evaluation method (Section 3.2) gives muchhigher total ZMP error ∆ZMPx than the first method(Section 3.1). Therefore, to eliminate the influence ofacceleration and deceleration within the first steps andthus to minimize the total ZMP error we restricted theMoCap active capture zone to ±1 m from the centerof the MoCap scene for the second method.

Figures 8 and 9 demonstrate ZMP and GCoM tra-jectories for the single and multiple mass point meth-ods of human body correspondingly. The total ZMPerror is shown with the red bars. The comparison oftotal ZMP errors for human body models as a sin-gle and multiple mass points (equations (9) and (10)correspondingly) demonstrates that the multiple masspoint method is more accurate for the shortened Mo-Cap capture zone of 2 m length and could be betterapplicable to the task of human locomotion analysisby static (GCoM) and dynamic (ZMP) stability crite-ria of human gait balance.

Figure 8: ZMP trajectory (red curve) and the total ZMP er-rors (red bars), calculated with a single mass point method,and GCoM trajectory (blue curve).

Figure 9: ZMP trajectory (red curve) and the total ZMPerrors (red bars), calculated with a multiple mass pointsmethod, and GCoM trajectory (blue curve).

5 CONCLUSION AND FUTUREWORK

This paper focused on the analysis of human loco-motion. The locomotion data was recorded with Mo-Cap markerless tracking and visualization technol-ogy, which is based on four Kinect 2 sensors andnon-real-time processing with iPi Soft software and

Page 8: ZMP Trajectory from Human Body Locomotion Dynamics …ipisoft.com/pr/papers/paper166.pdf · 2016-11-09 · ZMP trajectory from human body locomotion dynamics evaluated by Kinect-based

MATLAB. Human gait balance was analyzed apply-ing static (GCOM) and dynamic (ZMP) stabilities.

Human ZMP and GCoM trajectories were calcu-lated by two methods, which consider human bodyas a simplified approximation with the models of asingle mass point and multiple mass points. Our cal-culations demonstrated close localization of analyzedGCoM and ZMP trajectories to the human’s footprintsand the corresponding support polygons. It meansthat the human walking had static and dynamic sta-bility, proving that the human gait was properly bal-anced. The comparison of total ZMP errors for hu-man body models as a single and multiple mass pointsdemonstrated that the later method is significantlymore accurate for the limited MoCap walking zone of2 m length (the maximal ZMP error was 3.5 cm alongthe walking direction and 0.7 cm in lateral direction)than the former one, and thus should be preferred forhuman gait estimation. Quite significant ZMP trajec-tory deviations in the vicinity of footprints’ positionsarise from the limitations of MoCap system measure-ment accuracy.

Finally, we use MoCap system and analyze humanlocomotion to identify key features of human walk-ing, collecting statistically significant data to createan adequate human gait mathematical model, whichcould be adapted to Russian AR-601M robot simu-lation model, yielding its statically and dynamicallystable and more natural locomotion.

ACKNOWLEDGEMENTS

This work was supported by Russian Ministry ofEducation and Science, and our industrial partnerAndroid Technics under Scientific and Technolog-ical Research and Development Program of Rus-sian Federation for 2014-2020 years (research grantRFMEFI60914X0004). Special thanks to iPi Softcompany for providing temporary access to iPi SoftMocap Studio.

REFERENCES

Asano, F. and Luo, Z.-W. (2008). Energy-efficient andhigh-speed dynamic biped locomotion based on prin-ciple of parametric excitation. IEEE Transactions onRobotics, 24(6):1289–1301.

Chalodhorn, R., Grimes, D. B., Grochow, K., and Rao, R.P. N. (2007). Learning to walk through imitation. InProc. of 20th Int. Joint Conference on Artifical Intel-ligence, IJCAI’07, pages 2084–2090, San Francisco,CA, USA. Morgan Kaufmann Publishers Inc.

Dasgupta, A. and Nakamura, Y. (1999). Making feasiblewalking motion of humanoid robots from human mo-tion capture data. In IEEE ICRA, volume 2, pages1044–1049.

Field, M., Stirling, D., Naghdy, F., and Pan, Z. (2009). Mo-tion capture in robotics review. In IEEE Int. Conf. onControl and Automation, pages 1697–1702.

Gabbasov, B., Danilov, I., Afanasyev, I., and Magid, E.(2015). Toward a human-like biped robot gait: Biome-chanical analysis of human locomotion recorded bykinect-based motion capture system. In Proc. of 10thInt. Symposium on Mechatronics and its Applications.

Goswami, A. (1999). Foot rotation indicator (fri) point: Anew gait planning tool to evaluate postural stability ofbiped robots. In IEEE ICRA, pages 47–52.

Ha, T. and Choi, C.-H. (2007). An effective trajectory gen-eration method for bipedal walking. Robotics and Au-tonomous Systems, 55(10):795–810.

Haley, J. et al. (1988). Anthropometry and mass distribu-tion for human analogues. In Military Male Aviators,Volume 1, Aerospace Medical Research Lab., Wright-Patterson AFB Ohio USA, Tech. Rep, pages 34–38.

Kajita, S., Kanehiro, F., Kaneko, K., Fujiwara, K., Harada,K., Yokoi, K., and Hirukawa, H. (2003). Biped walk-ing pattern generation by using preview control ofzero-moment point. In IEEE ICRA, volume 2, pages1620–1626.

Khusainov, R., Afanasyev, I., and Magid, E. (2016). An-thropomorphic robot modelling with virtual height in-verted pendulum approach in Simulink: step lengthand robot height influence on walking stability. In Int.Conf. on Artificial ALife and Robotics (in press).

Kim, H., Lee, S., Lee, D., Choi, S., Ju, J., and Myung, H.(2015). Real-time human pose estimation and gesturerecognition from depth images using superpixels andsvm classifier. Sensors, 15(6):12410–12427.

Larsen, J. C. and Stoy, K. (2011). Energy efficiency of robotlocomotion increases proportional to weight. Proce-dia Computer Science, Proc. 2nd European FutureTechnologies Conference and Exhibition, 7:228 – 230.

Mrozowski, J., Awrejcewicz, J., and Bamberski, P. (2007).Analysis of stability of the human gait. Journal oftheoretical and applied mechanics, 45:91–98.

Naksuk, N., Lee, C., and Rietdyk, S. (2005). Whole-bodyhuman-to-humanoid motion transfer. In 5th IEEE-RAS Int. Conf. on Humanoid Robots, pages 104–109.

Sinnet, R. W., Powell, M. J., Jiang, S., and Ames, A. D.(2011). Compass gait revisited: A human data per-spective with extensions to three dimensions. In Proc.of 50th IEEE Conf. on Decision and Control, and Eu-ropean Control Conf. (CDC-ECC), pages 682–689.

Vukobratovic, M. and Borovac, B. (2004). Zero-momentpoint thirty five years of its life. International Journalof Humanoid Robotics, 1(1):157–173.

Vukobratovic, M. and Juricic, D. (1969). Contribution tothe synthesis of biped gait. In IEEE Transactions onBiomedical Engineering, volume 16(1), pages 1–6.

Wong, C., Zhang, Z.-Q., Lo, B., and Yang, G.-Z. (2015).Wearable sensing for solid biomechanics: A review.Sensors Journal, IEEE, 15(5):2747–2760.