Dynamaid, an Anthropomorphic Robot for Research on ... · Dynamaid, an Anthropomorphic Robot for...

10
Jörg Stückler, Sven Behnke Dynamaid, an Anthropomorphic Robot for Research on Domestic Service Applications Domestic tasks require three main skills from autonomous robots: robust navigation, object manipulation, and intuitive communication with the users. Most robot platforms, however, support only one or two of the above skills. In this paper we present Dynamaid, a robot platform for research on domestic service applications. For robust navigation, Dynamaid has a base with four individually steerable differential wheel pairs, which allow omnidirec- tional motion. For mobile manipulation, Dynamaid is additionally equipped with two anthropomorphic arms that include a gripper, and with a trunk that can be lifted as well as twisted. For intuitive multimodal communication, the robot has a microphone, stereo cameras, and a movable head. Its humanoid upper body supports natural interaction. It can perceive persons in its environment, recognize and synthesize speech. We developed software for the tests of the RoboCup@Home competitions, which serve as benchmarks for domestic service robots. With Dynamaid and our communication robot Robotinho, our team Nimbro@Home took part in the RoboCup German Open 2009 and RoboCup 2009 competitions in which we came in second and third, respectively. We also won the innovation award for innovative robot design, empathic behaviors, and robot-robot cooperation. Key words: domestic service robots, mobile manipulation, human-robot interaction Croatian translation of the title. If one of the authors is a native Croatian speaker, then please type the abstract translation in Croatian here. Otherwise, please leave this paragraph as it is. The editors will take care of the translation. Ovo je primjer sažetka. Ovo je primjer sažetka. Ovo je primjer sažetka. Ovo je primjer sažetka. Ovo je primjer sažetka. Ovo je primjer sažetka. Ovo je primjer sažetka. Ovo je primjer sažetka. Ovo je primjer sažetka. Ovo je primjer sažetka. Ovo je primjer sažetka. Ovo je primjer sažetka. Ovo je primjer sažetka. Ovo je primjer sažetka. Ovo je primjer sažetka. Kljuˇ cne rijeˇ ci: kljuˇ cna rijeˇ c 1, kljuˇ cna rijeˇ c 2, kljuˇ cna rijeˇ c3 1 INTRODUCTION When robots will leave industrial mass production to help with household chores, the requirements for robot platforms will change. While industrial production re- quires strength, precision, speed, and endurance, domestic service tasks require different capabilities from the robots. The three most important skills for an autonomous house- hold robot are: robust navigation in indoor environments, object manipulation, and intuitive communication with the users. Robust navigation requires a map of the home, naviga- tional sensors, such as laser-range scanners, and a mobile base that is small enough to move through the narrow pas- sages found in domestic environments. At the same time, the base must have a large enough support area to allow for a human-like robot height, which is necessary for both object manipulation and for face-to-face communication with the users. Object manipulation requires dexterous arms that can handle the payload of common household objects. To detect and recognize such objects the robot needs appropriate sensors like laser range finders and cam- eras. Intuitive communication with the users requires the combination of multiple modalities, such as speech, ges- tures, mimics, and body language. While, in our opinion, none of the three requirements is optional, most available domestic robot systems support only one or two of the above skills. In this paper, we de- scribe the integration of all three skills in our robot Dy- namaid, which we developed for research on domestic ser- vice applications. We equipped Dynamaid with an omnidi- rectional drive for robust navigation, two anthropomorphic arms for object manipulation, and with a communication In AUTOMATIKA - Journal for Control, Measurement, Electronics, Computing and Communications, Special Issue on ECMR’09, 2010

Transcript of Dynamaid, an Anthropomorphic Robot for Research on ... · Dynamaid, an Anthropomorphic Robot for...

Jörg Stückler, Sven Behnke

Dynamaid, an Anthropomorphic Robot for Research on DomesticService Applications

Domestic tasks require three main skills from autonomous robots: robustnavigation, object manipulation, andintuitive communication with the users. Most robot platforms, however, support only one or two of the above skills.

In this paper we present Dynamaid, a robot platform for research ondomestic service applications. For robustnavigation, Dynamaid has a base with four individually steerable differential wheel pairs, which allow omnidirec-tional motion. For mobile manipulation, Dynamaid is additionally equipped with twoanthropomorphic arms thatinclude a gripper, and with a trunk that can be lifted as well as twisted. For intuitive multimodal communication, therobot has a microphone, stereo cameras, and a movable head. Its humanoid upper body supports natural interaction.It can perceive persons in its environment, recognize and synthesizespeech.

We developed software for the tests of the RoboCup@Home competitions, which serve as benchmarks fordomestic service robots. With Dynamaid and our communication robot Robotinho, our team Nimbro@Home tookpart in the RoboCup German Open 2009 and RoboCup 2009 competitions inwhich we came in second and third,respectively. We also won the innovation award for innovative robot design, empathic behaviors, and robot-robotcooperation.

Key words: domestic service robots, mobile manipulation, human-robot interaction

Croatian translation of the title. If one of the authors is a native Croatian speaker, then please type the abstracttranslation in Croatian here. Otherwise, please leave this paragraph as it is. The editors will take care of thetranslation.

Ovo je primjer sažetka. Ovo je primjer sažetka. Ovo je primjer sažetka. Ovoje primjer sažetka. Ovo je primjersažetka. Ovo je primjer sažetka. Ovo je primjer sažetka. Ovo je primjer sažetka. Ovo je primjer sažetka. Ovo jeprimjer sažetka. Ovo je primjer sažetka. Ovo je primjer sažetka. Ovo je primjer sažetka. Ovo je primjer sažetka.Ovo je primjer sažetka.

Klju cne rijeci: kljucna rijec 1, kljucna rijec 2, kljucna rijec 3

1 INTRODUCTION

When robots will leave industrial mass production tohelp with household chores, the requirements for robotplatforms will change. While industrial production re-quires strength, precision, speed, and endurance, domesticservice tasks require different capabilities from the robots.The three most important skills for an autonomous house-hold robot are: robust navigation in indoor environments,object manipulation, and intuitive communication with theusers.

Robust navigation requires a map of the home, naviga-tional sensors, such as laser-range scanners, and a mobilebase that is small enough to move through the narrow pas-sages found in domestic environments. At the same time,the base must have a large enough support area to allowfor a human-like robot height, which is necessary for both

object manipulation and for face-to-face communicationwith the users. Object manipulation requires dexterousarms that can handle the payload of common householdobjects. To detect and recognize such objects the robotneeds appropriate sensors like laser range finders and cam-eras. Intuitive communication with the users requires thecombination of multiple modalities, such as speech, ges-tures, mimics, and body language.

While, in our opinion, none of the three requirementsis optional, most available domestic robot systems supportonly one or two of the above skills. In this paper, we de-scribe the integration of all three skills in our robot Dy-namaid, which we developed for research on domestic ser-vice applications. We equipped Dynamaid with an omnidi-rectional drive for robust navigation, two anthropomorphicarms for object manipulation, and with a communication

In AUTOMATIKA - Journal for Control, Measurement, Electronics, Computing and Communications, Special Issue on ECMR’09, 2010

Dynamaid, an Anthropomorphic Robot for Research on Domestic Service Applications J. Stückler, S. Behnke

Fig. 1. Our anthropomorphic service robot Dynamaidfetching a drink.

head. In contrast to most other service robot systems, Dy-namaid is lightweight, inexpensive, and easy to interface.

We developed software for solving the tasks of theRoboCup@Home competitions. These competitions re-quire fully autonomous robots to navigate in a home envi-ronment, to interact with human users, and to manipulateobjects.

Our team NimbRo@Home participated with great suc-cess at the RoboCup German Open in April 2009. Overall,we reached the second place. We evaluated our system alsoat RoboCup 2009, which took place in July in Graz, Aus-tria. In this competition, our robots came in third and wonthe innovation award.

After describing the mechanical and electrical details ofour robot in the next section, we cover algorithms and soft-ware integration for perception and autonomous behaviorcontrol in Sections 3-8. In Section 9, we report the ex-periences made during the RoboCup competitions. Afterreviewing related work in Section 10, the paper concludeswith a discussion of some ideas on the next steps in thedevelopment of capable domestic service robots.

2 HARDWARE DESIGN

We focused Dynamaid’s hardware design on lowweight, sleek appearance, and high movability. These areimportant features for a robot that interacts with peoplein daily life. In particular, the low weight is importantfor safety considerations, because the low weight requiresonly limited actuator power and thus Dynamaid is inher-ently safer than a heavy-weight robot. The total weightof our robot is only 20kg, an order of magnitude lowerthan other domestic service robots. The slim torso and theanthropomorphic arms strengthen the robot’s pleasant ap-pearance. With its omnidirectional driving and human-like

reaching capabilities, Dynamaid is able to perform a widevariety of mobile manipulation tasks. Its humanoid upperbody supports natural interaction with human users.

2.1 Omnidirectional Drive

Dynamaid’s mobile base (see Fig. 2) consists of four in-dividually steerable differential drives, which are attachedto corners of an rectangular chassis with size 60×42cm.We constructed the chassis from light-weight aluminumsections. Each pair of wheels is connected to the chassiswith a Robotis Dynamixel RX-64 actuator, which can mea-sure the heading angle, and which is also used to controlthe steering angle. Both wheels of a wheel pair are drivenindividually by Dynamixel EX-106 actuators. The Dy-namixel intelligent actuators communicate bidirectioanllyvia an RS-485 serial bus with an Atmel Atmega128 mi-crocontroller at 1 Mbps baudrate. Via this bus, the controlparameters of the actuators can be configured. The actu-ators report back position, speed, load, temperature, etc.The microcontroller controls the speed of the EX-106 ac-tuators and smoothly aligns the differential drives to targetorientations at a rate of about 100Hz. The main computer,a Lenovo X200 ThinkPad notebook, communicates overa RS-232 serial connection at 1Mbps with the microcon-troller. It implements omnidirectional driving by control-ling the linear velocities and orientations of the differentialdrives at a rate of 50Hz.

For navigation purposes, the base is equipped with aSICK S300 laser range finder. It provides distance mea-surements of up to 30m in an angular field-of-view of 270◦.The standard deviation of a measurement is approx 8mm.Two ultrasonic distance sensors cover the blind spot in theback of the robot.

Overall, the mobile base only weighs about 5kg. Itsmaximum payload is 20kg.

Fig. 2. Omnidirectional base with four individually steer-able diff-drives.

Dynamaid, an Anthropomorphic Robot for Research on Domestic Service Applications J. Stückler, S. Behnke

2.2 Anthropomorphic Upper Body

Dynamaid’s upper body consists of two anthropomor-phic arms, a movable head, and an actuated trunk. Alljoints are also driven by Dynamixel actuators. Each arm(see Fig. 3) has seven joints. We designed the arm size,joint configuration, and range of motion to resemble hu-man reaching capabilities. Each arm is equipped with a 2degree of freedom (DOF) gripper. Its maximum payloadis 1kg. From trunk to gripper an arm consists of a 3 DOFshoulder, an 1 DOF elbow, and a 3 DOF wrist joint. Theshoulder pitch joint is driven by two EX-106 actuators insynchronous mode to reach a holding torque of 20Nm anda maximum rotational speed of 2.3rad/s. Single EX-106servos actuate the shoulder roll, the shoulder yaw, and theelbow pitch joint. The wrist consists of RX-64 actuators(6.4Nm, 2rad/s) in yaw and pitch joint and a RX-28 servo(3.8Nm, 2.6rad/s) in the wrist roll joint. Both joints in thegripper are actuated by RX-28 servos.

All servos connect via a serial RS-485 bus to an AtmelAtmega128 microcontroller which forwards joint configu-rations like target joint angles, maximum torque, and targetvelocity from the main computer to the actuators. It alsoreports measured joint angles to the main computer.

The gripper contains four Sharp infrared (IR) distancesensors. With these sensors Dynamaid is able to directlymeasure the alignment of the gripper towards objects.They measure distance in the range 4cm to 30cm. Onesensor is attached at the bottom of the wrist to measureobjects like the table, for instance. Another sensor in thewrist perceives objects inside the hand. Finally, one sensoris attached at the tip of each finger. The sensor readingsare AD-converted by another Atmega128 microcontrollerin the wrist. The microcontroller is connected as slave tothe RS-485 network and forwards filtered measurementsto the master microcontroller which communicates themto the main computer.

Fig. 3. 7 DOF human-scale anthropomorphic arm with 2DOF gripper.

Fig. 4. Overview over the control modules within Dyna-maid’s behavior architecture.

In the trunk, Dynamaid is equipped with a HokuyoURG-04LX laser range finder. The sensor is mounted ona RX-28 actuator to twist the sensor around its roll axiswhich is very useful to detect objects in the horizontal andin the vertical plane. The trunk is additionally equippedwith two joints. One trunk actuator can lift the entire up-per body by 1m. This allows for object manipulation indifferent heights. In the lowest position, the trunk laser isonly 4cm above the ground. Hence, Dynamaid can pick upobjects from the floor. In the highest position, Dynamaidis about 180cm tall and can grab objects from 100cm hightables. The second actuator allows to twist the upper bodyby ±90◦. This extends the working space of the arms andallows the robot to face persons with its upper body with-out moving the base.

The head of Dynamaid consists of a white human facemask, a directional microphone, a time-of-flight camera,and a stereo camera on a pan-tilt neck built from 2 Dy-namixel RX-64 actuators. The stereo camera consists oftwo PointGrey Flea2-13S2C-C color cameras with a max-imum resolution of 1280x960 pixels. A MESA Swiss-Ranger SR4000 camera is located between the two stereocameras. It measures distance to objects within a range of10m.

Overall, Dynamaid currently has 35 joints, which canbe accessed from the main computer via USB. The robotis powered by Kokam 5Ah Lithium polymer cells, whichlast for about 30min of operation.

3 BEHAVIOR CONTROL ARCHITECTURE

Domestic service tasks require highly complex coordi-nation of actuation and sensing. Thus, for successful sys-tem integration, a structured approach to behavior designis mandatory. Dynamaid’s autonomous behavior is gener-ated in a modular multi-threaded control architecture. Weemploy the inter process communication infrastructure of

Dynamaid, an Anthropomorphic Robot for Research on Domestic Service Applications J. Stückler, S. Behnke

the Player/Stage project [1]. The control modules are or-ganized in four layers as shown in Fig. 4.

On thesensorimotor layer, data is acquired from thesensors and position targets are generated and sent to theactuating hardware components. The kinematic controlmodule, for example, processes distance measurements ofthe IR sensors in the gripper and feeds back control com-mands for the omnidirectional drive and the actuators intorso and arm.

The action-and-perception layercontains modules forperson and object perception, safe local navigation, local-ization, and mapping. These modules use sensorimotorskills to achieve reactive action and they process sensoryinformation to perceive the state of the environment. E.g.the local navigation module perceives its close surroundingwith the SICK S300 LRF to drive safely to target poses.

Modules on thesubtask layercoordinate sensorimo-tor skills, reactive action, and environment perception toachieve higher-level actions like mobile manipulation, nav-igation, and human-robot-interaction. For example, themobile manipulation module combines motion primitivesfor grasping and carrying of objects with safe omnidirec-tional driving and object detection.

Finally, at thetask layerthe subtasks are further com-bined to solve complex tasks that require navigation, mo-bile manipulation, and human-robot-interaction. One suchtask in the RoboCup@home competition is to fetch onespecific object out of a collection of objects from a shelf.The object is selected by a human user who is not familiarwith the robot’s usage. Together, the subtask and the tasklayer form a hierarchical finite state machine.

Our modular architecture design reduces the complex-ity of high-level domestic service tasks by successive ab-straction through the layers. Lower layer modules informhigher layer modules comprehensively and abstract aboutthe current state of the system. Higher layer modulesconfigure lower layer modules through abstract interfaces.Also, while lower layer modules need more frequent andprecise execution timing, higher layer modules are exe-cuted at lower frequency and precision.

4 SENSORIMOTOR SKILLS

High movability is an important property of a domesticservice robot. It must be able to maneuver close to ob-stacles and through narrow passages. To manipulate ob-jects in a typical domestic environment, the robot needsthe ability to reach objects on a wide range of heights,in large distances, and in flexible postures to avoid obsta-cles. Dynamaid’s mobile base is very maneuverable. It candrive omnidirectionally at arbitrary combinations of linearand rotational velocities within its speed limits. Its 7 DOF

anthropomorphic arm is controlled with redundant inversekinematics [2]. With its human-like mechanical design itcan reach objects in a wide range of heights at diverse armpostures. We implemented motion primitives, e.g. to graspobjects at arbitrary positions in the gripper’s workspace.

4.1 Control of the omnidirectional drive

We developed a control algorithm for Dynamaid’s mo-bile base that enables the robot to drive omnidirectionally.Its driving velocity can be set to arbitrary combinationsof linear and rotational velocities. The orientation of thefour drives and the linear velocities of the eight wheels arecontrolled kinematically such that their instantaneous cen-ters of rotation (ICRs) coincide with the ICR that resultsfrom the velocity commands for the center of the base.The drives are mechanically restricted to a270◦ orienta-tion range. Thus, it is necessary to flip the orientation of adrive by180◦, if it is close to its orientation limit. The maincomputer sends the target orientations and linear veloci-ties to the microcontroller that communicates with the Dy-namixel actuators. The microcontroller in turn implementsclosed-loop control of the wheel velocities. It smoothlyaligns the drives to their target orientations by rotating si-multaneously with the yaw actuators and the wheels. If adrive deviates significantly from its target orientation, thebase slows down quickly and the drive is realigned.

4.2 Control of the anthropomorphic arm

The arm is controlled using differential inverse kinemat-ics to follow trajectories of either the 6 DOF end-effectorpose or the 3 DOF end-effector position

θ = J#(θ) x − α (I − J#(θ)J(θ))dg(θ)

dθ, (1)

whereθ are the joint angles,x is the end-effector state vari-able,J andJ# are the Jacobian of the arm’s forward kine-matics and its pseudoinverse, respectively, andα is a stepsize parameter. Redundancy is resolved using nullspaceoptimization [3] of the cost functiong(θ) that favors con-venient joint angles and penalizes angles close to the jointlimits.

We implemented several motion primitives for grasp-ing, carrying, and handing over of objects. These motionprimitives are either open-loop motion sequences or usefeedback like the distance to objects as measured by theIR sensors, e.g., to adjust the height of the gripper oversurfaces or to close the gripper when an object is detectedbetween the fingers.

5 PERCEPTION OF OBJECTS AND PERSONS

Domestic service applications necessarily involve theinteraction with objects and people. Dynamaid is equipped

Dynamaid, an Anthropomorphic Robot for Research on Domestic Service Applications J. Stückler, S. Behnke

with a variety of sensors to perceive its environment. Itsmain sensors for object and person detection are the SICKS300 LRF, the Hokuyo URG-04LX LRF, and the stereocamera. The stereo camera is further used to recognize ob-jects and people.

5.1 Object detection and localization

We primarily use the Hokuyo URG-04LX LRF for ob-ject detection and localization. As the laser is mounted onan actuated roll joint, its scan is not restricted to the hori-zontal plane. In horizontal alignment, the laser is used tofind objects for grasping. The laser range scan is first seg-mented based on jump distance. Segments with specificsize and Cartesian width are considered as potential ob-jects. By filtering detections at a preferred object positionover successive scans, object are robustly tracked. In thevertical scan plane, the laser is very useful for detectingand estimating distance to and height of objects like tables.We use both types of object perception for mobile manipu-lation. We demonstrated their successful usage in theFetch& Carry task at the RoboCup@home competition.

5.2 Object recognition

The locations of the detected objects are mapped intothe image plane (see Fig. 5a). In the rectangular regionsof interest, both color histograms and SURF features [4]are extracted. The combined descriptor is compared toa repository of stored object descriptors. For each ob-ject class, multiple descriptors are recorded from differentview points during training, in order to achieve a view-independent object recognition. If the extracted descrip-tor is similar to a stored descriptor, an object hypothesisis initialized in egocentric coordinates, as shown in Part b)of Fig. 5. This hypothesis is maintained using a multi-hypothesis tracker, which integrates object observationsover time. When a recognition hypothesis is confirmedover several frames, the recognition hypothesis becomesconfident and the robot can use this information, e.g., tograsp a specific object.

5.3 Person detection and tracking

To interact with human users a robot requires situationalawareness of the persons in its surrounding. For this pur-pose, Dynamaid detects and keeps track of nearby persons.We combine both LRFs on the base and in the torso totrack people. The SICK S300 LRF on the base detectslegs, while the Hokuyo URG-04LX LRF detects trunks ofpeople. Detections from the two sensors are fused with aKalman filter which estimates position and velocity of aperson. It also accounts for the ego-motion of the robot. Inthis way, we can robustly track a person in a dynamic en-vironment which we could demonstrate in theFollow Metask at the RoboCup@home competition.

Fig. 5. Object Recognition: a) the camera image with rect-angular object regions that are computed from LRF objectdetections. The image also shows extracted SURF features(yellow dots) and recognized object class. b) recognizedobjects are tracked in egocentric coordinates by a Kalmanfilter.

Dynamaid keeps track of multiple persons in her sur-rounding with a multi-hypothesis-tracker. A hypothesis isinitialized from detections in the trunk laser range finder.Both types of measurements from the laser on the base andin the trunk are used to update the hypotheses. For asso-ciating measurements to hypotheses we use the Hungarianmethod [5].

Not every detection from the lasers is recognized as aperson. Dynamaid verifies that a track corresponds to aperson by looking towards the tracks and trying to detecta face in the camera images with the Viola&Jones algo-rithm [6].

5.4 Person identification

For theWho-is-Whoand theParty-Bottests, Dynamaidmust introduce itself to persons and recognize these per-sons later on. Using the VeriLook SDK, we implementeda face enrollment and identification system. In the enroll-ment phase, Dynamaid approaches detected persons andasks them to look into the camera. The extracted face de-scriptors are stored in a repository. If Dynamaid meets aperson later, she compares the new descriptor to the storedones, in order to determine the identity of the person.

6 NAVIGATION

Most domestic service tasks are not carried out at onespecific location, but require the robot to safely navigate inits environment. For this purpose, it must be able to esti-mate its pose in a given map, to plan obstacle free paths inthe map, and to drive safely along the path despite dynamicobstacles. Finally, the robot needs the ability to acquire amap in a previously unknown environment with its sensors.

Dynamaid, an Anthropomorphic Robot for Research on Domestic Service Applications J. Stückler, S. Behnke

Fig. 6. Top: Arena of the 2009 RoboCup@Home compe-tition in Graz. Bottom: Map of the arena generated withGMapping [8].

6.1 Simultaneous Localization and Mapping

To acquire maps of unknown environments, we applya FastSLAM2 [7] approach to the Simultaneous Localiza-tion and Mapping (SLAM) problem. In this approach, theposterior over trajectory and map given motion commandsand sensor readings is estimated with a set of weightedparticles. By factorisation of the SLAM posterior, Rao-Blackwellization can be applied to the SLAM problem:The particles contain discrete trajectory estimates and indi-vidual map estimates in the form of occupancy grid maps.We apply the GMapping implementation [8] which is con-tained in the OpenSLAM open source repository. Fig. 6shows the RoboCup@home arena at RoboCup 2009 inGraz and a generated map.

6.2 Localization

In typical indoor environments, large parts of the envi-ronment like walls and furniture are static. Thus, once therobot obtained a map of the environment through SLAM,it can use this map for localization. We apply a variant ofthe adaptive Monte Carlo Localization [9] to estimate therobot’s pose in a given occupancy grid map. The robot’smain sensor for localization is the SICK S300 laser rangefinder. The particles are sampled from a probabilistic mo-tion model which captures the noise in the execution ofmotion commands. When new laser sensor readings areavailable, the particles are weighted with the observationlikelihood of the laser scan given the robot’s pose. We usethe end-point model for laser range scans, as it can be im-plemented efficiently through look-up tables and it is more

robust than the ray-cast model to small changes in the en-vironment.

6.3 Path planning

To navigate in its environment, the robot needs the abil-ity to plan paths from its estimated pose in the map to targetlocations. We apply A* search [10] to find short obstacle-avoiding paths in the grid map. As heuristics we use theEuclidean distance to the target location. The traversationcost of a cell is composed of the traveled Euclidean dis-tance and an obstacle-density cost which is inversely pro-portional to the distance to the closest obstacle in the map.To be able to treat the robot as a point, we increase the ob-stacles in the map with the robot radius. By this, obstacle-free paths are found that trade-off shortness and distanceto obstacles. The resulting path is compressed to a list ofwaypoints. We generate waypoints when a specific traveldistance to the previous waypoint has been reached or atlocations on the path with high curvature.

6.4 Safe local navigation

The path planning module only considers obstacleswhich are represented in the map. To navigate in partiallydynamic environments, we implemented a module for lo-cal path planning and obstacle avoidance. From the sensorreadings, we estimate a local occupancy grid map. Again,the obstacles are enlarged by the robot’s shape. A paththrough the visible obstacle-free area is planned to the nextwaypoint by A* which also uses obstacle-density as a pathcost component. The omnidirectional driving capabilityof our mobile base simplifies the execution of the plannedpath significantly.

7 MOBILE MANIPULATION

To robustly solve mobile manipulation tasks we inte-grate object detection, safe navigation, and motion primi-tives. Dynamaid can grasp objects, carry them, and handthem to human users. To grasp an object from a specificlocation, Dynamaid first navigates roughly in front of theobject through global navigation. Then, it uses vertical ob-ject detection to determine distance to and height of thehorizontal surface to manipulate on. With its torso lift-ing actuator, it adjusts the height of the torso such that thetrunk LRF measures objects shortly above the surface inthe horizontal scan plane. It approaches the table as closeas possible through safe local navigation. Next, it detectsthe object to manipulate in the horizontal plane. If nec-essary it aligns to the object in sidewards direction usingsafe local navigation again. Finally, it performs a motionprimitive to grasp the object at the perceived location.

When Dynamaid hands-over an object to a human, theuser can trigger the release of the object either by a speech

Dynamaid, an Anthropomorphic Robot for Research on Domestic Service Applications J. Stückler, S. Behnke

Fig. 7. Dynamaid delivers a cereal box during the Super-market test at RoboCup 2009 in Graz.

command or by simply taking the object. As the arm actu-ators are back-drivable and support moderate compliance,the user can easily displace them by pulling the object.The actuators measure this displacement. The arm com-plies to the motion of the user. When the pulling persists,Dynamaid opens her gripper and releases the object. Fig. 7shows how Dynamaid hands an object to a human user dur-ing RoboCup 2009.

8 HUMAN-ROBOT INTERACTION

Dynamaid mainly communicates with humans throughspeech. For both speech synthesis and recognition we usethe commercial system from Loquendo. To make commu-nication with Dynamaid more intuitive, Dynamaid gazes attracked people by using its pan-tilt neck.

Loquendo’s speech recognition is speaker-independentand recognizes predefined grammars even in noisy envi-ronments. The Loquendo text-to-speech system supportsexpressive cues to speak with natural and colorful intona-tion. It also supports modulation of pitch and speed, andspecial sounds like laughing and coughing. Qualitatively,the female synthesized speech is very human-like.

We implemented also several gestures that Dynamaidperforms with her arms and her head, such as greeting peo-ple and pointing to objects. One advantage of using anthro-pomorphic robots is that the human users can use a humanmotion model to predict the robot motion. We utilize this,for example, by gazing in driving direction and by gazingat objects prior to grasping.

9 SYSTEM EVALUATION

Benchmarking robotic systems is difficult. Whilevideos of robot performances captured in ones own lab

are frequently impressive; in recent years, robot compe-titions, such as the DARPA Grand and Urban Challengesand RoboCup, play an important role in assessing the per-formance of robot systems. At such a competition, therobot has to perform tasks defined by the rules of the com-petition, in a given environment at a predetermined time.The simultaneous presence of multiple teams allows for adirect comparison of the robot systems by measuring ob-jective performance criteria, and also by subjective judg-ment of the scientific and technical merit by a jury.

The international RoboCup competitions, best knownfor robot soccer, also include now the @Home league fordomestic service robots. The rules of the league requirefully autonomous robots to robustly navigate in a home en-vironment, to interact with human users using speech andgestures, and to manipulate objects that are placed on thefloor, in shelves, or on tables. The robots can show theircapabilities in several predefined tests, such as followinga person, fetching an object, or recognizing persons. Inaddition, there are open challenges and the final demon-stration, where the teams can highlight the capabilities oftheir robots in self-defined tasks.

9.1 RoboCup German Open 2009

Our team NimbRo [11] participated for the first time inthe @Home league at RoboCup German Open 2009 duringHannover Fair.

In Stage I, we used our communication robot Robotinhofor the Introducetask. In this test, the robot has to intro-duce itself and the development team to the audience. Itmay interact with humans to demonstrate its human-robot-interaction skills. The team leaders of the other teamsjudge the performance of the robot on criteria like qualityof human-robot-interaction, appearance, and robustness ofmobility. Robotinho explained itself and Dynamaid and in-teracted with a human in a natural way. The jury awardedRobotinho the highest score of all robots in this test.

For theFollow Me test, we used Dynamaid. She wasable to quickly follow an unknown human through thearena, outside into an unknown, dynamic, and cluttered en-vironment, and back into the arena again. She also couldbe controlled by voice commands to stop, to move in somedirections, and to start following the unknown person. Per-formance criteria in this test are human-robot-interaction,safe navigation, and robust person following. Dynamaidachieved the highest score at this test. Dynamaid also ac-complished theFetch & Carrytask very well. For this test,a human user asks Dynamaid to fetch an object from oneout of five locations. The user is allowed to give a hint forthe location through speech. Dynamaid delivered reliablythe requested object and scored the highest score again forher human-robot-interaction and manipulation skills.

Dynamaid, an Anthropomorphic Robot for Research on Domestic Service Applications J. Stückler, S. Behnke

In Stage II, Dynamaid did theWalk & Talk task per-fectly. A human showed her five places in the apartmentthat she could visit afterwards as requested by spoken com-mands. Shortly before the run, the apartment is modifiedto test the ability of the robots to navigate in unknown en-vironments. In theDemo Challenge, Dynamaid demon-strated her skills as a waitress: Multiple users could orderdifferent drinks that she fetched quickly and reliably fromvarious places in the apartment.

In the final, Robotinho gave a tour through the apart-ment while Dynamaid fetched a drink for a guest. Thescore in the final is composed of the previous performanceof the team in Stage I and Stage II and an evaluation scoreby independent researchers that judge scientific contribu-tion, originality, usability, presentation, multi-modality,difficulty, success, and relevance. A video of the robots’performance is available at our web page1. Overall, theNimbRo@Home team reached the second place, only afew points behind b-it-bots [12].

9.2 RoboCup 2009

Some of Dynamaid’s features, such as object recogni-tion, face recognition, the second arm, and the movabletrunk became operational for the RoboCup 2009 compe-tition, which took place in July in Graz, Austria. Fig. 6shows both of our Robots in the arena, which consisted ofa living room, a kitchen, a bedroom, a bathroom, and anentrance corridor. 18 teams from 9 countries participatedin this competition.

In the Introducetest in Stage I, Robotinho explained it-self and Dynamaid, while she cleaned-up an object fromthe table. Together, our robots reached the highest score inthis test. Dynamaid successfully performed theFollow Meand theWho-is-Whotest. InWho-is-Whothe robot mustdetect three persons, approach them, ask for their names,remember their faces, and recognize them again when theyleave the apartment. Dynamaid detected one person andrecognized this person among the three persons leaving.She reached the second highest score in this test. Bothrobots reached the second highest score in theOpen Chal-lenge, where Robotinho gave a home tour to a guest whileDynamaid delivered a drink.

In Stage II, Dynamaid performed theWalk & Talk testvery well. In theSupermarkettest, the robot must fetchthree objects from a shelf on spoken command of a user.The robot must first explain its operation to the user. Dyna-maid recognized all requested objects, fetched two of themfrom different levels, and handed them to the user. Thisyielded the highest score in the test. Dynamaid also per-formed thePartybottest very well. She detected a person,

1http://www.NimbRo.net/@Home

asked for its name, went to the fridge, recognized the per-son when it ordered a drink, and delivered the drink. Bothrobots were used in theDemo Challenge. The theme of thechallenge was ’in the bar’. Robotinho offered snacks tothe guests while Dynamaid detected persons, approachedthem, offered drinks, took the orders, fetched the drinksfrom the kitchen table, and delivered them to the guest.The jury awarded 90% of the reachable points for this per-formance.

After Stage II, our team had the second most points, al-most on par with the leading team. In the final, Dynamaiddetected persons and delivered drinks to them. Overall, ourteam reached the third place in the @Home competition.We won also the innovation award for "Innovative robotbody design, empathic behaviors, and robot-robot cooper-ation".

10 RELATED WORK

An increasing number of research groups worldwide areworking on complex robots for domestic service applica-tions. For example, the Personal Robot One (PR1) [13] hasbeen developed at Stanford University. The design cou-ples a differential drive with torso rotation to approximateholonomic motion. The robot has two 7DOF arms for tele-operated manipulation. The authors discuss safety issues.Compared e.g. to a Puma-560 industrial robot, the risk ofserious injury is reduced dramatically. The successor PR2is currently developed by Willow Garage. It has four indi-vidually steerable wheels, similar to our robot.

The U.S. company Anybots [14] developed the robotMonty (170cm, 72kg), which has one fully articulatedhand (driven by 18 motors) and one gripper, and balanceson two wheels. The robot is supplied externally with com-pressed air. Video is available online, where the robot ma-nipulates household objects by using teleoperation.

At Waseda University in Japan, the robot Twendy-One [15] is being developed. Twendy-One is 147cm highand has a weight of 111kg. It moves on an omnidirec-tional wheeled base and has two anthropomorphic armswith four-fingered hands. The head contains cameras, butis not expressive. Several videos captured in the lab areavailable, where the robot manipulates various objects,presumably teleoperated.

One impressive piece of engineering is the robot Rollin’Justin [16], developed at DLR, Germany. Justin isequipped with larger-than human compliantly controlledlight weight arms and two four finger hands. The upperbody is supported by a four-wheeled mobile platform withindividually steerable wheels, similar to our design. WhileJustin is able to perform impressive demonstrations, e.g. atCeBit 2009, the robot does not yet seem to be capable ofautonomous operation in a home environment, as required

Dynamaid, an Anthropomorphic Robot for Research on Domestic Service Applications J. Stückler, S. Behnke

for RoboCup@Home. The DLR arms have also been usedin the DESIRE project [17].

The Care-O-Bot 3 [18] is the latest version of the do-mestic service robots developed at Fraunhofer IPA. Therobot is equipped with four individually steerable wheels,a7 DOF industrial manipulator from Schunk, and a tray forinteraction with persons. Objects are not directly passedfrom the robot to persons, but placed on the tray.

Pineau et al. [19] developed a domestic service robotfor assistance of the elderly. In contrast to Dynamaid, therobot does not posses manipulation capabilities. It demon-strated robust human-robot interaction to guide and assistpersons in daily activities. Their experiments indicate thatthe use of autonomous mobile systems in such applicationsis feasible and accepted by the elderly participants. Cestaet al. [20] evaluated the acceptance of a similar domes-tic service robot in tasks such as object finding, medicaltreatment reminding, and safety observation. Overall, theauthors report that test subjects positively react on assis-tive robots in their homes. While such cognitive assistancecan be of great value to the cognitively impaired, complexrobots with manipulation capabilities may further extendself-sustained living of elderly people.

11 CONCLUSION

The experiences made at RoboCup German Open andRoboCup 2009 in Graz clearly demonstrate our successin integrating robust navigation, mobile manipulation, andintuitive human-robot-interaction in our domestic servicerobot Dynamaid. In contrast to other systems [12, 21],which consist mainly of a differential drive and a smallKatana manipulator, Dynamaid’s omnidirectional base al-lows for flexible motion in the narrow passages found inhome environments. The anthropomorphic design of itsupper body and the torso lift actuator enables it to han-dle objects in a wide range of everyday household situ-ations. Its anthropomorphic body scheme facilitates nat-ural interaction with human users. In addition to robustnavigation and object manipulation, Dynamaid’s human-robot-interaction skills were also well received by the high-profile juries.

In contrast to the systems described in Section 10, Dy-namaid is light-weight, inexpensive, easy to interface, andfully autonomous. To construct the robot, we relied on in-telligent actuators that we used before to construct the hu-manoid soccer robots, which won the RoboCup 2007 and2008 soccer tournaments [22]. In our modular control ar-chitecture, we implemented task execution in hierarchicalfinite state machines. The abstraction through the behav-ior layers reduces the complexity in the design of statesand state transitions. By this, robust task execution can beachieved.

Dynamaid’s navigation capabilities mainly rely on 2Denvironment representations perceived through 2D laserrange finders. Localization in such maps works reliablywhen large parts of the environment remain static. Safecollision-free navigation is possible when obstacles can bemeasured in one of the scan planes of the LRFs. Dyna-maid demonstrated successful mobile manipulation in theRoboCup setting. Our approach to coordinate object per-ception, safe local navigation, and kinematic control yieldshigh success rates. For communication with users Dyna-maid primarily utilizes speech. Dynamaid’s dialogue sys-tem guides the conversation to achieve the task goal. Herhumanoid upper body scheme helps the user to perceivethe robot’s attention. Gaze control strategies make the ac-tions of the robot predictable for users.

In future work, we will implement planning on the taskand subtask layers to achieve more flexible behavior andto relieve the behavior designer from foreseeing the courseof actions. Also, the perception and representation of se-mantic knowledge is required for high-level planning. Toimprove navigation, 3D perception with TOF-cameras andgaze strategies can be used for obstacle avoidance [23].The representation of movable objects like doors [24] andfurniture will further improve the robustness of localiza-tion and mapping. For mobile manipulation in more un-constrained environments, for example, for grasping ob-structed objects from shelves or for receiving objects fromusers, real-time motion planning techniques will be re-quired.

In the current state, Dynamaid is not finished. Weplan to equip Dynamaid with an expressive communica-tion head, as demonstrated by Robotinho [11, 25]. Thiswill make intuitive multimodal communication with hu-mans easier.

ACKNOWLEDGMENT

This work has been supported partially by grant BE2556/2-3 of German Research Foundation (DFG).

REFERENCES

[1] B. Gerkey, R. T. Vaughan, and A. Howard, “ThePlayer/Stage project: Tools for multi-robot and distributedsensor systems,” inProc. of the Int. Conf. on AdvancedRobotics (ICAR), 2003.

[2] J. Burdick, “On the inverse kinematics of redundant manip-ulators: characterization of the self-motion manifolds,” inProceedings of IEEE International Conference on Roboticsand Automation (ICRA), 1989.

[3] J. Hollerbach and S. Ki, “Redundancy resolution of ma-nipulators through torque optimization,”IEEE Journal ofRobotics and Automation, vol. 3, no. 4, pp. 300–316, 1987.

Dynamaid, an Anthropomorphic Robot for Research on Domestic Service Applications J. Stückler, S. Behnke

[4] H. Bay, T. Tuytelaars, and L. V. Gool, “SURF: speeded uprobust features,” in9th European Conference on ComputerVision, 2006.

[5] H. Kuhn, “The hungarian method for the assignment prob-lem,” Naval Research Logistics Quarterly, vol. 2, no. 1,pp. 83–97, 1955.

[6] P. Viola and M. Jones, “Rapid object detection using aboosted cascade of simple features,” inProceedings of theIEEE Conference on Computer Vision and Pattern Recog-nition, 2001.

[7] M. Montemerlo, S. Thrun, D. Koller, and B. Wegbreit,“FastSLAM 2.0: An improved particle filtering algorithmfor simultaneous localization and mapping that provablyconverges,” inIn Proc. of the Int. Conf. on Artificial Intelli-gence (IJCAI), pp. 1151–1156, 2003.

[8] G. Grisetti, C. Stachniss, and W. Burgard, “Improved tech-niques for grid mapping with Rao-Blackwellized particlefilters,” IEEE Transactions on Robotics, vol. 23, no. 1,pp. 43–46, 2007.

[9] D. Fox, “Adapting the sample size in particle filters throughKLD-sampling,” Int. J. of Robotics Research (IJRR),vol. 22, no. 12, 2003.

[10] P. Hart, N. Nilson, and B. Raphael, “A formal basis for theheuristic determination of minimal cost paths,”IEEE Trans-actions of Systems Science and Cybernetics, vol. 4, no. 2,pp. 100–107, 1968.

[11] S. Behnke, J. Stückler, and M. Schreiber, “NimbRo@Hometeam description,” inRoboCup Team Descriptions, Graz,Austria, 2009.

[12] D. Holz, J. Paulus, T. Breuer, G. Giorgana, M. Reckhaus,F. Hegger, C. Müller, Z. Jin, R. Hartanto, P. Ploeger, andG. Kraetzschmar, “The b-it-bots,” inRoboCup Team De-scriptions, Graz, Austria, 2009.

[13] K. Wyrobek, E. Berger, H. V. der Loos, and K. Salisbury,“Towards a personal robotics development platform: Ratio-nale and design of an intrinsically safe personal robot,” inProceedings of IEEE International Conference on Roboticsand Automation (ICRA), 2008.

[14] Anybots, Inc., “Balancing manipulation robot Monty,”2009. http://www.anybots.com.

[15] Sugano Laboratory, Waseda University, “Twendy-one:Concept,” 2009. http://twendyone.com.

[16] C. Ott, O. Eiberger, W. Friedl, B. Bauml, U. Hillenbrand,C. Borst, A. Albu-Schaffer, B. Brunner, H. Hirschmüller,S. Kielhofer, R. Konietschke, M. Suppa, T. Wimbock,F. Zacharias, and G. Hirzinger, “A humanoid two-arm sys-tem for dexterous manipulation,” inProc. of the IEEE-RASInt. Conf. on Humanoid Robots (Humanoids), 2006.

[17] P. Plöger, K. Pervölz, C. Mies, P. Eyerich, M. Brenner,and B. Nebel, “The DESIRE service robotics initiative,”KIZeitschrift, vol. 3, 2008.

[18] C. Parlitz, M. Hägele, P. Klein, J. Seifert, and K. Dauten-hahn, “Care-o-bot 3 - rationale for human-robot interactiondesign,” inProceedings of 39th International Symposiumon Robotics (ISR), Seul, Korea, 2008.

[19] J. Pineau, M. Montemerlo, M. Pollack, N. Roy, andS. Thrun, “Towards robotic assistants in nursing homes:Challenges and results,”Robotics and Autonomous Systems,vol. 42, no. 1, 2003.

[20] A. Cesta, G. Cortellessa, M. Giuliani, F. Pecora, M. Scopel-liti, and L. Tiberio, “Psychological implications of domesticassistive technology for the elderly,”PsychNology Journal,vol. 5(3), pp. 229 – 252, 2007.

[21] S. Schiffer, T. Niemüller, M. Doostdar, and G. Lakemeyer,“Allemaniacs team description,” inRoboCup Team De-scriptions, Graz, Austria, 2009.

[22] S. Behnke and J. Stückler, “Hierarchical reactive controlfor humanoid soccer robots,”Int. J. of Humanoid Robots(IJHR), vol. 5, no. 3, 2008.

[23] D. Droeschel, D. Holz, J. Stückler, and S. Behnke, “Usingtime-of-flight cameras with active gaze control for 3D colli-sion avoidance,” inProc. of the IEEE Int. Conf. on Roboticsand Automation (ICRA), 2010.

[24] M. Nieuwenhuisen, J. Stückler, and S. Behnke, “Improv-ing indoor navigation of autonomous robots by an explicitrepresentation of doors,” inProc. of IEEE Int. Conf. onRobotics and Automation (ICRA), 2010.

[25] F. Faber, M. Bennewitz, C. Eppner, A. Görög, C. Gonsior,D. Joho, M. Schreiber, and S. Behnke, “The humanoid mu-seum tour guide Robotinho,” inProceedings of the 18thIEEE International Symposium on Robot and Human In-teractive Communication (RO-MAN), 2009.