Human-Inspired Robotic Grasp Control with Tactile Sensing

13
IEEE TRANSACTION ON ROBOTICS, VOL. X, NO. X, MONTH YEAR 1 Human-Inspired Robotic Grasp Control with Tactile Sensing Joseph M. Romano, Student Member, IEEE, Kaijen Hsiao, Member, IEEE, unter Niemeyer, Member, IEEE, Sachin Chitta, Member, IEEE, and Katherine J. Kuchenbecker, Member, IEEE Abstract—We present a novel robotic grasp controller that allows a sensorized parallel jaw gripper to gently pick up and set down unknown objects once a grasp location has been selected. Our approach is inspired by the control scheme that humans employ for such actions, which is known to centrally depend on tactile sensation rather than vision or proprioception. Our controller processes measurements from the gripper’s fingertip pressure arrays and hand-mounted accelerometer in real time to generate robotic tactile signals that are designed to mimic human SA-I, FA-I, and FA-II channels. These signals are combined into tactile event cues that drive the transitions between six discrete states in the grasp controller: Close, Load, Lift and Hold, Replace, Unload, and Open. The controller selects an appropriate initial grasping force, detects when an object is slipping from the grasp, increases the grasp force as needed, and judges when to release an object to set it down. We demonstrate the promise of our approach through implementation on the PR2 robotic platform, including grasp testing on a large number of real-world objects. Index Terms—robot grasping, tactile sensing I. I NTRODUCTION A S robots move into human environments, they will need to know how to grasp and manipulate a very wide variety of objects [1]. For example, some items may be soft and light, such as a stuffed animal or an empty cardboard box, while others may be hard and dense, such as a glass bottle or an apple. After deciding where such objects should be grasped (finger placement), the robot must also have a concept of how to execute the grasp (finger forces and reactions to changes in grasp state). A robot that operates in the real world must be able to quickly grip a wide variety of objects firmly, without dropping them, and delicately, without crushing them (Fig. 1). Non-contact sensors such as cameras and laser scanners are essential for robots to recognize objects and plan where to grasp them, e.g., [2], [3]. Recognition and planning may also instruct the grasping action, for example providing an object’s expected stiffness, weight, frictional properties, or simply the required grasp forces. But to safely handle unknown objects as well as to remain robust to inevitable uncertainties, any such a priori information must be complemented by real-time tactile sensing and observations. Indeed tactile sensors are superior to other modalities at perceiving interactive events, such as the slip of an object held in the fingers, a glancing Manuscript received August 1, 2010; revised April 12, 2011. J. M. Romano and K. J. Kuchenbecker are with the Department of Mechanical Engineering and Applied Mechanics, University of Pennsylvania, Philadelphia, PA, 19104 USA. e-mail: {jrom, kuchenbe}@seas.upenn.edu K. Hsiao, G. Niemeyer and S. Chitta are with Willow Garage, Menlo Park, CA, 94025 USA. e-mail: {hsiao, gunter, sachinc}@willowgarage.com Fig. 1. The Willow Garage PR2 robot using our grasp controller to carefully handle two sensitive everyday objects. collision between the object and an unseen obstacle, or the deliberate contact when placing the object. Understanding contact using tactile information and reacting in real time will be critical skills for robots to successfully interact with real- world objects, just as they are for humans. A. Human Grasping Neuroscientists have thoroughly studied the human talent for grasping and manipulating objects. As recently reviewed by Johansson and Flanagan [4], human manipulation makes great use of tactile signals from several different types of mechanoreceptors in the glabrous (non-hairy) skin of the hand, with vision and proprioception providing information that is less essential. Table I provides a list of the four types of mechanoreceptors in human glabrous skin. Johansson and Flanagan divide the seemingly effortless action of picking up an object and setting it back down into seven distinct states: reach, load, lift, hold, replace, unload, and release. In the first phase, humans close their grasp to establish finger contact with the object. Specifically, the transition from reach to load is known to be detected through the FA-I (Meissner) and FA- II (Pacinian) afferents, which are stimulated by the initial fingertip contact. FA signifies that these mechanoreceptors are fast-adapting; they respond primarily to changes in mechanical stimuli, with FA-I and FA-II having small and large receptive

Transcript of Human-Inspired Robotic Grasp Control with Tactile Sensing

Page 1: Human-Inspired Robotic Grasp Control with Tactile Sensing

IEEE TRANSACTION ON ROBOTICS, VOL. X, NO. X, MONTH YEAR 1

Human-Inspired Robotic Grasp Controlwith Tactile Sensing

Joseph M. Romano, Student Member, IEEE, Kaijen Hsiao, Member, IEEE, Gunter Niemeyer, Member, IEEE,Sachin Chitta, Member, IEEE, and Katherine J. Kuchenbecker, Member, IEEE

Abstract—We present a novel robotic grasp controller thatallows a sensorized parallel jaw gripper to gently pick up and setdown unknown objects once a grasp location has been selected.Our approach is inspired by the control scheme that humansemploy for such actions, which is known to centrally dependon tactile sensation rather than vision or proprioception. Ourcontroller processes measurements from the gripper’s fingertippressure arrays and hand-mounted accelerometer in real time togenerate robotic tactile signals that are designed to mimic humanSA-I, FA-I, and FA-II channels. These signals are combinedinto tactile event cues that drive the transitions between sixdiscrete states in the grasp controller: Close, Load, Lift and Hold,Replace, Unload, and Open. The controller selects an appropriateinitial grasping force, detects when an object is slipping from thegrasp, increases the grasp force as needed, and judges when torelease an object to set it down. We demonstrate the promiseof our approach through implementation on the PR2 roboticplatform, including grasp testing on a large number of real-worldobjects.

Index Terms—robot grasping, tactile sensing

I. INTRODUCTION

AS robots move into human environments, they will needto know how to grasp and manipulate a very wide variety

of objects [1]. For example, some items may be soft and light,such as a stuffed animal or an empty cardboard box, whileothers may be hard and dense, such as a glass bottle or anapple. After deciding where such objects should be grasped(finger placement), the robot must also have a concept of howto execute the grasp (finger forces and reactions to changes ingrasp state). A robot that operates in the real world must beable to quickly grip a wide variety of objects firmly, withoutdropping them, and delicately, without crushing them (Fig. 1).

Non-contact sensors such as cameras and laser scanners areessential for robots to recognize objects and plan where tograsp them, e.g., [2], [3]. Recognition and planning may alsoinstruct the grasping action, for example providing an object’sexpected stiffness, weight, frictional properties, or simply therequired grasp forces. But to safely handle unknown objectsas well as to remain robust to inevitable uncertainties, anysuch a priori information must be complemented by real-timetactile sensing and observations. Indeed tactile sensors aresuperior to other modalities at perceiving interactive events,such as the slip of an object held in the fingers, a glancing

Manuscript received August 1, 2010; revised April 12, 2011.J. M. Romano and K. J. Kuchenbecker are with the Department of

Mechanical Engineering and Applied Mechanics, University of Pennsylvania,Philadelphia, PA, 19104 USA. e-mail: {jrom, kuchenbe}@seas.upenn.edu

K. Hsiao, G. Niemeyer and S. Chitta are with Willow Garage, Menlo Park,CA, 94025 USA. e-mail: {hsiao, gunter, sachinc}@willowgarage.com

Fig. 1. The Willow Garage PR2 robot using our grasp controller to carefullyhandle two sensitive everyday objects.

collision between the object and an unseen obstacle, or thedeliberate contact when placing the object. Understandingcontact using tactile information and reacting in real time willbe critical skills for robots to successfully interact with real-world objects, just as they are for humans.

A. Human Grasping

Neuroscientists have thoroughly studied the human talentfor grasping and manipulating objects. As recently reviewedby Johansson and Flanagan [4], human manipulation makesgreat use of tactile signals from several different types ofmechanoreceptors in the glabrous (non-hairy) skin of the hand,with vision and proprioception providing information that isless essential. Table I provides a list of the four types ofmechanoreceptors in human glabrous skin. Johansson andFlanagan divide the seemingly effortless action of picking upan object and setting it back down into seven distinct states:reach, load, lift, hold, replace, unload, and release. In the firstphase, humans close their grasp to establish finger contact withthe object. Specifically, the transition from reach to load isknown to be detected through the FA-I (Meissner) and FA-II (Pacinian) afferents, which are stimulated by the initialfingertip contact. FA signifies that these mechanoreceptors arefast-adapting; they respond primarily to changes in mechanicalstimuli, with FA-I and FA-II having small and large receptive

Page 2: Human-Inspired Robotic Grasp Control with Tactile Sensing

IEEE TRANSACTION ON ROBOTICS, VOL. X, NO. X, MONTH YEAR 2

TABLE IMECHANORECEPTORS IN GLABROUS HUMAN SKIN [4], [6]–[8].

Mechanoreceptor Receptive Field (mm2) Response (Hz)SA-I (Merkel Disc) 2-4 0-5

SA-II (Ruffini Endings) >20 0-5FA-I (Meissner Corpuscle) 3-5 5-50FA-II (Pacinian Corpuscle) >20 20-1000

fields, respectively. Once contact has been detected, humansincrease their grasp force to the target level, using both pre-existing knowledge about the object and tactile informationgathered during the interaction. This loading process is regu-lated largely by the response of the SA-I (Merkel) afferents,which are slowly-adapting with small receptive fields. The loadphase ends when the target grasp force is reached with a stablehand posture.

Once the object is securely grasped, humans use their armmuscles to lift up the object, hold it in the air, and possiblytransport it to a new location. Corrective actions (typicallyincreases in grip force) are applied during the lifting andholding phases when the tactile feedback does not match theexpected result. Srinivasan et al. [5] showed that the FA-Iand FA-II signals are the primary sources of information fordetecting both fingertip slip and new object contact. Slip is ofcritical importance for rejecting disturbances in the lifting andholding phases. When it comes time to place the item backon the table, contact between between the object and the tablemust be detected to successfully transition to unloading. TheSA-I afferents are again important during unload to properlyset the object down before full release.

These tactile sensing capabilities and corrective reactionsenable humans to adeptly hold a very wide range of objectswithout crushing or dropping them. Indeed, humans typicallyapply a grip force that is only 10–40% more than the minimumamount needed to avoid slip [4], thereby achieving the dualgoals of safety and efficiency.

B. Our Approach: Human-Inspired Robotic Grasp Control

Inspired by the fluidity of human grasp control, this articlepresents a set of methods that enable a robot to delicatelyand firmly grasp real-world objects. We assume that fingertipplacement as well as hand and body movements have already

been determined using non-contact sensors and appropriateplanners. We describe robotic sensing methods that use finger-mounted pressure arrays and a hand-mounted accelerometerto mimic the important tactile signals provided by humanFA-I, FA-II, and SA-I mechanoreceptors. Noticeably absentin our approach are the SA-II mechanoreceptors, which areknown to respond to tangential loading such as skin stretch.We omit this channel because our current experimental systemcannot measure such signals, though our approach couldbe expanded to include them if they were available. Thethree sensory channels that we construct allow us to createa high-level robotic grasp controller that emulates humantactile manipulation: in the words of Johansson and Flanagan,our controller is “centered on mechanical events that marktransitions between consecutive action phases that representsubgoals of the overall task” [4]. As diagrammed in Fig. 2,our approach separates robotic grasping into six discrete states:Close, Load, Lift and Hold, Replace, Unload, and Open.

These states purposefully mirror those of human grasping,although we have combined Lift and Hold because theircontrol responses are nearly identical for the hand. Each statedefines a set of rules for controlling a robotic gripper toperform the specified behavior based on the tactile sensa-tions it experiences. In addition to creating this approach torobotic grasp control, we implemented our methods on thestandardized hardware and software of the Willow Garage PR2robot; our goal was to enable it to perform two-fingered graspson typical household objects at human-like speeds, withoutcrushing or dropping them.

Section II summarizes previous work in the area of tactilerobotic grasping and substantiates the novelty of our approach.Section III describes pertinent attributes of the PR2 platform,while Section IV defines our robotic SA-I, FA-I, and FA-IItactile channels and the low-level position and force controlstrategies we created for the PR2’s high-impedance gripper.Section V expounds on the control diagram of Fig. 2 bycarefully defining each control rule and state transition. Asdescribed in Section VI, we validated our methods throughexperiments with the PR2 and a large collection of everydayobjects under a variety of challenging test conditions. Weconclude the article and discuss our plans for future work in

Load

position

set t = 0

force

Fg,des = Fc

positionLeftContact &&RightContact

StableContact

force

Fg,des = Fc

Lift and Hold

Slip

set Fc = Fc ·KSLIP

force

Fg,des = Fc

Replace

Grasp()

Place() force

Unload

positionSlip ||

Vibration

set xc = x,tc = t

vg,des = VOPENset ts = t Fg,des = Fc − Fct− ts

TUNLOAD

xg,des = x(0)−VCLOSE · tvg,des = −VCLOSE

xg,des = xc

vg,des = 0set Fc = max

t(Fg) ·

KHARDNESS

VCLOSE

t− tc > TSETTLE

xg,des= xu+VOPEN·(t− tu)set xu = x,

tu = t

Close

Open

Fg,des == 0

Fig. 2. The state diagram for our robotic grasp controller. State transitions occur only after specific tactile events are detected. The details of this controllerare presented in Sections IV and V. Constant-valued parameters, such as VCLOSE, are defined in Table II.

Page 3: Human-Inspired Robotic Grasp Control with Tactile Sensing

IEEE TRANSACTION ON ROBOTICS, VOL. X, NO. X, MONTH YEAR 3

Section VII.

II. BACKGROUND

The development of tactile sensors for robotic hands hasbeen a very active area of research, as reviewed by Cutkoskyet al. in 2008 [9]. Among the wide range of sensors one coulduse, Dahiya et al. [10] present a strong case for the importanceof having tactile sensors capable of reproducing the rich re-sponse of the human tactile sensing system. Along these lines,some recent sensors have even achieved dynamic response[11] and spatial coverage [12] that are comparable to thehuman fingertip. Using tactile sensory cues with wide dynamicrange as the trigger for robotic actions was first proposed byHowe et al. over two decades ago [13]. Unfortunately, mostsuch sensors still exist only as research prototypes and arenot widely available. The pressure-sensing arrays used in ourwork represent the state of the art in commercially availabletactile sensors, and they are available on all PR2 robots. High-bandwidth acceleration sensing is far more established, thoughsuch sensors are rarely included in robotic grippers.

In this work, we present a fully autonomous PR2 robot thatcan perceive its environment, pick up objects from a table, andset them back down in a new location. We build on the overallsystem described in [14], focusing on methods for using tactilefeedback (pressure and acceleration) to improve the gripper’scontact interactions with grasped objects. Prior versions of thissystem have used pre-defined grasp forces and were limited toeither crushing or dropping many objects; hence, our primarygoal is to enable two-fingered grasps that are gentle but secure.

Several other research groups have developed autonomouspick-and-place robotic systems, such as [15]–[17]. While theseother works often use tactile sensing to guide adjustments inhand position during the pre-grasp stage, they rely mainlyon pre-defined grasp forces or positions and the mechanicalcompliance of their gripper in order to hold objects. Suchstrategies do not work well for the high-impedance paralleljaw grippers with which many robots are equipped. In otherprevious work, Yussof et al. [18] showed promising resultsusing a custom-built tabletop manipulator with custom opticalfingertip sensors that measure normal and shear forces. Taka-hashi et al. [19] calculated robot fingertip slip informationusing a pressure array similar to that used in our system.However, neither of these systems has yet been validated witha wide set of real-world objects, and we believe there is roomfor new approaches to processing and responding to tactilesignals.

Another large but scattered body of work is devoted tounderstanding fingertip sensor signals at and during contact,e.g., [20]–[23]. Again, these algorithms are typically developedwith custom hardware and validated against only a small andideal set of objects, which makes it hard to draw conclusionsabout their general utility. In some of our own previous work,we found that sensor information such as slip [24] and contactstiffness [25] is useful in well controlled situations but canbe extremely susceptible to changes in object texture, sensororientation, and other such factors that will naturally varywhen implemented on a fully autonomous robot that needsto interact with unknown objects in human environments.

This paper aims to develop robust tactile sensory signalsthat will apply to as many objects as possible in order tocreate a tactile-event-driven model for robotic grasp control.Several researchers [10], [13] have noted the importance ofsuch an approach, but to our knowledge this is among thefirst implementations of such methods in a fully autonomousrobot that has not been built for the sole purpose of grasping.

III. ROBOT EXPERIMENTAL SYSTEM

Robots have great potential to perform useful work in every-day settings, such as cleaning up a messy room, preparing anddelivering orders at a restaurant, or setting up equipment foran outdoor event [1]. Executing such complex tasks requireshardware that is both capable and robust. Consequently, we usethe Willow Garage PR2 robotic platform. As shown in Fig. 1,the PR2 is a human-sized robot designed for both navigationand manipulation. It has an omni-directional wheeled base,two seven-degree-of-freedom arms, and two one-degree-of-freedom parallel-jaw grippers. Its extensive non-contact sensorsuite includes two stereo camera pairs, an LED pattern pro-jector, a high-resolution camera, a camera on each forearm, ahead-mounted tilting laser range finder, a body-mounted fixedlaser range finder, and an IMU.

Figure 3 shows the parallel jaw gripper that is mountedon each of the PR2’s arms. The gripper’s only actuator is abrushless DC motor with a planetary gearbox and an encoder.This motor’s rotary motion is converted to the parallel jawmotion through a custom internal mechanism in the body ofthe gripper. Unlike many of the robot hands currently beingdeveloped for grasping, e.g., [15], [17], the PR2 gripper has ahigh mechanical impedance due to the large gear ratio of theactuation system; note that it can be slowly back-driven byapplying a large force at the fingertips. Motor output torquecan be specified in low-level software, but the transmissiondoes not include any torque or force sensors. Instead, motoreffort is estimated using current sensing; the transmission’ssignificant friction prevents this signal from correspondingwell with the force that the gripper applies to an object duringgrasping.

A high-bandwidth Bosch BMA150 digital accelerometeris embedded in the palm of each gripper, as illustrated inFigure 3. This tiny sensor measures triaxial acceleration overa range of ±78 m/s2 with a nominal resolution of 0.15 m/s2.The accelerometer has a sampling rate of 3 kHz and a band-width from DC to 1.5 kHz. Successive triaxial accelerationmeasurements are grouped together in sets of three (nine totalvalues) and made available to the controller at a rate of 1 kHz.

Each of the gripper’s two 2.3 cm × 3.7 cm × 1.1 cmfingertips is equipped with a pressure sensor array consistingof 22 individual cells. The 22 cells are divided between a fiveby three array on the parallel gripping surface itself, two sensorelements on the end of the fingertip, two elements on eachside of the fingertip, and one on the back (see Fig. 3). Thesecapacitive sensors (manufactured by Pressure Profile Systems,Inc.) measure the perpendicular compressive force applied ineach sensed region, and they have a nominal resolution of6.25 mN. As shown in Fig. 3, the entire sensing surface is

Page 4: Human-Inspired Robotic Grasp Control with Tactile Sensing

IEEE TRANSACTION ON ROBOTICS, VOL. X, NO. X, MONTH YEAR 4

Two elements on each side

One element at the tip of the back

5x3 array ofelements onthe front

Two elements at the tip

Fingertippressure sensors

accelerometer

Fig. 3. The PR2 robot gripper. The accelerometer is rigidly mounted to aprinted circuit board in the palm, and the pressure sensors are attached to therobot’s fingertips under the silicone rubber coating.

covered by a protective layer of silicone rubber that providesthe compliance and friction needed for successful grasping. Allpressure cells are sampled simultaneously at a rate of 24.4 Hz.Due to manufacturing imperfections and residual stresses inthe deformed rubber, each sensor cell has a unique non-zero reading when the fingertips are subjected to zero force.We compensate for this non-ideality by averaging the first0.25 seconds of each cell’s pressure measurements before eachgrasp and subtracting this offset from subsequent readings.

We use the open-source ROS software system for all ex-periments conducted with the PR2 robot. The implementedgripper controllers are run inside a 1 kHz soft-real-timeloop. Information from the tactile sensors (accelerometer andpressure cells) is available directly in this environment, as isthe gripper’s encoder reading.

IV. LOW-LEVEL SIGNALS AND CONTROL

Individual sensor readings and actuator commands are far-removed from the task of delicately picking up an object andsetting it back down on a table. Consequently, the high-levelgrasp controller diagrammed in Fig. 2 rests on an essentiallow-level processing layer that encompasses both sensing andacting. Here, we describe the three tactile sensory signals thatwe designed to mimic human SA-I, FA-I, and FA-II afferents,along with the position and force controllers needed for thegripper to move smoothly and interact gently with objects.All signal filtering and feedback control was done within the1 kHz soft-real-time-loop.

A. Fingertip Force (SA-I)

The SA-I tactile sensory channel is known to play a primaryrole in the human hand’s sensitivity to steady-state and low-frequency skin deformations up to approximately 5 Hz [4]. Aslisted in Table I, this type of afferent has a small receptive fieldand responds to low frequency stimulation. The cumulativeresponse of many SA-I nerve endings gives humans the abilityto both localize contact on the fingertip and discern the totalamount of force applied [7].

The pressure arrays on the PR2 fingertips provide informa-tion that is similar to human SA-I signals, though spatiallyfar less dense. A signal similar to the SA-I estimate of totalfingertip force can be obtained by summing the readings from

all fifteen elements in the pad of one finger on the gripper:

Fgl =

3∑i=1

5∑j=1

fl(i,j) (1)

The value fl(i,j) represents the force acting on the left fin-gerpad cell at location i, j. The same procedure is used tocalculate Fgr for the right finger using fr(i,j). The mean gripforce is calculated by averaging the force experienced by thetwo fingers, Fg = 1

2 (Fgl + Fgr). Figure 4 shows an exampleof these fingertip force signals during object contact, alongwith examples of the other tactile signals described below.

We also attempted to obtain localization information aboutthe fingertip contact using the methods described in [19],but we were not able to achieve satisfactory results. Wehypothesize that differences in the fingertip shape (the PR2has a flat fingertip that leads to multiple contact locations, asopposed to the rounded fingertips used by Takahashi et al.),and the lower number of tactile cells in our sensors were themain reasons we could not localize contacts with the PR2.

B. Fingertip Force Disturbance (FA-I)

Human FA-I signals are believed to be the most importantindicator of force-disturbance events during bare-handed ma-nipulation. As listed in Table I, they have small receptive fieldsand are attuned to mid-range frequencies. Force disturbancesoccur at many instants including initial object contact, objectslippage, impacts between a hand-held object and the environ-ment, and the end of object contact.

We process the data from the PR2’s pressure arrays tocreate a signal similar to the human FA-I channel. Our chosencalculation is to sum a high-pass-filtered version of the forcesdetected in all fifteen fingertip cells:

Fgl(z) =

3∑i=1

5∑j=1

HF (z)fl(i,j)(z) (2)

The force measured in each cell fl(i,j) is subjected to adiscrete-time first-order Butterworth high-pass filter HF (z)with a cutoff frequency of 5 Hz, designed for the 24.4 Hzsampling rate of the pressure signals. The resulting filteredsignals are then summed to obtain an estimate of the > 5 Hzforce disturbances Fgl acting on the entire left fingerpad. Theprocess is repeated for the right finger to obtain Fgr.

C. Hand Vibration (FA-II)

FA-II afferents are known to be the primary tactile channelby which humans sense interactions between a handheld tooland the items it is touching. As listed in Table I, thesemechanoreceptors have wide receptive fields and are attunedto high frequencies. During object grasping and manipulation,FA-II signals are particularly useful for detecting contactbetween handheld objects and other things in the environment,such as a table surface.

We create a robotic analog to the FA-II sensory channel byprocessing data from the PR2’s hand-mounted accelerometer,

Page 5: Human-Inspired Robotic Grasp Control with Tactile Sensing

IEEE TRANSACTION ON ROBOTICS, VOL. X, NO. X, MONTH YEAR 5

xg

Fgl

Fgr

Fgl

Fgr

ah

E

Close

Load

Lift and Hold

Replace

Unload

Open

0

0.05

0.1

Grip aperture (m)

0

5

10Left gripper force (N)

Right gripper force (N)

−2

0

2

Left and right gripper disturbance (N)

0

2

4

6

High−frequency gripper acceleration (m/s2)

0 5 10 15 20 25 30−100

0

100

Gripper motor effort (%)

Time (s)LeftContact &&RightContact StableContact

SlipPlace()

Slip ||Vibration F

g,des == 0

Fig. 4. Time history data for an interaction between the gripper and an Odwalla juice bottle. Controller states are marked with bars at the top, and importantsignals (state transitions and tactile events) are indicated with arrows at the bottom.

as follows:

ah(z) =√(Ha(z)ah,x)2 + (Ha(z)ah,y)2 + (Ha(z)ah,z)2

(3)The hand vibration signal ah is calculated by taking themagnitude of the high-pass-filtered three-dimensional accel-eration vector. The filter applied to each of the three Cartesianacceleration components (ah,x ah,y ah,z) is a discrete-timefirst-order Butterworth high-pass filter Ha(z) with a 50 Hzcutoff frequency, designed for the 3 kHz sampling rate of ouracceleration data stream.

D. Position and Force ControlIn addition to the rich tactile sensations described above,

humans excel at manipulation because they can move compe-tently through free space but quickly transition to regulatinggrasp force during object contact [4]. Replicating the fluidityof human grasping with a high-impedance parallel jaw gripperrequires well designed position and force controllers. Both ofthese controllers appear several times in the high-level statediagram of Fig. 2; each controller block is labeled with its type,along with the desired motion or force output. To facilitatea generic presentation of our approach, the mathematicalconstants used in this article are designated with an all-capitalized naming convention.

The PR2 gripper is a geared mechanism, so it lends itselfwell to position control. We define its position xg in metersand its velocity vg in meters per second. The position is zerowhen the fingers touch and positive otherwise, so that theposition value corresponds to the grip aperture. The grippervelocity follows the same sign as position, with positive valuesindicating that the hand is opening. We found that we couldachieve good position tracking via a simple proportional-derivative controller with an additional velocity-dependent

term to overcome friction:

E = KP · (xg − xg,des) + KD · (vg − vg,des)

−sign(vg,des) · EFRICTION (4)

Here, E is the motor effort (N), KP is the proportional errorgain (N/m), KD is the derivative error gain (Ns/m), and xg,des

and vg,des are the desired gripper position (m) and velocity(m/s) respectively. EFRICTION is a scalar constant for feed-forward friction compensation, applied to encourage motionin the direction of vg,des. Note that motor effort is defined tobe positive in the direction that closes the gripper, which isopposite the sign convention for the motion variables. Table IIlists values and units for all of the constants used in ourcontrollers, including KP, KD, and EFRICTION.

We created a force controller on top of this position con-troller to enable the PR2 to better interact with delicate objects.This controller requires access to the fingertip force signalsFg described above in Section IV-A. Forces that compress thefingertips are defined to be positive, so that positive motoreffort has the tendency of creating positive fingertip forces.We have developed a force controller that drives the desiredposition and velocity terms based on the error between thedesired force and the actual force [26]:

Fg,min = min (Fgl, Fgr) (5)

vg,des = KF · (Fg,min − Fg,des) (6)

KF =

{KFCLOSE if Fg,min − Fg,des < 0,KFOPEN otherwise

(7)

We servo on the minimum of the two finger forces, Fg,min, asdefined in (5), to ensure dual finger contact. Errors in trackingthe desired force Fg,des are multiplied by the constant gain KFto yield the desired velocity for the position controller (6). This

Page 6: Human-Inspired Robotic Grasp Control with Tactile Sensing

IEEE TRANSACTION ON ROBOTICS, VOL. X, NO. X, MONTH YEAR 6

desired velocity is integrated over time to provide the positioncontroller with a desired grip aperture xg,des. Experimentaltesting revealed that high values of the gain KF improved forcetracking but also caused the commonly encountered force-controller effect of chattering. We found that an asymmetricgain definition (7), where KFCLOSE is greater than KFOPEN,allows for the best balance of stability and responsivenessduring grasping.

V. ROBOTIC GRASP CONTROL

This section describes the high-level grasp controller wedeveloped to enable a robotic gripper to mimic the humancapacity for picking up and setting down objects withoutcrushing or dropping them. This control architecture is dia-grammed in Fig. 2 and explained sequentially below. It makesuse of all three of the low-level tactile signals and both of thelow-level controllers defined in the previous section.

A. Close

Closing is the starting point for any grasp. In this phasethe hand goes from having no contact with the object tocontacting the object with both fingers. The Close state beginswhen a client software module requests a grasp, denoted bythe Grasp() signal in Fig. 2. It is assumed that prior to thisrequest the gripper has been maneuvered so that its fingers areon opposing sides of an object, approximately equidistant toboth surfaces, and that closing the gripper’s jaws will result instable two-fingered contact with the object. Selection of suchposes without requiring a model of the object is an ongoingarea of research, e.g., [3], [14], as is correcting for errorsin execution of the selected pose, e.g., [14], [17]. Here, weassume a reasonable starting arm pose and focus solely oncontrolling the gripper’s one degree of freedom.

After the Grasp() request has been received, we use theposition controller defined in (4) to close the gripper with aconstant desired velocity vg,des = −VCLOSE. This closingmovement continues until contact is detected on both the leftand right fingers:

LeftContact = (Fgl > FLIMIT) ‖ (Fgl > DLIMIT)(8)

RightContact = (Fgr > FLIMIT) ‖ (Fgr > DLIMIT)(9)

Each fingertip force signal is compared with the constantFLIMIT, and each fingertip force disturbance signal is com-pared with the constant DLIMIT. Note that a finger’s contactsignal is true if the threshold on either its Fg or its Fg has

TABLE IIVALUES CHOSEN FOR CONTROLLER CONSTANTS.

ATHRESH 4.2 m/s2 KHARDNESS 0.027 m/sDLIMIT 0.02 N KP 20,000 N/m

EFRICTION 7.0 N KSLIP 1.08FBPTHRESH 0.25 N SLIPTHRESH 0.01

FLIMIT 0.75 N TSETTLE 0.05 sFTHRESH 0.15 N TUNLOAD 0.20 s

KD 5,000 Ns/m VCLOSE 0.04 m/sKFCLOSE 0.0013 m/Ns VOPEN 0.05 m/s

KFOPEN 0.0008 m/Ns VTHRESH 0.001 m/s

been exceeded, and recall that the second of these signalsis merely a high-pass-filtered version of the first. Throughimplementation on the PR2, we found that DLIMIT can bemuch smaller than FLIMIT because Fgl is not sensitive tothe pressure cells’ low-frequency fluctuations; consequently,the force disturbance condition is almost always first totrigger a contact event. A sample instance of LeftContact &&RightContact is indicated in Fig. 4.

While several groups [5], [13] have noted that higherfrequency vibrations are useful in detecting contact, we couldnot find a reliable indication of such events in our ah signal.This difficulty is likely due to the specific hardware of thePR2. A large distance separates the accelerometer from thefingertips, and the compliant silicone fingertip coverings softenimpacts considerably. Furthermore, significant high-frequencyvibration noise occurs whenever the gripper motor is turning,which masks out any small tactile cues that might occur (seethe ah signal during the Close and Open phases in Fig. 4).Fortunately, the tactile signals derived from the pressure cellsare very reliable at detecting the start of contact, providing aconsistent cue to transition to the Load phase.

B. Load

The goal of the Load phase is to apply a grip force thatis appropriate for the target object, so that the object can belifted and manipulation can begin. Selecting this grip forcelevel is challenging when the robot does not have a detailedmechanical model of the object or prior experience in grippingit. Consequently, we designed a novel method for choosing areasonable starting grasp force; many alternatives were tested,and here we report only the most reliable one.

After contact is detected, the robot pauses for a short periodof time (TSETTLE) while trying to hold the position of thefirst contact xc. This pause serves two purposes: first, manyobjects take a short time to mechanically respond to the initialcontact, and second, the pressure sensor cells update at arate that is much slower than the controller rate (24.4 Hzvs. 1000 Hz). We have found that the force response duringthis contact settling time is a very useful indicator of howfirmly an object should be grasped. Thus, the robot recordsthe maximum average force seen by the gripper fingers duringthis time and calculates the target force to hold the object as:

Fc = maxt

(Fg) ·KHARDNESS

VCLOSE(10)

As one would expect, the force response of an object stronglydepends on the speed of the finger impact. We remove thisdependence on velocity by dividing the gain KHARDNESSby VCLOSE, the speed at which the fingers are commandedto impact the object. This approach enables the controller tocalculate a contact force Fc that is relatively independent ofthe closing speed. The values of KHARDNESS and VCLOSEboth contribute strongly to the robot’s ability to delicatelygrasp objects. Large values of VCLOSE give the gripper highmomentum during closing, so that the robot may not be able tostop quickly upon object contact, so we chose a more moderateclosing speed. KHARDNESS has been tuned to estimate theinitial grasp force for a generic set of everyday objects, but it

Page 7: Human-Inspired Robotic Grasp Control with Tactile Sensing

IEEE TRANSACTION ON ROBOTICS, VOL. X, NO. X, MONTH YEAR 7

tends to be too low for very soft heavy objects, and too highfor stiff delicate objects, occasionally leading to object dropsor breakage respectively.

At the end of the brief pause after the start of contact, therobot transitions to force control, using the computed Fc valueas the desired force Fg,des. If the robot had a priori knowledgeof the appropriate force with which to grasp a certain object,that value would be used in conjunction with Fc to developan estimate for Fg,des. The force control mode continues untilthe gripper achieves StableContact, which we define as:

StableContact = (|Fg,min − Fg,des| < FTHRESH)

&& (|vg| < VTHRESH) (11)

This condition requires that the smaller of the two fingertipforces Fg,min be within the tolerance FTHRESH of thecommand force Fg,des, and that the gripper speed be belowthe tolerance VTHRESH. Through testing, we have foundthat grasps become stable very quickly on most everydayobjects. Slower stabilizations occur with very hard objects,which require additional effort after contact to ramp up thesteady-state force command, and extremely soft objects, whichneed some time before the velocity settles to zero.

C. Lift and Hold

After StableContact is achieved, the controller transitionsto the next grasp phase: Lift and Hold. In this phase therobot holds the object between its fingertips and moves its armto accomplish higher-level tasks. It is desirable for the graspcontroller to hold the object firmly enough to avoid slipping,but gently enough to avoid crushing. As with FA-I signals inhuman grasping, the high-pass filtered force signal Fg is astrong indicator of slip. This signal is more reliable than Fg

itself, since it does not vary significantly with low-frequencyrobot motion and reorientation of the object with respect togravity. We calculate the Slip condition as follows:

Slip =(|Fg| > Fg · SLIPTHRESH

)&&

(FBPg < FBPTHRESH

)(12)

Our Slip condition is met only when both subsidiary sensorycomparisons evaluate to true. First, the magnitude of the forcedisturbance signal Fg must exceed a threshold defined by theproduct of the total force Fg and a constant SLIPTHRESH;using this product rather than a constant value makes the robotless sensitive to force variations as the grip force increases.This approach was again inspired by human capabilities: hu-man force perception is known to follow Weber’s Law, wherea stimulus must vary by a certain percent of its magnitude tohave the change be detectable [27].

Second, slips are considered only when the average graspforce in not changing very quickly. We evaluate this forcestability condition by treating the average fingerpad force Fg

with a first-order Chebyshev discrete-time band-pass filter witha pass-band from 1 to 5 Hz to produce FBP

g . We compare thissignal with the empirically tuned constant FBPTHRESH. Thelower frequency cutoff of 1 Hz removes the mean value fromthe grip force signal, while the higher cutoff of 5 Hz removes

the slip effects that are seen in Fg . This condition prevents theformation of a feedback loop when the controller increases itsgrip force to stop slip events, as discussed below.

Every time Slip occurs, the controller increases the desiredgrip force Fc by a small percentage of its current value,such that Fc = Fc · KSLIP. Several alternative methods ofresponding to slip were tested, such as increasing the desiredgrip force proportional to the magnitude of the slip event,as done by Takahashi et al. [19]. However, we found thatour system does not exhibit a strong correlation betweenthe amount of slip (in either speed or distance) and the Fg

signal; instead, this signal depends somewhat on the propertiesof the grasped object, so we use it to detect only that thegrasp has been disturbed. In contrast to [19], we found thatallowing the grip force to decay over time resulted in a higherpercentage of dropped objects. This difference is most likelydue to the slower response time of the PR2 gripper whencompared with the hardware developed in [19]. This approachdoes occasionally cause objects to be held tighter than theminimum necessary grasp force when transient slip conditionsare experienced, e.g., after incidental contact between theobject and environment.

D. ReplaceThe transition from Lift and Hold to Replace occurs when

a client software module sends the Place() signal. The robotshould enter this mode only when it is ready to put theheld object down, which is typically after it has moved thegripper to within a few centimeters of the desired placementlocation. After issuing the Place() command, the higher-levelrobot controller moves the gripper toward the target surfaceat a moderate speed. During this time, the low-level forcecontroller holds the final target force from the previous phase,Fc, and the Replace controller monitors the Slip and Vibrationsignals to detect contact between the object and environment.We define the Vibration condition as a threshold on the high-frequency hand acceleration signal:

Vibration = (ah > ATHRESH) (13)

Reducing ATHRESH improves the robot’s ability to detectlight contacts with the world, but it also makes the PR2 moresensitive to false-positive triggers due to mechanical vibrationsfrom its own drivetrain. In practice we have found that thisvalue can be lowered to 1.25 m/s2 during slow arm motion.When either Slip or Vibration becomes true, the robot assumesthat contact has occurred between the object and the targetsurface, and it moves into the Unload phase.

It is necessary to observe both the Slip and Vibrationconditions to account for the variety of contact scenariosthat can occur. In the case of very light objects, which areappropriately held lightly during the Lift and Hold phase, theobject slips between the robot’s fingers, so the hand does notexperience significant vibrations. In the case of heavy objects,which are held firmly, no Slip signal is detected since theobject is held firmly enough to prevent slip, but the impactvibrations are easily apparent in the ah signal. For manyobjects between these two extremes, both Slip and Vibrationconditions are often true at contact.

Page 8: Human-Inspired Robotic Grasp Control with Tactile Sensing

IEEE TRANSACTION ON ROBOTICS, VOL. X, NO. X, MONTH YEAR 8

E. Unload

The Unload phase is entered automatically after the heldobject contacts the target surface. The goal of this phase issimply to let go, but our controller performs this unloadinggradually to avoid abrupt transitions. The desired grip force islinearly reduced to zero over a set period of time using:

Fg,des = Fc − Fct− ts

TUNLOAD(14)

Here, t represents the present time, and ts represents the start-ing time for the Unload state. TUNLOAD is a constant thatdetermines the unloading duration. The state is exited whenFg,des reaches zero, which occurs when t−ts == TUNLOAD.

F. Open

Once the robot has released the object on the surface, itproceeds to open the gripper. This movement is accomplishedby the position controller, using a constant positive desiredvelocity of VOPEN.

VI. EXPERIMENTAL VALIDATION

This section first presents a method for tuning our twomain controller constants, KSLIP and KHARDNESS, whichhelps show how these parameters affect grasp success. Wethen discuss experiments that test the performance of the novelaspects of our controller: the grip force chosen during Load,the response to Slip during Lift and Hold, and the effect of Slipand Vibration signals during Replace. To understand how ourapproach compares to more simplistic grasping solutions, wethen conducted a more general test of the PR2’s capabilitiesusing a collection of fifty everyday household objects.

A. Parameter Tuning

One challenge in implementing the grasp controller ofSection V is to select appropriate values for KSLIP andKHARDNESS. These two parameters play a major role in thesuccessful manipulation of objects. Proper tuning will resultin delicate manipulation, while improper tuning will result inhigh rates of object crushing or slippage. We define crushing tobe a deformation of 10 mm beyond the initial surface contact.We define slippage in two forms: translation (greater than 5mm), or rotation (greater than 0.175 radians).

For the PR2 robot, our parameter tuning experiments pro-ceeded as follows. We chose a plastic cylindrical-shapedCoffee-Mate canister with a mass of 0.45 kg for our parametertuning study. This object was selected for its heavy weight,which necessitates using a strong grasp to prevent object slip,and its flexible body, which necessitates using a delicate graspto avoid crushing the object. First, the canister was placedon a table in front of the robot, and the open robot gripperwas positioned around the object. Next, the Grasp() commandwas sent to the robot to initiate the event-driven graspingsequence. Finally, after the StableContact signal was achieved,the robot lifted the object 0.3 meters upward and rotated it 1.05radians. During the task the grasp aperture was continuouslymonitored to detect crushing conditions. After the task, theexperimenter measured the translation and rotation of the

KSLI

P

KHARDNESS (m/s)

0.63

1.08

1.41

1.73

0.005 0.012 0.027 0.065 0.110

Crush

Success

Slip

Fig. 5. The effect of varying KSLIP and KHARDNESS. As KHARDNESSis decreased away from the tuned value, objects tend to slip within the graspdue to inadequate initial grip forces. As KHARDNESS is increased, the gripforce is overestimated, and crushing occurs. As KSLIP is decreased, the robotdoes not respond strongly enough to slip events, and the object is allowedto slip an unacceptable amount. As KSLIP is increased, the robot tends toovercompensate and crush objects.

object to determine whether slip had occurred. This experimentwas repeated ten times for all combinations of five values ofKSLIP and five values of KHARDNESS, for a total of 250trials. The five different values for KSLIP and KHARDNESSwere selected such that they caused the controller response tovary between the two extremes of crushing and slipping. Theresulting state of each grasping attempt was recorded as eithercrushed, slipped (which included object drops), or successful.These results are presented in Fig. 5. Setting KSLIP andKHARDNESS to the central values, as listed in Table II, yieldsa system that is very likely to succeed at grasping. Smalldeviations away from the ideal parameter values can oftenyield somewhat successful grasp behavior, but they typicallyincrease the likelihood for slip and/or crush results. Largedeviations from the ideal parameter values result in completelyunsuccessful grasp results, as defined by our strict crushingand slipping metrics. Very large value for KHARDNESSwill cause the robot to saturate its grip force, resulting inthe tightest possible grasp. In practice, this parameter spacecan be sparsely sampled during tuning, but it should beexplored completely to obtain a thorough understanding ofthe parameter interaction as seen here.

B. Grip Force Estimation During Loading

As described in Section V-B, the controller in the Loadphase selects the target grip force based on the maximum forcemeasured during the gripper’s initial contact with the object,normalized by contact speed. We evaluated this techniquethrough grasp testing on eight different objects: a paper cup, apaperboard tea box, a ripe banana, an empty soda can, a rawchicken egg, a tennis ball, a glass bottle, and a full soda can.

We began the experiment by obtaining ground-truth mea-surements of the minimum grip force necessary for the PR2 tolift each object. These tests were done by placing each objectin a known location and orientation on a table. The robot then

Page 9: Human-Inspired Robotic Grasp Control with Tactile Sensing

IEEE TRANSACTION ON ROBOTICS, VOL. X, NO. X, MONTH YEAR 9

0

2

4

6

8

10

12

14G

rip F

orc

e (

N)

Paper Cup

Tea Box

Banana

Empty Can

Egg

Tennis Ball

Glass Bottle

Full Can

Minimum Force

Grip Force

Crush Force

Fig. 6. The target grip force chosen by our Load method when graspingeight everyday objects. The gain KHARDNESS was empirically tuned toyield a grip force (red bar) that is consistently above the minimum gripforce necessary to lift the object (blue bar). This calculation provides agood estimate for a large range of objects, but one can see its tendency tooverestimate the force necessary to hold objects that are both hard and light,such as the egg. The red × symbol marks the crushing force for all objectsthat can be crushed by the robot gripper.

closed its gripper on the object using only the force controllerdescribed in equations (5)–(7), with the desired force Fg,des

set to a small value (starting between 0.5 N and 6 N, dependingon the object.) The robot then used its arm joints to move thegripper up by 10 cm. If slip occurred during lifting, the trialwas repeated with the grasp force incremented by 0.1 N. If theobject did not slip, the desired grip force was recorded as theminimum grip force needed for lifting. This entire process wasrepeated eight times per object. The blue “Minimum Force”bars in Fig. 6 show the mean and standard deviation of theeight ground truth measurements for each of the eight objects.

The experiment was then repeated using the grasp controllerdescribed in this paper. We performed eight trials with eachof the same eight objects, located in the same position andorientation as before. For each trial, the desired loading forceFc was recorded, as calculated with (10). The dark red “GripForce” bars in Fig. 6 show the mean and standard deviationof the eight grip force levels that the robot chose during theLoad phase for each of the eight test objects.

Lastly, we determined the force necessary to crush eachobject (if crushing was possible) by successively incrementingthe force controller’s desired grip force by 0.1 N. Only asingle recording was done for the crushing force because thisoperation damages the object. These crush force measurementsappear as red X’s with the other results in Fig. 6. In all cases,our controller chose a grip force above the minimum levelneeded to avoid slip. For crushable objects, it chose grip forceswell below the crush limit for all objects except the egg, whichit crushed in three of the eight trials.

C. Slip Response During Lift and Hold

We conducted a separate experiment to test slip compensa-tion in the Lift and Hold phase. As described in Section V-C,Lift and Hold uses the force controller to try to maintain aconstant target grasp force; it watches for Slip events, which

0.5 1 1.5 2 2.5 3 3.5 42

3

4

5

6

7

8

9

10

11

Object Weight (N)

Grip

Fo

rce

(N

)

Trial 1

Trial 2

Trial 3

Minimum Grip Force

Fig. 7. Slip test results for three different trials. The glass cup was repeatedlyfilled with marbles to promote slip at a variety of grip force levels. The groundtruth data (red dashed line) indicates the minimum grip force needed to preventslip. As seen here, our Lift and Hold controller has been designed to graspobjects more tightly when it detects slip. This behavior reduces the likelihoodof dropped an object without requiring unnecessarily high grasp forces.

are derived from the pressure transducer data, and it respondsby increasing the target grasp force by a small percentage.

We sought to understand our system’s slip response byhaving the gripper hold a smooth straight-sided object thatperiodically increased in weight. At the start of this exper-iment, the cylindrical section of a glass cup was placed inthe robot gripper, as seen in the inset of Fig. 7. The weightof the cup was measured to be 0.6 N, and it was orientedvertically. The experimenter began a trial by activating the Liftand Hold controller with an initial desired grip force of 5 N.Batches of fifteen marbles (about 0.6 N per batch) were thenadded to the cup at intervals of three seconds. The gripper waslightly shaken for two seconds after each batch of marbles wasadded, during which time the controller reacted to any detectedSlip events. The final selected grip force value was recordedin software before the experimenter added another batch ofmarbles. The marbles were added five times to give the cupa final weight of approximately 3.7 N. This procedure wasrepeated three times to produce the data shown in the solidtraces of Fig. 7.

This test’s ground truth data was obtained for each of thesix cup weights using the force controller of Section IV-D.The controller’s desired grip force was started at 1.0 N. Afterthe cup was grasped by the robot, the experimenter lightlyshook the gripper to emulate the slight disturbances that occurduring arm motion. If the cup fell out of the gripper or slippedafter two seconds of shaking, the trial was repeated with agrasp force incremented by 0.1 N. The grasp force needed tohold the cup at each of the six weights is shown by the reddashed “Minimum Grip Force” line in Fig. 7. One can seethat this value increases up to an object weight of about 2.5 Nand then levels off at approximately 8.3 N. The controlleralways chose a grip force value above the level needed toprevent slip. The variation between the three trials is primarilydue to differences in how the experimenter shook the gripper;stronger external disturbances cause more corrective actionsand higher grip force levels.

Page 10: Human-Inspired Robotic Grasp Control with Tactile Sensing

IEEE TRANSACTION ON ROBOTICS, VOL. X, NO. X, MONTH YEAR 10

Fig. 8. The 50 objects used in our robustness test. These objects werechosen for no specific reason except that they have a wide range of propertiesincluding; hard/soft, heavy/light, sticky/slippery, brittle/elastic.

TABLE IIIOUTCOMES OF GRASP TESTING WITH EVERYDAY OBJECTS.

100% Motor Effort Our MethodsCrushed 30/30 1/30

Rotated Within Grasp 5/50 9/50Translated Within Grasp 2/50 4/50

Dropped 2/50 4/50

D. Grasping Robustness

Beyond testing specific aspects of our controller’s perfor-mance, we wanted to understand how the methods proposedin this paper would work on their intended subject, everydayreal-world objects. We thus gathered the collection of 50common objects shown in Fig. 8, purposefully seeking outitems that could be challenging for a robot to grasp. Theonly requirement on these objects is that they are all withinthe robot’s perception and manipulation capabilities (have atleast one grasp location that is less than the size of the max-imum gripper aperture, weight less than the arm’s maximumpayload, etc.). The objects included in the collection are asfollows: apple, banana (rotten), Band-Aid box, beer bottle(empty), beer bottle (full), can of Spam, can of peas, candle,cereal box (empty), Coffee-mate bottle, duct tape roll, foamball, gum container, Jello cup, juice box, large plastic bowl,magic marker, masking tape roll, medicine bottle, milk carton(empty), office tape dispenser (heavy), ointment tube (full),peach (soft), plastic juice bottle (empty), plastic juice bottle(full), plum (rotten), rubber football, Solo plastic cup, Saranwrap box, ShiKai shampoo bottle (empty), small woodenbowl, soap bottle (empty), soap box (full), soda can (empty),soda can (full), soup can (full), stress ball, stuffed bear, stuffedelephant, Suave shampoo bottle (empty), sunglasses case, teabox (metal), tea box (paperboard), tennis ball, thin plastic cup,Tide bottle (full), towel, Vasoline container (full), water bottle(empty), and wood plank.

The robot’s task for this experiment was to pick up eachobject from a table and set it down in a different location.We used the object perception and motion planning codeof Hsiao et al. to enable the PR2 to accomplish this taskautonomously [14]. The grasp selection and reactive graspingcomponents of [14] were used to select stable grasps, withthe object centered within the grasp before starting, as per ourassumptions for the Close phase. After grasping the object,

the robot lifted the object off the table, then moved it fromabove the table to the side of the robot, so that the armwould not interfere with perception for object placement.The object was then moved to the opposite side of the tablefor placement. This motion was planned using a randomizedjoint-angle planner and typically involved a great deal ofobject rotation, as is typical of many complex pick-and-placeoperations. During such motions, the robot would ideallyprevent the object from rotating or slipping out of the grasp,while continuing to avoid crushing the object.

Each object was tested under two grasp control conditions.The first condition was the original manipulation code de-scribed in [14], which takes a naive approach by alwaysclosing the gripper with 100% motor effort. The secondcondition was a portion of the human-inspired robotic graspcontroller described in this paper, which included our Close,Load, and Lift and Hold phases. The Replace, Unload, andOpen phases could not be included due to code integrationdifficulties at the time of this experiment, but they have sincebeen included in our grasping pipeline, and the followingsection describes relevant results. Our controller was testedfirst because the naive controller tends to damage crushableobjects.

During testing, the experimenter presented all 50 objectsto the robot one by one and manually recorded the outcomeof each trial. As tabulated in Table III, four different typesof errors occurred: the robot might crush a crushable object(30 of the 50 objects were crushable, crushable statistics arereported out of these 30 objects), it might let the object rotateor translate significantly within its grasp, and it might dropthe object either by failing to raise it off the table at the startof the Lift and Hold phase, or by allowing it to slip from itsgrasp later in Lift and Hold.

The naive controller was found to crush all thirty of thecrushable objects, while our controller crushed only one (therubber football). This drastic improvement in handling delicateobjects is balanced by a higher incidence of objects that rotate,slip, and/or drop; our controller commits all three of theseerrors about twice as often as the naive controller. The twomost challenging items were the rubber football and the fullTide bottle; both controllers allowed them to slip within thegrasp and fall on the table. This was due to the large size ofboth objects: the grasp location automatically selected (using[14]) for the football was larger than the maximum gripperdiameter, and the position selected for the Tide bottle waswithin 1 cm of the maximum diameter.

Quantitative data was also recorded for our controller’sgrasp of each of the 50 objects. Fig. 9 presents a histogram ofthe grip force chosen during the Load phase for the 49 objectsthat were successfully lifted. These values range from 2.5 Nto 27.5 N with most objects below 7.5 N. The objects withthe lowest initial grasp force were the towel, the Coffee-matebottle, the large plastic bowl, and the stress ball, in ascendingorder. The objects with the highest initial grasp force werethe wood plank, the duct tape roll, and the sunglasses case,in descending order. From this we observe that soft objectsgenerally receive lower initial grip forces than hard objects,as one would expect from the design of our controller.

Page 11: Human-Inspired Robotic Grasp Control with Tactile Sensing

IEEE TRANSACTION ON ROBOTICS, VOL. X, NO. X, MONTH YEAR 11

0 5 10 15 20 25 300

5

10

15

20

Nu

mb

er

of O

bje

cts

Load Force (N)

Fig. 9. Histogram for initial grasp force in object marathon.

0 20 40 60 80 1000

5

10

15

Num

ber

of O

bje

cts

Lift and Hold Average Motor Effort (%)

Fig. 10. Histogram for motor effort in object marathon.

Fig. 10 provides a histogram of the average motor effortrequired during Lift and Hold for the 46 objects that werenot dropped. One can quickly see that our controller is muchmore efficient than the naive controller, which always uses100% motor effort. Because the PR2 gripper has high internalfriction and compliant fingertips, it can securely hold objectssuch as the foam ball and the Band-Aid box with zero motoreffort. At the other end of the spectrum, the wood plank, theduct tape roll, and the full beer bottle all had an average motoreffort of 100%.

E. Contact Sensing During Replace

Lastly, we carried out a set of tests to evaluate the ef-fectiveness of the Slip and Vibration signals used to detectcontact between the held object and the table surface duringReplace, as described in Section V-D. Using the same graspingprocedure as the experiment with fifty objects, the PR2 robotwas programmed to detect, grasp, lift, and replace a full sodacan on a table. A sheet of pressure-sensitive film (Pressurex-micro 2-20 PSI film) was laid on the table at the placementlocation to record the pressure between the can and table.The film darkens in color with the application of additionalpressure from 13,800 to 138,000 N/m2.

The first test consisted of 25 trials using the standard non-contact sensing approach to object placement. The height ofthe table was found from stereo camera and scanning lasersensor data. Once the object was grasped, it was assumed tobe rigidly attached to the hand, and the robot attempted toplace the object back down at the exact height of the tableand release it. The second set of 25 trials used our contact-sensing replace strategy to release the object once the Slip orVibration signal was detected. The resulting pressure-sensitivefilm images can be seen in Fig. 11.

Qualitative inspection of the resulting films suggests thatour human-inspired controller is significantly more gentle than

(a) Standard Approach (b) Our Approach

Fig. 11. Scans of the pressure-sensitive film used to capture the forcesapplied by the robot to the object during object placement. Darker colorindicates regions of higher pressure. (a) The standard approach assumesperfect knowledge of the table position and object position within the grasp,which both contain errors that lead to forceful impacts. (b) Our approachreleases the object when table contact is detected, yielding gentler placements.

0 1 2 3 4 50

1

2

3

To

rqu

e (

N/m

)

0 1 2 3 4 50

5

10

Acc

ele

ratio

n (

m/s

2)

Time (s)

Standard Approach

Our ApproachContact

Fig. 12. Elbow joint torque (top) and hand acceleration (bottom) comparingthe two different approaches. The standard approach exerts significantlygreater torque after the initial contact, pushing the object down into the table.

the standard approach. When using the non-contact sensingapproach, the robot released the object in the air (prior tocontact with the table) a total of 4 out of the 25 trials, whichcaused the object to drop onto the table. In the remaining21 trials with the standard controller, the robot pressed theobject down into the table very firmly because the visuallyestimated target location was beneath the surface of the actualtable. In contrast, our contact-sensing controller uses the tactilesensors to detect contact with the table and release the object,as shown in Fig. 12. The Slip signal triggered in 4 of ourcontroller’s 25 trials, and the Vibration signal triggered in theremaining 21. One Vibration-triggered release occurred priorto contact between the object and the table due to mechanicalnoise on the accelerometer, highlighting the opportunity forfurther improvements in tactile signal processing during robotmotion.

Page 12: Human-Inspired Robotic Grasp Control with Tactile Sensing

IEEE TRANSACTION ON ROBOTICS, VOL. X, NO. X, MONTH YEAR 12

VII. CONCLUSION

This article introduced a set of sensory signals and controlapproaches that attempt to mimic the human reliance oncutaneous sensations during grasping tasks. We presented aframework that highlights how tactile feedback can be usedas a primary means of completing a manipulation task, withencouraging results. While it is clear to the authors that notall tasks can be completed solely via tactile information, wefeel that it is a promising and underutilized tool in roboticmanipulation systems.

In future work we hope to add additional sensing modalitiesinto our object handling framework, including estimates of thenecessary object grip force from visual and laser recognition,audio feedback about crushing and damaging objects, weightsensing after an object has been lifted off the table, graspdisturbance prediction from arm motion data, grasp qualityinformation based on the fingerpad contacts, and combiningthis data together into higher-level percept information. Ourcurrent estimate of the initial grip force necessary to lift anobject depends solely on the hardness information gleanedduring contact. While it has shown to be a strong indicatorfor many everyday objects, it does have certain failure caseswhere hardness information is deceptive, such as soft butheavy objects (e.g., a heavy trash bag or a stuffed animal),and light but hard objects (e.g., an egg or a thin wine glass).Supplementing this information with additional object datawill likely lead to a superior grip force estimator.

Our signal for detecting slip information was successful, butthe force response to slip events could be improved by usinga gripper with superior dynamic response. The PR2 gripperis admittedly inferior to several of the compliant and well-modeled designs existing in the literature; we hypothesize thatthese methods would be even more successful if implementedwith these alternative research systems. Furthermore, Taka-hashi et al. [19] have shown that it is possible to obtain usefulcentroid-of-contact information with spherical fingertips. Thisis a feature we were unable to reproduce with the PR2’s flatfingertips, but we may attempt to redesign the fingertip shapein the future if it proves highly beneficial. The addition ofshear-force sensing capabilities to the fingertip may also provean important indicator of slip information, and it is a closeparallel to an important mechanoreceptor of the hand we donot currently mimic, the SA-II channel, which detects skinstretch.

The implementation presented in this paper deals withthe case of a two-fingered parallel-jaw gripper. The graspstrategies with such a hand are limited, and this is one aspectof why we have focused so narrowly on simple pick-and-placetasks. As more advanced multi-fingered robot hands come intouse, we see these same contact signals we have describedbeing extremely useful in less task-specific ways. Researchis currently underway to develop a tactile-event-driven statemachine for object interaction, which should allow for richfeedback during a wider variety of interesting multi-fingeredinteractions and arm motions. The work presented in this paperis actively used in multiple PR2 robots around the world. Aswe continue to refine this system and increase the range of

objects it can handle, all relevant code is freely available athttps://code.ros.org/, and we encourage interested researchersto contact us about porting these capabilities to additionalrobot platforms.

ACKNOWLEDGMENTS

The authors thank Matei Ciocarlie for his help in addingthe described grasp controller to the PR2 object manipulationframework, and they thank Derek King for his assistance introubleshooting the PR2 accelerometer signals.

REFERENCES

[1] C. C. Kemp, A. Edsinger, and E. Torres-Jara, “Challenges for robotmanipulation in human environments: Developing robots that performuseful work in everyday settings,” IEEE Robotics and AutomationMagazine, pp. 20–29, March 2007.

[2] R. B. Rusu, A. Holzbach, R. Diankov, G. Bradski, and M. Beetz,“Perception for mobile manipulation and grasping using active stereo,” inProc. IEEE-RAS International Conference on Humanoid Robots, Paris,Dec. 2009.

[3] A. Saxena, J. Driemeyer, and A. Y. Ng, “Robotic grasping of novel ob-jects using vision,” International Journal of Robotics Research, vol. 27,no. 2, pp. 157–173, Feb. 2008.

[4] R. S. Johansson and J. R. Flanagan, “Coding and use of tactilesignals from the fingertips in object manipulation tasks,” Nature ReviewsNeuroscience, vol. 10, pp. 345–359, May 2009.

[5] M. A. Srinivasan, J. M. Whitehouse, and R. H. LaMotte, “Tactiledetection of slip: Surface microgeometry and peripheral neural codes,”Journal of Neurophysiology, vol. 63, pp. 1323–1332, 1990.

[6] J. Bell, S. Bolanowski, and M. H. Holmes, “The structure and functionof pacinian corpuscles: A review,” Progress in Neurobiology, vol. 42,no. 1, pp. 79–128, January 1994.

[7] G. A. Gescheider, J. H. Wright, and R. T. Verrillo, Information-processing channels in the tactile sensory system: a psychophysical andphysiological analysis. New York: Psychology Press: Taylor & FrancisGroup, 2009.

[8] K. O. Johnson, “The roles and functions of cutaneous mechanorecep-tors,” Current Opinion on Neurobiology, vol. 11, pp. 455–461, 2001.

[9] M. R. Cutkosky, robert D. Howe, and W. R. Provancher, SpringerHandbook of Robotics, B. Siciliano and O. Khatib, Eds. Springer,2008.

[10] R. S. Dahiya, M. Gori, G. Metta, and G. Sandini, “Better manipulationwith human inspired tactile sensing,” in Proc. of Robotics Science andSystems (RSS 2009) Workshop on Understanding the Human Hand forAdvancing Robotic Manipulation, 2009, pp. 36–37.

[11] A. Schmitz, M. Maggiali, M. Randazzo, L. Natale, and G. Metta, “Aprototype fingertip with high spatial resolution pressure sensing for therobot iCub,” in Proc. IEEE-RAS International Conference on HumanoidRobots, December 2008, pp. 423–428.

[12] T. Martin, R. Ambrose, M. Diftler, R. Platt, and M. J. Butzer, “Tactilegloves for autonomous grasping with the NASA/DARPA Robonaut,” inInternational Conference on Robotics and Automation, 2004.

[13] R. D. Howe, N. Popp, P. Akella, I. Kao, and M. R. Cutkosky, “Grasping,manipulation, and control with tactile sensing,” in International Confer-ence on Robotics and Automation, 1990.

[14] K. Hsiao, S. Chitta, M. Ciocarlie, and E. G. Jones, “Contact-reactivegrasping of objects with partial shape information,” in IROS, 2010.

[15] L. Natale and E. Torres-Jara, “A sensitive approach to grasping,” inProc. International Conference on Epigenetic Robots, Paris, France,September 2006.

[16] A. Jain and C. C. Kemp, “El-e: an assistive mobile manipulator thatautonomously fetches objects from flat surfaces,” Autonomous Robots,vol. 28, pp. 45–64, 2010.

[17] A. M. Dollar, L. P. Jentoft, J. H. Gao, and R. D. Howe, “Contact sensingand grasping performance of compliant hands,” Autonomous Robots,vol. 28, pp. 65–75, 2010.

[18] H. Yussof, M. Ohka, H. Suzuki, N. Morisawa, and J. Takata, “Tactilesensing-based control architecture in multi-fingered arm for objectmanipulation,” Engineering Letters, vol. 16, no. 2, 2009.

[19] T. Takahashi, T. Tsuboi, T. Kishida, Y. Kawanami, S. Shimizu, M. Iribe,T. Fukushima, and M. Fujita, “Adaptive grasping by mulit fingeredhand with tactile sensor based on robust force and position control,”in International Conference on Robotics and Automation, 2008.

Page 13: Human-Inspired Robotic Grasp Control with Tactile Sensing

IEEE TRANSACTION ON ROBOTICS, VOL. X, NO. X, MONTH YEAR 13

[20] B. Eberman and J. K. Salisbury, “Application of change detectionto dynamic contact sensing,” The International Journal of RoboticsResearch, vol. 13, p. 369, 1994.

[21] D. Dornfeld and C. Handy, “Slip detection using acoustic emission signalanalysis,” in International Conference on Robotics and Automation,1987.

[22] N. Wettels, V. J. Santos, R. S. Johansson, and G. E. Loeb, “Biomimetictactile sensor array,” Advanced Robotics, vol. 22, pp. 829–849, 2008.

[23] A. Kis, F. Kovacs, and P. Szolgay, “Grasp planning based on fingertipcontact forces and torques,” in Proceedings of Eurohaptics, 2006.

[24] J. M. Romano, S. R. Gray, N. T. Jacobs, and K. J. Kuchenbecker, “To-ward tactilely transparent glove: Collocated slip sensing and vibrotactileactuation,” in World Haptics, 2009.

[25] S. Chitta, M. Piccoli, and J. Sturm, “Tactile object class and internal staterecognition,” in International Conference on Robotics and Automation,2010.

[26] D. Whitney, “Historical perspective and state of the art in robot forcecontrol,” International Journal of Robotics Research, vol. 6, no. 1, pp.3–14, 1987.

[27] S. Allin, Y. Matsuoka, and R. Klatzky, “Measuring just noticeabledifferences for haptic force feedback: Implications for rehabilitation,”in Proc. IEEE Haptics Symposium, March 2002, pp. 299–302.

Joseph M. Romano is a Ph.D. student in the depart-ment of Mechanical Engineering and Applied Me-chanics (MEAM) at the University of Pennsylvania.Joe works in Dr. Kuchenbecker’s Haptics ResearchGroup, a part of the GRASP Laboratory, investi-gating the relationship between kinesthetic large-scale motion information and localized tactile andcutaneous signals to improve human and machinehaptic systems. He completed a Masters in MEAMat the University of Pennsylvania in May of 2010,and B.S. degrees in both Computer Science and

Engineering Mechanics at the Johns Hopkins University in 2007.

Kaijen Hsiao is a research scientist at WillowGarage, Inc. Her research interests lie in graspingand manipulation, particularly with respect to deal-ing with uncertainty, incorporating information fromdiverse sensors, and adding robustness to robot ca-pabilities. Dr. Hsiao received her Ph.D. in ComputerScience from MIT in 2009, and she has been atWillow Garage since then.

Gunter Niemeyer is a senior research scientist atWillow Garage Inc. and a consulting professor ofMechanical Engineering at Stanford University. Hisresearch examines physical human-robotic interac-tions, force sensitivity and feedback, teleoperationwith and without communication delays, and hapticinterfaces. This involves efforts ranging from real-time motor and robot control to user interface design.Dr. Niemeyer received his M.S. and Ph.D. from MITin the areas of adaptive robot control and bilateralteleoperation, introducing the concept of wave vari-

ables. He also held a postdoctoral research position at MIT developing surgicalrobotics. In 1997 he joined Intuitive Surgical Inc., where he helped create theda Vinci Minimally Invasive Surgical System. He joined the Stanford facultyin the Fall of 2001, directing the Telerobotics Lab and teaching dynamics,controls, and telerobotics. He has been a member of the Willow Garageresearch group since 2009.

Sachin Chitta is a research scientist at WillowGarage, Inc. His research interests are in motionplanning, control, and sensing for mobile manipu-lation with a particular interest in making robotsuseful in unstructured indoor environments. Priorto coming to Willow Garage, Dr. Chitta receivedhis Ph.D. in Mechanical Engineering and AppliedMechanics from the University of Pennsylvania in2005, and he was a postdoctoral researcher workingon learning for quadruped locomotion and modularrobotics in the GRASP Lab at Penn from 2005 to

2007. At Willow Garage, he has worked on motion planning and control formobile manipulation tasks like door opening, cart pushing and object retrievaland tactile sensing for reactive grasping and object state recognition with thePR2 mobile manipulation platform.

Katherine J. Kuchenbecker is the Skirkanich As-sistant Professor of Innovation in Mechanical Engi-neering and Applied Mechanics at the University ofPennsylvania. Her research centers on the design andcontrol of haptic interfaces, and she directs the PennHaptics Group, which is part of the GRASP RoboticsLab. Dr. Kuchenbecker serves on the program com-mittee for the IEEE Haptics Symposium, and she haswon several awards for her research, including anNSF CAREER award in 2009 and Popular ScienceBrilliant 10 in 201. Prior to becoming a professor,

she was a postdoctoral fellow at the Johns Hopkins University, and she earnedher Ph.D. in Mechanical Engineering from Stanford University in 2006.