Hybrid and Hierarchical Bayesian Data Fusion for Cooperative Human-Robot...

46
Hybrid and Hierarchical Bayesian Data Fusion for Cooperative Human-Robot Perception Nisar Ahmed, Ph.D. Assistant Professor, Aerospace Engineering Sciences University of Colorado at Boulder September 11, 2014

Transcript of Hybrid and Hierarchical Bayesian Data Fusion for Cooperative Human-Robot...

Page 1: Hybrid and Hierarchical Bayesian Data Fusion for Cooperative Human-Robot Perceptionecee.colorado.edu/~beto2469/dc-seminars/2014/Nisar_… ·  · 2014-09-15Hybrid and Hierarchical

Hybrid and Hierarchical Bayesian Data Fusion

for Cooperative Human-Robot Perception

Nisar Ahmed, Ph.D.Assistant Professor, Aerospace Engineering Sciences

University of Colorado at Boulder

September 11, 2014

Page 2: Hybrid and Hierarchical Bayesian Data Fusion for Cooperative Human-Robot Perceptionecee.colorado.edu/~beto2469/dc-seminars/2014/Nisar_… ·  · 2014-09-15Hybrid and Hierarchical

2

A Lesson from Aerospace History:Spam in a Can? [Sheridan, 2002, p. 154, paraphrasing]

• Draper proclaimed at the outset of the Apollo program that…

C.S. Draper with Werner von Braun in

front of the Apollo Guidance Computer

Tom Sheridan,

human factors consultant to the

Apollo Program, and Professor

Emeritus of Aero-Astro at MIT

The astronauts are to be passive passengers

… and in several instances they countermanded

the automation and saved the mission.

It turned out he was wrong.

Many routine sensing, pattern

recognition and control

functions had to be performed

by the astronauts, …

All the essential control activities will be

performed by the automation.

Page 3: Hybrid and Hierarchical Bayesian Data Fusion for Cooperative Human-Robot Perceptionecee.colorado.edu/~beto2469/dc-seminars/2014/Nisar_… ·  · 2014-09-15Hybrid and Hierarchical

3

Autonomous Robots Are Everywhere

• Doing (almost) anything

• Tremendous technical leaps

– Better sensor, actuators

– Faster/cheaper mobile computing

– AI, communication networks

Baxter (ReThink Robotics)

PackBot

(iRobot)

Kiva

Systems

Hermes UAVs

Robot Restaurant (Harbin, China) PR2 (Willow Garage and

GA Tech’s Charlie Kemp)

Shark Tracking AUVs

(Harvey Mudd College)GoogleCar

(Google)

PETMAN (Boston Dynamics)

Page 4: Hybrid and Hierarchical Bayesian Data Fusion for Cooperative Human-Robot Perceptionecee.colorado.edu/~beto2469/dc-seminars/2014/Nisar_… ·  · 2014-09-15Hybrid and Hierarchical

4

Everybody Needs Somebody Sometimes…

• Machines don’t know everything, and can fail unexpectedly

• Humans are key interactive counterparts– supervisors,

partners, helpers, clients

Autonomous City Explorer (TU Munich) Roomba (iRobot)

Team Cornell’s

“Skynet”

(the victim)

Team MIT’s

“Talos”

(the reckless

Boston driver)

Team

Oshkosh’s

“Terramax”

DARPA Urban Challenge 2007

Page 5: Hybrid and Hierarchical Bayesian Data Fusion for Cooperative Human-Robot Perceptionecee.colorado.edu/~beto2469/dc-seminars/2014/Nisar_… ·  · 2014-09-15Hybrid and Hierarchical

5

…But Nobody’s Perfect!

• Humanlimitations

– irrationality, biases, memories…

– speed, precision, fragility…

• Reasons to make “smart” robots in the first place!

How to strike the right balance?

Page 6: Hybrid and Hierarchical Bayesian Data Fusion for Cooperative Human-Robot Perceptionecee.colorado.edu/~beto2469/dc-seminars/2014/Nisar_… ·  · 2014-09-15Hybrid and Hierarchical

6

Sheridan’s Scales of Human-Machine Interaction[Parasuraman, Sheridan, and Wickens, IEEE SMC-B 2000]

“adjustable autonomy”:

intelligent adaptation?

complete machine control

complete human control

Page 7: Hybrid and Hierarchical Bayesian Data Fusion for Cooperative Human-Robot Perceptionecee.colorado.edu/~beto2469/dc-seminars/2014/Nisar_… ·  · 2014-09-15Hybrid and Hierarchical

7

Autonomy and Intelligence: What’s “Intelligent”?

• Humans and robots must live in a world full of uncertainties

– Physical: Noise, disturbances, modeling errors, sensor/actuator limits

– Each other: What is other trying to do/say? What does other know?

• Good uncertainty models robustness, safety

• Good uncertainty reduction efficiency, performance

Human Sensing

Human Control

Robot Control

Robot Sensing

The World

Planning, decision making

Perception, understanding

Page 8: Hybrid and Hierarchical Bayesian Data Fusion for Cooperative Human-Robot Perceptionecee.colorado.edu/~beto2469/dc-seminars/2014/Nisar_… ·  · 2014-09-15Hybrid and Hierarchical

8

Example: Human-Robot Search Missions[Murphy, et al. 2004; 2008; 2011], [Goodrich et al., 2009], [Lewis et al. 2009]

• How to cope with highly coupled uncertainties?

task coordination

information management

Human Sensing

Human Control

Robot Control

Robot Sensing

The World

Planning, decision making

Perception, understanding

cognitive loading; situational awareness

localization; survivor tracking;

planning and navigation

Page 9: Hybrid and Hierarchical Bayesian Data Fusion for Cooperative Human-Robot Perceptionecee.colorado.edu/~beto2469/dc-seminars/2014/Nisar_… ·  · 2014-09-15Hybrid and Hierarchical

9

Probabilistic Target Search [Bourgault 2006]

• Object location X=[x,y] with stochastic uncertainty p(X)

– Robot: find object as quickly as possible via camera

– Human supervisor: watch video feed, assist object ID

• Update p(X) with camera sensor data via Bayes’ Rule

Robot’s binary sensor model for “No target detected”

Prior pdf(Gaussian mixture)

PDF/Likelihood Value

Low High Robot position

Posterior pdf(Gaussian mixture)

y

x

Vision cone

Same general idea behind

Kalman filters for state estimation

“No target

detected”

Page 10: Hybrid and Hierarchical Bayesian Data Fusion for Cooperative Human-Robot Perceptionecee.colorado.edu/~beto2469/dc-seminars/2014/Nisar_… ·  · 2014-09-15Hybrid and Hierarchical

10

Single Robot Indoor Search Experiment

• Greedy search on posterior distributions

– Plan path to largest peak (MAP estimate) via D* Lite [Koenig and Likhachev, 2005]

Diffuse p(X) prior

“Bad”/Misinformed p(X) prior

targets

Cluttered

search space

Autonomous

Pioneer 3D-X

Vicon motion

tracking

Hokuyo lidar

(obstacle avoidance)

Unibrain Fire-i

camera

Mini ITX

(Intel Core 2)

Page 11: Hybrid and Hierarchical Bayesian Data Fusion for Cooperative Human-Robot Perceptionecee.colorado.edu/~beto2469/dc-seminars/2014/Nisar_… ·  · 2014-09-15Hybrid and Hierarchical

11

How to Improve Performance?• Greedy planning simple, but inefficient

– slow info gain, especially for bad p(X)

– “back and forth” paths (scattering)

• Better control for given sensors?

– model predictive control [Ryan, et al. 2010, Bourgault, 2005]

‣ expensive, still struggle with bad p(X)

– human/mixed-initiative control

‣ cognitive load [Lewis, et al 2009]

‣ time delay instabilities [Sheridan 1992]

• Better sensing for given control?

– augment robot’s sensing horizon

– human sensor: update p(X) even if robot not at X

– still let robot decide its own path

“No targets

detected”

“There is

nothing in the

far east or west

corridors”

“There was

something behind

that wall”

Page 12: Hybrid and Hierarchical Bayesian Data Fusion for Cooperative Human-Robot Perceptionecee.colorado.edu/~beto2469/dc-seminars/2014/Nisar_… ·  · 2014-09-15Hybrid and Hierarchical

12

Cooperative Human-Robot Intelligence: Any Robot, Any Human, Any Information

• How to combine robot perception with human perception?

• How to coordinate information from multiple humans and robots?

• Probabilistic modeling and state estimation

– Domain knowledge, stochastic variables

– Bayesian inference:

autonomous data-driven reasoning

‣ Hybrid: continuous + discrete uncertainties

‣ Hierarchical: model uncertainties

Data Data

X

Robot

data

Human

output data

Data

Data

Data

Data

Page 13: Hybrid and Hierarchical Bayesian Data Fusion for Cooperative Human-Robot Perceptionecee.colorado.edu/~beto2469/dc-seminars/2014/Nisar_… ·  · 2014-09-15Hybrid and Hierarchical

13

Cooperative Human-Robot Perception: Overview [Ahmed, et al TRO 2013; GNC 2012; GNC 2011, ACC 2011; ICRA 2010 ]

• Goal: improve feedback control and planning using “human sensors”

• Previous: very high/low-level [Lewis, et al. 2009; Bourgault, et al. 2008; Kaupp 2007]

• More natural semantic observations of physical states? [Hall and Jordan, 2010]

‣ i.e. positions, velocities, temperature, dimensions,…

• Challenge: humans are not oracles! [Walter, et al. 2013, Matuscek et al, 2013, Rosenthal, et al. 2011]

• Can we build Kalman filters with semantic human inputs ?

“No target

detected”

“Target at range

100 m and

bearing 30 deg.”“That is a

truck”

“Target is at

range 10 m,

bearing 10 deg”

??? ???

“There’s a small

truck behind the

trees, quickly

moving North”

“I think I see

something

nearby you”

Page 14: Hybrid and Hierarchical Bayesian Data Fusion for Cooperative Human-Robot Perceptionecee.colorado.edu/~beto2469/dc-seminars/2014/Nisar_… ·  · 2014-09-15Hybrid and Hierarchical

14

Semantic Human Sensors• Real semantic range data example: calibration for different human subjects

[Sample, Ahmed and Campbell, GNC 2012]

Dk = object is “next to” robot

Dk = object is “nearby” around robot

Dk = object is “far away” from robot

-5 -4 -3 -2 -1 0 1 2 3 4 5-5

-4

-3

-2

-1

0

1

2

3

4

5

X1, distance to right of robot (m)

X2, d

ista

nce

to

fro

nt o

f ro

bo

t (m

)

-5 -4 -3 -2 -1 0 1 2 3 4 5-5

-4

-3

-2

-1

0

1

2

3

4

5

X1, distance to right of robot (m)

X2, d

ista

nce

to

fro

nt o

f ro

bo

t (m

)

Sensor Data, Human A Sensor Data, Human B

Noisy classification of 2D position

i.e. conversion of continuous states

to discrete labels

distance to right of robot (m) distance to right of robot (m)

dis

tance

to fro

nt

of ro

bot

(m)

dis

tance

to fro

nt

of ro

bot

(m)

???

Page 15: Hybrid and Hierarchical Bayesian Data Fusion for Cooperative Human-Robot Perceptionecee.colorado.edu/~beto2469/dc-seminars/2014/Nisar_… ·  · 2014-09-15Hybrid and Hierarchical

15

Softmax Models for Semantic Data

• Probabilistic classifiers: machine learning [Bishop, 2006]

• Softmax likelihood model for mdiscrete convexly separable classes

– class “boundaries” are linear

???

Page 16: Hybrid and Hierarchical Bayesian Data Fusion for Cooperative Human-Robot Perceptionecee.colorado.edu/~beto2469/dc-seminars/2014/Nisar_… ·  · 2014-09-15Hybrid and Hierarchical

• Generalize to arbitrary non-convexly separable classes?

• One possibility: multimodal softmax (MMS)

– piecewise-linear class boundaries

16

MMS Models for Semantic Data [Ahmed and Campbell, Expert Systems w. Applications 2012; ACC 2008]

???

Page 17: Hybrid and Hierarchical Bayesian Data Fusion for Cooperative Human-Robot Perceptionecee.colorado.edu/~beto2469/dc-seminars/2014/Nisar_… ·  · 2014-09-15Hybrid and Hierarchical

17

• MMS models learned from real data via maximum likelihood [Ahmed and Campbell, ACC 2008; ESwA 2012]; [Ahmed and Campbell, TSP 2011: fully Bayesian learning]

Example: Semantic Range-Only Model

Sensor Data, Human B Sensor Data, Human A

???

Page 18: Hybrid and Hierarchical Bayesian Data Fusion for Cooperative Human-Robot Perceptionecee.colorado.edu/~beto2469/dc-seminars/2014/Nisar_… ·  · 2014-09-15Hybrid and Hierarchical

18

Bayesian Human-Robot Sensor Fusion

• Suppose human reports: “Something is <D> the robot”

– Range: “Next To”, “Nearby”, “Far”

– Bearing: “Front Of”, “Left Of”, “Behind”, “Right Of”

• Do Bayesian update on pdf after fusing robot data z using MMS model for D

• But exact posterior not closed-form:

???

Page 19: Hybrid and Hierarchical Bayesian Data Fusion for Cooperative Human-Robot Perceptionecee.colorado.edu/~beto2469/dc-seminars/2014/Nisar_… ·  · 2014-09-15Hybrid and Hierarchical

Variational Bayes Fusion [Ahmed and Campbell, 2010]

• Must find suitable approximation to

– grid-based [Bourgault et al, 2008] : curse of dimensionality

– particle filter [Arulumpalam et al, 2002] : inefficiencies, degeneracies

– ideally: meshes with robot sensor data fusion (e.g. Gaussian Kalman filter)

• Useful fact:

– let with mean and covariance

– then pdf is always unimodal

prior

softmax likelihood

prior

softmax likelihood

joint pdf =

(prior) x

(likelihood)

prior

softmax likelihood

joint pdf =

(prior) x

(likelihood)

???

Page 20: Hybrid and Hierarchical Bayesian Data Fusion for Cooperative Human-Robot Perceptionecee.colorado.edu/~beto2469/dc-seminars/2014/Nisar_… ·  · 2014-09-15Hybrid and Hierarchical

20

Variational Bayes Fusion [Ahmed and Campbell, 2010]

• This pdf well-approximated by unnormalized variational Gaussian :

• Renormalize approximation to get Gaussian posterior pdf

– generalizes [Murphy,1999]: convex optimization, but variance underestimated

VB joint pdf =

(prior) x

(variational

likelihood)

VB joint pdf =

(prior) x

(variational

likelihood)

variational

softmax

likelihood

variational

softmax

likelihood

VB posterior pdf true posterior pdfVB posterior pdf true posterior pdf

???

Page 21: Hybrid and Hierarchical Bayesian Data Fusion for Cooperative Human-Robot Perceptionecee.colorado.edu/~beto2469/dc-seminars/2014/Nisar_… ·  · 2014-09-15Hybrid and Hierarchical

21

New posterior pdf

(VBIS)

estimated

variance more

accurate

VB Importance Sampling (VBIS) [Ahmed et al., IEEE T-RO 2013; ACC 2011b]

• Fix overconfident VB with Monte Carlo importance sampling (IS)– Use VB result to draw Ns i.i.d samples

– Use weighted samples to get new pdf estimate

VB posterior pdftrue posterior pdf

IS pdf q(X)

samples combine weighted

samples

???

Page 22: Hybrid and Hierarchical Bayesian Data Fusion for Cooperative Human-Robot Perceptionecee.colorado.edu/~beto2469/dc-seminars/2014/Nisar_… ·  · 2014-09-15Hybrid and Hierarchical

• Complex non-Gaussian priors p(X) given by GMs– naturally recursive GM approximation: track multiple “state hypotheses”

‣ much more robust to “surprises” than particle filter [Arulumpalam et al., 2002]

– hybrid generalization of GM Kalman filter [Sorenson, Alspach, 1972; Kotecha, Djuric, 2003]

‣ easily parallelized

‣ scales well with state dimension

‣ unified state feedback estimation with semantic human + numerical robot data

VBIS for Gaussian Mixtures (GMs) [Ahmed et al., IEEE T-RO 2013; Ahmed et al., ACC 2011b]

22

???

Page 23: Hybrid and Hierarchical Bayesian Data Fusion for Cooperative Human-Robot Perceptionecee.colorado.edu/~beto2469/dc-seminars/2014/Nisar_… ·  · 2014-09-15Hybrid and Hierarchical

23

Joint Human-Robot Search Experiments• Now human voluntarily sends data to robot via GUI

– fixed dictionary, 2 Hz video (~0.5 sec delay)

– Human Sensing Only

– Human + Robot Sensing

“Behind” “Front”

Robot positionDetected

Target

Situation Map

Human Observation GUI

Cam view

Page 24: Hybrid and Hierarchical Bayesian Data Fusion for Cooperative Human-Robot Perceptionecee.colorado.edu/~beto2469/dc-seminars/2014/Nisar_… ·  · 2014-09-15Hybrid and Hierarchical

24

Results for Trials with 16 Human Participants [Sample, Ahmed, and Campbell, GNC 2012]

• Pre-training, then 4 randomized missions, 2 false targets, 7 mins

– undetected targets still well-localized

– human’s negative + corrective info significantly reduces time, distance traveled

– confidence weighting: reduce number of messages, MAP variance

MAP Target Position Error (m)

Test condition

Largest GM peaks

very close to correct

target locations

Baseline, robot only

(no human)

Individual MMS, with confidence

Individual MMS, no

confidence

General MMS, with confidence

General MMS, no

confidence

Page 25: Hybrid and Hierarchical Bayesian Data Fusion for Cooperative Human-Robot Perceptionecee.colorado.edu/~beto2469/dc-seminars/2014/Nisar_… ·  · 2014-09-15Hybrid and Hierarchical

25

Ongoing work: (NSF C-UAS)Decision-Theoretic Active Human-Robot Sensing

• Bayesian decision-theoretic querying for target search

– UAS decides how/when to get operator input

– respect (dynamic) constraints on operator’s time & attention

– value of information [Kaupp, et al. 2010]: “is what could be learned worth it?”

X

sr sh ch

U

pr eh

D

Individual and team utility/cost functions:

*information gain, time to detection,…

*operator task engagement

Decisions/actions with costs:

Robot position and

sensor data

Human sensor and

task engagement models

*Questions for human sensor

given target beliefs, e.g.:

- “What’s going on in Sector 3?”

(global SA)

- “Is anything nearby that shed?”

(local SA)

- “Is this anomaly the target?”

(ATR validation)

Target Location

Belief

Execute human query a* only if

expected utility exceeds cost of a*

Page 26: Hybrid and Hierarchical Bayesian Data Fusion for Cooperative Human-Robot Perceptionecee.colorado.edu/~beto2469/dc-seminars/2014/Nisar_… ·  · 2014-09-15Hybrid and Hierarchical

26

COHRINT Modeling and Data Fusion: Any Robot, Any Human, Any Information

• How to combine robot perception with human perception?

• How to coordinate information from multiple humans and robots?

• Probabilistic modeling and state estimation

– Domain knowledge, stochastic variables

– Bayesian inference:

autonomous data-driven reasoning

‣ Hybrid: continuous + discrete uncertainties

‣ Hierarchical: model uncertainties

Data Data

X

Robot

data

Human

output data

Data

Data

Data

Data

Page 27: Hybrid and Hierarchical Bayesian Data Fusion for Cooperative Human-Robot Perceptionecee.colorado.edu/~beto2469/dc-seminars/2014/Nisar_… ·  · 2014-09-15Hybrid and Hierarchical

Sketching Information for Target Search

27

• Incident Commanders in Wilderness Search and Rescue:

subjective estimates of location probabilities [Adams, et al. 2009]

– “Mental data fusion” of reports priority search map

• Can objective probabilities be obtained from local sketch data?

– How to account for loosely structured positive/negative information

(e.g. from experts and non-experts)?

– How to account for human sensor model uncertainties?

Page 28: Hybrid and Hierarchical Bayesian Data Fusion for Cooperative Human-Robot Perceptionecee.colorado.edu/~beto2469/dc-seminars/2014/Nisar_… ·  · 2014-09-15Hybrid and Hierarchical

28

Simple Experimental Testbed:

Large scale Human-Robot “Easter Egg Hunt”

• Network of specialist and

non-specialist humans,

autonomous Segway RMP

50XLs robots

• Find hidden objects around

campus

SICK lidar

(obstacle avoidance)

EnGenius WiFi

Septentrio GPS

3 x

FireFly

cameras

Mimo

touchscreen

Lenovo

Tablet PCs

Android

smartphonesiPod, iPad

Onboard

PC

Cornell

Campus

Page 29: Hybrid and Hierarchical Bayesian Data Fusion for Cooperative Human-Robot Perceptionecee.colorado.edu/~beto2469/dc-seminars/2014/Nisar_… ·  · 2014-09-15Hybrid and Hierarchical

29

Preliminary Human-only Experiments

• 6 humans searching for partially buried key chain

• 8 search scenarios, 15 min each

– vague “clues” about target location

Page 30: Hybrid and Hierarchical Bayesian Data Fusion for Cooperative Human-Robot Perceptionecee.colorado.edu/~beto2469/dc-seminars/2014/Nisar_… ·  · 2014-09-15Hybrid and Hierarchical

Preliminary Human-only Experiments:

Sample Sketch Data

30

Page 31: Hybrid and Hierarchical Bayesian Data Fusion for Cooperative Human-Robot Perceptionecee.colorado.edu/~beto2469/dc-seminars/2014/Nisar_… ·  · 2014-09-15Hybrid and Hierarchical

How to Derive Information from Human Sketches?

31

• Discretize search space X and sketches into NL cells

– correlated occupancy grid: only one X cell actually “true”

– Sin/out = “1” : “target may be here”/ “target may not be here”

– Sin/out = “0” : “no new information” (implicit)

“Outside” Sketches on

Search Space X

“Inside” Sketch on

Search Space X

Page 32: Hybrid and Hierarchical Bayesian Data Fusion for Cooperative Human-Robot Perceptionecee.colorado.edu/~beto2469/dc-seminars/2014/Nisar_… ·  · 2014-09-15Hybrid and Hierarchical

32

• Consider all possible target states for Sin/out =– Params (ai, bi) define ith human sensor likelihood for single cell s:

Sketch Cell Observation Likelihoods

“true detection”

“false alarm”

“false negative”

“true negative”

Page 33: Hybrid and Hierarchical Bayesian Data Fusion for Cooperative Human-Robot Perceptionecee.colorado.edu/~beto2469/dc-seminars/2014/Nisar_… ·  · 2014-09-15Hybrid and Hierarchical

33

For each Sin/out with Nin/out labeled cells, either:

• exactly one correct cell and (Nin/out – 1) others false, or

• all Nin/out cells are false

“X either is here OR is here OR…”

~ summation rule [Bailey, et al 2012]

“X not here AND not here AND…”

~ product rule [Bailey, et al 2012; Ferris, et al. 2006]

Cell Dependencies via Data Association Uncertainty

Page 34: Hybrid and Hierarchical Bayesian Data Fusion for Cooperative Human-Robot Perceptionecee.colorado.edu/~beto2469/dc-seminars/2014/Nisar_… ·  · 2014-09-15Hybrid and Hierarchical

• Capture uncertainty and

“meta-uncertainty” in

parameters and X

• Uncertainty over parameter

space not necessarily uniform

Probabilistic Graphical Model

for Bayesian Parameter Learning and Sketch Data Fusion

34

Discretized target location (known/unknown)

“Meta-uncertainties”:parameter priors and hyperpriors

“Evidence”:Discretized labeled sketches

vosesoftware.com

Page 35: Hybrid and Hierarchical Bayesian Data Fusion for Cooperative Human-Robot Perceptionecee.colorado.edu/~beto2469/dc-seminars/2014/Nisar_… ·  · 2014-09-15Hybrid and Hierarchical

• Capture uncertainty and

“meta-uncertainty” in

parameters and X

Probabilistic Graphical Model

for Bayesian Parameter Learning and Sketch Data Fusion

35

Discretized target location (known/unknown)

“Meta-uncertainties”:parameter priors and hyperpriors

“Evidence”:Discretized labeled sketches

Supervised “offline” learning:

Semi-/unsupervised“online” learning:

vosesoftware.com

Page 36: Hybrid and Hierarchical Bayesian Data Fusion for Cooperative Human-Robot Perceptionecee.colorado.edu/~beto2469/dc-seminars/2014/Nisar_… ·  · 2014-09-15Hybrid and Hierarchical

Unsupervised Learning Results with Humans Only:

Target Location Belief

36

• Mission 1

Page 37: Hybrid and Hierarchical Bayesian Data Fusion for Cooperative Human-Robot Perceptionecee.colorado.edu/~beto2469/dc-seminars/2014/Nisar_… ·  · 2014-09-15Hybrid and Hierarchical

Unsupervised Learning Results with Humans Only:

Target Location Belief

37

• Mission 4– Different Xtrue than Mission 1, different evidence

Page 38: Hybrid and Hierarchical Bayesian Data Fusion for Cooperative Human-Robot Perceptionecee.colorado.edu/~beto2469/dc-seminars/2014/Nisar_… ·  · 2014-09-15Hybrid and Hierarchical

Unsupervised Learning Results with Humans Only:

Target Location Belief

38

• Mission 7– same Xtrue as Mission 4, but different evidence

Page 39: Hybrid and Hierarchical Bayesian Data Fusion for Cooperative Human-Robot Perceptionecee.colorado.edu/~beto2469/dc-seminars/2014/Nisar_… ·  · 2014-09-15Hybrid and Hierarchical

Unsupervised Learning Results w/ Humans Only:

Posterior Sketch Likelihood Parameters

39

• Mission 1: agents that provide more negative info are more “trustworthy”: fewer

false alarms lower expected bi; note: ai and wi not so observable

ai bi wi

i=1

i=2

i=3

i=4

i=5

i=6

Hu

ma

n A

ge

nt #

Page 40: Hybrid and Hierarchical Bayesian Data Fusion for Cooperative Human-Robot Perceptionecee.colorado.edu/~beto2469/dc-seminars/2014/Nisar_… ·  · 2014-09-15Hybrid and Hierarchical

Unsupervised Learning Results w/ Humans Only:

Posterior Sketch Likelihood Parameters

40

• Mission 7: similar overall, but some agents’ values shifted with respect to Mission 1;

true X unknown and posterior still diffuse, so ai and wi still poorly observable

ai bi wi

i=1

i=2

i=3

i=4

i=5

i=6

Hu

ma

n A

ge

nt #

Page 41: Hybrid and Hierarchical Bayesian Data Fusion for Cooperative Human-Robot Perceptionecee.colorado.edu/~beto2469/dc-seminars/2014/Nisar_… ·  · 2014-09-15Hybrid and Hierarchical

Supervised Learning Results w/ Humans Only:

Posterior Sketch Likelihood Parameters

41

• Pool data from all 7 missions with true X given in each case: improved

observability; agent 5 ‘most trustworthy on average’ (used most negative info)

ai bi wi

i=1

i=2

i=3

i=4

i=5

i=6

Hu

ma

n A

ge

nt #

Page 42: Hybrid and Hierarchical Bayesian Data Fusion for Cooperative Human-Robot Perceptionecee.colorado.edu/~beto2469/dc-seminars/2014/Nisar_… ·  · 2014-09-15Hybrid and Hierarchical

What does “Fully Bayesian” Estimation Buy Us?

43

Q/parameter space

p(Q|data)

MAP/ML point estimates make sense for this kind of (posterior) parameter uncertainty…

…but not for more general cases

0 1What to do in general? Either:

- keep around multiple hypotheses, until evidence/data forces one to “win”, or…- average over (marginalize out) all hypotheses, if you’ll never know “for sure”

Page 43: Hybrid and Hierarchical Bayesian Data Fusion for Cooperative Human-Robot Perceptionecee.colorado.edu/~beto2469/dc-seminars/2014/Nisar_… ·  · 2014-09-15Hybrid and Hierarchical

What does “Fully Bayesian” Estimation Buy Us?

44

• Supervised/unsupervised MAP point estimates for agent 6 in Mission 1:

• Full Bayes: integrates over total parameter uncertainty to account for info

sensitivity (~Bayes point machines w.r.t. classification costs [Herbich, et al., 2001])

Unsupervised: complete liar, start inverting!p[false neg.] = 1, and p[true det.] =0

Supervised: oracle! p[false neg.] = 1- ai = 0, and p[true det.] = ai = 1

Page 44: Hybrid and Hierarchical Bayesian Data Fusion for Cooperative Human-Robot Perceptionecee.colorado.edu/~beto2469/dc-seminars/2014/Nisar_… ·  · 2014-09-15Hybrid and Hierarchical

Ongoing work: (AFRL Collaboration)

Human Sensor Fusion for Road Network Surveillance

45

Localize moving targets with event-based UGS and HDO data

• stochastic hybrid target dynamics along road network (with exits)

• delayed UGS/HDO data: collected out of order by UAV

• extra hard: data association uncertainties

Unattended ground sensors (UGSs) Human dismount operators (HDOs)

Page 45: Hybrid and Hierarchical Bayesian Data Fusion for Cooperative Human-Robot Perceptionecee.colorado.edu/~beto2469/dc-seminars/2014/Nisar_… ·  · 2014-09-15Hybrid and Hierarchical

Other related work: Bayesian Decentralized Data Fusion

[Ahmed, MFI 2014; Ahmed, et al, RSS 2012; Ahmed and Campbell, TSP 2012]

46

• How to share local information sets Z among many decentralized agents?

“Ideal”: centralized server

pool all Z, then process single pdf

not robust or scalable

Bayesian message passing

process Z locally, then share pdfs

robust, scalable O(N) convergence

Page 46: Hybrid and Hierarchical Bayesian Data Fusion for Cooperative Human-Robot Perceptionecee.colorado.edu/~beto2469/dc-seminars/2014/Nisar_… ·  · 2014-09-15Hybrid and Hierarchical

47

Acknowledgments

• Prof. Mark Campbell and Cornell ASL Members:

– Eric Sample, Chuck Yang, Ahmed El Samadisi, Art Sullivan, Ken Ho, Tauhira

Hoossainy, Lucas de la Garza, Ke Hu, Conan Lao, Kai Wang, Cordelia Lee

– Dr. Danelle Shah, Dr. Jon Schoenberg, Daniel Lee, Rina Tse

• Collaborators:

– GMU ARCH Lab (Prof. Raja Parasuraman, Prof. Tyler Shaw, et al)

– MIT ACL (Prof. Jon How, et al)

– Air Force Research Lab (Scott Galster, David Casbeer, Derek Kingston )

• Intrepid human subjects

• Sponsors:

– National Science Foundation GRFP

– Air Force Office of Scientific Research

– Army Research Office