Engineering Statistics for Multi-Object Tracking Ronald Mahler Lockheed Martin NE&SS Tactical...
-
date post
20-Dec-2015 -
Category
Documents
-
view
224 -
download
0
Transcript of Engineering Statistics for Multi-Object Tracking Ronald Mahler Lockheed Martin NE&SS Tactical...
Engineering Statistics forMulti-Object Tracking
Ronald MahlerLockheed Martin NE&SS Tactical Systems
Eagan, Minnesota, USA651-456-4819 / [email protected]
IEEE Workshop on Multi-Object TrackingMadison WI
June 21, 2003
This work was supported by the U.S. Army Research Office under contracts DAAH04-94-C-0011 and DAAG55-98-C-0039, and by the U.S. Air Force Office of Scientific Research under contract FF49620-01-C-0031. The content does not necessarily reflect the position or the policy of the Government. No official endorsement should be inferred.
Short Tutorial
Top-Down vs. Bottom-Up Data Fusion
usual “bottom-up” approach to data fusion
diversedata
ill-characterizeddata data association
tracking
I.D.
datafusion
heuristically integrate apatchwork of optimal orheuristic algorithms thataddress specific functions(mostly done by operators)
“top-down,” system-level approach
multiple sensors, multipletargets = single system
specify states of system= multisensor-multitarget
Bayesian-probabilisticformulation of problem
reformulate in “Statistics 101,“engineering-friendly” terms
requires random set theory
low-level functions subsumed in optimally integrated algorithms
fuzzy logic
detection
sensor mgmt
• Systematic probabilistic foundation for multisensor, multitarget problems • Preserves the “Statistics 101” formalism that engineers prefer• Corrects unexpected difficulties in multitarget information fusion• Potentially powerful new computational techniques (1st-order moment filtering)• Unifies much of multisource, multitarget, multiplatform data fusion
detection
tracking
identification
performance estima- tion
information theory
control theory
sensor manage- ment
group targets
Bayes
fuzzy logic
Dempster- Shafer
rules
Our Topic: “Finite-Set Statistics” (FISST)
FISST
unified Bayesiandata fusion
MATHEMATICSOF
DATA FUSION
I.R. Goodman, Ronald. P.S. Mahlerand Hung T. Nguyen
Kluwer Academic Publishers
Theory and Decision Library
Random SetsTheory and Applications
John Goutsias, Ronald. P.S. MahlerHung T. NguyenEditors
Springer
SPIE Milestone Series Volume MS 124
Volune 97
ATR Working GroupUSAF Correlation Symp.SPIE AeroSense Conf.Nat'l Symp. on Data FusionIEEE Conf. Dec. & Contr.Optical Discr. Alg's Conf.ONR Workshop on TrackingU. Wisconsin
Invited PresentationsScientific Workshops1995 ONR/ARO/LM WorkshopInvited Session: SPIE AeroSense'99
An Introduction toMultisource-Multitarget Statistics
and its Applications
Ronald MahlerMarch 15, 2000
book (out of print)hardcover proceedings monograph
HANDBOOKOF
MULTISENSORDATA FUSION
Edited byDAVID L. HALLJAMES LLINAS
book chapter 14
MDA/POETAFITNRaDHarvardJohns HopkinsU. MassachusettsNew Mexico StateIDC2002
Lockheed Martin
FISST-Related Books and Monographs
finite-setstatistics
scientificperformanceestimation forLevel 1 fusion
robust SAR ATR against
ground targets
unifiedcollection& controlfor UAVs
BMD technology
BMD technology
anti-air multisourceradar fusion
robustINTELL fusion
scientificperformance
estimation for Levels 2,3,4 data fusion
robust joint tracking and NCTI using HRRR
and track radar
basic researchin data fusion,
tracking, & NCTI
AFRLAFRL
AFRL
AFRLAFOSR
MRDEC
LMTS
MDA
MDA
ARO
robust multitargetASW /ASUW
fusion & NCTI
LMTS
= basic research project = applied research project
Current FISST Technology Transition= completed research project
robust ASWacoustictarget ID
SSC/ONR
AFOSR = Air Force Office of Scientific ResearchAFRL = Air Force Research LaboratoryARO = Army Research OfficeLMTS = Lockheed Martin Tactical SystemsMDA = Missile Defense AgencyMRDEC = Army Missile Research, Dev &Eval CenterONR = Office of Naval ResearchSSC = SPAWAR Systems Center
Topics
1. Introducing “Dr. Dogbert,” Critic of FISST
2. What is “engineering statistics”?
3. Finite-set statistics (FISST)
4. Multitarget Bayes Filtering
5. Brief introduction to applications
multitarget calculusmultitarget statisticsmultitarget modeling
Dr. Dogbert, Critic of FISST
What is extraordinary aboutthis assertion is not the
ignorance it displaysregarding FISST but, rather,
the ignorance it displaysregarding Bayes’Rule!
The multisource-multitarget engineering problems addressed by FISST actually require nothing more complicated than Bayes’ rule…
…which means that FISST is of meretheoretical interest at best. At worst,it is nothing more than pointlessmathematical obfuscation.
The Bayesian Iceberg
Bayes’ rule
non-standard application
Nothing deep here!
The optimality and simplicity of the Bayesian framework can be taken for granted only within the confines of standard applications addressed by standard textbooks. When one ventures out of these confines one must exercise proper engineering prudence—which includes verifying that standard textbook assumptions still apply. This is especially true in multitarget problems!
How Multi- and Single-Object Statistics Differ
Bayes’ rule
I sit astride amountaintop!
multitarget maximum likelihood estimator (MLE) is defined but multitarget
maximum a posteriori(MAP) estimator is not!
multitarget Shannonentropy cannot be
defined!
need explicit methods for constructing provably true
multisensor-multitargetlikelihood functions fromexplicit sensor models!
need explicit methods forconstructing provably true
multitarget Markov transition densities fromexplicit motion models!
new multitarget state estimators must be defined
and proved optimal!
multitarget expected values are not defined!
multitarget L2
(i.e., RMS distance)metric cannot be
defined!
Dr. Dogbert (Ctd.)
I, the great Dr. Dogbert , have discovered a powerful new research strategy! Strip off the mathematics that makes FISST systematic, general, and rigorous, copy its main ideas, change notation, and rename it Dogbert Theory!
a mere change of name and notation does not constitute
a great advance.
It is easy to claim invention of a simple and yet elastically all-subsuming theory if
one does so by neglecting, avoiding, or glossing over the specifics required to actually substantiate puffed-up boasts
of generality and “first principles”.
This is a great advance because it’s much simpler: It doesn’t need all that ugly math (Blech!) that no one understands anyway!
Dogbert Theory is straightforward precisely because it avoids specifying explicit general methods and found-ations—and yet is still fully general and derived from first principles!
• Engineering statistics, like all engineering mathematics, is a tool and not an end in itself. It must have two (inherently conflicting) characteristics: – Trustworthiness: Constructed upon systematic, reliable mathematical
foundations, to which we can appeal when the going gets rough. – Fire and forget: These foundations can be safely neglected in most
applications, leaving a parsimonious and serviceable mathematical machinery in their place.
What is “Engineering Statistics”?
• The “Bar-Shalom Test” defines the dividing line between the serviceable and the simplistic:– “Things should be as simple as possible—but no simpler!”
– Y. Bar-Shalom, quoting Einstein
• If foundations are:– so mathematically complex that they cannot be taken for granted in
most situations, then they are shackles and not foundations!– so simple that they repeatedly lead us into engineering blunders, then
they are simplistic, not simple!
Single-Sensor / Target Engineering Statistics
• Progress in single-sensor, single-target applications has been greatly facilitated by the existence of a systematic, rigorous, and yet practical engineering statistics that supports the development of new concepts in this field.
zx
observations
states
Zk = hk(x)+Wk
Xk+1= k(x)+Vk
sensor measurement model
target Markov motion model
pk(S)=Pr(ZkS)pk+1|k(S)=Pr(Xk+1S)
probability mass function
probability mass function
dxintegro-differentialcalculus
d
E[Z]E[X]
expectations &other moments
||z1-z2||||x1-x2||
miss distance
miss distance
^ xk|kBayes-optimalstate estimators
Bayes-optimal filtering
fk|k(x|Zk)fk+1|k(x|Zk)
posterior & prior densities
fk(z|x)fk+1|k(y|x)
likelihood function
Markov transition density
ZX
randomobservation
vector
randomstate vector
What is Single-Target Bayes Statistics?
Classical statistics: unknown state-vector x of the target is a fixed parameter
Bayesian statistics: unknown state is always a random vector X even if we do not
explicitly know either it or its distribution f0(X) (the prior). The
single-sensor, single-target likelihood function is a cond. prob. density
* equalities temporarily assume discrete state & observationspaces for ease of argument
f0(x) = Pr(X = x)
X
state space, X
randomstate-vector
prior distribution* of X
randomobservation-vector
observation space, Z
Z
Lz(x) = f(z|x) = Pr(Z=z|X=x)sensor likelihood function*
Single-Target Bayes Statistics: Distributions
specify single-target state space and its measure structureXZ
specify single-sensor/target measurement space and
its measure structurerandom variables
(e.g. random vectors)
f(z|x) = Pr(Z=z|X=x)signifies a random draw from a single measurement space signifies a random draw
from a single state-space
f0(x) = Pr(X=x)
interim targetstate-change
Xk|k=x
target
random target state
predictedrandom state
1. time-update step via prediction integral
The Recursive Bayes Filter
Xk+1|k=y
k
true Markov transition density
Xk+1|k = k(x)+ Vk (motion model)
fk+1|k(y|x) = fV (y- k(x)) k
fk+1|k(y|Zk) = fk+1|k(y|x) fk|k(x|Zk)dx
prediction of posterior
to time of new datum zcurrent posterior, conditioned
on old data-stream
Zk = {z1,…, zk}
assuming fk+1|k(y|x,Zk) = fk+1|k(y|x)
The Recursive Bayes Filter
sensor
zk+1observation
2. data-update step via Bayes’ rule
k
Xk+1|k
Xk+1|k+1
data-updatedrandom state
predictedrandom state
true likelihood function
Zk = hk(x)+ Wk (measurement model)
fk(z|x) = fW (z- hk(x)) k
fk+1|k+1(x|Zk+1) fk+1(zk+1|x) fk+1|k(x|Zk)
new posterior, conditioned
on all data Zk+1 ={z1,…, zk+1}prior = predicted
posterior
assuming fk+1(z|x,Zk) = fk+1(z|x)
Xk+1|k+1
Xk+1|k
xk+1|k+1 =^
current best estimate of xk+1
3. estimation step via Bayes-optimal state estimator
The Recursive Bayes Filter
argsupx fk+1|k+1(x|Zk+1)maximum a posteriori (MAP){
x fk+1|k+1(x|Zk+1)dxexpected a posteriori (EAP)
f(z|x)
without “true” likelihood forboth target &background,Bayes-optimal claim is hollow
“over-tuned” likelihoods may benon-robust w/r/tdeviations between model & reality
sensor models
Basic Issues in Single-Target Bayes Filtering
poor motion-model selection leads to performance degradation
without “true” Markov density, good model selection is wasted
motion models fk+1|k(x|y)
without guaranteedconvergence, approx.filter may be unstable, non-robust
poorly chosen state estimators may have hidden inefficiencies (erratic, inaccurate, slowly-convergent, etc.)
state estimation/convergence
xk|k^
filtering equations are computationally nasty
“textbook” approaches create computationalinefficiencies: e.g,numerical instabilitydue to central finite-difference schemes
comput-ability fk|k(y|Zk)
Bayes-Optimal State Estimatorspre-specifiedcost function
C(x,y) x(z1,…, zm)^state estimator: familyof state-valued func-tions of the obser-vations z1,…, zm, m>0
state estimator x is “Bayes-optimal” with respect to
cost C if it minimizes the Bayes risk
^
without a Bayes-optimal estimator, we aren’t Bayes-optimal!
RC(x,m) C(x, x(z1,…, zm)) f(z1,…, zm |x) f0(x)dz1 dzmdx^^
Bayes risk = expected value of the total posterior cost
True Likelihood Functions & Markov Densities
Step 1:construct
model
Step 2:use probability-mass function
to describe model statistics
Step 3:construct “true”density function
that faithfullydescribes
model
Bayesian modelingOBSERVATIONS
measurement model
randomobser-vation
deterministicmeasurement
model
sensornoise
process
Zk = hk(x) + Wk
probability-mass function
pk(S|x) = Pr(Zk S)= probability that observation is in S
true likelihood function
fk(z|x) = pk
z= Radon-Nikodým derivative of pk
motion model
deterministicmotion model
randomstate of
moved target
motionnoise
process
Xk+1|k = k+1|k(x) + Vk
MOTION
probability-mass function
= probability that moved target is in S
pk+1|k(S|x) = Pr(Xk+1|k S)
true Markov transition density
fk+1|k(y|x) =pk+1|k
y= Radon-Nikodým derivative of pk+1|k
Finite-Set Statistics (FISST)
• One would expect that multisensor, multitarget applications rest upon a similar engineering statistics
• Not so, despite long existence of point process theory (PPT) • FISST is in part an “engineering friendly” version of PPT
ZX
multisensor-multitarget observations
multitarget states
randomobservation-
set
randomstate-set
? k = hk(X) Ck
k+1= k(X)Bk
multitarget measurement model
multitarget Markov motion model
? k(S)=Pr(kS)k+1|k(S)=Pr(k+1 S)
belief-mass function
belief-mass function
?XFISST integro-
differential calculus
fk(Z|X)fk+1|k(Y|X)multitarget Markov
transition density
? fk|k(X|Z(k))fk+1|k(X|Z(k))multitarget posterior &
prior densities
Bayes-optimal filtering
?
Bayes-optimalmultitarget state estimators
^ Xk|k
D(z)D(x)PHDs &
other moments
??multi-observation
miss distance
d(Z1,Z2)d(X1,X2)multitarget
miss distance
FISST, Isystematic, probabilistic framework for modeling uncertainties in data
statistical
random setmodel
common probabilistic framework: random closed sets,
radar report
likelihood function, f(z|x)
1)|( zxz df
generalized likelihood function, (z|x)
zxz d)|(
random setmodel
fuzzy
English-languagereport
imprecise datalinkattribute
contingent rule
random setmodel
poorly character-ized likelihood
Observation = “Gustav is NEAR the tower”
interpretations of “NEAR”
p1
p2
p3
p4
(random set model of natural language statement)
probabilitiesfor each interpretation
Dealing With Ill-Characterized Data
FISST, II
sensors
targets
diverse observations
reformulate multi-object problem as generalized single-object problem
sensor”“meta-
“meta-observation”
“meta- target”
random
obser-vation-
set
random state-set
multisensor state
X*k = {x*1,…,x*s}
multisensor-multitargetobservation
Z = {z 1,…,z m}
multitarget state
X = {x1,…,xn}
FISST, III
single-sensor/target
sensortargetvector observation, zvector state, x
derivative, dpZ /dzintegral, f(x) dx
prob.-mass func., pZ(S)likelihood, fZ(z|x)prior PDF, f0(x)
information theoryfiltering theory
multisource-multitarget “Statistics 101”
meta-sensormeta-targetfinite-set observation, Zfinite-set state, X
set derivative, /Zset integral, f(Z) Z
belief-mass func., (S)multitarget likelihood, f(Z|X)multitarget prior PDF, f0(X)
multitarget information theorymultitarget filtering theory
multi-sensor/target
Almost-parallelWorlds
Principle (APWOP)
detection
tracking
identification
performance estima- tion
information theory
control theory
sensor manage- ment
group targets
Bayes
fuzzy logic
Dempster- Shafer
rules
FISST, IV
FISST
unified Bayesiandata fusion
systematic, rigorous foundation for:- ill-characterized data & likelihoods- multitarget filtering and estimation- multigroup filtering & sensor management- computational approximation
Classical statistics: unknown state-set X of the target is a fixed parameter
What is Bayes Multitarget Statistics?
Bayesian statistics: unknown state-set is always a random finite set even if we do not
explicitly know either it or its distribution f0(X) (the prior). The
multisensor-multitarget likelihood function is a cond. prob. density
* equalities temporarily assume discrete state & observationspaces for ease of argument
f0(X) = Pr( = X)
state space, X
randomstate-set
multitarget prior distribution* of
randomobservation-set
observation space, Z
multisensor-multitarget likelihood function*
LZ(X) = f(Z|X) = Pr(=Z|=X)
Multitarget Bayes Statistics: Distributionsmultitarget likelihoods and distributions must have a specific form
strictly speaking: re-writing f0(X) as a family of ordinary densities
(one for each choice of target number) is not Bayesian because then it does not describe a random draw from a single multitarget state-space:
,...),...,(,...,),(),(),( 10212
011
00
0 nnffff xxxxx
specify multitarget state space and its measure structure (e.g.: finite subsets of single-target state space). Subtleissues are involved here!
X*Z*specify multisensor-multitarget
measurement space and itsmeasure structure—e.g., finite
subsets of Z = Z1 Zs
union of the measurement spaces of the individual sensors)
random variables(e.g. random sets)
f(Z|X) = Pr(=Z|=X)signifies a random draw from a single multisensor, multi-target measurement space
signifies a random draw from a single multitarget state-space
f0(X) = Pr(=X)
Point Processes / Random Finite Setsa point process is a random finite multi-set
copies copies
11 ,...,,...,,...,1 n
nn
xxxxfinite multi-set(= set with repetition of elements)
)(...)(11 xx xx nn
Dirac mixture density
counting measure
)(...)(11 SS
nn xx
SdS yyxx )()(
)(...)( ),(),( 11TT
nn xx simple counting measure on pairs
),(...),( ),(),( 11xx xx nn
simple Dirac mixture density on pairs
)},(),...,,{( 11 nn xx finite set of pairs
Set Integral (Multi-Object Integral)
Set integral:
sum over all possiblenumbers of targets
sum over all multi-objectstates with n objects
nn
n ddffYYf yyyy 11
1 }),...,({)()(
Set Derivative (Multi-Object Derivative)
][][lim][
0
hGghGh
g
G
Frechét derivative of a functional G[h] (of a function-variable h(x)):
cts. linear, is ][hg
Fg
Functional derivative (from physics)
][][lim][
0
hGhGh
G
x
x
Dirac delta function
][)()(11
Sn
n
n
GSS
X1
xx xx
Set derivative of a set-function (S) = G[1S] :
X = {x1,…,xn}
Probability Law of a Random Finite Set
][E][ hhGprobability generating
functional (p.g.fl.)
X
X hhx
x)( h = bounded real-valued test function
)Pr(][)( SGS S 1belief-massfunction
probability that all targets are in S
Choquet capacity theorem
multitargetprobability
distribution(= Janossy densities)
xx
gg
GG
X = {x1,…,xn})"Pr(")( )( X
XXf
fk|k(X|Z(k))multitarget posterior
Multitarget Posterior Density Functions
fk|k(|Z(k)) (no targets)
fk|k({x}|Z(k) (one target)
fk|k({xx2}|Z(k)) ( two targets)
…
fk|k({xxn}|Z(k)) (n targets)
multitarget state
measurement-stream
Z(k)Z ,...,Zk
multisensor-multitarget measurements: Zk = { zzm(k)}
individual measurementscollected at time k
fk|k(X|Z(k))X = 1normality condition
fk|k(X|Z(k)) fk+1|k+1(X|Z(k+1))
randomobservation-
sets Z producedby targets
randomstate-set multi-target motion
state space
observation space
fk+1|k(X|Z(k))
five targets three targets
The Multitarget Bayes Filter
k|k k+1|k+1k+1|k
random
state-set
multitargetBayes filter
Zk+1
f(Z|X)
what does “true”multisensor-multitargetlikelihood even mean?
sensor models
how do we constructmultisensor- multitargetmeasurementmodels andmultisensor-multitargetlikelihoodfunctions?
Issues in Multitarget Bayes Filtering
multitarget motionmodels must accountfor changes andcorrelations in targetnumber and motions
how do we constructmultitarget motion models and true multitargetMarkov densities?
what does “true” multitarget Markovdensity even mean?
fk1|k(X|Y)motion models
multitargetanalogs of usualBayes-optimalestimators arenot even defined!
must define newmultitarget stateestimators andshow well-behaved
Xk|k
^state estimation
new mathematicaltools even moreessential than insingle-target case
fk|k(Y|Z(k))comput-ability
True Multitarget Likelihoods & Markov Densities
Step 1:construct
model
Step 2:use belief-
mass functionto describe
model statistics
Step 3:construct “true”density function
that faithfullydescribes
model
Bayesian modelingOBSERVATIONS
multitarget meas. model
k = Tk(X) Ck
randomobser-vation
target-generated
measurements
clutterprocess
belief-mass function
k(S|X) = Pr(k S)= probability that observation is in S
true likelihood function
)|()|( XZ
XZf kk
= set derivative of k
MOTION
multitarget motion model
random stateof old
targets
randomstate of
new targetsnew
targets
k+1|k = k+1|k(X) Bk
belief-mass function
= probability that moved target is in S
k+1|k(S|X) = Pr(k+1|k S)
true Markov transition density
= set derivative of k+1|k
)|()|( |1|1 X
YXYf kk
kk
state
no-target state position states (km)
posteriorprobability 1/2 f(x) = 1/4 km -1
0 1 2
Some problems go away if continuous states are discretized into cells, BUT: - approach is no longer comprehensive - power of continuous mathematics is lost
New multitarget state estimators must be defined and proved optimal, consistent, etc.!
Failure of Classical Bayes Estimators
simple posterior distribution
compare f() = 1/2 f(x) = 1/4 km -1
naïveMAP
naïveEAP
X f(X) X = F() + x f(x)dx
= 1/2 ( + 1 km) ?
}incommensurable!
SARfeatures
LROfeatures
MTIfeatures
ELINTfeatures
SIGINTfeatures
Application 1: Poorly Characterized Data
interpretationby operators
sent out on coded datalink
?alphanumeric formatmessages corrupted
by unknowablevariations
(“fat-fingering”,fatigue, overloading,
differences intraining & ability)
model “ambiguous data” as random closed subsets ofobservation space
Af XSAWvague imprecise contingent general(fuzzy) (Dempster- (rules) Shafer)
choose specificmodeling technique
Modeling Ill-Characterized Data
(|x) = Pr( x ) practical implementationvia “matching models”
CAUTION!
partiallyheuristic
single- or multi-target Bayes filters
(|x) generalized (i.e., unnormalized)likelihood function
CAUTION!
non-Bayesian
IMA/SAR/22.0F/29.0F/14.9F/ Y / - / Y / - / - / T / 6 / - //
SAR line from formatted datalink message
Generalized Likelihood of Datalink Message Line
convertfeatures tofuzzy sets
g1 g2 g3 g4 g5 g6 g7 g8 g9 g10 g11
compare tofuzzy templates
in databasef1,x f2,x f3,x f4,,x f5,x f6,x f7,x f8,x f9,x f10,x f11,x
compute attri-bute likelihoods
1 2 3 4 5 6 7 8 10 11
generalized likelihoodof SAR message line
SAR(SAR|x) = 1(x) 2(x)10(x) 11(x)
'' operation models statistical correlation intermediate between perfect correlation and independence
ab = a + b - abab
Application 2: Poorly Characterized Likelihoods
statistically uncharacterizable“extended operatingconditions” (EOCs)
wet mud
dents
irregular placement of standard equipment (e.g. grenade launchers)
turret articulations
non-standard equipment (e.g. hibachi welded on motor)
Modeling Ill-Characterized Likelihoods
nominal likelihoodfunction
)|()( czfc iiz
intensity value in pixel
likelihood value
z
best-guess error bar
somewhat hedged error bar
more hedged error bar
very hedged error bar
)()(
)()(1 cc
ccNzz
zz
random error bar onimage likelihood
(interval arithmetic)
random error bar onpixel likelihood
(random interval)
})()( cc i
ziz
))(Pr()(, cxxic zz
fuzzy error bar onimage likelihood
ordinarymetricsmultisensor-
multitargetalgorithm
mathematical information produced by algorithm(multitarget Kullback-Leibler)
miss distance
I.D. miss
track purity
multitarget missdistance
Application 3: Scientific Performance Estimation
Multitarget Miss Distance
ground truth targets, G
multi-target trackerdata
n tracks, X
Oliver Drummond et. al. have proposed optimal truth-to-track association for evaluating multitarget tracker algorithms
find best association xi gi
between truths gi and tracks xi
Wasserstein distance rigorously and tractably generalizes intuitive approach
“natural” approach works only when numbers of truths and tracks are equal:
iii
XGd
xg
1
1min),(
Application 4: Group-Target Filtering
group target 2 (tank brigade)
group target 1 (armored infantry)
individualtargets
detect (i.e., this target group is actually a group target)
track (e.g., this group target is
moving towards objective A)
classify (e.g., this group target is an armored brigade)
fk|k(Xk|Z(k)) fk+1|k+1(Xk+1|Z(k+1))multi-group
Bayes filter
randomobservation-
sets Z producedby targets ingroup targets
randomgroup-
set, k|k
multi-group motion
state space
observation space
fk+1|k(Xk+1|Z(k))
five groups three groups
Approximate 1st-Order Multi-Group Bayes Filter
randomgroup-set, k+1|k+1
1st-moment“group PHD”
filter
Dk|k(g,x|Z(k)) Dk+1|k(g,x|Z(k)) Dk+1|k+1(g,x|Z(k+1))
multiple sensors on multiple platforms
- data- attributes- language- rules
multisensor-multitarget observation
all sensors on all platforms regarded as single system
all targetsregardedas singlesystem (possiblyundetected &low-observable)
multitarget system
all observations regarded as single observation
predicted multitarget system
sensor and
platformcontrols
to bestdetect, track, &
identify alltargets
Application 5: Sensor Management
fk|k(X|Z(k)) fk+1|k+1(X|Z(k+1)) fk+1|k(X|Z(k))
Multi-Sensor / Target Mgmt Based on MHC’s
k|k k+1|k+1k+1|k
multitargetBayes filter
use multi-hypothesis correlator (MHC) as filter part of sensor mgmt
observation space
single-target state space
multitarget motion
*|kk *
|1 kk *1|1 kk
multisensor motion
MHC-likealgorithm
(step k|k)
prediction
MHC-likealgorithm(step k+1|k)
MHC-likealgorithm(step k+1|k+1)
data-update
optimize geometric O.F.:multisensor FoV & predicted tracks
),( *1
ik
iDp xx
FoV of ith sensor
multisensor FoV)),(1()),(1(1)( *
1*1
11 s
ksDkDD ppp xxxxx
optimize Csiszár information objective function
Summary / Conclusions
• “Finite-set statistics” (FISST) provides a systematic, Bayes-probabilistic foundation and unification for multisource, multitarget data fusion
• Is an “engineering friendly” formulation of point process theory, the recognized theoretical foundation for stochastic multi-object systems
• Has led to systematic foundations for previously murky problems: e.g. group-target fusion
• Is leading to new computational approaches, e.g. PHD filter; multistep look-ahead sensor management
• Currently being investigated by other researchers:– Defense Sciences Technology Office (Australia), U. Melbourne
(Australia), Swedish Defense Research Agency, Georgia Tech Research Institute, SAIC Corp.