detection1.pdf
-
Upload
arif-wahla -
Category
Documents
-
view
215 -
download
0
Transcript of detection1.pdf
8/14/2019 detection1.pdf
http://slidepdf.com/reader/full/detection1pdf 1/15
Detection Theory
Example
1- Radar
2- Communications
3- Speech
4- Sonar
5- Control
6- . . .
Definition
Assume a set of data {x[0], x[1], . . . , x[N − 1]} is available. To arrive at a decision,
first we form a function of the data or T (x[0], x[1], . . . , x[N −
1]) and then make a de-
cision based on its value. Determining the function T and its mapping to a decision
is the central problem addressed in Detection Theory.
1
8/14/2019 detection1.pdf
http://slidepdf.com/reader/full/detection1pdf 2/15
Introduction
Example: BPSK phase detection
2
8/14/2019 detection1.pdf
http://slidepdf.com/reader/full/detection1pdf 3/15
Introduction
Detection Problem
The simplest detection problem is to determine if a signal is present or not. Note
that such a signal is always embedded in noise. This type of detection problem is
called binary hypothesis testing problem. Assuming the received data at time n to
be x[n], the signal s[n] and the noise w[n], the binary hypothesis testing problem is
defined as follows
H0 : x[n] = w[n]
H1 : x[n] = s[n] + w[n]
Note that if the number of hypotheses is more than two, then the problem becomes
a multiple hypothesis testing problem. One example is detection of different digits
in speech processing.
3
8/14/2019 detection1.pdf
http://slidepdf.com/reader/full/detection1pdf 4/15
8/14/2019 detection1.pdf
http://slidepdf.com/reader/full/detection1pdf 5/15
Detection Problem
Example continued
The probability density function of x[0] under each hypothesis is as follows
p(x[0];H0) = 1
√ 2πσ2 exp− 1
2σ2 x2
[0]
p(x[0];H1) = 1√
2πσ2exp
− 1
2σ2(x[0]−1)2
Deciding between H0 and H1, we are essentially asking weather x[0] has beengenerated according to the pdf p(x[0];H0) or the pdf p(x[0];H1).
5
8/14/2019 detection1.pdf
http://slidepdf.com/reader/full/detection1pdf 6/15
8/14/2019 detection1.pdf
http://slidepdf.com/reader/full/detection1pdf 7/15
Chi-Squared (Central)
A chi-squared pdf arises as the pdf of x, where x =vi=1
x2i , if xi is a standard normally
distributed random variable. The chi-squared pdf with v degrees of freedom is
defined as
p(x) =
1
2v2 Γ(v
2)
xv
2−1 exp
−12x
, x > 0
0, x < 0
and is denoted by χ2v. v is assumed to be integer and v
≥1. The function Γ(u) is
the Gamma function and is defined as
Γ(u) =
∞
0tu−1 exp(−t)dt
7
8/14/2019 detection1.pdf
http://slidepdf.com/reader/full/detection1pdf 8/15
Chi-Squared (Noncentral)
If x =vi=1
x2i , where xi’s are i.i.d. Gaussian random variables with mean µi and
variance σ2 = 1, then x has a noncentral chi-squared pdf with v degrees of freedom
and noncentrality parameter λ =v
i=1 µ
2
i . The pdf then becomes
p(x) =
12
x
λ
v−2
4 exp− 1
2(x + λ)
I v2−1
√ λx
, x > 0
0, x < 0
8
8/14/2019 detection1.pdf
http://slidepdf.com/reader/full/detection1pdf 9/15
Neyman-Pearson Theorem
Detection performance measures
Detection performance of a system is measured mainly by two factors:
1. Probability of false alarm: P FA = p(H1;H0)
2. Probability of detection: P D = p(H1;H1)
Note that sometimes instead of probability of detection, probability of miss detec-
tion, P M = 1−P D is used.
9
8/14/2019 detection1.pdf
http://slidepdf.com/reader/full/detection1pdf 10/15
Neyman-Pearson Theorem
Problem statement
Assume a data set x = [x[0], x[1],...,x[N −1]]T is available. The detection problem
is defined as follow
H0 = T (x) < λ
H1 = T (x) > λ
where T is the decision function and λ is the detection threshold. Our goal is to
design T so as to maximize P D subject to P FA < α.
10
8/14/2019 detection1.pdf
http://slidepdf.com/reader/full/detection1pdf 11/15
Neyman-Pearson Theorem
Neyman-Pearson Theorem
To maximize P D for a given P FA = α decide H1 if
L(x) = p(x
;H1) p(x;H0)
> λ
where the threshold λ is found from
P FA = {x:L(x)>λ}
p(x;H0)dx = α
The function L(x) is called the likelihood ratio and the entire test is called the likeli-
hood ratio test (LRT).
11
8/14/2019 detection1.pdf
http://slidepdf.com/reader/full/detection1pdf 12/15
Neyman-Pearson Theorem
Example: DC level in WGN
Consider the following signal detection problem
H0 : x[n] = w[n] n = 0, 1, . . . , N −1
H1 : x[n] = s[n] + w[n] n = 0, 1, . . . , N
−1
where the signal is s[n] = A for A > 0 and w[n] is AWGN with variance σ2. Now the
NP detector decides H1 if
1
(2πσ2)N 2 exp
− 1
2σ2N −1
n=0 (x[n]−A)
21
(2πσ2)N
2
exp− 1
2σ2
N −1n=0 x2[n]
> λ
Taking the logarithm of both sides and simplification results in
Aσ2
N −1n=0
x[n] > lnλ + N A2
2σ2
Since A > 0, we have finally
1N
N −1n=0
x[n] > σ
2
N A ln λ + A2 = λ′
12
8/14/2019 detection1.pdf
http://slidepdf.com/reader/full/detection1pdf 13/15
Neyman-Pearson Theorem
Example continued
The NP detector compares the sample mean x̄ = 1N
N −1n=0 x[n] to a threshold λ
′
. To
determine the detection performance, we first note that the test statistic T (x) = x̄ is
Gaussian under each hypothesis and its distribution is as follows
T (x) ∼ N (0, σ
2
N ) under H0
N (A, σ2
N ) under H1
We have then
P FA = P r(T (x) > λ′
;H0) = Q
λ′
σ2/N
and
P D = P r(T (x) > λ′
;H
1) = Q λ′−A σ2/N
P D and P FA are related to each other according to the following equation
P D = QQ−1(P FA)
− N A2
σ2
13
8/14/2019 detection1.pdf
http://slidepdf.com/reader/full/detection1pdf 14/15
Receiver Operating Characteristics
The alternative way of summarizing the detection performance of a NP detector
is to plot P D versus P FA. This plot is called Receiver Operating Characteristics(ROC). For the former DC level detection example, the ROC is shown here. Note
that here NA2
σ2 = 1.
14
8/14/2019 detection1.pdf
http://slidepdf.com/reader/full/detection1pdf 15/15
Minimum Probability of Error
Assume the prior probabilities of H0 and H1 are known and represented by P (H0)
and P (H1), respectively. The probability of error, P e, is then defined as
P e = P (
H1)P (
H0
|H1) + P (
H0)P (
H1
|H0) = P (
H1)P M + P (
H0)P FA
Our goal is to design a detector that minimizes P e. It is shown that the following
detector is optimal in this case
p(x|H1) p(x|H0)
> P (H0)P (H1)
= λ
In case P (H0) = P (H1), the detector is called the maximum likelihood detector.
15