An Introduction to Signal Detection and Estimation - Second Edition Chapter IV: Selected Solutions

download An Introduction to Signal Detection and Estimation - Second Edition Chapter IV: Selected Solutions

of 7

Transcript of An Introduction to Signal Detection and Estimation - Second Edition Chapter IV: Selected Solutions

  • 8/9/2019 An Introduction to Signal Detection and Estimation - Second Edition Chapter IV: Selected Solutions

    1/7

    An Introduction to Signal Detection and

    Estimation - Second Edition

    Chapter IV: Selected Solutions

    H. V. PoorPrinceton University

    April 26, 2005

    Exercise 1:

    a.

    MAP(y) = arg

    max1e

    log |y| log

    = 1.

    b.

    MMSE(y) =

    e1 e

    |y|de1 e

    |y|d =

    1

    |y|+

    e|y| e1e|y|

    e|y| ee|y|

    Exercise 3:

    w(|y) = yee0

    yeed =

    ye(+1)(1 + )y+1

    y! .

    So:

    MMSE(y) = 1

    y!

    0

    y+1e(+1)d(1 + )y+1 = y+ 1

    + 1;

    MAP(y) = arg

    max>0

    y log (+ 1)

    = y

    + 1;

    andABS(y) solves ABS(y)0

    w(|y)d=1

    2

    which reduces toy

    k=0

    ABS(y)

    kk!

    =1

    2eABS(y).

    Note that the series on the left-hand side is the truncated power series expansion ofexp{ABS(y)} so that the ABS(y) is the value at which this truncated series equals halfof its untruncated value when the truncation point is y.

    1

  • 8/9/2019 An Introduction to Signal Detection and Estimation - Second Edition Chapter IV: Selected Solutions

    2/7

    Exercise 7:

    We have

    p(y) = ey+ ify 0 ify <

    ;

    and

    w() =

    1 if (0, 1)0 if (0, 1)

    .

    Thus,

    w(|y) = ey+min{1,y}

    0 ey+d

    = e

    emin{1,y} 1,

    for 0 min{1, y}, and w(|y) = 0 otherwise. This implies:

    MMSE(y) = min{1,y}0 e

    d

    emin{1,y} 1 =

    [min{1, y} 1]emin{1,y} + 1

    emin{1,y} 1 ;

    MAP(y) = arg

    max

    0min{1,y}e

    = min{1, y};

    and ABS(y)0

    ed=1

    2

    emin{1,y} 1

    which has the solution

    ABS(y) = log

    emin{1,y} + 1

    2

    .

    Exercise 8:

    a. We have

    w(|y) =ey+e

    eyy0 d

    =1

    y, 0 y,

    and w(|y) = 0 otherwise. That is, given y, is uniformly distributed on the interval[0, y]. From this we have immediately that MMSE(y) =ABS(y) =

    y2

    .b. We have

    MMSE=E{V ar(|Y)} .

    Sincew(|y) is uniform on [0, y], V a r(|Y) = Y2

    12 . We have

    p(y) =ey y0

    d= yey, y >0,

    from which

    MMSE=E{Y2}

    12 =

    1

    12

    0

    y3ey|dy= 3!

    12=

    1

    2.

    2

  • 8/9/2019 An Introduction to Signal Detection and Estimation - Second Edition Chapter IV: Selected Solutions

    3/7

    c. In this case,

    p(y) =n

    k=1

    eyk+, if0<

  • 8/9/2019 An Introduction to Signal Detection and Estimation - Second Edition Chapter IV: Selected Solutions

    4/7

    Exercise 15:

    We have

    p(y) =ey

    y! , y 0, 1, . . . .

    Thus

    logp(y) =

    (+y log ) = 1 +

    y

    ,

    and2

    2logp(y) =

    y

    2

  • 8/9/2019 An Introduction to Signal Detection and Estimation - Second Edition Chapter IV: Selected Solutions

    5/7

    So the CRLB is2n

    k=1s4k

    (1+s2k)2

    .

    c. With s2k

    = 1, the likelihood equation yields the solution

    ML(y) =

    1

    n

    nk=1

    y2k

    1,

    which is seen to yield a maximum of the likelihood function.d. We have

    E

    ML(Y)

    =

    1

    n

    nk=1

    E

    Y2k

    1 =.

    Similarly, since the Y ksare independent,

    V ar

    ML(Y

    = 1n2

    nk=1

    V ar

    Y2k

    = 1n2

    nk=1

    2(! + )2 =2(1 + )

    2

    n .

    Thus, the bias of the MLE is 0 and the variance of the MLE equals the CRLB. (Hence,the MLE is an MVUE in this case.)

    Exercise 22:

    a. Note that

    p(y) =exp

    n

    k=1

    log F(yk)/

    ,

    which impies that the statisticn

    k=1log F(yk) is a complete sufficient statistic for viathe Completeness Theorem for Exponential Families. We have

    E

    n

    k=1

    log F(Yk)

    = nE{log F(Y1)} =

    n

    log F(Y1) [F(y1)](1)/ f(y1)dy1.

    Noting thatd log F(y1) = f(y1)F(y1)

    dy1,and that [F(y1)]1/ =exp{log F(y1)/}, we can make

    the substitution x = log F(y1) to yield

    E

    n

    k=1 log F(Yk)

    =

    n

    0 xe

    x/

    dx= n.

    Thus, we haveE

    MV(Y)

    = ,

    which implies thatMV is an MVUE since it is an unbiased function of a complete sufficientstatistic.

    5

  • 8/9/2019 An Introduction to Signal Detection and Estimation - Second Edition Chapter IV: Selected Solutions

    6/7

  • 8/9/2019 An Introduction to Signal Detection and Estimation - Second Edition Chapter IV: Selected Solutions

    7/7

    and

    ys= 1

    n

    nk=1

    yksin

    k

    2

    1

    n

    n/2k=1

    (1)k+1y2k1.

    b. Appending the prior to the above problem yields MAP estimates:

    MAP =ML,

    and

    AM AP =AML+

    rn

    2+ 2(1+)

    2

    n

    1 + ,

    where 22

    n2 .

    c. Note that, when (and the prior diffuses), the MAP estimate ofA doesnot approach the MLE ofA.However, asn , the MAP estimate does approach theMLE.

    7