Lecture 16: Vector RVs, Functions of several RVs Prof...
Transcript of Lecture 16: Vector RVs, Functions of several RVs Prof...
ECE 340Probabilistic Methods in Engineering
M/W 3-4:15
Prof. Vince CalhounProf. Vince Calhoun
Lecture 16: Vector RVs, Functions of Lecture 16: Vector RVs, Functions of several RVsseveral RVs
• Section 5.8-5.9• Function of two RV’s
• Section 6.1-6.2• Vector RV’s• Functions of several RV’s
Section 6.1-6.2Vector RV’s
Functions of several RV’s
Section 6.3-6.4Expected value of vector RV’s
Jointly Gaussian RV’s
Section 6.3-6.41.Expected value of vector RV’s2.Jointly Gaussian RV’s
Functions of Several Random Variables
Types of Transformations– A Single Function of n RV’s
– Functions of n RV’s
)X(gZ =
nk);X(gZ kk ≤≤= 1
Functions of Several Random Variables
Functions of n RV’s
)X,...,X,X,X(gnk);X(gZ
nk
kk
321
1=
≤≤=
Assume Invertability
)Z,...,Z,Z,Z(hnk);Z(hX
nk
kk
321
1=
≤≤=
pdf of Linear Transformations
We consider first the linear transformation of two random variables :
or
Denote the above matrix by A. We will assume A has an inverse, so each point (v, w) has a unique corresponding point (x, y) obtained from
In Fig. 4.15, the infinitesimal rectangle and the parallelogram are equivalent events, so their probabilities must be equal. Thus
eYcXWbYaXV
+=+= .⎥
⎦
⎤⎢⎣
⎡⎥⎦
⎤⎢⎣
⎡=⎥
⎦
⎤⎢⎣
⎡YX
ecba
WV
(4.56) .1⎥⎦
⎤⎢⎣
⎡=⎥
⎦
⎤⎢⎣
⎡ −
wv
Ayx
dPwvfdxdyyxf WVYX ),(),( ,, ≅
where dP is the area of the parallelogram. The joint pdf of V and W is thus given by
where x an y are related to (v, w) by Eq. (4.56)It can be shown that so the “stretch factor” is
where |A| is the determinant of A. Let the n-dimensional vector Z be
where A is an invertible matrix. The joint of Z is then
(4.57) ,),(
),( ,,
dxdydP
yxfwvf YX
WV =
( ) ,dxdybcaedP −=
( )( ) ,Abcaedxdy
dxdybcaedxdydP
=−=−
=
,XZ A=
nn×
EXAMPLE 4.36 Linear Transformation of Jointly Gaussian Random Variables
Let X and Y be the jointly Gaussian random variables. Let V and Wbe obtained from (X, Y) by
Find the joint pdf of V and W.
|A| = 1,
( ) ( ) ( ) (4.58) z z x
Z AAf
Axxf
zzffzAx
nXXnZZ
n
n
11,,
1,,1
1
1
,,,,)(
−
=
==≡−
…… …
…
.1111
21
⎥⎦
⎤⎢⎣
⎡=⎥
⎦
⎤⎢⎣
⎡⎥⎦
⎤⎢⎣
⎡−
=⎥⎦
⎤⎢⎣
⎡YX
AYX
WV
,1111
21
⎥⎦
⎤⎢⎣
⎡⎥⎦
⎤⎢⎣
⎡ −=⎥
⎦
⎤⎢⎣
⎡WV
YX
so
where
By substituting for x and y, the argument of the exponent becomes
Thus
( ) ( ) .22 WVYWVX +=−= and
,2
,2
),( ,, ⎟⎠
⎞⎜⎝
⎛ +−=
wvwvfwvf YXWV
( ) ( ) ( ) .121,
222 12/2
2,ρρ
ρπ−+−−
−= yxyx
YX eyxf
( ) ( )( ) ( )( ) ( ) ( ) .
1212122/2/22/ 22
2
22
ρρρπρ
−+
+=
−+++−−− wvwvwvwvwv
( )( )[ ] ( )[ ]{ }.
121),( 12/12/
2/12,22 ρρ
ρπ−++−
−= wv
WV ewvf
pdf of General Transformations
Let the r.v’s V and W be defined by two nonlinear functions of X and Y :
Assume that the functions v(x, y) and w(x, y) are invertible, then
In Fig. 4.17(b) , make the approximation
and similarly for the y variable. The probabilities of the infinitesimal rectangle and the parallelogram are approximately equal. therefore
(4.59) and .),(),( 21 YXgWYXgV ==
.),(),( 21 wvhywvhx == and
,2 1),(),(),( =∂∂
+≅+ kdxyxgx
yxgydxxg kkk
dPwvfdxdyyxf WVYX ),(),( ,, =
and
where dP is the area of the parallelogram. The “stretch factor” at the point (v, w) is given by the determinant of a matrix of partial derivatives :
The determinant J(x, y) is called the Jacobian of the transformation.
(4.60) ,)),(),,((
),( 21,,
dxdydP
wvhwvhfwvf YX
WV =
.det),(
⎥⎥⎥⎥
⎦
⎤
⎢⎢⎢⎢
⎣
⎡
∂∂
∂∂
∂∂
∂∂
=
yw
xw
yv
xv
yxJ
The Jacobian of the inverse transformation is given by
It can be shown that
We therefore conclude that the joint pdf of V and W can be found using either of the following expressions :
.det),(⎥⎥⎥
⎦
⎤
⎢⎢⎢
⎣
⎡
∂∂
∂∂
∂∂
∂∂
=
wy
vy
wx
vx
wvJ
.),(
1),(yxJ
wvJ =
( )( ) (4.61b)
(4.61a)
wvJwvhwvhf
yxJwvhwvhf
wvf
YX
YXWV
,)),(),,((
,)),(),,((
),(
21,
21,,
=
=
Functions of Several Random Variables(Gaussian Example)
X and Y are two i.i.d. Zero mean Gaussian RV’s:
⎟⎠⎞
⎜⎝⎛=
+=
XYarctanW
YXV 22
x
y
vw
)Wsin(VY)Wcos(VX
==
Inversions:
Functions of Several Random Variables(Gaussian Example)
Jacobian
T
wvwvww
T
wy
wx
vy
vx
wv ⎥⎦
⎤⎢⎣
⎡−
=
⎥⎥⎥⎥
⎦
⎤
⎢⎢⎢⎢
⎣
⎡
∂∂
∂∂
∂∂
∂∂
=ℑ)cos()sin(
)sin()cos(),(
Then:
v|v||)w,v(| ==ℑ
Functions of Several Random Variables(Gaussian Example)Jacobian
( ))wsin(v),wcos(vvf)w,v(f XYVW =
2
22
2)(
22
1),( σπσ
yxeyxXYf
+−
=
Where
ππσ
πσ
σ
σ
20 2
2
2
2
2
22
2
2
<<=
=
−
+−
w;)v(Uev
ev)w,v(f
v
])wsinv()wcosv[(
VW
Thus
Functions of Several Random Variables(Gaussian Example)
Marginals:
2
2
22
2
0
)v(Uev
dw)w,v(f)v(f
v
VWw
V
σ
π
σ
−
=
=
∫=
Rayleigh RV
ππ
2w0 ; 21
),(
0
)(
<<=
∞
=
= ∫ dvwvVWf
v
wWf
Uniform RV
Consider the problem of finding the joint pdf for n functions of n random variables X = (X1,…, Xn):
We assume as before that the set of equations
has a unique solution given by
The joint pdf of Z is then given by
,)(,)(,)( 2211 X ,X X nn gZgZgZ === …
,)(,)(,)( 2211 x ,x x nn gzgzgz === … )62.4(
,)(,)(,)( 2211 x ,x x nn hxhxhx === …
( )( )
( ) ( ) (4.63b)xxx
(4.63a) xxx
,,,,)(,),(),(
,,,)(,),(),(
),,(
2121,,
21
21,,1,,
1
1
1
nnXX
n
nXXnZZ
zzzJhhhf
xxxJhhhf
zzf
n
n
n
……
……
…
…
……
=
=
where are the determinants of the transformation and the inverse transformation, respectively,
and
( ) ( )nn zzJxxJ ,,,, 11 …… and
( )
⎥⎥⎥⎥⎥
⎦
⎤
⎢⎢⎢⎢⎢
⎣
⎡
∂∂
∂∂
∂∂
∂∂
=
n
nn
n
n
xg
xg
xg
xg
xxJ …
1
1
1
1
1 det,,
( )
⎥⎥⎥⎥⎥
⎦
⎤
⎢⎢⎢⎢⎢
⎣
⎡
∂∂
∂∂
∂∂
∂∂
=
n
nn
n
n
zh
zh
zh
zh
zzJ …
1
1
1
1
1 det,,
4.5 MULTIPLE RANDOM VARIABLES
Joint Distributions
The joint cumulative distribution function of X1, X2,…., Xn is defined as the probability of an n-dimensional semi-infinite rectangle associate with the point (x1,…, xn):
The joint cdf is defined for discrete, continuous, and random variables of mixed type.
[ ] (4.38) .,,,),,( 221121,, 21 nnnXXX xXxXxXPxxxFn
≤≤≤= ………
EXAMPLE 4.27
Let the event A be defined as follows :
Find the probability of A .
The maximum of three numbers is less than 5 if and only if each of the three numbers is less than 5 ; therefore
( ){ }.5,,max 321 ≤= XXXA
[ ] { } { } { }[ ].)5,5,5(
555
321 ,,
321
XXXFXXXPAP
=≤∩≤∩≤=
The joint probability mass function of n discrete random variables is defined by
The probability of any n-dimensional event A is found by summing the pmf over the points in the event
One-dimensional pmf of Xj is found by adding the joint pmf over all variables other than xj:
The marginal pmf for X1,…,Xn-1 is given by
[ ] (4.39) .,,),,( 221121,, 21 nnnXXX xXxXxXPxxxpn
==== ………
( )[ ] ∑ ∑= (4.40) in Ain
.),,(,, 21,,21 21 nXXXxn xxxpAXXXPn
…… …
[ ] (4.41) .),,()( 21,,,1 1
21
1
∑∑∑∑− +
===n
n
j j
jx
nXXXx xx
jjjX xxxpxXPxp ……
(4.42) .),,(),,( 21,,,121,,, 21121 ∑=−−
n
nnx
nXXXnXXX xxxpxxxp …… ……
A family of conditional pmf’s is obtained from the joint pmf by conditioning on different subsets of the random variables.
if . Repeated applications of Eq. (4.43a) yield
( ) ( )( ) (4.43a) ,
,,|,,
,,|111,,
11,,11
1
1
−−
−− =
nnXX
nXXnnX xxxp
xxpxxxp
n
n
n ……
……
…
( ) 0,, 11,,1>−nXX xxp
n……
( ) ( )( ) ( ) ( )
(4.43b)
xpxxpxxxp
xxxpxxp
XXnnX
nnXnXX
n
nn
112211
1111,,
121
1
|,,|
,,|,,
……………
−−
−−
−×
=
EXAMPLE 4.28
A computer system receives messages over three communications lines. Let Xj be the number of messages received on line j in one hour. Suppose that the joint pmf of X1, X2, and X3 is given by
Find p(x1, x2) and p(x1) given that 0< ai < 1.
( ) ( )( )( )000
111,,
321
321321321,,321
321
≥≥≥
−−−=
,x,xx
aaaaaaxxxp xxxXXX
( ) ( )( )( )
( )( ) .11
111,
21
3
321
21
2121
032132121,
xx
x
xxxXX
aaaa
aaaaaaxxp
−−=
−−−= ∑∞
=
( ) ( )( )
( ) .1
11
1
2
21
1
11
021211
x
x
xxX
aa
aaaaxp
−=
−−= ∑∞
=
If r.v’s X1, X2,…,Xn are jointly continuous random variables, then the probability of any n-dimensional event A is
where is the joint probability density functionThe joint cdf of X is obtained from the joint pdf by integration :
The joint pdf (if the derivative exists) is given by
The marginal pdf for a subset of the random variables is obtained b integrating the other variables out. The marginal of X1 is
( )[ ] (4.44) Ain Ain
,),,(,, ''1
''1,,1 1∫∫= nnXXxn dxdxxxfXXP
n………… …
),,( ''1,,1 nXX xxf
n……
(4.45) ,),,(),,,( 1
121
''1
''1,,21,,, ∫ ∫∞− ∞−
=x x
nnXXnXXXn
nndxdxxxfxxxF ……… ……
(4.46) .),,(),,( 1,,1
''1,, 121 nXX
n
n
nXXX xxFxx
xxfnn
…… …… ∂∂∂
=
(4.47) .),,,()( ''2
''21,,1 11 ∫ ∫
∞
∞−
∞
∞−= nnXXX dxdxxxxfxf
n………
The marginal pdf for X1,…,Xn-1 is given by
The pdf of Xn given the values of X1,…,Xn-1 is given by
if
Repeated applications of Eq. (4.49a) yield
(4.48) .),,,(),,( ''11,,11,, 111 ∫
∞
∞− −− =− nnnXXnXX dxxxxfxxf
nn…… ……
(4.49a) ),,(
),,(),,|(
11,,
1,,11
11
1
−−
−
=nXX
nXXnnX xxf
xxfxxxf
n
n
n ……
……
…
0),,( 11,, 11>−− nXX xxf
n……
(4.49b)
)()|(),,|(
),,|(),,(
112211
111,,
121
1
xfxxfxxxf
xxxfxxf
XXnnX
nnXnXX
n
nn
……………
−−
−
−×
=
EXAMPLE 4.29
The r.v’s X1, X2, and X3 have the joint Gaussian pdf
Find the marginal pdf of X1 and X3 .
The above integral was carried out in Example 4.13 with
( ) .2
,,
2321
22
21
321
212
321,, ππ
⎟⎠⎞
⎜⎝⎛ +−+−
=xxxxx
XXXexxxf
( )( )
.2/22
, 2
22/
31,
2122
21
23
31dxeexxf
xxxxx
XX ∫∞
∞−
−+−−
=ππ
2/1=ρ
( ) .22
,2/2/
31,
21
23
31 ππ
xx
XXeexxf−−
=
Independence
X1,…,Xn-1 are independent if
for any one-dimensional events A1,…,An. It can be shown that X1,…,Xn are independent if and only if
for all x1,…,xn. If the random variables are discrete,
If the random variables are jointly continuous,
[ ] [ ] [ ]nnnn AXPAXPAXAXP in in in in … 1111 ,, =
(4.50) )()(),,( 11,, 11 nXXnXX xFxFxxFnn
……… =
. all for nnXXnXX ,x,xxpxpxxpnn
………… 111,, )()(),,(11
=
. all for nnXXnXX ,x,xxfxfxxfnn
………… 111,, )()(),,(11
=
4.6 FUNCTIONS OF SEVERAL RANDOM VARIABLES
One Function of Several Random Variables
Let the random variable Z be defined as a function of several random variables:
The cdf of Z is found by first finding the equivalent event of that is, the set then
( ) (4.51) .,,, 21 nXXXgZ …=
{ },zZ ≤( ) ( ){ },,,1 zgxxR nZ ≤== x that such x …
[ ]( ) (4.52)
in X
inx ∫∫==
.,,
)(''
1''
1,,1 nnXXR
zZ
dxdxxxf
RPzF
nz
………
EXAMPLE 4.31 Sum of Two Random Variables
Let Z = X + Y. Find FZ(z) and fZ(z) in terms of the joint pdf of X and Y.
The cdf of Z is
The pdf of Z is
Thus the pdf for the sum of two random variables is given by a superposition integral.
If X and Y are independent random variables, then by Eq. (4.21) the pdf is given by the convolution integral of the margial pdf’s of Xand Y :
.'')','()('
,∫ ∫∞
∞−
−
∞−=
xz
YXZ dxdyyxfzF
(4.53) .')','()()( ,∫∞
∞−−== dxxzxfzF
dzdzf YXZZ
(4.54) .')'()'()( ∫∞
∞−−= dxxzfxfzf YXZ
EXAMPLE 4.32 Sum of Nonindependent Gaussian Random Variables
Find the pdf of the sum Z = X + Y of two zero-mean, unit-variance Gaussian random variables with correlation coefficient ρ= -1 / 2.
After completing the square of the argument in the exponent we obtain
( )( ) ( )[ ]( )
( )[ ]
( ) .'4/32
''exp4/32
1
'12
'''2'exp12
1
')','()(
22
21
2
22
212
,
dxzzxx
dxxzxzxx
dxxzxfzf YXZ
⎭⎬⎫
⎩⎨⎧ +−−=
⎭⎬⎫
⎩⎨⎧
−−+−−
−−
=
−=
∫
∫
∫
∞
∞−
∞
∞−
∞
∞−
π
ρρ
ρπ
.2
)(2/2
π
z
Zezf−
=
Let Z = g(X, Y), and suppose we are given that Y = y, then Z = g(X, y) is a function of one random variable. And the pdf of Z given Y= y: fZ(z | Y = y). The pdf of Z is found from
.')'()'|()( ∫∞
∞−= dyyfyzfzf YZZ
EXAMPLE 4.34
Let Z = X / Y. Find the pdf of Z if X and Y are independent and both exponentially distributed with mean one.
Assume Y = y, then
The pdf of Z is
.)|()|( yyzfyyzf XZ =
.')','('
')'()'|'(')(
,∫∫∞
∞−
∞
∞−
=
=
dyyzyfy
dyyfyzyfyzf
YX
YXZ
Transformations of Random Vectors
Let X1,…, Xn be random variables associate with some experiment, and let the random variables Z1,…, Zn be defined by n functions of X= (X1,…, Xn) :
The joint cdf of Z1,…, Zn at the point z = (z1,…, zn) is equal to the probability of the region of x where
( ).0
11
''
')'()'(')(
2
0
''
0
>+
=
=
=
∫∫∞ −−
∞
zz
dyeey
dyyfzyfyzf
yzy
YXZ
.)()()( 2211 X X X nn gZgZgZ === …
:,...,1)( nkzg kk for x =≤
If X1,…, Xn have a joint pdf, then
EXAMPLE 4.35
Let the random variables W and Z be defined by
Find the joint cdf of W and Z in terms of the joint cdf of X and Y.
If z > w, the above probability is the probability of the semi-infinite rectangle defined by the point (z, z) minus the square region denote by A.
[ ] (4.55a) XX .)(,,)(),,( 111,,1 nnnZZ zgzgPzzFn
≤≤= ………
(4.55b) xx
.),...,(),,( ''1
''1,...,
)'(:'1,, 11 ∫∫
≤
= nnXXzg
nZZ dxdxxxfzzFn
kk
n……
.),max(),min( YXZYXW == and
{ } ( ){ }[ ].,max),min(),(, zYXwYXPzwF ZW ≤∩≤=