Distance: Midpoint: x y y y 1 x 2 y § 2 1 2 1 2 x x 1. (-5 ...
X= {x 0, x 1,….,x J-1 } Y= {y 0, y 1, ….,y K-1 } Channel Finite set of input (X= {x 0, x...
-
Upload
brook-kennedy -
Category
Documents
-
view
239 -
download
2
Transcript of X= {x 0, x 1,….,x J-1 } Y= {y 0, y 1, ….,y K-1 } Channel Finite set of input (X= {x 0, x...
![Page 1: X= {x 0, x 1,….,x J-1 } Y= {y 0, y 1, ….,y K-1 } Channel Finite set of input (X= {x 0, x 1,….,x J-1 }), and output (Y= {y 0, y 1,….,y K-1 }) alphabet.](https://reader037.fdocuments.net/reader037/viewer/2022102602/56649da65503460f94a90fcc/html5/thumbnails/1.jpg)
SHANNON INFORMATION
THEORY
TUTORIAL 7
ENG. SALLY NAFIE
![Page 2: X= {x 0, x 1,….,x J-1 } Y= {y 0, y 1, ….,y K-1 } Channel Finite set of input (X= {x 0, x 1,….,x J-1 }), and output (Y= {y 0, y 1,….,y K-1 }) alphabet.](https://reader037.fdocuments.net/reader037/viewer/2022102602/56649da65503460f94a90fcc/html5/thumbnails/2.jpg)
DISCRETE MEMORYLESS CHANNEL:
X= {x0, x1,….,xJ-1} Y= {y0, y1,….,yK-1}Channel
DISCRETE :Finite set of input (X= {x0, x1,….,xJ-1}) , and
output (Y= {y0, y1,….,yK-1}) alphabet.
MEMORYLESS : Current output symbol (yk) depends only on current input
symbol xj
.
![Page 3: X= {x 0, x 1,….,x J-1 } Y= {y 0, y 1, ….,y K-1 } Channel Finite set of input (X= {x 0, x 1,….,x J-1 }), and output (Y= {y 0, y 1,….,y K-1 }) alphabet.](https://reader037.fdocuments.net/reader037/viewer/2022102602/56649da65503460f94a90fcc/html5/thumbnails/3.jpg)
NOISELESS CHANNEL:
x0 = 0
x1 = 1
y0 = 0
y1 = 1
P(y0/x0) = P(0/0) = 1
P(y1/x1) = P(1/1) = 1
P(y0/x1) = P(0/1) = 0
P(y1/x0) = P(1/0) = 0
CONDITIONAL PROBABILITY
Transmitted Received
![Page 4: X= {x 0, x 1,….,x J-1 } Y= {y 0, y 1, ….,y K-1 } Channel Finite set of input (X= {x 0, x 1,….,x J-1 }), and output (Y= {y 0, y 1,….,y K-1 }) alphabet.](https://reader037.fdocuments.net/reader037/viewer/2022102602/56649da65503460f94a90fcc/html5/thumbnails/4.jpg)
CONDITIONAL PROBABILITY:
The conditional probability P(yk/xj) is the probability of receiving a certain symbol yk given a certain symbol xj was transmitted.
Ex:In a Noiseless channel: The Probability of receiving a 0 given that a 0 was transmitted
= P(0/0) = 1 The Probability of receiving a 0 given that a 1 was transmitted
= P(0/1) = 0 The Probability of receiving a 1 given that a 0 was transmitted
= P(1/0) = 0 The Probability of receiving a 1 given that a 1 was transmitted
= P(1/1) = 1
![Page 5: X= {x 0, x 1,….,x J-1 } Y= {y 0, y 1, ….,y K-1 } Channel Finite set of input (X= {x 0, x 1,….,x J-1 }), and output (Y= {y 0, y 1,….,y K-1 }) alphabet.](https://reader037.fdocuments.net/reader037/viewer/2022102602/56649da65503460f94a90fcc/html5/thumbnails/5.jpg)
NOISY CHANNEL:
x0 = 0
x1 = 1
y0 = 0
y1 = 1
P(y0/x0) = P(0/0) = 1- Pe
P(y1/x1) = P(1/1) = 1- Pe
P(y 0/x 1
) = P(0/1) =
P e
P(y1 /x
0 ) = P(1/0) = Pe
Transmitted Received
CONDITIONAL PROBABILITY
![Page 6: X= {x 0, x 1,….,x J-1 } Y= {y 0, y 1, ….,y K-1 } Channel Finite set of input (X= {x 0, x 1,….,x J-1 }), and output (Y= {y 0, y 1,….,y K-1 }) alphabet.](https://reader037.fdocuments.net/reader037/viewer/2022102602/56649da65503460f94a90fcc/html5/thumbnails/6.jpg)
CHANNEL (TRANSITION) MATRIX:
x0 = 0
x1 = 1
y0 = 0
y1 = 1
P(y0/x0) = P(0/0) = 1- Pe
P(y1/x1) = P(1/1) = 1- Pe
P(y 0/x 1
) = P(0/1) =
P e
P(y1 /x
0 ) = P(1/0) = Pe
Fixed Output
Fixed Input
![Page 7: X= {x 0, x 1,….,x J-1 } Y= {y 0, y 1, ….,y K-1 } Channel Finite set of input (X= {x 0, x 1,….,x J-1 }), and output (Y= {y 0, y 1,….,y K-1 }) alphabet.](https://reader037.fdocuments.net/reader037/viewer/2022102602/56649da65503460f94a90fcc/html5/thumbnails/7.jpg)
CHANNEL (TRANSITION) MATRIX:
X= {x0, x1,….,xJ-1} Y= {y0, y1,….,yK-1}Channel
Fixed Output
Fixed Input
![Page 8: X= {x 0, x 1,….,x J-1 } Y= {y 0, y 1, ….,y K-1 } Channel Finite set of input (X= {x 0, x 1,….,x J-1 }), and output (Y= {y 0, y 1,….,y K-1 }) alphabet.](https://reader037.fdocuments.net/reader037/viewer/2022102602/56649da65503460f94a90fcc/html5/thumbnails/8.jpg)
PROBABILITY TERMS:
The probability of the each symbol emitted from the source at the transmitter side.
P(xj) = P(X=xj)
A. PRIOR PROBABILITY:
B. CONDITIONAL PROBABILITY:
The probability of receiving a certain symbol yk given a certain symbol xj was transmitted.
P(yk / xj) = P( Y=yk / X=xj )
![Page 9: X= {x 0, x 1,….,x J-1 } Y= {y 0, y 1, ….,y K-1 } Channel Finite set of input (X= {x 0, x 1,….,x J-1 }), and output (Y= {y 0, y 1,….,y K-1 }) alphabet.](https://reader037.fdocuments.net/reader037/viewer/2022102602/56649da65503460f94a90fcc/html5/thumbnails/9.jpg)
PROBABILITY TERMS:
C. JOINT PROBABILITY:
The probability of sending a certain symbol xj ,and receiving a certain symbol yk.
P(xj , yk) = P(X= xj, Y=yk)
=P(Y=yk / X= xj) P(X= xj)
=P(yk/xj) P(xj)
PRIOR PROBABILITY
CONDITIONAL PROBABILITY
![Page 10: X= {x 0, x 1,….,x J-1 } Y= {y 0, y 1, ….,y K-1 } Channel Finite set of input (X= {x 0, x 1,….,x J-1 }), and output (Y= {y 0, y 1,….,y K-1 }) alphabet.](https://reader037.fdocuments.net/reader037/viewer/2022102602/56649da65503460f94a90fcc/html5/thumbnails/10.jpg)
The probability of receiving a certain symbol yk.
P(yk) = P(Y=yk)
=
PROBABILITY TERMS:
D. MARGINAL PROBABILITY:
x0 = 0
x1 = 1
y0 = 0
y1 = 1
OR
![Page 11: X= {x 0, x 1,….,x J-1 } Y= {y 0, y 1, ….,y K-1 } Channel Finite set of input (X= {x 0, x 1,….,x J-1 }), and output (Y= {y 0, y 1,….,y K-1 }) alphabet.](https://reader037.fdocuments.net/reader037/viewer/2022102602/56649da65503460f94a90fcc/html5/thumbnails/11.jpg)
BAYES’ RULE :
P(xj , yk) = P(yk , xj)
P(yk/xj) P(xj) = P(xj/yk) P(yk)
P(xj/yk) =
=P(xj/yk)
![Page 12: X= {x 0, x 1,….,x J-1 } Y= {y 0, y 1, ….,y K-1 } Channel Finite set of input (X= {x 0, x 1,….,x J-1 }), and output (Y= {y 0, y 1,….,y K-1 }) alphabet.](https://reader037.fdocuments.net/reader037/viewer/2022102602/56649da65503460f94a90fcc/html5/thumbnails/12.jpg)
CLASSICAL CHANNELS:A. BINARY SYMMETRIC:
x0 = 0
x1 = 1
y0 = 0
y1 = 1
P(y0/x0) = P(0/0) = 1- Pe
P(y1/x1) = P(1/1) = 1- Pe
P(y 0/x 1
) = P(0/1) =
P e
P(y1 /x
0 ) = P(1/0) = Pe
CHANNEL MATRIX:
=
![Page 13: X= {x 0, x 1,….,x J-1 } Y= {y 0, y 1, ….,y K-1 } Channel Finite set of input (X= {x 0, x 1,….,x J-1 }), and output (Y= {y 0, y 1,….,y K-1 }) alphabet.](https://reader037.fdocuments.net/reader037/viewer/2022102602/56649da65503460f94a90fcc/html5/thumbnails/13.jpg)
CLASSICAL CHANNELS:B. ERASURE CHANNEL:
CHANNEL MATRIX:
=
x0 = 0
x1 = 1
y0 = 0
y1 = 1
P(y0/x0) = P(0/0) = 1- q
P(y2/x
0) = P(e/0) = q
Transmitted Received
y2 = e
P(y1/x1) = P(1/1) = 1- q
P(y2/x1) = P(e/1) = q
![Page 14: X= {x 0, x 1,….,x J-1 } Y= {y 0, y 1, ….,y K-1 } Channel Finite set of input (X= {x 0, x 1,….,x J-1 }), and output (Y= {y 0, y 1,….,y K-1 }) alphabet.](https://reader037.fdocuments.net/reader037/viewer/2022102602/56649da65503460f94a90fcc/html5/thumbnails/14.jpg)
The average information transmitted over the channel per symbol.
H(X) =
The average information lost due to the channel per symbol, given that a certain symbol yk is received.
H(X/yk) =
SOURCE ENTROPY:
CONDITIONAL ENTROPY:
![Page 15: X= {x 0, x 1,….,x J-1 } Y= {y 0, y 1, ….,y K-1 } Channel Finite set of input (X= {x 0, x 1,….,x J-1 }), and output (Y= {y 0, y 1,….,y K-1 }) alphabet.](https://reader037.fdocuments.net/reader037/viewer/2022102602/56649da65503460f94a90fcc/html5/thumbnails/15.jpg)
The mean of the entropy over all the received symbols.
CONDITIONAL ENTROPY:
=
= P(xj , yk)
=
H(X/Y) =
H(X/Y) =
H(X/Y)
OR
Equivocation of X with respect to Y
![Page 16: X= {x 0, x 1,….,x J-1 } Y= {y 0, y 1, ….,y K-1 } Channel Finite set of input (X= {x 0, x 1,….,x J-1 }), and output (Y= {y 0, y 1,….,y K-1 }) alphabet.](https://reader037.fdocuments.net/reader037/viewer/2022102602/56649da65503460f94a90fcc/html5/thumbnails/16.jpg)
SIMILARLY:
=
= P(yk , xj)H(Y/X) =
H(Y/X)
OR
Equivocation of Y with respect to X
=
H(Y/X) =
H(Y/ xj) =
![Page 17: X= {x 0, x 1,….,x J-1 } Y= {y 0, y 1, ….,y K-1 } Channel Finite set of input (X= {x 0, x 1,….,x J-1 }), and output (Y= {y 0, y 1,….,y K-1 }) alphabet.](https://reader037.fdocuments.net/reader037/viewer/2022102602/56649da65503460f94a90fcc/html5/thumbnails/17.jpg)
The average information the receiver receives per symbol.I(X,Y) = H(X) – H(X/Y)
MUTUAL INFORMATION:
RECEIVED INFORMATION
TRANSMITTEDINFORMATION
LOSTINFORMATION
![Page 18: X= {x 0, x 1,….,x J-1 } Y= {y 0, y 1, ….,y K-1 } Channel Finite set of input (X= {x 0, x 1,….,x J-1 }), and output (Y= {y 0, y 1,….,y K-1 }) alphabet.](https://reader037.fdocuments.net/reader037/viewer/2022102602/56649da65503460f94a90fcc/html5/thumbnails/18.jpg)
I(X,Y) = H(X) – H(X/Y)
=
-
=1
=
=
=
=
-
-
![Page 19: X= {x 0, x 1,….,x J-1 } Y= {y 0, y 1, ….,y K-1 } Channel Finite set of input (X= {x 0, x 1,….,x J-1 }), and output (Y= {y 0, y 1,….,y K-1 }) alphabet.](https://reader037.fdocuments.net/reader037/viewer/2022102602/56649da65503460f94a90fcc/html5/thumbnails/19.jpg)
I(X,Y) = H(X) – H(X/Y) I(Y,X) = H(Y) – H(Y/X) I(X,Y) = I(Y,X) I(X,Y) = I (Y, X) = H(X) + H(Y) – H(X,Y)Where
H(X,Y)=
PROPERTIES OF MUTUAL INFORMATION:
I(X,Y)
H(X) H(Y)
H(X/Y) H(Y/X)
H(X,Y)
![Page 20: X= {x 0, x 1,….,x J-1 } Y= {y 0, y 1, ….,y K-1 } Channel Finite set of input (X= {x 0, x 1,….,x J-1 }), and output (Y= {y 0, y 1,….,y K-1 }) alphabet.](https://reader037.fdocuments.net/reader037/viewer/2022102602/56649da65503460f94a90fcc/html5/thumbnails/20.jpg)
CHANNEL CAPACITY: The channel capacity of a discrete memoryless channel is
defined as the maximum rate at which the information can be transmitted through the channel.
It is the maximum mutual information over all the possible distributions of input probabilities P(xj)
C =
![Page 21: X= {x 0, x 1,….,x J-1 } Y= {y 0, y 1, ….,y K-1 } Channel Finite set of input (X= {x 0, x 1,….,x J-1 }), and output (Y= {y 0, y 1,….,y K-1 }) alphabet.](https://reader037.fdocuments.net/reader037/viewer/2022102602/56649da65503460f94a90fcc/html5/thumbnails/21.jpg)
CAPACITY OF BINARY SYMMETRIC CHANNEL:
x0 = 0
x1 = 1
y0 = 0
y1 = 1
Pe” = 1- Pe
P e
Pe
Pe” = 1- Pe
P (x0) = P0
I(X,Y) =
P (x1) = P0” = 1- P0
=
+
+
+
k=0, j=0 :
k=0, j=1:
k=1, j=0:
k=1, j=1:
![Page 22: X= {x 0, x 1,….,x J-1 } Y= {y 0, y 1, ….,y K-1 } Channel Finite set of input (X= {x 0, x 1,….,x J-1 }), and output (Y= {y 0, y 1,….,y K-1 }) alphabet.](https://reader037.fdocuments.net/reader037/viewer/2022102602/56649da65503460f94a90fcc/html5/thumbnails/22.jpg)
+
= +
+
+
-
= -
=
= +
+
+
0 0
=
= +I(X,Y)
![Page 23: X= {x 0, x 1,….,x J-1 } Y= {y 0, y 1, ….,y K-1 } Channel Finite set of input (X= {x 0, x 1,….,x J-1 }), and output (Y= {y 0, y 1,….,y K-1 }) alphabet.](https://reader037.fdocuments.net/reader037/viewer/2022102602/56649da65503460f94a90fcc/html5/thumbnails/23.jpg)
+ I(X,Y) =
C =
I(X,Y) is maximum when all the transmitted symbols are equiprobable. i.e. P(x0) = P(x1) = 0.5
P0” = 1- P0 = 0.5
AT P0 = 0.5:
C = I(X,Y) = +
C = 1
![Page 24: X= {x 0, x 1,….,x J-1 } Y= {y 0, y 1, ….,y K-1 } Channel Finite set of input (X= {x 0, x 1,….,x J-1 }), and output (Y= {y 0, y 1,….,y K-1 }) alphabet.](https://reader037.fdocuments.net/reader037/viewer/2022102602/56649da65503460f94a90fcc/html5/thumbnails/24.jpg)
CAPACITY OF ERASURE CHANNEL:
P (x0) = P0
I(X,Y) =
P (x1) = P0” = 1- P0
=
+
+
+
k=0, j=0:
k=0, j=1:
k=1, j=0:
k=1, j=1:
x0 = 0
x1 = 1
y0 = 0
y1 = 1
y2 = e
k=2, j=0:
+k=2, j=1:
+
q” = 1- q
q
q” = 1- q
q
![Page 25: X= {x 0, x 1,….,x J-1 } Y= {y 0, y 1, ….,y K-1 } Channel Finite set of input (X= {x 0, x 1,….,x J-1 }), and output (Y= {y 0, y 1,….,y K-1 }) alphabet.](https://reader037.fdocuments.net/reader037/viewer/2022102602/56649da65503460f94a90fcc/html5/thumbnails/25.jpg)
0
I(X,Y)
C =
I(X,Y) is maximum when all the transmitted symbols are equiprobable. i.e. P(x0) = P(x1) = 0.5
P0” = 1- P0 = 0.5
AT P0 = 0.5:
C = I(X,Y)
C
0
= + +
+
![Page 26: X= {x 0, x 1,….,x J-1 } Y= {y 0, y 1, ….,y K-1 } Channel Finite set of input (X= {x 0, x 1,….,x J-1 }), and output (Y= {y 0, y 1,….,y K-1 }) alphabet.](https://reader037.fdocuments.net/reader037/viewer/2022102602/56649da65503460f94a90fcc/html5/thumbnails/26.jpg)
Given J input symbols, and K output symbols , the channel capacity of a symmetric discrete memoryless channel is given by:
ANOTHER METHOD TO OBTAIN CHANNEL CAPACITY: