Capacity for Communication Channels
Transcript of Capacity for Communication Channels
Capacity for Communication Channels
The material is presented at a workshop tutorial at the 43rd Conference on Decision and Control, Bahamas, December 2004.
2
Overview
Importance of Uncertainty in CommunicationsShannon’s Definition of Capacity Capacity of Additive Gaussian Channels
Random Variable CaseRandom Process Case
Capacity of MIMO Gaussian ChannelsReview of Maximin Capacity
Saddle Point Solutions
3
Overview
Maximin Capacity Subject to NormedUncertainties
Uncertain ChannelUncertain NoiseUncertain Channel and Noise
Coding TheoremExamples and ConclusionsReferences
4
Importance of Communication Subject to Uncertainties
5
Importance of Communication Subject to Uncertainties
Channel measurement errorsNetwork operating conditionsChannel modelingCommunication in presence of jammingSensor networks Teleoperations
6
Importance of Communication Subject to Uncertainties
Channel measurement errorsProblem of allocation of power and bandwidth for channel soundingAllocation depends on the benefit that can be obtained from more accurate measurement (e.g., when channel changes too rapidly)Feedback (allocation of bandwidth to provide accurate channel state information, robustness of feedback to channel error)Effect on new wireless systems (higher carrier frequency, ultra-wide bandwidth)
7
Importance of Communication Subject to Uncertainties
Network operating conditionsUncertainty can impair interference cancellation in multiple access systemsInterplay between physical layer and higher layers (e.g., effect of adaptive coding on congestion)
Channel modeling Models used for developing communication schemes are simple, but the gap between real channel and model can be big
8
Importance of Communication Subject to Uncertainties
Communication in presence of jammingUseful transmitted signal is accompanied with adversary signal making communication channel uncertain
Sensor networksTeleoperations
9
Shannon’s Definition of Capacity
10
Shannon’s Definition of Capacity
Model of communication system
wSource Encoder Decoder SinkChannel +
xn
y wf g
{ }M,...,1 { }M,...,1( )wf ( )( )wfh ( )( )( )wfhg( )nM ,
( )( )( ){ }( ) ∑
=
=
=≠=
=
M
ii
ne
i
MP
Misentiiifhg
nMR
1
1,...,1,|Pr
log
λ
λ
11
Shannon’s Definition of Capacity
R is achievable rate if there exists a sequence of codes (M=2nR,n) such that the probability of error tends to zero as n tends to infinity.
The operational capacity is the supremum of all achievable rates.
12
Shannon’s Definition of Capacity
Discrete memoryless channel
Channel capacity depends on channel transition matrix Q(y|x) that is known
( )( )
( ) ( ) ( ) ( )( ) ( )∑∑ ∑∈ ∈
∈
∈
=
=
Xx YyXx
XP
xyQxPxyQxyQxPQPI
QPIC
''|'
|log|,
,maxπ
13
Shannon’s Definition of Capacity
What if Q(y|x) is unknown ?Example: compound BSC
What is the channel capacity ?
θ
θ
1- θ
1- θ0 0
1 1
{ } [ ]
( )⎩⎨⎧
=−≠
=
⊂Θ==
xyxy
xyQ
YX
,1,
;|
1,0,1,0
θθ
θ
14
Shannon’s Definition of Capacity
Model of communication system with CSI
Source Encoder Decoder SinkChannel +x
ny x
CSIu v
15
Shannon’s Definition of Capacity
Examples: arbitrary varying channels (AVC), arbitrary varying finite state channels (AVFSC), and others.Discrete memoryless AVC, - the state space
What is the channel capacity ?
( ) ( )∏=t
ttt sxyQsxyQ ;|;|
Sst ∈
16
Shannon’s Definition of Capacity
Additive Gaussian ChannelsRandom Variable Case
( )[ ]
⎟⎠⎞
⎜⎝⎛ +=
≤+=
2
2
1log21
,0~,
σ
σ
PC
PxVarNnnxy
( )( ) [ ] PxVaryxIC
XP≤=
∈,;max
π
17
Shannon’s Definition of Capacity
Random variable case derivation
( )( ) ( ) ( )
( ) ( )
( )
⎟⎠⎞
⎜⎝⎛ +=
−+=
−=
−=
2
22
1log21
2log212log
21
|;max
σ
σπσπ
π
P
ePe
nHyH
xyHyHyxIX
18
Shannon’s Definition of Capacity
What is the capacity if noise is unknown?Gaussian AVC
Statistic of is unknown
( )2,0~, σNnsnxy ttttt ++=
ts
19
Shannon’s Definition of Capacity
Additive Gaussian ChannelsRandom Process Case
+
( )fW
( )fH
n
yx
20
Shannon’s Definition of Capacity
Random process case derivation
{ }
( ) ( )PdfSdfWS
HSSJ
PdfSSA
dfWS
HSC
xn
xx
xx
n
x
ASx
−+⎟⎟
⎠
⎞
⎜⎜
⎝
⎛+=
≤=
⎟⎟
⎠
⎞
⎜⎜
⎝
⎛+=
∫∫
∫
∫∈
λ2
2
2
2
1log21
;
1log21sup
21
Shannon’s Definition of Capacity
Necessary conditions
( )∫
∫
=−
>−=
>=−=+
⎟⎟
⎠
⎞
⎜⎜
⎝
⎛+=
+
∈
PdfHWS
HWSS
HWSS
dfWS
HSC
n
nx
nx
n
x
ASx
22*
22**
**
22*
2
2
/)2
0/
01/
1)1
1log21sup
ν
ν
νλ
22
Shannon’s Definition of Capacity
Capacity of continuous time additive Gaussian channel
∫ ⎟⎟
⎠
⎞
⎜⎜
⎝
⎛= df
WS
HC
n2
2*log
21 ν
PdfH
WSn =⎟⎟
⎠
⎞
⎜⎜
⎝
⎛−∫
+
2
2
*ν
23
Shannon’s Definition of Capacity
Range of integration
What is the capacity if the frequency response of the channel H, or psd of the noise Sn belong to certain sets?
0*,0* 2
2
>≥− ννH
WSn
24
Shannon’s Definition of Capacity
Water-filling
*ν2
2
H
WSn
fBfBf−
psd
25
MIMO Channels
26
MIMO Channels
Static MIMO Gaussian Channel
n is zero-mean complex Gaussian noise with independent, equal variance real, and imaginary parts,
[ ][ ] [ ]( ) PxxEtracexxE
InnEnHxy r
≤=
=+=**
*,
trrt CHCyCx ×∈∈∈ ,,
27
MIMO Channels
Capacity of static MIMO channel
are the eigenvalues of HH*, ν is Lagrange multiplierWhat is the capacity if the matrix H belongs to a certain set?
( )( )( )∑
∑+
+−
=
=−
ii
ii
C
P
νλ
λν
log
1
{ }iλ
28
Review of MaximinCapacity
29
Review of Minimax Capacity
Example: compound DMC
This result is due to Blackwell et. al. [6]. Also look at Csiszar [8], and Wolfowitz [21] Blachman [5], and Dobrushin [12] were first to apply game theoretic approach in computing the channel capacity with mutual information as a pay-off function for discrete channels
( )( )( )θ
θπ;|,infmax ⋅⋅=
Θ∈∈QPIC
XP
30
Review of Minimax Capacity
The existence of saddle point ?
For further references see Lapidoth, Narayan [18]
( )( )( )
( )( )( )
( )( ) ( )( ) ( )( )θθθ
θθπθθπ
;|,;|,;|,
;|,maxinf;|,infmax
**** ⋅⋅≤⋅⋅≤⋅⋅
⋅⋅=⋅⋅=∈Θ∈Θ∈∈
QPIQPIQPI
QPIQPICXPXP
31
Review of Minimax Capacity
Gaussian AVC (GAVC) ChannelsHughes, and Narayan [16] determined the λ-capacity of discrete time GAVC for averaged, and peak power constraints imposed on channel, and transmitted sequence for random codes Hughes, and Narayan [17] determined the capacity of vector discrete time GAVC accompanied with water-filling equation, and proved the saddle pointCsiszar, and Narayan [9] determined the capacity of GAVC for deterministic codesAhlswede [1] computed the channel capacity of GAVC when the noise variance varies but does not exceed certain bound
32
Review of Minimax Capacity
Gaussian AVC (GAVC) ChannelsMcEliece [22] was first to apply game theoretic approach on continuous channels with mutual information as pay-off functionBasar, and Wu [3] considered jamming Gaussian channels with mean-square error pay-off functionDiggavi, and Cover [23] used the game theoretic approach to find the worst case noise when the covariance matrix constraint is imposed on the noiseBaker [2] computed the channel capacity of M-dimensional Gaussian continuous time channels with energy constraints posed on the transmitted signal, and jammerRoot, and Varaiya [20] considered the capacity of white noise Gaussian continuous time compound channel
33
Review of Minimax Capacity
Gaussian AVC (GAVC) ChannelsVishwanath, et. al. [24], computed the worst case capacity of Gaussian MIMO channels by applying duality principle
34
Maximin Capacity Subject to NormedUncertainties
35
Motivation
We deal with one class of channels, time continuous Gaussian compound channelsThe choice of Gaussian noise does not present a limitation because it is known that Gaussian noise is the worst case noise in terms of mutual information (see [15])As opposed to GAVC that models the uncertainty through the additive (jamming) signal with unknown statistics (see slide 11), we model the uncertainty through the uncertainty set of all possible frequency responses of the channel (see [13], [20])
36
Motivation
This approach is appealing because the uncertainty set is well defined in the normedlinear spaces H∞, and L1
It enables the treatment of both parametric, and non-parametric uncertaintiesIt is practical because the uncertainty set can be extracted from Nyquist, or Bode plotsThe obtained channel capacity formula explicitly depends on the size of the uncertainty set
37
Motivation
It gives the optimal transmitted power in the form of water-filling that depends on the size of uncertainty setOur computation does not require the saddle point because it relies on the work of Root, and Varaiya [20], and Gallager [15]It enables us to deal in the same time with two types of uncertainty (frequency response of the channel, and noise uncertainty) that has not been done until now (to our best knowledge)
38
Motivation
Our approach gives the solution to the jamming problem for continuous time channels accompanied with optimal transmitter and jammer strategies in terms of optimal PSD. We show that the optimal PSD of the signal is proportional to the optimal PSD of the noise Although, we consider similar problem to that of Baker’s paper [2], his constraints on the signal, and noise are in terms of the energy of the signal while we deal with power constraints
39
Communication system model
Most of the models that are used are probabilistic (compound channel, and arbitrary varying channel with and without memory) and finite dimensional channelsPresent model employs unknown transfer functions; transmitted signal is constrained in power with known PSD, noise is Gaussian whose PSD belongs to a certain set
40
Communication system model
Modeln
+
( )fW
( )fHyx
41
Communication system model
Both could be unknownUnknown transfer function can be model by using additive uncertainty
and where
Other models are possible: multiplicative
( ) ( ) ( ) ( )fWffGfG nom 1∆+=
( ) ( )fWfH ,( )fG
( ) ( ) ( ) ( ) ( ) 1,,, 1 ≤∆∈∆∞
∞ fHfWffGfG nom
( ) ( ) ( ) ( )( )fWffGfG nom 11 ∆+=
( ) ( )fGfGf
sup=∞
42
Communication system model
Uncertainty models: additive and multiplicative
( ) ( )fWf 1∆
( )fGnom +( ) ( )fWf 1∆
( )fGnom +
43
Communication system model
Example
α/β(1-δ)
Re
Im
α/β(1+δ)
α/β
( ) βαξβπ
α /,2
=+
=fj
fGnom
( ) ( )
( ) ( ) 10,
1/2
<<∆+=
+=
δδξξξ
βπξ
ff
fjf
fG
p
p
44
Communication system model
The uncertainty set is described by the ball in frequency domain centered at and with radius of
( ) ( ) ( )βπ
ξδ+
=≤−fj
fWfGfG nom 21
( )fW1
( )fGnom
45
Channel capacity with uncertainty
Define four sets
{ ,; 222 WWWHWA nom ∆+=∈= ∞
}1,,, 222 ≤∆∈∈∆∈∞
∞∞∞ HWHHWnom
( ) ( ){ }∫ ≤= xxx PdffSfSA ;1
{ ,; 113 WHHHHA nom ∆+=∈= ∞
}1,,, 111 ≤∆∈∈∆∈∞
∞∞∞ HWHHHnom
46
Channel capacity with uncertainty
Overall PSD of noise is and uncertainty is modeled by uncertainty of filter
or by the set A4
( ) ( ){ }∫ ≤= nnn PdffSfSA ;4
( ) ( ) 2fWfSn
( ) ( ) ( ) ( )fWffWfW nom 22∆+=
47
Channel capacity with uncertainty
Mutual information rate is pay-off function
( ) ∫ ⎟⎟
⎠
⎞
⎜⎜
⎝
⎛+= df
WS
HSyxI
n
x2
2
1log21;
48
Channel capacity with uncertainty I
Three problems could be definedNoise uncertainty
Channel uncertainty
∫ ⎟⎟
⎠
⎞
⎜⎜
⎝
⎛
∆++=
∈∈df
WWS
HSC
nomn
x
AWASNU
x2
22
2
1log21infsup
21
∫ ⎟⎟
⎠
⎞
⎜⎜
⎝
⎛ ∆++=
∈∈df
WS
WHSC
n
nomx
AHASCU
x2
2111log
21infsup
31
49
Channel capacity with uncertainty I
Channel – noise uncertainty
∫ ⎟⎟
⎠
⎞
⎜⎜
⎝
⎛
∆+
∆++=
∈∈∈df
WWS
WHSC
nomn
nomx
AHAWASCNU
x2
22
2111log
21infinfsup
321
50
Channel capacity with uncertainty I
Channel capacity with channel-noise uncertainty
Theorem 1. Assume that
is bounded and integrable.
( )( )22
21
WWS
WH
nomn
nom
−
+
∫ ⎟⎟
⎠
⎞
⎜⎜
⎝
⎛
∆+
∆++=
∈∈∈df
WWS
WHSC
nomn
nomx
AHAWASCU
x2
22
2111log
21infinfsup
321
51
Channel capacity with uncertainty I
Channel capacity is given parametrically
( )( )∫ ⎟
⎟
⎠
⎞
⎜⎜
⎝
⎛
+
−= df
WWS
WHC
nomn
nomNU 2
2
21*
log21 ν
( )( ) x
nom
nomn PdfWH
WWS=
⎟⎟
⎠
⎞
⎜⎜
⎝
⎛
−
+−∫
+
21
22*ν
52
Channel capacity with uncertainty I
Such that
where ν* is related to Lagrange multiplier, and is obtained from constraint equation.Infimum over noise uncertainty is achieved at
( )( ) 0*,0* 2
1
22 >≥
−
+− νν
WH
WWS
nom
nomn
( ) ( )( ) ( )( )( ) ( ) 1,argargexp *22
*2 =∆+−=∆
∞ffWjfWjf nom
53
Channel capacity with uncertainty I
Mutual information rate after minimization is give as
( )∫
∫
⎟⎟
⎠
⎞
⎜⎜
⎝
⎛
++=
⎟⎟
⎠
⎞
⎜⎜
⎝
⎛
∆++
≤∆ ∞
dfWWS
HS
dfWWS
HS
nomn
x
nomn
x
22
2
222
2
1
1log
1loginf2
54
Channel capacity with uncertainty I
Infimum over channel uncertainty is achieved at
( ) ( )( ) ( )( )( )( ) 1
,argargexp*1
1*1
=∆
++−=∆
∞f
jfHjfWjf nom π
55
Channel capacity with uncertainty I
Mutual information rate after second minimization is give as
( )
( )( )∫
∫
⎟⎟
⎠
⎞
⎜⎜
⎝
⎛
+
−+=
⎟⎟
⎠
⎞
⎜⎜
⎝
⎛
∆+
−+
≤∆ ∞
dfWWS
WHS
dfWWS
WHS
nomn
nomx
nomn
nomx
22
21
222
21
1
1log
1loginf1
56
Channel capacity with uncertainty I
Maximization gives water – filling equation
( )( ) *2
1
22* ν=
−
++
WH
WWSS
nom
nomnx
57
Channel capacity with uncertainty I
Water – filling
*ν
( )( )21
22
WH
WWSn
−
+
fBfBf−
psd
58
Channel capacity with uncertainty I
Channel capacity with channel uncertainty 02 =W
( )∫ ⎟
⎟
⎠
⎞
⎜⎜
⎝
⎛ −= df
WS
WHC
nomn
nomNU 2
21*
log21 ν
( ) xnom
nomn PdfWH
WS=
⎟⎟
⎠
⎞
⎜⎜
⎝
⎛
−−∫
+
21
2
*ν
59
Channel capacity with uncertainty I
Channel capacity with noise uncertainty 01 =W
( )∫ ⎟⎟
⎠
⎞
⎜⎜
⎝
⎛
+= df
WWS
HC
nomn
nomNU 2
2
2*log
21 ν
( )x
nom
nomn PdfH
WWS=
⎟⎟
⎠
⎞
⎜⎜
⎝
⎛ +−∫
+
2
22*ν
60
Channel capacity with uncertainty II
Second approachNoise uncertainty
Channel – noise uncertainty
∫ ⎟⎟
⎠
⎞
⎜⎜
⎝
⎛+=
∈∈df
WS
HSC
n
x
ASASNU
nx2
2
1log21infsup
41
∫ ⎟⎟
⎠
⎞
⎜⎜
⎝
⎛ ∆++=
∈∈∈df
WS
WHSC
n
nomx
AHASASCNU
nx2
2111log
21infinfsup
341
61
Channel capacity with uncertainty II
Channel capacity with channel-noise uncertainty
Theorem 2. Assume that
is bounded and integrable,
∫ ⎟⎟
⎠
⎞
⎜⎜
⎝
⎛ ∆++=
∈∈∈df
WS
WHSC
n
nomx
AHASASCNU
nx2
2111log
21infinfsup
341
2
2
WS
HS
n
x
WWH
R nom 1−=
62
Channel capacity with uncertainty II
Channel capacity is given as
where are Lagrange multipliers
WWH
RdfRCCNU12
*2
*1 ;1log
21 −
=⎟⎟⎠
⎞⎜⎜⎝
⎛+= ∫ λλ
( )*
*2
*1
*2
2*1
*2
2*1*
2
1
2*2*
2
021
nx
xxnn
SRRS
RSRSSS
λλ
λλλλ
λ
=+
=
=−+
0, *2
*1 >λλ
63
Channel capacity with uncertainty II
Integral constraints should be satisfied with equality
Power spectral densities satisfy water-filling
nn
xx
PdfS
PdfS
=
=
∫∫
*
*
*2
*2*
21λ
=+−xn SRS
64
Channel capacity with uncertainty II
Capacity for noise uncertainty case is obtained for
Theorem 3. Assume that
is bounded, and integrable. Define sets
01 =W
2
2
WS
HS
n
x
( ) ( )⎪⎭
⎪⎬⎫
⎪⎩
⎪⎨⎧
≤= ∫ xxx PdffSfSA ;1 ( ) ( )⎪⎭
⎪⎬⎫
⎪⎩
⎪⎨⎧
≤= ∫ nnn PdffSfSA ;4
01 =W
65
Channel capacity with uncertainty II
The lower value C- of pay-off function (mutual information rate, slide 40) is defined as
and is given by Theorem 2. The upper value C+
is defined by
∫ ⎟⎟
⎠
⎞
⎜⎜
⎝
⎛+==
∈∈
− dfWS
HSCC
n
x
ASASNU
nx2
2
1log21infsup
41
∫ ⎟⎟
⎠
⎞
⎜⎜
⎝
⎛+=
∈∈
+ dfWS
HSC
n
x
ASASxn
2
2
1log21supinf
14
66
Channel capacity with uncertainty II
The lower value C- is equal to the upper value C+ implying that the saddle point existsThe optimal PSD of the noise is proportional to the optimal PSD of the transmitter, which can be explained such that the players in the game try to match each other
67
Channel Coding Theorem
Define the frequency response of equivalent channel
with impulse response and ten sets
( )2/1
2
2
⎟⎟
⎠
⎞
⎜⎜
⎝
⎛=
WS
HSfG
n
x
( )tg
{ }3211 ,,; AHAWASGB x ∈∈∈=
68
Channel Coding Theorem
{ }0,,; 2312 =∆∈∈= AHASGB x
{ }0,,; 1213 =∆∈∈= AWASGB x
{ }0,,,; 23414 =∆∈∈∈= AHASASGB nx
{ }0,0,,; 21415 =∆=∆∈∈= ASASGB nx
( ) ( ) ( ){ } 5,...,1,)3),2),1,; =∈= isatisfiestgBfGtgK ii
69
Channel Coding Theorem
1) has finite duration δ2)
3)
Sets Ki are conditionally compact sets.
( )tg( ) 2Ltg ∈
( ) ( ) +∞→→+ ∫∫+∞−
∞−
AdffGdffGA
A
,022
70
Channel Coding Theorem
Positive number Ri is called attainable rate for the set of channels Ki if there exists a sequence of codes such that when then uniformly over set Ki.
Theorem 4. The operational capacities Ci (supremum of all attainable rates Ri) for the sets of communication channels with the uncertainties Ki ,(i=1,…,5) are given by corresponding formulas in Theorem 1, and Theorem 2.
Proof. Follows from [15], and [20] (see [11])
( ){ }nnRT Te n ,,ε
+∞→nT 0→nε
71
Example 1
Uncertain channel, white noiseTransfer function
( )
( ) ( )
( ) ( ) βαξδδξξξβπ
ξβπβα
/,10,1/2
1/2/
1 =<≤∆+=+
=
+=
fffj
ffH
fjfH
p
p
nom
( ) ( ) ( ) ( ) ( )1/2111 +
=≤∆=−βπξδ
fjfWfWffHfH nom
72
Example 1
Channel capacity
⎪⎪⎭
⎪⎪⎬
⎫
⎪⎪⎩
⎪⎪⎨
⎧⎟⎠⎞
⎜⎝⎛
⎟⎠⎞
⎜⎝⎛
−⎟⎠⎞
⎜⎝⎛
⎟⎠⎞
⎜⎝⎛= −
β
π
βππ
3/16/1
13/16/1
49
tan491 c
P
cPCCU
( ) ( )⎪⎩
⎪⎨
⎧⎟⎠⎞
⎜⎝⎛
⎟⎠⎞
⎜⎝⎛≤−⎟
⎠⎞
⎜⎝⎛
==
otherwisecPcPcfS fx
,049,
49
|
3/16/12
3/12
2
πωωππω
( )220
12 δα −=
Nc
73
Example 1
δ
P = 10-2 WN0 = 10-8 W/Hzβ = 1000 rad/s
⎪⎩
⎪⎨
⎧=
1000500250
α
74
Example 2
Uncertain noiseTransfer function
Noise uncertainty description
( )( ) ( ) 22
2
222 nn
n
fjfjfH
ωπξωπω
++=
( )βπ
ξδ/22 fj
fW =( ) ( )
( ) ( ) βαξδδξξξβπ
ξ
/,10,1/2
=<≤∆+=+
=
fffj
ffW p
2p
75
Example 2
δ
( )sradn /700=ω
( )sradn /1000=ω
( )sradn /1300=ω
( )srad
WP
/10001
2.001.0
====
βαξ
76
Example 2
( )( )srad
srad
WP
n /1300/1000
12.001.0
=====
ωβαξ
0=δ
2.0=δ
1.0=δ
77
Example 3
Uncertain channel, uncertain noise
Damping ration is uncertain
The noise uncertainty is modelled as in the Example 2
( )( ) ( ) 22
2
222 nn
n
fjfjfH
ωπξωπω
++=
ξ
78
Example 3
79
Example 3
80
References
[1] Ahlswede, R., “The capacity of a channel with arbitrary varying Gaussian channel probability functions”, Trans. 6th Prague Conf. Information Theory, Statistical Decision Functions, and Random Processes, pp. 13-31, Sept. 1971.
[2] Baker, C. R., Chao, I.-F., “Information capacity of channels with partially unknown noise. I. Finite dimensional channels”, SIAM J. Appl. Math., vol. 56, no. 3, pp. 946-963, June 1996.
[3] Basar, T., “A complete characterization of minimax, and maximinencoder-decoder policies for communication channels with incomplete statistical description”, IEEE Transactions on Information Theory, vol. 31, pp. 482-489, Jan., 1985.
81
References
[4] Biglieri, E., Proakis, J., Shamai, S., “Fading channels: information-theoretic and communications aspects,” IEEE Transactions on Information Theory, vol. 44, no. 6, pp. 2619-2692, October, 1998.
[5] Blachman, N. M., “Communication as a game”, IRE Wescon 1957 Conference Record, vol. 2, pp. 61-66, 1957.
[6] Blackwell, D., Breiman, L., Thomasian, A. J., “The capacity of a class of channels”, Ann. Math. Stat., vol. 30, pp. 1229-1241, 1959.
[7] Charalambous, C. D., Denic, S. Z., Djouadi, S. M. "Robust Capacity of White Gaussian Noise Channels with Uncertainty", accepted for43th IEEE Conference on Decision and Control.
82
References
[8] Csiszar, I., Korner, J., Information theory: Coding theorems for discrete memoryless systems. New York: Academic Press, 1981.
[9] Csiszar, I., Narayan P., “Capacity of the Gaussian arbitrary varying channels”, IEEE Transactions on Information Theory, vol. 37, no. 1, pp. 18-26, Jan., 1991.
[10] Denic, S. Z., Charalambous, C. D., Djouadi, S.M., “Capacity of Gaussian channels with noise uncertainty”, Proceedings of IEEE CCECE 2004, Canada.
[11] Denic, S.Z., Charalambous, C.D., Djouadi, S.M., “Robust capacity for additive colored Gaussian uncertain channels,” preprint.
83
References
[12] Dobrushin, L. “Optimal information transmission through a channel with unknown parameters”, Radiotekhnika i Electronika, vol. 4, pp. 1951-1956, 1959.
[13] Doyle, J.C., Francis, B.A., Tannenbaum, A.R., Feedback control theory, New York: McMillan Publishing Company, 1992.
[14] Forys, L.J., Varaiya, P.P., “The ε-capacity of classes of unknown channels,” Information and control, vol. 44, pp. 376-406, 1969.
[15] Gallager, G.R., Information theory and reliable communication.New York: Wiley, 1968.
84
References
[16] Hughes, B., Narayan P., “Gaussian arbitrary varying channels”, IEEE Transactions on Information Theory, vol. 33, no. 2, pp. 267-284, Mar., 1987.
[17] Hughes, B., Narayan P., “The capacity of vector Gaussian arbitrary varying channel”, IEEE Transactions on Information Theory, vol. 34, no. 5, pp. 995-1003, Sep., 1988.
[18] Lapidoth, A., Narayan, P., “Reliable communication under channel uncertainty,” IEEE Transactions on Information Theory, vol. 44, no. 6, pp. 2148-2177, October, 1998.
[19] Medard, M., “Channel uncertainty in communications,” IEEE Information Theory Society Newsletters, vol. 53, no. 2, p. 1, pp. 10-12, June, 2003.
85
References
[20] Root, W.L., Varaiya, P.P., “Capacity of classes of Gaussian channels,” SIAM J. Appl. Math., vol. 16, no. 6, pp. 1350-1353, November, 1968.
[21] Wolfowitz, Coding Theorems of Information Theory, Springer –Verlang, Belin Heildelberg, 1978.
[22] McElice, R. J., “Communications in the presence of jamming – An information theoretic approach, in Secure Digital Communications, G. Longo, ed., Springer-Verlang, New York, 1983, pp. 127-166.
[23] Diggavi, S. N., Cover, T. M., “The worst additive noise under a covariance constraint”, IEEE Transactions on Information Theory, vol. 47, no. 7, pp. 3072-3081, November, 2001.
86
References
[24] Vishwanath, S., Boyd, S., Goldsmith, A., “Worst-case capacity of Gaussian vector channels”, Proceedings of 2003 Canadian Workshop on Information Theory.
[25] Shannon, C.E., “Mathematical theory of communication”, Bell Sys. Tech. J., vol. 27, pp. 379-423, pp. 623-656,July, Oct, 1948