Regional Price Parities for States and Metropolitan Areas, 2006–2010
Layered Error-Correction Codes for Real-Time Streaming ...akhisti/mcGill.pdf · 2 Apply Erasure...
Transcript of Layered Error-Correction Codes for Real-Time Streaming ...akhisti/mcGill.pdf · 2 Apply Erasure...
Layered Error-Correction Codes forReal-Time Streaming over Erasure Channels
Ashish KhistiUniversity of Toronto
Joint Work:Ahmed Badr (Toronto),Wai-Tian Tan (Cisco),
John Apostolopoulos (Cisco).
Sequential and Adaptive Information Theory WorkshopMcGill University
Nov 7, 2013
Real-Time Communication System
Sequential and Adaptive Information Theory Workshop McGill University Nov 7, 2013 2/ 42
Real-Time Communication System
Sequential and Adaptive Information Theory Workshop McGill University Nov 7, 2013 2/ 42
Real-Time Communication System
Sequential and Adaptive Information Theory Workshop McGill University Nov 7, 2013 2/ 42
Real-Time Communication System
Sequential and Adaptive Information Theory Workshop McGill University Nov 7, 2013 2/ 42
Real-Time Communication System
Sequential and Adaptive Information Theory Workshop McGill University Nov 7, 2013 2/ 42
Real-Time Communication System
Sequential and Adaptive Information Theory Workshop McGill University Nov 7, 2013 2/ 42
Real-Time Communication System
Sequential and Adaptive Information Theory Workshop McGill University Nov 7, 2013 2/ 42
Real-Time Communication System
Sequential and Adaptive Information Theory Workshop McGill University Nov 7, 2013 2/ 42
Real-Time Communication System
Sequential and Adaptive Information Theory Workshop McGill University Nov 7, 2013 2/ 42
Real-Time Communication System
Sequential and Adaptive Information Theory Workshop McGill University Nov 7, 2013 2/ 42
Real-Time Communication System
Sequential and Adaptive Information Theory Workshop McGill University Nov 7, 2013 2/ 42
Real-Time Communication System
Sequential and Adaptive Information Theory Workshop McGill University Nov 7, 2013 2/ 42
Motivating Example
s0p0 p1 p2 p3 p4 p5 p6 p7
xis1 s2 s3 s4 s5 s6 s7k
n
s0p0 p1 p2 p3 p4 p5 p6 p7
xis1 s2 s3 s4 s5 s6 s7k
ns0p0 p1 p2 p3 p4 p5 p6 p7
xis1 s2 s3 s4 s5 s6 s7k
ns0p0 p1 p2 p3 p4 p5 p6 p7
xis1 s2 s3 s4 s5 s6 s7k
ns0p0 p1 p2 p3 p4 p5 p6 p7
xis1 s2 s3 s4 s5 s6 s7k
ns0p0 p1 p2 p3 p4 p5 p6 p7
xis1 s2 s3 s4 s5 s6 s7k
n
Recover s0, s1, s2, s3
s0
p0 p1 p2 p3 p4 p5 p6 p7
xi
s1 s2 s3 s4 s5 s6 s7kn
pi = si ·H0 + si−1 ·H1 + . . .+ si−M ·HM , Hi ∈ Fk×n−kq
Erasure Codes:
Random Linear CodesStrongly-MDS Codes (Gabidulin’88, Gluesing-Luerssen’06 )
p4
p5
p6
p7
=
H4 H3 H2 H1
H5 H4 H3 H2
0 H5 H4 H3
0 0 H5 H4
︸ ︷︷ ︸
full rank
s0
s1
s2
s3
Sequential and Adaptive Information Theory Workshop McGill University Nov 7, 2013 3/ 42
Motivating Example
s0p0 p1 p2 p3 p4 p5 p6 p7
xis1 s2 s3 s4 s5 s6 s7k
ns0p0 p1 p2 p3 p4 p5 p6 p7
xis1 s2 s3 s4 s5 s6 s7k
n
s0p0 p1 p2 p3 p4 p5 p6 p7
xis1 s2 s3 s4 s5 s6 s7k
ns0p0 p1 p2 p3 p4 p5 p6 p7
xis1 s2 s3 s4 s5 s6 s7k
ns0p0 p1 p2 p3 p4 p5 p6 p7
xis1 s2 s3 s4 s5 s6 s7k
ns0p0 p1 p2 p3 p4 p5 p6 p7
xis1 s2 s3 s4 s5 s6 s7k
n
Recover s0, s1, s2, s3
s0
p0 p1 p2 p3 p4 p5 p6 p7
xi
s1 s2 s3 s4 s5 s6 s7kn
pi = si ·H0 + si−1 ·H1 + . . .+ si−M ·HM , Hi ∈ Fk×n−kq
Erasure Codes:
Random Linear CodesStrongly-MDS Codes (Gabidulin’88, Gluesing-Luerssen’06 )
p4
p5
p6
p7
=
H4 H3 H2 H1
H5 H4 H3 H2
0 H5 H4 H3
0 0 H5 H4
︸ ︷︷ ︸
full rank
s0
s1
s2
s3
Sequential and Adaptive Information Theory Workshop McGill University Nov 7, 2013 3/ 42
Motivating Example
s0p0 p1 p2 p3 p4 p5 p6 p7
xis1 s2 s3 s4 s5 s6 s7k
ns0p0 p1 p2 p3 p4 p5 p6 p7
xis1 s2 s3 s4 s5 s6 s7k
ns0p0 p1 p2 p3 p4 p5 p6 p7
xis1 s2 s3 s4 s5 s6 s7k
n
s0p0 p1 p2 p3 p4 p5 p6 p7
xis1 s2 s3 s4 s5 s6 s7k
ns0p0 p1 p2 p3 p4 p5 p6 p7
xis1 s2 s3 s4 s5 s6 s7k
ns0p0 p1 p2 p3 p4 p5 p6 p7
xis1 s2 s3 s4 s5 s6 s7k
n
Recover s0, s1, s2, s3
s0
p0 p1 p2 p3 p4 p5 p6 p7
xi
s1 s2 s3 s4 s5 s6 s7kn
pi = si ·H0 + si−1 ·H1 + . . .+ si−M ·HM , Hi ∈ Fk×n−kq
Erasure Codes:
Random Linear CodesStrongly-MDS Codes (Gabidulin’88, Gluesing-Luerssen’06 )
p4
p5
p6
p7
=
H4 H3 H2 H1
H5 H4 H3 H2
0 H5 H4 H3
0 0 H5 H4
︸ ︷︷ ︸
full rank
s0
s1
s2
s3
Sequential and Adaptive Information Theory Workshop McGill University Nov 7, 2013 3/ 42
Motivating Example
s0p0 p1 p2 p3 p4 p5 p6 p7
xis1 s2 s3 s4 s5 s6 s7k
ns0p0 p1 p2 p3 p4 p5 p6 p7
xis1 s2 s3 s4 s5 s6 s7k
ns0p0 p1 p2 p3 p4 p5 p6 p7
xis1 s2 s3 s4 s5 s6 s7k
ns0p0 p1 p2 p3 p4 p5 p6 p7
xis1 s2 s3 s4 s5 s6 s7k
n
s0p0 p1 p2 p3 p4 p5 p6 p7
xis1 s2 s3 s4 s5 s6 s7k
ns0p0 p1 p2 p3 p4 p5 p6 p7
xis1 s2 s3 s4 s5 s6 s7k
n
Recover s0, s1, s2, s3
s0
p0 p1 p2 p3 p4 p5 p6 p7
xi
s1 s2 s3 s4 s5 s6 s7kn
pi = si ·H0 + si−1 ·H1 + . . .+ si−M ·HM , Hi ∈ Fk×n−kq
Erasure Codes:
Random Linear CodesStrongly-MDS Codes (Gabidulin’88, Gluesing-Luerssen’06 )
p4
p5
p6
p7
=
H4 H3 H2 H1
H5 H4 H3 H2
0 H5 H4 H3
0 0 H5 H4
︸ ︷︷ ︸
full rank
s0
s1
s2
s3
Sequential and Adaptive Information Theory Workshop McGill University Nov 7, 2013 3/ 42
Motivating Example
s0p0 p1 p2 p3 p4 p5 p6 p7
xis1 s2 s3 s4 s5 s6 s7k
ns0p0 p1 p2 p3 p4 p5 p6 p7
xis1 s2 s3 s4 s5 s6 s7k
ns0p0 p1 p2 p3 p4 p5 p6 p7
xis1 s2 s3 s4 s5 s6 s7k
ns0p0 p1 p2 p3 p4 p5 p6 p7
xis1 s2 s3 s4 s5 s6 s7k
ns0p0 p1 p2 p3 p4 p5 p6 p7
xis1 s2 s3 s4 s5 s6 s7k
n
s0p0 p1 p2 p3 p4 p5 p6 p7
xis1 s2 s3 s4 s5 s6 s7k
n
Recover s0, s1, s2, s3
s0
p0 p1 p2 p3 p4 p5 p6 p7
xi
s1 s2 s3 s4 s5 s6 s7kn
pi = si ·H0 + si−1 ·H1 + . . .+ si−M ·HM , Hi ∈ Fk×n−kq
Erasure Codes:
Random Linear CodesStrongly-MDS Codes (Gabidulin’88, Gluesing-Luerssen’06 )
p4
p5
p6
p7
=
H4 H3 H2 H1
H5 H4 H3 H2
0 H5 H4 H3
0 0 H5 H4
︸ ︷︷ ︸
full rank
s0
s1
s2
s3
Sequential and Adaptive Information Theory Workshop McGill University Nov 7, 2013 3/ 42
Motivating Example
s0p0 p1 p2 p3 p4 p5 p6 p7
xis1 s2 s3 s4 s5 s6 s7k
ns0p0 p1 p2 p3 p4 p5 p6 p7
xis1 s2 s3 s4 s5 s6 s7k
ns0p0 p1 p2 p3 p4 p5 p6 p7
xis1 s2 s3 s4 s5 s6 s7k
ns0p0 p1 p2 p3 p4 p5 p6 p7
xis1 s2 s3 s4 s5 s6 s7k
ns0p0 p1 p2 p3 p4 p5 p6 p7
xis1 s2 s3 s4 s5 s6 s7k
ns0p0 p1 p2 p3 p4 p5 p6 p7
xis1 s2 s3 s4 s5 s6 s7k
n
Recover s0, s1, s2, s3
s0
p0 p1 p2 p3 p4 p5 p6 p7
xi
s1 s2 s3 s4 s5 s6 s7kn
pi = si ·H0 + si−1 ·H1 + . . .+ si−M ·HM , Hi ∈ Fk×n−kq
Erasure Codes:
Random Linear CodesStrongly-MDS Codes (Gabidulin’88, Gluesing-Luerssen’06 )
p4
p5
p6
p7
=
H4 H3 H2 H1
H5 H4 H3 H2
0 H5 H4 H3
0 0 H5 H4
︸ ︷︷ ︸
full rank
s0
s1
s2
s3
Sequential and Adaptive Information Theory Workshop McGill University Nov 7, 2013 3/ 42
Motivating Example
s0p0 p1 p2 p3 p4 p5 p6 p7
xis1 s2 s3 s4 s5 s6 s7k
ns0p0 p1 p2 p3 p4 p5 p6 p7
xis1 s2 s3 s4 s5 s6 s7k
ns0p0 p1 p2 p3 p4 p5 p6 p7
xis1 s2 s3 s4 s5 s6 s7k
ns0p0 p1 p2 p3 p4 p5 p6 p7
xis1 s2 s3 s4 s5 s6 s7k
ns0p0 p1 p2 p3 p4 p5 p6 p7
xis1 s2 s3 s4 s5 s6 s7k
ns0p0 p1 p2 p3 p4 p5 p6 p7
xis1 s2 s3 s4 s5 s6 s7k
n
Recover s0, s1, s2, s3
s0
p0 p1 p2 p3 p4 p5 p6 p7
xi
s1 s2 s3 s4 s5 s6 s7kn
pi = si ·H0 + si−1 ·H1 + . . .+ si−M ·HM , Hi ∈ Fk×n−kq
Erasure Codes:
Random Linear CodesStrongly-MDS Codes (Gabidulin’88, Gluesing-Luerssen’06 )
p4
p5
p6
p7
=
H4 H3 H2 H1
H5 H4 H3 H2
0 H5 H4 H3
0 0 H5 H4
︸ ︷︷ ︸
full rank
s0
s1
s2
s3
Sequential and Adaptive Information Theory Workshop McGill University Nov 7, 2013 3/ 42
Motivating Example
s0p0 p1 p2 p3 p4 p5 p6 p7
xis1 s2 s3 s4 s5 s6 s7k
ns0p0 p1 p2 p3 p4 p5 p6 p7
xis1 s2 s3 s4 s5 s6 s7k
ns0p0 p1 p2 p3 p4 p5 p6 p7
xis1 s2 s3 s4 s5 s6 s7k
ns0p0 p1 p2 p3 p4 p5 p6 p7
xis1 s2 s3 s4 s5 s6 s7k
ns0p0 p1 p2 p3 p4 p5 p6 p7
xis1 s2 s3 s4 s5 s6 s7k
ns0p0 p1 p2 p3 p4 p5 p6 p7
xis1 s2 s3 s4 s5 s6 s7k
n
Recover s0, s1, s2, s3
s0
p0 p1 p2 p3 p4 p5 p6 p7
xi
s1 s2 s3 s4 s5 s6 s7kn
pi = si ·H0 + si−1 ·H1 + . . .+ si−M ·HM , Hi ∈ Fk×n−kq
Erasure Codes:
Random Linear CodesStrongly-MDS Codes (Gabidulin’88, Gluesing-Luerssen’06 )
p4
p5
p6
p7
=
H4 H3 H2 H1
H5 H4 H3 H2
0 H5 H4 H3
0 0 H5 H4
︸ ︷︷ ︸
full rank
s0
s1
s2
s3
Sequential and Adaptive Information Theory Workshop McGill University Nov 7, 2013 3/ 42
Motivating Example
s0p0 p1 p2 p3 p4 p5 p6 p7
xis1 s2 s3 s4 s5 s6 s7k
ns0p0 p1 p2 p3 p4 p5 p6 p7
xis1 s2 s3 s4 s5 s6 s7k
ns0p0 p1 p2 p3 p4 p5 p6 p7
xis1 s2 s3 s4 s5 s6 s7k
ns0p0 p1 p2 p3 p4 p5 p6 p7
xis1 s2 s3 s4 s5 s6 s7k
ns0p0 p1 p2 p3 p4 p5 p6 p7
xis1 s2 s3 s4 s5 s6 s7k
ns0p0 p1 p2 p3 p4 p5 p6 p7
xis1 s2 s3 s4 s5 s6 s7k
n
Recover s0, s1, s2, s3
s0
p0 p1 p2 p3 p4 p5 p6 p7
xi
s1 s2 s3 s4 s5 s6 s7kn
pi = si ·H0 + si−1 ·H1 + . . .+ si−M ·HM , Hi ∈ Fk×n−kq
Erasure Codes:
Random Linear CodesStrongly-MDS Codes (Gabidulin’88, Gluesing-Luerssen’06 )
p4
p5
p6
p7
=
H4 H3 H2 H1
H5 H4 H3 H2
0 H5 H4 H3
0 0 H5 H4
︸ ︷︷ ︸
full rank
s0
s1
s2
s3
Sequential and Adaptive Information Theory Workshop McGill University Nov 7, 2013 3/ 42
Motivating Example- IIB = 4, T = 8
Rate 1/2 Baseline Erasure Codes, T = 7
v0
p0
v1
p1
v2
p2
v3
p3
v4
p4
v5
p5
v6
p6
v7
p7
xi
v8
p8
v9
p9
v10
p10
v11
p11
v0,v1,v2,v3
Rate 1/2 Repetition Code, T = 8
u0
u-8
u1
u-7
u2
u-6
u3
u-5
u4
u-4
u5
u-3
u6
u-2
u7
u-1
xi
u8
u0
u9
u1
u10
u2
u11
u3
u0
u1
u2
u3
Sequential and Adaptive Information Theory Workshop McGill University Nov 7, 2013 4/ 42
Motivating Example- IIB = 4, T = 8
Rate 1/2 Baseline Erasure Codes, T = 7
v0
p0
v1
p1
v2
p2
v3
p3
v4
p4
v5
p5
v6
p6
v7
p7
xi
v8
p8
v9
p9
v10
p10
v11
p11
v0,v1,v2,v3
Rate 1/2 Repetition Code, T = 8
u0
u-8
u1
u-7
u2
u-6
u3
u-5
u4
u-4
u5
u-3
u6
u-2
u7
u-1
xi
u8
u0
u9
u1
u10
u2
u11
u3
u0
u1
u2
u3
Sequential and Adaptive Information Theory Workshop McGill University Nov 7, 2013 4/ 42
Motivating Example - IIB = 4, T = 8
u0u-8
u1u-7
u2u-6
u3u-5
u4u-4
u5u-3
u6u-2
u7u-1
u8u0
u9u1
u10u2
u11u3
v0p0
v1p1
v2p2
v3p3
v4p4
v5p5
v6p6
v7p7
v8p8
v9p9
v10p10
v11p11
vu
uu
v0
p0
v1
p1
v2
p2
v3
p3
v4
p4
v5
p5
v6
p6
v7
p7
v8
p8
v9
p9
v10
p10
v11
p11
u0
u-8
u1
u-7
u2
u-6
u3
u-5
u4
u-4
u5
u-3
u6
u-2
u7
u-1
u8
u0
u9
u1
u10
u2
u11
u3
u
v
u
u
R= u+v3u+v
=12
v0
v1
v2
v3
v4
v5
v6
v7
v8
v9
v10
v11
u0
u1
u2
u3
u4
u5
u6
u7
u8
u9
u10
u11
p4
+u-4
p0
+u-8
p1
+u-7
p2
+u-6
p3
+u-5
p5
+u-3
p6
+u-2
p7
+u-1
p8
+u0
p9
+u1
p10
+u2
p11
+u3
xi
si
u
v
u
R= u+v2 u+v
=23
v0
v1
v2
v3
v4
v5
v6
v7
v8
v9
v10
v11
u0
u1
u2
u3
u4
u5
u6
u7
u8
u9
u10
u11
p4
+u-4
p0
+u-8
p1
+u-7
p2
+u-6
p3
+u-5
p5
+u-3
p6
+u-2
p7
+u-1
p8
+u0
p9
+u1
p10
+u2
p11
+u3
si
xi
u
v
u
R= u+v2 u+v
=23
v0
v1
v2
v3
v4
v5
v6
v7
v8
v9
v10
v11
u0
u1
u2
u3
u4
u5
u6
u7
u8
u9
u10
u11
p4
+u-4
p0
+u-8
p1
+u-7
p2
+u-6
p3
+u-5
p5
+u-3
p6
+u-2
p7
+u-1
p8
+u0
p9
+u1
p10
+u2
p11
+u3
si
v0,v1,v2,v3
xi
u
v
u
R= u+v2 u+v
=23
v0 v1 v2 v3 v4 v5 v6 v7 v8 v9 v10 v11
u0 u1 u2 u3 u4 u5 u6 u7 u8 u9 u10 u11
p4+u-4
p0+u-8
p1+u-7
p2+u-6
p3+u-5
p5+u-3
p6+u-2
p7+u-1
p8+u0
p9+u1
p10+u2
p11+u3
si
v0,v1,v2,v3 u0 u1 u2 u3
xi
uv
u
R= uv2uv
v0
v1
v2
v3
v4
v5
v6
v7
v8
v9
v10
v11
u0
u1
u2
u3
u4
u5
u6
u7
u8
u9
u10
u11
p4
+u-4
p0
+u-8
p1
+u-7
p2
+u-6
p3
+u-5
p5
+u-3
p6
+u-2
p7
+u-1
p8
+u0
p9
+u1
p10
+u2
p11
+u3
si
v0,v1,v2,v3 u0 u
1u2
u3
s0
s1
s2
s3
xi
u
v
u
R= u+v2 u+v
=23
Encoding:
1 Split each source symbol into 2 groups si = (ui,vi)
2 Apply Erasure code to the vi stream generating pi parities
3 Repeat the ui symbols with a shift of T
4 Combine the repeated ui’s with the pi’s
5 Maximally Short Codes (Martinian and Sundberg ’04, Martinian and Trott ’07)
Sequential and Adaptive Information Theory Workshop McGill University Nov 7, 2013 5/ 42
Motivating Example - IIB = 4, T = 8
u0u-8
u1u-7
u2u-6
u3u-5
u4u-4
u5u-3
u6u-2
u7u-1
u8u0
u9u1
u10u2
u11u3
v0p0
v1p1
v2p2
v3p3
v4p4
v5p5
v6p6
v7p7
v8p8
v9p9
v10p10
v11p11
vu
uu
v0
p0
v1
p1
v2
p2
v3
p3
v4
p4
v5
p5
v6
p6
v7
p7
v8
p8
v9
p9
v10
p10
v11
p11
u0
u-8
u1
u-7
u2
u-6
u3
u-5
u4
u-4
u5
u-3
u6
u-2
u7
u-1
u8
u0
u9
u1
u10
u2
u11
u3
u
v
u
u
R= u+v3u+v
=12
v0
v1
v2
v3
v4
v5
v6
v7
v8
v9
v10
v11
u0
u1
u2
u3
u4
u5
u6
u7
u8
u9
u10
u11
p4
+u-4
p0
+u-8
p1
+u-7
p2
+u-6
p3
+u-5
p5
+u-3
p6
+u-2
p7
+u-1
p8
+u0
p9
+u1
p10
+u2
p11
+u3
xi
si
u
v
u
R= u+v2 u+v
=23
v0
v1
v2
v3
v4
v5
v6
v7
v8
v9
v10
v11
u0
u1
u2
u3
u4
u5
u6
u7
u8
u9
u10
u11
p4
+u-4
p0
+u-8
p1
+u-7
p2
+u-6
p3
+u-5
p5
+u-3
p6
+u-2
p7
+u-1
p8
+u0
p9
+u1
p10
+u2
p11
+u3
si
xi
u
v
u
R= u+v2 u+v
=23
v0
v1
v2
v3
v4
v5
v6
v7
v8
v9
v10
v11
u0
u1
u2
u3
u4
u5
u6
u7
u8
u9
u10
u11
p4
+u-4
p0
+u-8
p1
+u-7
p2
+u-6
p3
+u-5
p5
+u-3
p6
+u-2
p7
+u-1
p8
+u0
p9
+u1
p10
+u2
p11
+u3
si
v0,v1,v2,v3
xi
u
v
u
R= u+v2 u+v
=23
v0 v1 v2 v3 v4 v5 v6 v7 v8 v9 v10 v11
u0 u1 u2 u3 u4 u5 u6 u7 u8 u9 u10 u11
p4+u-4
p0+u-8
p1+u-7
p2+u-6
p3+u-5
p5+u-3
p6+u-2
p7+u-1
p8+u0
p9+u1
p10+u2
p11+u3
si
v0,v1,v2,v3 u0 u1 u2 u3
xi
uv
u
R= uv2uv
v0
v1
v2
v3
v4
v5
v6
v7
v8
v9
v10
v11
u0
u1
u2
u3
u4
u5
u6
u7
u8
u9
u10
u11
p4
+u-4
p0
+u-8
p1
+u-7
p2
+u-6
p3
+u-5
p5
+u-3
p6
+u-2
p7
+u-1
p8
+u0
p9
+u1
p10
+u2
p11
+u3
si
v0,v1,v2,v3 u0 u
1u2
u3
s0
s1
s2
s3
xi
u
v
u
R= u+v2 u+v
=23
Encoding:
1 Split each source symbol into 2 groups si = (ui,vi)
2 Apply Erasure code to the vi stream generating pi parities
3 Repeat the ui symbols with a shift of T
4 Combine the repeated ui’s with the pi’s
5 Maximally Short Codes (Martinian and Sundberg ’04, Martinian and Trott ’07)
Sequential and Adaptive Information Theory Workshop McGill University Nov 7, 2013 5/ 42
Motivating Example - IIB = 4, T = 8
u0u-8
u1u-7
u2u-6
u3u-5
u4u-4
u5u-3
u6u-2
u7u-1
u8u0
u9u1
u10u2
u11u3
v0p0
v1p1
v2p2
v3p3
v4p4
v5p5
v6p6
v7p7
v8p8
v9p9
v10p10
v11p11
vu
uu
v0
p0
v1
p1
v2
p2
v3
p3
v4
p4
v5
p5
v6
p6
v7
p7
v8
p8
v9
p9
v10
p10
v11
p11
u0
u-8
u1
u-7
u2
u-6
u3
u-5
u4
u-4
u5
u-3
u6
u-2
u7
u-1
u8
u0
u9
u1
u10
u2
u11
u3
u
v
u
u
R= u+v3u+v
=12
v0
v1
v2
v3
v4
v5
v6
v7
v8
v9
v10
v11
u0
u1
u2
u3
u4
u5
u6
u7
u8
u9
u10
u11
p4
+u-4
p0
+u-8
p1
+u-7
p2
+u-6
p3
+u-5
p5
+u-3
p6
+u-2
p7
+u-1
p8
+u0
p9
+u1
p10
+u2
p11
+u3
xi
si
u
v
u
R= u+v2 u+v
=23
v0
v1
v2
v3
v4
v5
v6
v7
v8
v9
v10
v11
u0
u1
u2
u3
u4
u5
u6
u7
u8
u9
u10
u11
p4
+u-4
p0
+u-8
p1
+u-7
p2
+u-6
p3
+u-5
p5
+u-3
p6
+u-2
p7
+u-1
p8
+u0
p9
+u1
p10
+u2
p11
+u3
si
xi
u
v
u
R= u+v2 u+v
=23
v0
v1
v2
v3
v4
v5
v6
v7
v8
v9
v10
v11
u0
u1
u2
u3
u4
u5
u6
u7
u8
u9
u10
u11
p4
+u-4
p0
+u-8
p1
+u-7
p2
+u-6
p3
+u-5
p5
+u-3
p6
+u-2
p7
+u-1
p8
+u0
p9
+u1
p10
+u2
p11
+u3
si
v0,v1,v2,v3
xi
u
v
u
R= u+v2 u+v
=23
v0 v1 v2 v3 v4 v5 v6 v7 v8 v9 v10 v11
u0 u1 u2 u3 u4 u5 u6 u7 u8 u9 u10 u11
p4+u-4
p0+u-8
p1+u-7
p2+u-6
p3+u-5
p5+u-3
p6+u-2
p7+u-1
p8+u0
p9+u1
p10+u2
p11+u3
si
v0,v1,v2,v3 u0 u1 u2 u3
xi
uv
u
R= uv2uv
v0
v1
v2
v3
v4
v5
v6
v7
v8
v9
v10
v11
u0
u1
u2
u3
u4
u5
u6
u7
u8
u9
u10
u11
p4
+u-4
p0
+u-8
p1
+u-7
p2
+u-6
p3
+u-5
p5
+u-3
p6
+u-2
p7
+u-1
p8
+u0
p9
+u1
p10
+u2
p11
+u3
si
v0,v1,v2,v3 u0 u
1u2
u3
s0
s1
s2
s3
xi
u
v
u
R= u+v2 u+v
=23
Encoding:
1 Split each source symbol into 2 groups si = (ui,vi)
2 Apply Erasure code to the vi stream generating pi parities
3 Repeat the ui symbols with a shift of T
4 Combine the repeated ui’s with the pi’s
5 Maximally Short Codes (Martinian and Sundberg ’04, Martinian and Trott ’07)
Sequential and Adaptive Information Theory Workshop McGill University Nov 7, 2013 5/ 42
Motivating Example - IIB = 4, T = 8
u0u-8
u1u-7
u2u-6
u3u-5
u4u-4
u5u-3
u6u-2
u7u-1
u8u0
u9u1
u10u2
u11u3
v0p0
v1p1
v2p2
v3p3
v4p4
v5p5
v6p6
v7p7
v8p8
v9p9
v10p10
v11p11
vu
uu
v0
p0
v1
p1
v2
p2
v3
p3
v4
p4
v5
p5
v6
p6
v7
p7
v8
p8
v9
p9
v10
p10
v11
p11
u0
u-8
u1
u-7
u2
u-6
u3
u-5
u4
u-4
u5
u-3
u6
u-2
u7
u-1
u8
u0
u9
u1
u10
u2
u11
u3
u
v
u
u
R= u+v3u+v
=12
v0
v1
v2
v3
v4
v5
v6
v7
v8
v9
v10
v11
u0
u1
u2
u3
u4
u5
u6
u7
u8
u9
u10
u11
p4
+u-4
p0
+u-8
p1
+u-7
p2
+u-6
p3
+u-5
p5
+u-3
p6
+u-2
p7
+u-1
p8
+u0
p9
+u1
p10
+u2
p11
+u3
xi
si
u
v
u
R= u+v2 u+v
=23
v0
v1
v2
v3
v4
v5
v6
v7
v8
v9
v10
v11
u0
u1
u2
u3
u4
u5
u6
u7
u8
u9
u10
u11
p4
+u-4
p0
+u-8
p1
+u-7
p2
+u-6
p3
+u-5
p5
+u-3
p6
+u-2
p7
+u-1
p8
+u0
p9
+u1
p10
+u2
p11
+u3
si
xi
u
v
u
R= u+v2 u+v
=23
v0
v1
v2
v3
v4
v5
v6
v7
v8
v9
v10
v11
u0
u1
u2
u3
u4
u5
u6
u7
u8
u9
u10
u11
p4
+u-4
p0
+u-8
p1
+u-7
p2
+u-6
p3
+u-5
p5
+u-3
p6
+u-2
p7
+u-1
p8
+u0
p9
+u1
p10
+u2
p11
+u3
si
v0,v1,v2,v3
xi
u
v
u
R= u+v2 u+v
=23
v0 v1 v2 v3 v4 v5 v6 v7 v8 v9 v10 v11
u0 u1 u2 u3 u4 u5 u6 u7 u8 u9 u10 u11
p4+u-4
p0+u-8
p1+u-7
p2+u-6
p3+u-5
p5+u-3
p6+u-2
p7+u-1
p8+u0
p9+u1
p10+u2
p11+u3
si
v0,v1,v2,v3 u0 u1 u2 u3
xi
uv
u
R= uv2uv
v0
v1
v2
v3
v4
v5
v6
v7
v8
v9
v10
v11
u0
u1
u2
u3
u4
u5
u6
u7
u8
u9
u10
u11
p4
+u-4
p0
+u-8
p1
+u-7
p2
+u-6
p3
+u-5
p5
+u-3
p6
+u-2
p7
+u-1
p8
+u0
p9
+u1
p10
+u2
p11
+u3
si
v0,v1,v2,v3 u0 u
1u2
u3
s0
s1
s2
s3
xi
u
v
u
R= u+v2 u+v
=23
Encoding:
1 Split each source symbol into 2 groups si = (ui,vi)
2 Apply Erasure code to the vi stream generating pi parities
3 Repeat the ui symbols with a shift of T
4 Combine the repeated ui’s with the pi’s
5 Maximally Short Codes (Martinian and Sundberg ’04, Martinian and Trott ’07)
Sequential and Adaptive Information Theory Workshop McGill University Nov 7, 2013 5/ 42
Motivating Example - IIB = 4, T = 8
u0u-8
u1u-7
u2u-6
u3u-5
u4u-4
u5u-3
u6u-2
u7u-1
u8u0
u9u1
u10u2
u11u3
v0p0
v1p1
v2p2
v3p3
v4p4
v5p5
v6p6
v7p7
v8p8
v9p9
v10p10
v11p11
vu
uu
v0
p0
v1
p1
v2
p2
v3
p3
v4
p4
v5
p5
v6
p6
v7
p7
v8
p8
v9
p9
v10
p10
v11
p11
u0
u-8
u1
u-7
u2
u-6
u3
u-5
u4
u-4
u5
u-3
u6
u-2
u7
u-1
u8
u0
u9
u1
u10
u2
u11
u3
u
v
u
u
R= u+v3u+v
=12
v0
v1
v2
v3
v4
v5
v6
v7
v8
v9
v10
v11
u0
u1
u2
u3
u4
u5
u6
u7
u8
u9
u10
u11
p4
+u-4
p0
+u-8
p1
+u-7
p2
+u-6
p3
+u-5
p5
+u-3
p6
+u-2
p7
+u-1
p8
+u0
p9
+u1
p10
+u2
p11
+u3
xi
si
u
v
u
R= u+v2 u+v
=23
v0
v1
v2
v3
v4
v5
v6
v7
v8
v9
v10
v11
u0
u1
u2
u3
u4
u5
u6
u7
u8
u9
u10
u11
p4
+u-4
p0
+u-8
p1
+u-7
p2
+u-6
p3
+u-5
p5
+u-3
p6
+u-2
p7
+u-1
p8
+u0
p9
+u1
p10
+u2
p11
+u3
si
xi
u
v
u
R= u+v2 u+v
=23
v0
v1
v2
v3
v4
v5
v6
v7
v8
v9
v10
v11
u0
u1
u2
u3
u4
u5
u6
u7
u8
u9
u10
u11
p4
+u-4
p0
+u-8
p1
+u-7
p2
+u-6
p3
+u-5
p5
+u-3
p6
+u-2
p7
+u-1
p8
+u0
p9
+u1
p10
+u2
p11
+u3
si
v0,v1,v2,v3
xi
u
v
u
R= u+v2 u+v
=23
v0 v1 v2 v3 v4 v5 v6 v7 v8 v9 v10 v11
u0 u1 u2 u3 u4 u5 u6 u7 u8 u9 u10 u11
p4+u-4
p0+u-8
p1+u-7
p2+u-6
p3+u-5
p5+u-3
p6+u-2
p7+u-1
p8+u0
p9+u1
p10+u2
p11+u3
si
v0,v1,v2,v3 u0 u1 u2 u3
xi
uv
u
R= uv2uv
v0
v1
v2
v3
v4
v5
v6
v7
v8
v9
v10
v11
u0
u1
u2
u3
u4
u5
u6
u7
u8
u9
u10
u11
p4
+u-4
p0
+u-8
p1
+u-7
p2
+u-6
p3
+u-5
p5
+u-3
p6
+u-2
p7
+u-1
p8
+u0
p9
+u1
p10
+u2
p11
+u3
si
v0,v1,v2,v3 u0 u
1u2
u3
s0
s1
s2
s3
xi
u
v
u
R= u+v2 u+v
=23
Encoding:
1 Split each source symbol into 2 groups si = (ui,vi)
2 Apply Erasure code to the vi stream generating pi parities
3 Repeat the ui symbols with a shift of T
4 Combine the repeated ui’s with the pi’s
5 Maximally Short Codes (Martinian and Sundberg ’04, Martinian and Trott ’07)
Sequential and Adaptive Information Theory Workshop McGill University Nov 7, 2013 5/ 42
Motivating Example - IIB = 4, T = 8
u0u-8
u1u-7
u2u-6
u3u-5
u4u-4
u5u-3
u6u-2
u7u-1
u8u0
u9u1
u10u2
u11u3
v0p0
v1p1
v2p2
v3p3
v4p4
v5p5
v6p6
v7p7
v8p8
v9p9
v10p10
v11p11
vu
uu
v0
p0
v1
p1
v2
p2
v3
p3
v4
p4
v5
p5
v6
p6
v7
p7
v8
p8
v9
p9
v10
p10
v11
p11
u0
u-8
u1
u-7
u2
u-6
u3
u-5
u4
u-4
u5
u-3
u6
u-2
u7
u-1
u8
u0
u9
u1
u10
u2
u11
u3
u
v
u
u
R= u+v3u+v
=12
v0
v1
v2
v3
v4
v5
v6
v7
v8
v9
v10
v11
u0
u1
u2
u3
u4
u5
u6
u7
u8
u9
u10
u11
p4
+u-4
p0
+u-8
p1
+u-7
p2
+u-6
p3
+u-5
p5
+u-3
p6
+u-2
p7
+u-1
p8
+u0
p9
+u1
p10
+u2
p11
+u3
xi
si
u
v
u
R= u+v2 u+v
=23
v0
v1
v2
v3
v4
v5
v6
v7
v8
v9
v10
v11
u0
u1
u2
u3
u4
u5
u6
u7
u8
u9
u10
u11
p4
+u-4
p0
+u-8
p1
+u-7
p2
+u-6
p3
+u-5
p5
+u-3
p6
+u-2
p7
+u-1
p8
+u0
p9
+u1
p10
+u2
p11
+u3
si
xi
u
v
u
R= u+v2 u+v
=23
v0
v1
v2
v3
v4
v5
v6
v7
v8
v9
v10
v11
u0
u1
u2
u3
u4
u5
u6
u7
u8
u9
u10
u11
p4
+u-4
p0
+u-8
p1
+u-7
p2
+u-6
p3
+u-5
p5
+u-3
p6
+u-2
p7
+u-1
p8
+u0
p9
+u1
p10
+u2
p11
+u3
si
v0,v1,v2,v3
xi
u
v
u
R= u+v2 u+v
=23
v0 v1 v2 v3 v4 v5 v6 v7 v8 v9 v10 v11
u0 u1 u2 u3 u4 u5 u6 u7 u8 u9 u10 u11
p4+u-4
p0+u-8
p1+u-7
p2+u-6
p3+u-5
p5+u-3
p6+u-2
p7+u-1
p8+u0
p9+u1
p10+u2
p11+u3
si
v0,v1,v2,v3 u0 u1 u2 u3
xi
uv
u
R= uv2uv
v0
v1
v2
v3
v4
v5
v6
v7
v8
v9
v10
v11
u0
u1
u2
u3
u4
u5
u6
u7
u8
u9
u10
u11
p4
+u-4
p0
+u-8
p1
+u-7
p2
+u-6
p3
+u-5
p5
+u-3
p6
+u-2
p7
+u-1
p8
+u0
p9
+u1
p10
+u2
p11
+u3
si
v0,v1,v2,v3 u0 u
1u2
u3
s0
s1
s2
s3
xi
u
v
u
R= u+v2 u+v
=23
Encoding:
1 Split each source symbol into 2 groups si = (ui,vi)
2 Apply Erasure code to the vi stream generating pi parities
3 Repeat the ui symbols with a shift of T
4 Combine the repeated ui’s with the pi’s
5 Maximally Short Codes (Martinian and Sundberg ’04, Martinian and Trott ’07)
Sequential and Adaptive Information Theory Workshop McGill University Nov 7, 2013 5/ 42
Motivating Example - IIB = 4, T = 8
u0u-8
u1u-7
u2u-6
u3u-5
u4u-4
u5u-3
u6u-2
u7u-1
u8u0
u9u1
u10u2
u11u3
v0p0
v1p1
v2p2
v3p3
v4p4
v5p5
v6p6
v7p7
v8p8
v9p9
v10p10
v11p11
vu
uu
v0
p0
v1
p1
v2
p2
v3
p3
v4
p4
v5
p5
v6
p6
v7
p7
v8
p8
v9
p9
v10
p10
v11
p11
u0
u-8
u1
u-7
u2
u-6
u3
u-5
u4
u-4
u5
u-3
u6
u-2
u7
u-1
u8
u0
u9
u1
u10
u2
u11
u3
u
v
u
u
R= u+v3u+v
=12
v0
v1
v2
v3
v4
v5
v6
v7
v8
v9
v10
v11
u0
u1
u2
u3
u4
u5
u6
u7
u8
u9
u10
u11
p4
+u-4
p0
+u-8
p1
+u-7
p2
+u-6
p3
+u-5
p5
+u-3
p6
+u-2
p7
+u-1
p8
+u0
p9
+u1
p10
+u2
p11
+u3
xi
si
u
v
u
R= u+v2 u+v
=23
v0
v1
v2
v3
v4
v5
v6
v7
v8
v9
v10
v11
u0
u1
u2
u3
u4
u5
u6
u7
u8
u9
u10
u11
p4
+u-4
p0
+u-8
p1
+u-7
p2
+u-6
p3
+u-5
p5
+u-3
p6
+u-2
p7
+u-1
p8
+u0
p9
+u1
p10
+u2
p11
+u3
si
xi
u
v
u
R= u+v2 u+v
=23
v0
v1
v2
v3
v4
v5
v6
v7
v8
v9
v10
v11
u0
u1
u2
u3
u4
u5
u6
u7
u8
u9
u10
u11
p4
+u-4
p0
+u-8
p1
+u-7
p2
+u-6
p3
+u-5
p5
+u-3
p6
+u-2
p7
+u-1
p8
+u0
p9
+u1
p10
+u2
p11
+u3
si
v0,v1,v2,v3
xi
u
v
u
R= u+v2 u+v
=23
v0 v1 v2 v3 v4 v5 v6 v7 v8 v9 v10 v11
u0 u1 u2 u3 u4 u5 u6 u7 u8 u9 u10 u11
p4+u-4
p0+u-8
p1+u-7
p2+u-6
p3+u-5
p5+u-3
p6+u-2
p7+u-1
p8+u0
p9+u1
p10+u2
p11+u3
si
v0,v1,v2,v3 u0 u1 u2 u3
xi
uv
u
R= uv2uv
v0
v1
v2
v3
v4
v5
v6
v7
v8
v9
v10
v11
u0
u1
u2
u3
u4
u5
u6
u7
u8
u9
u10
u11
p4
+u-4
p0
+u-8
p1
+u-7
p2
+u-6
p3
+u-5
p5
+u-3
p6
+u-2
p7
+u-1
p8
+u0
p9
+u1
p10
+u2
p11
+u3
si
v0,v1,v2,v3 u0 u
1u2
u3
s0
s1
s2
s3
xi
u
v
u
R= u+v2 u+v
=23
Encoding:
1 Split each source symbol into 2 groups si = (ui,vi)
2 Apply Erasure code to the vi stream generating pi parities
3 Repeat the ui symbols with a shift of T
4 Combine the repeated ui’s with the pi’s
5 Maximally Short Codes (Martinian and Sundberg ’04, Martinian and Trott ’07)
Sequential and Adaptive Information Theory Workshop McGill University Nov 7, 2013 5/ 42
Motivating Example - IIB = 4, T = 8
u0u-8
u1u-7
u2u-6
u3u-5
u4u-4
u5u-3
u6u-2
u7u-1
u8u0
u9u1
u10u2
u11u3
v0p0
v1p1
v2p2
v3p3
v4p4
v5p5
v6p6
v7p7
v8p8
v9p9
v10p10
v11p11
vu
uu
v0
p0
v1
p1
v2
p2
v3
p3
v4
p4
v5
p5
v6
p6
v7
p7
v8
p8
v9
p9
v10
p10
v11
p11
u0
u-8
u1
u-7
u2
u-6
u3
u-5
u4
u-4
u5
u-3
u6
u-2
u7
u-1
u8
u0
u9
u1
u10
u2
u11
u3
u
v
u
u
R= u+v3u+v
=12
v0
v1
v2
v3
v4
v5
v6
v7
v8
v9
v10
v11
u0
u1
u2
u3
u4
u5
u6
u7
u8
u9
u10
u11
p4
+u-4
p0
+u-8
p1
+u-7
p2
+u-6
p3
+u-5
p5
+u-3
p6
+u-2
p7
+u-1
p8
+u0
p9
+u1
p10
+u2
p11
+u3
xi
si
u
v
u
R= u+v2 u+v
=23
v0
v1
v2
v3
v4
v5
v6
v7
v8
v9
v10
v11
u0
u1
u2
u3
u4
u5
u6
u7
u8
u9
u10
u11
p4
+u-4
p0
+u-8
p1
+u-7
p2
+u-6
p3
+u-5
p5
+u-3
p6
+u-2
p7
+u-1
p8
+u0
p9
+u1
p10
+u2
p11
+u3
si
xi
u
v
u
R= u+v2 u+v
=23
v0
v1
v2
v3
v4
v5
v6
v7
v8
v9
v10
v11
u0
u1
u2
u3
u4
u5
u6
u7
u8
u9
u10
u11
p4
+u-4
p0
+u-8
p1
+u-7
p2
+u-6
p3
+u-5
p5
+u-3
p6
+u-2
p7
+u-1
p8
+u0
p9
+u1
p10
+u2
p11
+u3
si
v0,v1,v2,v3
xi
u
v
u
R= u+v2 u+v
=23
v0 v1 v2 v3 v4 v5 v6 v7 v8 v9 v10 v11
u0 u1 u2 u3 u4 u5 u6 u7 u8 u9 u10 u11
p4+u-4
p0+u-8
p1+u-7
p2+u-6
p3+u-5
p5+u-3
p6+u-2
p7+u-1
p8+u0
p9+u1
p10+u2
p11+u3
si
v0,v1,v2,v3 u0 u1 u2 u3
xi
uv
u
R= uv2uv
v0
v1
v2
v3
v4
v5
v6
v7
v8
v9
v10
v11
u0
u1
u2
u3
u4
u5
u6
u7
u8
u9
u10
u11
p4
+u-4
p0
+u-8
p1
+u-7
p2
+u-6
p3
+u-5
p5
+u-3
p6
+u-2
p7
+u-1
p8
+u0
p9
+u1
p10
+u2
p11
+u3
si
v0,v1,v2,v3 u0 u
1u2
u3
s0
s1
s2
s3
xi
u
v
u
R= u+v2 u+v
=23
Encoding:
1 Split each source symbol into 2 groups si = (ui,vi)
2 Apply Erasure code to the vi stream generating pi parities
3 Repeat the ui symbols with a shift of T
4 Combine the repeated ui’s with the pi’s
5 Maximally Short Codes (Martinian and Sundberg ’04, Martinian and Trott ’07)
Sequential and Adaptive Information Theory Workshop McGill University Nov 7, 2013 5/ 42
Key Questions
Can we construct streaming codes for realistic channels?
Gilbert-Elliott Model
Bad State Good State Bad State
How much performance gains can we obtain?
What are the fundamental metrics for low-delay errorcorrection codes?
Sequential and Adaptive Information Theory Workshop McGill University Nov 7, 2013 6/ 42
Problem Setup
System Model
Source Model : i.i.d. sequence s[t] ∼ ps(·)= Unif{(Fq)k}
Streaming Encoder: x [t] = ft (s[1], . . . , s[t]), x [t] ∈ (Fq)n
Erasure Channel
Delay-Constrained Decoder: s[t] = gt(y [1], . . . , y [t+ T ])
Rate R = kn
0 3 4 5 9 11 12 14 151 8 10
(N,B,W) = (2,3,6)
2 6 7 13
W = 6N = 2
0 3 4 5 9 11 12 14 151 8 10
(N,B,W) = (2,3,6)
2 6 7 13
W = 6N = 2
0 3 4 5 9 11 12 14 151 8 10
(N,B,W) = (2,3,6)
2 6 7 13
W = 6N = 2
0 3 4 5 9 11 12 14 151 8 10
(N,B,W) = (2,3,6)
2 6 7 13
W = 6B = 3
0 3 4 5 9 11 12 14 151 8 10
(N,B,W) = (2,3,6)
2 6 7 13
W = 6B = 3
Capacity: C(N,B,W, T )
Sequential and Adaptive Information Theory Workshop McGill University Nov 7, 2013 7/ 42
Problem Setup
System Model
Source Model : i.i.d. sequence s[t] ∼ ps(·)= Unif{(Fq)k}
Streaming Encoder: x [t] = ft (s[1], . . . , s[t]), x [t] ∈ (Fq)n
Erasure Channel
Delay-Constrained Decoder: s[t] = gt(y [1], . . . , y [t+ T ])
Rate R = kn
0 3 4 5 9 11 12 14 151 8 10
(N,B,W) = (2,3,6)
2 6 7 13
W = 6N = 2
0 3 4 5 9 11 12 14 151 8 10
(N,B,W) = (2,3,6)
2 6 7 13
W = 6N = 2
0 3 4 5 9 11 12 14 151 8 10
(N,B,W) = (2,3,6)
2 6 7 13
W = 6N = 2
0 3 4 5 9 11 12 14 151 8 10
(N,B,W) = (2,3,6)
2 6 7 13
W = 6B = 3
0 3 4 5 9 11 12 14 151 8 10
(N,B,W) = (2,3,6)
2 6 7 13
W = 6B = 3
Capacity: C(N,B,W, T )
Sequential and Adaptive Information Theory Workshop McGill University Nov 7, 2013 7/ 42
Problem Setup
System Model
Source Model : i.i.d. sequence s[t] ∼ ps(·)= Unif{(Fq)k}
Streaming Encoder: x [t] = ft (s[1], . . . , s[t]), x [t] ∈ (Fq)n
Erasure Channel
Delay-Constrained Decoder: s[t] = gt(y [1], . . . , y [t+ T ])
Rate R = kn
0 3 4 5 9 11 12 14 151 8 10
(N,B,W) = (2,3,6)
2 6 7 13
W = 6N = 2
0 3 4 5 9 11 12 14 151 8 10
(N,B,W) = (2,3,6)
2 6 7 13
W = 6N = 2
0 3 4 5 9 11 12 14 151 8 10
(N,B,W) = (2,3,6)
2 6 7 13
W = 6N = 2
0 3 4 5 9 11 12 14 151 8 10
(N,B,W) = (2,3,6)
2 6 7 13
W = 6B = 3
0 3 4 5 9 11 12 14 151 8 10
(N,B,W) = (2,3,6)
2 6 7 13
W = 6B = 3
Capacity: C(N,B,W, T )
Sequential and Adaptive Information Theory Workshop McGill University Nov 7, 2013 7/ 42
Problem Setup
System Model
Source Model : i.i.d. sequence s[t] ∼ ps(·)= Unif{(Fq)k}
Streaming Encoder: x [t] = ft (s[1], . . . , s[t]), x [t] ∈ (Fq)n
Erasure Channel
Delay-Constrained Decoder: s[t] = gt(y [1], . . . , y [t+ T ])
Rate R = kn
0 3 4 5 9 11 12 14 151 8 10
(N,B,W) = (2,3,6)
2 6 7 13
W = 6N = 2
0 3 4 5 9 11 12 14 151 8 10
(N,B,W) = (2,3,6)
2 6 7 13
W = 6N = 2
0 3 4 5 9 11 12 14 151 8 10
(N,B,W) = (2,3,6)
2 6 7 13
W = 6N = 2
0 3 4 5 9 11 12 14 151 8 10
(N,B,W) = (2,3,6)
2 6 7 13
W = 6B = 3
0 3 4 5 9 11 12 14 151 8 10
(N,B,W) = (2,3,6)
2 6 7 13
W = 6B = 3
Capacity: C(N,B,W, T )
Sequential and Adaptive Information Theory Workshop McGill University Nov 7, 2013 7/ 42
Problem Setup
System Model
Source Model : i.i.d. sequence s[t] ∼ ps(·)= Unif{(Fq)k}
Streaming Encoder: x [t] = ft (s[1], . . . , s[t]), x [t] ∈ (Fq)n
Erasure Channel
Delay-Constrained Decoder: s[t] = gt(y [1], . . . , y [t+ T ])
Rate R = kn
0 3 4 5 9 11 12 14 151 8 10
(N,B,W) = (2,3,6)
2 6 7 13
W = 6N = 2
0 3 4 5 9 11 12 14 151 8 10
(N,B,W) = (2,3,6)
2 6 7 13
W = 6N = 2
0 3 4 5 9 11 12 14 151 8 10
(N,B,W) = (2,3,6)
2 6 7 13
W = 6N = 2
0 3 4 5 9 11 12 14 151 8 10
(N,B,W) = (2,3,6)
2 6 7 13
W = 6B = 3
0 3 4 5 9 11 12 14 151 8 10
(N,B,W) = (2,3,6)
2 6 7 13
W = 6B = 3
Capacity: C(N,B,W, T )Sequential and Adaptive Information Theory Workshop McGill University Nov 7, 2013 7/ 42
Streaming Codes — CapacityBadr-Khisti-Tan-Apostolopoulos (2013)
Theorem
Consider the C(N,B,W ) channel, with W ≥ B + 1, and let thedelay be T .Upper-Bound For any rate R code, we have:(
R
1−R
)B +N ≤ min(W,T + 1)
Lower-Bound: There exists a rate R code that satisfies:(R
1−R
)B +N ≥ min(W,T + 1)− 1.
The gap between the upper and lower bound is 1 unit of delay.
Sequential and Adaptive Information Theory Workshop McGill University Nov 7, 2013 8/ 42
Streaming Codes - Burst Erasure Channels
C(N = 1, B,W ≥ T + 1) = TT+B
v0
v1
v2
v3
v4
v5
v6
v7
v8
v9
v10
v11
u0
u1
u2
u3
u4
u5
u6
u7
u8
u9
u10
u11
p4
+u-4
p0
+u-8
p1
+u-7
p2
+u-6
p3
+u-5
p5
+u-3
p6
+u-2
p7
+u-1
p8
+u0
p9
+u1
p10
+u2
p11
+u3
xi
si
B
T-B
B
v0
v1
v2
v3
u0
u1
u2
u3
p0
+u-8
p1
+u-7
p2
+u-6
p3
+u-5
T-B
...v3
v0
B
v0
v1
v2
v3
v4
v5
v6
v7
v8
v9
v10
v11
u0
u1
u2
u3
u4
u5
u6
u7
u8
u9
u10
u11
p4
+u-4
p0
+u-8
p1
+u-7
p2
+u-6
p3
+u-5
p5
+u-3
p6
+u-2
p7
+u-1
p8
+u0
p9
+u1
p10
+u2
p11
+u3
xi
si
B
T-B
B
v0
v1
v2
v3
u0
u1
u2
u3
p0
+u-8
p1
+u-7
p2
+u-6
p3
+u-5
T-B
...v3
v0
B
v0
v1
v2
v3
v4
v5
v6
v7
v8
v9
v10
v11
u0
u1
u2
u3
u4
u5
u6
u7
u8
u9
u10
u11
p4
+u-4
p0
+u-8
p1
+u-7
p2
+u-6
p3
+u-5
p5
+u-3
p6
+u-2
p7
+u-1
p8
+u0
p9
+u1
p10
+u2
p11
+u3
xi
si
B
T-B
B
v0
v1
v2
v3
u0
u1
u2
u3
p0
+u-8
p1
+u-7
p2
+u-6
p3
+u-5
T-B
...v3
v0
Bu
0u
1u
2u
3
Assume si ∈ FTq . Let vi ∈ FT−B
q , ui,pi ∈ FBq
Recovery of {vi} by time T − 1
Number of Unknowns: B symbols over FT−Bq
Number of Equations: T −B symbols over FBq
Recovery of ui at time i+ T
Sequential and Adaptive Information Theory Workshop McGill University Nov 7, 2013 9/ 42
Streaming Codes - Burst Erasure Channels
C(N = 1, B,W ≥ T + 1) = TT+B
v0
v1
v2
v3
v4
v5
v6
v7
v8
v9
v10
v11
u0
u1
u2
u3
u4
u5
u6
u7
u8
u9
u10
u11
p4
+u-4
p0
+u-8
p1
+u-7
p2
+u-6
p3
+u-5
p5
+u-3
p6
+u-2
p7
+u-1
p8
+u0
p9
+u1
p10
+u2
p11
+u3
xi
si
B
T-B
B
v0
v1
v2
v3
u0
u1
u2
u3
p0
+u-8
p1
+u-7
p2
+u-6
p3
+u-5
T-B
...v3
v0
B
v0
v1
v2
v3
v4
v5
v6
v7
v8
v9
v10
v11
u0
u1
u2
u3
u4
u5
u6
u7
u8
u9
u10
u11
p4
+u-4
p0
+u-8
p1
+u-7
p2
+u-6
p3
+u-5
p5
+u-3
p6
+u-2
p7
+u-1
p8
+u0
p9
+u1
p10
+u2
p11
+u3
xi
si
B
T-B
B
v0
v1
v2
v3
u0
u1
u2
u3
p0
+u-8
p1
+u-7
p2
+u-6
p3
+u-5
T-B
...v3
v0
B
v0
v1
v2
v3
v4
v5
v6
v7
v8
v9
v10
v11
u0
u1
u2
u3
u4
u5
u6
u7
u8
u9
u10
u11
p4
+u-4
p0
+u-8
p1
+u-7
p2
+u-6
p3
+u-5
p5
+u-3
p6
+u-2
p7
+u-1
p8
+u0
p9
+u1
p10
+u2
p11
+u3
xi
si
B
T-B
B
v0
v1
v2
v3
u0
u1
u2
u3
p0
+u-8
p1
+u-7
p2
+u-6
p3
+u-5
T-B
...v3
v0
Bu
0u
1u
2u
3
Assume si ∈ FTq . Let vi ∈ FT−B
q , ui,pi ∈ FBq
Recovery of {vi} by time T − 1
Number of Unknowns: B symbols over FT−Bq
Number of Equations: T −B symbols over FBq
Recovery of ui at time i+ T
Sequential and Adaptive Information Theory Workshop McGill University Nov 7, 2013 9/ 42
Streaming Codes - Burst Erasure Channels
C(N = 1, B,W ≥ T + 1) = TT+B
v0
v1
v2
v3
v4
v5
v6
v7
v8
v9
v10
v11
u0
u1
u2
u3
u4
u5
u6
u7
u8
u9
u10
u11
p4
+u-4
p0
+u-8
p1
+u-7
p2
+u-6
p3
+u-5
p5
+u-3
p6
+u-2
p7
+u-1
p8
+u0
p9
+u1
p10
+u2
p11
+u3
xi
si
B
T-B
B
v0
v1
v2
v3
u0
u1
u2
u3
p0
+u-8
p1
+u-7
p2
+u-6
p3
+u-5
T-B
...v3
v0
B
v0
v1
v2
v3
v4
v5
v6
v7
v8
v9
v10
v11
u0
u1
u2
u3
u4
u5
u6
u7
u8
u9
u10
u11
p4
+u-4
p0
+u-8
p1
+u-7
p2
+u-6
p3
+u-5
p5
+u-3
p6
+u-2
p7
+u-1
p8
+u0
p9
+u1
p10
+u2
p11
+u3
xi
si
B
T-B
B
v0
v1
v2
v3
u0
u1
u2
u3
p0
+u-8
p1
+u-7
p2
+u-6
p3
+u-5
T-B
...v3
v0
B
v0
v1
v2
v3
v4
v5
v6
v7
v8
v9
v10
v11
u0
u1
u2
u3
u4
u5
u6
u7
u8
u9
u10
u11
p4
+u-4
p0
+u-8
p1
+u-7
p2
+u-6
p3
+u-5
p5
+u-3
p6
+u-2
p7
+u-1
p8
+u0
p9
+u1
p10
+u2
p11
+u3
xi
si
B
T-B
B
v0
v1
v2
v3
u0
u1
u2
u3
p0
+u-8
p1
+u-7
p2
+u-6
p3
+u-5
T-B
...v3
v0
Bu
0u
1u
2u
3
Assume si ∈ FTq . Let vi ∈ FT−B
q , ui,pi ∈ FBq
Recovery of {vi} by time T − 1
Number of Unknowns: B symbols over FT−Bq
Number of Equations: T −B symbols over FBq
Recovery of ui at time i+ T
Sequential and Adaptive Information Theory Workshop McGill University Nov 7, 2013 9/ 42
Streaming Codes - Isolated ErasuresC(N ≥ 2, B, W)
T = 8
u0 u1 u2 u3 u4 u5 u6 u7 u8 u9 u10 u11v0 v1 v2 v3 v4
p4+u-4
v5 v6 v7 v8 v9 v10 v11p0+u-8
p1+u-7
p2+u-6
p3+u-5
p5+u-3
p6+u-2
p7+u-1
p8+u0
p9+u1
p10+u2
p11+u3
xi
uv
u
Erasures at time t = 0 and t = 8
u0 cannot be recovered due to a repetition code
Sequential and Adaptive Information Theory Workshop McGill University Nov 7, 2013 10/ 42
Proposed Approach: LayeringC(N ≥ 2, B, W)
v0
v1
v2
v3
v4
v5
v6
v7
v8
v9
v10
v11
u0
u1
u2
u3
u4
u5
u6
u7
u8
u9
u10
u11
p4
+u-4
p0
+u-8
p1
+u-7
p2
+u-6
p3
+u-5
p5
+u-3
p6
+u-2
p7
+u-1
p8
+u0
p9
+u1
p10
+u2
p11
+u3
xi
u
v
u
q1
q2
q3
q4
q5
q6
q7
q8
q9
q10
q11 kq
0
v0
v1
v2
v3
v4
v5
v6
v7
v8
v9
v10
v11
u0
u1
u2
u3
u4
u5
u6
u7
u8
u9
u10
u11
p4
+u-4
p0
+u-8
p1
+u-7
p2
+u-6
p3
+u-5
p5
+u-3
p6
+u-2
p7
+u-1
p8
+u0
p9
+u1
p10
+u2
p11
+u3
xi
si
u
v
u
q1
q2
q3
q4
q5
q6
q7
q8
q9
q10
q11 kq
0
v0
v1
v2
v3
v4
v5
v6
v7
v8
v9
v10
v11
u0
u1
u2
u3
u4
u5
u6
u7
u8
u9
u10
u11
p4
+u-4
p0
+u-8
p1
+u-7
p2
+u-6
p3
+u-5
p5
+u-3
p6
+u-2
p7
+u-1
p8
+u0
p9
+u1
p10
+u2
p11
+u3
xi
si
u
v
u
q1
q2
q3
q4
q5
q6
q7
q8
q9
q10
q11 kq
0
v0
v1
v2
v3
v4
v5
v6
v7
v8
v9
v10
v11
u0
u1
u2
u3
u4
u5
u6
u7
u8
u9
u10
u11
p4
+u-4
p0
+u-8
p1
+u-7
p2
+u-6
p3
+u-5
p5
+u-3
p6
+u-2
p7
+u-1
p8
+u0
p9
+u1
p10
+u2
p11
+u3
xi
si
u
v
u
q1
q2
q3
q4
q5
q6
q7
q8
q9
q10
q11 kq
0
v0v2
v0
v1
v2
v3
v4
v5
v6
v7
v8
v9
v10
v11
u0
u1
u2
u3
u4
u5
u6
u7
u8
u9
u10
u11
p4
+u-4
p0
+u-8
p1
+u-7
p2
+u-6
p3
+u-5
p5
+u-3
p6
+u-2
p7
+u-1
p8
+u0
p9
+u1
p10
+u2
p11
+u3
xi
si
u
v
u
q1
q2
q3
q4
q5
q6
q7
q8
q9
q10
q11 kq
0
u0u2
Layered Code Design
Burst-Erasure Streaming Code C1 : (ui,vi,pi + ui−T )
Erasure Code: qi =∑M
t=1 ui−t ·Hut , qi ∈ Fk
q
C2: (ui,vi,pi + ui−T ,qi)
R =T
T +B + k, k =
N
T −N + 1B
Sequential and Adaptive Information Theory Workshop McGill University Nov 7, 2013 11/ 42
Proposed Approach: LayeringC(N ≥ 2, B, W)
v0
v1
v2
v3
v4
v5
v6
v7
v8
v9
v10
v11
u0
u1
u2
u3
u4
u5
u6
u7
u8
u9
u10
u11
p4
+u-4
p0
+u-8
p1
+u-7
p2
+u-6
p3
+u-5
p5
+u-3
p6
+u-2
p7
+u-1
p8
+u0
p9
+u1
p10
+u2
p11
+u3
xi
u
v
u
q1
q2
q3
q4
q5
q6
q7
q8
q9
q10
q11 kq
0
v0
v1
v2
v3
v4
v5
v6
v7
v8
v9
v10
v11
u0
u1
u2
u3
u4
u5
u6
u7
u8
u9
u10
u11
p4
+u-4
p0
+u-8
p1
+u-7
p2
+u-6
p3
+u-5
p5
+u-3
p6
+u-2
p7
+u-1
p8
+u0
p9
+u1
p10
+u2
p11
+u3
xi
si
u
v
u
q1
q2
q3
q4
q5
q6
q7
q8
q9
q10
q11 kq
0
v0
v1
v2
v3
v4
v5
v6
v7
v8
v9
v10
v11
u0
u1
u2
u3
u4
u5
u6
u7
u8
u9
u10
u11
p4
+u-4
p0
+u-8
p1
+u-7
p2
+u-6
p3
+u-5
p5
+u-3
p6
+u-2
p7
+u-1
p8
+u0
p9
+u1
p10
+u2
p11
+u3
xi
si
u
v
u
q1
q2
q3
q4
q5
q6
q7
q8
q9
q10
q11 kq
0
v0
v1
v2
v3
v4
v5
v6
v7
v8
v9
v10
v11
u0
u1
u2
u3
u4
u5
u6
u7
u8
u9
u10
u11
p4
+u-4
p0
+u-8
p1
+u-7
p2
+u-6
p3
+u-5
p5
+u-3
p6
+u-2
p7
+u-1
p8
+u0
p9
+u1
p10
+u2
p11
+u3
xi
si
u
v
u
q1
q2
q3
q4
q5
q6
q7
q8
q9
q10
q11 kq
0
v0v2
v0
v1
v2
v3
v4
v5
v6
v7
v8
v9
v10
v11
u0
u1
u2
u3
u4
u5
u6
u7
u8
u9
u10
u11
p4
+u-4
p0
+u-8
p1
+u-7
p2
+u-6
p3
+u-5
p5
+u-3
p6
+u-2
p7
+u-1
p8
+u0
p9
+u1
p10
+u2
p11
+u3
xi
si
u
v
u
q1
q2
q3
q4
q5
q6
q7
q8
q9
q10
q11 kq
0
u0u2
Layered Code Design
Burst-Erasure Streaming Code C1 : (ui,vi,pi + ui−T )
Erasure Code: qi =∑M
t=1 ui−t ·Hut , qi ∈ Fk
q
C2: (ui,vi,pi + ui−T ,qi)
R =T
T +B + k, k =
N
T −N + 1B
Sequential and Adaptive Information Theory Workshop McGill University Nov 7, 2013 11/ 42
Proposed Approach: LayeringC(N ≥ 2, B, W)
v0
v1
v2
v3
v4
v5
v6
v7
v8
v9
v10
v11
u0
u1
u2
u3
u4
u5
u6
u7
u8
u9
u10
u11
p4
+u-4
p0
+u-8
p1
+u-7
p2
+u-6
p3
+u-5
p5
+u-3
p6
+u-2
p7
+u-1
p8
+u0
p9
+u1
p10
+u2
p11
+u3
xi
u
v
u
q1
q2
q3
q4
q5
q6
q7
q8
q9
q10
q11 kq
0
v0
v1
v2
v3
v4
v5
v6
v7
v8
v9
v10
v11
u0
u1
u2
u3
u4
u5
u6
u7
u8
u9
u10
u11
p4
+u-4
p0
+u-8
p1
+u-7
p2
+u-6
p3
+u-5
p5
+u-3
p6
+u-2
p7
+u-1
p8
+u0
p9
+u1
p10
+u2
p11
+u3
xi
si
u
v
u
q1
q2
q3
q4
q5
q6
q7
q8
q9
q10
q11 kq
0
v0
v1
v2
v3
v4
v5
v6
v7
v8
v9
v10
v11
u0
u1
u2
u3
u4
u5
u6
u7
u8
u9
u10
u11
p4
+u-4
p0
+u-8
p1
+u-7
p2
+u-6
p3
+u-5
p5
+u-3
p6
+u-2
p7
+u-1
p8
+u0
p9
+u1
p10
+u2
p11
+u3
xi
si
u
v
u
q1
q2
q3
q4
q5
q6
q7
q8
q9
q10
q11 kq
0
v0
v1
v2
v3
v4
v5
v6
v7
v8
v9
v10
v11
u0
u1
u2
u3
u4
u5
u6
u7
u8
u9
u10
u11
p4
+u-4
p0
+u-8
p1
+u-7
p2
+u-6
p3
+u-5
p5
+u-3
p6
+u-2
p7
+u-1
p8
+u0
p9
+u1
p10
+u2
p11
+u3
xi
si
u
v
u
q1
q2
q3
q4
q5
q6
q7
q8
q9
q10
q11 kq
0
v0v2
v0
v1
v2
v3
v4
v5
v6
v7
v8
v9
v10
v11
u0
u1
u2
u3
u4
u5
u6
u7
u8
u9
u10
u11
p4
+u-4
p0
+u-8
p1
+u-7
p2
+u-6
p3
+u-5
p5
+u-3
p6
+u-2
p7
+u-1
p8
+u0
p9
+u1
p10
+u2
p11
+u3
xi
si
u
v
u
q1
q2
q3
q4
q5
q6
q7
q8
q9
q10
q11 kq
0
u0u2
Layered Code Design
Burst-Erasure Streaming Code C1 : (ui,vi,pi + ui−T )
Erasure Code: qi =∑M
t=1 ui−t ·Hut , qi ∈ Fk
q
C2: (ui,vi,pi + ui−T ,qi)
R =T
T +B + k, k =
N
T −N + 1B
Sequential and Adaptive Information Theory Workshop McGill University Nov 7, 2013 11/ 42
Proposed Approach: LayeringC(N ≥ 2, B, W)
v0
v1
v2
v3
v4
v5
v6
v7
v8
v9
v10
v11
u0
u1
u2
u3
u4
u5
u6
u7
u8
u9
u10
u11
p4
+u-4
p0
+u-8
p1
+u-7
p2
+u-6
p3
+u-5
p5
+u-3
p6
+u-2
p7
+u-1
p8
+u0
p9
+u1
p10
+u2
p11
+u3
xi
u
v
u
q1
q2
q3
q4
q5
q6
q7
q8
q9
q10
q11 kq
0
v0
v1
v2
v3
v4
v5
v6
v7
v8
v9
v10
v11
u0
u1
u2
u3
u4
u5
u6
u7
u8
u9
u10
u11
p4
+u-4
p0
+u-8
p1
+u-7
p2
+u-6
p3
+u-5
p5
+u-3
p6
+u-2
p7
+u-1
p8
+u0
p9
+u1
p10
+u2
p11
+u3
xi
si
u
v
u
q1
q2
q3
q4
q5
q6
q7
q8
q9
q10
q11 kq
0
v0
v1
v2
v3
v4
v5
v6
v7
v8
v9
v10
v11
u0
u1
u2
u3
u4
u5
u6
u7
u8
u9
u10
u11
p4
+u-4
p0
+u-8
p1
+u-7
p2
+u-6
p3
+u-5
p5
+u-3
p6
+u-2
p7
+u-1
p8
+u0
p9
+u1
p10
+u2
p11
+u3
xi
si
u
v
u
q1
q2
q3
q4
q5
q6
q7
q8
q9
q10
q11 kq
0
v0
v1
v2
v3
v4
v5
v6
v7
v8
v9
v10
v11
u0
u1
u2
u3
u4
u5
u6
u7
u8
u9
u10
u11
p4
+u-4
p0
+u-8
p1
+u-7
p2
+u-6
p3
+u-5
p5
+u-3
p6
+u-2
p7
+u-1
p8
+u0
p9
+u1
p10
+u2
p11
+u3
xi
si
u
v
u
q1
q2
q3
q4
q5
q6
q7
q8
q9
q10
q11 kq
0
v0v2
v0
v1
v2
v3
v4
v5
v6
v7
v8
v9
v10
v11
u0
u1
u2
u3
u4
u5
u6
u7
u8
u9
u10
u11
p4
+u-4
p0
+u-8
p1
+u-7
p2
+u-6
p3
+u-5
p5
+u-3
p6
+u-2
p7
+u-1
p8
+u0
p9
+u1
p10
+u2
p11
+u3
xi
si
u
v
u
q1
q2
q3
q4
q5
q6
q7
q8
q9
q10
q11 kq
0
u0u2
Layered Code Design
Burst-Erasure Streaming Code C1 : (ui,vi,pi + ui−T )
Erasure Code: qi =∑M
t=1 ui−t ·Hut , qi ∈ Fk
q
C2: (ui,vi,pi + ui−T ,qi)
R =T
T +B + k, k =
N
T −N + 1B
Sequential and Adaptive Information Theory Workshop McGill University Nov 7, 2013 11/ 42
Proposed Approach: LayeringC(N ≥ 2, B, W)
v0
v1
v2
v3
v4
v5
v6
v7
v8
v9
v10
v11
u0
u1
u2
u3
u4
u5
u6
u7
u8
u9
u10
u11
p4
+u-4
p0
+u-8
p1
+u-7
p2
+u-6
p3
+u-5
p5
+u-3
p6
+u-2
p7
+u-1
p8
+u0
p9
+u1
p10
+u2
p11
+u3
xi
u
v
u
q1
q2
q3
q4
q5
q6
q7
q8
q9
q10
q11 kq
0
v0
v1
v2
v3
v4
v5
v6
v7
v8
v9
v10
v11
u0
u1
u2
u3
u4
u5
u6
u7
u8
u9
u10
u11
p4
+u-4
p0
+u-8
p1
+u-7
p2
+u-6
p3
+u-5
p5
+u-3
p6
+u-2
p7
+u-1
p8
+u0
p9
+u1
p10
+u2
p11
+u3
xi
si
u
v
u
q1
q2
q3
q4
q5
q6
q7
q8
q9
q10
q11 kq
0
v0
v1
v2
v3
v4
v5
v6
v7
v8
v9
v10
v11
u0
u1
u2
u3
u4
u5
u6
u7
u8
u9
u10
u11
p4
+u-4
p0
+u-8
p1
+u-7
p2
+u-6
p3
+u-5
p5
+u-3
p6
+u-2
p7
+u-1
p8
+u0
p9
+u1
p10
+u2
p11
+u3
xi
si
u
v
u
q1
q2
q3
q4
q5
q6
q7
q8
q9
q10
q11 kq
0
v0
v1
v2
v3
v4
v5
v6
v7
v8
v9
v10
v11
u0
u1
u2
u3
u4
u5
u6
u7
u8
u9
u10
u11
p4
+u-4
p0
+u-8
p1
+u-7
p2
+u-6
p3
+u-5
p5
+u-3
p6
+u-2
p7
+u-1
p8
+u0
p9
+u1
p10
+u2
p11
+u3
xi
si
u
v
u
q1
q2
q3
q4
q5
q6
q7
q8
q9
q10
q11 kq
0
v0v2
v0
v1
v2
v3
v4
v5
v6
v7
v8
v9
v10
v11
u0
u1
u2
u3
u4
u5
u6
u7
u8
u9
u10
u11
p4
+u-4
p0
+u-8
p1
+u-7
p2
+u-6
p3
+u-5
p5
+u-3
p6
+u-2
p7
+u-1
p8
+u0
p9
+u1
p10
+u2
p11
+u3
xi
si
u
v
u
q1
q2
q3
q4
q5
q6
q7
q8
q9
q10
q11 kq
0
u0u2
Layered Code Design
Burst-Erasure Streaming Code C1 : (ui,vi,pi + ui−T )
Erasure Code: qi =∑M
t=1 ui−t ·Hut , qi ∈ Fk
q
C2: (ui,vi,pi + ui−T ,qi)
R =T
T +B + k, k =
N
T −N + 1B
Sequential and Adaptive Information Theory Workshop McGill University Nov 7, 2013 11/ 42
Upper BoundW ≥ T + 1
Periodic Erasure Channel with P = T + 1−N +B.
B erasures in each burst.
Guard Interval = T + 1−N .
Link: · · ·B T + 1−N B T + 1−N B T + 1−N
Show that any low-delay code recovers every symbol on thiserasure channel.
R ≤ T + 1−NT + 1−N +B
Sequential and Adaptive Information Theory Workshop McGill University Nov 7, 2013 12/ 42
Simulation ResultsGilbert-Eliott Channel (α, β) = (5× 10−4, 0.5), T = 12 and R = 12/23
Gilbert Elliott Channel
Good State: Pr(loss) = ε
Bad State: Pr(loss) = 1
Sequential and Adaptive Information Theory Workshop McGill University Nov 7, 2013 13/ 42
Simulation ResultsGilbert-Eliott Channel (α, β) = (5× 10−4, 0.5), T = 12 and R = 12/23
1 2 3 4 5 6 7 8 9 10x 10 3
10 6
10 5
10 4
10 3
10 2
Loss
Pro
babi
lity
Gilbert Channel ( , ) = (5x10 4,0.5) T = 12, Rate = 12/23 0.5
UncodedMaximally Short Code (N,B) = (1,11)Strongly MDS Code (N,B) = (6,6)MiDAS Code (N,B) = (2,9)
Code N B Code N B
Strongly MDS 6 6 MiDAS 2 9
Burst-Erasure 1 11
Sequential and Adaptive Information Theory Workshop McGill University Nov 7, 2013 13/ 42
Simulation Results-IIFritchman Channel (α, β) = (1e− 5, 0.5) and T = 40 and R = 40/79, 9 states
“Good” E1 E2 E3 EN-1 EN
α
1- 1- 1- 1-1-1-
α
α = 1e− 5
β = 0.5
5 10 15 20 25 30 35 400
0.02
0.04
0.06
0.08
0.1
0.12
Burst Length
Prob
abilit
y of
Occ
uren
ce
Histogram of Burst Lengths for 9 States Fritchman Channel ( , ) = (1E 5,0.5)
AnalyticalActual
Sequential and Adaptive Information Theory Workshop McGill University Nov 7, 2013 14/ 42
Simulation Results-IIFritchman Channel (α, β) = (1e− 5, 0.5) and T = 40 and R = 40/79, 9 states
1 2 3 4 5 6 7 8x 10 3
10 7
10 6
10 5
10 4
10 3
10 2Lo
ss P
roba
bilit
y9 States Fritchman Channel ( , ) = (1e 005,0.5), T = 40
UncodedRLCSCoE RLC (Shift = 32)E RLC (Shift = 36)
Code N B Code N B
Strongly MDS 20 20 MiDAS-I 8 31
Burst Erasure 1 39 MiDAS-II 4 35
Sequential and Adaptive Information Theory Workshop McGill University Nov 7, 2013 14/ 42
Simulation Results - IIIGilbert-Eliott Channel (α, β) = (5× 10−5, 0.2), T = 50 and R ≈ 0.6
1 2 3 4 5 6 7 8 9 10x 10 3
10 6
10 5
10 4
10 3
Loss
Pro
babi
lity
Gilbert Channel ( , ) = (5x10 5,0.2), Simulation Length = 108, T = 50, Rate = 50/88 0.6
MS (N,B) = (1,33)S BEC (N,B) = (20,20)MIDAS (N,B) = (4,30)PRC (N,B) = (4,25)
Code N B Code N B
Strongly MDS 20 20 MiDAS 4 30
Burst-Erasure 1 33 PRC 4 25
Sequential and Adaptive Information Theory Workshop McGill University Nov 7, 2013 15/ 42
Streaming Codes — Burst plus Isolated ErasuresBadr-Khisti-Tan-Apostolopoulos (2013)
When do Earlier Constructions Fail?
v0
v1
v2
v3
v4
v5
v6
v7
v8
v9
v10
v11
u0
u1
u2
u3
u4
u5
u6
u7
u8
u9
u10
u11
p4
+u-4
p0
+u-8
p1
+u-7
p2
+u-6
p3
+u-5
p5
+u-3
p6
+u-2
p7
+u-1
p8
+u0
p9
+u1
p10
+u2
p11
+u3
xi
si
B
T-B
B
v0
v1
v2
v3
u0
u1
u2
u3
p0
+u-8
p1
+u-7
p2
+u-6
p3
+u-5
...v2
v0 ? ? ?
Burst spans t ∈ [0, 2]
Isolated Erasure i ∈ [7, 10]
Interference from vi during the recovery of u0, . . . ,u2
Sequential and Adaptive Information Theory Workshop McGill University Nov 7, 2013 16/ 42
Streaming Codes — Burst plus Isolated ErasuresBadr-Khisti-Tan-Apostolopoulos (2013)
Layered Construction for Partial Recovery
s0 s1 s2u3
r4
r5
r6
r7
r8
r9
r10
r11
r0
r1
r2
r3
v0
v1
v2
v3
v4
v5
v6
v7
v8
v9
v10
v11
u0
u1
u2
u3
u4
u5
u6
u7
u8
u9
u10
u11
p4
+u-4
p0
+u-8
p1
+u-7
p2
+u-6
p3
+u-5
p5
+u-3
p6
+u-2
p7
+u-1
p8
+u0
p9
+u1
p10
+u2
p11
+u3
xi
si
u
v
u
v0
v1
v2
v3
u0
u1
u2
u3
p0
+u-8
p1
+u-7
p2
+u-6
p3
+u-5
r
s0 s1 s2u3
r4
r5
r6
r7
r8
r9
r10
r11
r0
r1
r2
r3
v0
v1
v2
v3
v4
v5
v6
v7
v8
v9
v10
v11
u0
u1
u2
u3
u4
u5
u6
u7
u8
u9
u10
u11
p4
+u-4
p0
+u-8
p1
+u-7
p2
+u-6
p3
+u-5
p5
+u-3
p6
+u-2
p7
+u-1
p8
+u0
p9
+u1
p10
+u2
p11
+u3
xi
si
u
v
u
v0
v1
v2
v3
u0
u1
u2
u3
p0
+u-8
p1
+u-7
p2
+u-6
p3
+u-5
rs0 s1 s2
u3
r4
r5
r6
r7
r8
r9
r10
r11
r0
r1
r2
r3
v0
v1
v2
v3
v4
v5
v6
v7
v8
v9
v10
v11
u0
u1
u2
u3
u4
u5
u6
u7
u8
u9
u10
u11
p4
+u-2
p0
+u-8
p1
+u-7
p2
+u-6
p3
+u-5
p5
+u-1
p6
+u0
p7
+u1
p8
+u2
p9
+u3
p10
+u4
p11
+u5
xi
si
u
v
u
v0
v1
v2
v3
u0
u1
u2
u3
p0
+u-6
p1
+u-5
p2
+u-4
p3
+u-3
r
T=8
=2Teff
=6 v6
s0 s1 s2u3
r4
r5
r6
r7
r8
r9
r10
r11
r0
r1
r2
r3
v0
v1
v2
v3
v4
v5
v6
v7
v8
v9
v10
v11
u0
u1
u2
u3
u4
u5
u6
u7
u8
u9
u10
u11
p4
+u-2
p0
+u-8
p1
+u-7
p2
+u-6
p3
+u-5
p5
+u-1
p6
+u0
p7
+u1
p8
+u2
p9
+u3
p10
+u4
p11
+u5
xi
si
u
v
u
v0
v1
v2
v3
u0
u1
u2
u3
p0
+u-6
p1
+u-5
p2
+u-4
p3
+u-3
r
T=8
=2Teff
=6
v0v1v2
ri =∑M
t=1 vi−t ·Hvt
Shift in Repetition Code: pi + ui−(T−∆)
Isolated Loss: v ≤ (∆ + 1)r
Burst Loss: v(B + 1) ≤ (T −∆−B)(u+ r)
Sequential and Adaptive Information Theory Workshop McGill University Nov 7, 2013 17/ 42
Streaming Codes — Burst plus Isolated ErasuresBadr-Khisti-Tan-Apostolopoulos (2013)
Layered Construction for Partial Recovery
s0 s1 s2u3
r4
r5
r6
r7
r8
r9
r10
r11
r0
r1
r2
r3
v0
v1
v2
v3
v4
v5
v6
v7
v8
v9
v10
v11
u0
u1
u2
u3
u4
u5
u6
u7
u8
u9
u10
u11
p4
+u-4
p0
+u-8
p1
+u-7
p2
+u-6
p3
+u-5
p5
+u-3
p6
+u-2
p7
+u-1
p8
+u0
p9
+u1
p10
+u2
p11
+u3
xi
si
u
v
u
v0
v1
v2
v3
u0
u1
u2
u3
p0
+u-8
p1
+u-7
p2
+u-6
p3
+u-5
r
s0 s1 s2u3
r4
r5
r6
r7
r8
r9
r10
r11
r0
r1
r2
r3
v0
v1
v2
v3
v4
v5
v6
v7
v8
v9
v10
v11
u0
u1
u2
u3
u4
u5
u6
u7
u8
u9
u10
u11
p4
+u-4
p0
+u-8
p1
+u-7
p2
+u-6
p3
+u-5
p5
+u-3
p6
+u-2
p7
+u-1
p8
+u0
p9
+u1
p10
+u2
p11
+u3
xi
si
u
v
u
v0
v1
v2
v3
u0
u1
u2
u3
p0
+u-8
p1
+u-7
p2
+u-6
p3
+u-5
r
s0 s1 s2u3
r4
r5
r6
r7
r8
r9
r10
r11
r0
r1
r2
r3
v0
v1
v2
v3
v4
v5
v6
v7
v8
v9
v10
v11
u0
u1
u2
u3
u4
u5
u6
u7
u8
u9
u10
u11
p4
+u-2
p0
+u-8
p1
+u-7
p2
+u-6
p3
+u-5
p5
+u-1
p6
+u0
p7
+u1
p8
+u2
p9
+u3
p10
+u4
p11
+u5
xi
si
u
v
u
v0
v1
v2
v3
u0
u1
u2
u3
p0
+u-6
p1
+u-5
p2
+u-4
p3
+u-3
r
T=8
=2Teff
=6 v6
s0 s1 s2u3
r4
r5
r6
r7
r8
r9
r10
r11
r0
r1
r2
r3
v0
v1
v2
v3
v4
v5
v6
v7
v8
v9
v10
v11
u0
u1
u2
u3
u4
u5
u6
u7
u8
u9
u10
u11
p4
+u-2
p0
+u-8
p1
+u-7
p2
+u-6
p3
+u-5
p5
+u-1
p6
+u0
p7
+u1
p8
+u2
p9
+u3
p10
+u4
p11
+u5
xi
si
u
v
u
v0
v1
v2
v3
u0
u1
u2
u3
p0
+u-6
p1
+u-5
p2
+u-4
p3
+u-3
r
T=8
=2Teff
=6
v0v1v2
ri =∑M
t=1 vi−t ·Hvt
Shift in Repetition Code: pi + ui−(T−∆)
Isolated Loss: v ≤ (∆ + 1)r
Burst Loss: v(B + 1) ≤ (T −∆−B)(u+ r)
Sequential and Adaptive Information Theory Workshop McGill University Nov 7, 2013 17/ 42
Streaming Codes — Burst plus Isolated ErasuresBadr-Khisti-Tan-Apostolopoulos (2013)
Layered Construction for Partial Recovery
s0 s1 s2u3
r4
r5
r6
r7
r8
r9
r10
r11
r0
r1
r2
r3
v0
v1
v2
v3
v4
v5
v6
v7
v8
v9
v10
v11
u0
u1
u2
u3
u4
u5
u6
u7
u8
u9
u10
u11
p4
+u-4
p0
+u-8
p1
+u-7
p2
+u-6
p3
+u-5
p5
+u-3
p6
+u-2
p7
+u-1
p8
+u0
p9
+u1
p10
+u2
p11
+u3
xi
si
u
v
u
v0
v1
v2
v3
u0
u1
u2
u3
p0
+u-8
p1
+u-7
p2
+u-6
p3
+u-5
rs0 s1 s2
u3
r4
r5
r6
r7
r8
r9
r10
r11
r0
r1
r2
r3
v0
v1
v2
v3
v4
v5
v6
v7
v8
v9
v10
v11
u0
u1
u2
u3
u4
u5
u6
u7
u8
u9
u10
u11
p4
+u-4
p0
+u-8
p1
+u-7
p2
+u-6
p3
+u-5
p5
+u-3
p6
+u-2
p7
+u-1
p8
+u0
p9
+u1
p10
+u2
p11
+u3
xi
si
u
v
u
v0
v1
v2
v3
u0
u1
u2
u3
p0
+u-8
p1
+u-7
p2
+u-6
p3
+u-5
r
s0 s1 s2u3
r4
r5
r6
r7
r8
r9
r10
r11
r0
r1
r2
r3
v0
v1
v2
v3
v4
v5
v6
v7
v8
v9
v10
v11
u0
u1
u2
u3
u4
u5
u6
u7
u8
u9
u10
u11
p4
+u-2
p0
+u-8
p1
+u-7
p2
+u-6
p3
+u-5
p5
+u-1
p6
+u0
p7
+u1
p8
+u2
p9
+u3
p10
+u4
p11
+u5
xi
si
u
v
u
v0
v1
v2
v3
u0
u1
u2
u3
p0
+u-6
p1
+u-5
p2
+u-4
p3
+u-3
r
T=8
=2Teff
=6 v6
s0 s1 s2u3
r4
r5
r6
r7
r8
r9
r10
r11
r0
r1
r2
r3
v0
v1
v2
v3
v4
v5
v6
v7
v8
v9
v10
v11
u0
u1
u2
u3
u4
u5
u6
u7
u8
u9
u10
u11
p4
+u-2
p0
+u-8
p1
+u-7
p2
+u-6
p3
+u-5
p5
+u-1
p6
+u0
p7
+u1
p8
+u2
p9
+u3
p10
+u4
p11
+u5
xi
si
u
v
u
v0
v1
v2
v3
u0
u1
u2
u3
p0
+u-6
p1
+u-5
p2
+u-4
p3
+u-3
r
T=8
=2Teff
=6
v0v1v2
ri =∑M
t=1 vi−t ·Hvt
Shift in Repetition Code: pi + ui−(T−∆)
Isolated Loss: v ≤ (∆ + 1)r
Burst Loss: v(B + 1) ≤ (T −∆−B)(u+ r)
Sequential and Adaptive Information Theory Workshop McGill University Nov 7, 2013 17/ 42
Streaming Codes — Burst plus Isolated ErasuresBadr-Khisti-Tan-Apostolopoulos (2013)
Layered Construction for Partial Recovery
s0 s1 s2u3
r4
r5
r6
r7
r8
r9
r10
r11
r0
r1
r2
r3
v0
v1
v2
v3
v4
v5
v6
v7
v8
v9
v10
v11
u0
u1
u2
u3
u4
u5
u6
u7
u8
u9
u10
u11
p4
+u-4
p0
+u-8
p1
+u-7
p2
+u-6
p3
+u-5
p5
+u-3
p6
+u-2
p7
+u-1
p8
+u0
p9
+u1
p10
+u2
p11
+u3
xi
si
u
v
u
v0
v1
v2
v3
u0
u1
u2
u3
p0
+u-8
p1
+u-7
p2
+u-6
p3
+u-5
rs0 s1 s2
u3
r4
r5
r6
r7
r8
r9
r10
r11
r0
r1
r2
r3
v0
v1
v2
v3
v4
v5
v6
v7
v8
v9
v10
v11
u0
u1
u2
u3
u4
u5
u6
u7
u8
u9
u10
u11
p4
+u-4
p0
+u-8
p1
+u-7
p2
+u-6
p3
+u-5
p5
+u-3
p6
+u-2
p7
+u-1
p8
+u0
p9
+u1
p10
+u2
p11
+u3
xi
si
u
v
u
v0
v1
v2
v3
u0
u1
u2
u3
p0
+u-8
p1
+u-7
p2
+u-6
p3
+u-5
rs0 s1 s2
u3
r4
r5
r6
r7
r8
r9
r10
r11
r0
r1
r2
r3
v0
v1
v2
v3
v4
v5
v6
v7
v8
v9
v10
v11
u0
u1
u2
u3
u4
u5
u6
u7
u8
u9
u10
u11
p4
+u-2
p0
+u-8
p1
+u-7
p2
+u-6
p3
+u-5
p5
+u-1
p6
+u0
p7
+u1
p8
+u2
p9
+u3
p10
+u4
p11
+u5
xi
si
u
v
u
v0
v1
v2
v3
u0
u1
u2
u3
p0
+u-6
p1
+u-5
p2
+u-4
p3
+u-3
r
T=8
=2Teff
=6 v6
s0 s1 s2u3
r4
r5
r6
r7
r8
r9
r10
r11
r0
r1
r2
r3
v0
v1
v2
v3
v4
v5
v6
v7
v8
v9
v10
v11
u0
u1
u2
u3
u4
u5
u6
u7
u8
u9
u10
u11
p4
+u-2
p0
+u-8
p1
+u-7
p2
+u-6
p3
+u-5
p5
+u-1
p6
+u0
p7
+u1
p8
+u2
p9
+u3
p10
+u4
p11
+u5
xi
si
u
v
u
v0
v1
v2
v3
u0
u1
u2
u3
p0
+u-6
p1
+u-5
p2
+u-4
p3
+u-3
r
T=8
=2Teff
=6
v0v1v2ri =
∑Mt=1 vi−t ·Hv
t
Shift in Repetition Code: pi + ui−(T−∆)
Isolated Loss: v ≤ (∆ + 1)r
Burst Loss: v(B + 1) ≤ (T −∆−B)(u+ r)
Sequential and Adaptive Information Theory Workshop McGill University Nov 7, 2013 17/ 42
Streaming Codes — Burst plus Isolated ErasuresBadr-Khisti-Tan-Apostolopoulos (2013)
s0 s1 s2u3
r4
r5
r6
r7
r8
r9
r10
r11
r0
r1
r2
r3
v0
v1
v2
v3
v4
v5
v6
v7
v8
v9
v10
v11
u0
u1
u2
u3
u4
u5
u6
u7
u8
u9
u10
u11
p4
+u-2
p0
+u-8
p1
+u-7
p2
+u-6
p3
+u-5
p5
+u-1
p6
+u0
p7
+u1
p8
+u2
p9
+u3
p10
+u4
p11
+u5
xi
si
u
v
u
v0
v1
v2
v3
u0
u1
u2
u3
p0
+u-6
p1
+u-5
p2
+u-4
p3
+u-3
r
T=8
=2Teff
=6
v0v1v2
Code-Params
v = (T −∆ + 1)(∆−B − 1)
u =(B + 1)(T −∆ + 1)
−(∆−B − 1)
r = (∆−B − 1)
Rate
R =u + v
2u + v + r
=∆(T −∆) + (B + 1)
∆(T −∆) + (B + 1)(T −∆ + 2)
∆∗ = T + 1−√T −B
Sequential and Adaptive Information Theory Workshop McGill University Nov 7, 2013 18/ 42
Streaming Codes — Burst plus Isolated ErasuresBadr-Khisti-Tan-Apostolopoulos (2013)
s0 s1 s2u3
r4
r5
r6
r7
r8
r9
r10
r11
r0
r1
r2
r3
v0
v1
v2
v3
v4
v5
v6
v7
v8
v9
v10
v11
u0
u1
u2
u3
u4
u5
u6
u7
u8
u9
u10
u11
p4
+u-2
p0
+u-8
p1
+u-7
p2
+u-6
p3
+u-5
p5
+u-1
p6
+u0
p7
+u1
p8
+u2
p9
+u3
p10
+u4
p11
+u5
xi
si
u
v
u
v0
v1
v2
v3
u0
u1
u2
u3
p0
+u-6
p1
+u-5
p2
+u-4
p3
+u-3
r
T=8
=2Teff
=6
v0v1v2
Layering Principle
Start with a code for C(N,B,W )
Add additional parity check symbols for more sophisticatedpatterns.
Sequential and Adaptive Information Theory Workshop McGill University Nov 7, 2013 18/ 42
Unequal Source/Channel Rates
Source model: i.i.d. process with s[i] ∼ uniform over Fkq
Streaming encoder:x[i, j] = fi,j(s[0], s[1], · · · , s[i]) ∈ Fnq
Macro-packet: X[i, :] = [x[i, 1] | . . . | x[i,M ]]
Rate: R = H(s)n×M
= kn×M
Packet erasure channel: erasure burst of maximum B channel packets
Delay-constrained decoder: s[i] recovered by macro-packet i+ T
Sequential and Adaptive Information Theory Workshop McGill University Nov 7, 2013 19/ 42
Unequal Source/Channel Rates
Source model: i.i.d. process with s[i] ∼ uniform over Fkq
Streaming encoder:x[i, j] = fi,j(s[0], s[1], · · · , s[i]) ∈ Fnq
Macro-packet: X[i, :] = [x[i, 1] | . . . | x[i,M ]]
Rate: R = H(s)n×M
= kn×M
Packet erasure channel: erasure burst of maximum B channel packets
Delay-constrained decoder: s[i] recovered by macro-packet i+ T
Sequential and Adaptive Information Theory Workshop McGill University Nov 7, 2013 19/ 42
Unequal Source/Channel Rates
Source model: i.i.d. process with s[i] ∼ uniform over Fkq
Streaming encoder:x[i, j] = fi,j(s[0], s[1], · · · , s[i]) ∈ Fnq
Macro-packet: X[i, :] = [x[i, 1] | . . . | x[i,M ]]
Rate: R = H(s)n×M
= kn×M
Packet erasure channel: erasure burst of maximum B channel packets
Delay-constrained decoder: s[i] recovered by macro-packet i+ T
Sequential and Adaptive Information Theory Workshop McGill University Nov 7, 2013 19/ 42
Unequal Source/Channel Rates
Source model: i.i.d. process with s[i] ∼ uniform over Fkq
Streaming encoder:x[i, j] = fi,j(s[0], s[1], · · · , s[i]) ∈ Fnq
Macro-packet: X[i, :] = [x[i, 1] | . . . | x[i,M ]]
Rate: R = H(s)n×M
= kn×M
Packet erasure channel: erasure burst of maximum B channel packets
Delay-constrained decoder: s[i] recovered by macro-packet i+ T
Sequential and Adaptive Information Theory Workshop McGill University Nov 7, 2013 19/ 42
Unequal Source/Channel Rates
Source model: i.i.d. process with s[i] ∼ uniform over Fkq
Streaming encoder:x[i, j] = fi,j(s[0], s[1], · · · , s[i]) ∈ Fnq
Macro-packet: X[i, :] = [x[i, 1] | . . . | x[i,M ]]
Rate: R = H(s)n×M
= kn×M
Packet erasure channel: erasure burst of maximum B channel packets
Delay-constrained decoder: s[i] recovered by macro-packet i+ T
Sequential and Adaptive Information Theory Workshop McGill University Nov 7, 2013 19/ 42
Unequal Source/Channel Rates
Source model: i.i.d. process with s[i] ∼ uniform over Fkq
Streaming encoder:x[i, j] = fi,j(s[0], s[1], · · · , s[i]) ∈ Fnq
Macro-packet: X[i, :] = [x[i, 1] | . . . | x[i,M ]]
Rate: R = H(s)n×M
= kn×M
Packet erasure channel: erasure burst of maximum B channel packets
Delay-constrained decoder: s[i] recovered by macro-packet i+ T
Sequential and Adaptive Information Theory Workshop McGill University Nov 7, 2013 19/ 42
Unequal Source/Channel Rates
Source model: i.i.d. process with s[i] ∼ uniform over Fkq
Streaming encoder:x[i, j] = fi,j(s[0], s[1], · · · , s[i]) ∈ Fnq
Macro-packet: X[i, :] = [x[i, 1] | . . . | x[i,M ]]
Rate: R = H(s)n×M
= kn×M
Packet erasure channel: erasure burst of maximum B channel packets
Delay-constrained decoder: s[i] recovered by macro-packet i+ T
Sequential and Adaptive Information Theory Workshop McGill University Nov 7, 2013 19/ 42
Unequal Source/Channel Rates
Source model: i.i.d. process with s[i] ∼ uniform over Fkq
Streaming encoder:x[i, j] = fi,j(s[0], s[1], · · · , s[i]) ∈ Fnq
Macro-packet: X[i, :] = [x[i, 1] | . . . | x[i,M ]]
Rate: R = H(s)n×M
= kn×M
Packet erasure channel: erasure burst of maximum B channel packets
Delay-constrained decoder: s[i] recovered by macro-packet i+ T
Sequential and Adaptive Information Theory Workshop McGill University Nov 7, 2013 19/ 42
Unequal Source/Channel Rates
Source model: i.i.d. process with s[i] ∼ uniform over Fkq
Streaming encoder:x[i, j] = fi,j(s[0], s[1], · · · , s[i]) ∈ Fnq
Macro-packet: X[i, :] = [x[i, 1] | . . . | x[i,M ]]
Rate: R = H(s)n×M
= kn×M
Packet erasure channel: erasure burst of maximum B channel packets
Delay-constrained decoder: s[i] recovered by macro-packet i+ T
Sequential and Adaptive Information Theory Workshop McGill University Nov 7, 2013 19/ 42
Unequal Source/Channel Rates
Source model: i.i.d. process with s[i] ∼ uniform over Fkq
Streaming encoder:x[i, j] = fi,j(s[0], s[1], · · · , s[i]) ∈ Fnq
Macro-packet: X[i, :] = [x[i, 1] | . . . | x[i,M ]]
Rate: R = H(s)n×M
= kn×M
Packet erasure channel: erasure burst of maximum B channel packets
Delay-constrained decoder: s[i] recovered by macro-packet i+ T
Sequential and Adaptive Information Theory Workshop McGill University Nov 7, 2013 19/ 42
Unequal Source/Channel Rates
Source model: i.i.d. process with s[i] ∼ uniform over Fkq
Streaming encoder:x[i, j] = fi,j(s[0], s[1], · · · , s[i]) ∈ Fnq
Macro-packet: X[i, :] = [x[i, 1] | . . . | x[i,M ]]
Rate: R = H(s)n×M
= kn×M
Packet erasure channel: erasure burst of maximum B channel packets
Delay-constrained decoder: s[i] recovered by macro-packet i+ T
Sequential and Adaptive Information Theory Workshop McGill University Nov 7, 2013 19/ 42
Unequal Source/Channel Rates
Source-splitting based scheme
s1 s
2s3
s4 s
5s6
x1
x2
x3
x4
x5
x6
x7
x8
x9
Source Stream
ChannelStream
x10
x11
x12
x13
x14
x15
x16
ReceivedStream
y1
y2
y3
y4
y5
y6
y7
y8
y9
y10
y11
y12
y13
y14
y15
y16
T'=6s11s12s13s21s22s23
s32
s31
s33
s11
s12
s21
s22
s23
s31
s33
s41
s42
s51
s52
s53
s61
SplitSource
s13
s32 s
43
Split s[i] into sub-symbols (s[i, 1], s[i, 2], . . . , s[i,M ]).
Apply a streaming code for M = 1.
Decoding Delay T ′ = M · T .
R = MTMT+B
Sequential and Adaptive Information Theory Workshop McGill University Nov 7, 2013 20/ 42
Unequal Source/Channel RatesPatil-Badr-Khisti-Tan 2013
40 45 50 55 60 65 70 75 800.54
0.56
0.58
0.6
0.62
0.64
0.66
0.68
0.7
0.72M = 20, T = 5
Burst Length (B)
Rate
(R)
Optimal CodesSCo CodesMDP Codes
Source Splitting:R = MT
MT+B
Strongly MDS:R = 1− B
M(T+1)
Sequential and Adaptive Information Theory Workshop McGill University Nov 7, 2013 21/ 42
Unequal Source/Channel RatesPatil-Badr-Khisti-Tan 2013
40 45 50 55 60 65 70 75 800.54
0.56
0.58
0.6
0.62
0.64
0.66
0.68
0.7
0.72M = 20, T = 5
Burst Length (B)
Rate
(R)
Optimal CodesSCo CodesMDP Codes
Source Splitting:R = MT
MT+B
Strongly MDS:R = 1− B
M(T+1)
Capacity: Let B = bM + B′, whereB′ ∈ {0, . . . ,M − 1}
C =
{T
T+b, 0 ≤ B′ ≤ b
T+bM
M(T+b+1)−BM(T+b+1)
, bT+b
M ≤ B′ ≤M − 1
Sequential and Adaptive Information Theory Workshop McGill University Nov 7, 2013 21/ 42
Optimal Code Construction IPatil-Badr-Khisti-Tan 2013
Key Idea:
1 Apply code for M = 1 on complete source-packet to generatethe entire macro-packet.
2 Map/reshape the generated macro-packet into M individualchannel-packets.
Sequential and Adaptive Information Theory Workshop McGill University Nov 7, 2013 22/ 42
Optimal Code Construction IIPatil-Badr-Khisti-Tan 2013
1 Source splitting
split s[i] into into two groups
s[i] = (s1[i], . . . , sk[i])
= (u1[i], . . . , uku [i]︸ ︷︷ ︸uvec[i]
, v1[i], . . . , vkv [i]︸ ︷︷ ︸vvec[i]
)
Sequential and Adaptive Information Theory Workshop McGill University Nov 7, 2013 23/ 42
Optimal Code Construction IIIPatil-Badr-Khisti-Tan 2013
2 Parity generation
layer 1: (kv + ku, kv, T ) Strongly-MDS code applied to vvec[·]generating pvec[i]layer 2: repetition code on uvec with a shift of T
Sequential and Adaptive Information Theory Workshop McGill University Nov 7, 2013 24/ 42
Optimal Code Construction IVPatil-Badr-Khisti-Tan 2013
Overall combined parity: qvec[i] = pvec[i] + uvec[i− T ]
Sequential and Adaptive Information Theory Workshop McGill University Nov 7, 2013 25/ 42
Optimal Code Construction VPatil-Badr-Khisti-Tan 2013
3 Mapping of macro-packet to individual channel-packets
Map the generated macro-packet into M individualchannel-packets
}
Rate of the code= ku+kv2ku+kv
Sequential and Adaptive Information Theory Workshop McGill University Nov 7, 2013 26/ 42
Optimal Code Construction VIPatil-Badr-Khisti-Tan 2013
Overall macro-packet structure:
Sequential and Adaptive Information Theory Workshop McGill University Nov 7, 2013 27/ 42
Decoding AnalysisPatil-Badr-Khisti-Tan 2013
Key Fact: The worst case burst pattern starts at the beginning of the
macro-packet.
Let B = bM +B′. Two cases depending on whether B′ ≶ (1−R)M :
Burst only erases symbols fromuvec[i+ b]
Recovery of v symbols:(ku + kv)b = kuT
Optimal spitting ratio:kukv
= bT−b
R = TT+b
Burst erases symbols from vvec[i+ b]
Recovery of v symbols:(ku + kv)b+ (B′n− ku) = kuT
Optimal spitting ratio:kukv
= BM(T+b+1)−2B
R = M(T+b+1)−BM(T+b+1)
Sequential and Adaptive Information Theory Workshop McGill University Nov 7, 2013 28/ 42
Capacity Result
Theorem
For the streaming setup considered, with any M , T and B, thestreaming capacity C is given by the following expression:
C =
T
T+b , B′ ≤ bT+bM, T ≥ b,
M(T+b+1)−BM(T+b+1) , B′ > b
T+bM, T > b,M−B′
M , B′ > M2 , T = b,
0, T < b.
where the constants b and B′ are defined via
B = bM +B′, B′ ∈ {0, 1, . . . ,M − 1}, b ∈ N0.
�
Sequential and Adaptive Information Theory Workshop McGill University Nov 7, 2013 29/ 42
Robust Extension for M = 1
Append an additional layer parity checks containingStrongly-MDS code on u
v0
v1
v2
v3
v4
v5
v6
v7
v8
v9
v10
v11
u0
u1
u2
u3
u4
u5
u6
u7
u8
u9
u10
u11
p4
+u-4
p0
+u-8
p1
+u-7
p2
+u-6
p3
+u-5
p5
+u-3
p6
+u-2
p7
+u-1
p8
+u0
p9
+u1
p10
+u2
p11
+u3
xi
u
v
u
q1
q2
q3
q4
q5
q6
q7
q8
q9
q10
q11 kq
0
R = u+v2u+v+k .
Very close to being optimal for k = NT−N+1B
Sequential and Adaptive Information Theory Workshop McGill University Nov 7, 2013 30/ 42
Robust Extension for any M
Many possibilities for the placement of Strongly-MDS parities foru!
Approach I:
Repetition code replaced byStrongly-MDS code for u
Rate of the codeunchanged.
N = r is achievable.
}
Approach II
Append additional layer ofStrongly-MDS code for u
R = ku+kv
2ku+kv+ks.
ks chosen according togiven N .
}
Sequential and Adaptive Information Theory Workshop McGill University Nov 7, 2013 31/ 42
Related Works
Burst Erasure Channel: Maximally Short Codes (Block Code+ Interleaving), Martinian and Sundberg (IT-2004), Martinianand Trott (ISIT-2007)
Other Variations of Streaming CodesBurst and isolated loss patterns(Badr-Khisti-Tan-Apostolopoulos, ISIT 2013)Unequal Source Channel Rates (Patil-Badr-Khisti-TanAsilomar 2013)Multicast Extension (Khisti-Singh 2009, Badr-Lui-KhistiAllerton 2010)Parallel Channels (Lui-Badr-Khisti CWIT 2011)Multi-Source Streaming Codes (Lui Thesis, 2011)
Connections between Network Coding and Real-TimeStreaming Codes (Tekin-Ho-Yao-Jaggi ITA-2012, Leong andHo ISIT-2013)
Tree Codes: Schulman (IT 1996), Sahai (2001), Martinianand Wornell (Allerton 2004), Sukhavasi and Hassibi (2011)
Sequential and Adaptive Information Theory Workshop McGill University Nov 7, 2013 32/ 42
Related Works
Burst Erasure Channel: Maximally Short Codes (Block Code+ Interleaving), Martinian and Sundberg (IT-2004), Martinianand Trott (ISIT-2007)Other Variations of Streaming Codes
Burst and isolated loss patterns(Badr-Khisti-Tan-Apostolopoulos, ISIT 2013)Unequal Source Channel Rates (Patil-Badr-Khisti-TanAsilomar 2013)Multicast Extension (Khisti-Singh 2009, Badr-Lui-KhistiAllerton 2010)Parallel Channels (Lui-Badr-Khisti CWIT 2011)Multi-Source Streaming Codes (Lui Thesis, 2011)
Connections between Network Coding and Real-TimeStreaming Codes (Tekin-Ho-Yao-Jaggi ITA-2012, Leong andHo ISIT-2013)
Tree Codes: Schulman (IT 1996), Sahai (2001), Martinianand Wornell (Allerton 2004), Sukhavasi and Hassibi (2011)
Sequential and Adaptive Information Theory Workshop McGill University Nov 7, 2013 32/ 42
Related Works
Burst Erasure Channel: Maximally Short Codes (Block Code+ Interleaving), Martinian and Sundberg (IT-2004), Martinianand Trott (ISIT-2007)Other Variations of Streaming Codes
Burst and isolated loss patterns(Badr-Khisti-Tan-Apostolopoulos, ISIT 2013)Unequal Source Channel Rates (Patil-Badr-Khisti-TanAsilomar 2013)Multicast Extension (Khisti-Singh 2009, Badr-Lui-KhistiAllerton 2010)Parallel Channels (Lui-Badr-Khisti CWIT 2011)Multi-Source Streaming Codes (Lui Thesis, 2011)
Connections between Network Coding and Real-TimeStreaming Codes (Tekin-Ho-Yao-Jaggi ITA-2012, Leong andHo ISIT-2013)
Tree Codes: Schulman (IT 1996), Sahai (2001), Martinianand Wornell (Allerton 2004), Sukhavasi and Hassibi (2011)
Sequential and Adaptive Information Theory Workshop McGill University Nov 7, 2013 32/ 42
Related Works
Burst Erasure Channel: Maximally Short Codes (Block Code+ Interleaving), Martinian and Sundberg (IT-2004), Martinianand Trott (ISIT-2007)Other Variations of Streaming Codes
Burst and isolated loss patterns(Badr-Khisti-Tan-Apostolopoulos, ISIT 2013)Unequal Source Channel Rates (Patil-Badr-Khisti-TanAsilomar 2013)Multicast Extension (Khisti-Singh 2009, Badr-Lui-KhistiAllerton 2010)Parallel Channels (Lui-Badr-Khisti CWIT 2011)Multi-Source Streaming Codes (Lui Thesis, 2011)
Connections between Network Coding and Real-TimeStreaming Codes (Tekin-Ho-Yao-Jaggi ITA-2012, Leong andHo ISIT-2013)
Tree Codes: Schulman (IT 1996), Sahai (2001), Martinianand Wornell (Allerton 2004), Sukhavasi and Hassibi (2011)
Sequential and Adaptive Information Theory Workshop McGill University Nov 7, 2013 32/ 42
Distance and Span Properties
Append an additional layer parity checks containingStrongly-MDS code on u
v0
v1
v2
v3
v4
v5
v6
v7
v8
v9
v10
v11
u0
u1
u2
u3
u4
u5
u6
u7
u8
u9
u10
u11
p4
+u-4
p0
+u-8
p1
+u-7
p2
+u-6
p3
+u-5
p5
+u-3
p6
+u-2
p7
+u-1
p8
+u0
p9
+u1
p10
+u2
p11
+u3
xi
u
v
u
q1
q2
q3
q4
q5
q6
q7
q8
q9
q10
q11 kq
0
R = u+v2u+v+k .
Very close to being optimal for k = NT−N+1B
MiDAS → (Near) Maximum Distance And Span tradeoff
Sequential and Adaptive Information Theory Workshop McGill University Nov 7, 2013 33/ 42
Distance and Span Properties
Consider (n, k,m) Convolutional code: xi =∑m
j=0 si−jGj
Sta
tes
Trellis Diagram
0 1 2 3 4
Sta
tes
Trellis Diagram – Free Distance
0 1 2 3 4
Sta
tes
Column Distance in [0,3]
0 1 2 3 4
Sta
tes
Column Distance in [0,3]
0 1 2 3 4
Sta
tes
Column Span in [0,3]
0 1 2 3 4
Sequential and Adaptive Information Theory Workshop McGill University Nov 7, 2013 34/ 42
Distance and Span Properties
Consider (n, k,m) Convolutional code: xi =∑m
j=0 si−jGj
Sta
tes
Trellis Diagram
0 1 2 3 4
Sta
tes
Trellis Diagram – Free Distance
0 1 2 3 4
Sta
tes
Column Distance in [0,3]
0 1 2 3 4
Sta
tes
Column Distance in [0,3]
0 1 2 3 4
Sta
tes
Column Span in [0,3]
0 1 2 3 4
Sequential and Adaptive Information Theory Workshop McGill University Nov 7, 2013 34/ 42
Distance and Span Properties
Consider (n, k,m) Convolutional code: xi =∑m
j=0 si−jGj
Sta
tes
Trellis Diagram
0 1 2 3 4
Sta
tes
Trellis Diagram – Free Distance
0 1 2 3 4
Sta
tes
Column Distance in [0,3]
0 1 2 3 4
Sta
tes
Column Distance in [0,3]
0 1 2 3 4
Sta
tes
Column Span in [0,3]
0 1 2 3 4
Column Distance: dT
dT = min[s0,...,sT ]s0 6=0
wt
[s0 . . . sT]G0 G1 . . . GT
0 G0 . . . GT−1...
. . ....
0 . . . G0
Sequential and Adaptive Information Theory Workshop McGill University Nov 7, 2013 34/ 42
Distance and Span Properties
Consider (n, k,m) Convolutional code: xi =∑m
j=0 si−jGj
Sta
tes
Trellis Diagram
0 1 2 3 4
Sta
tes
Trellis Diagram – Free Distance
0 1 2 3 4
Sta
tes
Column Distance in [0,3]
0 1 2 3 4
Sta
tes
Column Distance in [0,3]
0 1 2 3 4
Sta
tes
Column Span in [0,3]
0 1 2 3 4
Column Distance: dT
dT = min[s0,...,sT ]s0 6=0
wt
[s0 . . . sT]G0 G1 . . . GT
0 G0 . . . GT−1...
. . ....
0 . . . G0
Sequential and Adaptive Information Theory Workshop McGill University Nov 7, 2013 34/ 42
Distance and Span Properties
Consider (n, k,m) Convolutional code: xi =∑m
j=0 si−jGj
Sta
tes
Trellis Diagram
0 1 2 3 4
Sta
tes
Trellis Diagram – Free Distance
0 1 2 3 4
Sta
tes
Column Distance in [0,3]
0 1 2 3 4
Sta
tes
Column Distance in [0,3]
0 1 2 3 4
Sta
tes
Column Span in [0,3]
0 1 2 3 4
Column Span: cT
cT = min[s0,...,sT ]s0 6=0
span
[s0 . . . sT]G0 G1 . . . GT
0 G0 . . . GT−1...
. . ....
0 . . . G0
Sequential and Adaptive Information Theory Workshop McGill University Nov 7, 2013 34/ 42
Column-Distance & Column Span TradeoffBadr-Khisti-Tan-Apostolopoulos (2013)
Theorem
Consider a C(N,B,W ) channel with delay T and W ≥ T + 1. Astreaming code is feasible over this channel if and only if itsatisfies: dT ≥ N + 1 and cT ≥ B + 1
Theorem
For any rate R convolutional code and any T ≥ 0 theColumn-Distance dT and Column-Span cT satisfy the following:(
R
1−R
)cT + dT ≤ T + 1 +
1
1−R
There exists a rate R code (MiDAS Code) over a sufficiently largefield that satisfies:(
R
1−R
)cT + dT ≥ T +
1
1−R
Sequential and Adaptive Information Theory Workshop McGill University Nov 7, 2013 35/ 42
Column-Distance & Column Span TradeoffBadr-Khisti-Tan-Apostolopoulos (2013)
Theorem
Consider a C(N,B,W ) channel with delay T and W ≥ T + 1. Astreaming code is feasible over this channel if and only if itsatisfies: dT ≥ N + 1 and cT ≥ B + 1
Theorem
For any rate R convolutional code and any T ≥ 0 theColumn-Distance dT and Column-Span cT satisfy the following:(
R
1−R
)cT + dT ≤ T + 1 +
1
1−R
There exists a rate R code (MiDAS Code) over a sufficiently largefield that satisfies:(
R
1−R
)cT + dT ≥ T +
1
1−R
Sequential and Adaptive Information Theory Workshop McGill University Nov 7, 2013 35/ 42
Multicast (Low-Delay) Codes
Src. StreamEncoder
B1
B2
X[i]Decoder 1
Decoder 2
Delay = T1
Delay = T2
Burst ErasureBroadcast Channel
Motivation
B1 < B2
Receiver 1 : Good Channel State
Receiver 2: Weaker Channel State
Delay adapts to Channel State
Sequential and Adaptive Information Theory Workshop McGill University Nov 7, 2013 36/ 42
Channel-Adaptive Delay
Example: B1 = 2, T1 = 4, B2 = 3 and T2 = 5.
x0 x1 x2 x3 x4 x5 x6 x7 x8 x9 x10 x11 x12 x13 x14
Output s0 s1 s2 s3 s6 s7 s8s4 ,s5
B1 B2
T1 T2
A burst of length B1 results in a delay of T1.
A burst of length B2 results in a delay of T2.
Stretch-Factor: s = T2−B1T1−B1
Tradeoff between s and Pr(loss).
Sequential and Adaptive Information Theory Workshop McGill University Nov 7, 2013 37/ 42
Diversity Embedded Streaming Codes: DE-SCoBadr-Khisti-Martinian 2011
Src. StreamEncoder
B1
B2
X[i]Decoder 1
Decoder 2
Delay = T1
Delay = T2
Burst ErasureBroadcast Channel
Theorem
There exists a {(B1, T1), (B2, T2)} DE-SCo construction of rateR = T1
T1+B1provided
T2 ≥ T ?2 ,
B2
B1T1+B1.
This code has polynomial-time encoding and decoding complexity.
Sequential and Adaptive Information Theory Workshop McGill University Nov 7, 2013 38/ 42
Multicast (Low Delay) Capacity
Src. StreamEncoder
B1
B2
X[i]Decoder 1
Decoder 2
Delay = T1
Delay = T2
Burst ErasureBroadcast Channel
Capacity Function
Capacity function C(T1, T2, B1, B2)
Single User Upper Bound: C ≤ min(
T1T1+B1
, T2T2+B2
)Concatenation Lower Bound: C ≥ 1
1+B1T1
+B2T2
Sequential and Adaptive Information Theory Workshop McGill University Nov 7, 2013 39/ 42
Multicast (Low Delay) CapacityBadr-Khisti-Lui 2011
Assume w.l.o.g. B2 ≥ B1
T1
T2
(B1 , B2 )
B1 + B2
B2
E
CB
D
A
Region Capacity
A
B
C
D
E ??
22
2
BTT
21
1
BTT
11
1
BTT
212
12
BBTBT
T2 =T1 +B2
T2 =T1
111
22 BT
BBT
DE-SCo
SCo
Sequential and Adaptive Information Theory Workshop McGill University Nov 7, 2013 40/ 42
Other Extensions
Multiple Erasure Bursts (Li-Khisti-Girod 2011) - InterleavedLow-Delay Codes
Multiple Links (Lui-Badr-Khisti 2011) - Layered coding forburst erasure channels
Multiple Source Streams with Different Decoding Delays (Lui2011) - Embedded Codes
Sequential and Adaptive Information Theory Workshop McGill University Nov 7, 2013 41/ 42
Conclusions
Error Correction Codes for Real-Time Streaming
Deterministic Channel Models C(N,B,W )
Tradeoff between achievable N and B
MiDAS Constructions
Column-Distance and Column-Span Tradeoff
Partial Recovery Codes for Burst + Isolated Erasures
Unequal Source-Channel Rates
Sequential and Adaptive Information Theory Workshop McGill University Nov 7, 2013 42/ 42