Asymptotically good binary code with efficient encoding & Justesen code

42
1 Asymptotically good binary code with efficient encoding & Justesen code Tomer Levinboim Error Correcting Codes Seminar (2008)

description

Asymptotically good binary code with efficient encoding & Justesen code. Tomer Levinboim Error Correcting Codes Seminar (2008). Outline. Intro codes Singleton Bound Linear Codes Bounds Gilbert-Varshamov Hamming RS codes Code Concatention Examples Wozencraft Ensemble Justesen Codes. - PowerPoint PPT Presentation

Transcript of Asymptotically good binary code with efficient encoding & Justesen code

1

Asymptotically good binary code with efficient encoding& Justesen code

Tomer Levinboim

Error Correcting Codes Seminar (2008)

2

Outline

Intro codes Singleton Bound

Linear Codes Bounds

Gilbert-Varshamov Hamming

RS codes Code Concatention

Examples Wozencraft Ensemble Justesen Codes

3

Hamming Distance

Hamming Distance between

The Hamming Distance is a metric Non negative Symmetric Triangle inequality

),( yxH

nyx ,

|}|{| ii yxi =

yxyxH 0),(

4

Weight

)0,()( nH xxwt

nx The weight (wt) of

Example (on board)

5

Code

An (n,k,d)q code C is a function such that:

For every

nkC :

q ||

kyx ,

dyCxCH ))(),((

6

Code (parameters)

(n,k,d)q

Parameters n – block length k – information length d – minimum distance (actually, a lower bound) q – size of alphabet

|C| = qk or k=logq|C|

7

Code (parameters div n)

Asymptotic view of parameters as n∞:

The rate

Relative minimum distance

Thus an (n,k,d)q can be written as (1,R,δ)q

Notation: (n,k,d)q vs. [n,k,d]q – latter reserved for linear code (soon)

n

kR

n

d

8

Trivial Code Example

FEC3 = write each bit three time R = ? d = ?

how many errors can we Detect ? (d-1) Correct ? t, where d=2t+1

9

Goal

Would like to: Maximize δ – correct more Maximize R – send more information

* conflicting goals - would like to be able to construct an [n,k,d]q code s.t. δ>0, R>0 and both are constant.

Minimize q – for practical reasons Maximize number of codewords while minimizing n

and keeping d large.

10

Singleton Bound

Let C be an [n,k,d]q code thenk ≤ n – d + 1

equivalently R ≤ 1 – δ + o(1)

Proof: project C to first k-1 coordinatesOn Board

11

Visual intuition

On board... Ballq(x,r)

r:=dr:=t (where d=2t+1)

Volq(n,r) = |Ballq(x,r)|

12

Linear Codes

13

Linear Codes

An [n,k,d]q code C:FqKFq

n is linear when: Fq is a field

C is linear function (e.g., matrix)

Linearity implies: C(ax+by) = aC(x) + bC(y) 0n member of C

14

Linear Codes (example)

FEC3 [3,1,3]2

Hadamard – longest linear code [n,logn, n/2]2

e.g., - [8,3,4]2 (H - Matrix representation on board)

Dimensions

Asymptotic behavior

15

Linear Codes – minimum distance

)]0,([min)],([min '',

nxH

xxxH

xxCCC

kk

Lemma: if C:FqKFq

n is linear then

Note: for clarity Cx means C(x) Proof:

≤ - trivial ≥ - follows from linearity (on board)

16

Reed-Solomon code

Idea: oversample a polynomial Let q be prime power and Fq a finite field of size

q. Let k<n and fix n elements of Fq,

x1,x2,..xn

Given a message m=(c0..ck-1) interpret it has the coefficients of the polynomial p

1

0

)(k

i

iixcxp

17

RS Codes

Thus (c0..ck-1) is mapped to (p(x1),..p(xn)) Linear mapping (Vandermonde)

Using linearity, can show for x≠0

RS meet the Singleton bound Proof: on board

(# of roots of a k-1 degree poly)

Encoding time

)1()( knCwt x

18

Bounds

19

Gilbert-Varshamov Bound Preliminaries

Binary Entropy

Stirling

Implying that:

))((! n

e

nnn

)1

1log()1()

1log()(2 p

pp

ppH

knk

kn

n

k

n

knk

n

k

n

)(

20

Gilbert-Varshamov Bound Preliminaries

Using the binary entropy we obtain

On board

)(log)/(log 2 nndnHd

n

21

Gilbert-Varshamov Boundbound statement

For every n and d<n/2 there is an (n,k,d)q (not necessarily linear) code such that:

In terms of rate and relative min-distance:

)(log))/(1( 2 nndHnk

)1()(1 2 oHR

22

Gilbert-Varshamov Bound Proof

On Board Sketch of proof:

if C is maximal then:

And

Now use union bound and entropy to obtain result (we show for q=2, using binary entropy)

Cc qnq dcBallF

)1,(

jd

jq qj

ndcBall )1()1,(

1

0

23

GV-Bound

Gilbert proved this with a greedy construction

Varshamov proved for linear codesproved using random generator matrices –

most matrices are good error correcting codes

24

Singleton / GV Plot

10.5

1

Singleton (upper)

Gilbert-Varshamov (lower)

25

Hamming Bound (Upper)

With similar reasoning to GV bound but using

For q=2 can show that

nqCc q F

dcBall

)

2

1,(

1)2

(2

HR

26

Bounds plot

*Madhu Sudan (Lecture 5, 2001)

27

Code Concatenation

28

Code Concatenation - Motivation

RS codes imply we can construct good [n,k,d]q codes for any q=pk

Practically would like to work with small q (2, 28) Consider the “obvious” idea for binary code

generated from C – simply convert each symbol from Σn to log2q,

What’s the problem with this approach ? (write the new code!)

29

Code Concatenation

Due to Forney (1966) Two codes:

Outer: Cout = [N,K,D]Q

Inner: Cin = [n,k,d]q

Inner code should encode each symbol of outer code k = logqQ

30

Code Concatenation

How does it work ?

* Luca Trevisan (Lecture 2)

31

Code Concatenation

What is the new code ?

dcon = dD Proof: On board

qqQ dDkKnNdknDKN ],,[],,[],,[

32

Code Concatenation (Examples)

Asymptotically δ = ¼ R=logn/2n 0

22 ]2

,log,[]2

,2

,[n

nnnn

nHRS n

]4

,log,[2

2 nnnn

33

Good Codes

Can we “explicitly” build asymptotically good (linear) codes ? asymptotically good = constant R, δ > 0 as n∞ Explicit = polytime constructable / logspace

constructible

34

Asymptotically Good Codes

35

Asymptotically Good Codes

GV tells us that most linear functions of a certain size are good error-correcting codesCan find a good code in brute-force

Use brute force on inner-code, where the alphabet is exponentially smaller!

Do we really need to search ?

36

Wozencraft Ensemble Consider the following set of codes:

such that (R=1/2) (

Notice that (on board)

}|:{ *2kq

kq

kq FFFC

),()( xxxC

}0{)Im()Im()2 CC

yxyxxC 1),()()1

37

Wozencraft Ensemble

Lemma: There exists an ensemble of codes c1,..cN of rate ½ where N = qk-1 such that for at least (1-ε)N value of i, the code Ci has distance di s.t.

Proof (on board), outline: Different codes have only 0n in common Let y=Cα(x), then, If wt(y)<d

y in Ball(0n, d)

there are at most Vol(n,d) “bad” codes For large enough n=2k, we have Vol(n,d) ≤ εN

)()2

1(1 dnHd qi

38

Wozencraft Ensemble

Implications:Can construct entire ensemble in O(2k)=O(2n)There are many such good codes, but which

one do we use ?

39

Justesen Code

Concatenation of: Cout - RS code over

a set of inner codes

Justesen Code: C* = Cout(C1, C2, .. CN) Each symbol of Cout is encoded using a different

inner code Cj

If RS has rate R C* has rate R/2

*}{kq

FC

kqF

40

Justesen Code - δ

Denote the outer RS code [N,K,D]Q

Claim: C* has relative distance

)2

1()1( 1

* qHR

41

Justesen Code Proof

Intuition: like regular concatenation, but εN bad codes. for x≠y, the outer code induces S={j | xj≠yj},

|S| ≥D

There are at most εN j’s such that Cj is bad and therefore at least |S|- εN ≥ D- εN ≥ (1-R- ε)N good codes since RS implies D=N-(K-1)

Each good code has relative distance ≥ d d* ≥ (1-R- ε)Nd

42

Justesen Code

The concatenated code C* is an asymptotically good code and has a “super” explicit construction

Can take q=2 to get such a binary code