1 Reliability-Based SD Decoding Not applicable to only graph-based codes May even help with some...

Post on 19-Jan-2018

221 views 0 download

description

3 Reliability measures and decoding Consider received vector r For each received symbol r i, form the Hard decision z i Reliability log(P(r i |v i =1)/P(r i |v i =0) )  |r i | As can be expected, z i is more likely to be in error when |r i | is small:

Transcript of 1 Reliability-Based SD Decoding Not applicable to only graph-based codes May even help with some...

1

Reliability-Based SD Decoding• Not applicable to only graph-based codes

• May even help with some algebraic structure • SD alternative to trellis decoding and iterative decoding

• But hard to implement ML Soft Decoding for general codes

• Performance: Somewhere between HD and SD

2

Correlation Discrepancy• Send binary codeword (v0,...,vn-1)

• Modulate into bipolar (c0,...,cn-1)

• Receive real vector (r0,...,rn-1)

• P(ri|vi) = K e –(ri-ci)2/N0

• P(ri|vi=1)/P(ri|vi=0) = K e –(ri-1)2/N0/ K e –(ri+1)2/N0

• log(P(ri|vi=1)/P(ri|vi=0) ) ri.• Receive r. Decode to the codeword c that minimizes

i (ri-ci)2 = i ri2 + n - 2i ri·ci

• Maximize correlation m(r,v) = i ri·ci

• ...= i |ri| - 2i such that ri·ci < 0 |ri|

• Minimize Correlation discrepancy (r,v) = i such that ri·ci < 0 |ri|

3

Reliability measures and decoding• Consider received vector r• For each received symbol ri, form the

• Hard decision zi

• Reliability log(P(ri|vi=1)/P(ri|vi=0) ) |ri|

• As can be expected, zi is more likely to be in error when |ri| is small:

4

Probability of errors in zi: LRP vs MRP

5

Reliability and decoding: LRP• Decoding based on the set of Least Reliable Positions:• Assume that errors are more likely to occur in the LRPs

• Select a set E of error patterns e, confined to the LRPs• For each e E, form the modified received vector z+e• Decode z+e into a codeword c(e) C, by use of an

efficient algebraic decoder• The preceding steps give a list of candidate codewords.

The final decoding step is to compare each of these codewords with r, and select the one which is closest in terms of squared Euclidean distance

• Performance: Depends on |E| • Complexity: Depends on |E| and on the algebraic decoder

6

Reliability and decoding: MRP• Decoding based on the set of Most Reliable Positions:• Assume that errors are less likely to occur in the MRPs

• Select a set I of k independent MRPs (MRIPs)• Select a set E of error patterns e, with 1s confined to the

k MRIPs• For each e E, form the modified information vector

zk+e and encode it into a codeword c(e) C• The preceding steps give a list of candidate codewords.

The final decoding step is to compare each of these codewords with r, and select the one which is closest in terms of squared Euclidean distance

• Performance and complexity: Depends on |E|

7

Condition on optimality• In both of the preceding algorithms, whenever we find a

codeword which is good enough, we can terminate the process.

• What do we mean by good enough?• Need an optimality condition

8

Condition on optimality (cont.)

• D0(v) = {i : vi=zi, 0i<n}, D1(v) = {i : vizi, 0i<n}

• n(v) = | D1(v) | = dH(v,z)

(r,v) = i such that ri·ci < 0 |ri|= iD1(v)|ri|

• Want to find the codeword with the lowest correlation discrepancy:

(r,v*) (r,v*) = min vC, v v * { (r,v) } (r,v*) is hard to evaluate, but we can hope to find a lower

bound on it• Let D0

(j)(v) consist of the j elements of D0(v) with lowest reliability | rj |

• Let wi be the i-th nonzero weight in the code

9

Condition on optimality (cont.)• Thus D0

(wj-n(v)) (v) consists of the wj-n(v) elements of D0(v) with lowest reliability | rj |

• Theorem: If (r,v) iD0

(wj-n(v))(v) |ri|,

then the ML codeword for r is at a distance less than wj from v.

• Proof: Assume that v’ is a codeword such that dH(v,v’) wj

(r,v’) = iD1(v’) |ri| i D0(v) D1(v’) | ri|

iD0(wj-n(v))

(v) |ri|,

• ...because |D1 (v’)| wj – n(v):

• |D0(v) D1 (v’)| + |D1(v) D0 (v’)| = dH(v,v’) wj

• |D1 (v’)| |D0(v) D1 (v’)| wj - |D1(v) D0 (v’)| wj - |D1(v)| wj – n(v)

10

Corollary:• If

(r,v) iD0(w1-n(v))

(v) |ri|,

Then v is the ML codeword.• If

(r,v) iD0(w2-n(v))

(v) |ri|,

Then either v is the ML codeword, or the ML codeword is one of the nearest neighbours of v.

11

Optimality condition based on two observed codewords

•A more sophisticated criterion can be applied if we know more than one codeword ”close” to r•Skip the details.

12

Generalized Minimum Distance Decoding

• Forney (1966)• Based on erasure decoding: • Consider two codewords c and c’ such that d = dH(c,c’).

Assume that c is sent.• Assume an error and erasure channel producing t errors

and e erasures• Then an ML decoder will decode to c if

2t+e d - 1

• GMD decoding considers all possible patterns of dmin - 1 erasures in the dmin - 1 LRPs

13

GMD Decoding (on an AWGN)• From r, derive the HD word z and the reliability word |r|. • Produce a list of (dmin+1)/2 partly erased words by

erasing• If dmin is even: the least reliable, the three least

reliable, ..., the dmin - 1 least reliable positions

• If dmin is odd: no bit; the two least reliable, the four least reliable, ..., the dmin - 1 least reliable positions

• For each partly erased word in the list, decode by using (algebraic) error and erasure decoding algorithm

• Compute SD metric for each decoded word w.r.t r. Select the one closest to r.

14

The Chase algorithms: 1• From r, derive the HD word z and the reliability word |r|• Produce a set E of all error patterns of weight exactly

dmin/2 • For each e E, decode z+e by using (algebraic) errors-

only decoding algorithm• Compute SD metric for each decoded word w.r.t r. Select

the one closest to r.

• Better performance, Veeeery high complexity

15

The Chase algorithms: 2• From r, derive the HD word z and the reliability word |r|• Produce a set of 2 dmin/2 test error patterns E with all

possible error patterns confined to the dmin/2 LRps• For each e E, decode z+e by using (algebraic) errors-

only decoding algorithm• Compute SD metric for each decoded word w.r.t r. Select

the one closest to r.

• Better performance, higher complexity than Chase-3

16

The Chase algorithms: 3• From r, derive the HD word z and the reliability word |r|• Produce a list of (dmin+1)/2 modified words by

complementing• If dmin is even: no b it, the least reliable, the three least

reliable, ..., the dmin -1 least reliable positions

• If dmin is odd: no bit; the two least reliable, the four least reliable, ..., the dmin -1 least reliable positions

• For each modified word in the list, decode by using (algebraic) errors-only decoding algorithm

• Compute SD metric for each decoded word w.r.t r. Select the one closest to r.

17

Generalized Chase and GMD decoding• Generalizations: How to choose the test error patterns• Choose a number a {1,2,...,dmin/2}• Algorithm A(a) uses error set E(a), consisting of the

following 2a-1((dmin+1)/2-a+1) vectors (even dmin):• All 2a-1 error patterns confined to the a-1 LRPs• For each of the preceding error patterns, also

complement the next i, i=0,1,3,..., dmin-2a+1

18

The Generalized Chase algorithm A(a)• From r, derive the HD word z and the reliability word |r|• Generate the error patterns in E(a) (in likelihood order?)• For each modified word in the list, decode by using

(algebraic) errors-only decoding algorithm• Compute SD metric for each decoded word w.r.t r. Select

the one closest to r.• -----• A(1) : Chase-3• A(dmin/2): Chase-2

19

The Generalized Chase algorithm Ae(a)

• Similar to A(a), but uses error set Ee(a), formed by (even dmin):• All 2a-1 error patterns confined to the a-1 LRPs• For each of the preceding error patterns, also erase the

next i, i=0,1,3,..., dmin-2a+1• Decode with errors-and-erasure decoder• ---• Ae(1): Basic GMD decoder• The generalized algorithms achieve bounded distance

decoding: If the received sequence is within a distance of sqrt(dmin) of the transmitted word, then decoding is correct.

• Similar to each other in basic properties (performance, complexity)

20

Performance curves: (64,42,8) RM code

21

Performance curves (127,64,21)

22

Suggested exercises...

• 10.2,10.3,10,4,10.5, 10.1