Turbo Principle SISO (Soft Input Soft Output) Achieving ...

14
Introduction Turbo Principle SISO (Soft Input Soft Output) Example of a product code Achieving channel capacity ... Shannon says this is possible ... how ? Make use of the gap between the source rate and the channel capacity: coding scheme Claude Shannon, 1953 A scheme of coding and decoding can be found allowing correction of all transmission errors, if the information rate is inferior or equal to the channel capacity. J.-M. Brossier Turbo codes.

Transcript of Turbo Principle SISO (Soft Input Soft Output) Achieving ...

IntroductionTurbo Principle

SISO (Soft Input Soft Output)Example of a product code

Achieving channel capacity ...

Shannon says this is possible ... how ?

Make use of the gap between the source rate and the channelcapacity: coding scheme

Claude Shannon, 1953

A scheme of coding and decoding can be found allowing correctionof all transmission errors, if the information rate is inferior or equalto the channel capacity.

J.-M. Brossier Turbo codes.

IntroductionTurbo Principle

SISO (Soft Input Soft Output)Example of a product code

Coding and uncoding

Turbo coding

Serial encoding

Code 1

Code 2Interleaver

Data

u

p

q

Two short systematic codes are used to build a large codeData uRedundancy of the first coder pRedundancy of the second coder q

J.-M. Brossier Turbo codes.

IntroductionTurbo Principle

SISO (Soft Input Soft Output)Example of a product code

Coding and uncoding

Turbo decoding

Iterative Decoding Scheme

Decoder 1

Decoder 2

u

p

q

E

E

D

First iteration

The two decoders provide afirst estimation of thetransmitted symbols.

Each decoder transmits itsoutput to the input of theother one for the seconditeration.

J.-M. Brossier Turbo codes.

IntroductionTurbo Principle

SISO (Soft Input Soft Output)Example of a product code

Coding and uncoding

Turbo decoding

Iterative Decoding Scheme

Decoder 1

Decoder 2

u

p

q

E

E

D

Second Iteration

Using the outputs computedat the first iteration, the twodecoders provide a secondestimation of the transmittedsymbols.

the same sequence ofoperations is applied for alliterations ...

J.-M. Brossier Turbo codes.

IntroductionTurbo Principle

SISO (Soft Input Soft Output)Example of a product code

Definition of a soft information, how to use it?Convolutional codesBlock codes

Soft information

What is a soft information?

A log-likelihood ratio

Example: the Additive White Gaussian Noise Channel

Its output is given by r = x + b with x = ±1 and b zero-meanGaussian random variable with variance σ2.

LLR (Log Likelihood Ratio):

logp (r |+ 1)

p (r | − 1)= log

1σ√

2πexp

(− (r−1)2

2σ2

)1

σ√

2πexp

(− (r+1)2

2σ2

) =

2

σ2r

Interpretation:

The sign of the LLR is a hard decisionIts module indicates the reliability of this decision.

J.-M. Brossier Turbo codes.

IntroductionTurbo Principle

SISO (Soft Input Soft Output)Example of a product code

Definition of a soft information, how to use it?Convolutional codesBlock codes

Modelisation of the decoder input

Emitted codeword

X = (X1, · · · ,Xn)

Soft received sequence of values

R = (LLR1, · · · , LLRn)

LLRi are Log-Likelihood Ratios

Iteration 1: information ishard or soft.

Following iterations: onlysoft information.

Decomposition of the informationgiven by LLRs

Hard decision: the LLR signprovides a hard decision:

Yi = sgn [LLRi ]

Reliability: the LLR moduleprovides its reliability:

αi = |LLRi |

J.-M. Brossier Turbo codes.

IntroductionTurbo Principle

SISO (Soft Input Soft Output)Example of a product code

Definition of a soft information, how to use it?Convolutional codesBlock codes

Several kinds of decoders

Notations

Vector of errors Zm = Y ⊕ Xm

Weights of errors W (Zm) =n∑

i=1Zm

i

Analog weight (soft) Wα (Zm) =∑n

i=1 αiZmi .

Incomplete Decoder (Hard input)

It only uses the hard information.

It gives the word Xm = (Xm1 , · · · ,Xm

n ) whose Hammingdistance to Y = (Y1, · · · ,Yn) is minimum.

1 word is found if W (Zm) ≤⌊

dmin−12

⌋else no word found.

The decision is right if the number of errors is less than⌊dmin−1

2

⌋.

J.-M. Brossier Turbo codes.

IntroductionTurbo Principle

SISO (Soft Input Soft Output)Example of a product code

Definition of a soft information, how to use it?Convolutional codesBlock codes

Several kinds of decoders

Notations

Vector of errors Zm = Y ⊕ Xm

Weights of errors W (Zm) =n∑

i=1Zm

i

Analog weight (soft) Wα (Zm) =∑n

i=1 αiZmi .

Complete Decoder (Soft input)

It uses the whole information.

Complete decoder: minm Wα (Y ⊕ Xm) can provide acodeword even if the number of errors is greater than⌊

dmin−1

2

⌋Complete Soft Decoder: minm Wα (Y ⊕ Xm) with analogweight Wα (Zm).

J.-M. Brossier Turbo codes.

IntroductionTurbo Principle

SISO (Soft Input Soft Output)Example of a product code

Definition of a soft information, how to use it?Convolutional codesBlock codes

SISO Convolutional

Soft input

The Viterbi algorithm is able to use soft inputs: it only needs touse an Euclidian metric.

Soft output

The Viterbi must be modified:

Soft Output Viterbi Algorithm (SOVA)

Idea: keep more than a single path: the difference betweenmetrics of the two best paths is an indication about thereliability of the decision.

J.-M. Brossier Turbo codes.

IntroductionTurbo Principle

SISO (Soft Input Soft Output)Example of a product code

Definition of a soft information, how to use it?Convolutional codesBlock codes

Block turbo codes

Soft input - Chase Algorithm

If only hard decisions are known. Vector R is received. A hardversion Y of R is usable by a usual algebric decoder. Analgebraic decoder provides a codeword XA.

For an incomplete decoder, the procedure stops here.

But, if reliabilities are known, it is possible to improve theestimation:

Find weak positions (weak LLRs)Modify Y for these positions and produce a small set ofdecoded codewords using this set of decisions about R.Select the codeword whose analog distance to R is minimum.

J.-M. Brossier Turbo codes.

IntroductionTurbo Principle

SISO (Soft Input Soft Output)Example of a product code

Definition of a soft information, how to use it?Convolutional codesBlock codes

Block turbo codes

Soft input - Chase Algorithm

If only hard decisions are known. Vector R is received. A hardversion Y of R is usable by a usual algebric decoder. Analgebraic decoder provides a codeword XA.

For an incomplete decoder, the procedure stops here.

But, if reliabilities are known, it is possible to improve theestimation:

Find weak positions (weak LLRs)Modify Y for these positions and produce a small set ofdecoded codewords using this set of decisions about R.Select the codeword whose analog distance to R is minimum.

J.-M. Brossier Turbo codes.

IntroductionTurbo Principle

SISO (Soft Input Soft Output)Example of a product code

Definition of a soft information, how to use it?Convolutional codesBlock codes

Block turbo codes

Soft outputs - Pyndiah Algorithm

Aim : computation of reliabilities

Λ (dj) = logP (aj = +1|R)

P (aj = −1|R)

Reliabilities

S±1j is the set of words with c i

j = ±1,

P (aj = ±1|R) =∑

C i∈S±1j

P(E = C i |R

)

J.-M. Brossier Turbo codes.

IntroductionTurbo Principle

SISO (Soft Input Soft Output)Example of a product code

Definition of a soft information, how to use it?Convolutional codesBlock codes

Block turbo codes

Soft outputs - Pyndiah Algorithm

Aim : computation of reliabilities

Λ (dj) = logP (aj = +1|R)

P (aj = −1|R)

Reliabilities

Λ (dj) = log

∑C i∈S+1

jP(R|E = C i

)∑C i∈S−1

jP (R|E = C i )

with P(R|E = C i

)=(

1√2πσ

)nexp

(−|R−C i |2

2σ2

)

J.-M. Brossier Turbo codes.

IntroductionTurbo Principle

SISO (Soft Input Soft Output)Example of a product code

Definition of a soft information, how to use it?Convolutional codesBlock codes

Block turbo codes

Soft outputs - Pyndiah Algorithm

Aim : computation of reliabilities

Λ (dj) = logP (aj = +1|R)

P (aj = −1|R)

A good approximation of reliabilities is given by:

Λ (dj) ≈1

2σ2

(∣∣∣R − C−1(j)∣∣∣2 − ∣∣∣R − C+1(j)

∣∣∣2)C±1(j) are words in S±1

j whose Euclidian distance to R is minimum.

S±1j is the set of words with c i

j = ±1

J.-M. Brossier Turbo codes.