DCSP-6: Signal Transmission + information theory Jianfeng Feng Department of Computer Science...

57
DCSP-6: Signal Transmission + information theory Jianfeng Feng Department of Computer Science Warwick Univ., UK [email protected] http://www.dcs.warwick.ac.uk/~feng/ dcsp.html

Transcript of DCSP-6: Signal Transmission + information theory Jianfeng Feng Department of Computer Science...

Page 1: DCSP-6: Signal Transmission + information theory Jianfeng Feng Department of Computer Science Warwick Univ., UK Jianfeng.feng@warwick.ac.uk feng/dcsp.html.

DCSP-6: Signal Transmission + information theory

Jianfeng Feng

Department of Computer Science Warwick Univ., UK

[email protected]

http://www.dcs.warwick.ac.uk/~feng/dcsp.html

Page 2: DCSP-6: Signal Transmission + information theory Jianfeng Feng Department of Computer Science Warwick Univ., UK Jianfeng.feng@warwick.ac.uk feng/dcsp.html.
Page 3: DCSP-6: Signal Transmission + information theory Jianfeng Feng Department of Computer Science Warwick Univ., UK Jianfeng.feng@warwick.ac.uk feng/dcsp.html.
Page 4: DCSP-6: Signal Transmission + information theory Jianfeng Feng Department of Computer Science Warwick Univ., UK Jianfeng.feng@warwick.ac.uk feng/dcsp.html.
Page 5: DCSP-6: Signal Transmission + information theory Jianfeng Feng Department of Computer Science Warwick Univ., UK Jianfeng.feng@warwick.ac.uk feng/dcsp.html.

How to deal with noise?How to transmit signals?

Page 6: DCSP-6: Signal Transmission + information theory Jianfeng Feng Department of Computer Science Warwick Univ., UK Jianfeng.feng@warwick.ac.uk feng/dcsp.html.

Errors in digital communication

We noted earlier that one of the most important advantages of digital communications is that it permits very high fidelity.

We shall consider in detail only BPSK systems, and comment on the alternative modulations.

In the absence of noise, the signal V, from a BPSK system can take one of two values + or - vb.

In the ideal case, if the signal is greater than 0, the value is assigned 1.

Page 7: DCSP-6: Signal Transmission + information theory Jianfeng Feng Department of Computer Science Warwick Univ., UK Jianfeng.feng@warwick.ac.uk feng/dcsp.html.

If the signal is less than 0, the value that is read is assigned 0.

When noise is present, this distinction between + and - vb (with the threshold at 0 becomes blurred).

There is a finite probability of the signal dropping below 0, and thus being assigned 0, even through a 1 was transmitted when this happens, we say that a bit-error has occurred.

The probability that a bit-error will occur in a given time is referred to as the bit-error rate (BER)

Page 8: DCSP-6: Signal Transmission + information theory Jianfeng Feng Department of Computer Science Warwick Univ., UK Jianfeng.feng@warwick.ac.uk feng/dcsp.html.

We suppose that the signal V, which has the signal levels + or – vb with noise N of variance .

The probability that an error will occur in the transmission of a 1 is

Page 9: DCSP-6: Signal Transmission + information theory Jianfeng Feng Department of Computer Science Warwick Univ., UK Jianfeng.feng@warwick.ac.uk feng/dcsp.html.

1

1+N

1 1 1 1 1 1 1 1 1

0 0 1 1 1 1 1 1 1

Page 10: DCSP-6: Signal Transmission + information theory Jianfeng Feng Department of Computer Science Warwick Univ., UK Jianfeng.feng@warwick.ac.uk feng/dcsp.html.

It is usual to write these expressions in terms of the ratio of(energy per bit) to (nose power per unit Hz), En.

The power S in the signal is, on average (vb)2 and the total energy in the signaling period T is (vb)2 T.

Page 11: DCSP-6: Signal Transmission + information theory Jianfeng Feng Department of Computer Science Warwick Univ., UK Jianfeng.feng@warwick.ac.uk feng/dcsp.html.

• \

Page 12: DCSP-6: Signal Transmission + information theory Jianfeng Feng Department of Computer Science Warwick Univ., UK Jianfeng.feng@warwick.ac.uk feng/dcsp.html.

Timing controlIn additional to providing the analogue modulation and

demodulation functions, digital communication also requires timing control.

It is necessary to introduce a clock in signal transmission is obvious if we look at Fig.

Page 13: DCSP-6: Signal Transmission + information theory Jianfeng Feng Department of Computer Science Warwick Univ., UK Jianfeng.feng@warwick.ac.uk feng/dcsp.html.

Timing control is required to identify the rate at which bits are transmitted and to identify the start and end of each bit.

This permits the receiver to correctly identify each bit in the transmitted message.

Bits are never sent individually.

Page 14: DCSP-6: Signal Transmission + information theory Jianfeng Feng Department of Computer Science Warwick Univ., UK Jianfeng.feng@warwick.ac.uk feng/dcsp.html.

They are grouped together in segments, called blocks.

A block is the minimum segment of data that can be sent with each transmission.

Usually, a message will contain many such blocks.

Page 15: DCSP-6: Signal Transmission + information theory Jianfeng Feng Department of Computer Science Warwick Univ., UK Jianfeng.feng@warwick.ac.uk feng/dcsp.html.

Each block is framed by binary characters identifying the start and end of the block.

The type of method used depends on the source of the timing information.

If the timing in the receiver is generated by the receiver, separately from the transmitter, the transmission is

termed asynchronous.

If the timing is generated, directly or indirectly, from the transmitter clock the transmission is termed synchronous.

Page 16: DCSP-6: Signal Transmission + information theory Jianfeng Feng Department of Computer Science Warwick Univ., UK Jianfeng.feng@warwick.ac.uk feng/dcsp.html.

Asynchronous transmission is used for low data-rate transmission and stand-alone equipment.

We will not discuss it in detail here.

Synchronous transmission is used for high data rate transmission.

The timing is generated by sending a separate clock signal, or embedding the timing information into the transmission.

This information is used to synchronize the receiver circuitry to the transmitter clock.

Page 17: DCSP-6: Signal Transmission + information theory Jianfeng Feng Department of Computer Science Warwick Univ., UK Jianfeng.feng@warwick.ac.uk feng/dcsp.html.

Synchronous receivers require a timing signal from the transmitter.

An additional channel may be sued in the system to transmit the clock signal.

This is wasteful of bandwidth, and it is more customary to embed the timing signal within the transmitted data stream by use of suitable encoding (self-clocking encoding).

Page 18: DCSP-6: Signal Transmission + information theory Jianfeng Feng Department of Computer Science Warwick Univ., UK Jianfeng.feng@warwick.ac.uk feng/dcsp.html.

Bipolar Coding

A binary 0 is encoded as zero volts

A binary 1 is encoded alternately as a positive voltage and a negative voltage.

Page 19: DCSP-6: Signal Transmission + information theory Jianfeng Feng Department of Computer Science Warwick Univ., UK Jianfeng.feng@warwick.ac.uk feng/dcsp.html.
Page 20: DCSP-6: Signal Transmission + information theory Jianfeng Feng Department of Computer Science Warwick Univ., UK Jianfeng.feng@warwick.ac.uk feng/dcsp.html.

Ethernet

Page 21: DCSP-6: Signal Transmission + information theory Jianfeng Feng Department of Computer Science Warwick Univ., UK Jianfeng.feng@warwick.ac.uk feng/dcsp.html.

Manchester code (also known as Phase Encoding, or PE) is a form of data communications in which each bit of data is signified by at least one voltage level transition.

Manchester encoding is therefore considered to be self-clocking, which means that accurate synchronization of a data stream is possible.

Each bit is transmitted over a predefined time period.

Page 22: DCSP-6: Signal Transmission + information theory Jianfeng Feng Department of Computer Science Warwick Univ., UK Jianfeng.feng@warwick.ac.uk feng/dcsp.html.

Summary:

• Data and clock signals are combined to form a single self-synchronizing data stream

• each encoded bit contains a transition at the midpoint of a bit period

• the direction of transition determines whether the bit is a "0" or a "1," and

• the first half is the true bit value and the second half is the complement of the true bit value.

Page 23: DCSP-6: Signal Transmission + information theory Jianfeng Feng Department of Computer Science Warwick Univ., UK Jianfeng.feng@warwick.ac.uk feng/dcsp.html.
Page 24: DCSP-6: Signal Transmission + information theory Jianfeng Feng Department of Computer Science Warwick Univ., UK Jianfeng.feng@warwick.ac.uk feng/dcsp.html.

Applications• Data transmission is the conveyance of any kind of information

from one space to another. Historically this could be done by courier, a chain of bonfires, and

later by Morse code over copper wires.

• In recent computer terms, it means sending a stream of bits or bytes from one location to another using any number of technologies, such as copper wire, optical fiber, laser, radio, or infra-red light.

Practical examples include moving data from one data storage device to another such as accessing a website, which involves data transfer from web servers to a user's browser.

• A related concept to data transmission is the data transmission protocol used to make the data transfer legible.

Page 25: DCSP-6: Signal Transmission + information theory Jianfeng Feng Department of Computer Science Warwick Univ., UK Jianfeng.feng@warwick.ac.uk feng/dcsp.html.

Protocol

• A protocol is an agreed-upon format for transmitting data between two devices e.g.: computer and printer.

All communications between devices require that the devices agree on the format of the data. The set of rules defining a format is called a protocol.

• The protocol determines the following:

• the type of error checking to be used if any e.g.: Check digit (and what type/ what formula to be used)

• data compression method, if any e.g.: Zipped files if the file is large, like transfer across the Internet, LANs and WANs.

Page 26: DCSP-6: Signal Transmission + information theory Jianfeng Feng Department of Computer Science Warwick Univ., UK Jianfeng.feng@warwick.ac.uk feng/dcsp.html.

• how the sending device will indicate that it has finished sending

a message, e.g.: in a a spare wire would be used, for serial (USB) transfer start and stop digits maybe used.

• how the receiving device will indicate that it has received a message

• rate of transmission (in baud or bit rate) • whether transmission is to be synchronous or asynchronous

• In addition, protocols can include sophisticated techniques for detecting and recovering from transmission errors and for encoding and decoding data.

Page 27: DCSP-6: Signal Transmission + information theory Jianfeng Feng Department of Computer Science Warwick Univ., UK Jianfeng.feng@warwick.ac.uk feng/dcsp.html.

• Introduction

• Fourier Transform I

• Fourier Transform II

• ASK, FSK, and PSK

• Noise

• Signal Transmission

Week 2

Week 3

Week 4

Page 28: DCSP-6: Signal Transmission + information theory Jianfeng Feng Department of Computer Science Warwick Univ., UK Jianfeng.feng@warwick.ac.uk feng/dcsp.html.

• Data transmission: Channel characteristics,

signalling methods,

interference and noise,

synchronisation,

data compression and encryption;

Page 29: DCSP-6: Signal Transmission + information theory Jianfeng Feng Department of Computer Science Warwick Univ., UK Jianfeng.feng@warwick.ac.uk feng/dcsp.html.

• Data transmission:

• Information Sources and Coding: Information theory,

• coding of information for efficiency and error protection;

Page 30: DCSP-6: Signal Transmission + information theory Jianfeng Feng Department of Computer Science Warwick Univ., UK Jianfeng.feng@warwick.ac.uk feng/dcsp.html.
Page 31: DCSP-6: Signal Transmission + information theory Jianfeng Feng Department of Computer Science Warwick Univ., UK Jianfeng.feng@warwick.ac.uk feng/dcsp.html.

Information and coding theoryInformation theory is concerned with the

description of information sources, the representation of the information from a source, and the transmission of this information over channel.

Page 32: DCSP-6: Signal Transmission + information theory Jianfeng Feng Department of Computer Science Warwick Univ., UK Jianfeng.feng@warwick.ac.uk feng/dcsp.html.

Information and coding theoryInformation theory is concerned with the

description of information sources, the representation of the information from a source, and the transmission of this information over channel.

This might be the best example to demonstrate how a deep mathematical theory could be successfully applied to solving engineering problems.

Page 33: DCSP-6: Signal Transmission + information theory Jianfeng Feng Department of Computer Science Warwick Univ., UK Jianfeng.feng@warwick.ac.uk feng/dcsp.html.

Information theory is a discipline in applied mathematics involving the quantification of data with the goal of enabling as much data as possible to be reliably stored on a medium and/or communicated over a channel.

Page 34: DCSP-6: Signal Transmission + information theory Jianfeng Feng Department of Computer Science Warwick Univ., UK Jianfeng.feng@warwick.ac.uk feng/dcsp.html.

Information theory is a discipline in applied mathematics involving the quantification of data with the goal of enabling as much data as possible to be reliably stored on a medium and/or communicated over a channel.

The measure of data, known as information entropy, is usually expressed by the average number of bits needed for storage or communication.

Page 35: DCSP-6: Signal Transmission + information theory Jianfeng Feng Department of Computer Science Warwick Univ., UK Jianfeng.feng@warwick.ac.uk feng/dcsp.html.

The field is at the crossroads of

mathematics,

statistics,

computer science,

physics,

neurobiology,

and electrical engineering.

Page 36: DCSP-6: Signal Transmission + information theory Jianfeng Feng Department of Computer Science Warwick Univ., UK Jianfeng.feng@warwick.ac.uk feng/dcsp.html.

Its impact has been crucial to success of

• the voyager missions to deep space,

• the invention of the CD,

• the feasibility of mobile phones,

• the development of the Internet,

• the study of linguistics and of human perception,

• the understanding of black holes,

and numerous other fields.

Page 37: DCSP-6: Signal Transmission + information theory Jianfeng Feng Department of Computer Science Warwick Univ., UK Jianfeng.feng@warwick.ac.uk feng/dcsp.html.

Information theory is generally considered to have been founded in 1948 by Claude Shannon in his seminal work,

A Mathematical Theory of Communication

Page 38: DCSP-6: Signal Transmission + information theory Jianfeng Feng Department of Computer Science Warwick Univ., UK Jianfeng.feng@warwick.ac.uk feng/dcsp.html.

The central paradigm of classic information theory is the engineering problem of the transmission of information over a noisy channel.

An avid chess player, Professor Shannon built a chess-playing computer years before IBM's Deep Blue came along. While on a trip to Russia in 1965, he challenged world champion Mikhail Botvinnik to a match.

He lost in 42 moves, considered an excellent showing.

Page 39: DCSP-6: Signal Transmission + information theory Jianfeng Feng Department of Computer Science Warwick Univ., UK Jianfeng.feng@warwick.ac.uk feng/dcsp.html.

The most fundamental results of this theory are

1. Shannon's source coding theorem which establishes that, on average, the number of

bits needed to represent the result of an uncertain event is given by its entropy;

Page 40: DCSP-6: Signal Transmission + information theory Jianfeng Feng Department of Computer Science Warwick Univ., UK Jianfeng.feng@warwick.ac.uk feng/dcsp.html.

The most fundamental results of this theory are

1. Shannon's source coding theorem which establishes that, on average, the number of

bits needed to represent the result of an uncertain event is given by its entropy;

2. Shannon's noisy-channel coding theorem which states that reliable communication is possible

over noisy channels provided that the rate of communication is below a certain threshold called the channel capacity.

The channel capacity can be approached by using appropriate encoding and decoding systems.

Page 41: DCSP-6: Signal Transmission + information theory Jianfeng Feng Department of Computer Science Warwick Univ., UK Jianfeng.feng@warwick.ac.uk feng/dcsp.html.

Consider to predict the activity of Prime minister tomorrow.

This prediction is an information source.

The information source has two outcomes:

• He will be in his office,

• he will be naked and run 10 miles in London.

Page 42: DCSP-6: Signal Transmission + information theory Jianfeng Feng Department of Computer Science Warwick Univ., UK Jianfeng.feng@warwick.ac.uk feng/dcsp.html.

Clearly, the outcome of 'in office' contains little information; it is a highly probable outcome.

The outcome 'naked run', however contains considerable information; it is a highly improbable event.

Page 43: DCSP-6: Signal Transmission + information theory Jianfeng Feng Department of Computer Science Warwick Univ., UK Jianfeng.feng@warwick.ac.uk feng/dcsp.html.

In information theory, an information source is a probability distribution, i.e. a set of probabilities assigned to a set of outcomes.

"Nothing is certain, except death and taxes"

Page 44: DCSP-6: Signal Transmission + information theory Jianfeng Feng Department of Computer Science Warwick Univ., UK Jianfeng.feng@warwick.ac.uk feng/dcsp.html.

In information theory, an information source is a probability distribution, i.e. a set of probabilities assigned to a set of outcomes.

"Nothing is certain, except death and taxes"

This reflects the fact that the information contained in an outcome is determined not only by the outcome, but by how uncertain it is.

An almost certain outcome contains little information.

A measure of the information contained in an outcome was introduced by Hartley in 1927.

Page 45: DCSP-6: Signal Transmission + information theory Jianfeng Feng Department of Computer Science Warwick Univ., UK Jianfeng.feng@warwick.ac.uk feng/dcsp.html.

He defined the information contained in an outcome x a

I(x) = - log2 p(x)

Page 46: DCSP-6: Signal Transmission + information theory Jianfeng Feng Department of Computer Science Warwick Univ., UK Jianfeng.feng@warwick.ac.uk feng/dcsp.html.

He defined the information contained in an outcome x a

I(x) = - log2 p(x)This measure satisfied our requirement that the

information contained in an outcome is proportional to its uncertainty.

If P(x)=1, then I(x)=0, telling us that a certain event contains no information

Page 47: DCSP-6: Signal Transmission + information theory Jianfeng Feng Department of Computer Science Warwick Univ., UK Jianfeng.feng@warwick.ac.uk feng/dcsp.html.

The definition above also satisfies the requirement that the total information in in dependent events should add.

Page 48: DCSP-6: Signal Transmission + information theory Jianfeng Feng Department of Computer Science Warwick Univ., UK Jianfeng.feng@warwick.ac.uk feng/dcsp.html.

The definition above also satisfies the requirement that the total information in in dependent events should add.

Clearly, our prime minister prediction for two days contain twice as much information as for one day.

Page 49: DCSP-6: Signal Transmission + information theory Jianfeng Feng Department of Computer Science Warwick Univ., UK Jianfeng.feng@warwick.ac.uk feng/dcsp.html.

The definition above also satisfies the requirement that the total information in in dependent events should add.

Clearly, our prime minister prediction for two days contain twice as much information as for one day.

For two independent outcomes xi and xj, I(xi and xj) = -log P(xi and xj) = -[log P(xi) P(xj)] =

Page 50: DCSP-6: Signal Transmission + information theory Jianfeng Feng Department of Computer Science Warwick Univ., UK Jianfeng.feng@warwick.ac.uk feng/dcsp.html.

The definition above also satisfies the requirement that the total information in in dependent events should add.

Clearly, our prime minister prediction for two days contain twice as much information as for one day.

For two independent outcomes xi and xj, I(xi and xj) = log P(xi and xj) = log P(xi) P(xj) =

Hartley's measure defines the information in a single outcome.

Page 51: DCSP-6: Signal Transmission + information theory Jianfeng Feng Department of Computer Science Warwick Univ., UK Jianfeng.feng@warwick.ac.uk feng/dcsp.html.

The measure entropy H(X) defines the information content of the course X as a whole.

It is the mean information provided by the source.

We have

H(X)=i P(xi)I(xi) = - i P(xi) log2 P(xi)

A binary symmetric source (BSS) is a source with two outputs whose probabilities are p and 1-p respectively.

Page 52: DCSP-6: Signal Transmission + information theory Jianfeng Feng Department of Computer Science Warwick Univ., UK Jianfeng.feng@warwick.ac.uk feng/dcsp.html.

The prime minister discussed is a BSS.

The entropy of the source is

H(X) = -p log2 p - (1-p) log2 (1-p)

Page 53: DCSP-6: Signal Transmission + information theory Jianfeng Feng Department of Computer Science Warwick Univ., UK Jianfeng.feng@warwick.ac.uk feng/dcsp.html.
Page 54: DCSP-6: Signal Transmission + information theory Jianfeng Feng Department of Computer Science Warwick Univ., UK Jianfeng.feng@warwick.ac.uk feng/dcsp.html.

The function takes the value zero when p=0.

When one outcome is certain, so is the other, and the entropy is zero.

As p increases, so too does the entropy, until it reaches a maximum when p = 1-p = 0.5.

When p is greater than 0.5, the curve declines symmetrically to zero, reached when p=1.

Page 55: DCSP-6: Signal Transmission + information theory Jianfeng Feng Department of Computer Science Warwick Univ., UK Jianfeng.feng@warwick.ac.uk feng/dcsp.html.

We conclude that the average information in the BSS is maximised when both outcomes are equally likely.

The entropy is measuring the average uncertainty of the source.

(The term entropy is borrowed from thermodynamics. There too it is a measure of the uncertainly of disorder of a system).

Page 56: DCSP-6: Signal Transmission + information theory Jianfeng Feng Department of Computer Science Warwick Univ., UK Jianfeng.feng@warwick.ac.uk feng/dcsp.html.

In classical thermodynamics, the concept of entropy is defined phenomenologically by the second law of thermodynamics, which states that

the entropy of an isolated system always increases or remains constant

Page 57: DCSP-6: Signal Transmission + information theory Jianfeng Feng Department of Computer Science Warwick Univ., UK Jianfeng.feng@warwick.ac.uk feng/dcsp.html.

My greatest concern was what to call it. I thought of calling it ‘information’, but the word was overly used, so I decided to call it ‘uncertainty’. When I discussed it with John von Neumann, he had a better idea. Von Neumann told me, ‘You should call it entropy, for two reasons. In the first place your uncertainty function has been used in statistical mechanics under that name, so it already has a name. In the second place, and more important, nobody knows what entropy really is, so in a debate you will always have the advantage.