Information Reading: Complexity: A Guided Tour, Chapter 3.

29
Information Reading: Complexity: A Guided Tour, Chapter 3

Transcript of Information Reading: Complexity: A Guided Tour, Chapter 3.

Page 1: Information Reading: Complexity: A Guided Tour, Chapter 3.

Information

Reading: Complexity: A Guided Tour, Chapter 3

Page 2: Information Reading: Complexity: A Guided Tour, Chapter 3.

Recap: Core disciplines of the science of complexity

Dynamics: The study of continually changing structure and

behavior of systems

Information: The study of representation, symbols, and communication

Computation: The study of how systems process information and act on the results

Evolution: The study of how systems adapt to

constantly changing environments

Page 3: Information Reading: Complexity: A Guided Tour, Chapter 3.

Information

Motivating questions:

• What are “order” and “disorder”?

• What are the laws governing these quantities?

• How do we define “information”?

• What is the “ontological status” of information

• How is information signaled between two entities?

• How is information processed to produce “meaning”?

Page 4: Information Reading: Complexity: A Guided Tour, Chapter 3.

Energy, Work, and Entropy

• What is energy?

• What is entropy?

• What are the laws of thermodynamics?

• What is “the arrow of time”?

Page 5: Information Reading: Complexity: A Guided Tour, Chapter 3.

Maxwell’s Demon

James Clerk Maxwell, 1831−1879See Netlogo simulation

Page 6: Information Reading: Complexity: A Guided Tour, Chapter 3.

Szilard’s solution

Leo Szilard, 1898−1964

Page 7: Information Reading: Complexity: A Guided Tour, Chapter 3.

Bennett and Landauer’s solution

Charles Bennett, b. 1943 Rolf Landauer, 1927–1999

Page 8: Information Reading: Complexity: A Guided Tour, Chapter 3.

Entropy/Information in Statistical Mechanics

• What is “statistical mechanics”?

• Describe the concepts of “macrostate”and “microstate”

Ludwig Boltzmann, 1844−1906

Page 9: Information Reading: Complexity: A Guided Tour, Chapter 3.

Entropy/Information in Statistical Mechanics

• What is “statistical mechanics”?

• Describe the concepts of “macrostate” and “microstate”.

• Combinatorics of a slot machine

Possible fruits: apple, orange, cherry, pear, lemon– Microstates– Macrostates

Macrostate: “Three identical fruits”How many microstates?

Macrostate: “Exactly one lemon”How many microstates?

Macrostate: “At least 2 cherries”How many microstates?

Page 10: Information Reading: Complexity: A Guided Tour, Chapter 3.

Boltzmann’s entropy, S

Page 11: Information Reading: Complexity: A Guided Tour, Chapter 3.

Boltzmann’s entropy, SBoltzmann entropy of a macrostate: natural logarithm of the number of microstates (W) corresponding to that macrostate (you can ignore the “k”).

Page 12: Information Reading: Complexity: A Guided Tour, Chapter 3.

Boltzmann’s entropy, SBoltzmann entropy of a macrostate: natural logarithm of the number of microstates (W) corresponding to that macrostate (you can ignore the “k”).

Aside: What is a “natural logarithm”?

Page 13: Information Reading: Complexity: A Guided Tour, Chapter 3.

Natural Log e

Page 14: Information Reading: Complexity: A Guided Tour, Chapter 3.

Intuition about Logb x

“How many places (diversity) do I need (when working with b number of symbols) to to represent number x ?”

Log10 100 = 2

Log10 1000 = 3

Log2 16 = 4

Page 15: Information Reading: Complexity: A Guided Tour, Chapter 3.

Intuition about Logb x

“How many places (diversity) do I need (when working with b number of symbols) to to represent number x ?”

Log10 100 = 2

Log10 1000 = 3

Log2 16 = 4

Log2 .5 = -1 ?

Page 16: Information Reading: Complexity: A Guided Tour, Chapter 3.

Quick review of logarithms

• log10

• ln

• log2

• loga b = log10 b / log10 a = logn b / logn a for any n

Page 17: Information Reading: Complexity: A Guided Tour, Chapter 3.

Boltzmann’s entropy, SBoltzmann entropy of a macrostate: natural logarithm of the number of microstates (W) corresponding to that macrostate (you can ignore the “k”).

Bolzmann entropy: the more microstates that give rise to a macrostate, the more probable that macrostate is. Thus high entropy = more probable macrostate.

Page 18: Information Reading: Complexity: A Guided Tour, Chapter 3.

Boltzmann’s entropy, SBoltzmann entropy of a macrostate: natural logarithm of the number of microstates (W) corresponding to that macrostate (you can ignore the “k”).

Bolzmann entropy: the more microstates that give rise to a macrostate, the more probable that macrostate is. Thus high entropy = more probable macrostate.

Second Law of Thermodynamics (à la Boltzmann): Nature tends towards more probable macrostates

Page 19: Information Reading: Complexity: A Guided Tour, Chapter 3.

Boltzmann’s entropy, S

What does this have to do with the “arrow of time”?

Page 20: Information Reading: Complexity: A Guided Tour, Chapter 3.

Lord Kelvin: Heat Death of the Universe

“The result would inevitably be a state of universal rest and death, if the universe were finite and left to obey existing laws. But

it is impossible to conceive a limit to the extent of matter in the universe; and therefore science points rather to an endless

progress, through an endless space, of action involving the transformation of potential energy into palpable motion

and hence into heat, than to a single finite mechanism, running down like a clock, and stopping for ever.[4]

On the Age of the Sun’s Heat" (1862)

Page 21: Information Reading: Complexity: A Guided Tour, Chapter 3.

Shannon Information / Entropy

http://www.khanacademy.org/science/brit-cruise/information-theory

Page 22: Information Reading: Complexity: A Guided Tour, Chapter 3.

Shannon Information / Entropy

Claude Shannon, 1916−2001

What were his motivations for defining/studying information?

What is a “message source”?

Page 23: Information Reading: Complexity: A Guided Tour, Chapter 3.

Shannon Information Vocab

Page 24: Information Reading: Complexity: A Guided Tour, Chapter 3.

Shannon Information Definitions

Page 25: Information Reading: Complexity: A Guided Tour, Chapter 3.

Shannon Information

H(message source) = − pi log2i=1

N

∑ pi

Measured in “bits”

Message source has N “miscrostates” (or “messages”, e.g., words).

pi is the probability of message i.

S(state) = k lnW

Boltzmann Entropy

Measured in units definedby k (often “Joules per Kelvin”)

Page 26: Information Reading: Complexity: A Guided Tour, Chapter 3.

H(Nicky) = − pi log2i

∑ pi

Messages: {Da}

Page 27: Information Reading: Complexity: A Guided Tour, Chapter 3.

H(Jake) = − pi log2i

∑ pi

Messages: {300 words}

Page 28: Information Reading: Complexity: A Guided Tour, Chapter 3.

Calculating Shannon Entropy

Page 29: Information Reading: Complexity: A Guided Tour, Chapter 3.

Calculating Shannon Entropy