Markov chain intro

23
ntroduction to Markov chains Examples of Markov chains: - Random walk on a line

description

Markov chain analysis

Transcript of Markov chain intro

Page 1: Markov chain intro

Introduction to Markov chainsExamples of Markov chains:- Random walk on a line

Page 2: Markov chain intro

Introduction to Markov chainsExamples of Markov chains:- Random walk on a graph

Page 3: Markov chain intro

Introduction to Markov chainsExamples of Markov chains:- Random walk on a hypercube

Page 4: Markov chain intro

Introduction to Markov chainsExamples of Markov chains:- Markov chain on graph colorings

Page 5: Markov chain intro

Introduction to Markov chainsExamples of Markov chains:- Markov chain on graph colorings

Page 6: Markov chain intro

Introduction to Markov chainsExamples of Markov chains:- Markov chain on graph colorings

Page 7: Markov chain intro

Introduction to Markov chainsExamples of Markov chains:- Markov chain on matchings of a graph

Page 8: Markov chain intro

Definition of Markov chains- State space - Transition matrix P, of size ||x||

- P(x,y) is the probability of getting from state x to state y- y2 P(x,y) =

This course: state space is finite

Page 9: Markov chain intro

Definition of Markov chains- State space - Transition matrix P, of size ||x||

- P(x,y) is the probability of getting from state x to state y- y2 P(x,y) =

This course: state space is finite

Page 10: Markov chain intro

Definition of Markov chainsExample: frog on lily pads

Page 11: Markov chain intro

Definition of Markov chainsStationary distribution ¼ (on states in ):

Example: frog with unfair coins

Page 12: Markov chain intro

Definition of Markov chainsOther examples:

Page 13: Markov chain intro

Definition of Markov chains- aperiodic

- irreducible

- ergodic: a MC that is both aperiodic and irreducible

Thm (3.6 in Jerrum): An ergodic MC has a unique stationary distribution. Moreover, it converges to it as time goes to 1.

Page 14: Markov chain intro

Gambler’s RuinA gambler has k dollars initially and bets on (fair) coin flips.If she guesses the coin flip: gets 1 dollar.If not, loses 1 dollar.Stops either when 0 or n dollars.

Is it a Markov chain ?

How many bets until she stops playing ?What are the probabilities of 0 vs n dollars ?

Page 15: Markov chain intro

Gambler’s RuinStationary distribution ?

Page 16: Markov chain intro

Gambler’s RuinStarting with k dollars, what is the probability of reaching n ?

What is the probability of reaching 0 ?

Page 17: Markov chain intro

Gambler’s RuinHow many bets until 0 or n ?

Page 18: Markov chain intro

Coupon Collecting- n coupons- each time get a coupon at random, chosen from 1,…,n (i.e., likely will get some coupons multiple times)- how long to get all n coupons ?

Is this a Markov chain ?

Page 19: Markov chain intro

Coupon CollectingHow long to get all n coupons ?

Page 20: Markov chain intro

Coupon CollectingHow long to get all n coupons ?

Page 21: Markov chain intro

Coupon CollectingProposition:Let ¿ be the random variable for the time needed to collect all n coupons. Then,

Pr(¿ > d n log n + cn e) · e-c.

Page 22: Markov chain intro

Random Walk on a HypercubeHow many expected steps until we get to a random state ?(I.e., what is the mixing time ?)

Page 23: Markov chain intro

Random Walk on a HypercubeHow many expected steps until we get to a random state ?(I.e., what is the mixing time ?)