Dynamic Programming

23
Dynamic Programming Heejin Park College of Information and Communications Hanyang University

description

Dynamic Programming. Heejin Park College of Information and Communications Hanyang University. Content. Introduction Assembly-line scheduling Matrix-chain multiplication Elements of dynamic programming Longest common subsequence. Matrix-chain multiplication. Matrix-chain multiplication - PowerPoint PPT Presentation

Transcript of Dynamic Programming

Page 1: Dynamic Programming

Dynamic Programming

Heejin Park

College of Information and Communications

Hanyang University

Page 2: Dynamic Programming

2

Content

Introduction

Assembly-line scheduling

Matrix-chain multiplication

Elements of dynamic programming

Longest common subsequence

Page 3: Dynamic Programming

3

Matrix-chain multiplication

Matrix-chain multiplication Given a chain of n matrices <A1, A2, ..., An>,

compute the product A1A2…An.

The number of ways to compute the product A1A2 …An.

= the number of ways to fully parenthesize the product. The number of ways to compute the product A1A2A3 is 2.

(A1 A2) A3

A1(A2 A3)

Page 4: Dynamic Programming

4

Matrix-chain multiplication

The order of multiplications

The order of multiplications does not change the value of the product because matrix multiplication is associative.

For example, whether the left multiplication is done first or the right multiplication is done first does not matter.

( A1· A2) · A3 = A1· (A2 · A3)

However, the order of multiplication affects the number of scalar multiplications needed to compute the product.

Page 5: Dynamic Programming

5

Matrix-chain multiplication

Multiplying two matrices A and B

We can multiply them if they are compatible: the number of columns of A must equal the number of rows of B.

If A is a p×q matrix and B is a q×r matrix,

the resulting matrix is a p×r matrix.

x =(A 2x3)

(B 3x2) (C 2x2)

Page 6: Dynamic Programming

6

Matrix-chain multiplication

The number of scalar multiplications to multiply A and B. It is pqr because we compute pr elements and computing ea

ch element needs q scalar multiplications.

Page 7: Dynamic Programming

7

Matrix-chain multiplication

The order of multiplications affects the number of scalar multiplications.

Computing A1A2A3 where A1: 10×100 A2: 100×5 A3 : 5×50

(A1 A2) A3 (A1 A2) = 10*100*5 = 5000, (10 ×5) A3 = 10*5*50 = 2500

=>5000 + 2500 = 7,500

A1 (A2 A3) (A2 A3) = 100*5*50 =25000, A1(100 × 50) = 10 *100*50 = 50000

=>25000 + 50000 = 75,000

Computing (A1 A2) A3 is 10 times faster.

Page 8: Dynamic Programming

8

Matrix-chain multiplication

Matrix-chain multiplication problem

Given a chain A1, A2, ..., An of n matrices, where matrix Ai has dimension pi-1×pi, find the order of matrix multiplications minimizing the scalar multiplications to compute the product.

That is, to fully parenthesize the product of matrices minimizing scalar multiplications. For example, for the product A1 A2 A3 A4, a fully parenthe

sization is ((A1 A2) A3) A4.

Page 9: Dynamic Programming

9

Matrix-chain multiplication

Solutions of the matrix-chain multiplication problem

Brute-force approach Enumerate all possible parenthesizations. Compute the number of scalar multiplications of each pa

renthesization. Select the parenthesization needing the least number of s

calar multiplications.

Page 10: Dynamic Programming

10

Matrix-chain multiplication

The Brute-force approach is inefficient.

The number of enumerated parenthesizations is Ω(4n/n3/2).

The number of parenthesizations of a product of n matrices, denoted by P(n), is as follows.

1

1

)()(

1

)( n

k

knPkPnP

if n=1

if n≥2

Page 11: Dynamic Programming

11

Matrix-chain multiplication

The product A1 A2 A3 A4 can be fully parenthesized in five distinct ways.

A1(A2(A3 A4)), A1((A2 A3) A4),

(A1 A2)(A3 A4),

(A1(A2 A3))A4, ((A1 A2) A3) A4.

Page 12: Dynamic Programming

12

Matrix-chain multiplication

P(n): The number of alternative parenthesizations of a sequence of n matrices. The split between the two subproducts may occur betwee

n the kth and (k + 1)st matrices for any k = 1, 2, ..., n – 1.

1

1

)()(

1

)( n

k

knPkPnP

if n=1

if n≥2

Page 13: Dynamic Programming

13

Matrix-chain multiplication

Dynamic programming

Optimal substructure m[i,j]: The minimum number of scalar multiplications for comp

uting Ai Ai+1 …Aj.

matrix Ai : pi-1 × pi

computing Ai..k Ak+1..j takes pi-1 pk pj scalar multiplications. s[i, j] stores the optimal k for tracing the optimal solution.

}],1[],[{min

0],[

1 jkijki

pppjkmkimjimif i=j

if i<j

Page 14: Dynamic Programming

14

Matrix-chain multiplication

i j 1 2 3 4 5 6

1 0

2 0

3 0

4 0

5 0

6 0

i j 2 3 4 5 6

1

2

3

4

5

s

541

531

521

]5,5[]4,2[

]5,4[]3,2[

]5,3[]2,2[

min]5,2[

pppmm

pppmm

pppmm

m

i=2, j=5, i≤k<j

1

2

3

4

5

1

3

3

5

3

0 +2500+35·15·20=13000,

2625+1000+35·5·20=7125,

4375+0 +35·10·20 =11375

3

3

3

3

m

15750 7875 9375 11875 15125

2625 4375 7125 10500

750 2500 5375

1000 3500

5000

3

Page 15: Dynamic Programming

15

Matrix-chain multiplication

Running time O(n3) time in total

Θ(n2) subproblems O(n) time for each subproblem

Space consumption Θ(n2) space to store the m and s tables.

Page 16: Dynamic Programming

16

Content

Introduction

Assembly-line scheduling

Matrix-chain multiplication

Elements of dynamic programming

Longest common subsequence

Page 17: Dynamic Programming

17

Elements of dynamic programming Optimal substructure Overlapping subproblems

Elements of dynamic programming

Page 18: Dynamic Programming

18

Element of dynamic programming

Subtleties Unweighted longest simple path problem

Does it have optimal substructure?

u w vp1 p2

p

Page 19: Dynamic Programming

19

Elements of dynamic programming

q r

s t

Page 20: Dynamic Programming

20

Elements of dynamic programming

Overlapping subproblems When a recursive algorithm revisits the same problem over an

d over again, that is the optimization problem has overlapping subproblems.

Page 21: Dynamic Programming

21

Elements of dynamic programming

1...4

1...1 2...4 1...2 3...4 1...3 4...4

2...2 3...4 2...3 4...4 1...1 2...2 3...3 4...4 1...1 2...3 1...2 3...3

3...3 4...4 2...2 3...3 2...2 3...3 1...1 2...2

Matrix chain multiplication: top-down vs. bottom-up

Page 22: Dynamic Programming

22

Elements of dynamic programming

Memoization Recursive solution but solve each subproblem only once. Fills the table in recursive way. In most case, it is slower than dynamic programming. It is useful when only a part of subproblems is solved.

Page 23: Dynamic Programming

23

Elements of dynamic programming

The running time of a dynamic-programming algorithm depends on the product of two factors.

The number of subproblems overall. How many choices each subproblem has.

Assembly line scheduling Θ(n) subproblems · 2 choices = Θ(n)

Matrix chain multiplication Θ(n2) subproblems · (n-1) choices = O(n3)