PARALLEL PROGRAMMING ABSTRACTIONS 6/16/2010 Parallel Programming Abstractions 1.

28
PARALLEL PROGRAMMING ABSTRACTIONS 6/16/2010 Parallel Programming Abstractions 1

Transcript of PARALLEL PROGRAMMING ABSTRACTIONS 6/16/2010 Parallel Programming Abstractions 1.

Page 1: PARALLEL PROGRAMMING ABSTRACTIONS 6/16/2010 Parallel Programming Abstractions 1.

Parallel Programming Abstractions 1

PARALLEL PROGRAMMING ABSTRACTIONS

6/16/2010

Page 2: PARALLEL PROGRAMMING ABSTRACTIONS 6/16/2010 Parallel Programming Abstractions 1.

Parallel Programming Abstractions 2

Tasks vs Threads• Similar but not the same.

6/16/2010

h/w processors

Operating System

Threads

Task Scheduler

Tasks

Page 3: PARALLEL PROGRAMMING ABSTRACTIONS 6/16/2010 Parallel Programming Abstractions 1.

Parallel Programming Abstractions 3

Tasks vs Threads: Main Differences• Tasks need not run concurrently• The task scheduler can schedule them on the same thread

6/16/2010

Page 4: PARALLEL PROGRAMMING ABSTRACTIONS 6/16/2010 Parallel Programming Abstractions 1.

Parallel Programming Abstractions 4

Tasks vs Threads: Main Differences• Tasks need not run concurrently• The task scheduler can schedule them on the same thread

• Tasks do not have fairness guarantees

6/16/2010

while(t == 0);Write(“hello”);

t = 1;

Page 5: PARALLEL PROGRAMMING ABSTRACTIONS 6/16/2010 Parallel Programming Abstractions 1.

Parallel Programming Abstractions 5

Tasks vs Threads: Main Differences• Tasks need not run concurrently• The task scheduler can schedule them on the same thread

• Tasks do not have fairness guarantees

• Tasks are cheaper than threads• Have smaller stacks• Do not pre-allocate OS resources

6/16/2010

Page 6: PARALLEL PROGRAMMING ABSTRACTIONS 6/16/2010 Parallel Programming Abstractions 1.

Parallel Programming Abstractions 6

Generating Tasks from Programs• You don’t want programmers to explicitly create tasks• Like assembly level programming

• Instead:• Design programming language constructs that capture

programmers intent• Compiler converts these constructs to tasks

• Languages with the following features are very convenient for this purpose• Type inference• Generics• First order anonymous functions (lamdas/delegates)

6/16/2010

Page 7: PARALLEL PROGRAMMING ABSTRACTIONS 6/16/2010 Parallel Programming Abstractions 1.

Parallel Programming Abstractions 7

First-order functions: C# Lambda Expressions

• Syntax (input parameters) => expression

• Is similar to : &anon_foo // where foo is declared elsewhere anon_foo(input parameters){ return expression;}

• Examples:x => x(x,y) => x==y(int x, string s) => s.Length > x() => { return SomeMethod()+1; }

Func<int, bool> myFunc = x => x == 5;bool result = myFunc(4);

6/16/2010

Page 8: PARALLEL PROGRAMMING ABSTRACTIONS 6/16/2010 Parallel Programming Abstractions 1.

Parallel Programming Abstractions 8

Sequential Merge Sort With Lambdas

6/16/2010

MergeSort(int[] a, low, hi){ if(base_case) …

int mid = low + (hi-low)/2;

var f = (l,h) => { MergeSort(a, l, h);}

f(low, mid-1); f(mid, high); Merge(a, low, mid, hi);}

Page 9: PARALLEL PROGRAMMING ABSTRACTIONS 6/16/2010 Parallel Programming Abstractions 1.

Parallel Programming Abstractions 9

Things to Know about C# Lambdas• Lambda is an expression (with no type)• Conversion to a delegate type• Type inference for parameters• Capture of free variables• Locations referenced by free variables are converted to be on

the heap (“boxed”)

6/16/2010

Page 10: PARALLEL PROGRAMMING ABSTRACTIONS 6/16/2010 Parallel Programming Abstractions 1.

Parallel Programming Abstractions 10

The Task Abstraction

6/16/2010

delegate void Action();

class Task {Task( Action a );void Wait();

// called by the WSQ schedulervoid Execute();

}

Page 11: PARALLEL PROGRAMMING ABSTRACTIONS 6/16/2010 Parallel Programming Abstractions 1.

Parallel Programming Abstractions 11

Merge Sort With Tasks

6/16/2010

MergeSort(int[] a, low, hi){ if(base_case) …

int mid = low + (hi-low)/2;

Task left = new Task( delegate{ MergeSort(a, low, mid); } );

Task right = new Task( delegate{ MergeSort(a, mid, hi); } );

left.Wait(); right.Wait(); Merge(a, low, mid-1, hi);}

Page 12: PARALLEL PROGRAMMING ABSTRACTIONS 6/16/2010 Parallel Programming Abstractions 1.

Parallel Programming Abstractions 12

Parallel.Invoke

• Invokes all input actions in parallel • Waits for all of them to finish

6/16/2010

static void Invoke(params Action[] actions);

Page 13: PARALLEL PROGRAMMING ABSTRACTIONS 6/16/2010 Parallel Programming Abstractions 1.

Parallel Programming Abstractions 13

Merge Sort With Parallel.Invoke

6/16/2010

MergeSort(int[] a, low, hi){ if(base_case) …

int mid = low + (hi-low)/2;

Paralle.Invoke{ () => { MergeSort(a, low, mid-1); } () => { MergeSort(a, mid, hi); } } Merge(a, low, mid, hi);}

Page 14: PARALLEL PROGRAMMING ABSTRACTIONS 6/16/2010 Parallel Programming Abstractions 1.

Parallel Programming Abstractions 14

Compare with Sequential Version

6/16/2010

MergeSort(int[] a, low, hi){ if(base_case) …

int mid = low + (hi-low)/2;

{ MergeSort(a, low, mid-1); } { MergeSort(a, mid, hi); } Merge(a, low, mid, hi);}

Page 15: PARALLEL PROGRAMMING ABSTRACTIONS 6/16/2010 Parallel Programming Abstractions 1.

Parallel Programming Abstractions 15

Data Parallelism• Sometimes you want to perform the same

computation on all elements of a collection

• For every string in an array, check if it contains “foo”

6/16/2010

Page 16: PARALLEL PROGRAMMING ABSTRACTIONS 6/16/2010 Parallel Programming Abstractions 1.

Parallel Programming Abstractions 16

Parallel.For

• Iterates a variable i from lower from to upper• Calls the delegate with i as the parameter in parallel

6/16/2010

For(int lower, int upper, delegate int (int) body);

Page 17: PARALLEL PROGRAMMING ABSTRACTIONS 6/16/2010 Parallel Programming Abstractions 1.

Parallel Programming Abstractions 17

Parallel.For

6/16/2010

// sequential forfor(int i=0; i<n; i++){ if(a[i].Contains(“foo”)){DoSomething(a[i]);}}

//Parallel forParallel.For(0, n, (i) => { if(a[i].Contains(“foo”)){DoSomething(a[i]);}});

Page 18: PARALLEL PROGRAMMING ABSTRACTIONS 6/16/2010 Parallel Programming Abstractions 1.

Parallel Programming Abstractions 18

A;Parallel.For(0, N, m: i => { B; });C;

m(0) m(1) m(N-1)…

The DAG created by Parallel.For

A

C

6/16/2010

Page 19: PARALLEL PROGRAMMING ABSTRACTIONS 6/16/2010 Parallel Programming Abstractions 1.

Parallel Programming Abstractions 19

Paralle.ForEach

• Same as Parallel.For, but iterates over elements of a collection in parallel

6/16/2010

ForEach<T>(IEnumerable<T> source, delegate void (T) body);

Page 20: PARALLEL PROGRAMMING ABSTRACTIONS 6/16/2010 Parallel Programming Abstractions 1.

Parallel Programming Abstractions 20

Advantage of High-Level Abstractions• Makes it easy to add parallelism• Explore and experiment with different parallelization

strategies

• Language Compiler/Runtime can perform performance optimizations• Efficiently create tasks• Efficiently distribute tasks to available cores• Tight integration with scheduler

6/16/2010

Page 21: PARALLEL PROGRAMMING ABSTRACTIONS 6/16/2010 Parallel Programming Abstractions 1.

Parallel Performance 21

Example Partitioning on Two Cores6/16/2010

Page 22: PARALLEL PROGRAMMING ABSTRACTIONS 6/16/2010 Parallel Programming Abstractions 1.

Parallel Performance 22

Partitioning on Four Cores6/16/2010

Page 23: PARALLEL PROGRAMMING ABSTRACTIONS 6/16/2010 Parallel Programming Abstractions 1.

Parallel Programming Abstractions 23

Advantage of High-Level Abstractions• Makes it easy to add parallelism• Explore and experiment with different parallelization strategies

• Language Compiler/Runtime can perform performance optimizations• Efficiently create tasks• Efficiently distribute tasks to available cores• Tight integration with scheduler

• Provide programmatic features• Exceptions• Cancelling running tasks

6/16/2010

Page 24: PARALLEL PROGRAMMING ABSTRACTIONS 6/16/2010 Parallel Programming Abstractions 1.

Parallel Programming Abstractions 24

Semantics of Parallel Constructs• What does this do:

6/16/2010

Paralle.Invoke{ () => { WriteLine(“Hello”); } () => { WriteLine(“World”); } }

Page 25: PARALLEL PROGRAMMING ABSTRACTIONS 6/16/2010 Parallel Programming Abstractions 1.

Parallel Programming Abstractions 25

Semantics of Parallel Constructs• What does this do:

• Compare with

6/16/2010

Paralle.Invoke{ () => { WriteLine(“Hello”); } () => { WriteLine(“World”); } }

{ WriteLine(“Hello”); } { WriteLine(“World”); }

Page 26: PARALLEL PROGRAMMING ABSTRACTIONS 6/16/2010 Parallel Programming Abstractions 1.

Parallel Programming Abstractions 26

Semantics of Parallel Constructs• By writing this program

• You are telling the compiler that both outcomes are acceptable

6/16/2010

Paralle.Invoke{ () => { WriteLine(“Hello”); } () => { WriteLine(“World”); } }

Page 27: PARALLEL PROGRAMMING ABSTRACTIONS 6/16/2010 Parallel Programming Abstractions 1.

Parallel Programming Abstractions 27

Correctness Criteria• Given a DAG, any linearization of the DAG is an

acceptable execution

6/16/2010

Page 28: PARALLEL PROGRAMMING ABSTRACTIONS 6/16/2010 Parallel Programming Abstractions 1.

Parallel Programming Abstractions 28

Correctness Criteria• Simple criterion:• Every task operates on separate data• No dependencies other than the edges in the DAG

• We will look at more complex criteria later in the course.

6/16/2010