Tool Benchmarking Where are we? Justin E. Harlow III Semiconductor Research Corporation April 9,...

14
Tool Benchmarking Where are we? Justin E. Harlow III Semiconductor Research Corporation April 9, 2001

Transcript of Tool Benchmarking Where are we? Justin E. Harlow III Semiconductor Research Corporation April 9,...

Page 1: Tool Benchmarking Where are we? Justin E. Harlow III Semiconductor Research Corporation April 9, 2001.

Tool BenchmarkingWhere are we?

Justin E. Harlow III

Semiconductor Research Corporation

April 9, 2001

Page 2: Tool Benchmarking Where are we? Justin E. Harlow III Semiconductor Research Corporation April 9, 2001.

Metrics and Benchmarks:A Proposed Taxonomy

Methodology Benchmarking Assessment of productivity Prediction of design time Monitoring of throughput

Flow Calibration and Tuning Monitor active tool and flow performance Correlate performance with adjustable parameters Estimate settings for future runs

Tool Benchmarking Measure tool performance against a standard Compare performance of tools against each other Measure progress in algorithm development

Page 3: Tool Benchmarking Where are we? Justin E. Harlow III Semiconductor Research Corporation April 9, 2001.

How It’s Typically Done...

My Tool

Your Tool

The Job

Page 4: Tool Benchmarking Where are we? Justin E. Harlow III Semiconductor Research Corporation April 9, 2001.

ICCAD 2000: Typical Results

Page 5: Tool Benchmarking Where are we? Justin E. Harlow III Semiconductor Research Corporation April 9, 2001.

Predictive Value?Kind of….

It takes more time to detect more faults

But sometimes it doesn’t...

Page 6: Tool Benchmarking Where are we? Justin E. Harlow III Semiconductor Research Corporation April 9, 2001.

Bigger Benchmarks Take Longer

Sometimes...

S526: 451

detects, 1740 sec

S641: 404

detects, 2 sec

Page 7: Tool Benchmarking Where are we? Justin E. Harlow III Semiconductor Research Corporation April 9, 2001.

What’s Wrong with the way we do it today?

Results are not predictiveResults are often not repeatableBenchmark sets have unknown propertiesComparisons are inconclusive

Page 8: Tool Benchmarking Where are we? Justin E. Harlow III Semiconductor Research Corporation April 9, 2001.

A Better Way?Design of Experiments

Critical properties of equivalence class: “sufficient” uniformity “sufficient” size to allow for t-test or

similar

EquivalenceClass

AlgorithmA

AlgorithmB

Evaluate

Evaluate

Cost Index

Page 9: Tool Benchmarking Where are we? Justin E. Harlow III Semiconductor Research Corporation April 9, 2001.

Example: Tool Comparison

Scalable circuits with known complexity propertiesObserved differences are statistically significant

Page 10: Tool Benchmarking Where are we? Justin E. Harlow III Semiconductor Research Corporation April 9, 2001.

Canonical Reference on DoE Tool Benchmark Methodology

D. Ghosh. Generation of Tightly Controlled Equivalence Classes for Experimental Design of Heuristics for Graph-Based NP-hard Problems. PhD thesis, Electrical and Computer Engineering, North Carolina State University, Raleigh, N.C., May 2000. Also available at http://www.cbl.ncsu.edu/publications/#2000-Thesis-PhD-Ghosh.

Page 11: Tool Benchmarking Where are we? Justin E. Harlow III Semiconductor Research Corporation April 9, 2001.

Tool Benchmark Sets

ISCAS 85, 89, MCNC workshops etc.

ISPD98 Circuit Partitioning BenchmarksITC BenchmarksTexas Formal Verification Benchmarks NCSU Collaborative Benchmarking Lab

Page 12: Tool Benchmarking Where are we? Justin E. Harlow III Semiconductor Research Corporation April 9, 2001.

“Large Design Examples”

CMU DSP Vertical Benchmark project. The Manchester STEED Project The Hamburg VHDL Archive Wolfgang Mueller's VHDL collectionSun Microsystems Community Source

programOpenCores.orgFree Model Foundry….

Page 13: Tool Benchmarking Where are we? Justin E. Harlow III Semiconductor Research Corporation April 9, 2001.

SummaryThere are a lot of different activities that we

loosely call “benchmarking”

At the tool level, we don’t do a very good job

Better methods are emerging, but

Good Experimental Design is a LOT of work

You have to deeply understand the properties that are important and design the experimental data

Most of the design examples out there are not of much use for tool benchmarking

Page 14: Tool Benchmarking Where are we? Justin E. Harlow III Semiconductor Research Corporation April 9, 2001.

To Find Out More...

Advanced Benchmark Web Site

http://www.eda.org/benchmrk

Nope… There’s no “a” in there

Talk to Steve “8.3” Grout