Tool Benchmarking Where are we? Justin E. Harlow III Semiconductor Research Corporation April 9,...
-
Upload
beatrix-rose -
Category
Documents
-
view
213 -
download
0
Transcript of Tool Benchmarking Where are we? Justin E. Harlow III Semiconductor Research Corporation April 9,...
![Page 1: Tool Benchmarking Where are we? Justin E. Harlow III Semiconductor Research Corporation April 9, 2001.](https://reader035.fdocuments.net/reader035/viewer/2022081822/5697bf8f1a28abf838c8d9bc/html5/thumbnails/1.jpg)
Tool BenchmarkingWhere are we?
Justin E. Harlow III
Semiconductor Research Corporation
April 9, 2001
![Page 2: Tool Benchmarking Where are we? Justin E. Harlow III Semiconductor Research Corporation April 9, 2001.](https://reader035.fdocuments.net/reader035/viewer/2022081822/5697bf8f1a28abf838c8d9bc/html5/thumbnails/2.jpg)
Metrics and Benchmarks:A Proposed Taxonomy
Methodology Benchmarking Assessment of productivity Prediction of design time Monitoring of throughput
Flow Calibration and Tuning Monitor active tool and flow performance Correlate performance with adjustable parameters Estimate settings for future runs
Tool Benchmarking Measure tool performance against a standard Compare performance of tools against each other Measure progress in algorithm development
![Page 3: Tool Benchmarking Where are we? Justin E. Harlow III Semiconductor Research Corporation April 9, 2001.](https://reader035.fdocuments.net/reader035/viewer/2022081822/5697bf8f1a28abf838c8d9bc/html5/thumbnails/3.jpg)
How It’s Typically Done...
My Tool
Your Tool
The Job
![Page 4: Tool Benchmarking Where are we? Justin E. Harlow III Semiconductor Research Corporation April 9, 2001.](https://reader035.fdocuments.net/reader035/viewer/2022081822/5697bf8f1a28abf838c8d9bc/html5/thumbnails/4.jpg)
ICCAD 2000: Typical Results
![Page 5: Tool Benchmarking Where are we? Justin E. Harlow III Semiconductor Research Corporation April 9, 2001.](https://reader035.fdocuments.net/reader035/viewer/2022081822/5697bf8f1a28abf838c8d9bc/html5/thumbnails/5.jpg)
Predictive Value?Kind of….
It takes more time to detect more faults
But sometimes it doesn’t...
![Page 6: Tool Benchmarking Where are we? Justin E. Harlow III Semiconductor Research Corporation April 9, 2001.](https://reader035.fdocuments.net/reader035/viewer/2022081822/5697bf8f1a28abf838c8d9bc/html5/thumbnails/6.jpg)
Bigger Benchmarks Take Longer
Sometimes...
S526: 451
detects, 1740 sec
S641: 404
detects, 2 sec
![Page 7: Tool Benchmarking Where are we? Justin E. Harlow III Semiconductor Research Corporation April 9, 2001.](https://reader035.fdocuments.net/reader035/viewer/2022081822/5697bf8f1a28abf838c8d9bc/html5/thumbnails/7.jpg)
What’s Wrong with the way we do it today?
Results are not predictiveResults are often not repeatableBenchmark sets have unknown propertiesComparisons are inconclusive
![Page 8: Tool Benchmarking Where are we? Justin E. Harlow III Semiconductor Research Corporation April 9, 2001.](https://reader035.fdocuments.net/reader035/viewer/2022081822/5697bf8f1a28abf838c8d9bc/html5/thumbnails/8.jpg)
A Better Way?Design of Experiments
Critical properties of equivalence class: “sufficient” uniformity “sufficient” size to allow for t-test or
similar
EquivalenceClass
AlgorithmA
AlgorithmB
Evaluate
Evaluate
Cost Index
![Page 9: Tool Benchmarking Where are we? Justin E. Harlow III Semiconductor Research Corporation April 9, 2001.](https://reader035.fdocuments.net/reader035/viewer/2022081822/5697bf8f1a28abf838c8d9bc/html5/thumbnails/9.jpg)
Example: Tool Comparison
Scalable circuits with known complexity propertiesObserved differences are statistically significant
![Page 10: Tool Benchmarking Where are we? Justin E. Harlow III Semiconductor Research Corporation April 9, 2001.](https://reader035.fdocuments.net/reader035/viewer/2022081822/5697bf8f1a28abf838c8d9bc/html5/thumbnails/10.jpg)
Canonical Reference on DoE Tool Benchmark Methodology
D. Ghosh. Generation of Tightly Controlled Equivalence Classes for Experimental Design of Heuristics for Graph-Based NP-hard Problems. PhD thesis, Electrical and Computer Engineering, North Carolina State University, Raleigh, N.C., May 2000. Also available at http://www.cbl.ncsu.edu/publications/#2000-Thesis-PhD-Ghosh.
![Page 11: Tool Benchmarking Where are we? Justin E. Harlow III Semiconductor Research Corporation April 9, 2001.](https://reader035.fdocuments.net/reader035/viewer/2022081822/5697bf8f1a28abf838c8d9bc/html5/thumbnails/11.jpg)
Tool Benchmark Sets
ISCAS 85, 89, MCNC workshops etc.
ISPD98 Circuit Partitioning BenchmarksITC BenchmarksTexas Formal Verification Benchmarks NCSU Collaborative Benchmarking Lab
![Page 12: Tool Benchmarking Where are we? Justin E. Harlow III Semiconductor Research Corporation April 9, 2001.](https://reader035.fdocuments.net/reader035/viewer/2022081822/5697bf8f1a28abf838c8d9bc/html5/thumbnails/12.jpg)
“Large Design Examples”
CMU DSP Vertical Benchmark project. The Manchester STEED Project The Hamburg VHDL Archive Wolfgang Mueller's VHDL collectionSun Microsystems Community Source
programOpenCores.orgFree Model Foundry….
![Page 13: Tool Benchmarking Where are we? Justin E. Harlow III Semiconductor Research Corporation April 9, 2001.](https://reader035.fdocuments.net/reader035/viewer/2022081822/5697bf8f1a28abf838c8d9bc/html5/thumbnails/13.jpg)
SummaryThere are a lot of different activities that we
loosely call “benchmarking”
At the tool level, we don’t do a very good job
Better methods are emerging, but
Good Experimental Design is a LOT of work
You have to deeply understand the properties that are important and design the experimental data
Most of the design examples out there are not of much use for tool benchmarking
![Page 14: Tool Benchmarking Where are we? Justin E. Harlow III Semiconductor Research Corporation April 9, 2001.](https://reader035.fdocuments.net/reader035/viewer/2022081822/5697bf8f1a28abf838c8d9bc/html5/thumbnails/14.jpg)
To Find Out More...
Advanced Benchmark Web Site
http://www.eda.org/benchmrk
Nope… There’s no “a” in there
Talk to Steve “8.3” Grout