The Conceptual-Level Design Approach to Complex...
Transcript of The Conceptual-Level Design Approach to Complex...
The Conceptual-Level Design Approach to Complex Systems
by
David B. Lidsky
B.S. (University of Massachusetts) 1989M.S. (University of California, Berkeley) 1994
A dissertation submitted in partial satisfaction of the
requirements for the degree of
Doctor of Philosophyin
Engineering - Electrical Engineeringand Computer Sciences
in the
GRADUATE DIVISION
of the
UNIVERSITY OF CALIFORNIA, BERKELEY
Committee in charge:
Professor Jan M. Rabaey, ChairProfessor A. Richard Newton
Professor Paul Wright
Fall 1998
The dissertation of David B. Lidsky is approved:
Chair Date
Date
Date
University of California, Berkeley
Fall 1998
3
st
-bit
has
ed
nds.
f the
arlier
n
odify
cess,
ighly
singly
e and
esign
Abstract
The Conceptual-Level Design Approach to Complex Systems
byDavid B. Lidsky
Doctor of Philosophy in Engineering-Electrical Engineeringand Computer Sciences
University of California, Berkeley
Professor Jan M. Rabaey, Chair
The complexity of electronic circuits has grown dramatically over the pa
several decades. Circuits with more than 1 million gates, combined with giga
memories, clocked at 1000s of MHz are emerging. In addition, heterogeneity
become the norm, with analog, digital, and mixed-signal circuitry commonly includ
in the same design. Circuit design is further complicated by a number of new tre
For example migration toward sub-micron line widths necessitates that aspects o
design (e.g., subthreshold noise, inductive effects) need to be considered much e
in the design process than previously necessary.
System complexity is rapidly outgrowing current CAD tools ability to aid i
design. An important aspect of the design process is to be able to evaluate and m
designs during the conception of a system. Furthermore, throughout the design pro
it is necessary to be able to abstract key properties in order to analyze large, h
heterogeneous systems. The work in this thesis addresses the evaluation of increa
complex systems both 1) at the crucial early stages of the design process when cod
netlists have yet to be specified and 2) throughout the design process when d
4
ysical
n. In
l, in
tual-
ol for
ject-
ssing
f the
are
y as
AD
e
specifications and tools are distributed amongst people, computer systems and ph
locations.
This thesis focuses on methodologies and tools for complex-system desig
particular, it formalizes the use of a higher level of abstraction, the conceptual-leve
the context of electronic systems. A prototype design system focused on concep
level design, PowerPlay, was created to demonstrate the requirements of a CAD to
complex systems. The most important features of PowerPlay is the use of ob
oriented macromodels and the creation and use of hyperlinked spreadsheets. Addre
the increasing size of design teams, PowerPlay was created to leverage off o
capabilities of the WWW. Hence a number of techniques for networked design
presented. The work concludes with a number of systems created with PowerPla
part of the design flow and a roadmap for future WWW-based, complex-system C
environments.
Jan M. Rabaey, Chairman of Committe
Acknowledgments v
ne,
a
is a
te
like
mes
my
a
we
I
I
nt
me
Acknowledgments
I dedicate this thesis to all the girls I’ve loved before.
Okay not really, but I always wanted to say that.
I always thought acknowledgments in a thesis were overblown and over do
but now that I’m writing one, I realize how much work it is to put together a such
monstrosity. The research was (mostly) fun, but man, writing one of these things
pain in the ***. I’m glad that part is over. I have just written hundreds of comple
sentences so I’m going to free myself to write in fragments and mispel and, if I feel
it, end my sentences with prepositions. i before e be damned!
Cast in order of appearance:
What order to put peoples names in? Do people get offended when their na
are stuck floundering in the middle. Do I begin with my parents and end with
advisor, or is it the other way around. Trying not to offend, I shall present in
somewhat chronological order.
Dad Lidsky may be the smartest and most capable man I know. I believe I o
most of my logical thinking process to him.
Mom Lidsky If my dad taught me how to think, my mom taught me all
needed to know about humanity.
Loren & Jane - I’m not sure if I can point at anything relevant to the thesis
can totally thank them for. But I love them so that’s enough.
I will break my “order of appearance rule” already to talk about:
Jill and Mark - Did an amazing job of reminding me about what is importa
without even trying.
Archie Shepp and Miles Davis - taught me how creativity and intensity co
together to create whatever needs to be created.
Acknowledgments vi
to
er I
o
had
to
ngs,
b at
this
an
de
ed
ple
ade
all
Gu
ion:
St
think
Mary Shaughnessytaught me an amazing amount of things. Most relevant
my work here - courage and strength. She really did teach me I could do whatev
truly set out to do. Drink more than you can!
Jan Rabaey - The only advisor that I wanted. At a time when there was n
room in his research group, he took mercy on me and brought me in. He obviously
a profound impact on this work but more importantly, provided guidance as to how
do research and be a person at the same time.
Tony “Two transistors equals a career”Stratakos - I could thank him for being
my best friend throughout my scholastic career and for a whole bunch of other thi
but that would be boring. I’ll just pass along some sentence fragments for him: la
night = draggin’ arm. Do I use a crowbar a wrench or a ball-peen hammer for
circuit? You will never beat me in table tennis. I will never have more back hair th
you. Is that enough?
The 550 clan- Fearing that I would be alone amongst number crunching, gra
grubbing, culturally blind, people when I re-entered the world of UCB, I was surpris
that I was wrong on all counts. A multi-cultured mix of energetic and interesting peo
co-existed as denizens of the top of 550. Everyone helped everyone in 550 which m
it an exciting and fun place to spend manymanymany late nights.
To try to I shall just list the names that come to my head. That they are
turned out to be so intellectually helpful goes w/out saying. In random order:
KeithOnoderaAndyAboSrenikMehtaJeffWeldonSamShengMarleneWanLisa
erraJeffGilbertDennisYeeJohnDavisTomTruman
My cube mates who have provided endless excellant conversat
ArthurAbnousRoySuttonScarlettWuMyLee The list continues;
TomBurdArnoldFeldmanErikBoskinJennieChenRenuMehraAnnaIssonKevin
oneSekharNarayanaswamiAdrianIsles
Anantha Chandrakasan stole all of my work. Which is okay ’cause I will
get even! Its even more okay because you spent a lot of time teaching me how to
Acknowledgments vii
you
reat
later
and
ol!
all
ed.
also
ng
is
ave
ted
in
k.
that
like an engineer. Thanks for all your early help and the football in 550. Man are
hard to tackle!
Lisa Guerra - An awesome person and great engineer. Had impact on a g
deal of the work that I did here. Perhaps the only person who was consistently up
than me. Three people I can call late at night and not offend: My mother, my father,
Lisa. That alone is worth the price of admission.
Craig Teuscher taught me that people that come from Utah really can be co
Bob Brodersen - People already know he is smart and can raise money and
that. I liked looking at the way he looked at a problem and also the way he talk
Along with Jan he provided a relaxed yet inspirational research environment. I
appreciate both his and Jan’s choice in beer.
and for Chris Rudell , a die hard meat eater - not that there is anything wro
with that. I tip my glass and eat a prime rib. Another good friend emerged
Andy Burstein . You gotta love the guy even when he is tearing you up with h
wit. Along with my father, Bob Meyer and Jan, Andy is among the great teachers I h
had.
Certain people you meet and you just know they will be some how connec
to you the rest of your existence.Jeff Weldon is one of those people.
Eric Boskin : Finally a grad student older than me! Provided much insight
life issues. Great Rball too.
Ole Bentz had a great wit and was greatly influential in this work. NO
INFOPAD!
Marlene Wan - you provide much help and substantiation later in the wor
You also brought many smiles and much food to 550.
Richard Newton and Paul Wright , who not only graciously signed this pile
of words, gave good perspective and insight. They also provided further proof
professors can be people.Paul Gray andRandy Katz, along with Paul, got me through
Quals and provided helpful insight.
Acknowledgments viii
and
I
ed
her
t deal
tch
er
Kevin Z, Heather, Ruth, Elise, Tom B. and Carol got me through all of the
administrative bs that I could never figure out and provided good conversation
perspective to boot.
Cornelia -next ta- Sylvester,had the misfortune of getting to know me when
was working and writing this thesis. How she put up with it I do not know. She provid
me with the strength, sanity and patience to keep on writing. I was going to thank
for reading this entire thesis, but then even she didn’t! She read and edited a grea
though.
And my last words are: Thank you Eleta for never being quick enough to ca
Satchmo, yes that wasmy dog in the building every night and weekend and summ
day. He always knew the back stairway would be a safe point of entry.
Until next time....
ix
....1
...
.........2
.........4
.....7
........8
.....12
......13
.....15......15......15...16.....16.....18....18.....19
......20
.......23.....23......24
......25
......26
.....27.....29.....33....34.....35......36
.......
....38
.....38
....39.....40
Table Of Contents
Chapter 1: Introduction ................................................................................................
1.1 Motivation...................................................................................................................1
1.2 Research Contributions.........................................................................................
1.3 Thesis Organization ..............................................................................................
Chapter 2: The Conceptual Level ...............................................................................
2.1 The Conceptual Level Defined ..............................................................................
2.2 Design Gains at Different Levels of Abstraction....................................................
2.3 Arguments For Integrated Design Flows ...............................................................
2.4 Related Work in Conceptual Level Design and CAD.............................................2.4.1 Equation Based Tools .............................................................................
2.4.1.1 Spreadsheets.............................................................................2.4.1.2 Analytic Tools .............................................................................2.4.1.3 Design Sheet[7].........................................................................
2.4.2 Graphical Environments ..........................................................................2.4.2.1 Ptolemy [24]...............................................................................
2.4.3 Non-electrical Based Domains ................................................................
2.5 Conceptual Design Methodology ..........................................................................
2.6 Challenges at the Conceptual Level......................................................................2.6.1 Problems With Current Methods .............................................................2.6.2 Challenges of complex system design at the conceptual-level ...............
2.7 Roles of a Conceptual-Level Environment ............................................................
2.8 The Conceptual-Level Design Environment..........................................................2.8.1 Macromodeling (Chapter 3)....................................................................2.8.2 Hyperlinked Spreadsheet With Macromodels (Section 4.1)....................2.8.3 Macromodel Creation (Chapter 3) ...........................................................2.8.4 Distributed Infrastructure (Section 4.2) ....................................................2.8.5 Generic Block Entry ................................................................................2.8.6 Documentation and Design Management...............................................
2.9 Scope of Thesis .....................................................................................................37
Chapter 3: Macromodels.............................................................................................
3.1 Macromodeling Overview ......................................................................................3.1.1 Definition Revisited ..................................................................................3.1.2 Previous work in macromodeling ............................................................
x
....42
.....43.....45...45....46
......4
.....51
....53
......56
....57
.....60
....61
.......68.....69.....70.....71
.....
1
........82
.....85...86.....87....91......92....92......94
......96.....96.....98
.....
0
.....
...101...101....102...103....106
3.1.2.1 Roles of conceptual-level macromodels ....................................3.1.3 Macromodel trade-offs.............................................................................3.1.4 Types of Macromodels.............................................................................
3.1.4.1 Primitive model overview ...........................................................3.1.4.2 Composite model overview........................................................
3.2 Primitive models ....................................................................................................93.2.1 Analytic Expressions ...............................................................................
3.2.1.1 Example analytical macromodels ..............................................3.2.2 Tables ......................................................................................................
3.2.2.1 Example Table Macromodels ....................................................3.2.3 Dynamic Models......................................................................................3.2.4 Example trade-offs in primitive selection. ................................................
3.3 Composite Models ................................................................................................3.3.1 Partitioning...............................................................................................3.3.2 Model Assignment ...................................................................................3.3.3 Composition.............................................................................................
3.4 Summary.................................................................................................................80
Chapter 4: Technical Enablers of Conceptual-Level Design Environments..............8
4.1 Hyperlinked Spreadsheets....................................................................................
4.2 System design on the World Wide Web (WWW)...................................................4.2.1 The Early History of the WWW ................................................................4.2.2 Basic WWW architecture and protocols..................................................4.2.3 Related work in WWW-based CAD design..............................................4.2.4 Customizing for each user ......................................................................4.2.5 Generic HTML page customization..........................................................4.2.6 Remote tool calls.....................................................................................
4.3 Object-Oriented modeling and design ...................................................................4.3.1 Object-oriented overview.........................................................................4.3.2 Object-oriented macromodeling ..............................................................
4.4 Summary.................................................................................................................99
Chapter 5: PowerPlay: A Conceptual-Level Design Environment...........................10
5.1 User interface ........................................................................................................1015.1.1 Overview..................................................................................................5.1.2 The Philosophy of the conceptual-level environment..............................5.1.3 Essential Elements ..................................................................................5.1.4 Primitives .................................................................................................5.1.5 Composition: Hyperlinked Spreadsheets................................................
xi
....109
...110
...110..112...113...113...115...117...118...120....121
....122
...122....126..130...130...131...132....133...133...134
.....
137
.....137..138.143..149.150....154....154
57
.....
.....159...159...160....161...161
5.1.6 Composite model creation ......................................................................5.1.6.1 Enumeration..............................................................................5.1.6.2 Schematic entry.........................................................................5.1.6.3 Textual entry ..............................................................................
5.1.7 Design and model organization ...............................................................5.1.7.1 Customizable centralized server ...............................................5.1.7.2 Multi-user model database........................................................
5.1.8 Associative Model Assignment ...............................................................5.1.9 Sensitivity Analyses and Graphing..........................................................5.1.10 Documentation Summary ......................................................................5.1.11 Security .................................................................................................
5.2 PowerPlay’s System Architecture..........................................................................5.2.1 Code overview .........................................................................................5.2.2 File Structure...........................................................................................5.2.3 CAD design on the WWW .......................................................................5.2.4 Utilities.....................................................................................................5.2.5 Composite model evaluation....................................................................5.2.6 Remote tool calls/dynamic models ..........................................................5.2.7 User customization..................................................................................
5.2.7.1 Basics ........................................................................................5.2.7.2 Code ..........................................................................................
5.3 Summary...............................................................................................................136
Chapter 6: Design at the Conceptual Level.................................................................
6.1 Design Examples ..................................................................................................6.1.1 Trade-offs in algorithm choice..................................................................6.1.2 Estimations involving various interacting cost functions ..........................6.1.3 Primitive characterization and inclusion...................................................6.1.4 Dynamic models and other tools leveraging off the WWW......................6.1.5 Textual Parsing........................................................................................6.1.6 Design Management ...............................................................................
Chapter 7: Summary and Directions of Future Research .........................................1
7.1 Summary...............................................................................................................158
7.2 Road map for future design environments............................................................7.2.1 Integration ................................................................................................7.2.2 Design entry and result visualization.......................................................7.2.3 Documentation........................................................................................7.2.4 The simulation and verification paradox..................................................
xii
3....164...165...165....166....166....168...168...169...171....172..173...175.175...175...178...178..179
84....184..188...191...197...197..198
Appendix A: PowerPlay: Tutorial and FAQ...............................................................16A.1 FAQ..........................................................................................................................163
A.1.1 What is PowerPlay? ................................................................................A.1.2 Where do I start/ What is the “personal design center”? ........................A.1.3 What kind of Models can be created?.....................................................A.1.4 How does one create new models? ........................................................A.1.5 How do I add to my spreadsheet? ..........................................................
A.2 Tutorial: The Personal Design Center...................................................................A.2.1 Your Personal Design Center..................................................................A.2.2 Customization .........................................................................................A.2.3 To do .......................................................................................................
A.3 Tutorial 2.................................................................................................................171A.3.1 Viewing a model ......................................................................................A.3.2 Creating a primitive model ......................................................................A.3.3 Edit models .............................................................................................
A.4 Tutorial 3: Advanced Primitive Modeling...............................................................A.4.1 Tool Encapsulation .................................................................................
A.5 Tutorial 4: The Composite Model..........................................................................A.5.1 Basic functions........................................................................................A.5.2 Step by step breakdown of composite model functionality .....................
Appendix B: Figures and Code Useful For Downloading..........................................1B.1 Figures and code for Section A4. ..........................................................................
B.1.1 Code for performing transistor estimation ...............................................B.1.2 Matlab/Table look up code......................................................................B.1.3 Eaxmple table lookup data ......................................................................B.1.4 Example Mathemaica deck .....................................................................B.1.5 Example MODELS.moddat file...............................................................
xiii
thods.
ded.
List of Figures
Figure 2.1 A switching voltage regulator at the conceptual level...........................9 Figure 2.2 Conceptual-level sketch of a digitally controlled voltage regulator....12 Figure 2.3 Design Sheet’s constraint network and top-level interface[7] ............17 Figure 2.4 DUCADE: A mechanical and electrical view of the same system[23]19 Figure 2.5 Spreadsheet analysis of power dissipation of Example 2....................32 Figure 2.6 Java block editor with parameter entry form[46] ................................36Figure3.1Macromodelsare interfaced in thesamemanner regardlessofevaluationme40 Figure 3.2 Types of macromodels ........................................................................45 Figure 3.3 Luminance decompression - three memories and a register ...............47 Figure 3.4 Each line is evaluated with a macromodel ..........................................48 Figure 3.5 Maxim Data sheet for ADCs ...............................................................59 Figure 3.6 Bit-wise characterization of switching activity ...................................62 Figure 3.7 Folded cascode op-amp .......................................................................64 Figure 3.8 SPICE model of folded cascode..........................................................66 Figure 3.9 Example error analyses........................................................................73 Figure 3.10: InfoPad: multimedia wireless terminal.............................................75 Figure 3.11 Compostion of wireless multimedia system: InfoPad.......................77 Figure 3.12: Conceptual view of the InfoPad terminal.........................................78 Figure 3.13: InfoPad’s radio sub-system ..............................................................79 Figure 4.1 Block diagram of hyperlinked spreadsheet .........................................83 Figure 4.2 Sente’s hyperlinked spreadsheet derived from PowerPlay .................85 Figure 4.3 World Wide Web Transactions ...........................................................89 Figure 4.4 Silva’s SMTP based system[31]..........................................................95 Figure 5.1 Primitive Macromodel.......................................................................104 Figure 5.2 PowerPlay’s hyperlinked spreadsheet interface ................................108 Figure 5.3 Webtop’s Java based schematic entry interface ................................111 Figure 5.4 Centralized page for model access and commands: The Design Center(continued on Figure 5.5).....................................................................................114 Figure 5.5 Design center (continued from Figure 5.4) .......................................115 Figure 5.6 Hierarchical breakdown of composite showing model ownership....116 Figure 5.7 Using associative model assignment for HW/SW co-design............118 Figure 5.8 Hyperlinked pie chart used for bottleneck analyses..........................119 Figure 5.9 Sensitivity analysis of voltage regulator system ...............................120 Figure 5.10 Simple form used for creating sensitivity plots...............................120 Figure 5.11 Functional breakdown of code ........................................................123Figure 5.12 Breakdown of PowerPlay’s directories. Most subdirectories are not expanAll users have same directory structure ...............................................................127 Figure 5.13 Example of composite storage: Luminance decompression ...........129
xiii
xiv
Figure 5.14 Dynamic model access ....................................................................132 Figure 5.15 Functionality of encapsulation tools................................................133 Figure 6.1: Block diagram of a luminance decompression chip.........................140 Figure 6.2: PowerPlay’s spreadsheet power analysis .........................................141 Figure 6.3: Alternate implementation of the decompression chip......................142 Figure 6.4 Switching voltage regulator power train ...........................................144 Figure 6.5 Simplified PFM operation .................................................................145 Figure 6.6: Voltage regulation composite model................................................146 Figure 6.7: Ripple versus capacitor size .............................................................147 Figure 6.8: Media Processor described using WebTop ......................................151 Figure 6.9: Media processor composite model ...................................................152 Figure 6.10: Dynamic model utilizing table look-up..........................................153 Figure 6.11: InfoPad: multimedia wireless terminal...........................................155 Figure 6.12 Composite model of a portable multimedia system ........................156
xiv
1.1 Motivation 1
Chapter 1
Introduction
st
-bit
has
ed
nds.
f the
arlier
the
ives.
or
f the
1.1 Motivation
The complexity of electronic circuits has grown dramatically over the pa
several decades. Circuits with more than 1 million gates, combined with giga
memories, clocked at 1000s of MHz are emerging. In addition, heterogeneity
become the norm, with analog, digital, and mixed-signal circuitry commonly includ
in the same design. Circuit design is further complicated by a number of new tre
For example migration toward sub-micron line widths necessitates that aspects o
design (e.g., subthreshold noise, inductive effects) need to be considered much e
in the design process than previously necessary.
To support these trends, system-level design aids will be critical in aiding
performance of comprehensive and timely exploration of system design alternat
Current tools do not target the most critical early design stages. Additionally, little
no work has been done, even at lower abstraction levels, to support the analysis o
breadth of design styles found in a typical system.
1.2 Research Contributions 2
ent
rm,
.
ing
plex
ation
The support of many designers working in disparate locations on differ
tools and systems is a further complication. To address this problem, a unifo
platform independent medium is needed forrepeated evaluation of a system’s
performance throughout the design process, from system conception to completion
The chief problem that this research addresses is summarized in the follow
statement:
Problem Statement: System complexity is rapidly outgrowing current
CAD tools ability to aid in design. An important aspect of the design
process is to be able to evaluate and modify designs during the
conception of a system. Furthermore, throughout the design process, it
is necessary to be able to abstract key properties in order to analyze
large, highly heterogeneous systems. The work in this thesis addresses
the evaluation of increasingly complex systems both 1) at the crucial
early stages of the design process when code and netlists have yet to be
specified and 2) throughout the design process when design
specifications and tools are distributed amongst people, computer
systems and physical locations.
1.2 Research Contributions
This research seeks to address issues related to the design of com
heterogeneous systems. In particular, the qualities of an adaptable, explor
methodology and design environment are presented.
1.2 Research Contributions 3
and
esis
text
ned
tive
, but
rk
tive
The
s
nt-
em
Key contributions are summarized herein:
Conceptual-level exploration: Previous research has addressed methodologies
tools at various levels of abstraction (e.g., behavioral, structural, physical). This th
formalizes the use of a higher level of abstraction, the conceptual-level, in the con
of electronic systems. Key aspects of this approach include:
• A system and methodology to attain as accurate as possible estima-
tions, as quickly as possible, without requiring a compilable description
of a design. It is now well understood that often the greatest design gains
are attainable at the earliest levels of abstraction. A design which is
described simply with flowcharts, simplistic codes, or simple lists can be
used to produce quick evaluation of design approaches.
• A macromodel based estimation environment. Macromodels are used to
describe and evaluate subcomponents in order to selectively (and dynam-
ically) abstract a system or subsystem to its key aspects. User defined
libraries of macromodels enables the use of the proper model (be it an
estimation tool or analytical expression) that is required to explore a spe-
cific issue. The use of macromodels is also key for facilitating hierarchi-
cal estimations and design reuse.
As will be shown in this thesis, the use of a macromodel approach, combi
with abstracting key properties to the conceptual level, not only enables effec
analyses of complex systems both early, and throughout, the design process
provides a back-plane for a highly adaptable environment.
Network ed CAD Systems: Both the size of design teams, and the change in wo
habits require that CAD tools be available across networks. In addition, the crea
nature of conceptual-level design requires a comfortable and flexible environment.
World Wide Web’s (WWW) facility at information transfer, platform independence, a
well as its wide user base, makes it a clear choice for a system design platform.
As demonstrated in this thesis, the use of the WWW is similar to a clie
server configuration with a number of important differentiators. Included among th
1.3 Thesis Organization 4
the
y to
esis
ter/
ted in
as
arks.
n of
tion,
tems.
ant
cond
f the
of a
f an
is
is the WWW’s lack of dedicated connections, limited graphical interaction, and
fairly primitive protocols for data transfer and display.
The CAD tool presented herein is the first design tool to be developed specificall
leverage off of the powers, and overcome the challenges, of the WWW. This th
gives both an overview, and details, for techniques in designing CAD tools for in
intra-nets.
Many aspects of the proposed conceptual-level methodology have been encapsula
a WWW-based design tool, PowerPlay. PowerPlay’s availability on the WWW h
enabled designers to provide continuous feedback, suggestions and design benchm
This feedback has, in turn, enabled PowerPlay to evolve into not just a demonstratio
key concepts, but a tool actively used by a number of designers.
PowerPlay may be found at http://InfoPad.EECS.Berkeley.edu/PowerPlay.
1.3 Thesis Organization
In a general sense, the thesis can be broken into two sections. The first sec
Chapters 1 and 2, introduces the conceptual-level design approach for complex sys
Motivations are laid out and previous work are described in detail. Import
subcomponents of a conceptual-level environment are then presented. The se
section of the thesis, Chapters 3-6, can be viewed as a “proof by implementation” o
conceptual-level approach by addressing the two most challenging aspects
conceptual-level environment: high-level design evaluation, and implementation o
intricate CAD tool on a WWW environment. A more detailed outline of the thesis
presented below:
1.3 Thesis Organization 5
d on
tools
ects of
is
work
of
odel
of
this
ect-
logies
d in
eful
Play
on to
es of
n
ples
ering
Chapter 2 -The Conceptual Level: This chapter lays the ground work for the
rest of the thesis. Motivation for, and challenges of, designing a system focuse
higher abstraction levels is first presented. Other high level, and complex system,
and environments are described and compared. From this basis, the essential asp
conceptual-level methodologies and environments are introduced.
Chapter 3 - Macromodeling: Perhaps the most important new enabler
macromodels designed for use at the conceptual level. There has been significant
in modeling, the most relevant of which is described. Two super-classes
macromodels, the primitive and the composite, are introduced. Specifics for m
choice, in terms of design accuracy and speed, are given special attention.
Chapter 4 - Enablers of Conceptual-Level Design Environments:This
section outlines the three most important technical enablers for this type
environment. The first, hyperlinked spreadsheets, is a new technology introduced in
thesis. The final two are the emerging technologies of the World Wide Web and obj
oriented design. In this chapter, the elements and challenges of using these techno
will be highlighted.
Chapter 5 - PowerPlay, A Conceptual-Level Design Environment:
Implementation details of the conceptual-level design tool PowerPlay are presente
this chapter. Challenges of design on the WWW is described and a number of us
techniques introduced. This chapter provides a basic breakdown of the Power
implementation with an emphasis on the aspects which are believed to be comm
future environments. Tutorials and details on the code are presented in appendic
this chapter.
Chapter 6 -Design at the Conceptual-Level:PowerPlay has been available o
the WWW for a number of years. Therefore a number of designs and useful exam
have been developed. This chapter provides a series of examples focused on diff
1.3 Thesis Organization 6
s are
ls or
us on
tep
laced
s for
techniques available at the conceptual level. Throughout the chapter benchmark
presented comparing conceptual-level estimations versus either lower level too
measurement of fabricated systems.
Chapter 7 - Conclusions and Future Work: As systems will continue to
become more and more complex, and technical challenges continue to change, foc
higher levels of abstraction will continue to be essential. This work is just a small s
towards the development of tools which can answer the increasing demands p
upon system designers. The thesis concludes with suggestions for future direction
complex system design environments.
7
Chapter 2
The Conceptual Level
ral,
onal
tion
stage
ally
of
am
in a
sign
shion
logy
ion
Traditionally, design is considered to occur in three abstractions: behavio
where there exists a functional description; structural, where there is a compositi
description of the blocks of the design; and physical, where there is an implementa
description. There are considerable tools to help at each of these levels. The
before there is a solid description, or full understanding, of a design has been virtu
ignored.
The very first stage of design exploration typically involves a combination
incomplete or partially-defined application specifications, rough block diagr
sketches of hardware, and algorithms conceived but not yet formally described
compilable behavioral language. In current day design practices, this level of de
abstraction, henceforth called the conceptual-level, is addressed in an ad-hoc fa
without much support from tools or without utilizing existing data and models.
In this chapter a fourth abstraction, conceptual, is defined and a methodo
of design at this level is delineated. The following section provides a formal definit
2.1 The Conceptual Level Defined 8
ard
this
ions
with
of
AD
the
. Aject
are
tage
has
r of
ns,
d/or
vels
n is
all
t the
ent
for the conceptual level and uses a simple example to illustrate its application tow
electronic systems. This is followed by an analysis of potential gains of working at
high abstraction level and a study in applicable previous work. The remaining sect
introduces the fundamental aspects of the proposed conceptual-level environments
will be detailed in the remaining chapters of the thesis. In particular, an outline
conceptual-level methodologies and the important building blocks of a supporting C
system will be outlined. Both of these provide the back-bone for designing at
conceptual level -- the primary focus of this work.
2.1 The Conceptual Level Defined
Definition:The conceptual level is an abstraction in which a system is incompletely specified
conceptual-level description shows the essence of the functionality of a design obthrough parameters, global functionality, constraints and costs.
Though early in the design cycle, fundamental and important decisions
made at this level often using a minimum amount of information. The conceptual s
of the design process is before the behavior, architecture and/or the technology
been concretely defined.
At this point in the design cycle, design ideas are captured in a numbe
different ways. The behavior can be described with a combination of equatio
flowgraphs and/or pseudo-code. Structure is often captured with block diagrams an
ordered lists of elements. Potential implementation details are reflected to higher le
only for elements and/or costs which are considered most critical. As the desig
being formulated, choices about behavior, structure and implementation are
simultaneously being considered.
The conceptual level is also useful for heterogeneous systems throughou
design process. A complex system often requires different tools for differ
2.1 The Conceptual Level Defined 9
these
m is
ptual
ign
subcomponents. With rare exceptions, there is no means to interface between
tools. Therefore, as designs get further defined, the only way to examine a full syste
by abstracting away details. In a sense this is bringing the design back to the conce
level to enable analysis.
The following example illustrates conceptual level descriptions and des
choices.
Example 1Switching voltage regulator at the conceptual-level
This simplified example of a switching voltage regulator design
illustrates a design described at the conceptual level. In addition to
providing examples of conceptual-level descriptions and decisions, it
demonstrates how even a somewhat small design can be a quite complex
system.
Conceptual-level Description: Figure 2.2 shows a basic block diagram
that a designer would sketch early in the design process. The purpose of
the circuitry is to maintainVout within a certain error margin of a desired
voltage,Vref, while minimizing power dissipation. The sketch describes
the function of the system as a composition of modules with an
understood functionality (such as for instance a low-pass filter).
Generation ΣLow Pass
FilterPower
TransistorsCircuitry
VoutVref
Figure 2.1 A switching voltage regulator at the conceptual level
Vref -VoutControlReference
2.1 The Conceptual Level Defined 10
Additional information a designer would define is the range of error
allowable on the output voltage.
The key attributes of this description that make it conceptual level is that
the detail given is not substantial enough to enable the use of
conventional estimators or simulators. As will be seen, even if very
specific details of blocks are understood, the design as a whole may still
benefit from the abstraction to a conceptual description.
Though a relatively small system, each of the components are quite
complex, and the entire system itself is heterogeneous. The reference
generation requires analog design skills, while the control circuitry
requires knowledge of controls and can be created using a number of
different architectures. The power transistors and the filter also require
unique skills and can be implemented either with discrete components or
integrated onto a single integrated circuit.
The basic sketch may often be combined with a simple functional
description. In this example, the circuit functions as follows: The control
circuitry creates a square wave which is passed to power transistors. A
low pass filter presents the average voltage to the output. To keepVout as
close to Vref as possible, the difference of the two signals is used to
control the pulse width of the square wave. If the output voltage is too
low, for example, the pulse width is increased, and the resultingVout
increases towardVref. Similarly, if the output voltage is too great, the
pulse-width is decreased forcingVout to decrease towardVref.
Conceptual-level design decisions:At this point in the design cycle
most of the circuitry has not been fully specified, let alone designed.
Significant information, such as the frequency of the square wave is yet
2.1 The Conceptual Level Defined 11
to be determined. Therefore current day estimators and simulators are not
applicable. There is, however, a great deal of design decisions that must
be made at this level which have significant impact on the circuit
performance. Examples of global decisions are the frequency of the
square wave and the fabrication technology. These, along with
constraints on power dissipation, output accuracy, and area, need to be
considered when choosing and designing the architecture of each
component. These types of decisions must be made before conventional
tools can be utilized.
As the design progresses, and architectures and technologies are being
specified, abstraction to the conceptual level is also powerful. As sub-
block architectures are being specified, their sub-blocks are also
abstracted.
Figure 2.2 shows a conceptual-level sketch of the voltage regulator after
certain components are selected. In this architecture, the reference signal
is created using analog circuitry. The error signal (Vout-Vref) is converted
to a digital word. The creation of the pulse width is then performed with
digital circuitry. A number of high-level choices need to be made such as
the resolution of the Analog to Digital Convertor (ADC); and the sizing
of power transistors and discrete components. As will be shown in this
thesis, The conceptual-level approach will enable an effective means to
evaluate a system with the complex heterogeneity of the circuitry of
Figure 2.2.
Other conceptual-level decisions at this point include comparisons of this
architecture with other possible implementations. For example the ADC
and digital control circuitry may be completely replaced with either
continuous time analog circuitry or a switched capacitor implementation.
2.2 Design Gains at Different Levels of Abstraction 12
a
ical
tion
hing
nal
osen,
t the
ons
ains,
nd
ows
The same methodology which is used to evaluate a single complex system
is also useful in evaluating these seemingly disparate architectures.
2.2 Design Gains at Different Levels of Abstraction
At the behavioral level, the algorithm has been clearly described in
procedural language (e.g., C, VHDL), a data flow language (e.g., silage) or a graph
program (e.g., Ptolemy). In the analog domain, an analogy to an algorithmic descrip
are descriptions of transfer functions and important cost functions (e.g., matc
impedances, noise figures). At the structural level, block diagrams of functio
elements and their interconnect are described, programmable architectures are ch
and analog architectures (e.g., amplifiers, capacitor ratios, etc.) are specified. A
implementation level, final details are specified. Layout of blocks and interconnecti
on the chip and board level are finalized.
As the design gets finalized, the degrees of freedom, and hence possible g
gets smaller. At the conceptual-level, modification of all lower levels is possible a
hence greatest potential gains possible. This is reflected by Table 1 which sh
Analog
GenerationΣ
Low PassFilter
PowerTransistors
Digital
Circuitry
VoutVref
Figure 2.2 Conceptual-level sketch of a digitally controlled voltage regulator
ADC
Vref -Vout
ControlReference
2.3 Arguments For Integrated Design Flows 13
from
her
rom
s of
and
n as
me
ing
the
all
int
ross
expected gains versus the level of abstraction compiled and extrapolated
[1][2][54] in the case of power reduction. This trend of higher gains available at hig
abstraction levels is consistent across cost metrics.
2.3 Arguments For Integrated Design Flows
In any one discipline, there are many tools utilized as a design evolves f
conception to completion. The majority of tools are in a sense islands of protocol
both data storage and functionality. A major impediment to a smooth design flow,
hence faster more effective design, is the overhead of converting design informatio
one switches to an optimal tool for a specific design task. In addition, this sa
impediment will often force a design to be processed with a sub-optimal tool produc
potentially sub-optimal solutions.
In the design of complex systems, this problem is further complicated by
many disciplines (e.g., electrical-analog, electrical-digital, mechanical) that are
integral, and interacting, components of a system. Currently, with few point-to-po
exceptions, there exists no significant means to interface design information ac
disciplines.
Table 1 Expected Gains versus Level of Abstraction[1]
Level of abstraction Expected power improvements
Conceptual >100 times
Behavioral 10 -100 times
Structural 10 - 90%
Implementation 15%-20%
Device 20%[2]
2.3 Arguments For Integrated Design Flows 14
be
and
) an
is to
, for
ues,
have
rve
for
erent
ies
to
rd a
on
stly,
any
bles
how
s on
ble
n be
At any point in a design cycle, sub-components of a design may
simultaneously modified by multiple designers, at many different abstraction levels
disciplines. A meaningful means to integrate (as well as document and track
evolving design of this nature poses a number of challenges.
There are two means in which disparate tools can be integrated. The first
establish a means, either through a host of translators/proxies or encapsulation
tools to interact. The second is to abstract key points away so, for certain iss
information from these disparate tools can be exchanged. Both of these methods
their merits. In fact, the two methodologies are not mutually exclusive and often se
different purposes.
Proxies: The development of proxies to serve as information servers
designers is becoming a necessity and is being addressed in a number of diff
forums. The works of [43][45] are good first steps in this direction. The use of prox
will enable the transfer of design information from tool to tool, transparent
designers, saving design time and allowing the appropriate tool to be focused towa
design problem.
Abstraction: A methodology based on the conceptual-level of abstracti
enables a number of complimentary tasks to aid the integration of a design flow. Fir
and perhaps most important, it provides a means to integrate estimations from m
tools into a single environment. Secondly, supporting high levels of abstraction ena
a flexible system where during the early stages of design there is no restrictions on
a design is described. There are trade-offs, however, in that there are no check
functionality, and there are time penalties of later translation into compila
descriptions. Until (if ever) there is a single means to describe a system which ca
used by any tool - this type of abstraction will be necessary.
2.4 Related Work in Conceptual Level Design and CAD 15
ess,
t of
e.g.,
the
le.
e
t of
imple
ore
tools
eful
heir
The
sed
s of
d the
the
cent
ols
2.4 Related Work in Conceptual Level Design and CAD
Though there is innumerable work at later stages of the design proc
relatively little work focuses on the earliest stages of the design. As a large par
conceptual design requires abstraction of knowledge from these lower levels (
structural, technological). Useful work from these levels is an integral part of
proposed methodology and will be referred to throughout the thesis when applicab
A limited number of tools do address (implicitly or explicitly) some of th
principals of conceptual-level design. This section outlines some the most relevan
these works to the environment proposed in Chapters 3 and 4.
2.4.1 Equation Based Tools
Perhaps the most standard early design method is to make sketches and s
equations. Hence the first conceptual design tool may be the calculator. For m
complex systems, equations can still be used, but a more rigorous approach and
are required. As will be discussed in Chapter 3, equation based tools will remain us
at the high level. They provide a significant freedom for design choices. However, t
scope is limited to area where estimates can be performed using equations.
integration of this type of tool into an environment is one aspect of the propo
environment.
2.4.1.1 Spreadsheets
As will be shown, spreadsheets are an essential tool at the early stage
design. Hence early tools, such as VisiCalc and Lotus 1-2-3 [4], can be considere
first conceptual-level applicable tools. An intuitive interface is a necessity at
conceptual-level, hence Microsoft’s Excel spreadsheet[5], which has become a re
standard, is perhaps the most user friendly. Inclusion of O.L.E [6], which allows to
2.4 Related Work in Conceptual Level Design and CAD 16
ghly
arly
this
. In
ugh
stem
the
ign
ly on
esign
ing
eans
sign
to be embedded inside other tools in the Windows environment may become a hi
useful addition to this system.
2.4.1.2 Analytic Tools
Tools that aid in the solving of simultaneous equations are useful at the e
stage of design. Matlab and Mathematica are perhaps the most common tools in
area. Matlab’s use of the “toolbox” allows customization for specific types of design
affect this provides a limited form of design re-use. Matlab and Mathematica, tho
useful in the early stages, have most of their advantageous in the simulation of a sy
and do not provide aid in evaluating the costs of design choices.
2.4.1.3 Design Sheet[7]
Perhaps the first system focused toward electronic design to target
conceptual level explicitly is Rockwell Corporation’sDesign Sheet. It is an
environment focused on solving a series of constraint driven equations:
“Design Sheet is a conceptual design system that integrates constraint
management techniques, symbolic mathematics, and robust equation
solving capabilities with a flexible environment for developing models
and specifying trade-off studies. It encourages designers to explore a
wider range of the design space than might otherwise be feasible.”[7]
This work accurately identifies the importance of the early stage of the des
process and also properly focuses on trade-off analyses. Though focusing sole
equation based estimation, it also provides a means of model creation supporting d
re-use.
Figure 2.3 illustrates the interface in the design sheet for describ
constraints, and interfacing with models. As can be seen, this system provides a m
for describing and analyzing very complex systems of equations. Therefore each de
2.4 Related Work in Conceptual Level Design and CAD 17
in the
tion
em is
r are
for a
heet
rly
element requires an equation based description of constraints that are described
same time domain so that they may interact in a meaningful manner. This descrip
method enables rigorous analysis early in the design cycle, therefore such a syst
not applicable to designs where these constraint driven equations do not exist o
hard to create.
For a complex mathematical system this approach can be very useful, but
more heterogeneous structure, this requirement may be prohibitive. As the Design S
is developed it may prove to be a valuable early verification tool, however it clea
requires to much detailed description to support early design decisions.
Figure 2.3 Design Sheet’s constraint network and top-level interface[7]
2.4 Related Work in Conceptual Level Design and CAD 18
the
el of
ptual
sign
ng,
t is
sign
s an
cks,
ented
e of
neity
ber
sign
ort a
The
the
ther
t the
2.4.2 Graphical Environments
The ability to provide analysis directly from a sketch may be valuable at
early stage of the design process. Though focused primarily on the behavioral lev
digital systems, the Ptolemy project demonstrates many aspects useful for conce
design.
2.4.2.1 Ptolemy [24]
Ptolemy is a research project and software environment focused on the de
of reactive systems. It provides high-level support for signal processi
communications, and real-time control. The key underlying principle in the projec
the use of multiple models of computation in a hierarchical heterogeneous de
environment.
The Ptolemy project, though focused solely on a DSP design space, exhibit
intuitive interface toward building a heterogeneous design model. Functional blo
which can be described using a behavioral language such as C or silage, are repres
by simple graphical blocks termed stars. Hierarchy is captured through the us
blocks termed galaxies which can be made up of other stars or galaxies. Heteroge
is addressed by allowing the designer to arbitrarily partition their design to any num
of stars or galaxies.
Edward Lee identifies that there are counter goals to a design system. A de
environment tries to support expressiveness and generality yet also must to supp
system description which is both verifiable and either compilable or synthesizable.
choice of where a system tool should focus in this dichotomy most clearly defines
difference between the work proposed in this thesis versus Ptolemy and most o
previous works. The conceptual-level system provides greater expressiveness a
expense of compilability or synthesizability.
2.4 Related Work in Conceptual Level Design and CAD 19
ins.
cting
have
elds
l.
ork
rs
E is
the
is
ess
d the
2.4.3 Non-electrical Based Domains
One of the aspects of conceptual design is that it spans different doma
Electro-mechanical systems, for example, may often best be explored by abstra
away most details. Therefore it is not surprising that other, non-electrical domains,
also explored some aspect of conceptual design. It is worth noting that in many fi
the termconceptual levelis synonymous to the electrical engineering behavioral leve
A prime example of conceptual-level design is the electrical-mechanical w
of the Domain Unified CAD Environment (DUCADE) [23]. In this project, the autho
unify the mechanical and electrical domains. Currently, Joint research on DUCAD
being conducted between the Wireless Computing Consortium (WCC) and
Integrated Manufacturing Lab (IML) at the University of California at Berkeley. Th
design environment will serve as a platform for the inter-disciplinary design of wirel
computers and similar consumer products. Research focuses on tool integration an
development of a uniform database.
Figure 2.4 DUCADE: A mechanical and electrical view of the same system[23]
2.5 Conceptual Design Methodology 20
us
ed to
tem.
ical
ard
astly
be
oth
plete
gners
ptual
not
s the
ly
:
es
).
n is
are
rs,
Like this work, DUCADE is determining that the key to getting heterogeneo
structures to interact is through the abstraction of all but the key components need
solve a specific design problem. Figure 2.4 illustrates two views on the same sys
On the left, is a view of a mechanical box being designed to fit around an electr
circuit board, which is shown on the right. Complex elements of the electrical bo
tool, regarding size and strength for example, needs to be communicated with a v
different mechanical tool. This mechanical tool must interpret this information and
used to design a box which meets a number of non-electrical design constraints.
2.5 Conceptual Design Methodology
As Example 1 illustrates, at the conceptual-level there is some concept of b
the behavior and the structure. The key challenge is to relate these types of incom
specifications to meaningful analyses of costs and constraints, thus enabling desi
to make important decisions and trade-offs. This section describes the conce
design methodology as it is usually done intuitively by designers. The intention is
to change this methodology, but to create tools and an environment which increase
speed and effectiveness in which this part of the design process is accomplished.
The traditional methodology utilized at the conceptual level is a high
iterative process which can transcend the following trajectory in a number of ways
1. Problem Formulation: The first step is to determine the desired properti
of a design. This is in effect defining constraints and goals.
2. Rough sketch: A rough sketch is made of a possible implementation(s
This is usually a block diagram, pseudo-code, or a combination thereof. The desig
now represented as a collection of interconnected sub-blocks. The sub-blocks
functional and/or behavioral. (Examples of sub-blocks are multipliers, FIR filte
2.5 Conceptual Design Methodology 21
e a
f
l
one
cess
is
can
e
with
his
, to
can
d/or
ry
o be
n of
wn
the
subroutines, comparators). The interconnections of the sub-blocks can b
representation of data flow, or contain physical information about connectivity.
3. Verif ication: At this point the designer will usually verify to the best o
their ability the functionality of their design. With limited information, conventiona
simulators often can not be used. Therefore some form of verification is usually d
by hand. This often takes the form of a worst case analysis. Tools to aid in this pro
is beyond the scope of this paper, but is a focus of future work.
4. Evaluation: Evaluating the design with respect to constraints and goals
the next step. This stage is the primary focus of this work. The evaluation process
be broken into the following steps:
a. Sub-block estimation: At this point there is some information about th
behavior and the structure of each sub-block. This must now be combined
information about the implementation platform(s) being considered. Traditionally t
is done in a number of ad-hoc ways varying from estimations based on intuition
extensive simulations of critical sub-blocks. Each small change in design
necessitate a long re-evaluation of some or all of the sub-blocks.
Examined more rigorously, for each of the blocks a modelF(input parameters)
is evaluated, where the input parameters give information about the behavior an
structure and the function,F, produces estimates on cost.
b. Overhead estimation: Overhead (e.g., interconnect, controller, memo
etc.) must be evaluated in the same manner. The output of one function may als
input parameters of overhead. For example, area of interconnect will be a functio
the area of the rest of the circuit.
Evaluating sub-blocks and overhead involves modeling the information kno
about each block’s implementation as a function of input parameters. Due to
2.5 Conceptual Design Methodology 22
As
igital
fore
he
wer,
ysis
e
the
The
eed
ary to
rent
g.,
en
ral,
d in
ible.
ates
ction
complexity of estimation, this step is often skipped in the traditional design flow.
line-widths shrink and speeds increase, this overhead, be it capacitance in a d
circuit, or noise effects of an analog circuits, dominate performance. There
neglecting this overhead will not be an option.
c. Designestimation: The costs of each sub-block are then combined with t
overhead analysis to give estimates of cost for the entire design. In the case of po
this involves simply adding the power of the sub-blocks. In other cases this anal
may be more complex (e.g., timing, inductive effects).
5. Analysis and evaluation: The design costs are then compared with th
constraints and goals. The most common technique is identifying sub-blocks of
design which are the most costly, so design time can be focussed appropriately.
design environment should aid in this identification.
Iteration back to any of the previous steps is common: The structure may n
adjustment, or other structures need to be considered(2). It may become necess
change constraints and/or goals (e.g., increasing timing budget)(1). Diffe
implementation platforms maybe tried by simply switching in different models (e.
switching in a different type of multiplier)(4). Behavior and architecture is oft
changed by simply changing input parameters (e.g., bit-widths, memory size)(4).
In effect, the goal of the methodology is to combine incomplete behavio
structural and implementation information to get the most information possible to ai
making crucial conceptual-level decisions. Absolute accuracy is obviously not poss
However, the important goal is to determine both rough order of magnitude estim
and relative estimates. Questions which are answered at this stage are“Which design
choice is better?”or “Will this keep me within my cost budget(s)?”.Experiments have
shown that both of these questions can often be answered at this level of abstra
[55].
2.6 Challenges at the Conceptual Level 23
ad-
great
rmed
or
h sub-
ble
tricts
ust
igner
haps
er
es
f the
2.6 Challenges at the Conceptual Level
2.6.1 Problems With Current Methods
Conceptual-level design has traditionally been done by hand analysis in an
hoc manner. For every iteration through the analysis process, a designer makes a
number of timely and often inaccurate estimations. These estimates are often perfo
through:
Time consuming hand analyses:Sub-blocks are often modeled with an equation
groups of equations. Each time a designer changes even a simple parameter eac
block’s equation must be recalculated.
Databooks: For off-the-shelf components, a designer may leaf through availa
databooks. Each change in a sub-block requires data to be rechecked. It also res
estimation, and perhaps design, to parts for which exists data.
Tool calls: A designer may run lengthy estimation(s) to analyze a sub-block. This m
be redone each time a change is made. Tool calls are restricted to tools the des
uses and has available. Many options may go untried if the time to run, and per
learn, a new tool is prohibitive
Intuition and approximation: When the above methods prove too time costly, a design
may rely on their intuition. For example s/he may know that a multiplier tak
approximately 50 nsec to execute. This method relies heavily on the experience o
designer, and is obviously only a rough estimate.
2.6 Challenges at the Conceptual Level 24
cess
l key
ptual-
lity,
esign
and
een
ation
For
as the
ional
tc.).
. The
tual
t and
n
This
and
wer
ss
2.6.2 Challenges of complex system design at the conceptual-level
The properties of modern day systems further complicates the design pro
and must be considered in developing an environment. This section outlines severa
properties of complex systems, and the associated challenges they create in conce
level design.
Heterogeneous systems are made up of many differenttypes of entities.
Heterogeneity resides at many different levels and can relate to types of functiona
implementation platforms, and ways of connecting components together.
Heterogeneity presents many challenges to both the designer and the d
tools. Different types of entities are analyzed and designed using different methods
different tools. A conceptual-level environment must enable smooth transitions betw
tools, as well as provide a means to make comparisons of cost across implement
platforms.
Hierarchical systems are composed of smaller (heterogeneous) objects.
example, a portable device can be seen as composed out of separate entities such
radio, processor, display, keyboard, etc. Each of these entities are also composit
(e.g., the radio can be partitioned into a a transceiver, transmitter, ADC, mixers, e
In complex systems, hierarchy transcends many layers creating several challenges
higher the hierarchy level, the further the designer is removed from the ac
implementation and the harder it becomes to get reasonable information about cos
performance.
Varying abstraction levels further complicates the design process. Any give
object at any hierarchy level can be described at various abstraction levels.
impacts the information that can be obtained about this object (and its accuracy)
what operations or actions can be performed on it. For example, getting po
estimations for a piece of behavioral VHDL code is different (and typically le
2.7 Roles of a Conceptual-Level Environment 25
me
ign is
pare
.
are
sign
The
any
sign
the
timate
e on
and
tion,
he
lieve
accurate) from getting an estimation from a structural VHDL description of the sa
object.
Throughout the design process, abstraction levels tend to change as a des
further defined. This change in abstraction level must be tracked. A means to com
cost functions across abstractions in a meaningful way is important in system tools
Many designers: The more complicated the system, the more designers
required. Combined with the increasing size of design companies, the use of de
consultants, and the rise in telecommuting the challenge of design flow increases.
challenge of bringing together more designers, at greater distances, working on m
different types of computers will be a prime obstacle to present and future de
teams.
2.7 Roles of a Conceptual-Level Environment
The goal of the conceptual design environment is to provide means to make
early analysis process as quick and accurate as possible. By speeding up the es
process the designer may go through more iterations and therefore focus their tim
making critical decisions. The environment must also provide means for analyzing,
cost budgeting, complex systems throughout the design process.
In addition to addressing the four challenges described in the previous sec
a conceptual-level environment should aid in:
Design evaluation: Once an estimate is found, the tool should relieve t
designer from the task of evaluation. Estimations should be performed so as to re
the designer of the tedium of equation evaluation, tool execution, etc.
2.8 The Conceptual-Level Design Environment 26
ir
luate
can
ided
of
is a
n
new
ing
ary
rtant
lar to
ct of
AD
at
Design entry and manipulation: It should be easy for designers to enter the
designs. It should also be easy for a designer to choose to switch how they eva
each sub-block to track design changes.
Data display: The data should be displayed in a way so that designers
identify the areas which need optimizations. A number of means should be prov
ranging from ordered lists to various types of graphical representations.
Data managementand information gathering: A designer needs to keep track
of different options in order to compare optimizations. The ability to archive a view
a design in process, as well as integrate documentation in an intuitive manner,
necessity.
Estimationcreation: An estimation for a specific block may not already be i
the framework. It should be easy for a designer to characterize and encapsulate
sub-blocks.
2.8 The Conceptual-Level Design Environment
One goal of this thesis is to provide a framework for describing and evaluat
designs at the conceptual-level of abstraction. This section lists the prim
subcomponents of said system. The following sub-sections describe the impo
subcomponents of such as system. Parts of this proposed environment are simi
current CAD environments (for example, data management is an important aspe
any CAD system). Designing complex systems requires a number of specialized C
functions incorporated into a unified framework. It is a primary tenet of this work th
the re-deriving of specialized tools is not only impractical but infeasible.
2.8 The Conceptual-Level Design Environment 27
AD
is,
not
ork.
e of
ed in
key
ity
the
lar,
roach.
tive
ate a
the
et-
xed
it can
As technology changes so do the requirement of a CAD system. Hence a C
tool has to be flexible to incorporate not only existing, but future work. In this thes
related and useful work has been incorporated, with modifications as necessary.
There are several key requirements for this level of design which have
received adequate attention and has been the primary focus of this w
Macromodeling at the conceptual stage of design (Section 2.8.1) and the us
hyperlinked spreadsheets (Section 2.8.2) have primary focus and are also explor
Chapters 3 and 5. The use of the WWW to facilitate design (Section 2.8.4) is also a
aspect of this work and will be described in Chapter 4.
2.8.1 Macromodeling (Chapter 3)
The concept of macromodeling is the key enabler for providing the flexibil
and adaptability required of a conceptual-level environment. Macromodels are
primary components used in evaluating costs of a complex system. In particu
analysis of heterogeneous systems and reuse are strongly supported with this app
As will be seen, the macromodel approach provides the basis for an adap
environment where changes in technologies and design goals will not necessit
change in the design environment.
Before describing the different tools in the environment, macromodels at
conceptual-level should be defined:
Definition:
A macromodel is a model which presents a quantified view of one or more cost mrics of a design as a function of a set of specification parameters.
A macromodel can take on many forms: it can be represented by a fi
number, a table, a set of equations, an invocation of a tool such as an estimator, or
be composed of a set of other macromodels.
2.8 The Conceptual-Level Design Environment 28
x so
the
gonal
and
rary
.
ss
ith a
rs. It
del,
ngful
into
of
ate,
d as
by
rrent
lf
be
ions.
The main benefit of macromodels is that they may be used as a black bo
that a designer interacts with designs in the same manner regardless of
subcomponents. Heterogeneous subcomponents can be evaluated by using ortho
macromodels so that one can analyze a switch of implementation platforms simply
quickly. For example, using macromodels, the effects of changing a hardware lib
can be instantly evaluated without rewriting the behavioral or structural description
A macromodel presents a uniform entry of inputs, and return of outputs, regardleof the method used to calculate the outputs.
Regardless of the type of estimation being evaluated, the user interacts w
macromodel in the same manner. Every analysis is just an entry of input paramete
is up to the environment to interpret the input parameters, find the proper macromo
evaluate the macromodel, and provide the estimate(s) to the user in a meani
manner. The environment should also aid designers in incorporating new models
the environment.
Macromodels can take a number of different forms. The primaryprimitive
forms are analytical expressions, tables and estimators:
Analytical expressionsare used often to model a cost function as a function
any number of input parameters. It provides a flexible, and potentially highly accur
means to analyze a block quickly. For example, power of a multiplier can be modele
a function of the bitwidths of each multiplicand. Further accuracy can be attained
using details about input vectors if available. Constants, for example a constant cu
for a given amplifier architecture, are also classified as an analytical expression.
Tables are useful for both nonlinear functions and for off-the-she
components. An analog to digital converter (ADC) for example, can often not
characterized by simple equations, but can be pre-characterized for different precis
2.8 The Conceptual-Level Design Environment 29
crete
r of
be
tical
els
the
y be
nt. In
e” a
els,
itive
tem.
and
ed to
rk. A
the
of
Entire data books can be encapsulated into tables to enable fast estimation of dis
components.
Dynamic models areestimators and predictors that can be used in a numbe
different situations. For most critical components a transistor level model may
necessary. In addition, to determine important behavioral characteristics, mathema
tools (e.g., Matlab) can be encapsulated into a model.
Definition:
A composite macromodel is a combination of primitive and other composite modand the description as to how they interact
Perhaps the most important aspect of macromodels is they can also take
form of a (complex) combination of other macromodels. For example, a system ma
a represented by a single macromodel consisting of a model of each subcompone
turn this same system’s macromodel may be combined with other models to “produc
macromodel of a complex system. Regardless of the complexity of any of the mod
designers interact with each of them in the same manner. This supports intu
interaction with the environment during the early creative stages.
Composite macromodels provide a conceptual-level representation of a sys
All aspects of composite models are defined and explored in Section 3.3. Building
analyzing systems using composite macromodels is the focus of all of Chapter 6.
2.8.2 Hyperlinked Spreadsheet With Macromodels (Section 4.1)
The spreadsheet is where the designer does most of their analysis. It is us
bring together and manipulate all the heterogeneous components in one framewo
list of all elements, with attributes enumerated, is a powerful means of tracking
status of a design, identifying bottlenecks, and evaluating effectiveness
2.8 The Conceptual-Level Design Environment 30
lizes
s, a
th a
)
optimizations. This thesis introduces a new more powerful spreadsheet which uti
both hyperlinks and macromodels. By enabling interconnections of the element
powerful composite model can be described.
As will be described in Section 4.1, a hyperlinked spreadsheet combined wi
macromodeling technique will enable a designer to:
• Evaluate designs across platforms (addressing a system’s heterogeneous nature
By modeling heterogeneous elements with orthogonal macromodels, and
linking their inputs and outputs, the proper means of evaluation for each
of the subcomponents can be utilized.
• Evaluate designs throughout the design process (addressing a system’s varying
abstraction levels)
Macromodels which give a more accurate view of the final
implementation may by utilized so that as the design evolves, the tool
required does not change, just the models utilized.
• Get hierarchical views of a design
Nested spreadsheets are used to mirror the hierarchy of a system and
hence give a natural view of the design.
• Integrate the work of many designers in one framework.
Many designers can work on subcomponents of a system, in a sense
owning different macromodels, which can in turn be encapsulated in a
single (or collection of) spreadsheet(s). In this means parallel work can
be done and collected in a single environment.
2.8 The Conceptual-Level Design Environment 31
ation
ed
• Design flow management and cost budgeting (easing documentation and inform
transfer amongst designers).
Documentation can be linked into the spreadsheets providing a single, yet
expandable documentation focal point. Design managers can use a central
spreadsheet(s) to perform cost budgeting and to track the progress of a
design.
The following example presents the basic properties of the hyperlink
spreadsheets and macromodels.
Example 2Hyperlinked spreadsheet and macromodels
Figure 2.5 shows a spreadsheet analysis of the voltage regulation system
of example 1. The analysis is from the conceptual-level tool
PowerPlay[55]. Each spreadsheet has the general format seen in this
example.
Each row represents a sub-block of the example. The left column has the
name of each entry. The second column allows the user to enter the
parameters for each entry. The right hand columns are used to report
information about the analysis to the designer. In this case, power is the
cost function. Global parameters are defined in a box at the bottom of the
spreadsheet.
This spreadsheet is used as an interactive tool during the conceptual
design process. At any time, blocks can be added or subtracted, or
parameter values changed. To see the effect of a design change, the
designer simply clicks the play button to get immediate feedback.
2.8 The Conceptual-Level Design Environment 32
The spreadsheet tool, in this case PowerPlay, evaluates the macromodels
associated with each of the line entries and then categorizes and sums all
the information for the designer. At any time the designer can choose new
models for a block entry and re-evaluate. There are no restrictions on the
type of models that can be included on any sheet. By partitioning the
analysis into macromodels, the problem of heterogeneity is solved.
The first two lines of this example (the reference voltage and comparator)
are analog circuits modeled by equations. The following line is the
analog to digital convertor (ADC) which is not a linear function of its
input parameter, resolution, and should be evaluated by a table look-up.
The digital circuitry, and power transistors, are modeled by a tool call to
an estimator. If parts of the circuitry were further along in the design
process, a macromodel appropriate for that level of abstraction may be
used.
Figure 2.5 Spreadsheet analysis of power dissipation of Example 2
2.8 The Conceptual-Level Design Environment 33
tem
le and
n of
To facilitate both reuse and hierarchy, any spreadsheet may be
encapsulated into a single macromodel. In other words this entire
spreadsheet may be encapsulated into a single line and incorporated into
another system. When the macromodel for a spreadsheet is a line-entry in
another spreadsheet, the name in the left column is a hyperlink to the
appropriate spreadsheet. When the play button is clicked, all levels of
hierarchy are evaluated. Parameters are passed down to lower levels.
There is no fundamental limit to the depth of hierarchy allowed. This
utilization of hierarchy will be shown in section 5.2.
For each sub-block in the spreadsheet, there are hyperlinks to
documentation about the block and information about the macromodel.
There is no restriction on the location of the model or the documentation
so designers and models (and their associated tools and databases) can be
located wherever is most appropriate.
Finally documentation may easily be put on to a spreadsheet, and
spreadsheets may be archived at any time. This makes it a useful tool for
tracking design decisions and design flow.
The outputs from the tools can be forwarded to a graphing tool to aid in
analyzing the “problem” blocks. In addition, exploration can be done at
this level by varying a local or global parameter and graphing the effects
on the cost functions.
2.8.3 Macromodel Creation (Chapter 3)
A design system is only as good as its underlying models. The design sys
can not rely on a fixed set of base models as system requirements are unpredictab
unlimited. Instead, the design system should enable the creation and inclusio
2.8 The Conceptual-Level Design Environment 34
te the
ny
nce
s, a
W)
a
ools
the
ls be
sign
on
hoice
er
ons,
nd
models as necessary. In addition, the system should encourage reuse and facilita
sharing of models.
An important part of the design environment is to enable the inclusion of a
type of model. Mechanisms for this are given in Chapter 4. Section 3.4 details guida
as to how to create new models and include error metrics.
2.8.4 Distributed Infrastructure (Section 4.2)
In order to bring together the different models and any number of designer
distributive infrastructure is required. The Internet, and the World Wide Web (WW
in particular, is the enabling technology to achieve this unification. To enable
designer to work where they want, using whatever computer they want, design t
should be made “web-compatible”. Tools, customized for the Internet, can exploit
features of the WWW more fully than simply encapsulating existing tools.
Fully utilizing the WWW will enable:
• shared libraries and documentation across the network, promoting re-use
• tools accessible across the network
• enable many designers to interact regardless of location or computer platform.
Both the size of design teams, and the change in work habits require that CAD too
available across networks. In addition, the creative nature of conceptual-level de
requires a comfortable and flexible environment. The WWW’s facility at informati
transfer, platform independence, as well as its wide user base, makes it a clear c
for a system design platform.
The use of the WWW is similar to a client-server configuration with a numb
of important differentiators. Included among them is the lack of dedicated connecti
limited graphical interaction, and the fairly primitive protocols for data transfer a
2.8 The Conceptual-Level Design Environment 35
d to
eb.
s an
in
can
tions
n be
the
s the
ough
e. If
f the
r the
sues
er of
be
igner
. This
ined in
display. All of the work in the proposed conceptual-level environment are designe
leverage off the strengths, and overcome the limitations, of the World Wide W
Details of networked implementation details are given in Chapter 4.
The next section details the use of generic block entry for design ideas. A
example of network design, a generic entry tool, WebTop[36], which is located
M.I.T., is integrated into the Berkeley based PowerPlay design flow. A designer
enter a design utilizing a server in Massachusetts, and then to perform their estima
the relevant information is transferred to the California server where the design ca
further specified. The results of estimations can then be back annotated onto
schematic entry tool.
2.8.5 Generic Block Entry
The first step in the design process, is the sketching of an idea. To addres
heterogeneity of complex designs, the interface for this entry has to be generic en
to support any type of description. In addition, this interface should be simple to us
ideas do not flow into a tool as naturally as to a piece of paper, important aspects o
design process may be sacrificed. The tool will either go unused or, if used, hampe
design process.
A first prototype of a Java based block editor addresses many of these is
(Figure 2.6)[46].Sub-blocks and connectors can be sketched and given any numb
attributes or values with no restriction. The interpolation of these attributes will
handled by the macromodels. The generic nature of entry is essential so that a des
can use the same interface independent of the nature of the system being specified
enables heterogeneous systems to be entered. Global parameters may also be def
this environment.
2.8 The Conceptual-Level Design Environment 36
ple
the
t the
is to
re is
cally
ript
the
lly or
n
xy to
[44].
sign
The ideal input media of a conceptual-level environment would be a sim
pen and paper. To really enable the same creativity used today would require
integration of a pen interface.
2.8.6 Documentation and Design Management
With an increasing amount of designs and macromodels spread throughou
Internet, management of this information becomes a priority. The current method
encapsulate knowledge inside the spreadsheet tool. For each macromodel the
information about how to find and execute estimations. Equations can be stored lo
to either the server or designer. For a more complex tool call, a pointer to a sc
encapsulating the model is stored locally. The script is accessed and run in
background whenever analysis is requested. Tools may be executed either loca
across the network.
In time, this solution will become prohibitively restrictive. The next generatio
conceptual-level design environment may thus utilize a separate design agent/pro
handle model calls. For information on design agents, the reader is referred to [43]
The main task of an agent is to quantify abstract requests for information. A de
agent will aid in locating, encapsulating and running tools across the network.
Figure 2.6 Java block editor with parameter entry form[46]
2.9 Scope of Thesis 37
vel
hich
er can
.
are
t be
oxies
e-in.
he
ies,
eptual-
s is
W)
level
ully
the
The use of a design agent will give more added value to conceptual le
design. A design agent can perform searches. For example a list of LCD displays w
sell for less than $200 and dissipate less than 500 mW can be assembled. The us
then choose the most appropriate LCD and include the macromodel in their design
2.9 Scope of Thesis
There are a number of elements of the conceptual-level environment which
already being pursued for other applications. In those cases, that work will no
repeated, but will be incorporated. For example, data management and design pr
are actively being addressed in other works and will not be addressed in detail her
Data entry via schematic entry is also beyond the scope of this work.
In the following two chapters, the two most important elements of t
environment will be addressed. Chapter 3 details macromodeling methodolog
techniques and accuracy. Chapter 4 presents the elements necessary for a conc
level environment. A prototype of an environment which incorporates these idea
presented in Chapter 5. In so doing, the challenges of designing an internet (WW
based system is demonstrated. Finally, Chapter 6 is used to show conceptual
design of actual systems. Using complete examples, the methodology is f
demonstrated on a variety of complex systems. This section also benchmarks
accuracy that can be attained using this method.
3.1 Macromodeling Overview 38
Chapter 3
Macromodels
a
tions
n of
ion at
for
heets
gful
odel
logies
ther
,
are
tion
The
3.1 Macromodeling Overview
One of the main roles of the conceptual-level environment is to provide
means in which heterogeneous designs can be evaluated for a variety of cost func
across abstraction levels. As discussed in Section 2.7.1 and 2.7.2, the utilizatio
macromodels with hyperlinked spreadsheets provides the basis for design evaluat
the critical conceptual level. Macromodels are used as the primary components
evaluating costs of a complex system with heterogeneous components. Spreads
provides an intuitive view on the design as well as a means to provide meanin
graphs and sensitivity analyses. As will be shown in this chapter, the macrom
approach provides the basis for an adaptive environment where changes in techno
and design goals will not necessitate a change in the design environment. In o
words, across design disciplines, abstraction levels, cost functions and even timethe
CAD system need not change, just the models.
In the remainder of this section, the basic characteristics of macromodels
presented. In Section 3.1.1 the formal definition of macromodels is revisited. In sec
3.1.2 the trade-offs involved in creating and using macromodels is demonstrated.
3.1 Macromodeling Overview 39
into
l
els.
ailed
is
et-
ss
tes
any
et of
will
the
uate
signer
ing
gner,
gner
remaining part of the overview section discusses the breakdown of macromodels
two major typesprimitive and composite.The two other sections of this chapter dea
with all aspects of creation and use of these two general forms of macromod
Primitive models are addressed in detail in Section 3.2. Composite models are det
in Section 3.3.
3.1.1 Definition Revisited
In Section 2.8.1, macromodeling in the context of conceptual-level design
defined:
Definition:
A macromodel is a model which presents a quantified view of one or more cost mrics of a design as a function of a set of specification parameters.
A macromodel presents a uniform entry of inputs, and return of outputs, regardleof the method used to calculate the outputs.
A high-level picture of a macromodel is presented in Figure 3.1. It illustra
the uniform manner in which macromodels can be evaluated. In this chapter, the m
forms that macromodels can manifest themselves (e.g., fixed number, a table, a s
equations, an invocation of a tool, or a composition of sets of other macromodels)
be introduced and examined.
One of the major strengths of the conceptual-level system arises from
“black-box nature” of the macromodels. Macromodels enable a designer to eval
systems in the same manner regardless of the system’s subcomponents. The de
interfaces directly with the inside of the box only in the choosing and/or characteriz
of the macromodel. The evaluation of the models themselves are, to the desi
always identical thus enabling focus to be put on design decisions. As the desi
3.1 Macromodeling Overview 40
nment
om
sting
s, a
idely
t of
the
ue
ce
ation
proceeds through the design process, macromodels enable the same design enviro
to be used from design specification to implementation.
3.1.2 Previous work in macromodeling
There has been volumes of work focused on modeling ranging in fields fr
aerodynamics, mechanical engineering, architecture to mathematics. Therefore li
all works in modeling would be a thesis in itself. Focused specifically on electronic
great deal of work has been focused on a number of aspects of modeling:
Transistor level simulation: A great deal of effort in CAD for integrated
circuits has been focused toward fast transistor level simulators, such as the w
used SPICE. Work focused on macromodeling utilizing SPICE covers a wide gamu
design. For example Gharpurey [25] utilizes macromodels and SPICE to model
effects of substrate coupling, while Grimaila [26] utilizes a similar techniq
examining cellular neural networks.
Commercial tools (e.g., EPIC’s Power/Time-Mill [27] and Avanti’s Star_spi
[28]) have targeted macromodeling transistors in such a way to enable quick evalu
and higher levels of partitioning than possible with SPICE.
MacromodelsAnalytic Expression
Combination
Estimator
Data Tableof Models
Figure 3.1Macromodels are interfaced in the same manner regardlessof evaluation methods.
Input
Parameters
Cost
Functions
3.1 Macromodeling Overview 41
d
ct-
20],
s
e
n to
ed
ns,
e
of
tion
ut a
ly in
e of
are
hey
Analog macromodeling: A great deal of work on modeling have focuse
specifically on the analog domain. For example, Mammen [19] utilizes an obje
oriented approach to creating models of analog devices. Similarly, JianFeng [
utilized analog macromodeling to support hierarchical circuit design.
Digital macromodeling: A number of works in the digital domain served a
strong inspiration to this work. In particular, the following two works utiliz
macromodeling at the architectural level to enable estimation before compilatio
silicon:
Landman[10] introduced an architectural level power estimator which utiliz
two concepts influential to this work. In order to perform accurate power estimatio
Landman’s methodabstractedinformation away. In so doing, he usedblack-boxmodels
as a means to reduce the complexity seen by a designer.
Tiwari and Malik’s work in instruction level modeling[15] also showed th
power of black-box modeling extended to the microprocessor domain. Instead
modeling functional units, they empirically measured current dissipated per instruc
type.
Analytical macromodeling: By using strictly analytical modeling, a system
can be analyzed at a number of different abstractions. In addition, intuition abo
system is often generated in the formulation of these models.
Svensson, for example, illustrated methods to estimate performance ear
the design process with equations by using electrical fundamentals and knowledg
the target architecture [14].
These examples touch on the seemingly unlimited work in modeling. There
a number of differentiaters in the work of macromodeling presented in this thesis. T
3.1 Macromodeling Overview 42
the
of
r a
sub-
le in
s of
ross
the
ds
els
a
ows
of
e used
mostly revolve around the desire to enable conceptual-level design as defined in
previous chapter.
3.1.2.1 Roles of conceptual-level macromodels
This section lists some of the key focal points in the development
macromodeling for conceptual design:
Macromodels enable the combining of estimators at various abstraction
levels: Conceptual estimation requires that the most relevant information fo
particular estimation be performed on a system. This means that estimations of
blocks at behavioral, structural and the implementation abstractions are all possib
the same environment. As different parts of complex systems are at various form
development simultaneously, it is important to be able to facilitate estimation ac
abstraction.
Enabling models to interact across heterogeneity:Complex systems are
made of various heterogeneic elements. This form of macromodeling enables
partitioning and recomposition required to analyze these types of system.
Support evolving systems: As a design evolves the model of the system nee
to be able to evolve without much overhead. The uniform interface of macromod
allows the modifying or switching of models to intuitively track the changes of
system.
Perhaps the most important element is to provide an environment which all
this type of modeling to be performed in an intuitive manner. The form
macromodeling presented in this chapter addresses all of these issues and can b
as a basis for a complex system analysis.
3.1 Macromodeling Overview 43
To
eling
el to
ion
This
at
ting
ose
n the
or a
del
ional
out
tem:
but
h
3.1.3 Macromodel trade-offs
There has been extensive work in modeling all types of cost functions.
estimate at the conceptual level does not necessitate the formulation of new mod
techniques. The challenge at the conceptual level is to choose or derive a mod
extract the essenceof each subcomponent for a particular cost function and abstract
level, at an acceptable accuracy, required at a given point in the design process.
combination of cost functions, abstraction and accuracy is termed aview of a design.
Definition:
A view on a design is the combination of cost metrics abstraction and accuracy this requested by a designer.
In developing and choosing macromodels, the first step is to explore exis
models and observe how they apply to the required view. If a model which is cl
enough exists, reuse is usually the most efficient means to perform an estimation. I
case that the required model does not exist, an older model should be modified,
new one created, in order to produce the necessary view.
A key trade-off in choosing and designing each macromodel is between mo
accuracy and design time. To get a more accurate estimation usually requires addit
design time. Focusing design efforts properly can reduce design time with
sacrificing accuracy.
There are three key components of design time using a model based sys
model creation, design specification and design evaluation.
• Model creation is the characterization of elements. It is the specification of how to
evaluate one or more cost metrics as a function of design parameters. It should,
does not always include, a means for evaluating the accuracy of an evaluation.
• Design specificationis the translation of design ideas into a description from whic
3.1 Macromodeling Overview 44
ca-
sign
del
f the
can
ime.
the
nput
uch
the
sign
of
el for
d is
estimations can be performed. Optimization of a design usually entails the modifi
tion and/or rewriting of a specification
• Design evaluation: The combination of the specification and underlying models to
create a desired view of a design.
Design specification and evaluation comprise the exploration part of the de
process and is traditionally where the majority of effort has been placed. Mo
creation, on the other hand, is usually done one time and not on the critical path o
design flow. Focusing more energy on choosing and creating the proper models
greatly increase the effectiveness of the exploration.
Models should be chosen which reduce both specification and evaluation t
Exploration time is determined not only by the evaluation time of the model, but in
time it takes to determine the input parameters for estimation. For example, if the i
specifications required to get a certain accuracy is too complex, it may require too m
specification time by the designer to get the required input vectors. In addition, if
estimation uses a complex tool invocation, estimation time itself, both incodecreation
and tool run-time, may be too cumbersome for the accuracy required to make de
decisions.
A primary goal of this work is to provide a system where any type
macromodel can be easily included so that designers can chose the proper mod
each task. In addition, guidelines for choosing the correct model is important an
addressed in Section 3.2.
3.1 Macromodeling Overview 45
e is
eans
thod
well
nents
ther
vide
ct.
hich
ther
ype
also
3.1.4 Types of Macromodels
Macromodels can be divided into two types,primitive and composite as
illustrated in Figure 3.2. Analysis is done in a tree structure in which each nod
either decomposed into sub-nodes (the branches), or evaluated by some other m
(the leaf nodes). Primitives, the leaf nodes of a tree, provide a clearly defined me
for evaluating cost metrics. Composite models, on the other hand, do not have a
defined method(s) per se and hence utilizes structural decomposition of subcompo
to evaluate cost metrics. Composite models consist of a combination of o
macromodels (either primitives or other composite models). Composite models pro
not only the assemblage of other models, but the description as to how they intera
3.1.4.1 Primitive model overview
All estimation systems requires a set of base models and a means in w
models can be modified and encapsulated. Creating theseprimitives is the major
contributor to model creation time discussed in Section 3.1.3. Primitives are ei
analytical (e.g., equations), or empirical (e.g., tool invocation, look-up-table). The t
of primitive model chosen effects estimation time and estimation accuracy but can
Composite macromodel
Figure 3.2Types of macromodels
Primitive macromodel
3.1 Macromodeling Overview 46
, and
ned
dels
tion
been
tem
ition
aluate
s to
tion
sign
he
del,
impact the understanding of the system being designed. The creation of primitives
the implication of model choice, will be discussed in Section 3.2.
3.1.4.2 Composite model overview
When a system can not be modeled with a primitive, it needs to be partitio
into parts and then recomposed in an effective matter. A key property of macromo
is they can also take the form of a combination of other macromodels. This combina
of macromodels, since they are in effect compositions of subcomponents, have
termed composite macromodels.
As will be discussed in Section 3.3, creating a composite model of a sys
entails three steps:
• the decomposition, or partitioning, of a system into subcomponents
• associating these subcomponents with models to evaluate relevant cost metrics
• the composition of these sub-components back into a single entity. This compos
entails not just the assemblage of the sub-parts, but the addition of models to ev
the overhead of composition.
Lacking in previous work is an environment which enables these three step
be effective and intuitively performed. Hyperlinked spreadsheets, introduced in Sec
3.3, are integral to this approach as they provide the smooth interface to de
information which is required to make the traversing and iteration of t
aforementioned three steps effective and intuitive.
The following example illustrates the basic means in which a composite mo
described with a simple sketch, is associated with a composite model.
3.1 Macromodeling Overview 47
Example 1Simple Composite Model
A simple sketch of a decompression system is shown in Figure 3.3 to
demonstrate a simple composite model sketch drawn to describe a system
early in the design process1. A composite model can be made by
partitioning the design by the four functional units, then associating each
unit with a macromodel. The recomposition of the subcomponents into a
composite model of the system is illustrated by the spreadsheet of Figure
3.4. In this view power is being evaluated. Each row of the main table
represents a subcomponent of the system. The left-hand side of the main
table allows entry of input parameters. The right-hand side presents
results. The cost of each row is calculated by its own macromodel. These
macromodels can either be evaluated as a primitive model (Section 3.2)
or decomposed into composite models as will be shown in Section 3.3.
The macromodels used to evaluate each subcomponent can be modified or
replaced as the design evolves.
1. Please see Chapter 6for a detailed description of this system.
Input
clock
clock
R/W
4096 x 66
2048 x 8
2048 x 8
Pixel
f
clock
8
R/W
8
ff/16
f/320
1
6
Figure 3.3Luminance decompression - three memories and a register
DataLuminance
3.1 Macromodeling Overview 48
Modeling the overhead of assembling a system is also an important
capability of composition. The power dissipated by interconnect can be
estimated by including a fifth model. To estimate the interconnect energy
to first order requires an estimate of interconnect length and switching
frequency of the interconnect, as well as information of the area, fringe
and inter-wire capacitance.
The length of interconnect can be modeled as a function of the square
root of the active area. Hence, the interconnect length model will require
an input parameter active area. This active area can be found by including
models of area of the subcomponents and summing their results. The
switching activity of the interconnect can be modeled, or assumed to be
worst case (uniform white noise). The technology information would be
included in the model.
Figure 3.4Each line is evaluated with a macromodel
3.2 Primitive models 49
e
the
ts of
eir
t be
ction
for
ore
ter 5.
rent
e
sign,
vel of
ent
el. A
an
ould
Tables in the spreadsheet enable the defining of global parameters,
supply voltage, operating frequency and technology line width.
Composite models use of global parameters such as these is essential in
that fundamental changes of a system can be quickly evaluated.
3.2 Primitive models
The primitive is the fundamental building block of a system. In this sam
manner, primitive macromodels are the fundamental building blocks of
corresponding estimation. This section details the important trade-offs and aspec
both the creation and use of primitive macromodels with primary focus as to th
application to a conceptual environment.
As discussed in the previous section, model creation and evaluation mus
performed in a timely manner to provide as accurate as practical results. This se
will explore the formulation of analytical expressions, tables and choice of tools
various types of primitive macromodels. The formulation of composite models is m
analogous to full system evaluation and will be addressed in Section 3.3 and Chap
Models should be created and or chosen based on a number of diffe
properties, each of which will be illustrated for the three major types of primitives.
Accuracy: The levels of accuracy required for an estimation of a primitiv
varies dependent upon the system being evaluated. At the earliest stages of de
absolute accuracy is not always possible. However, at the earliest stages, a high le
accuracy it is often not required. In addition, the importance of a primitive compon
in a system also impacts the accuracy required in the corresponding primitive mod
primitive which adds significantly to a total cost, may require more accuracy th
another subcomponent of the system. When choosing primitives the designer sh
trade-off accuracy with estimation speed.
3.2 Primitive models 50
on
gain
odel
n an
t
t is
ality,
for
y be
t
o be
be
ur
een
rsus
cy is
cy of
ple
e
dels
Speed of evaluation: Models are repeatedly evaluated during the explorati
process. Hence, it is more important to create models which evaluate quickly. A
there needs to be a designer driven trade-off between evaluation time and m
accuracy. For example, a primitive modeled with SPICE may be more accurate tha
equation, however the time to execute a SPICE-based model may be prohibitive.
Creation time: The creation of primitives models requires initial effor
seemingly orthogonal to the design flow. Though this time may be substantial, i
usually a one time cost which can increase the effectiveness, both in time and qu
of the exploration process. The macromodel approach strongly supports design
reuse, hence, if a specific primitive is foreseen to have many uses, extra time ma
warranted in the creation.
Flexibility : Optimization of a system requires the variation of inpu
parameters. The ability to confidently vary input parameters of a model should als
considered in the choice of primitive. This also impacts the ability for a primitive to
re-used in many systems.
There are inherent trade-offs in primitive design with respect to the fo
properties described. The most fundamental of which are the trade-off betw
accuracy versus evaluation time, and the trade-off between creation time ve
flexibility.
Accuracy versus Evaluation Time: This is perhaps the key trade-off in
primitive design and selection. At the earliest stages of the design, absolute accura
not possible, so the designer should make the proper trade-off between the accura
their estimation versus the evaluation time. This will be explored more fully in Exam
5.
Creation Time versus Flexibility: Being able to vary parameters over a wid
range provides a lot of capability early in the design process. However, to create mo
3.2 Primitive models 51
he
is
sign
ons,
the
els:
three
d in
d to
iled
y the
rom a
ging
ignal
or
hus
been
well
o be
which can provide this flexibility requires added time in the development of t
primitive model. Knowledge of the important bounds of the input parameters
valuable in creating models. For example, a model of a transistor for a digital de
may not require an accurate model of operation in the sub-threshold of ohmic regi
so it would be wise to not spend excessive time to create a model which has
flexibility to accurately model those regions.
The following three sections details the three superclasses of primitive mod
analytical expressions, tables, and active models. The means of converting these
seemingly disparate techniques to all have uniform interfaces will be discusse
Chapter 4. Understanding the differing qualities of these types of models is require
be able to use a conceptual-level environment effectively and will be deta
throughout the rest of this section.
3.2.1 Analytic Expressions
Once a block has been characterized, analytical expressions are usuall
quickest means to analyze the cost of a subcomponent. An expression can range f
simple constant with no input parameters to a complex expression with inputs ran
from bitwidths, to input impedances, to more obscure parameters such as s
correlations.
Derivation: Analytical expressions can be derived either analytically
empirically and can be used to model subcomponents at any level of hierarchy. T
they can be used throughout the design process. Before any low-level work has
done, equations can be used to predict final outcomes. This can be done by using
known relationships (e.g.,Power=Voltage*Current), or can be derived analytically
based on assumptions of an implementation [14]. Analytical expressions can als
3.2 Primitive models 52
of
del
rived
ation
o
t no
ly,
n to
here
uracy
sed
sing
put
ts of
with
L of
of
gener-
derived from empirical simulations or measurements and the utilization of some form
linear regression.
The speed to create equation models varies greatly with the type of mo
being derived. As the complexity goes up, the speed in which a model can be de
analytically goes down. The creation of models have the added value that their cre
often brings with them insight.
Speed of Evaluation: Equations are the fastest of all primitive models t
evaluate. Hence, a system modeled by equations gives the designer almos
restriction, in terms of time, to their exploration process.
Accuracy: Equations can provide a wide range of accuracy. Typical
however, as the complexity of a component increases, the ability for an equatio
accurately model it decreases. In addition, equations usually have ranges as to w
they are applicable. When input parameters are taken outside of this range, acc
can be reduced.
Primitive models should be able to provide accuracy metrics which can be u
by a system or the designer to provide either constraints or warnings. A risk of u
analytic expressions is that they are usually only accurate for a limited range of in
parameter values. For example a model of an amplifier may not incorporate effec
saturation.
There are a number of means in which accuracy can be addressed
analytical expressions:
• A separate expression can be used to estimate errors. An example of this is the IN
an analog to digital convertor where equations can be used to model inaccuracy
quantization based on the distance from the discrete quantization levels. The de
3.2 Primitive models 53
of the
Sec-
are
cal-
ges.
e of
or a
s of
and
ate case is when the inaccuracy is modeled as a constant value or a percentage
result. These errors can be propagated through the system as will be explored in
tion 3.3.3.
• Separate equations can be utilized to track when the bounds of input parameters
violated. The estimation system can utilize this information to either not allow the
culation or provide warnings when this occurs.
In summary, analytical expressions, in general, have a number of advanta
They are quick to evaluate and can provide a great deal of flexibility. The downsid
equations is that thy may have limited bounds and may be impractical to derive f
complex system. The following subsection demonstrates some of the application
analytical expression models.
3.2.1.1 Example analytical macromodels
Table 3.1 gives a number of example equations used to model power, gain
delay of both digital and analog subcomponents.
Table 3.1 Example primitive equation models
Subcomponent Cost Model Unit
1 Adder w/ uncorre-lated inputs
Capacitance bitwidth*269 fF
2 Adder w/ highlycorrelated inputs
Capacitance bitwidth*154 fF
3 Memory - read Capacitance 9707+word*108+bit-width*1126+bitwidth*words*6
fF
4 Memory - write Capacitance 7997+word*117+bit-width*759+bitwidth*words*9
fF
5 Memory - no-op Capacitance 489 fF
6 common sourceamplifier
Gain gmro -
3.2 Primitive models 54
o
same
and
tain
ides
rge
early
em
tes
uses
during
nts of
o be
The first five equations all calculate power of digital blocks. The first tw
models capacitance of the same ripple adder. Both equations are derived for the
adder with different correlations on the input.
Equations 3-5 calculate capacitance switched in a memory for read, write,
no-op activities respectively. These models, derived empirically, require a cer
amount of overhead to formulate. However, once they are derived, the model prov
high degree of accuracy at a fraction of the time it would require to simulate a la
memory. Designing systems which are dominated by memory accesses, this
overhead of model derivation will enable fairly accurate estimation of total syst
power at an early stage of the design process.
Equations 6 and 7 are for the same common source amplifier. It illustra
different abstractions available when creating and evaluating equations. Equation 5
the abstract concepts of transconductance and output resistance most often used
the design process. Equation 6, on the other hand, derives gain from basic eleme
transistor sizing and characteristics of the process enabling different aspects t
considered during the design process
7 common sourceamplifier
Gain sqrt(2IDµCoxW/L)*(1/λID) -
8 Inverter Delay Vdd*2.3e-9/(Vdd-0.48)2 ns
9 Inverter(slow process corner)
Delay Vdd*3.0e-9/(Vdd-0.50)2 ns
10 Inverter(fast process corner)
Delay Vdd*1.55e-9/(Vdd-0.40)2 ns
Table 3.1 Example primitive equation models
Subcomponent Cost Model Unit
3.2 Primitive models 55
ree
also
in the
ny
head.
s of
sults
f the
rough
an be
ion
ited.
r is
puts
t is
One
, best
r of
sion
in a
The delay elements (8,9,10) model delay of a long channel inverter for th
different process corners. Equation 7 may be used for nominal estimations, but by
having the corner models available, worst case analyses can be performed early
design process.
Note the inverter example was chosen for simplicity of description. A
element can be modeled over process corners without a large amount of extra over
The conceptual-level environment can be used to facilitate the proces
accuracy analysis. Difference between theory and either measured or simulated re
can often be captured with equations. The impact of the accuracy differences o
primitives can be measured at the system level by propagating these equations th
the composite models. Two cases are presented to demonstrate how accuracy c
modeled in primitives: In the digital domain, the mismatch of signal statistic estimat
is used, and for the cost function of delay, the effect of process variations is exhib
In entries 1 and 2 of Table 3.1, capacitance dissipated in a ripple adde
modeled with different equations. In equation 1, it is assumed that the adder’s in
are uncorrelated - in other words, to behave like uniform white noise. In EQ 2, i
assumed that the inputs are highly correlated giving a much lower power estimate.
can model an average of the two and an error bounds:
(Eq 3-1)
By enabling designers to use equations which cover ranges of solutions, average
case and worst case estimations can be processed simultaneously and simply.
Equations 8, 9 and 10 model delay through an inverter chain for a numbe
different process technologies. Unlike the previous example, a simple linear expres
is not sufficient to process best and worst case analyses. As will be seen,
Capaci cetan 212fFxbitwidth 56 fFxbitwidth±=
3.2 Primitive models 56
usly
l for
ular,
ith a
tual
or
atch
es is
y not
on,
if the
of
to
d to
low,
rent
or
n the
conceptual-level environment, these equations can be utilized to instantaneo
evaluate the effects of process corners.
3.2.2 Tables
Tables are a reasonably quick primitive estimation method. They are usefu
models which are not easily encapsulated as closed form expressions. In partic
non-linear functions and discrete components are prime candidates for modeling w
table. Table lookup may also be a viable means of handling more abstract intellec
property (e.g., DSP cores).
Derivation: Tables are created empirically from either measurements
simulations. A large table can be created relatively painlessly by running a tool in b
mode with varying input parameters. Choosing the proper number and type of entri
a challenge in that you have to predict requirements. Since model creation is usuall
in the critical path of the design flow, large tables can be created. In additi
interpolation and extrapolation methods can be used during the estimation process
needed values are not pre-calculated.
Speed of Evaluation Tables are very fast to evaluate. Though the length
estimation time may go up with the size of the table, it is almost never prohibitive
conceptual level exploration.
Accuracy: Tables are as accurate as the tool, or the measurement, utilize
create the table. Interpolation and extrapolation can be used to find things not in f
but this will increase the inaccuracy. Accuracy can be modeled in a number of diffe
means for table look-up. If it is a consistent error, a +/- of either a percentage
absolute amount could be included. If the accuracy is dependent upon the entry i
table, a second value can be included in the table to indicate the model accuracy.
3.2 Primitive models 57
to
ither
will
erty
also
time
time.
Uses: Tables are useful for non-linear functions and when the overhead
extract an analytic model from empirical data is excessive. Commercial products, e
discrete component or intellectual property, can be encapsulated in tables. This
become increasingly useful as system design using company’s intellectual prop
increases. Section 3.2.3 illustrates the use of tool invocation as models, which are
used for a lot of these same reasons. As tool invocations can be prohibitively
consuming, tables can be used to cache information in order to reduce estimation
3.2.2.1 Example Table Macromodels
Example 2Instruction-level modeling of microprocessor
Block box modeling of microprocessors was introduced as a method for
power analysis of software power optimizations[15]. In Table 3.3, a
microprocessor breakdown of power by instruction for a 486DC2
processor is presented. The models were derived empirically on a test
bench, but also could be derived from a more abstract description ranging
from transistor level simulations to a high-level architectural description.
Though too low level of an abstraction to be used directly at the
conceptual level, Wan demonstrated the ability to, while sacrificing
Table 3.1 Example Table look-up Microprocessor[15]
Subcomponent Cost Model unit
MOV Energy 302.4*Vcc/f nJ
ADD DX,BX Energy 428.3*Vcc/f nJ
JMP label Energy 1019*Vcc/f nJ
CMP BX,DX Energy 298.2*Vcc/f nJ
NOP Energy 257.7*Vcc/f nJ
3.2 Primitive models 58
accuracy, take a higher level description and parse it down to base
instructions[30]. As there is no expression to capture all of these costs,
only a table lookup is applicable for this type of instruction level
macromodel.
Example 3ADC and commercial parts
Analog to digital conversion is inherently a non-linear function. Table
3.2, a Maxim data sheet of their low-power analog to digital converters,
illustrates the usefulness of using a table for macromodeling not just non-
linear components, but off-the-shelf components. There are over 38 parts
described with 9 different cost functions. This information is easily
captured in a table look-up.
Tables do not need to sit local to the conceptual-level environment.
Datasheets from various companies can be incorporated into an
estimation system through remote access. Thus evaluating systems using
company’s parts or IP is expedited.
Table 3.2 Maxim Data Sheet
Subcomponent Cost Model unit
TOBEATTACHED
3.2 Primitive models 59
Part
Num
berR
esolution(bits)
Sam
ple Rate
(KH
z max)
Conversion
Tim
e (µs)
InputC
hannelsR
eference
Voltage a (V)
a. E=
external reference, I = internal reference
Data-B
usInterface
(Bits)
Supply Voltage
(V)
Input Ranges
(V)
EV
Kit
Features
Price b
1000-up $
b. Prices provided are for design guidance and are F
OB
US
A.
MA
X114
81000
0.664
Eµ
P/8
+5
+5
5V, 8-bit, 4-channel AD
C w
ith power-dow
n3.30
MA
X118
81000
0.668
Eµ
P/8
+5
+5
Yes
5V, 8-bit, 4-channel AD
C w
ith power-dow
n3.40
MA
X153
81000
0.661
Eµ
P/8
+5 or +
/-5+
5Y
esH
igh-speed AD
C w
ith 1 µ
A pow
er-down
2.95
MA
X150
8500
1.341
E or I/+
2.5µ
P/8
+5
5 or +/-2.5
Com
plete AD
C w
ith T/H
and reference5.85
MA
X113
8400
1.84
Eµ
P/8
2.7 to 3.6+
53V, 8-bit, 4-channel A
DC
with pow
er-down
3.45
MA
X117
8400
1.88
Eµ
P/8
2.7 to 3.6+
3Y
es3V, 8-bit, 8-channel A
DC
with pow
er-down
3.55
MA
X152
8400
1.81
Eµ
P/8
+3 or +
/-3+
3Y
es3V
AD
C w
ith 1µA pow
er-down
3.10
MA
X154
8400
24
E or I/+
2.5µ
P/8
+5
3 or +/-1.5
4-channel AD
C w
ith T/H
and reference6.45
MA
X158
8400
28
E or I/+
2.5µ
P/8
+5
+5
8-channel AD
C w
ith T/H
and reference6.85
MA
X155
8250
3.68
E or I/+
2.5µ
P/8
+5 or +
/-52.5 or+
/-2.5Y
es8-channel A
DC
with T
/H and reference
9.50
MA
X156
8250
3.64
E or I/+
2.5µ
P/8
+5 or +
/-52.5 or+
/-2.54-channel A
DC
with T
/H and reference
7.19
MA
X165
8200
51
E or I/+
1.23µ
P/8
+5
+5
Low-cost sam
pling AD
C w
ith reference3.95
MA
X166
8200
51
E or I/+
1.23µ
P/8
+5
+5
Differential input com
plete AD
C**
MA
X1110
850
168
E or I/+
2.048S
erial2.7 to 5.5
Vr or +
/-Vr/2
Yes
2.7V, 8-channel 250µ
A A
DC
with pow
er-down
**
MA
X1111
850
164
E or I/+
2.048S
erial2.7 tp 5.5
Vr or +
/-Vr/2
2.7V, 4-channel 250µ
A A
DC
with pow
er-down
**
MA
X1112
850
168
E or I/+
4.096S
erial+
5V
r or +/-V
r/2Y
es5V, 8-channel 250
µA
AD
C w
ith power-dow
n**
MA
X1113
850
164
E or I/+
4.096S
erial+
5V
r or +/-V
r/25V, 4-channel 250
µA
AD
C w
ith power-dow
n3.45
MA
X151
10300
2.51
E or I/+
4.0µ
P/10
+/-5
+5
Sam
pling AD
C w
ith ref7.95
MA
X1249
10133
7.54
ES
erial2.7 to 5.25
Vr or +
/-Vr/2
2.7V, 10-bit, low-pow
er, serial, 4-channel AD
C3.95
MA
X192
10133
7.58
I/+4.096
Serial
+5
5 or +/-2.5
Low pow
er, low cost, sm
all package2.95
MA
X148
10133
7.58
ES
erial2.7 to 5.25
2.5,+/-1.25
2.7V, 8-channel low-pow
er AD
C4.10
MA
X149
10133
7.584
I/+2.5
Serial
2.7 to 3.62.5,+
/-1.252.7V, 8-channel low
-power A
DC
**
MA
X177
10 or 12100
8.31
I/-5.25µ
P/8 or 12
5 & -12 to -15
+/-2.5
MA
X167 w
ith 10-bit accuracy7.96
MA
X1243
1075
7.51
ES
erial2.7 to 5.25
Vr or +
/-Vr/2
2.7V, 10-bit serial AD
C in 8-pin package
**
MA
X120
12500
1.61
I/-5.0µ
P/12
5 & -12 to -15
+5
Yes
High-speed com
plete sampling A
DC
11.50
MA
X122
12333
2.61
I/-5.0µ
P/12
5 & -12 to -15
+5
Yes
High-speed com
plete sampling A
DC
8.95
MA
X176
12250
3.51
I/-5.0S
erial5 &
-12 to -15+
5Y
esS
erial AD
C, 8-pin m
iniDIP
with T
/H9.85
MA
X1247
12133
7.54
ES
erial2.7 to 5.25
Vr or +
/-Vr/2
Yes
2.7V, 12-bit, low-pow
er, serial, 4-channel AD
C**
MA
X146
12133
7.58
I/+2.5
Serial
2.7 to 3.62.5,+
/-1.252.7V, 8-channel, low
-power A
DC
with reference
**
MA
X147
12133
7.58
ES
erial2.7 to 5.25
2.5,+/-1.25
Yes
2.7V, 8-channel, low-pow
er AD
C5.95
MA
X186
12133
7.58
E or I/+
4.096S
erial+
5+
/-10,+/-5
Yes
7mW
, 10µΑ
power-dow
n6.75
MA
X188
12133
7.58
E or I/+
4.096S
erial+
5+
5Y
esM
AX
186 without reference
6.25
MA
X196
12133
66
E or I/+
4.096µ
P/12
+5
+/-10,+
/-5Y
esM
ulti-range, fault protected 6-channel9.90
MA
X197
12100
68
E or I/+
4.096µ
P/8
+5
+/-10,+
/-5Y
esM
ulti-range, fault protected 8-channel9.90
MA
X198
12100
66
Eµ
P/12
+5
+/-4,+
/-2Y
esM
ulti-range, fault protected 6-channel9.90
MA
X199
12100
64
E or I/+
4.096µ
P/8
+5
+/-4,+
/-2Y
esM
ulti-range, fault protected 8-channel9.90
Figure 3.5Maxim Data sheet for ADCs
3.2 Primitive models 60
for
an be
at the
put
o a
les
il.
e
er
rade-
on
hey
an
nt on
ool
when
h an
3.2.3 Dynamic Models
One of the key tenets of this work is to use the proper level of abstraction
the estimation required. When both the need and the means are there, a tool c
encapsulated as a model. This encapsulation is a dynamic model in the sense th
input specification to the tool is modified dynamically dependent upon the in
parameters.
Derivation: The derivation of a dynamic model consists of:
1) the creation of a parameterizable input specification. An input deck t
given tool must be created such that specific parameters can be modified.
2) The encapsulation of the specification into a generic format. This enab
the tool-call to have the uniform interface required of a macromodel.
Chapter 5 illustrates the methodology for creating dynamic models in deta
Evaluation time: There are two major components of the evaluation tim
when using a too: specification and evaluation.
There is a significant time and design commitment required to do many low
level simulations so the designer must decide how to make the time vs. accuracy t
off. A common mistake follows the proverbial 80-20 rule. 80% of the time is focused
20% of the problem. By allowing the designer to only use tools when important, t
are not forced to spend a lot of time working on a small part of a problem.
Accuracy: A tool invocation is often used to provide greater accuracy th
possible with equation models. The accuracy of the estimate, however, is depende
the accuracy of both the tool and the accuracy of the input specification. A t
invocation does have the advantage of giving an accurate representation of costs
input parameters are out of the normal operating bounds. For example, thoug
3.2 Primitive models 61
ool
uch
o
nant
for
is
be
put
ns
of
n be
ad
is
ools
-level
analytical model may not capture the effects of an amplifier saturating, a t
invocation to SPICE would accurately track such behavior.
A tool should be used for a macromodel in a number of different reasons s
as:
Dominant subcomponents: A tool is usually the most accurate means t
perform an estimate. When it is known that a specific sub-component is the domi
contributor to a cost, it may make sense to trade-off increased estimation time
greater accuracy.
Non-linear functions: A tool can also be used to replace a table when there
not a good closed form expression for a cost. Though the estimation time may
longer, using a tool will not require the pre-characterization over a wide range of in
parameters that creating a table entails.
Complex systems:When a system is not well understood, the quickest mea
to get an estimate may be a lower-level tool invocation. In other words, instead
trying to analyze a complex system in order to create an analytical model, a tool ca
utilized.
3.2.4 Example trade-offs in primitive selection.
The following examples illustrates trade-offs in primitives. The overhe
required in model evaluation for deriving input specifications and tool invocation
shown at both the architectural and circuit abstraction. Each example utilizes t
which themselves can be encapsulated, as macromodels, in a conceptual
environment.
Example 4Comparing two dynamic models
3.2 Primitive models 62
In this example the time versus accuracy trade-off common in the
choosing of the proper abstraction for dynamic models is explored. The
architectural level tool SPA is compared with SPICE.
Landman[10] presented an architectural level power estimation tool for
digital integrated circuits. Landman’s technique models each
subcomponent of a digital library (e.g., adders, multipliers, etc.) as a
function of the characteristics of the signals at the input and output of
each block. With proper characterization of the subcomponents and with
the use of proper input vectors, the power of a digital system can be
estimated within 15% of SPICE in a fraction of the estimation time.
Landman’s technique divides a digital word into three parts, LSBs, MSBs
and transition bits. As Figure 3.6 illustrates, the MSBs, the sign bits,
switch much more infrequently than the lower order bits. By defining the
-0.99-0.9-0.8-0.6-0.4-0.20.00.20.40.60.80.90.99
P(0 → 1)
0.00
0.10
0.200.25
0.50
0 2 4 6 8 10 12 14
ρ
0.40
0.30
LSB MSB
SUBP1
UWN SIGN
BP0Figure 3.6Bit-wise characterization of switching activity
3.2 Primitive models 63
n a
fined
ith a
ess,
ner
are
three regions, and stochastic nature of each of those regions, the digital
estimate can be achieved at magnitudes the speed of a SPICE simulation.
This type of estimate, though much quicker than SPICE, still requires a
significant time both in model creation and input specification. In order
to derive the input specifications (i.e., the signal statistics) requires that
the design be specified completely in VHDL. In addition, input vectors
must be derived. VHDL creation and debugging and input specifications
may take hours to days for a relatively small design. Each small change
in the design may require significant time to rewrite and debug code.
A design modeled with SPA may enable a much quicker estimation tha
design modeled in SPICE at the expense of accuracy. For a design that is well de
and significantly designed, SPA can provide estimates much quicker than SPICE w
relatively high degree of accuracy. At the conceptual stage of the design proc
however, this approach could still be prohibitively time consuming. The desig
should only choose to use this tool as a primitive for parts of the circuitry which
well defined and specified.
Example 5Dynamic model versus analytical expressions
In this example, the trade-off of the accuracy of a dynamic model, is
contrasted with the speed of an analytical model. There are secondary
effects of this trade-off, flexibility and confidence, that will also be
addressed with this example.
The accuracy that SPICE can provide makes it one of the most widely
used tool by IC designers. SPICE however, requires both a detailed
circuit description and, unlike SPA, can require significant time to
evaluate. In this example of a folded cascode op-amp, gain, power and
bandwidth are the cost functions of interest.
3.2 Primitive models 64
ritinges and
For a specific function, an analog designer will often know what
architecture is most suited, but needs to optimize it. For example, a
folded cascode is a common building block in low-voltage CMOS designs
(especially switched capacitor circuits) in that it provides high gain-
bandwidth product, with good headroom and power supply rejection. All
in a single stage. To evaluate the feasibility of a specific architecture can
often be done with a minor amount of simulations/estimations.
Figure 3.7 shows the schematic of the cascode opamp2. Figure 3.8
illustrates a corresponding SPICE description3. The specification of the
2. The bias voltages are set to provide dc bias currents
3. Note, this SPICE description is not a macromodel. To make it into a macromodel, would require wthe model in terms of the input parameters of interest, and using scripts to substitute in the proper valure-running.
Bias
Bias2
Bias1
V+ V-Vout
CC
Figure 3.7Folded cascode op-amp
VDD
VSS
3.2 Primitive models 65
design is now a matter of choosing transistor sizes, bias currents, and
compensation capacitor,Cc. Inputting that information into the
specification is relatively trivial, however, estimation time can be
prohibitive. To calculate both gain and bandwidth requires ~7 seconds on
a Sparc 20
3.2 Primitive models 66
.The gain and bandwidth can also be estimated by a simple approximation
[21]:
(Eq 3-1)
Folded Cascode
vdd vdd 0 dc 3
vss vss 0 dc -3
m1 3 posin 1 1 pmos w=240u l=10u
m2 2 negin 1 1 pmos w=240u l=10u
m3 9 5 3 vss nmos w=40u l=10u
m4 out 5 2 vss nmos w=40u l=10u
m5 7 6 vdd vdd pmos w=92u l=10u
m6 6 6 vdd vdd pmos w=92u l=10u
m7 out 9 7 7 pmos w=92u l=10u
m8 9 9 6 6 pmos w=92u l=10u
m9 2 4 vss vss nmos w=80u l=10u
m10 3 4 vss vss nmos w=80u l=10u
m11 1 8 vdd vdd pmos w=480u l=10u
*BIAS:
V4 4 0 -1.6
V5 5 0 0.1
V8 8 0 1.8
CL out 0 10p
*OPERATION
Vcommon com 0 dc 0
Vin posin com dc -19.7u ac 1
vneg negin 0 19.7u
.model nmos nmos kp=60u vto=0.7 lambda=0.02
.model pmos pmos kp=26u vto=-0.7 lambda=0.02
.op
.tf v(out) vin
.ac dec 10 100 10g
.meas ac LF_gain find vm(out) at=100
.options post brief
.end
Figure 3.8SPICE model of folded cascode
gain gmpRout=
3.2 Primitive models 67
and using a dominant pole approximation:
(Eq 3-2)
where
(Eq 3-3)
In a simple spreadsheet, calculating the characteristics of all the
MOSFETS, as well as the gain and bandwidth with this approximation
occurs in a fraction of a second. Table 3.3 illustrates bandwidth
estimations using SPICE versus the approximation for various device and
capacitor sizes. A maximum difference of 16% from SPICE is observed.
The time to run the analysis in SPICE ranged from 10 to 20 seconds as
opposed to the analytical model which took less than a second per
evaluation. The time to specify and encapsulate the model, however, was
on the order of tens of minutes. In this case, it was much quicker to
specify the design with an analytical model as it is a relatively simple
circuit. For larger circuits, however, the specification time of the analytic
model could go up dramatically. In addition, the cost functions, gain and
bandwidth, are well understood and easily modeled. To find more
complex costs (e.g., third harmonic distortion) would require a dynamic
model, or perhaps a table, for all but the most trivial of circuits.
Table 3.3 SPICE versus simple models
Parameter SPICE Equations %error
gmp 3.26e-4 3.16e-4 3
3dbbandwidth1
2πCCRout----------------------------=
Rout rop 1 gmprop+( ) ron 1 gmnron+( )( )=
3.3 Composite Models 68
d in
ny
is
istor
f
f.
tual
hese
gether
evels
As each design change would require a new simulation, it is the
designer’s choice as to whether the accuracy presented by SPICE is worth
the added estimation time.
Also note that if the range of required input parameters is bound and
known, a table can be generated outside the design flow and can provide
the accuracy of SPICE without the large increase of time during
exploration.
This trade-off analysis between estimation time and accuracy demonstrate
the previous two examples is characteristic of all levels of estimation. At a
abstraction level, this trade-off is performed. At lower-level of abstractions, it
possible to automate significant portions of this task. Examples of this are trans
level tools like Avanti’s Star_Sim or EPIC’s PowerMill [28][27]. At higher levels o
abstraction, however, it is more incumbent on the designer to perform this trade-of
3.3 Composite Models
As demonstrated in Chapter 2, analyzing complex systems at the concep
level requires a partitioning of the system into manageable components. T
components are than associated with appropriate models and than recomposed to
for evaluation. This so called divide and conquer approach has been used at lower l
Rout 45e6 54e6 16
gain 14.5e3 17.3e3 16
3db point 330KHz 290KHz 12
Table 3.3 SPICE versus simple models
Parameter SPICE Equations %error
3.3 Composite Models 69
ioral
tor
uer
he
del
ble
stem
lly
s the
ocess
s to
ation
A
s are
cal
have
ters in
of abstraction. For example [22] uses a divide and conquer approach at the behav
level for both estimation and optimization. At both the architectural and transis
level, commercial tools, such as Avanti’s Star_sim[28] utilize a divide and conq
approach to estimate power and timing information.
In the aforementioned tools, their methodology is automated. At t
conceptual-level, the three steps of compositional modeling -- partitioning, mo
selection and composition --must be designer driven.
3.3.1 Partitioning
The goal of partitioning is to divide a complex system into managea
subparts. The role of conceptual models is to provide a means that a complex sy
can be so subdivided.
Partitioning at the conceptual level is almost fully designer driven. It is usua
done by inspection with each major subcomponent matched to a macromodel. A
design starts to get more specified, however, there are means to facilitate the pr
using either a graphical or textual description. The goal in the partitioning stage i
enumerate a list of the most important functions. The three basic means of enumer
are:
Inspection: This is how it traditionally been done at the conceptual level.
basic architecture or pseudo-code is sketched. From that, the major component
extracted in a list form. Any overhead that may be created may also be noted.
Schematic entry: A schematic entry tool can be used to capture a graphi
description of a design. At the conceptual level, the schematic description does not
to be functional, but is used as a means to extract key subcomponents and parame
a visual manner.
3.3 Composite Models 70
sic
evel
need
gets
d. At
be
ified
e a
plex,
the
elps
sign
done
ative
o a
m a
g.
the
els by
Text entry: Likewise, a textual description can be used to describe the ba
behavior of the system. Unlike traditional software descriptions, at the conceptual l
this description can be a combination of structural, behavioral and physical, and
not be fully accurate or executable.
The last two means of enumeration have the advantage that as the design
more specified, and hence it’s abstraction lower, the same description can be use
this stage, the two descriptions need not be compilable into a form that can
simulated, but as the design is further specified, the same description can be mod
and used with behavioral or architectural tools. In addition, it is often useful to tak
more fully described system and abstract away details in order to evaluate a com
heterogeneous system. By incorporating schematic entry and software tools into
conceptual-level environment, the ability to go higher in abstraction when needed h
keep a uniform design flow.
3.3.2 Model Assignment
The first step in the evaluation of the subcomponents of the system is to as
each subcomponent to either a primitive or composite macromodel. This can be
either on an individual basis, using a direct method, or in groups, using an associ
method.
Direct: The direct method is simply the assignment of each component t
model. This is done by simply matching each component with a macromodel fro
list.This can also be done manually, or using schematic capture or software parsin
Indirect: By using a library based approach, more power is added at
conceptual level. Groups of subcomponents can be assigned to types of macromod
3.3 Composite Models 71
d of
ly to
dels.
ative
ply
kly
ains.
ickly
nd
sign
n is
y the
d
seful
ed.
nput
simply selecting different libraries. This is done through the use ofgenerics and
domains.
Generics are simply place holders. When using the associative metho
assignment, subcomponents are mapped to generics as opposed to direct
macromodels. Domains are hash tables that map generics to actual macromo
Example domains are hardware libraries and instruction sets. By using this associ
approach the effects of changing implementation platform can be evaluated by sim
changing the domain and re-evaluating the composite model.
In this way, a change in hardware library, or example, can be quic
evaluated. Different parts of a composite model can be mapped using different dom
Hence tasks such as partitioning between hardware and software can be qu
evaluated.
3.3.3 Composition
The majority of the exploration is performed through the creation a
manipulation of composite macromodels. The properties of conceptual de
exploration, as defined in Chapter 2, must be supported in this composition. Desig
an iterative process, and hence once the model is constructed, modifications b
designer should be facilitated so as not to impede with the design process.
The following tasks must be supported with composite models:
Enumeration of subcomponents:Composite models are most often viewe
and manipulated in a list form. Hence, subcomponents need to be presented in a u
list form. Fields for inputting parameters and printing output costs need to be provid
Outputs of models need to be able to be linked into other models through the i
parameters to capture model interaction.
3.3 Composite Models 72
ich
ded.
ween
the
el
the
with
ther
ed.
ing
e to
than
ns
any
For
f the
ion.
ould
l of
the
Overhead estimation: The assemblage of a system creates overhead wh
can add significantly to cost. Hence, the ability to model overhead needs to be inclu
This provides great challenges at the conceptual-level as often the connection bet
subblocks, most useful in determining overhead, is not described.
A number of techniques for overhead estimation can be utilized at
conceptual level:
1) The inclusion of overhead models. Simply taking the time to mod
overhead, often ignored, is significant. The environment must be able to facilitate
creation and utilization of these models. For example, interconnect size associated
a circuit can be estimated by using simple area-based models[13]. Like o
macromodels, these overhead models can adapt as the design gets further specifi
2) Partitioning based upon grouping is often useful in estimation. Again us
interconnect as a metric, by grouping together elements which will be laid out clos
each other, estimation of interconnect area can be performed with greater accuracy
estimation without partitioning[13].
Accuracy analysis: As previously discussed, the conceptual-level, by mea
of its distance from the final representation, provides the most limited accuracy of
level of abstraction. The goal in evaluating a conceptual model is to ensure that
1) the accuracy of the estimation is within a certain level so as to be useful.
example, when choosing between two different implementations, the accuracy o
estimation must be less than the difference between the cost of each implementat
2) the accuracy of the estimation is understood by the designer. There sh
be a means to evaluate whether an estimate is within 5% or 55%.
There are a number of steps which can be taken to provide this leve
understand of model accuracy. To start, when the primitives are first created
3.3 Composite Models 73
mbers
ch as
the
total
n be
. The
alues
an be
the
ks.
can
full
accuracy of each cost estimate should also be modeled (Section 3.2). These nu
then can be propagated through the composite model using standard means su
error calculus.
For example, a number of composites are found by totaling costs of
subcomponents. simple equations can be used to identify confidence in the
estimates. Figure 3.9 illustrates the means in which a number of calculations ca
automated in spreadsheet format. A series of entries and costs are assumed
accuracy of these cost estimates is known as a percentage. A number of useful v
can be automatically calculated. For example, confidence in the aggregate cost c
determined by calculating the total error as shown. In addition, the calculation of
impact of each cost can be automated as well to aid in the identification of bottlenec
Once a bottleneck or major cost contributor has been identified, extra time
be taken to model this component to increase the accuracy of the estimation of the
system.
entry cost
Entry1 a
Entry2 b
(n total entries)
total cost=Σ(a..n) total error = (w*a + x*b .....*n)/Σ(a..n)
Figure 3.9Example error analyses
percentage
a/Σ(a..n)
b/Σ(a..n)
error
+/- w%
+/- x%
3.3 Composite Models 74
ll
lobal
f
c of
n
wer
s the
wo
ich
the
rted
is
to be
(Sec-
Use of global parameters: Aspects which are common to many or a
components in a system should be supported with global parameters. Example g
parameters are supply voltage, frequency, and process information.
Global functions/views: Certain functions may be performed on all or most o
a composite on an individual basis. For example, when power is the cost metri
interest, the calculation ofCV2f will be calculated on all the digital components. I
addition, there may be functions which are used on all of the elements. For po
estimation, for example, the summation of all the power of the subcomponents give
power for the whole composite. The use of global functions produce theview of a
composite model as defined in Section 3.1.3.
Plotting: Using plots for analysis is a key part of the design process. T
types of plots should be supported for conceptual models.Bottleneck analysisis the
plotting of the contribution of each component to a certain cost. For example, wh
component contributes the most to the delay through a circuit.Sensitivity analysis
allows the changing of one or more parameter and the measuring of the effect on
global cost functions.
In addition to these functions, a number of other functions should be suppo
by the user interface to the conceptual model:
• Hierarchy tracking: The ability to track the hierarchy inherent in a complex system
important in a composite models.
• Documentation: Documentation should be easily attached to composite models.
• Model manipulation: Being able to change the association of macromodels needs
supported. The approach to making system changes using generics and domains
tion 3.3.2) needs to be included.
3.3 Composite Models 75
y in
As will be seen in Chapter 4, the use of hyperlinked spreadsheets is kesupporting all of these tasks.
Example 6Composite model
To illustrate some of the basic aspects of composition models, a
multimedia system is presented. This example demonstrates how
conceptual-level analysis can be used on large systems.
The multimedia wireless InfoPad terminal [51] is a good example of a
highly heterogeneous, complex system (Figure 3.10). It includes custom
digital chips, a commercial microprocessor for signal processing, error
correction, a radio subsystem with analog and digital components, nine
mixed-signal voltage regulation systems, display drivers, and an I/O
interface.
This system was designed by many different designers and requires a
variety of both skills and tools. The composition of the topmost levels of
the InfoPad system is shown in Figure 3.11. The entire system breakdown
would be too large to be captured in a single-page description. The
Figure 3.10: InfoPad: multimedia wireless terminal
Radio
Voltage Regulation
Video ChipsµProc
Display Drivers
I/O
Error Correction
3.3 Composite Models 76
system is partitioned functionally into seven subsystems. Each of these
sub-systems can also be represented as compositions of other systems.
Hence one representation of the InfoPad is a composite model of seven
composite models. Each of those composite models are made of other
composite or primitive models. This tree like structure continues until all
nodes are terminated with primitive models. In the case of the InfoPad,
the primitives themselves vary from commercial parts, to fully custom
ASICs, to sub-circuits of integrated circuits.
3.3
Co
mp
osite
Mo
de
ls77
Text/graphic Arm Systemdisplay
Arm 60 Chip
PAL chip
EEPROM
SRAM
20MHz
Back-light
LCDs
SRAM
LevelConvert-
InfoPad
Digital ASICsCrystalsDC-DCVideo display Radio
PlessyDownlink
ProximUplink
A/D Con-vert.
Arm Inter-face
Rx chip
Tx chip
Text chipset
3.68MHz
2.46 MHz
2.05 MHz
Buffers
9 -> 5 V
5 -> -5 V
5 -> 1.5 V
LCD
Backlight
Level Con-vertor
Figure 3.11
Com
postion of wireless m
ultimedia system
: InfoPad
3.3 Composite Models 78
As will be shown in the next chapter, one of the most powerful means of
interfacing with a composite model is through a hyperlinked spreadsheet
which is previewed in this example to give a feel for the use of composite
models.
Figure 3.12 shows a conceptual view of the InfoPad’s composite model
via a hyperlinked spreadsheet analysis. Each of the seven composites
shown in Figure 3.11 are represented with a single line entry in the table.
In this case, the view of power is shown, though any relevant cost
function(s) can be viewed.
Each of the seven composites are linked to the InfoPad composite. Figure
3.13 illustrates the composite model of the radio which is linked to from
the InfoPad composite. Each model in the composite for the radio has a
line entry in the spreadsheet. In this way, designers can intuitively
Figure 3.12: Conceptual view of the InfoPad terminal
3.3 Composite Models 79
ad.
ntrol
This
This
ation
transcend levels of hierarchy. In addition, from any level in the system,
costs can be evaluated by simply hitting the “PLAY” button.
Also worth noting is that many engineers work on systems like the InfoP
This tool could be used for management purposes. The head of the project can co
the upper levels of hierarchy, and group leaders can control lower spreadsheets.
enables the encapsulation of the work of an entire group into a single environment.
type of system also provides a good backbone for cost budgeting and document
management.
Figure 3.13: InfoPad’s radio sub-system
3.4 Summary 80
ration
hich
3.4 Summary
This chapter presented one of the base requirements for conceptual-level explo
systems: macromodeling. In the following chapter, the necessities for the infrastructure, w
utilizes this form or macromodeling, will be presented.
81
Chapter 4
Technical Enablers ofConceptual-Level Design
Environments
evel
ical
tailed
ked
e to
r a
and
ted
hose
This chapter focuses on implementation related issues of a conceptual-l
environment. Creation of a supporting environment requires that a number of techn
enablers be present. The first, conceptual macromodeling, was introduced and de
in Chapter Three.
This Chapter presents three other important enablers. The first, hyperlin
spreadsheets, is a new form of spreadsheet which provides a powerful interfac
composite macromodels. The WWW provides a powerful distributed backbone fo
conceptual-level environment. Relevant aspects of the WWW will be detailed,
techniques for utilizing the WWW will be presented in Section 4.2. Object orien
design techniques share many similarities with the design of a complex system. T
4.1 Hyperlinked Spreadsheets 82
will
ith
with
s to
nal
eet.
litate
l, or
with
re
t, a
eet is
hese
similarities will be illustrated and aspects that the proposed environment utilizes
be detailed in Section 4.3.
4.1 Hyperlinked Spreadsheets
A hyperlinked spreadsheet is proposed as a new media for interfacing w
compositional models. Spreadsheets in general provide a powerful means to work
a design description providing ordered lists of components and an intuitive mean
interface with parameters and costs. As a basis for interfacing with compositio
macromodels, this new type of spreadsheet adds these important functions:
• evaluation of any type of heterogeneous macromodel
• intuitive links to macromodels and design functions supporting hierarchy
• fully integrated documentation
Figure 4.1 illustrates the important components of a hyperlinked spreadsh
In the main spreadsheet there are a number important functions integrated to faci
composition. Each row of the spreadsheet is dedicated to a structural, behaviora
overhead related functions of the system description. The most important features,
respect to conceptual design, are detailed:
Macromodel evaluation: In traditional spreadsheets, cost functions a
evaluated using explicit mathematical functions. In the hyperlinked spreadshee
linkage to an evaluation means (i.e., macromodels) is used. When the spreadsh
evaluated, each row entry is evaluated using the appropriate macromodel. T
4.1 Hyperlinked Spreadsheets 83
pe of
ry
low
ough
W
other
models, as discussed in Chapter 3, are not limited to equations, but can be any ty
primitive or composite model.
Linkage to macromodels: The macromodels associated with each row ent
are linked to via hyperlinks. This functionality enables the designer to effortless f
through the hierarchy of a design. The entire design tree structure is traversable thr
hyperlinks. In this implementation, the links are located in the first column.
Linkage to documentation: As well as including documentation on the
spreadsheet itself (not shown), links to further documentation, and important WW
locations, can also be included. This documentation can be created by the user or
designers and can be located anywhere on the Inter-/Intra- net.
Nam
esId
entifi
ers
& H
yper
links
Inpu
t Par
amet
ers
Vie
ws
Cos
t Est
imat
es
Valu
es
Main Spreadsheet
Global Parameters
Figure 4.1Block diagram of hyperlinked spreadsheet
Global ViewsGlobal Costs
4.1 Hyperlinked Spreadsheets 84
et
) are
ew’s
and
ates
n the
s in
the
er
wer
ated
del
ntire
d min
be
al
vels
igner
d to
ased
a
Various Views of a Design:The third and fourth columns on the spreadshe
are where various cost functions are presented. The cost estimates (third column
costs that are analyzed using a method specifically for a macromodel. The vi
column (fourth column) provides a means for taking outputs from macromodels
process them in the same manner. Though the results of both are simply cost estim
of a system, there is an important differentiator as to how they are analyzed: costs i
first column are output directly from a macromodel, specific to each row, while cost
the second column are calculated using the same method for each row in
spreadsheet.
A good example of the use of views is in the calculation of digital pow
dissipation. While the effective capacitance,Ceff, switched for each row may be
calculated using different models, the translation of capacitance switched to po
dissipated is the same, . The capacitance switched would be calcul
with row specific macromodels while power could be calculated using a global mo
and, thus presented in the view column.
The bottom of the spreadsheet can be used for doing calculations on the e
spreadsheet. Examples of useful global cost functions are total, average, max, an
of specific columns. These values can be used to identify which information should
abstracted up to a composite model which may be referring to this model.
Global Parameters: In hyperlinked spreadsheets, the separation of glob
parameters is useful for defining which parameters can be adjusted from higher le
of abstraction. By defining what views and global parameters are enabled, the des
in a sense is defining a specific perspective or view on a design.
The use of hyperlinked spreadsheets has been successfully transferre
industry. Sente Inc. [50], first introduced to hyperlinked spreadsheets in 1996, rele
a WWW interface to their power estimation tool (March, 1998) which utilizes
P CeffV2
f=
4.2 System design on the World Wide Web (WWW) 85
level
uted
, the
lex-
on is
s of
hyperlink spreadsheet interface[50]. Sente’s software, though focusing on a lower
of abstraction, utilize many of the functions described in this section.
4.2 System design on the World Wide Web (WWW)
As the size of electronic systems have increased, the need for distrib
databases and distributed tools has become a necessity. As will be described
distributed infrastructure of the World Wide Web is the proper backbone for a comp
system design environment for a plethora of reasons. First, an interesting explorati
to look at the goals of the designers of the WWW and how they align with the goal
complex-system designers.
Figure 4.2Sente’s hyperlinked spreadsheet derived from PowerPlay
4.2 System design on the World Wide Web (WWW) 86
at
ning
that
sal
cess
as
far
lex
of
as
hing
re
our
is
ere
a
pes
and
ny as
ed to
. As
4.2.1 The Early History of the WWW
Tim Berners-Lee invented the World Wide Web in late 1990 while working
CERN, the European Particle Physics Laboratory in Geneva, Switzerland. In defi
the early system and protocols, he and his co-authors, defined a number of goals
they were hoping to fulfil with the environment. The main goal, as stated in a propo
to CERN, was for an organization of both their documentation and their design pro
of various aspect of their particle physic’s research [33]. As their original intention w
to design a support system for an admittedly complex research project, it is not a
stretch that the WWW would turn out to provide the ideal infrastructure for comp
system design.
In fact, designing complex systems on the WWW is one of the original goals
the WWW founders. In the original documentation of the WWW, the authors state
one of their primary goals to design a system where “collaborative design of somet
other than the hypertext itself” is supported[34].
Another early goal of the WWW was to design a working environment whe
“we have an intuitive space in which we can put down our thoughts and build
understanding of what we want to do and how and why we will do it.”[34]. This
exactly the goal of a conceptual level environment: to create an environment wh
intuitive design is completely supported.
In addition, the authors recognized the importance of keeping the WWW
“platform independent” environment. Large systems require a number of different ty
of tools, and hence designers are more likely to be using different computers
operating systems. On a single complex project there can require input from as ma
tens, to hundreds, of designers and managers. The WWW was specifically design
handle this sharing of the evolving information associated with a complex design
4.2 System design on the World Wide Web (WWW) 87
) the
W,
s of
t
soft
the
ere
e.g.,
the
ssing
for
ed
ous.
may
lient
may
em,
information management is a large part of complex system design, using (arguably
present day standard for information transfer is a natural progression.
4.2.2 Basic WWW architecture and protocols
To understand how to design CAD systems to be integrated on the WW
basics of the architecture must be understood. The WWW is actually a serie
protocols which enable a so calledweb of unlimited clients and servers. The mos
common client being the classical browsers (e.g., Netscape Communicator, Micro
Explorer) and WWW servers.
The basic architecture is shown in Figure 4.3, This system is much like
classic client-server configuration with some notable exceptions. The first is that th
is an unlimited number of clients and servers. Disregarding security measures (
firewalls) any server can be connected to any browser. In addition, the WWW sits on
TCP/IP based infrastructure the internet. The specifics of this packet based addre
protocol are not necessary for studying this text. The reader is referred to [38]
further details. The most important implication of the way this packet bas
interconnect is used is that connection between a client and a server is discontinu
These exceptions present a number of challenges:
• There can not be a continuous program running holding state for a user as they
move forward and backward throughout the design process. Each page on the c
side must hold all the information needed to keep track of a design.
• Design management is complicated: In a multi-user project, a portion of a design
be “checked-out”, so only one user can update it at a time. In a “world-wide” syst
this form of design management becomes more complex.
4.2 System design on the World Wide Web (WWW) 88
pen
no
ger,
urity
user
ent
d to
s a
col
ief
any
nd
age
way
.
Identification and security issues are exasperated in this somewhat o
environment. When the first prototype for PowerPlay was written, there were
standards for security and identification. Security has since become much stron
however the issue of identity still proposes some challenges. Though the basic sec
problems have been solved, Section 5.1.11 will present programs which make
customization and security more convenient.
Different tasks are performed on the Internet by using a number of differ
protocols. For example the File Transfer Protocol (FTP), as its name implies, is use
transfer files between machines. The Simple Mail Transfer Protocol (SMTP) i
protocol used for handling mail. The introduction of the HyperText Transfer Proto
(HTTP), combined with Universal Resource Locators (URLs) was one of the ch
enablers of the WWW. Their introduction enabled a uniform method to transfer
form of multimedia and the now common use of hyperlinks to link pages a
multimedia.
The last major pieces of the puzzle was the HyperText Markup Langu
(HTML), which became a standard for creating web pages and the Common Gate
Interface (CGI), which enabled programs to be called remotely across the network
4.2 System design on the World Wide Web (WWW) 89
nt
file
with
rnet.
ted.
red.
e.
that
then
Figure 4.3 is a simplified view of the WWW drawn to describe the differe
transactions. For simplicity, only a single server and browser is shown. The server’s
system includes two special directory structures accessible over the WWW. Files,
the proper permissions, in the public_html structure can be accessed over the inte
Programs, with the proper permissions, in the cgi_bin directories can be execu
Programs can perform operations on files anywhere on the file system if so configu
A single computer can be easily configured as both a browser and a server.
There are a few different major forms of transaction on the WWW:
Linking to a page: The most common interaction is the hyperlink to a pag
The address part of the URL is used to locate the server over the Internet. The file
is accessed must be in a location termedpublic_html. In this way only certain areas of
the server are accessible over the WWW. The file located at the server is
Inter/tra-net
File System
public_html cgi_bin
Figure 4.3World Wide Web Transactions
Browser Server
4.2 System design on the World Wide Web (WWW) 90
oded
r
page.
is
n.
h
rt for
areas
wser
gh
s.
gh
the
r.
to
ever
ing
te
rent
e(s)
ide
of a
transferred over the Internet in packets. The browser then interprets the HTML enc
text and prints it to the screen.
Linking to multi-media: A powerful aspect of the web is that linkage to othe
types of files (e.g., audio, video) occurs in the same manner as in the passing of a
Identifiers after the “.” in the file name indicates the type of media. If the browser
configured for this type of file, it will interpret and present the multimedia informatio
Remotely calling a program: The server also has special locations with whic
programs can be called over the WWW. These programs are stored in cgi-bins, sho
common gateway interface bin. These programs can perform data manipulation on
any where on the server system. The cgi-bin program returns information to the bro
by printing to STDOUT, which gets routed, using the HTTP, to the browser. Thou
often simple file manipulations, there is no limit to the complexity of these program
Cgi-bin programs can be called either through the entry of forms or throu
hyperlinks. In the former method, the user can alter the information passed to
program, in the later, the information is fixed when a page is printed to the browse
Plug-ins/Java: At the browser’s discretion, programs can be downloaded
the browser and run. The most common of these is Sun Microsystem’s java, how
specialized plug-ins for different media (e.g., streaming audio and video) are becom
more prevalent.
A design environment on the WWW is made up of a number of discre
programs. Hence it is possible to partition the design of the environment into diffe
types of transactions. A few general rules of thumb can be used in choosing the typ
of transactions to use.
Functions which require great deal of time intensive computation should res
on the server as computation duration can usually be minimized with the choice
4.2 System design on the World Wide Web (WWW) 91
al of
imize
is
om
hich
ions
is a
uire
on.
ion
tly
, A
el
a
.S.
ed to
gn
y of
faster, perhaps compiled, language. In addition, functions which require a great de
interaction with a centralized data base should also be located on the server to min
internet traffic. In addition, until Java becomes truly platform independent, it
generally more robust to have a program running on a single operating system.
There are a number of characteristics of programs which may benefit fr
running on the browser using Java. Those programs generally tend to be ones w
require low latency or are graphical in nature. Prime candidates for Java CAD funct
are schematic entry tools and plotting software. The hyperlinked spreadsheet
special case as it is highly interactive with the user for some operation, but may req
intensive calculations and/or interaction with a central server for model informati
Hence a combination of Java for local functions and cgi-bin scripting for evaluat
may be the best configuration.
4.2.3 Related work in WWW-based CAD design
The WELD group at UC Berkeley is developing a system which could grea
aid in this type of WWW based design [45]. Though not completed at this time
“WELD-like” infrastructure could greatly reduce time of a conceptual-lev
environment in that it could optimize the WWW transaction.
The WELD project aims to construct the first operational prototype of
national-scale CAD design environment enabling Internet-wide IC design for the U
electronics industry. On a small scale it has demonstrated the technology need
provide a uniform access to CAD environments and tools.
The WELD group is exploring how to develop practical distributed desi
technology via agent-based design, a new methodology involving the assembl
complex systems from simpler agents.
4.2 System design on the World Wide Web (WWW) 92
ch
sed
the
ams
tion
the
o it.
sers
the
used
be
fully
ing
ed
The VELA project (which includes the work of the WELD group, this resear
and the work of six other universities) is also addressing the needs of WWW ba
CAD design. Though early in development stage, if they are successful, efforts for
encapsulation of distributed tools will be minimized [41].
4.2.4 Customizing for each user
A system designed on the WWW incorporates a number of separate progr
and an evolving collection of text and data files. As opposed to a dedicated connec
of a traditional CAD program, each user operates in a discontinuous mode with
server. Each time a program is called, a finite amount of information can be sent t
The smaller the amount of information, the quicker the transaction time. Since u
have the ability to jump indiscriminately from page to page on the browser side,
most current information has to be stored locally.
Customized pages can be created in two manners. A generic page can be
which gets customized for a user as it is linked to, or a fully custom page can
created. A generic page has the advantage that links can be made quickly, a
custom page, trades off some added customizing functionality with added programm
and access time. Both will be illustrated in this section.
4.2.5 Generic HTML page customization
The techniques for customizing a generic HTML page for each user is divid
into three tasks:
• User identification: Passing user information from the browser’s web page to the
server
• The storing and retrieval of user specific information
• The customization and printing of HTML pages
4.2 System design on the World Wide Web (WWW) 93
are
on
r
ser
nd
ssed
tion
script
asily
, the
ase, a
am-
User Identif ication:1 As discussed in Section 4.2.2, server-side programs
called either by submitting forms or through hyperlinks. The user identificati
information is encapsulated in the following manner:
Forms: In an input form, fields are used for passing information. A field fo
user identification is added for each form. A hidden field can be used for u
identification:
<type= hidden name = “user” value = “username”>
thus, the environmental variable, @ARGV[user], will be set tousernamewhen the
server-program is called.
Hyperlinks: Information can be tagged to a hyperlink by adding a ? at the e
of the URL followed by the data to be passed. In this manner, information can be pa
to a program call. The information can be parsed by either identifiers or by the loca
in the string. For example, the design name and user name can be passed to a
which displays a compositional model, ShowDesign.pl, in the following manners:
<a_href=”http://infopad/Power-bin/ShowDesign.pl?design=luminence&user=lidsky>ShowLuminence</a>
and
<a_href=”http://infopad/Power-bin/ShowDesign.pl?luminence&lidsky>ShowLuminence</a>
The difference between the two cases are minimal. The first case is more e
parsed, while the second example sends slightly less information. In the first case
string is parsed exactly the same as the Forms are parsed, while in the second c
delineator, (arbitrarily chosen as &) is used to parse the string.
1. Note, with the advent of cookies, user identification can be incorporated without this special progrming, However, this redundancy can be added for system’s which don’t support the cookie protocol.
4.2 System design on the World Wide Web (WWW) 94
st
ion
This
arch
has
ame
the
an
be
rsed
files
an
es so
be
r. In
of
ols.
2 Server side: After parsing the information, from the page, the server mu
verify the user’s identification and then recover all the user’s information. Informat
that is required for most tasks need to accessible quickly to reduce access time.
can be done by using one or more central files. For a large number of users a se
string algorithm should be used. In the method utilized by PowerPlay, each user
single line in the user file holding user-specific data. The protocol used can be the s
as the protocol used for transferring information to server programs simplifying
required code:
user=username¶m1=value1¶m2=value2&....paramN=valueN&
Once this line has been located, the string can be parsed quickly into
associative array for easy utilization.
For user specific information necessary for only specific tasks, data can
stored in other locations to reduce the size of the central data file. This can be pa
through in the same manner as the central file. In addition, the simple existence of
can be used for user specific custom documents.
3) Once user specific information has been located and parsed into
associative array, appropriate user information must be incorporated into the pag
that they can be re-used.
4.2.6 Remote tool calls
Section 3.2 illustrated the need to use tool invocation for macromodels. To
effective as macromodels, tools need to be located not central to a single serve
addition tools most be encapsulated into a uniform interface. The process
encapsulation should be as easy as possible for the designer to incorporate the to
4.2 System design on the World Wide Web (WWW) 95
) a
has
are
col
all
for
ork
on.
late
ith
ased
ed.
In addition to the work done by Weld and the Vela project (Section 4.2.3
number of works could facilitate these goals of uniform access to tools, but none
been deployed to a great of extent to be viable. Two examples of previous work
worth examining. Silva [31] presented a method of using a the simple mail tool proto
(SMTP) to be used as hubs for tool packets.
In Silva’s system, each computer would have a local hub with which
communications would transpire using the mail protocol. This system, though good
smaller transactions, may not be sufficient for a high bandwidth interactions. This w
was the first to identify the need for a uniform standard for this type of communicati
Bentz [44] demonstrated a method of using design agents to encapsu
design information into abstract tool calls. Incorporating this type of design agent w
a hub based system such as Silva’s could greatly facilitate this type of web-b
design. Both works, however, require an infrastructure that is yet to be widely utiliz
HUB
models
HUB
models
HUB
models
server
UCLAMIT Berkeley
Figure 4.4Silva’s SMTP based system[31]
4.3 Object-Oriented modeling and design 96
on
sing
rver.
of
ract.
sing
tems.
ect-
l for
the
. A
mber
en
the
lities
ing
of
only
Presented in this thesis, (Section 5.2.6) is an interface utilizing the comm
gateway interface which eliminates the requirements of distributed proxies/hubs. U
simple encapsulation scripts, a tool can be located on any site with a simple web se
4.3 Object-Oriented modeling and design
Object-oriented design is methodology which focuses on the organization
discrete objects, characterized by their structure and behavior and how they inte
Though originally derived as a means for software development, there is increa
focus being placed on using an object-oriented design approach to integrated sys
As will be seen in this section, a number of techniques being promoted for obj
oriented design applies for the design of complex systems. The file structure usefu
conceptual design is, in effect, an object-oriented structure.
4.3.1 Object-oriented overview
A full treatise on object-oriented design is beyond the scope of this work,
reader is referred to [47][48] among other useful texts on object-oriented design
basic understanding of this design methodology, however, is useful. There are a nu
of variations on this methodology. This section will either reflect similarities betwe
most approaches or use some techniques specific to Rumbaugh based design.
There a number of themes common between object-oriented design and
conceptual-level approach to the design of systems. Reviewing these commona
provides insights both into conceptual design and into the motivation for utiliz
object-orientation in a conceptual level environment.
Abstraction: Both methodologies rely on focusing on the essential aspects
a system for a given design goal. Extraneous details are not specified. This not
4.3 Object-Oriented modeling and design 97
s the
he
jects,
her
me
rt to
es to
, the
xplic-
class.
me
-
The
ary to
s, by
enables a quicker specification, and evaluation, of a complex system, but preserve
freedom to make decisions from the most powerful early abstraction levels.
Encapsulation: Rumbaugh defines encapsulation as consisting of t
separation of “the external aspects of an object, which are accessible to other ob
from the internal implementation details of the object, which are hidden from ot
objects” [47]. This is, in effect, what macromodels attempt to do.
Reuse: The object-oriented methodology promotes reuse much in the sa
way that the macromodeling approach does. In both cases, it requires an initial effo
be put in modeling, to provide savings in design time later in the design process.
In the general sense, object-oriented technology utilizes objects and class
define a system. Rather than go into a detailed preamble of object-oriented design
aspects which are most relevant to system design are described here2:
• Object/instance: An object is a combination of a state and a set of methods that e
itly embodies an abstraction of an element. An object is made by instantiating a
• Class: An implementation that can be used to create multiple objects with the sa
types of behavior.
• Attributes are the identifying characteristics of individual objects. For example bit
widths, bias currents, signal characteristics.
The use of classes and objects are directly applicable to macromodeling.
general macromodel can be viewed as a class which holds the information necess
perform an estimate once the attributes are specified. The instantiation of a clas
2. For a full list of object-oriented definitions, the reader is referred to [www.clark.net/pub/howie/OO/ooterms.html]
4.3 Object-Oriented modeling and design 98
ular
ere
egate
And,
gate)
ling
h an
the
d as
f the
ters,
st of
applying attributes, creates an instance which is the estimate for that partic
subblock.
• Aggregation is, in object-oriented terms, a special form of transitive association wh
a group of component objects for a single semantic entity. Operations on an aggr
often propagate to the components.
Composite models are nothing more than an aggregation of other models.
as noted above, changing global parameters (attributes) of the composite (aggre
often propagates to the components.
4.3.2 Object-oriented macromodeling
As shown in the previous section, there is a large overlap in macromode
techniques and in object-oriented design. Though not originally targeted for use wit
object-oriented structure, macromodeling for conceptual-level design can utilize
same structure to further the understanding of a system.
Both primitive and conceptual macromodels can be viewed as being define
a class, and then instantiated into a system to provide an estimate. The definition o
class is the definition of the means to provide the costs from a set of input parame
as well as a list of parameters. A macromodel class is instantiated by providing a li
input parameters and evaluating. This is best seen with a simple example.
Example 7Instantiation of a primitive macromodel:
Let’s look at a simple primitive model of power dissipation of a memory
cell (see Section 3.2.1). The model,
(9707+word*108+bitwidth*1126+bitwidth*words*6)*Vdd2Ceff*f
4.4 Summary 99
are
el is
uced
gy,
ging
to be
vel
type
can be viewed as a class with attributes/properties of bitwidths, word-
length, supply voltage and frequency. By applying specific attributes of a
specific instance, the macromodel is instantiated into an object. A
memory with this specific architecture, for example, can be instantiated
as an 1024 word memory block, of 16 bit words, operating off of a 3.3
volt supply with a frequency of 1MHz, giving an estimate of 250µWatts.
Likewise, composite models, as aggregates of primitive and composites,
parameterized by a set of global parameters and desired views. A composite mod
instantiated by selected specific parameters and views and evaluating.
4.4 Summary
There are a number of new and/or emerging technologies. Chapter 3 introd
the concepts of high-level macromodeling. In this chapter, a new technolo
hyperlinked spreadsheets, was introduced. In addition, the application of two emer
technologies, the World Wide Web and object-oriented design, were demonstrated
directly applicable to the design of complex systems and conceptual-le
environments. In the following chapter, these enablers are assembled into a proto
system.
100
Chapter 5
PowerPlay: A Conceptual-Level Design Environment
uilt
the
tand
t bed
ides
ntial
are
are
ser’s
cific
d in
A prototype conceptual-level design environment, PowerPlay, was b
primarily for two reasons. Firstly, to give practical experience and guidelines in
design of conceptual-level environments. Only in building a system can you unders
the true challenges. Secondly, and perhaps more importantly, was to design a tes
for conceptual-level design and benchmarking (Chapter 6). This chapter prov
details on the PowerPlay implementation and what was found to be the most esse
elements for this type of design system. In addition, suggestions for improvements
included. To accentuate the elements of the conceptual-level environment which
important to the system designer, Section 5.1 overviews the system from the u
perspective - illustrating the essential parts in general, as well as the spe
implementation. Section 5.2 provides details of the code and file structure use
PowerPlay highlighting the most important aspects.
5.1 User interface 101
f the
Play
osite
hical
be
, the
stem
ind
key
st
and
but it
s a
y be
gle
5.1 User interface
5.1.1 Overview
This section presents essentials elements of the system from the view o
designer providing both the concepts in general as well as the specific Power
implementation. In PowerPlay, a system is specified by creating nested comp
models using hyperlinked spreadsheets. The composite at the top of this hierarc
chain is a model of the entire system. In this way a composite model can also
considered a design specification. As this chapter is from the view of the designer
word design will often be used to indicate a composite model which represents a sy
or a subsystem.
5.1.2 The Philosophy of the conceptual-level environment
In designing the user interface, a closer look at the philosophy beh
conceptual-level design is important. The design environment should follow these
tenets:
Don’t restrict the designer -The conceptual-level is where the greate
fundamental changes occur. “What if?” type questions are common at this level
need to be supported. Warnings can be provided when certain limits are exceeded,
should be to the designer’s choice as to what should, or should not, be allowed.
Tools should not dictate design decisions -Choices of tools or libraries can
have profound impact on a design. For example, if a tool’s input language i
sequential language which does not allow multi-threading, issues of parallelism ma
lost in design decisions. The simple incorporation of many tools into a sin
environment is important.
5.1 User interface 102
the
-
ould
ns,
cess,
rs
sign
n
uld
as
ign
nge
a
and
level
lay
Minimize rework, support reuse- Supporting design reuse is key in minimizing
time to market. Making it easy to include models of completed design blocks and
associated documentation is imperative in minimizing rework.
Relieve designers of “dirty work” -One of the key roles of the conceptual
level design environment is to quickly facilitate trade-off analyses. The designer sh
focus their time on optimizing their design. The tool should handle all evaluatio
provide useful plots and handle data management. The quicker the evaluation pro
the more options that can be considered.
Support many designers -Facilitate the interaction between many designe
and design ideas. This obviously includes a primary focus on documentation and de
management.
Add simplicity, not complexity- One of the primary challenges in any desig
field is the increasing complexity of design objects. Design environments sho
counter this complexity trend by making their interfaces as simple and intuitive
possible.
5.1.3 Essential Elements
As stated in Chapter 2, at the conceptual level the most critical des
decisions rely as heavily on intuition as experience and expertise. A primary challe
of designing an environment is to provide all the functionality that is required in
manner that supports intuitive design. Creativity needs to be fostered at this stage
cumbersome interfaces can hamper design time and final system performance.
There are a number of essentials subcomponents for a conceptual-
environment. The following list is accompanied with the section of the PowerP
implementation which address these issues.
5.1 User interface 103
.1.7)
is
odel
tables
ply
itives
gh
and
ction
ies.
tant
od of
• Easy inclusion of primitive models (Section 5.1.4)
• Hyperlinked spreadsheets for composite models (Section 5.1.5)
• Flexible interface for describing designs/composite macromodels(Section 5.1.6)
• Support for multi-user system design with an emphasis on shared resources (Section 5
• Links to graphing tools (Section 5.1.9)
• Documentation (Section 5.1.10)
5.1.4 Primitives
Primitives are the base elements from which a design specification
constructed. The environment’s focus in terms of primitives includes:
• Primitive macromodels are easy to include in the system. Reducing the initial energy of m
inclusion is essential in a model based system.
• Designers can create and include any type of primitive model. This includes equations,
and dynamic models (encapsulated tools). As new models are required, they can be sim
included using WWW from entry.
• Designers can modify their models as their designs become more defined. Thus the prim
evolve with the design.
• Documentation is easily included with the primitive model.
Systems should enable interaction with primitives in three ways: throu
interfaces to create and edit primitives, as single stand alone entities of models
documentation, and as components of composite models. The rest of this se
presents PowerPlay’s interface to primitives as both stand-alone and editable entit
Figure 5.1 shows a primitive model page for a carry select adder. An impor
aspect is that primitives are presented in a predictable manner. PowerPlay’s meth
presentation is as follows:
5.1 User interface 104
his
hics
ons
be
the
n or
as a
e the
The top of the page has the primitive name and documentation. T
documentation is created by the owner of the model and can consist of text, grap
and/or hyperlinks. This is followed by input fields for the input parameters, and butt
for each of the cost functions. In this manner any or all of the cost functions can
evaluated for any set of primitives. At the bottom of the page is a description of
model. In this case the models are equations but models which utilize tool invocatio
table look-up will also be identified here. This documentation is an essential item
designer should be able to examine how a model is evaluated so as to determin
applicability of the model to the designer’s needs.
Figure 5.1Primitive Macromodel
5.1 User interface 105
is
edit
uate
d the
tial
used
the
any
e
s of
tion
or
the
irst,
cation
ation
mat
data
ging
PowerPlay enables easy editing of primitives by providing an “Edit th
model” link at the bottom of the page. In this means the designer of the model can
all of the contents of the primitive macromodel page at any time. The means to eval
the model can be changed, the default input parameters can be changed, an
documentation modified, all through a WWW interface.
Primitives are created interactively using WWW input forms. This is essen
as no models should be hard coded into the system. Thus, the environment can be
for any user defined system without any adjustment to the code. Tutorials for
creation of new primitives, which enables the encapsulation of tools distributed on
web site, is detailed in Appendix A.
Dynamic model invocation: One of the key enablers of this technology is th
ability to create models of any tool located anywhere on the WWW. The mechanic
this interaction is presented in Section 5.2. The remaining parts of this subsec
explore the user’s perspective of tool encapsulation.
In incorporating tool primitives, it is important to minimize the overhead f
the designer. In addition, tools need to be locatable any where on the Internet so
designer can use the systems and tools that is most appropriate.
Regardless of the implementation the tool invocation takes three steps. F
the input parameters are parsed by the parameters and sent to the appropriate lo
on the Internet. Second, a program, where the tools are located, parses the inform
and runs the appropriate tool. Finally, the information is parsed from the output for
of the tool back to the conceptual-level environment.
The ideal scenario would be to have a uniform standard for encapsulating
on the web. In lieu of this standard, a more point-to-point approach is taken levera
off of the HTTP protocol.
5.1 User interface 106
s, a
ut
eds
with
ipts
r the
in
ners
ntial
sheet
of
text,
tion
iron-
• By using Uniform Resource Locators (URLs) as addresses for dynamic model
dynamic model can be located where-ever there is a cgi-bin.
• At the location of the server, only two simple text fields (model location, and inp
parameters) need to be provided by the designer.
• At the location of the dynamic model, a simple program encapsulating the tool ne
to be installed. This code can be downloaded from the server and can be installed
trivial modifications. The most common tools (e.g., SPICE, Matlab) can have scr
supplied at the server which can be customized by answering questions ove
WWW and then downloading.
By following these simple guidelines, the details of which are provided
Appendix A and Section 5.2.3, complex dynamic models can be included by desig
with minimal overhead.
5.1.5 Composition: Hyperlinked Spreadsheets
The implementation of hyperlinked spreadsheets includes all the esse
elements as introduced in Section 4.9.3. Figure 5.2, illustrates a PowerPlay spread
with boxes around the essential elements1.
Documentation: Designers can include documentation at the top or bottom
the spreadsheet. As can be seen in this example, documentation can include
graphics and links. Not shown in the screen dump are links for editing documenta
so it can be continuously, and easily, updated.
1. For legibility, parts of the spreadsheet are not shown. These include links to other parts of the envment such as model management and graphing tools.
5.1 User interface 107
ch
ories
tion.
f
be
ese
ed
The
r is
al
ration
e
of
ical
ld be
pre-
Design Description: The design is describe by specifying the models (ea
row refers to another model) and the parameters. In this case three separate mem
and a single register have been identified as the major contributors to power dissipa
Global Parameters: The global parameter table allows the modification o
parameters passed either to the macromodels (in this example,bits) or for the
calculation of views (supply and frequency). It is also in this box that domains can
defined2. When this composite model is encapsulated in another composite, it is th
global parameters that can be modified at the level of the parent.
Design Evaluation: The power dissipated, as shown in Figure 5.2, is analyz
by first calculating the capacitance as modeled with row specific macromodels.
values are displayed in theCostscolumn. The calculation of theView power, from the
capacitance, is encapsulated into the entire spreadsheet.
Each of the rows have an associated macromodel for the calculation of thecost
of capacitance. The macromodel for calculating the capacitance of the registe
different than the model for the memory cells. Theview power is calculated the same
for each of the rows.
Views can be defined by any designer. It entails the definition of addition
columns in the spreadsheet. These columns can either be input rows (e.g., ope
counts as in this example) or macromodels (e.g,P=C*V2f ). These macromodels can b
primitive or composite.
In addition, a view defines what summations are to be made at the bottom
the spreadsheet. Again, flexibility is imperative: from both common mathemat
expressions (e.g., sum, average, error calculus) and complex equations shou
2. The interface for associative model assignment through the utilization of domains and generics issented in Section 5.1.8
5.1 User interface 108
ed to
supported. Another feature of the view of the spreadsheet is that it they can be usdetermine which costs get presented to parent composite models.
Figure 5.2PowerPlay’s hyperlinked spreadsheet interface
Doc
umen
tatio
n
Glo
bal P
aram
eter
s
Vie
w S
elec
tion
Nam
es
Inpu
t Par
amet
ers
Cos
tsV
iew
on
Pow
er
&H
yper
links
5.1 User interface 109
are
by
ds to
each
ce,
itive
ance
hods
nced.
able
of
nts
lay’s
hree
the
ay:
5.1.6 Composite model creation
As detailed in Section 3.3, composite models at the conceptual level
basically lists of primitives and other composite models. The system is evaluated
processing the composite model in a hierarchical manner. These models, nee
support the various ways designers specify design information. Tutorials as to how
of these three methods are used are presented in Appendix A.
It is important to realize that the creation of a composite model is, in essen
the creation of a design specification. Therefore, it should be supported in an intu
manner. As noted in Chapter 2, the key to conceptual design is to support and enh
current methodologies. Therefore, instead of focusing on the creation of new met
for design descriptions, current description methods should be supported and enha
The following key components of composite model creation are focused upon to en
designs to be described in as many ways as possible:
• utilize WWW based interface
• support standards whenever possible
• allow many different means of design descriptions
The composite model description is just a simple list of elements. The role
all description methods is to simply parse their description into a list(s) of compone
with associated parameters and model assignments. The details of PowerP
structure of a composite model is presented in Section 5.2. The role of the t
description methods presented herein, is to provide an intuitive interface for
designer.
The following three design description methods are supported by PowerPl
5.1 User interface 110
od
ign
ther
ible.
.
the
in the
nt.
ions
ype
ed. In
s of
uld
site
tual
of
on
5.1.6.1 Enumeration
Enumeration is simply the listing of the system’s components. The meth
used in PowerPlay is analogous to the writing of a list of components in a des
notebook. A composite is created and modified by using WWW buttons to add o
primitives and other composites. The key here is to make it as simple as poss
Tutorials as to how designers enumerate via PowerPlay is presented in Appendix A
When using direct model assignment, specific macromodels are added to
composite design. When using associative model assignment, generics are added
same manner. Section 5.1.8 presents the interface for associative model assignme
5.1.6.2 Schematic entry
Often the first step in the design process is to sketch potential implementat
at the block level. This can be a block diagram for hardware or a flow graph t
diagram of behavior. Through schematic entry, designs can be sketched and captur
addition, as schematic entry is used from the block level to transistor level, design
various points of development can be integrated into the modeling environment.
The mapping of the schematic into the conceptual-level description sho
provide a hierarchical mapping from the elements in the schematic into compo
entities. The advantage of using schematic entry mixed with a concep
representation include:
• The ability to use a graphical description. This adheres to the philosophy
making design description as simple as possible for a designer.
• The ability to include information about interconnect to improve estimati
accuracy.
5.1 User interface 111
me
later
tion
ks a
op
dia
tous
rence,
• The ability to insert conceptual estimation into the design flow. The sa
schematic entry tool can be used to interface with the conceptual-level as
levels of abstraction. Hence, a single tool can be used for design specifica
throughout the design process.
• A means to create a hierarchical estimate at the conceptual level which trac
hierarchical visual description.
Schematic entry has been implemented in conjunction with MIT’s WebT
environment[35][36]. Figure 5.3 shows a schematic entry of a simple multime
processor3. By using a Java based schematic entry tool, the same benefits of ubiqui
access supported by the rest of the environment is upheld.
3. This particular example is detailed in Chapter 6 and was presented at the Design Automation ConfeJune 1998.
Figure 5.3Webtop’s Java based schematic entry interface
5.1 User interface 112
of
lay
(any
dels.
e the
are
ked
tions
an
ed).
r (in
ts of
other
e is
ion
, the
ither
thm.
Another form of schematic entry into PowerPlay is the EDIF standard. All
the major commercial schematic entry tools can export EDIF. The EDIF-to-PowerP
parser is accessed via a web page requiring only a pointer to the EDIF text file
where on the web) and guidance as to mapping of the design blocks to macromo
PowerPlay retains the hierarchy created in the schematic entry tool
5.1.6.3 Textual entry
Code is often used both early and throughout the design process to describ
behavior of a system. The functionality desired is the ability to, given a softw
description with input vectors, create a behavioral description into nested hyperlin
spreadsheets. To perform this with any level of accuracy requires making assump
on the architectural platform, or compilation tool(s), to be used
This function has been prototyped using a flow implemented by Marlene W
[30]. The algorithm is described in a subset of the C++ language (pointers exclud
The C++ code, and associated input vectors, is run through a commercial profile
this case a Sparc architecture) to give a breakdown of the system in terms of coun
subroutine accesses. Each subroutine is, in turn, described as a breakdown of
subroutines and primitive functions (e.g., multiply, memory access) Each subroutin
parsed into a separate composite model files.
The primitives that get utilized are assigned using domain libraries (Sect
5.1.8) to enable quick analyses of the effects of different architectures. In addition
designer has a means to quickly identify the bottlenecks of an algorithm through e
the use of graphing, or by traversing the hyperlinked spreadsheet view of the algori
An example of this approach is presented in Section 6.1.5.
5.1 User interface 113
itive
means
rhaps
d a
antly
f
h the
s to
st of
5.1.7 Design and model organization
As a design is developed, an ever increasing number of models (both prim
and composite) needs to be accessible in an intuitive manner. There needs to be a
for a designer to access models created by him/herself, other designers, or pe
commercial companies. To provide this flexibility, the user interface must prove
simple means to access all models. Though simple, this interface must be const
adjusting to adjust to each users needs.
To reach these goals, a customizable interface, termedThe Design Center, is
utilized.
5.1.7.1 Customizable centralized server
PowerPlay provides this adaptive function through a central page termedThe
Personal Design Center.This is a virtual hub for all activity. Figure 5.4 shows a view o
the Personal Design Center. Each user has their own personal design center wit
same organization, but different content. In the first table at the top of the page, link
the most common functions are provided. This is the same for each designer. The re
the page, however, is dynamically customized for each user.
5.1 User interface 114
of
lable
is
s are
.
Given permission, a designer can view and use any user’s libraries
primitives or composite models. These are listed in tables on the page. Also avai
are generics and libraries which will be detailed in Section 5.1.8
Each time the user loads the page, the most up-to-date information
presented. In Figure 5.4, the designers most recent designs, models and librarie
shown. In addition a number of other model directories have been selected for view
Figure 5.4Centralized page for model access and commands: The DesignCenter (continued on Figure 5.5)
5.1 User interface 115
ign
face
oth
is to
ner.
n be
ple:
A link at the top of the page provides links to pages for customizing the des
center and other parts of the environment. Appendix A has details on the inter
utilized for user customization.
5.1.7.2 Multi-user model database
Estimations are performed by creating and manipulating models. Models (b
primitive and composite) need to be shared between groups of users. The goal
handle model management with as little overhead as possible to the desig
PowerPlay utilizes a concept called “model ownership” to ensure that models ca
easily shared between users without the risk of corruption. The protocol is very sim
Figure 5.5Design center (continued from Figure 5.4)
5.1 User interface 116
oper
edit
nd
e of
s. A
del
ber of
level
ital
ed of
each user owns and maintains all the models that they create. Anybody, with pr
permissions, can use these models to perform estimation, but only the owner can
the model.
A composite model, though owned by one user, may contain primitives a
composites created by other designer(s). To illustrate the usefulness of this typ
model management, consider a generic system made of analog and digital block
simplified diagram of the system’s model is shown in Figure 5.6. The composite mo
representing the system, the top-most node, may consist of models made by a num
different designers. In this case a project leader/system designer owns the top
composite model, which consists of primitives and composites created by a dig
designer and an analog designer. The digital designer’s composites are compos
Digital Designer
Figure 5.6Hierarchical breakdown of composite showing model ownership
Standard Cell Library Designers
Project Leader
Analog Designer
5.1 User interface 117
aic
) of a
sites,
bles
, and
also
t the
ner
ise
may
ary
ced
ay as
del of
5.7.
uses
h the
e by
ither
the
level
tant
) is
standard cell library primitives which model basic digital functions (e.g. algebr
functions, registers). These models were created and maintained by the designer(s
standard cell library. The analog composites are made up of other analog compo
analog primitives, as well as some digital composites. This level of sharing ena
users to keep their own models up to date as they make changes to their designs
have the most recent modifications reflected up the levels of hierarchy. This
enables the project leader to perform budgeting and bottleneck analysis throughou
design process.
Note, an owner of a model may not be a single person. For example an ow
may be a group of people - for example the creators of a standard cell library. Likew
a single designer may have different user names. A digital designer, for example,
log in as him/herself as well as logging in as the group-user “standard cell libr
designers”.
5.1.8 Associative Model Assignment
As detailed in Section 3.3.2, the concept of generics and domains is introdu
to enable the associative assignment of models. Generics are added in the same w
a regular macromodel, but the actual assignment occurs in the spreadsheet. A mo
a VSELP algorithm implemented on both hardware and software is shown in Figure
Each of the entries use generic names to identify each functional routing and
association to map them to specific models. The assignment is performed thoug
use of the domain calls. The designer sets the global domain for the entire pag
using the selection at the top of the page. Each macromodel has the ability of e
inheriting the higher level, or overriding the global domain. Thus the designer has
option of being able to assign the entire design to one domain by changing the top
of a composite, but retains the ability to use many domains by overriding at impor
levels. This use of inheritance (akin to inheritance in the object-oriented world
5.1 User interface 118
one
ers
ains
is
ify
ms is
essential to allow a designer to assign a hierarchical system to a single domain with
simple change at the top level.
Domains are simply the mapping of generics to specific primitives. Design
can create any number of domains through a WWW form. Users can also used dom
created by other designers.
5.1.9 Sensitivity Analyses and Graphing
The ability to perform an analysis of the effects of parameter variation
important in an environment.
Throughout the design process, it is important to be able to ident
bottlenecks and understand trends. A clean way to understand both of these ite
Figure 5.7Using associative model assignment for HW/SW co-design
5.1 User interface 119
hing
n
d in
r of
3 It
re the
ther
mine
WW
ion
ased
5.9
tery
through plots. Therefore, a complex system design tool should include grap
capabilities.
Bottleneck analysis:Bottlenecks can be identified by breaking down a desig
results into bar or pie charts. PowerPlay utilizes a hyperlinked pie chart implemente
Java to give added functionality. Figure 5.8 shows a pie chart breakdown of powe
VSELP encoder which was profiled using the profiler introduced in Section 5.1.6.
can be easily seen that the SearchCodebook and the QuantizeGains functions a
dominant source of power consumption. Each slice of the pie provides a link to ei
the associated model or documentation allowing the designer to immediately exa
any element of the system. This is yet another example of using the power of the W
to further facilitate the design process.
Tracking trends: To track the trends of a design graphing costs as a funct
of one or more design parameters is invaluable. PowerPlay utilizes a java b
graphing tool so that the graphs can be plotted inside the web-browser [37]. Figure
illustrates the sensitivity of the MOSFETs in a buck convertor to changes in bat
voltage.
Figure 5.8Hyperlinked pie chart used for bottleneck analyses
5.1 User interface 120
he
alysis
als.
ated.
ign
and
ss. It
Figure 5.10 illustrates the simple form, located at the bottom of t
hyperlinked spreadsheet, that is used to request the desired analyses. This an
plotted the costs of MOSFET loss versus battery voltage in steps of 200mV interv
At each interval, the entire spreadsheet, and all the hierarchical children, are evalu
5.1.10 Documentation Summary
As shown, documentation interfaces are included throughout the des
environment. This section summarizes the goals and techniques to support
encourage documentation. Documentation is an essential part of the design proce
is often the most neglected as well. Two tasks need to be supported:
Figure 5.9Sensitivity analysis of voltage regulator system
Figure 5.10Simple form used for creating sensitivity plots
5.1 User interface 121
the
table
to
asily
ign
ant
sign
d in
dure
om
lid
er of
cial
ould
em.
ecific
• Easy incorporation of documentation by individual designer
• Easy location and organization of documentation of systems
To encourage individual documentation, entry points are ubiquitous. All of
documentation forms have the same format and are saved to pages in predic
locations. All documentation can be written in HTML so besides just text, links
figures, spreadsheets, word processing files, and other documentation can be e
included. This is another example of the WWW giving added value to the des
environment.
By linking the documentation to the models, it is easy to locate relev
documentation. In addition, a project leader can travel the hierarchical tree of a de
(Figure 5.6) to check on design and documentation progress. Though not include
this environment, a more automated documentation check in and assembly proce
can be implemented in this type of system.
5.1.11 Security
A major concern of shared design, especially on the internet, is security fr
malicious destruction and the protection of intellectual property. Though a va
concern, design of secure systems on the WWW has become mature with a numb
security elements integrated with the servers. Though focused primarily on finan
security, the techniques can be applied to the security of proprietary designs and sh
not be considered an impediment to future design environments on the WWW.
As an academic project, PowerPlay did not install a highly secure syst
however, three simple security methods were utilized.
• Users login using passwords.
• Cookies are used to make sure only certain machines can have access to sp
accounts.
5.2 PowerPlay’s System Architecture 122
site
ent
. It
nted
AD
y is
own
vel
s,
ntry
s can
Java
e for
ing
ally
the
• By using dynamic models, all of a user’s primitives can be stored at their own
providing a second level of security.
In addition, in most systems, local intranets exist on which the environm
can run behind firewalls.
5.2 PowerPlay’s System Architecture
This section details the specifics of the PowerPlay system architecture
utilizes techniques for the WWW as introduced in Section 4.2, and the object-orie
structure described in Section 4.3.
5.2.1 Code overview
Due to the client-server nature of the WWW presented in Section 4.2, a C
tool on the WWW must consist of a number of disconnected programs. PowerPla
made of over 25 separate pieces of code. In this section, the code will be broken d
by functionality to give a feel for the types of functions needed in a conceptual-le
environment.
As the majority of the duration of program execution is file interaction
Perl[39] is used for the majority of the code. Only for the graphing and schematic e
functions was Java chosen. Since code is disjoint, different programming language
be used for different tasks. Future generations of this tool may consider the use of
to provide more interaction when using the spreadsheet, and a compiled languag
composite model evaluation for faster speeds. For all of the designs utiliz
PowerPlay, however, evaluation time has not been prohibitive as the time is usu
dominated by the interaction over the web rather than the actual evaluation of
models.
5.2 PowerPlay’s System Architecture 123
red
ling
new
is
iting,
lity.
both
Figure 5.11 illustrates the breakdown of code for the major functions requi
by PowerPlay. As expected, the majority of the code is focused on macromode
functions and design management. Code used for simple tasks such as entering
user information is not shown in this figure.
The fact that only two pieces of code are focused on documentation
somewhat misleading as embedded in the process of macromodel creation and ed
as well as in some utility functions, are other documentation procedures.
Table 5.1 presents a breakdown the code of Figure 5.11 by basic functiona
Sections 5.2.4-5.2.9 utilizes the code to demonstrate general functions useful for
EvaluationPlay.plPostData.plShowDesign.plCalc.pl
Cr eationModelEntry.plEditModel.plPostData.plNewPrim.plSubCct2Design.pl
AssociationDisplayDomain.plGenericEntry.plParseDomains.plSaveDomain.plSetDomain.pl
DocumentationDocuDesign.plProcDoc.pl
Design CenterArchive.plChangeMenu.plCustomMenu.plDisplayDir.plGarbage.plEmptyGarbage.pl
Macromodeling
Design Management
Utilitieslink.plUtilities.incInputParse.pl
GraphingChart.plPie.pl
Other
Figure 5.11Functional breakdown of code
5.2 PowerPlay’s System Architecture 124
s in
WWW programming in general, and the design of conceptual-level environmentparticular.
Table 5.1 PowerPlay’s main code
Code Name Functionality
Macromodel Evaluation
Play.pl Evaluates composite models. Prints results to afiles. Also creates data for charting.
ShowDesign.pl Prints composites to hyperlinked spreadsheets.
Calc.pl Calculates primitive macromodels. Printsresults to screen.
Macromodel Creation
ModelEntry.pl Provides a series of WWW pages for inputtingnew primitive models
NewPrim.pl Processes the form entry for a new primitiveinto a documentation file and a class for a new
primitive.
EditModel.pl Provides customized input forms for editingprimitives.
PostData.pl Adds a specified primitive to a compositemodel.
SubCct2Design.pl Adds a composite model as a child of anothercomposite.
Macromodel Association
DisplayDomain.pl Displays the domain match-up as an editableWWW form. Used for editing and creating
new domains.
GenericEntry.pl Provides a series of WWW pages for inputtingnew generics.
ParseDomains.pl Utility used by other functions.
SaveDomain.pl Saves the user’s newly specified domain.
SetDomain.pl Provides the input form for creating domains.
5.2 PowerPlay’s System Architecture 125
Management of the Design Center
Archive.pl Puts a dated archived version of the currentdesign into archive directory.
CustomMenu.pl Creates customized input form for change thePersonal Design Center (PDC).
ChangeMenu.pl Saves current version of PDC.
menu.pl Displays the Personal Design Center (PDC).
Customize.pl Allows users to customize their environmentincluding the PDC.
DisplayDir.pl Displays the directories Temp, Archive, Gar-bage giving options for viewing/deleting files.
GarbageDir.pl Places active design into “garbage” directory.
EmptyGarbage.pl Cleans out the garbage directory.
Documentation Management
DocuDesign.pl Provides forms for the addition or editing ofcomposite documentation.
ProcDoc.pl Processes the composite documentation forms.
Graphing
Chart.pl Produces line charts for sensitivity analyses.
Pie.pl Creates hyperlinked pie chart of cost functions.
Utilities
link.pl links to HTML pages and customizes for user.Used by a number of different programs.
InputParse.pl Parses the http encoded information into anassociative array.
Utilities.inc Include file with a number of commonly usedfunctions
Table 5.1 PowerPlay’s main code
Code Name Functionality
5.2 PowerPlay’s System Architecture 126
oth
n of
anize
ed
ent
ge
ay.
to be
dels,
of
heir
e is
5.2.2 File Structure
PowerPlay uses a simplified object-oriented structure for storing b
primitive and composite macromodels. This section illustrates the basic breakdow
the design directories in order to provide both guidance as to proper means to org
and an outline of the breakdown of the different file structures.
The file structure used to store information all utilize the simple protocol us
for passing information on the WWW. As defined by HTTP, sets of doublets are s
with this format:
identifier1=value1&identifier2=value2&.....&identifiern=valuen&
PowerPlay utilizes a simple modification of this protocol to allow the stora
of multi-dimensional arrays. The# separator is used to separate any value into an arr
By keeping an order to the storage of each value and also allow associative arrays
stored. For example:
identifier1=value1a#value2a&...
where identifier1 indicates a simple array
or
identifier1=identifier1a#value1a#identifier1b#value1b&....
where the identifier can be parsed into a more complex associative array.
Figure 5.12 illustrates the breakdown of the file system used to store both macromo
documentation, and designer information. They are all found in subdirectories
public_html/PowerPlay. For simplicity, most of the directories are not expanded. T
function is evident from their names. Though only a single user’s directory structur
expanded, all users’s directories have the same structure.
5.2 PowerPlay’s System Architecture 127
s. A
and
s
ch
as
ile.
ns to
As can be seen in Figure 5.12, each user has personalized directorie
breakdown of each user’s directory will give insight into both macromodel storage
the issues involved with WWW documentation.
MODEL directory: This directory holds all the information of the user’
primitives. Information about each specific primitive is divided into two locations. Ea
primitive has it’s own HTML template page which contain all documentation as well
links to common functions. The data for each primitive model is stored in a single f
This separation of tasks enables a clean organization of documentation and a mea
quickly evaluate sets of models.
Figure 5.12Breakdown of PowerPlay’s directories. Most subdirectories arenot expanded. All users have same directory structure
...public_html/PowerPlay
welcome.htmlDesignCenter.PPpassword.html......
FIGURES/ USERS/ TUTORIAL/ HELP/ LOGS/
user.dat user1/ user2/ user3/.......
MODEL/ COMPOSITE/ DOC/ DOMAIN/
GENERIC/ TEMP/ ARCHIVE/ GARBAGE/
5.2 PowerPlay’s System Architecture 128
ach
n
ates
iate
In
f the
The
o an
e
ite is
eters
f the
This file, MODELS.moddat, has all the information necessary to evaluate e
primitive. Every primitive has a single line in the file which holds the informatio
which can be considered the “class” of each primitive. The programs which evalu
primitives can utilize this information, along with the defined parameters, to instant
the primitives. Each line of the file must capture all the aspects of the model.
PowerPlay a separate doublet, indicated here by their identifier, is used for each o
following items:
CostNames: this is a list of all of the costs
Models: There is a separate entry indicated the class of each of the costs.
class is simply the means in which the model should evaluate. This either points t
equation, another composite or a dynamic model.
ModelUnits: indicates the units of each of the costs
Param: Lists all of the parameters for each of the costs
CostUnits: Lists the units for each of the costs.
An example MODELS.moddat file is given in Section B.1.5.
COMPOSITE directory: All composites are stored here in single files. Thes
files play the role of both an instance and a class for each composite. If the compos
a top level model, that particular instance is stored. However, as the global param
are located in the file, and are editable, they also serve as a class description o
composite when called by a parent composite.
Each composite has the same format summarized in Table 5.2
Table 5.2 Breakdown of a composite file
Line Function
1 View information
5.2 PowerPlay’s System Architecture 129
A file example for the luminance chip is presented in Figure 5.13.
2 Global parameters
next n lines A separate line for each entry.
n+3 Place holder used for parsing
n+4 through eof Totals for the specific instance
Table 5.2 Breakdown of a composite file
Line Function
ShowCosts=CHECKED&user=lidsky&viewall=power&
Variables=3&frequency=1e6&bits=8&supply=4&
number=1&name=memlp&DelayUNITS=ns&bits=bits&cap=1.784e-10&capUNITS=F&allMOD=Delay#cap#Area#&Area=12136910&allparam=bits#words#blocks#&owner=berkeley_low_power&AreaUNITS=um^2&blocks=2&words=2048&Delay=4.5&count=.63&total_cap=1.124e10&power=1.799e-03&
number=2&name=memlp&DelayUNITS=ns&bits=bits&cap=1.784e-10&capUNITS=F&allMOD=Delay#cap#Area#&Area=12136910&allparam=bits#words#blocks#&owner=berkeley_low_power&AreaUNITS=um^2&blocks=2&words=2048&Delay=4.5&count=.1&total_cap=1.785e11&power=2.855e-04&
number=3&name=memlp&DelayUNITS=ns&bits=bits+2&cap=3.1e10&capUNITS=F&allMOD=Delay#cap#Area#&Area=20438418&allparam=bits#words#blocks#&owner=berkeley_low_power&AreaUNITS=um^2&blocks=2&words=4096&Delay=4.5&count=1&total_cap=3.114e-10&power=4.982e-03&
number=4&allMOD=cap#&name=reglp&allparam=bits#&bits=bits*2&owner=berkeley_low_power&cap=1.52e-12&capUNITS=F&count=1&total_cap=1.520e12&power=2.432e-05&
TOTALS####TOTALS
Figure 5.13Example of composite storage: Luminance decompression
5.2 PowerPlay’s System Architecture 130
e
files
ttom
is
is:
, and
the
he
r
SITE
ore
r of
hich
ent
DOCUMENTATION directory: This directory is used for the storage of th
documentation of an instantiated composite. Each composite has up to two
associated, one for top of the spreadsheet documentation and one for bo
documentation.
DOMAIN directory: All domains that a user has defined are located in th
directory. They use simple HTTP for storage. The information stored on each line
name and owner of generic, name and owner of model assigned to each generic
whether a specific model is primitive or composite
GENERIC directory: The generics are stored in the same manner as
primitive with the exception that there is no need for a MODELS.moddat file. T
format used for storing the HTML generic is the same as the primitives.
TEMP, GARBAGE and ARCHIVE directory: These directories are used fo
composite model management. They have the same format as the COMPO
directory.
5.2.3 CAD design on the WWW
Since the start of this project, WWW design has become much m
understood. However, CAD for the WWW is less understood. There are a numbe
techniques utilized in the design of PowerPlay, focused toward system design, w
are can be generalized for CAD systems in general.
5.2.4 Utilities
There are a number of functions that are used in a number of differ
functions. Each of which can be downloaded from the PowerPlay web-site
5.2 PowerPlay’s System Architecture 131
in
lates
gle
iled
anly
e.
is
.pl
the
oving
lays
e for
site
.
en
ass)
track
ber
InputParse.pl: All information comes in with the HTTP protocol described
Chapter 4. This routine parses all the info into a single associative array and trans
the special encoded strings that HTTP utilizes.
Serverr: Stands for server error. This is a subroutine found in every sin
script. When writing code on the WWW, debugging can be challenging as deta
errors are not printed to a browser. This subroutine is used to exit the a program cle
on an error, print a useful message to the browser, and record the error in a log fil
link.pl: This subroutine provides a customized link to an HTML page. As it
used for customization, it will be detailed in Section 5.2.7.
5.2.5 Composite model evaluation
The evaluation of composite models is broken into two programs. Play
receives the information from the WWW, parses to associative arrays, evaluates
nested spreadsheet and prints to a file. It also handles other operations as in the m
of lines in the spreadsheet. Play.pl terminates by calling ShowDesign.pl which disp
the spreadsheet. The important aspects of Play.pl and ShowDesign.pl are:
• Two major subroutines are used for evaluating each line of the spreadsheet, on
evaluation of the primitives and one for the composite. Note the compo
subroutine is recursively called to enable unlimited nesting of spreadsheets
• The only composite model whose file is altered is at the top level. All childr
composites are instantiated to produce results, but the composite file (cl
remains unchanged, just their results posted.
• Code gets passed from the spreadsheet as one line of text. To be able to keep
of each line in the spreadsheet, all identifiers are terminated with a num
indicating the line number in the spreadsheet.
5.2 PowerPlay’s System Architecture 132
ding
r of
tput
sign
e
ernet
e the
• Each line of the spreadsheet corresponds to one line in the correspon
composite file. Information stored in the line includes, name, owner, indicato
primitive or composite model, input parameters, corresponding values, ou
names, and their corresponding values.
5.2.6 Remote tool calls/dynamic models
The requirements of calling remote tools are:
• Ubiquitous access - models are stored everywhere.
• Easy to use and encapsulate as possible
• Make evaluation as quick as possible
As discussed in Chapter 4, the most attractive choice would be to use de
proxies which have not beeninstalled to a universal extent. Figure 5.14 illustrates th
method used by PowerPlay to access dynamic models located at local or distant Int
locations. Dedicated code needs to be placed in the cgi-bins of the machine wher
dynamic model is located.
PowerPlayDedicated
dynamic dynamic
server
Berkeley
Figure 5.14Dynamic model access
CodeDedicated
Code
models models
5.2 PowerPlay’s System Architecture 133
an
he
e to
f the
ML
nd of
like
ely
ws
the
Encapsulating a tool as a dynamic model requires the placement of
application specific program in a cgi-bin local to the tool. PowerPlay utilizes t
browser program, lynx, in batch mode to execute the tool. Another option would b
open up a socket link, which proves not to be necessary in this case. A flow graph o
functions required by the encapsulating tool is shown in Figure 5.15.
5.2.7 User customization
5.2.7.1 Basics
In PowerPlay to achieve full user customization all pages are stored as HT
templates and linked either through custom programs or created on the fly at the e
a program. The common user customizations are:
• People like to see things in the same manner. For example, if a designer would
their documentation attached at the bottom of a page for one model, it is lik
they want it attached that way for all models. This type of customization allo
the one-time setting of these types of preferences.
• The Personal Design Center is fully customizable to enable viewing of only
preferred work spaces.
• Parse input from STDIN (http protocol)• Check passwords• Customize input deck of tool with specifiedinput parameters• Run tool with new input deck• Parse output from tool• Print output to STDOUT
Figure 5.15Functionality of encapsulation tools
5.2 PowerPlay’s System Architecture 134
the
ched
hed
ay
. In
s at
first
more
e
ser
into
ith
ts
to
• As discussed in Section 4.2.4, each web-page must hold identification for
current user. In the case of forms, all pages have to have the user name atta
as a hidden input field. In addition, all links, must have the users name attac
in a parseable location. With the advents of cookies, this user identification m
become unnecessary.
User’s custom information is stored in the central user.dat file (Figure 5.12)
addition, the most up-to-date information is located by scanning all model directorie
each time of update.
5.2.7.2 Code
Two examples of the code utilized by PowerPlay is described herein. The
is a generic algorithm that can be usable in almost any environment. The second is
specific for PowerPlay, however, generic techniques will be highlighted.
Link.pl - All HTML files are stored as templates and are not linked to in th
traditional way. Link.pl, performs a link to a desired page with the substitution of u
specific data. The functionality is broken into three steps:
1) the user’s custom information is parsed out of a central file and placed
an associative array.
2) The desired file is opened and read line by line. All files are saved w
indicators for the user custom information.
3) Line by line, link.pl, substitutes in the user’s custom information and prin
to the browser.
Link.pl’s line by line substitution is fast enough to be virtually transparent
the user. This program can be downloaded from the PowerPlay web site.
5.2 PowerPlay’s System Architecture 135
ive
nces.
ners
eful
not
The
into
for
al to
, the
ee:
ked.
ns,
the
d in
.
Menu.pl - The Personal Design Center is customized through interact
forms. The key is that the actual design center is not stored, but the user’s prefere
The program, menu.pl, creates a menu of the most current models from the desig
the user chooses. (hence the name menu.pl) This is combined with links to us
functions to create the PDC. Creating a static PDC is not an option since it would
capture information created by another designer after the creation of the PDC.
basic functionality of menu.pl follows:
1) The user’s custom information is parsed out of a central file and placed
an associative array.
2) The header of the PDC is printed at the top of the page. This is common
all users, except for information needed to track the user, so a subroutine identic
link.pl is utilized.
Then to print out the tables of models and domains that the user desires
following steps are looped through for each user that the designer has elected to s
3) Whether the designer has been given permission to view the files is chec
If permission has not been granted, the table will not be displayed.
4) For one or more of the following, composites, primitives, generics, domai
the directory of the desired user is searched and up-to-date tables printed. From
PDC the designer can chose which menus he would like shown.
There are a number of properties in these examples which can be reuse
other WWW-based applications:
• Pages get stored in a generic form which can be customized quickly on the fly
• A central user file(s) can be used for storing customization information.
5.3 Summary 136
ing
.
d. A
nted.
ich
m the
l for
lies
ool-
• User preferences can be attained explicitly, from forms, or implicitly, by assum
that a certain behavior will often be repeated.
• Pages can be printed fully custom on the fly as in the table created in the PDC
5.3 Summary
In this section, the conceptual-level environment PowerPlay was presente
number of techniques for the design of conceptual-level environments were prese
A number of these are generic to a number of other WWW applications. Code wh
may be useful as examples to other environment designers can be downloaded fro
WWW at the InfoPad site. The appendix to this chapter presents a FAQ and tutoria
using PowerPlay. This may be used purely for reference, but it also supp
information as to how the design of PowerPlay was made model-focused, not t
focused.
6.1 Design Examples 137
Chapter 6
Design at the ConceptualLevel
ce
ased,
signs,
of
said
ues
6.1 Design Examples
Versions of PowerPlay have been publicly available on the WWW sin
1994. Hence a number of real designs have been implemented utilizing the web-b
conceptual-level design environment. This chapter presents a number of these de
in conjunction with other design drivers, to illustrate the effectiveness of this type
environment. Each of these examples can be used as guidelines for using
environment for a variety of tasks. A summary of design examples, techniq
demonstrated, and level of benchmarking, is presented in Table 2.
6.1 Design Examples 138
ed at
ore
ct the
s is
aid
ice
tant
6.1.1 Trade-offs in algorithm choice
As discussed in Section 2.2, the greatest design wins are most often attain
the highest level of abstraction. Once a system is broadly defined, one or m
algorithms are selected, and then trade-off analyses are performed in order to sele
best solution. Except for the most basic of algorithms, conducting these analyse
time consuming, and often inaccurate. A conceptual-level design environment can
in this work in a number of different ways.
After algorithm specification, the process of algorithmic analysis and cho
follows this trajectory:
• Specification of key components: This entails the sketching of the most impor
components of an algorithm
Table 2 Example summary
Example Techniques highlighted Benchmarking
LuminanceDecompression
Trade-offs of algorithmsDigital modeling of power
versus fabbed Si
Switching regula-tion
Heterogeneous systemsVariety of cost functions
versus SPICE
ARM Processor Creation of dynamic models versus variouslower level tools
RF baseband Complex dynamic models n/a
VSELP encoders Associative assignment of models n/a
InfoPad Complex system with manydesigners
Cost budgeting/documentation
n/a
Media Processor Schematic entrycomplex WWW interaction
versus Verilogand fabbed Si
6.1 Design Examples 139
can
ant
tial.
ons
eatly
ions
d) to
teps
the
• Specification of key parameters and cost functions: A number of parameters
often be traded off versus various cost functions. Identifying both the domin
parameters and the most important, or “most highly stressed” costs is essen
The identification of the key components, parameters and cost functi
enables the designer to focus in on the most important aspects of the design, gr
reducing this stage of the design cycle.
• Matching components with models: Once the key components and cost funct
are identified, models of the required accuracy can be used (created if neede
perform the required estimations.
• Optimization and selection: This entails the iteration of the second and third s
of this process.
The following example demonstrates the use of PowerPlay in aiding in
trade-off analysis of algorithm choice.
Example 8Luminance Decompression[54]
This example illustrates the methodology for conceptual-level
algorithmic trade-off analyses. Two alternative designs of the luminance
sub-component of a custom real-time video decompression chip [54],[56]
are explored and compared.
The vector-quantization decompression scheme (Figure 6.1) decodes an
8-bit input into 16 6-bit words, each of which represent the luminance of
a video pixel. The decompression is accomplished using a memory look-
up table (LUT), where the 8-bit input specifies the address of a 16-word
block of luminance values. Incoming data is buffered using a ping-pong
memory access scheme. Figure 6.1 shows that the current frame’s data is
being stored in memory Bank 0, while the previous frame’s data is being
6.1 Design Examples 140
read from Bank 1. With each new video frame, the read/write roles of the
memory banks are reversed.
The system has a 256 x 128 pixel video screen which requires updates at
a minimum rate of 60 frames/second. Since the incoming video arrives at
30 frames/second, each read buffer is decompressed and displayed twice.
In other words, a buffer is read twice as often as it is written. These
requirements set the minimum frequency,f, at which pixels are sent to the
screen to 2 MHz, and the read and write buffer access rates tof/16 and
f/32, respectively [54].
Without conceptual-level tools performing this analysis, either tedious
hand analysis or composing compilable structural descriptions are
necessary. Trying out different options would also be time consuming.
BANK 0
BANK 1
DECODE
Input DataA
clock
clock
R/W
4096 x 6 6
2048 x 8
2048 x 8Pixel
f
clock
8
R/W
A
DI
DI
A
8Luminance
f
f/16
f/320
1
Figure 6.1: Block diagram of a luminancedecompression chip
6
LUT
6.1 Design Examples 141
At the conceptual level, estimates of the power consumption of this
proposed architecture was produced as follows: Hardware modules were
selected from a library of pre-characterized components, and were
customized by defining the model parameters, such as bit-width,
memory-block organization, and signal-correlation characteristics.
PowerPlay multiplied the resulting energy/operation by the estimated
number of accesses of each resource (activity). Based on this
information, it generated a spreadsheet that contains the power estimates,
per module, as well as total power consumption (Figure 6.2). The
spreadsheet includes appropriate input parameters, such as bit-widths and
supply voltages, which can be varied dynamically. The whole process,
including the selection of the library elements and the composition of the
architecture, was executed through a standard WWW browser, Netscape,
in less than three minutes. No other tool interfaces are needed.
Figure 6.2: PowerPlay’s spreadsheet power analysis
6.1 Design Examples 142
Note that the clock capacitance is included in the model of each block. In
this example, signal correlations are neglected, yielding a conservatively
high power estimate. Note however, that the lack of interconnect analysis
in this example neglects some switching capacitance. The important point
is that one should be able to decide which components are important and
provide estimates as accurately as possible. In this way, the estimate can
take seconds to perform and account for everything (or more than) that
would normally be taken into consideration in manual analysis.
This estimation strategy enables a quick comparison of alternative design
choices. Figure 6.3: shows another implementation of the same
algorithm, which exploits the locality-of-reference of vector quantization
by addressing groups of four words. In this implementation, each
memory access yields four times as many bits as in the previous
implementation. The question is whether the overhead of the larger
memory accesses and extra multiplexors outweighs savings earned by
reducing the total number of accesses. In this implementation, only one
multiplexor and register are switching at the full 2 MHz. The memories
BANK0
BANK1
LUT
ByteSel
8
CK
CK
R/W
1024 x 2424
PixSel
6
6
6
6512 x 32
512 x 32CK
32
R/Wff/4f/4
(f)
f/321
f/640 (f/16)
Figure 6.3: Alternate implementation of the decompression chip
8
8
8
8
6.1 Design Examples 143
bility
, it
rk to
to
le is
switch more capacitance per access, but at a fraction of the frequency of
the initial implementation.
At this level of abstraction, accuracy should be within an octave of the
actual value. This enables power budgeting at an early stage and gives a
good basis for making architectural and algorithmic decisions. PowerPlay
estimated the power dissipation of the second implementation (Figure
6.3:) to be ~150µW, or 1/5 that of the original design (Figure 6.1:). The
final implementation of the chip used this second architecture and had a
measured average power dissipation of 100µW [54].
6.1.2 Estimations involving various interacting cost functions
One of the key enablers that a macromodel based system provides is the a
to perform analyses of a variety of cost functions simultaneously. More importantly
also provides the ability to bring these examples together into a common framewo
perform very complex analyses throughout the design process.
In the following example, a highly heterogeneous system is used
demonstrate the methodology for this type of analysis. The analysis in this examp
benchmarked versus SPICE.
Example 9Voltage regulator operating in Pulse Frequency Modulation (PFM).
The basic functionality of a switching regulator is to regulate voltage
substantially constant to another electronic circuit. Like most system, the
regulator has a number of other conflicting cost functions. The energy
should be supplied as efficiently as possible, and the size of external
components need to be kept as small as possible minimizing both system
cost and board area.
6.1 Design Examples 144
The cost functions tracked in this example are regulation accuracy (in
terms of voltage ripple) and efficiency. External component size is a
design parameter. In addition, for a battery supplied system, the input
voltage to the regulator varies with time, hence battery voltage will also
be a design parameter.
The system under design is a buck regulator - the input voltage is higher
than the output voltage. The basic power train for the buck regulator is
shown in Figure 6.4.
The main job of a converter is to regulate an output voltage,Vo,
substantially constant by controlling the power switchesMP and MN. In
Pulse Frequency Modulation (PFM), the goal is to only turn on the
switches when the output voltage drops below a specified trigger level.
Simplified operation is shown in Figure 6.5. PFM is usually used when
the output has a relatively low current draw. In the state whenVo is above
the trigger level, Vt, only circuitry monitoring the output voltage is
enabled (usually a voltage reference and a comparator), the switches can
either be controlled by a PWM circuitry or other methods. In this
Cf
Lf
iLf
Vo
+
-
Vin
+
-
Mp
Mn
Cin
vx
+
-
Output Filter
Figure 6.4Switching voltage regulator power train
6.1 Design Examples 145
example, whenVo drops below a specific level, the PMOS device is
turned on, and then the NMOS device, thus applying a single burst of
charge (Figure 6.5). Switching does not occur again untilVo goes below
the threshold. The voltage levelsVout and ∆Vp (the allowable ripple) are
defined by the application specifications.
While meeting the regulation specifications,Vo andVe, the regulator also
needs to minimize energy dissipation, hence maximizing efficiency.
Primitive models for regulator accuracy can be derived in a general sense
without the development of the details of the design. Efficiency, however,
requires some assumptions of the underlying architectures.
A composite model for this view of the regulator is shown in Figure 6.6.
The model is composed of a composite for PFM losses and a number of
primitives. It illustrates the variety of cost functions which can be
simultaneously monitored with this type of system. In the first row, losses
are calculated using a composite model. By linking to the PFM losses,
breakdown of the losses can be explored. The sensitivity analysis
presented in Figure 5.9 was calculated using the composite of losses.
Energy dissipated in the load, and efficiency are both calculated using
Time
Vout +/- 2∆Vp
Slow Dissipation
PFM on
Figure 6.5Simplified PFM operation
6.1 Design Examples 146
primitive models. In the calculation of efficiency, the usefulness of
spreadsheet analyses is demonstrated.
The last two entries in the composite contain seven different primitive
model equations calculating complex aspects of the regulator. The main
purpose of line 4 is to calculate the size of each charge burst. The model
for the charge, Q, is reliant on the time that each of the MOS devices are
Figure 6.6: Voltage regulation composite model
6.1 Design Examples 147
on, which is, in turn, reliant on the characteristics of the input and output
parameters of the circuit.
For a specific regulator, a certain efficiency is required. In this regulator
the ripple must be less than +/-5% of the nominal output. By allowing the
voltage to sag 20mV below the nominal output, the allowable Ripple is
calculated as0.5*Vout+20e-3. Note, this is put in as an equation so that
the spreadsheet remains valid as the output voltage specification
changes.The primitive model, Ripple, calculates not only the ripple for
this particular specification, it calculates the allowable ripple, and the
minimum capacitor size required to meet this specification.
As can be seen from this example, the specification is not met.
Information is calculated in the minimum capacitor size and the size of
the ripple which will allow the designer to evaluate ways to make the
changes. As the design is changed to meet the ripple specification, the
designer can simultaneously monitor the effects that this has on regulator
efficiency and component size.
Figure 6.7: Ripple versus capacitor size
6.1 Design Examples 148
Figure 6.7 illustrates a plot that can be created through PowerPlay to
show the trade-off between capacitor size and the voltage ripple. Through
plots such as these, the designer can identify crossover points when their
design goals are met.
A powerful enabler of PowerPlay is the ability to see the effects that
design changes have on a variety of cost functions. Using the same
sensitivity analysis used for Figure 6.7, the effects that changing the
inductor size has on efficiency and ripple are plotted in Table 3. It
illustrates that as inductor size goes up, the amount of charge per burst
also rises. Though, this translates to a larger voltage ripple, the efficiency
increases. The efficiency wins arise from the lower frequency of pulses
required.
Table 3 Ripple and Efficiency versus Inductor size
L(uH)
Ripple(mV)
Efficiency(%)
1 14.7 92.2
2 29.4 85.9
3 44.1 87.2
4 58.8 87.9
5 73.5 88.3
6 88.2 88.6
7 103 88.8
8 118 88.9
9 132 89.0
10 147 89.2
11 161 89.2
12 176 89.3
13 191 89.3
14 205 89.4
6.1 Design Examples 149
an
or a
uits
th a
lex
mic
Time of Analysis: The time to create and input all the models used in
this example was approximately an hour and a half. However, each single
analysis took less than a second to perform. In addition, even the most
complex of sensitivity analysis took on the order of 10 to 20 seconds.
In addition, most of the models used are general to PFM architectures so
as future designs are created, very little modeling needs to occur. An
example of design for reuse.
Accuracy: These models were used to help size the peak current in a
PFM architecture which was fabricated in 0.6um CMOS process. The
ripple estimates were all within a few percentage points of accurate
across battery voltage and output voltage ranges. The accuracy of the
ripple estimates went down slightly as the load current went up as that
was not accounted for in the model
The efficiency was measured to be between 89-93% depending on load.
Maximum error from estimated efficiency was 4.5%.
6.1.3 Primitive characterization and inclusion
The ability to quickly and accurately characterize and include primitives is
essential property of the conceptual-level approach. On of the key building blocks f
number of systems are digital standard cell libraries. In the following example, circ
and methodologies used for characterizing digital cells will be demonstrated. Bo
1.2µm and a 0.6µm library were characterized.
6.1.4 Dynamic models and other tools leveraging off the WWW
The inclusion of dynamic models is fundamental to the estimation of comp
systems. A number of examples are presented in Appendix A. Two different dyna
6.1 Design Examples 150
l
ther
ct
een
r a
models are presented in this section.1 In Example 11 , the use of complex too
interaction is also demonstrated by utilizing a schematic entry tool located on ano
server on a separate file system.
Example 10Arm Processor[32]
This example illustrates an architecture specific dynamic model. The
ARM8 was modeled for energy. In this the model, an algorithm (in C) is
compiled, simulated and profiled. This produces information on
operation counts and timing. The reader is referred to [32] for more
details on this process. The profiled output is then parsed into PowerPlay
to produce a nest of composites representing the entire algorithm, or a
single number for energy consumption. Thus the user can choose what
level of detail needed. The model takes less than 10 seconds to evaluate
and was put into PowerPlay using the WWW interface described in
Appendix A.
The VELA project aims toward providing the capability for tools to intera
over the WWW. One of the keys to this is to form standards for communication betw
tools. The following example, though not using standards, illustrates the ability fo
schematic entry tool to interact with an estimation tool, across the WWW.
Example 11Media Processor
The schematic entry tool WebTop[38] was used to describe a multimedia
processor.The java entry tool used is located at MIT and can be accessed
from any browser. This processor consist of an IDCT block, an ARM
microprocessor and a voltage regulator for each of these units. Input
parameters for each of these elements can be placed. The design is parsed
1. A number of other examples of dynamic model usage is presented in Appendix A
6.1 Design Examples 151
ally
s in
into a textual format MIT. This format is passed, at the request of the
browser, to PowerPlay to produce an estimate.
The screen shown directly after the linkage to PowerPlay is illustrated in
Figure 6.9. This particular model illustrates a number of PowerPlay’s
capabilities:
• The ability to describe a system in a schematic and view it as a composite virtu
instantaneously
• The use of associative modeling allowing the user to select different model
PowerPlay without changing anything in the schematic
Figure 6.8: Media Processor described using WebTop
6.1 Design Examples 152
d in
was
RM
are
• Use of views. In this case the view scalePower is used. The IDCT was designe
0.8µm technology, however, the characterized library is a 1.2µm library. The
design rules and cells used were very simple so only a simple scaling factor
used. This factor was incorporated into the view of the spreadsheet.
• The use of various types of models. The IDCT is a composite model, the A
model is the dynamic model of the previous example, and the regulators
primitives.
Time of Analysis: The transfer of the description from MIT to Powerplay
varied in time from 5 to 20 seconds. Evaluation time of the model varied
from 10 -25 seconds depending on the length of the code being profiled.
Accuracy: Most of the design was not fabricated. However, the IDCT
section was both simulated using Verilog and fabricated in Silicon.
PowerPlay estimated the power at 5.8mW. The simulator, Pythia[19], a
6.1 Design Examples 153
Verilog estimator, estimated the power dissipation to be 5.57mW. The
fabricated part dissipated 4.65mW[20].
Figure 6.9: Media processor composite model
6.1 Design Examples 154
C
via
s a
g a
te
ess an
Example 12Radio baseband noise analysis
Figure 6.11 illustrates the primitive utilized for estimating the effect that AD
bitwidth has on noise in the data of a RF receiver. The plot shown was created
Matlab and included as documentation.
As noted in the documentation at the bottom of the figure, the model i
dynamic call to a table look-up algorithm. The data in the table was created utilizin
MATLAB script for a CDMA radio path. By utilizing a table, however, the estima
requires less than a second as opposed to the ~ten seconds required to acc
encapsulated MATLAB script.
Figure 6.10: IDCT implementation[20]
6.1 Design Examples 155
ic
a
ene
6.1.5 Textual Parsing
The ability to produce a composite description of a design from an algorithm
description is useful for functions like bottleneck identification, while staying within
design flow. The following example was performed using a tool created by Marl
Wan and integrated into the PowerPlay system.
Example 13VSELP encoder
A textual parser was developed to analyze the complexity of an
algorithm. The algorithm (in a restricted C++ format) is compiled and
Figure 6.11: Dynamic model utilizing table look-up
6.1 Design Examples 156
simulated to get execution frequency of each of the basic subroutines.
This is independent of architecture so certain inaccuracies are expected.
However major components should be identified. A compiler front end is
used together with this functional breakdown to derive the number of
basic operations that are in each procedure. Finally, this information is
parsed into the PowerPlay format, with associative model mapping, so
different architectures can be used.
The flow described was used to guide both hardware and software co-
design, and to identify bottlenecks in an algorithm being prototyped on
the domain-specific hardware, Pleides.
6.1.6 Design Management
Example 14InfoPad wireless multimedia terminal
The multimedia wireless InfoPad terminal [53] is a good example of a
highly heterogeneous, complex system (Figure 6.12). It includes custom
digital chips, a commercial microprocessor for signal processing and
error correction, a radio subsystem with analog and digital components,
nine mixed-signal voltage regulation systems, display drivers, and an I/O
Radio
Voltage Regulation
Video ChipsµProc
Display Drivers
I/O
Error Correction
Figure 6.12: InfoPad: multimedia wireless terminal
6.1 Design Examples 157
interface. This example was provided in detail in Chapter 3. PowerPlay
was used as a means for documenting the projectafter completion. In
other words the implementation details were abstracted up to the
conceptual level. Further details of this example can be found in [53] and
Example 6.
Figure 6.13Composite model of a portable multimedia system
157
Chapter 7
Summary and Directions ofFuture Research
. In
plex
able
cent
ing
ge
thesis
this
en dou-dingly
The growth in complexity of electronic systems is not a new phenomenon
fact it has been a continuum as constant as Moore’s law1. Over time, designers have
responded to increased complexity in the same manner. When designs get too com
to understand at one level of abstraction, create tools and methodologies to en
important design decisions to be made from higher levels of abstraction. The re
growth in complexity, combined with the drive toward system integration is, us
another Intel founder’s terminology, creating a strategic inflection point [53]. A chan
so profound that new approaches and methodologies are needed. We believe this
to be one of the first steps in identifying and addressing methods for responding to
dramatic change in the way systems need to be both examined and designed.
1. Gordon Moore, in preparing a speech in 1965, made the observation that chip complexity had bebling every 18-24 months. The extrapolation of this observation beyond 1965 has proven to be astounaccurate and has become known asMoore’s Law.
7.1 Summary 158
ight
ext
h a
n of
tails
nts
e, the
n is
s it
s of
ples
ential
ent
r tool
The work presented in this thesis is a small step in what we believe is the r
direction, however there is obviously much more development required. In the n
section, the main lessons learned will be reviewed. The thesis will conclude wit
potential road map for future directions in complex system design.
7.1 Summary
This work has identified and developed some key aspects in the desig
increasingly complex circuits. These included:
The conceptual level of abstraction: A higher level of abstraction has been
defined with regards to complex system design. In this level, only the required de
of the lower levels of abstraction are utilized. A conceptual-level description prese
only the essence of the system required to answer a specific design question. Henc
conceptual-level description is a dynamic description, changing not only as a desig
modified, but as different design objectives are targeted.
A prototype environment was developed. In a number of design example
was shown how the conceptual level is of primary importance in the early stage
design to maximize design wins and minimize design time. In addition, these exam
demonstrated how, throughout the design process, abstracting away all but the ess
elements is sometimes the only way to deal with a complex system.
Macromodel-based design environment:Macromodeling was identified as a
key building block for complex-system design. In effect, a macromodeling environm
is object focused, as opposed to tool focused, enabling designers to use the prope
for a particular part of a heterogeneous system.
7.2 Road map for future design environments 159
ere
ls,
r of
ary
The
be
n of
he
of
have
ents
ool
ious
ents
o all
The important aspects of a conceptual-level design environment w
identified in detail in Chapter 2. Chapter 3 introduced two forms of macromode
primitive and composite.
World Wide Web-based CAD design: This work introduced one of the first
CAD systems designed to leverage off the capabilities of the Internet. A numbe
design techniques were identified and highlighted in Chapters 4 and 5. Of prim
significance, two aspects of the environment seem to have had the most impact.
first, hyperlinked spreadsheets,has already made the transfer to industry and may
the most recognizable aspect of the design environment. The implementatio
dynamic macromodelsillustrated how tools can be invoked transparently over t
WWW, aiding in system analysis.
7.2 Road map for future design environments
The development work has focused primarily on the estimation portion
design exploration however elements important to the entire design environment
been identified. This thesis concludes with a discussion of the important developm
needed to support the increased complexity in system.
7.2.1 Integration
A design environment should not dictate what tools get utilized. The best t
should always be used. As much as possible, the integration of tools and prev
designs/estimations should be transparent to a designer. The following three elem
were introduced in a specific sense for estimation, but needs to be extrapolated t
aspects of design, and be used widespread to be of full value.
7.2 Road map for future design environments 160
e
as
this
rds
the
ing
e an
n a
ar
s of
ing,
and
inal
very
block
wer-
n a
ial.
d
een
atics/
• Standards: It will not be practical to try to force all tools to resolve to on
description standard. Whatwill be important is to define a series of standards
to how designs get stored and information transferred. Some great steps in
directions are ongoing including the Vela project [41] and the VSI standa
committee. In this work it was shown by making a simple standard for both
request and return of information, a variety of tools can be integrated us
simple wrappers.
• Tools and data encapsulation: A virtual catalogue of design tools, design
elements and estimations should be available to a designer. This will requir
initial effort to encapsulate a number of tools and data sheets. In additio
systematic way to easily incorporate tools will be important.
• Proxies/Design Servers:The distributed nature of tools, as well as the she
tombs of information, will necessitate that continued research in the area
proxies and design agents. These agents will facilitate the search
management and organization of an ever growing amount of tools
intellectual property. As discussed in Chapter 3 this may prove to be the f
step in full integration of a complex-system design environment.
7.2.2 Design entry and result visualization
Present day design entry tools are adequate in bottom-up design, but fall
short for the needs of top-down design. Simple aspects such as drawing a simple
diagram of a system, without needed to spend any time defining aspects of the lo
levels, is cumbersome in all the most widely used commercial tools. Focus o
schematic entry tool which can be used for both bottom up and top down is essent
The ability to link the high-level design descriptions with estimation an
simulation data is also a need moving forward. Being able to cross-probe betw
different levels of the design process (e.g., estimates back annotated onto schem
7.2 Road map for future design environments 161
way
.
ign
D
port
gn
all
the
cal
, it
into
to
table
e
end
nd
cost
nt a
code; linkage between floor-planning and behavioral analyses) will be a powerful
to abstract essential elements of design information to the useful conceptual level
As the complexity of designs go up, perhaps new methods of des
visualization will need to be developed. The interactive verification work of the WEL
project is an example of looking at new paradigms which may be required to sup
complex system visualization [45].
7.2.3 Documentation
This work has illustrated how to fully integrate documentation into a desi
environment. The chief means in which it was accomplished was simply by making
documentation as easy to include as possible and ubiquitous throughout
environment. The use of hyperlinks and HTML is an obvious major technologi
innovation to aid in this process.
Tools still have a long way to go in supporting documentation. For example
is non-trivial to integrate a schematic of a design, or a view on a waveform editor,
a document. It should be required, for example, for all tools to have the ability
capture any portion of the design description or analysis on the screen into and edi
format with little effort. In addition, little “hooks” to documentation should b
ubiquitous.
7.2.4 The simulation and verification paradox
As most theses start with a problem statement, perhaps the best place to
this thesis is to posit another question.
This work has focused primarily on conceptual-level estimation a
exploration with and emphasis on design description and the evaluation of
functions. The idea of conceptual-level simulation or verification seems to prese
7.2 Road map for future design environments 162
ription
or
ars to
the
re a
paradox. The abstracting away to the essence of a system produces a design desc
which, as stated in chapter 2, is not completely specified. Performing simulation
verification when a number of the key elements have been abstracted away appe
require adding enough detail to no longer be at the conceptual-level. Hence,
paradox: can you simulate functionality of an incompletely described system. Is the
practical means of performing conceptual-level simulation.
163
Appendix A
PowerPlay: Tutorial andFAQ
It
iding
Q)
on.
This chapter includes both the on-line FAQ and tutorial for PowerPlay.
provides valuable guidance for the user interface design of systems as well as prov
system guidance.
A.1 FAQ
This section includes PowerPlay’s on-line Frequently Asked Questions (FA
file. It assumes little knowledge by the user. Refer to 5.2 for more detailed informati
Table 1.1 PowerPlay’s on-line FAQ
5.1.1 What is PowerPlay?
5.1.2 What is new with this release?
5.1.3 Where do I start?
5.1.4 What types of models can be created
5.1.5 How do I create new models
5.1.6 How do I add to my design
5.1.7 What about documentation?
164
ms
gn
It is
ptual
The
are
es
nal
lized
.
of a
x
ce.
y be
l
A.1.1 What is PowerPlay?
PowerPlay is a CAD tool aimed at aiding in the design of Complex-Syste
with any number of conflicting and interacting cost functions and desi
goals.PowerPlay is a networked, model creation and manipulation environment.
primarily focused on the early stages of the design process, herein called the conce
level, however it can also be gainfully used throughout the design process.
environment is usable not just for electronic systems, but the integrated models
focused toward integrated circuit and larger systems.
Tasks that PowerPlay can aid:
• Conceptual-level estimation- Earliest stage estimation. This environment enabl
ASAP (As Accurate As Practical) estimations before there is any traditio
behavioral or architectural description.
• Integrating many tools in one estimation environment - PowerPlay enables a
user to integrate any means of performing an estimation. From using genera
equations to integrating behavioral estimations to transistor level estimators
• Breakdown and identification of design bottlenecks- A hyperlinked, java- based
spreadsheet aids in breaking down and isolating “expensive” elements
design.
• Design Management- Many designers can work on different parts of a comple
system in parallel. The results of their work can be viewed in one pla
Documentation is integrated with the estimations. Dated archives can easil
produced.
• IP support and reuse facilitation
• Dynamic and static code analysis- Code can be parsed into a hierarchica
breakdown of functionality and analyzed via PowerPlay.
165
nal
u may
Your
ep
re a
ing
to
ese
l of
ther
A.1.2 Where do I start/ What is the “personal design center”?
The heart of ones interaction with PowerPlay revolves around your perso
design center. In essence it is a customized central menu page. From that page yo
link to any designs that you have created or anybody who has given you access.
design center will be created for you when you first log into PowerPlay. It will ke
track of the designs and primitive models that you create. At the top of this page a
number of links to places to aid you in tasks such as - primitive creation, customiz
your environment, setting up a domain, and much more.
A.1.3 What kind of Models can be created?
Primitive Models
A primitive models is the lowest level of model. It can not be broken down in
smaller models. There are two basic kinds:
Equations: A model can be any simple or complex equation
Tool-calls: Any tool can be encapsulated inside a primitive
Spreadsheet Models - Designs
A model can also be made up of any combination of other models. Th
models are usually viewed in a spreadsheet form. There is no limit to the leve
hierarchy. A spreadsheet model may contain any combination of primitives and o
spreadsheets. This is how most Designs are viewed.
166
ou
the
s)
or a
tion
ell.
any
ody
nd
e it
e.
r
your
that
A.1.4 How does one create new models?
Primitive models:
There is a link at the top of your design center. Follow this link and y
will be given a series of questions about your model. When you finish answering
questions, your primitive will automatically be included into your design center.
The first page will ask you to name your primitive. The next page(
allow you to enter your models. Here you can define your model as an equation
tool. For an equation simply type in the equation (just about any mathematical func
will be parsed. Parameters (such as bitwidth,) will be automatically parsed out as w
You may put in any number of models. The last page enables you to put
documentation on the page. Please (PLEASE) Document. It will help everyb
including yourself.
Spreadsheet/Composite Models:
1) Copy an existing one and edit
This is done by finding a spreadsheet that you want to start with a
hitting the “play and save” button at the top of the page. This will automatically sav
to your design center.
2) Start a new one by filling in the proper form on the customization pag
Go to the “Customizing your Design Center” link at the top of you
Design Center page. This page will have a simple form for starting a new design.
A.1.5 How do I add to my spreadsheet?
You always have one spreadsheet “active”. This spreadsheet is noted on
personal design center. If you would like to make another spreadsheet active, go to
167
form
eet. A
ves.
active
r
e an
ur
e
top of
is no
t yet
by
ing
n.
of
or
by
d by
spreadsheet and hit the appropriate button at the top of that page. You can per
estimates on any spreadsheet, however You may only add to your active spreadsh
spreadsheet is made up of line entries of either base primitives or other primiti
Once a spreadsheet has been established, new lines may be added to your
spreadsheet by:
Primitive : To add a primitive, go to the primitive page by linking off of you
design center. Put in input parameters on that page and hit a button to mak
estimation. On the return page, there will be a button which will let you include yo
this primitive in your “active design”
Spreadsheet/Composite: To add a spreadsheet entry to your activ
spreadsheet, go to the spreadsheet that you wish to enter. There is a button at the
that page to enable you to add this spreadsheet to your active spreadsheet. There
limit to the amount of hierarchy. Be careful of circular references as these are no
prohibited.
Documentation: Documentation can be added to any model or design
answering a simple Q&A form. This documentation can be written in html - enabl
the integration of pictures, graphs, and hyperlinks to more complete documentatio
Domains: Domains are used in PowerPlay in order to line up a series
operations with a series of models. The models may be either primitives
Spreadsheets. This is used most frequently for the following two applications:
Dynamic Profiling When code is profiled - estimates are performed
associating the operators with the models.
Schematic Entry When schematic entry is used, estimates are performe
associating the operators with the models.
168
und
C
es
des
dy’s
A.2 Tutorial: The Personal Design Center
In this tutorial, you will learn:
• About your Personal Design Center
• How to customize your working environment
• How to disable your password for fast access.
In this tutorial little previous knowledge is assumed.
It is suggested that the user has checked out the FAQ for backgro
information.
A.2.1 Your Personal Design Center
After logging in, you will be at you Personal Design Center (PDC). The PD
is the central page for your work in PowerPlay ( Figure A.1 ). The first table provid
links to the most useful information and functions. The tables below the first provi
links to the designs and models that you have created, followed by anybo
information you choose to see.
169
has
g
t the
for
In Figure A.1 , "newguy" has a single design created, Luminance, and
chosen to be able to view designs and models in Berkeley's Low Power Library.
A.2.2 Customization
Click on the first hyperlink in the upper table (Customize your workin
environment) You now can customize what you see in the PDC. The GO button, a
top of the page is hit to do any changes. The first check box and input field, used
Figure A.1 The Personal Design Center
170
lows
thers
n at
hared
, it
body
s only
ave
naming new designs, will be addressed in the second tutorial. The large table al
you to chose whose designs you view. The second column is how you disable o
ability to see your work. (Note: in general it is suggested that you leave the butto
the bottom of the table, enabling all access, checked. This promotes re-use and s
design)
In the following screen capture of the top of newguy's customization page
can be seen that the newguy is allowing everybody to see his designs, and every
but anthonys and rabaey have allowed the newguy to have access. The newguy ha
chosen to see berkeley_low_power on his PDC. At no time will you be able to h
write access on somebody elses designs.
Figure A.2 User customization
171
he
to
ive
#2,
nal
r the
le
A.2.3 To do
Click on different users in the "view on PDC", click on "GO" and observe t
changes on your PDC. For extra credit, you can start to explore other's designs.
The check box at the bottom of the page (Enable Password) allows you
disable your password for those logging on for your (and only your) machine. F
different machines are automatically stored in your profile. Before going to tutorial
check the berkeley_low_power library for viewing on your PDC.
A.3 Tutorial 2
In this tutorial, you will learn how to:
• View other peoples models
• Create your own equation-based model
• Create a new model with many cost functions calculated
• Edit a model already created
To do this tutorial the user should be starting to get familiar with the Perso
Design Center. To learn about the goals of PowerPlay check out either the FAQ o
paper.
To do this tutorial, you should have the berkeley low power library, viewab
from your PDC, and understand the concept of primitive models.
172
)
ary
te
ply
hey
and
ters
A.3.1 Viewing a model
After tutorial 1 you should have the Berkeley Low-Power Library (BLPL
primitives visible on your PDC. Since the BLPL has allowed all access to the libr
you can view any of their primitive models. Later you will learn how to incorpora
them in your own designs (or composite models). To view any of the models, sim
click on their hyper link. Clicking on the sram link provides Figure A.3 .
At the top of the page is documentation which the designer adds when t
create the document. This is written in HTML so pictures, like the brain above,
hyperlinks are incorporated.
This model uses equations to calculate the cost functions. With the parame
above selected, and the “all available” button pressed, provides Figure A.4
Figure A.3 Primitive model page of a memory cell
173
a
n
for
The calculated result is displayed. Also shown is means to include in
composite model (Tutorial 3). In this case, the active design is video_framebuffer..
Todo: Explore with different models from different users.
A.3.2 Creating a primitive model
To create a primitive model, hit the “Add a new primitive model” button o
your PDC”. You will be presented with a number of successive screens asking
relevant information. A series of screens are presented.
Figure A.4 Results summary
174
e
rent
ore
ch
free
The first screen, Figure A.5 , is for naming the primitive. You will then b
presented with a series of screens as shown in Figure A.6 for defining the diffe
models for your cost functions. The screen itself has instructions as to syntax.
You will be prompted for more costs until you are finished and select no m
models. At this time you’ll be prompted for documentation. Put in as mu
documentation as you think maybe helpful. This will be interpreted as html so feel
to use any HTML tags.
Figure A.5 Primitive entry form
Figure A.6 Primitive entry form
175
del
l as a
, you
your
of
cally
ht.
he
ood
side
ng of
our
t in
rl is
A.3.3 Edit models
You can edit any created model, by hitting the link at the bottom of the mo
page. This is often useful in creating new models as you can use an existing mode
base. Just change the name at the top of the page to your new model name. Note
can use somebody elses model as a base. It will automatically be saved to
directory and won’t modify the existing model.
To Do: Go to berkeley’s SRAM, hit the edit the models link at the bottom
the page and use it to create a new model. Observe that your PDC is automati
updated.
A.4 Tutorial 3: Advanced Primitive Modeling
In this tutorial, dynamic modeling, including tool encapsulation, is taug
Example scripts for tool encapsulation are included in Appendix B.
In order to use tool encapsulation, you should have familiarity with t
WWW, the concepts behind cgi-bin, and access to your own personal cgi-bin. G
background in cgi-bin can be found [42].
A.4.1 Tool Encapsulation
One of the features of PowerPlay is that any tool can be encapsulated in
PowerPlay. To understand how encapsulation works, requires some understandi
cgi-bins. Please see the above link.
In this configuration, the tool that you which to encapsulate must be on y
site. In addition you must have a cgi-bin which you can edit. The code that you pu
your cgi-bin can be of any language, for this example the scripting language pe
used.
176
and
o the
p.
ble
ay.
and
th a
et’s
oceed
tics
n.
Here is the step by step of what happens when a tool is invoked:
• First, PowerPlay sends a call to the encapsulation tool code with the names
values of all the input parameters.
• The code then parses the parameter values and either calls a tool local t
site, performs the estimate within the code itself, or performs a table look-u
• For the case of tool encapsulation and/or table lookup, the tool runs, or ta
look-up is performed, producing a result.
• The code then interprets the results and sends the values back to PowerPl
• PowerPlay interprets the values in the context of the particular estimation
continues estimating with the next model.
NOTE: instead of encapsulated a tool, the whole job can often be done wi
script as will be shown in the next example. That is the basic feel of what goes on. L
look at the roles of the encapsulation code again, discuss the syntax, and then pr
with an example:
1) Code interprets parameters from STDIN.
2) Code prepares input deck for tool.
3) Code invokes tool.
4) Code interprets output from tool.
5) Code returns values to PowerPlay through STDOUT.
Here is an example of tool encapsulation using a NMOS device characteris
and MATLAB. Figures B123 illustrates the page used to input the model informatio
177
and
tual
be
d
ble
ave
as a
up.
nd
it
As you can see, the tool call is the http location of the encapsulated script
the parameters for each estimation are incorporated on the following line. The ac
page created is at Figure B4
The script which performed the estimate is in Figure B5. In this script it can
seen how the inputs are parsed and the output is sent back. The syntax is simply:
name=value&name2=value2&name3.....&
All on one line
The follow example files may be useful. It takes Vtn, Vtp, and Vdd an
calculates the vih of an inverter.
Figure B8 is the script which encapsulates matlab. In addition, it keeps a ta
of previously run estimates so that no simulation needs to be run twice to s
estimation time. The script includes extensive documentatin and can be used
template for other tool encapsulations.
Table B1 is the table of data that is referred to in the code for the table look
Figure B9 is the mathematica input deck.
All scripts can be downloaded from the on-line version of this tutorial fou
on the PowerPlay site.
To Do If you have your own cgi-bin, encapsulate a simple tool call and try
out your self.
178
ned
of
. For
del of
with
This
ents,
ed
hen
d to.
cial
n be
.
an
A.5 Tutorial 4: The Composite Model
A.5.1 Basic functions
When a system can not be modeled with a primitive, it needs to be partitio
into parts and then recomposed in an effective matter. Thus a key property
macromodels is they can also take the form of a combination of other macromodels
example, a system may be represented by a single macromodel consisting of a mo
each subcomponent. In turn this same system’s macromodel may be combined
other models to "produce" a macromodel of a larger, more complex, system.
combination of macromodels, since they are in effect compositions of subcompon
have been termed composite macromodels.
Active design/composite:At any time, one composite model can be deem
"active". An active model is the only model whose composition can be added to. W
evaluating other composites or primitives, it is the active model that can be adde
Non-active models can still be edited and evaluated. In other words, the only spe
function that an active model has is that it can be added to. A composite model ca
made active via a link at the top of its page. This will be shown later in the tutorial
Creating a new composite:Composites can be created by either copying
existing composite, or by starting a new composite. A new composite is created by
• 1) Go to the customize environment page (1st link on your PDC)
• 2) Click on the first radio button
• 3) Fill in the name of your new model and hit go.
179
gn/
h
will
ow
an
the
as to
At this point, you have created a new model and it is your active desi
composite.
Adding a primitive to a composite: Composites are composed of bot
primitives and other composites. Later in the tutorial, the means to add composites
be illustrated. Primitives are added by:
• 1) Go to your PDC
• 2) Link to the primitive you wish to add
• 3) Select input parameters and hit the evaluate button
• 4) You will see a link for adding the primitive to your active model
• 5) You will be at your composite model. Hit play to update.
A.5.2 Step by step breakdown of composite model functionality
The goal of this section is to give an overview of a composite model and h
to create them. To view a composite model, got to your PDC. All new users have
example model, Luminance, in their PDC. click on the link and you should see
luminance composite model. The composite model has been divided into pieces so
describe all the functionality and options.
Figure A.7 Composite model header
180
ks.
DC.
nt It
an
A.8
e, hit
the
be
are
the
on,
ed
Figure A.7 , the top of the composite model, provides a lot of relevant lin
From this table you can create and edit documentation as well as link back to your P
The links above also allow you to delete or archive your design. Important comme
is in this table where you can
• Make the composite your active design.
• Include this composite as a part of an active design.
• In this case the active design is PFM_Losses.
Designer included documentation is not shown here for brevity, but it c
appear at this point in the spreadsheet or at the bottom. the buttons in Figure
signifies the beginning of the analysis portion of the design. Once changes are mad
play, or play and save, to re-evaluate. The design will be saved with the name of in
box next to the buttons. If you are not the owner of this design, the design will
copied into your directories. Hence, two ways of creating new designs
demonstrated:
• 1) If you want to make a new design out of one of your own, simple change
name in the box.
• 2) If you want to start with somebody elses design, simple hit the play butt
which will automatically copy it over into your directories.
Note: If you hit play, instead of play and save, a temporary version: titl
temp_ will be created. This way, you can always keep a backup version.
Figure A.8 Composite evaluation buttons
181
ing
also
ls.
at
or
he
Figure A.9 shows the table where global parameters are defined. By typ
the name in the input parameter fields below, they are automatically passed. This is
where domains can be defined.
Here is the main table. Some interesting facets of the table:
• The names of the table also serve as hyperlinks to the corresponding mode
• In the input parameter forms, a variety of options are available:
• A number
• A global parameter (as defined in the input parameter table)
• A cost calculated in an earlier line. This is done by using the form
of A pound sign, followed by a row number, followed by the cost. F
example, to include the power of the first memory element in t
bitwidth form of a lower row, type: #1power. It is case sensitive!
• A mathematical combination of all of the above is also usable.
Figure A.9 Composite input parameter table
Figure A.10Main Spreadsheet table
182
eet.
fine
s
sts.
t at
is
• Rows can be moved around or deleted by using the links on the right.
• Columns that are viewed is defined in a table at the bottom of the spreadsh
Figure A.11 illustrates the table, found under the spreadsheet, used to de
the view in the main spreadsheet model of Figure A.10 .
The two input forms shown in Figure A.12 are very useful. The first allow
the definition of new cost functions. The second enables the definition of global co
For example, you can define MinPower as:
MinPower=#1power+#4power
This will calculate the sum of the power of the 1st and 4th row and present i
the bottom of the form. Note - you will only see it when the Show cost function box
checked. Common functions are also supported:
SUM(cost) will sum a cost in the entire row.
Figure A.11Composite model view selection
Figure A.12 Input forms for global parameters and global costs
183
be
for
ted.
n the
ost.
AVG(cost) will present an average.
MAX(cost) and MIN(cost) are also supported.
Remember to put it in the form of an equation. All calculations must
assigned a name so it can be passed to a parent composite model.
The bottom of the composite model page ( Figure A.13 ) provides forms
the, different types of plotting available. In the first case, 1-3 costs can be calcula
The same syntax used in the input parameters defined above can be used i
graphing. In the second case, a hyperlinked pie-chart will be created of any c
Chapter 5 provides examples of this functionality.
Figure A.13Graphing options
184
Appendix B
Figures and Code UsefulFor Downloading
B.1 Figures and code for Section A4.
Figure B.1Editing NMOS primitive
185
Figure 2.2Editing NMOS primitive
Figure B.2Editing NMOS primitive
186
Figure B.3Editing NMOS primitive
187
Figure B.4Editing NMOS primitive
188
B.1.1 Code for performing transistor estimation
#!/usr/sww/bin/perl
#
#
# Computes the model of an NMOS transistor in a 1.2 um
technology
# $_ receives the request from PowerPlay and parses it to
the associative
# array %input:
$_ = $ENV{’QUERY_STRING’}
|| ($_ = <STDIN>);
%input = split (/=|!/,$_);
189
# Desired value is in $input{cost}
# Model parameters - these should become globals in future
versions
$VT0 = 0.74;
$KP = 19.6;
$LAMBDA = 0.06;
$GAMMA = 0.54;
$PHI = 0.6;
$DELTA = 0.3;
# Input parameters
$cost = $input{cost};
delete $input{cost};
$W = $input{"W"};
$L = $input{"L"};
$Vgs = $input{"Vgs"};
$Vds = $input{"Vds"};
$Vsb = $input{"Vsb"};
# Compute model
$VT = $VT0 + $GAMMA * (($PHI + $Vsb)**0.5 - ($PHI)**0.5);
if ($cost eq "VT") {
OutputValue($VT);
exit;
}
$K = $KP * $W / ($L - $DELTA);
190
if ($Vgs <= $VT) {
# transistor in cutoff
$Id = 0;
$gm = 0;
$rout = "Inf";
} elsif ($Vds <= $Vgs - $VT) {
# transistor in linear operation mode
$Id = $K * ( ($Vgs - $VT) * $Vds - ($Vds)**2/2.);
$gm = $K * 1.E-6 * $Vds;
$rout = 1.E6 /($K * ($Vgs - $Vds - $VT));
}
else {
# transistor in saturation
$Id = $K * ($Vgs - $VT)**2 * (1 + $LAMBDA * $Vds) /
2.;
$gm = $K * 1.E-6 * ($Vgs - $VT) * (1 + $LAMBDA *
$Vds);
$rout = 1.E6 /($K * $LAMBDA * ($Vgs - $VT)**2 / 2. );
}
# Return the result
if ($cost eq "gm") {
OutputValue($gm);
} elsif ($cost eq "rout") {
OutputValue($rout);
} elsif ($cost eq "Id") {
OutputValue($Id);
} else {
print("Unknown cost\n\n");
}
191
sub OutputValue {
print("Content-type: text/html\n\n");
print("$cost=$_[0]&");
}
B.1.2 Matlab/Table look up code
#!/usr/sww/bin/perl
#
#
# Searches through a lookup table for a previously run
estimate
# If the estimate has been performed, then return the value
# If not, perform a new estimate and store the value in the
table
# Table has a limit of $Table_Limit and works as a FIFO.
#
# Define Constants:
$Table_Limit = 40; # Max 40 values in the table
# Next line is where all files are stored. Not an optimal
place:
$Dir = "/path/path/path/MATH/";
# $_ receives the request from PowerPlay and parses it to
the associative
# array %input:
$_ = $ENV{’QUERY_STRING’}
192
|| ($_ = <STDIN>);
%input = split (/=|!/,$_);
# Desired value is in $input{cost}:
$cost = $input{cost};
delete $input{cost};
# File names:
# $file holds the last unique $Table_Limit:
$temp = $Dir.$cost.".tmp";
$file = $Dir.$cost.".dat";
# Sleep for 5 is temp exists
# sleep -5 if (-e $temp);
# Get the length of the file. set a flag if it exceeds the
tablelimit
$xxx = ’wc -l $file’;
$xxx =~ / (.*) (.*)/;
$count = $1;
$FLAG = 1 if ($count > $Table_Limit);
# open The files.
open (TEMP, ">$temp")
|| die "Can’t create tempfile $temp";
open (FILE, "$file" )
|| die "Can’t open the data file $file";
193
#####
# Here we search for a precious match:
$MATCH = "";
$i = 0;
while (<FILE>) {
print(TEMP) unless ($FLAG); # First line doesn’t
get printed to temp
# if the table size is
met
$FLAG = "";
%data = split (/=|&/,$_);
$i = 0;
$j = 0;
foreach $key (keys %input) {
$j++ if ($input{$key} == $data{$key});
$i++;
}
if ($i==$j) { # Match made
close (TEMP);
’rm $temp’;
$MATCH = "1";
last;
}
}
close (FILE);
# Match found if $MATCH
########
unless ($MATCH) {
194
&ExecuteTool; # makes the tool call and puts result in
%data{cost}
# Print out the parameters
foreach $key (keys %input) {
print(TEMP "$key=$input{$key}&");
delete $data{$key}; # may be redundant if data
only gets costs
}
# Print out all costs
foreach $key (keys %data) {
print(TEMP "$key=$data{$key}&");
}
print (TEMP "\n");
close (TEMP);
rename ($temp,$file);
chmod 0666,$file;
}
# Return the final value:
print("Content-type: text/html\n\n");
print("$cost=$data{$cost}&");
sub ExecuteTool {
$ToolFile = $Dir.$cost.".math";
$TempFile = $Dir.$cost."tmp.math";
$OutFile = $Dir.$cost.".out";
195
# Substitute all of the parameters into the math file and
place into
# the Temporary File
&subandprint($ToolFile,$TempFile);
# Run mathmatica using the temporary file:
system ("math-3.0 < $TempFile > $OutFile");
# Parse Outfile for the costs. Also do any post-processing:
$data{$cost} = &ParseCost($cost,$OutFile);
}
sub subandprint {
# Parse the filenames to local variables:
local($in,$out) = @_;
open(IN,"$in")
|| die "Could not open $in";
open(OUT,">$out")
|| die "Could not open $out for writing";
# The input file has all the parameters which need
substituting written
# as <param>xx . The substitution, is done here:
while(<IN>) {
196
s/"/'/g; # make " into ' for later
printing
s/(\w*)xx/$input{$1}/g; # substution done here
print(OUT); # Printed to OUT
}
close (IN);
close (OUT);
}
sub ParseCost {
local($cost,$file)= @_;
local(@Cost);
open (FILE,$file)
|| die "Could not open $file";
while(<FILE>) {
# Wayne’s code follows. Functional but needs tightening:
if (/->/) { # find the line that contains ’->’
($f1, $f2, $f3, $f4, $f5, $f6, $f7) = split (/\s+/
);
# ($g1, $g2) = split (/{+/, $f2);
# ($tmp1, $tmp2) = split (/,/, $f4);
# print("$f4\n");
$f4 =~ s/,//;
push (@Cost,$f4);
# die;
}
}
foreach $key (@Cost) {
197
$answer = $key if ($key > 0);
}
close("FILE");
$answer;
# remove IN,OUT;
}
B.1.3 Eaxmple table lookup data
Vtn=0.7&Vtp=0.74&Vdd=3&vih=1.40288&
Vtn=0.7&Vtp=0.7&Vdd=3&vih=1.4209&
Vtn=0.5&Vtp=0.7&Vdd=3&vih=1.31102&
Vtn=.7&Vtp=.74&Vdd=5&vih=1.31102&
Vtn=.7&Vtp=.8&Vdd=3&vih=1.31102&
Vtn=0.7&Vtp=0.7&Vdd=5&vih=1.31102&
Vtn=0.75&Vtp=0.75&Vdd=5&vih=1.31102&
Vtn=0.4&Vtp=0.7&Vdd=5&vih=1.31102&
Vtn=0.3&Vtp=0.2&Vdd=5&vih=1.31102&
Vtn=0&Vtp=0&Vdd=5&vih=1.31102&
Vtn=0.2&Vtp=0.5&Vdd=3&vih=1.31102&
Vtn=1&Vtp=1&Vdd=2&vih=1.31102&
Vtn=0&Vtp=0&Vdd=0&vih=1.31102&
B.1.4 Example Mathemaica deck
vdd = Vddxx
vtn = Vtnxx
vtp = Abs[-Vtpxx]
198
kn = 0.00008
kp = 0.000027
Solve[{ kn*vout + kp*(vdd - vih - vtp) == kn*(vih - vout -
vtn),
kn*((vih - vtn)*vout - (vout^2)/2) == (kp/2)*(vdd - vih -
vtp)^2 }]
B.1.5 Example MODELS.moddat file
name=adder&DelayUNITS=ns&PARAMbits=8&cap=#bits*268e-15&
capUNITS=F&allMOD=cap#Delay#Area#&Area=(#bits*64+11)*207&allParam=
bits#&AreaUNITS=um^2&Delay=7.7+#bits*0.75&
name=csadder&Area=(#bits*64+40)*232&allMOD=cap#Area#&allPar
am=bits#&PARAMbits=8&AreaUNITS=um^2&cap=#bits*233e-15&capUNITS=F&
name=counter&DelayUNITS=ns&PARAMbits=8&cap=#bits*154e-
15&capUNITS=F&allMOD=cap#Delay#Area#&Area=(#bits*64+75)*160*31*(#b
its+1)/15&allParam=bits#&AreaUNITS=um^2&Delay=#bits*0.75+5&
name=register&allMOD=cap#&allParam=bits#&PARAMbits=8&cap=#b
its*95e-15&capUNITS=F&
name=reg&allMOD=cap#&allParam=bits#&PARAMbits=8&cap=#bits*9
5e-15&capUNITS=F&
name=reglp&allMOD=cap#&allParam=bits#&PARAMbits=8&cap=#bits
*95e15&capUNITS=F&PARAMwords=1024&
name=sram&DelayUNITS=ns&PARAMbits=8&cap=(#bits*1126+#words/
#blocks*108+#words/#blocks*#bits*6+9707)*1e-15&capUNITS=F&allMOD=
Delay#cap#Area#&Area=(17*#words/#blocks+362)*(51*#bits+275)&
allParam=bits#words#blocks#&AreaUNITS=um^2&PARAMblocks=4&Delay=4.5
&PARAMwords=1024&
name=memlp&DelayUNITS=ns&PARAMbits=8&cap=(#bits*1126+#words
/#blocks*108+#words/#blocks*#bits*6+9707)*1e-15&capUNITS=F&allMOD=
Delay#cap#Area#&Area=(17*#words/#blocks+362)*(51*#bits+275)&
199
allParam=bits#words#blocks#&AreaUNITS=um^2&PARAMblocks=4&Delay=4.5
&
name=mult&PARAMbits=16&cap=(#bits*#bits2*253)*1e-
15&PARAMbits2=8&capUNITS=F&allMOD=cap#Area#&Area=(129*(#bits-
1)+410)*135*(#bits2-1)+3&allParam=bits#bits2#&AreaUNITS=um^2&
name=mux2to1&DelayUNITS=ns&PARAMbits=16&cap=#bits*40e-
15&capUNITS=F&allMOD=Delay#cap#Area#&Area=56*((64*#bits)+25*(#bits
+1)/8)&allParam=bits#&AreaUNITS=um^2&Delay=2.5&
name=mux3to1&DelayUNITS=ns&PARAMbits=16&cap=#bits*64e-15&
capUNITS=F&allMOD=Delay#cap#Area#&Area=93*((64*#bits)+20*(#bits+1)
/8)&allParam=bits#&AreaUNITS=um^2&Delay=8/3&
name=mux4to1&DelayUNITS=ns&PARAMbits=16&cap=#bits*78e-15&
capUNITS=F&allMOD=Delay#cap#Area#&Area=113*((64*#bits)+16*((#bits+
1)/16)-1)+79&allParam=bits#&AreaUNITS=um^2&Delay=8/3&PARAMwords
=1024&
name=RegFile&DelayUNITS=ns&PARAMbits=8&cap=(93+(40*#words)+
(50*#bits)+(9*#words*#bits)*1e-15&capUNITS=F&allMOD=Delay#cap#
&allParam=words#bits#&Delay=4.5&
name=sub&DelayUNITS=ns&PARAMbits=8&cap=#bits*264e-
15&capUNITS=F&allMOD=Delay#cap#Area#&Area=207*(64*#bits+20)&allPar
am=bits#&AreaUNITS=um^2&Delay=#bits*0.75&
name=xor&DelayUNITS=ns&PARAMbits=8&cap=#bits*45e-
15&capUNITS=F&allMOD=Delay#cap#Area#&Area=#bits*56*64&allParam=bit
s#&AreaUNITS=um^2&Delay=2.5&
name=and2&DelayUNITS=ns&PARAMbits=8&cap=#bits*36e-
15&capUNITS=F&allMOD=Delay#cap#Area#&Area=#bits*51*64&allParam=bit
s#&AreaUNITS=um^2&Delay=2.8&
name=and3&DelayUNITS=ns&PARAMbits=8&cap=#bits*50e-
15&capUNITS=F&allMOD=Delay#cap#Area#&Area=#bits*59*64&allParam=bit
s#&AreaUNITS=um^2&Delay=2.9&
200
name=or2&DelayUNITS=ns&PARAMbits=8&cap=#bits*40e-
15&capUNITS=F&allMOD=Delay#cap#Area#&Area=#bits*53*64&allParam=bit
s#&AreaUNITS=um^2&Delay=2.9&
name=or3&DelayUNITS=ns&PARAMbits=8&cap=#bits*50e-
15&capUNITS=F&allMOD=Delay#cap#Area#&Area=#bits*70*64&allParam=bit
s#&AreaUNITS=um^2&Delay=3.1&
201
References
”
ce
-bit
. The
de
ion,
[1] Kurt Keutzer, “The impact of CAD on the Design of Low-Power Digital CircuitsIEEE Symposium on Low Power Electronices - Digest of Technical Papers,pp.342-346, Oct 1994.
[2] Chan, T.Y.; Ko, P.K.; Hu, C. “Dependence of channel electric field on deviscaling.” IEEE Electron Device Letters,vol.EDL-6, (no.10), Oct. 1985. p.551-3.
[3] Beer, M. Programmable worksheet (VISICALC).Computer Age, (no.9), Aug. 1980.p.35-8.
[4] Roberts, D., “More than just a spreadsheet,”PC User, May 1983. p.34-6 (Lotus1,2,3)
[5] Loggins, R. “Excel (spreadsheet software),”Business Computer Systems, vol.4,(no.11), Nov. 1985. p.85-9l
[6] OLE automation programmer’s reference : creating programmable 32applications. Redmond, Wash.,Microsoft Press, c1996.
One of the early papers in conceptual-design was focused toward the Aerospace industryWeb-site reference is most useful as it has a number of publications on line:
[7] M.J. Buckley et. al., “Design Sheet: An Environment for FacilitatingFlexible TraStudies During Conceptual Design,” 1992 Aerospace DesignConference,February 3-6, 1992,Irvine, California
[8] “http://www.rpal.rockwell.com/design-sheet/”,Rockwell Palo Alto Labratory,PaloAlto, CA
[9] D. Serrano, “Constraint Management in Conceptual Design”, PhD DissertatMassachusetts Institute of Technology, Dept. of Mechanical Engineering,Cambridge, MA, 1987
202
Paulbox
tire
D.
sisn
ural
ype
,”
,”
an
ng
2
ognt
alr-
ts”,
The idea of using a macromodeling approach was somewhat inspired by the work ofLandman and Jan Rabaey. The following4 references all relate to Landman’s work in blackmodeling of digital systems. All well written, the thesis provides the best overview of the enbody of work:
[10] Paul Landman, “Low-Power Architectural Design Methodologies” Ph.Dissertation, UC Berkeley, August 1994
[11] Paul Landman and Jan Rabaey. “Activity-Sensitive Architectural PowerAnalyfor the Control Path,”1995 International Symposium on Low Power Desig,Dana Point, CA, pp. 93-98, April 23-26, 1995.
[12] Paul Landman and Jan Rabaey. “Black-Box Capacitance Models for ArchitectPower Analysis,” 1994 Internatinal Workshop on Low Power, pp. 165-170,April, 1994.
[13] Paul Landman and Jan Rabaey. “Architectural Power Analysis: The Dual Bit TMethod,” 1995 Transactions on VLSI Systems, June, 1995.
[14] C. Svensson et al, “Low Power Circuit Techniques,” from “Low PowerDesign Methodologies”, pp.37-64, Kluwer Academic Publishers, Boston, MA, 1996.
[15] Vivek Tiwari et al, “Power Analysis of Embedded Software,” IEEE Transactions on VLSI Systems, pp 437-445, December 1994
[16] A. Salz and M.Horowitz, “IRSIM: An Incremental MOS Switch-Level SimulatorProceedings of the 26th Design Automation Conference,pp. 173-178, 1989
[17] Farid Najm, “A Survey of Power Estimation Techniques in VLSI CircuitsTransactions on VLSI Systems,vol. 2, no. 4 pp. 446-455, Dec. 1994.
[18] Farid Najm, “Towards a High-Level Power Estimation Capability,”1995International Symposium on Low Power Design, Dana Point, CA, pp. 87-92,April 23-26, 1995. --- High level in this sense refers to RTL, but there isinteresting study of using entropy analysis.
[19] T. Xanthopoulos, Y. Yaoi, A. Chandrakasan, "Architectural Exploration UsiVerilog-Based Power Estimation: A Case Study of the IDCT", IEEE/ACMDesign Automation Conference, pp. 415-420, June 1997.
[20] T. Xanthopoulos, A. P. Chandrakasan, "A Low-Power IDCT Macrocell for MPEGMP@ML Exploiting Data Distribution Properties for Minimal Activity," IEEESymposium on VLSI Circuits”, pp. 38-39, Honolulu, June 1998.
[21] H. Mammen, and W. Thronicke, “Object-oriented macromodelling of AnalDevices”, Proceedings of the 2nd International Conference on ConcurreEngineering and Electronic Design Automation,pp. 331-6, San Diego CA,1994.
[22] S. Jianfeng and R. Harjani, “Macromodeling of Analog Circuits for HierarchicCircuit Design”, 1994 IEEE/ACM International Conference on ComputeAided Design, Digest of Technical Paper, pp. 656-663, New York, NY, 1994.
[23] Paul Gray and Robert Meyer, “Analysis and Design of Analog Integrated CircuiThird Edition,John Wily & Sons, Inc.New York, NY, 1993.
203
bal
eofn
gm
ingd
larn
b-and
nale
us",
b,”
file
[24] L. Guerra, M. Potkonjak, J. Rabaey, "Divide-and-Conquer Techniques forGloThroughput Optimization,"VLSI Signal Processing, IX, 1996.
[25] F.-C. Wang, B. Richards and P.K. Wright, “A Generic Framework for thIntegration of Mechanical and Electrical CAD Tools for Concurrent DesignConsumer Electronic Products,”Proceedings, the ASME 21th DesigAutomation Conference, Vol. 1, pp. 17-24, June 1995,
[26] .J. T. Buck, S. Ha, E. A. Lee, D.G. Messerschmitt, ``Ptolemy: A mixed ParadiSimulation/Prototyping Platform’’,Proc. of Speech Tech 1991, New York, NY,April 23-25,1991.
[27] Ranjit Gharpuey and Robert Meyer, “Modeling and Analysis of substrate Couplin Integrated Circuits,”Proceedings of the IEEE 1995 Custom IntegrateCircuits Conference,pp. 125-128, May 1995.
[28] M. R. Grimaila and J.P. de Gyvex, “A Macromodel Fault Generator For CelluNeural Networks,”Proceeding os the Third IEEE International Workshop oCellular Neural Networks and their Applications,pp. 369-374 Rome, Italy,Dec. 1994.
[29] C.X. Huang et. al., “The Deisgn and Implementation of PowerMill,”1995International Symposium on Low Power Design,pp. 105-108, Dana Point, CA,April 1995.
[30] Andrew Yang and I.L. Wemple, “Timing and Power Simulation for Depp Sumicron ICs”, 1995 International Symposium on VLSI Technology, SystemsApplications,pp 79-86, Taipei, Taiway, June, 1995.
[31] Paul Gray, and Robert Meyer. “Future Directions in Silicon ICs for RF PersoCommunications,”Proceedings, 1995 Custom Integrated Circuits Conferenc,pp. 83-90, May 1995.
[32] Marlene Wan, Yuji Ichikawa, David Lidsky, Jan Rabaey, "An Energy ConscioMethodology for Early Design Exploration of Heterogeneous DSPSProceedings of the Custom Intergrated Circuit Conference, Santa Clara, CA,USA, May 1998
Papers and sites deaing with WWW and System Design Environments
[33] Mario Silva and Randy Katz, “The Case for Design Using the World Wide We32nd Design Automation Conference,pp. 579-585. July 1995 --- One of theearly works espousing CAD on the WWW. Proposes a nice method fortransfering using active messages and the SMTP protocal
[34] Tim Berners-Lee, et. al., “The World Wide Web,”Communications of the ACM,37(8):76-82
[35] Tim Berners-Lee at. al. http://www.w3.org/DesignIssues/Uses.html
[36] Tim Berners-Lee et. al., http//www.w3.org/1998/Potential.html
[37] http://apsara.mit.edu/WebTop/
[38] D. Saha, A. P. Chandrakasan, "Web-based Distributed VLSI Design,"VLSI Design’98 Conference, pp. 449-454, Madras, India, January 1998
204
"
d
[39] Downloaded and customized from WWW site which has now been removed.
[40] Craig Hunt, “TCP/IP Network Administration,”O’Reilly & Associates, Inc,Sebastopol, CA, August, 1992
[41] Larry Wall and Randal Schwartz, “Programming Perl,”O’Reilly & Associates, Inc,Sebastopol, CA, Januarry, 1991
[42] NCSA, http://hoohoo.ncsa.uiuc.edu/cgi/, National Center for SupercomputingApplications at the University of Illinois at Urbana,Champaign, IL, USA.
[43] The VELA Project, http://www.cbl.ncsu.edu/vela/,North Carolina StateUniversity, NC, USA.
[44] Donald Norman, “The Design of Everyday Things”,DoubleDay Publishing Group,Inc, New Your, New York, 1988.
Papers on Proxies and Design Agents:
[45] Ole Bentz et al,“Information-based Design Environment“,IEEE VLSI SignalProcessing VIII, pp. 237-246, Nov. 1995.
[46] Ole Bentz, J. Rabaey and David Lidsky,"A Dynamic Design Estimation andExploration Environment", Proceedings of Design Automation Conference,,IEEE Press, June, 1997
[47] Francis Chanet al., "WELD - An Environment for Web-Based Electronic Design,ACM Design Automation Conference, San Francisco, CA. 1998
[48] O. Olateju, “High-Level Block Editor,”SUPERB:Technical Reports, UC Berkeley,CA, 1996
Object-oriented design:
[49] James Rumbaugh et al, “Object-Oriented Modeling and Design”,Prentice Hall,Englewood Cliffs, New Jersey, 1991
[50] Grady Booch, “Object-Oriented Analysis and Design”,The Benjamin/CummingsPublishing Company, Inc,Redwood City, California, 1994
[51] Bing Wang, et al. “MOSFET Thermal Noise Modeling for Analog IntegrateCircuits,” IEEE Journal of Solid-State Circuits, Vol. 29, No. 7. pp. 833-835,July 1994.
[52] "IC Design on the WWW,”IEEE Spectrum, Number 6, NY, NY pg 45-49, June1998
Complex Systems used as design drivers and examples:
[53] Sam Shenget al, “A Portable Multimedia Terminal,” IEEE CommunicationsMagazine, pp. 64-75, December 1992.
[54] A. Chandrakasan, et al, “A Low-Power Chipset for Portable MultimediaApplications,” ISSCC, March 1994.
205
lex
s",
[55] Grove, Andy, “Only the Paranoid Survive”,Bantom Doubleday Books,New York,New York, 1996.
[56] D. Lidsky, J. Rabaey, “The Conceptual-Level Design Approach to Compsystems",The Journal of VLSI Signal Processing, No. 18, pp 11-24, 1998
[57] D. Lidsky, J. Rabaey, "Early Power Exploration - a World Wide Web Application",Proceedings of the Design Automation Conference, pp. 27-32, IEEE Press,June, 1996.
[58] D. Lidsky, J. Rabaey,"Low-Power Design of Memory Intensive FunctionProceedings from the IEEE Symposium on Low-Power Electronics,pp. 16-17,October 1994.