Integrating Case-Based, Analogy-Based, and Parameter-Based Estimation via Agile COCOMO II
The COCOMO II Suite of Software Cost Estimation Models
description
Transcript of The COCOMO II Suite of Software Cost Estimation Models
1
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
©USC-CSSE11/8/06
The COCOMO II Suite of Software Cost Estimation Models
Barry Boehm, USCCOCOMO/SSCM Forum 21 Tutorial
November 8, 2006
[email protected], http://csse.usc.edu/research/cocomosuite
2
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
©USC-CSSE11/8/06
Thanks to USC-CSSE Affiliates (33)• Commercial Industry (10)
– Cost Xpert Group, Galorath, Group Systems, IBM, Intelligent Systems, Microsoft, Motorola, Price Systems, Softstar Systems, Sun
• Aerospace Industry (8)– BAE Systems, Boeing, General Dynamics, Lockheed Martin, Northrop
Grumman(2), Raytheon, SAIC
• Government (6)– FAA, NASA-Ames, NSF, US Army Research Labs, US Army TACOM,
USAF Cost Center
• FFRDC’s and Consortia (6)– Aerospace, FC-MD, IDA, JPL, SEI, SPC
• International (3)– Institute of Software, Chinese Academy of Sciences, EASE (Japan), Samsung
3
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
©USC-CSSE11/8/06
USC-CSSE Affiliates Program• Provides priorities for, access to USC-CSE research
– Scalable spiral processes, cost/schedule/quality models, requirements groupware, architecting and re-engineering tools, value-based software engineering methods.
– Experience in application in DoD, NASA, industry– Affiliate community events
• 14th Annual Research Review and Executive Workshop, USC Campus, February 12-15, 2007
• 11th Annual Ground Systems Architecture Workshop (with Aerospace Corp.), Manhattan Beach, CA, March 26-29, 2007
• 22nd International COCOMO/Systems and Software Cost Estimation Forum, USC Campus, October 23-26, 2007
• Synergetic with USC distance education programs– MS – Systems Architecting and Engineering– MS / Computer Science – Software Engineering
4
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
©USC-CSSE11/8/06
Outline• COCOMO II Overview
– Motivation and Context– Model Form and Parameters– Calibration and Accuracy
• Overview of Emerging Extensions– COTS Integration (COCOTS)– Quality: Delivered Defect Density (COQUALMO)– Phase Distributions (COPSEMO)– Rapid Application Development Schedule (CORADMO)– Productivity Improvement (COPROMO)– Product Line Investment (COPLIMO)– System Engineering (COSYSMO)– System of System Integration (COSOSIMO)– COCOMO II Security Extensions (COSECMO)– Network Information Protection (CONIPMO)
5
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
©USC-CSSE11/8/06
Software product size estimate Software development
mainten ance cost and schedule estimates
Software product, process, com-puter, and personal attributes
Cost, schedule, distribution by
Software reuse, maintenance,and increment parameters
phase, activity, incrementSoftware organization’sProject data
COCOMO recalibrated to organization’s data
COCOMO
COCOMO Baseline Overview I
6
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
©USC-CSSE11/8/06
1. Introduction 2. Model Definition 3. Application Examples 4. Calibration 5. Emerging Extensions 6. Future Trends Appendices
– Assumptions, Data Forms, User’s Manual, CD Content
COCOMO II Book Table of Contents- Boehm, Abts, Brown, Chulani, Clark, Horowitz, Madachy, Reifer, Steece,
Software Cost Estimation with COCOMO II, Prentice Hall, 2000
CD: Video tutorials, USC COCOMO II.2000, commercial tool demos, manuals, data forms, web site links, Affiliate forms
7
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
©USC-CSSE11/8/06
Need to ReEngineer COCOMO 81
• New software processes
• New sizing phenomena
• New reuse phenomena
• Need to make decisions based on incomplete information
8
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
©USC-CSSE11/8/06
Feasibility
Concept ofOperation
Rqts.Spec.
Plansand
Rqts.
ProductDesign
ProductDesignSpec.
DetailDesignSpec.
DetailDesign
Devel.and Test
AcceptedSoftware
Phases and Milestones
RelativeSize Range x
4x
2x
1.25x
1.5x
0.25x
0.5x ApplicationsComposition
(3 parameters)
Early Design(13 parameters)
Post-Architecture(23 parameters)0.67x
0.8x
COCOMO II Model Stages
9
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
©USC-CSSE11/8/06
Outline• COCOMO II Overview
– Motivation and Context– Model Form and Parameters– Calibration and Accuracy
• Overview of Emerging Extensions– COTS Integration (COCOTS)– Quality: Delivered Defect Density (COQUALMO)– Phase Distributions (COPSEMO)– Rapid Application Development Schedule (CORADMO)– Productivity Improvement (COPROMO)– Product Line Investment (COPLIMO)– System Engineering (COSYSMO)– System of System Integration (COSOSIMO)– COCOMO II Security Extensions (COSECMO)– Network Information Protection (CONIPMO)
10
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
©USC-CSSE11/8/06
Early Design and Post-Architecture Model
FactorsScaleProcessSizeEffort
sMultiplier
Environment
Environment: Product, Platform, People, Project Factors
Size: Nonlinear reuse and volatility effects
Process: Constraint, Risk/Architecture, Team, Maturity Factors
FactorsScaleProcess EffortSchedule Multiplier
11
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
©USC-CSSE11/8/06
New Scaling Exponent Approach
• Nominal person-months = A*(size)B
• B = 0.91 + 0.01 (scale factor ratings)- B ranges from 0.91 to 1.23- 5 scale factors; 6 rating levels each
• Scale factors:- Precedentedness (PREC)- Development flexibility (FLEX)- Architecture/ risk resolution (RESL)- Team cohesion (TEAM)- Process maturity (PMAT, derived from SEI CMM)
12
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
©USC-CSSE11/8/06
Project Scale FactorsPM
estimated2.94(Size) B EM
i
B .0.910.01 SFi
Scale Factors(Wi)
Very Low Low Nominal High Very High Extra High
PREC thoroughlyunprecedented
largelyunprecedented
somewhatunprecedented
generallyfamiliar
largely familiar throughlyfamiliar
FLEX rigorous occasionalrelaxation
somerelaxation
generalconformity
someconformity
general goals
RESL little (20%) some (40%) often (60%) generally(75%)
mostly (90%) full (100%)
TEAM very difficultinteractions
some difficultinteractions
basicallycooperativeinteractions
largelycooperative
highlycooperative
seamlessinteractions
PMAT weighted sum of 18 KPA achievement levels
13
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
©USC-CSSE11/8/06
Relativecost
Amount Modified
1.0
0.75
0.5
0.25
0.25 0.5 0.75 1.0
0.55
0.70
1.0
0.046
Usual LinearAssumption
Data on 2954NASA modules
[Selby, 1988]
Nonlinear Reuse Effects
14
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
©USC-CSSE11/8/06
Reuse and Reengineering Effects
• Add Assessment & Assimilation increment (AA)- Similar to conversion planning increment
• Add software understanding increment (SU)- To cover nonlinear software understanding effects- Coupled with software unfamiliarity level (UNFM)- Apply only if reused software is modified
• Results in revised Equivalent Source Lines of Code (ESLOC)
- AAF = 0.4(DM) + 0.3 (CM) + 0.3 (IM)- ESLOC = ASLOC[AA+AAF(1+0.02(SU)(UNFM))], AAF < 0.5 - ESLOC = ASLOC[AA+AAF(SU)(UNFM))], AAF > 0.5
15
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
©USC-CSSE11/8/06
Software Understanding Rating / Increment
Very Low Low Nom High Very High
Structure Very lowcohesion, high
coupling,spaghetti code.
Moderately lowcohesion, high
coupling.
Reasonablywell -
structured;some weak
areas.
High cohesion,low coupling.
Strongmodularity,information
hiding indata/controlstructures.
ApplicationClarity
No matchbetween
program andapplication
world views.
Somecorrelation
betweenprogram andapplication .
Moderatecorrelation
betweenprogram andapplication .
Goodcorrelation
betweenprogram andapplication .
Clear matchbetween
program andapplication
world views.Self-
DescriptivenessObscure code;documentation
missing,obscure orobsolete.
Some codecommentary andheaders; some
usefuldocumentation.
Moderate levelof code
commentary,headers,
documentation.
Good codecommentaryand headers;
usefuldocumentation;
some weakareas.
Self-descriptive
code;documentationup-to-date,
well-organized,with designrationale.
SU Increment toESLOC
50 40 30 20 10
16
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
©USC-CSSE11/8/06
Other Major COCOMO II Changes
• Range versus point estimates
• Requirements Volatility (Evolution) included in Size
• Multiplicative cost driver changes
- Product CD’s
- Platform CD’s
- Personnel CD’s
- Project CD’s
• Maintenance model includes SU, UNFM factors from reuse model
– Applied to subset of legacy code undergoing change
17
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
©USC-CSSE11/8/06
Percentage of sample projects within 30% of actuals
-Without and with calibration to data source
COCOMO II Estimation Accuracy:
COCOMO81 COCOMOII.2000COCOMOII.1997
# Projects 63 83 161
Effort
Schedule
81% 52%64%
75%80%
61%62%
72%81%
65%
18
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
©USC-CSSE11/8/06
COCOMO II. 2000 Productivity Ranges
Productivity Range
1 1.2 1.4 1.6 1.8 2 2.2 2.4
Product Complexity (CPLX)
Analyst Capability (ACAP)
Programmer Capability (PCAP)
Time Constraint (TIME)
Personnel Continuity (PCON)
Required Software Reliability (RELY)
Documentation Match to Life Cycle Needs (DOCU)
Multi-Site Development (SITE)
Applications Experience (AEXP)
Platform Volatility (PVOL)
Use of Software Tools (TOOL)
Storage Constraint (STOR)
Process Maturity (PMAT)
Language and Tools Experience (LTEX)
Required Development Schedule (SCED)
Data Base Size (DATA)
Platform Experience (PEXP)
Architecture and Risk Resolution (RESL)
Precedentedness (PREC)
Develop for Reuse (RUSE)
Team Cohesion (TEAM)
Development Flexibility (FLEX)
Scale Factor Ranges: 10, 100, 1000 KSLOC
19
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
©USC-CSSE11/8/06
Outline• COCOMO II Overview
– Motivation and Context– Model Form and Parameters– Calibration and Accuracy
• Overview of Emerging Extensions– COTS Integration (COCOTS)– Quality: Delivered Defect Density (COQUALMO)– Phase Distributions (COPSEMO)– Rapid Application Development Schedule (CORADMO)– Productivity Improvement (COPROMO)– Product Line Investment (COPLIMO)– System Engineering (COSYSMO)– System of System Integration (COSOSIMO)– COCOMO II Security Extensions (COSECMO)– Network Information Protection (CONIPMO)
20
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
©USC-CSSE11/8/06
Status of Models
COCOMO II
COCOTS
COQUALMO Defects in Defects out
CORADMO
COSYSMO
Literature BehaviorSignif. Variables Delphi
Data, Bayes
**
****
**
****
**
****
**
**
*
>200
20
6
6
16
60
21
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
©USC-CSSE11/8/06
COCOMO vs. COCOTS Cost SourcesS
TA
FF
ING
TIME
22
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
©USC-CSSE11/8/06
COCOMO II
Current COQUALMO System
COQUALMO
DefectIntroduction
Model
DefectRemoval
Model
Software platform, Project, product and personnel attributes
Software Size Estimate
Defect removal profile levelsAutomation, Reviews, Testing
Software development effort, cost and schedule estimate
Number of residual defectsDefect density per unit of size
23
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
©USC-CSSE11/8/06
Defect Removal Rating Scales
Highly advanced
tools, model-based test
More advance test tools,
preparation.
Dist-monitoring
Well-defined test seq. and
basic test coverage tool
system
Basic test
Test criteria based on checklist
Ad-hoc test and debug
No testingExecution Testing and
Tools
Extensive review
checklist
Statistical control
Root cause analysis,
formal follow
Using historical data
Formal review roles and Well-trained people
and basic checklist
Well-defined preparation,
review, minimal
follow-up
Ad-hoc informal walk-
through
No peer reviewPeer Reviews
Formalized specification, verification.
Advanced dist-
processing
More elaborate req./design
Basic dist-processing
Intermediate-level module
Simple req./design
Compiler extension
Basic req. and design
consistency
Basic compiler capabilities
Simple compiler syntax
checking
Automated Analysis
Extra HighVery HighHighNominalLowVery Low
COCOMO II p.263
24
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
©USC-CSSE11/8/06
Defect Removal Estimates- Nominal Defect Introduction Rates
60
28.5
14.37.5
3.5 1.60
10
20
30
40
50
60
70
VL Low Nom High VH XH
Delivered Defects/ KSLOC
Composite Defect Removal Rating
25
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
©USC-CSSE11/8/06
Outline• COCOMO II Overview
– Motivation and Context– Model Form and Parameters– Calibration and Accuracy
• Overview of Emerging Extensions– COTS Integration (COCOTS)– Quality: Delivered Defect Density (COQUALMO)– Phase Distributions (COPSEMO)– Rapid Application Development Schedule (CORADMO)– Productivity Improvement (COPROMO)– Product Line Investment (COPLIMO)– System Engineering (COSYSMO)– System of System Integration (COSOSIMO)– COCOMO II Security Extensions (COSECMO)– Network Information Protection (CONIPMO)
26
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
©USC-CSSE11/8/06
COCOMO II cost drivers
(except SCED)
Language Level,
experience,...
COCOMO II
Phase Distributions(COPSEMO)
RAD Extension
Baseline effort,
schedule
Effort,
schedule by stage
RAD effort, schedule by phase
RVHL
DPRS
CLAB
RESL
COCOMO II RAD Extension (CORADMO)
PPOSRCAP
27
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
©USC-CSSE11/8/06
0
2
4
6
8
10
12
14
16
0 10 20 30 40 50
PM
M
3.7*(Cube root) 3*(Cube root) Square root
RCAP = XL
RCAP = XH
Effect of RCAP on Cost, Schedule
28
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
©USC-CSSE11/8/06
COPROMO (Productivity) Model• Uses COCOMO II model and extensions as
assessment framework– Well-calibrated to 161 projects for effort, schedule
– Subset of 106 1990’s projects for current-practice baseline
– Extensions for Rapid Application Development formulated
• Determines impact of technology investments on model parameter settings
• Uses these in models to assess impact of technology investments on cost and schedule– Effort used as a proxy for cost
29
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
©USC-CSSE11/8/06
The COPLIMO Model– Constructive Product Line Investment Model
• Based on COCOMO II software cost model– Statistically calibrated to 161 projects, representing 18 diverse
organizations
• Based on standard software reuse economic terms– RCR: Relative cost of reuse
– RCWR: Relative cost of writing for reuse
• Avoids overestimation– Avoids RCWR for non-reused components
– Adds life cycle cost savings
• Provides experience-based default parameter values• Simple Excel spreadsheet model
– Easy to modify, extend, interoperate
30
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
©USC-CSSE11/8/06
COPLIMO Estimation SummaryPart I: Product Line Development Cost Estimation Summary:
# of Products 0 1 2 3 4 5
Effort (PM)
No Reuse 0 294 588 882 1176 1470
Product Line 0 444 589 735 881 1026
Product Line Savings 0 -150 -1 147 295 444
ROI 0 -1.00 -0.01 0.98 1.97 2.96
Part II: Product Line Annualized Life Cycle Cost Estimation Summary:
# of Products 0 1 2 3 4 5
AMSIZE-P 0 8.1 16.2 24.2 32.3 40.4
AMSIZE-R 0 6.1 6.1 6.1 6.1 6.1
AMSIZE-A 0 6.1 7.7 9.3 11.0 12.6
Total Equiv. KSLOC 0 20.2 29.9 39.6 49.3 59.1
Effort (AM) (*2.94) 0 59.4 88.0 116.5 145.1 173.7
5-year Life Cycle PM 0 296.9 439.8 582.6 725.4 868.3
PM(N, 5)-R (+444) 0 740.9 883.7 1026.5 1169.4 1312.2
PM(N, 5)-NR 0 590.9 1181.9 1772.8 2363.8 2954.7
Product Line Savings (PM) 0 -149.9 298.2 746.3 1194.4 1642.5
ROI 0 -1.00 1.99 4.98 7.97 10.96
Devel. ROI 0 -1.00 -0.01 0.98 1.97 2.96
3-year Life Cycle 0 -142.0 120.0 480.0
AMSIZE: Annually Maintained Software Size
Product Line Development Cost Estimation
-200
0
200
400
600
0 1 2 3 4 5 6
# of products in product line
Net
dev
elo
pm
ent
effo
rt s
avin
gs
Product Line Annualized Life Cycle Cost Estimation
-200
-100
0
100
200
300
400
500
600
700
800
0 1 2 3 4 5 6
# of products
Net
Pro
du
ct L
ine
Eff
ort
Sav
ing
s
5-year Life Cycle
3-year Life Cycle
Development
31
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
©USC-CSSE11/8/06
Outline• COCOMO II Overview
– Motivation and Context– Model Form and Parameters– Calibration and Accuracy
• Overview of Emerging Extensions– COTS Integration (COCOTS)– Quality: Delivered Defect Density (COQUALMO)– Phase Distributions (COPSEMO)– Rapid Application Development Schedule (CORADMO)– Productivity Improvement (COPROMO)– Product Line Investment (COPLIMO)– System Engineering (COSYSMO)– System of System Integration (COSOSIMO)– COCOMO II Security Extensions (COSECMO)– Network Information Protection (CONIPMO)
32
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
©USC-CSSE11/8/06
COCOMO II• Software• Development phases• 20+ years old• 200+ calibration points• 23 Drivers• Variable granularity• 3 anchor points• Size is driven by SLOC
COSYSMO• Systems Engineering• Entire Life Cycle• 3 years old• 60 calibration points• 18 drivers• Fixed granularity• No anchor points• Size is driven by
requirements, I/F, etc
Model Differences
33
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
©USC-CSSE11/8/06
COSYSMO
SizeDrivers
EffortMultipliers
Effort
Calibration
# Requirements# Interfaces# Scenarios# Algorithms
+Volatility Factor
- Application factors-8 factors
- Team factors-6 factors
- Schedule driver WBS guided by ISO/IEC 15288
COSYSMO Operational Concept
34
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
©USC-CSSE11/8/06
Size Drivers
Exponential Scale Factors
SoSDefinition andIntegrationEffort
Calibration
• Interface-related eKSLOC• Number of logical
interfaces at SoS level• Number of operational
scenarios• Number of components
• Integration simplicity• Integration risk resolution• Integration stability• Component readiness• Integration capability• Integration processes
COSOSIMO
COSOSIMO Operational Concept
35
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
©USC-CSSE11/8/06
Security Impact on Engineering Effort
• For software developers:– Source lines of code
increases– Effort to generate
software increases• Security functional
requirements• Security assurance
requirements
– Effort to transition also increases
• More documentation• Additional certification and
accreditation costs
• For systems engineers:– Effort to develop
system increases• Network defense
requirements • Network defense
operational concepts• Program protection
requirements• Anti-tamper
implementation
– Effort to transition also increases
• DITSCAP and red teaming
Being addressed by COSECMO Being addressed by CONIPMO
36
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
©USC-CSSE11/8/06
Backup Charts
37
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
©USC-CSSE11/8/06
To help people reason about the
cost and schedule implications of
their software decisions
Purpose of COCOMO II
38
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
©USC-CSSE11/8/06
Major Decision SituationsHelped by COCOMO II
• Software investment decisions– When to develop, reuse, or purchase– What legacy software to modify or phase out
• Setting project budgets and schedules
• Negotiating cost/schedule/performance tradeoffs
• Making software risk management decisions
• Making software improvement decisions– Reuse, tools, process maturity, outsourcing
39
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
©USC-CSSE11/8/06
Relations to MBASE*/Rational Anchor Point Milestones
App. Compos.
Inception Elaboration, Construction
LCO, LCA
IOC
Waterfall Rqts. Prod. Des.
LCA
Development
LCO
Sys Devel
IOC
Transition
SRR PDR
Construction
SAT
Trans.
*MBASE: Model-Based (System) Architecting and Software Engineering
Inception Phase
Elaboration
40
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
©USC-CSSE11/8/06
Post-Architecture EMs-Product:
Very Low Low Nominal High Very High Extra High
Required Reliability (RELY)
slight inconvenience (0.82)
Low, easily recoverable losses (0.92)
Moderate, easily recoverable losses (1.00)
High financial loss (1.10)
Risk to human life (1.26)
Database Size (DATA)
DB bytes/Pgm SLOC<10
10<D/P<100 100<D/P<
1000
D/P>1000
Complexity (CPLX) _ _
See Complexity
Table_ _ _
Required Reuse (RUSE)
None Across project
Across program
Across product line
Across multiple product lines
Documentation Match to Lifecycle (DOCU)
Many lifecycle needs uncovered
Some lifecycle needs uncovered
Right-sized to lifecycle needs
Excessive for lifecycle needs
Very excessive for lifecycle needs
41
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
©USC-CSSE11/8/06
Post-Architecture ComplexityControl Operations Computation al
OperationsDevice-
dependent Operations
Data Management Operations
User Interface Management Operations
Very Low … … … … …
Low … … … … …
Nominal Mostly simple nesting. Some intermodule control. Decision tables. Simple callbacks or message passing, including middleware-supported distributed processing.
Use of standard math and statistical routines. Basic matrix/vector operations.
I/O processing includes device selection, status checking and error processing.
Multi-file input and single file output. Simple structural changes, simple edits. Complex COTS-DB queries, updates.
Simple use of widget set.
High … … … … …
Very High … … … … …
Extra High
… … … … …
42
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
©USC-CSSE11/8/06
Post-Architecture EMs-Platform:Very Low Low Nominal High Very High Extra High
Execution Time Constraint (TIME)
< 50% use of available execution time
70% 85% 95%
Main Storage Constraint (STOR)
< 50% use of available storage
70% 85% 95%q
Platform Volatility (PVOL)
Major change every 12 mo.; minor change every 1 mo.
Major: 6 mo.; minor: 2 wk.
Major: 2 mo.; minor: 1 wk.
Major: 2 wk.; minor: 2 days
43
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
©USC-CSSE11/8/06
Post-Architecture EMs-Personnel:Very Low Low Nominal High Very High Extra High
Analyst Capability (ACAP)
15th percentile
35th percentile
55th percentile
75th percentile
90th percentile
Programmer Capability (PCAP)
15th percentile
35th percentile
55th percentile
75th percentile
90th percentile
Personnel Continuity (PCON)
48%/year 24%/year 12%/year 6%/year 3%/year
Application Experience (AEXP)
<2 months 6 months 1 year 3 years 6 years
Platform Experience (PEXP)
<2 months 6 months 1 year 3 years 6 years
Language and Tool Experience (LTEX)
<2 months 6 months 1 year 3 years 6 years
44
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
©USC-CSSE11/8/06
Post-Architecture EMs-Project:Very Low Low Nominal High Very High Extra High
Use of Software Tools (TOOL)
Edit, code, debug
Simple, frontend, backend CASE, little integration
Basic lifecycle tools, moderately integrated
Strong, mature lifecycle tools, moderately integrated
Strong, mature, proactive lifecycle tools, well integrated with processes, methods, reuse
Multisite Development:
Collocation
(SITE)
International Multi-city and Multi-company
Multi-city or Multi-company
Same city or metro. Area
Same building or complex
Fully collocated
Multisite Development:
Communications (SITE)
Some phone, mail
Individual phone, FAX
Narrowband email
Wideband electronic communication
Wideband elect. Comm, occasional video conf.
Interactive multimedia
Required Development
Schedule (SCED)
75% of nominal
85% 100% 130% 160%
45
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
©USC-CSSE11/8/06
• Initial Schedule Estimation
where estimated person months excluding Schedulemultiplier effects
• Output Ranges
- 80% confidence limits: 10% of time each below Optimistic, above Pessimistic
- Reflect sources of uncertainty in model inputs
TDEV 3.67 PM
0.280.2(B 0.91)
SCED %100
PM
Stage Optimistic Estimate Pessimistic Estimate
Application Composition 0.50 E 2.0 EEarly Design 0.67 E 1.5 E
Post-Architecture 0.80 E 1.25 E
Other Model Refinements
46
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
©USC-CSSE11/8/06
Early Design vs. Post-Arch EMs:
Early Design Cost Driver Counterpart Combined Post Architecture Cost Drivers
Product Reliability and Complexity RELY, DATA, CPLX, DOCU
Required Reuse RUSE
Platform Difficulty TIME, STOR, PVOL
Personnel Capability ACAP, PCAP, PCON
Personnel Experience AEXP, PEXP, LTEX
Facilities TOOL, SITE
Schedule SCED
47
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
©USC-CSSE11/8/06
Outline
• COCOMO II Overview– Motivation and Context– Model Form and Parameters– Calibration and Accuracy
• Overview of Emerging Extensions– COTS Integration (COCOTS)– Quality: Delivered Defect Density (COQUALMO)– Phase Distributions (COPSEMO)– Rapid Application Development Schedule (CORADMO)– Productivity Improvement (COPROMO)– Product Line Investment (COPLIMO)– System Engineering (COSYSMO)– System of System Integration (COSOSIMO)– Dependability ROI (iDAVE)
48
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
©USC-CSSE11/8/06
USC-CSE Modeling Methodology
Analyze existing literature
Step 1
Perform Behavioral analyses
Step 2 Identify relative significance
Step 3Perform expert-judgment Delphi assessment, formulate a-priori modelStep 4 Gather project data
Step 5
Determine Bayesian A-Posteriori model
Step 6
Gather more data; refine model
Step 7
- concurrency and feedback implied
49
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
©USC-CSSE11/8/06
Results of Bayesian Update: Using Prior and Sampling Information (Step 6)
1.06
Literature,behavioral analysis
A-prioriExperts’ Delphi
Noisy data analysis
A-posteriori Bayesian update
Productivity Range =Highest Rating /Lowest Rating
1.451.51
1.41
Language and Tool Experience (LTEX)
50
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
©USC-CSSE11/8/06
COCOMO Model ComparisonsCOCOMO Ada COCOMO COCOMO II:
Application CompositionCOCOMO II:Early Design
COCOMO II:Post-Architecture
Size Delivered Source Instructions(DSI) or Source Lines ofCode (SLOC)
DSI or SLOC Application Points Function Points (FP) andLanguage or SLOC
FP and Language or SLOC
Reuse Equivalent SLOC = Linear(DM,CM,IM)
Equivalent SLOC = Linear(DM,CM,IM)
Implicit in Model Equivalent SLOC = nonlinear(AA, SU,UNFM,DM,CM,IM)
Equivalent SLOC = nonlinear(AA, SU,UNFM,DM,CM,IM)
Rqts. Change Requirements Volatilityrating: (RVOL)
RVOL rating Implicit in Model Change % : RQEV
Maintenance Annual Change Traffic(ACT) = %added + %modified
ACT Object Point ACT (ACT,SU,UNFM) (ACT,SU,UNFM)
Scale (b) inMMNOM=a(Size)b
Organic: 1.05 Semidetached:1.12 Embedded: 1.20
Embedded: 1.04 -1.24depending on degree of: early risk elimination solid architecture stable requirements Ada process maturity
1.0 .91-1.23 depending on thedegree of: precedentedness conformity early architecture, risk
resolution team cohesion process maturity (SEI)
.91-1.23 depending on thedegree of: precedentedness conformity early architecture, risk
resolution team cohesion process maturity (SEI)
Product Cost Drivers RELY, DATA, CPLX RELY*, DATA, CPLX*,RUSE
None RCPX*, RUSE* RELY, DATA, DOCU*,CPLX, RUSE*
Platform Cost Drivers TIME, STOR, VIRT, TURN TIME, STOR, VMVH,VMVT, TURN
None Platform difficulty: PDIF * TIME, STOR, PVOL(=VIRT)
Personnel CostDrivers
ACAP, AEXP, PCAP,VEXP, LEXP
ACAP*, AEXP, PCAP*,VEXP, LEXP*
None Personnel capability andexperience: PERS*, PREX*
ACAP*, AEXP, PCAP*,PEXP*, LTEX*, PCON*
Project Cost Drivers MODP, TOOL, SCED MODP*, TOOL*, SCED,SECU
None SCED, FCIL* TOOL*, SCED, SITE*
* Different Multipliers Different Rating Scale
RQEV
51
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
©USC-CSSE11/8/06
COCOMO II Experience Factory: I
Ok?
Rescope
COCOMO 2.0Corporate parameters:tools, processes, reuse
System objectives:fcn’y, perf., quality
No
Yes
Cost,Sched,Risks
52
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
©USC-CSSE11/8/06
COCOMO II Experience Factory: II
Ok?
Rescope
COCOMO 2.0Corporate parameters:tools, processes, reuse
System objectives:fcn’y, perf., quality
Executeprojectto next
Milestone
Ok?
Done?
End
ReviseMilestones,
Plans,Resources
No
RevisedExpectations
M/SResults
Yes
Yes
Milestone expectations
No
Yes
Cost,Sched,Risks
No
Milestone plans,resources
53
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
©USC-CSSE11/8/06
COCOMO II Experience Factory: III
Ok?
Rescope
COCOMO 2.0
RecalibrateCOCOMO 2.0
Corporate parameters:tools, processes, reuse
System objectives:fcn’y, perf., quality
Executeprojectto next
Milestone
Ok?
Done?
End
ReviseMilestones,
Plans,Resources
AccumulateCOCOMO 2.0
calibrationdata
No
RevisedExpectations
M/SResults
Yes
Yes
Milestone expectations
No
Yes
Cost,Sched,Risks
No
Milestone plans,resources
54
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
©USC-CSSE11/8/06
COCOMO II Experience Factory: IV
Ok?
Rescope
COCOMO 2.0
RecalibrateCOCOMO 2.0
Corporate parameters:tools, processes, reuse
System objectives:fcn’y, perf., quality
Executeprojectto next
Milestone
Ok?
Done?
End
ReviseMilestones,
Plans,Resources
EvaluateCorporate
SWImprovement
Strategies
AccumulateCOCOMO 2.0
calibrationdata
No
RevisedExpectations
M/SResults
Yes
Yes
Milestone expectations
ImprovedCorporate
Parameters
No
Yes
Cost,Sched,Risks
Cost, Sched,Quality drivers
No
Milestone plans,resources
55
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
©USC-CSSE11/8/06
New Glue Code Submodel ResultsNew Glue Code Submodel Results• Current calibration looking reasonably good
– Excluding projects with very large, very small amounts of glue code (Effort Pred):
• [0.5 - 100 KLOC]: Pred (.30) = 9/17 = 53%• [2 - 100 KLOC]: Pred (.30) = 8/13 = 62%
– For comparison, calibration results shown at ARR 2000:• [0.1 - 390 KLOC]: Pred (.30) = 4/13 = 31%
• Propose to revisit large, small, anomalous projects– A few follow-up questions on categories of code & effort
• Glue code vs. application code• Glue code effort vs. other sources
56
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
©USC-CSSE11/8/06
Current Insights into Maintenance Phase IssuesCurrent Insights into Maintenance Phase Issues Priority of Activities by Effort Involved and/or CriticalityPriority of Activities by Effort Involved and/or Criticality
• Higher– training SS CC– configuration management CC– operations support CC– integration analysis SS– requirements management SS CC
• Medium– certification SS– market watch CC– distribution SS
– vendor management CC– business case evaluation SS
• Lower– administering COTS licenses CC
S - spikes aroundS - spikes around refresh cyclerefresh cycle anchor pointsanchor pointsC - continuousC - continuous
57
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
©USC-CSSE11/8/06
RAD Context
• RAD a critical competitive strategy– Market window; pace of change
• Non-RAD COCOMO II overestimates RAD schedules– Need opportunity-tree cost-schedule
adjustment– Cube root model inappropriate for small
RAD projects• COCOMO II: Mo. = 3.7 ³ PM
58
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
©USC-CSSE11/8/06
Work streamlining (80-20) - O
Better People and Incentives
Weekend warriors - PPOS
RAD Opportunity Tree
Eliminating Tasks
Reducing Time Per Task
Reducing Risks of Single-Point Failures
Reducing Backtracking
Activity Network Streamlining
Increasing Effective Workweek
Development process reengineering - DPRS
Reusing assets - RVHL
Applications generation - RVHL
Tools and automation - O
Work streamlining (80-20) - O
Increasing parallelism - RESL
Reducing failures - RESL
Reducing their effects - RESL
Early error elimination - RESL
Process anchor points - RESL
Improving process maturity - O
Collaboration technology - CLAB
Minimizing task dependencies - DPRS
Avoiding high fan-in, fan-out - DPRS
Reducing task variance - DPRS
Removing tasks from critical path - DPRS
24x7 development - PPOS
Nightly builds, testing - PPOS
Weekend warriors - PPOS
O: covered by
RAD
Eliminating Tasks
Reducing Time Per Task
Reducing Risks of Single-Point Failures
Reducing Backtracking
Activity Network Streamlining
Increasing Effective Workweek
Reusing assets - RVHL
Applications generation - RVHL
Schedule as Independent Variable Process
Tools and automation - O
Increasing parallelism - RESL
Reducing failures - RESL
Reducing their effects - RESL
Early error elimination - RESL
Process anchor points - RESL
Improving process maturity - O
Collaboration technology - CLAB
Minimizing task dependencies - DPRS
Avoiding high fan-in, fan-out - DPRS
Reducing task variance - DPRS
Removing tasks from critical path - DPRS
24x7 development - PPOS
Nightly builds, testing - PPOS
O: covered by
RAD Capability and experience - RCAP
59
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
©USC-CSSE11/8/06
RATING
FACTOR XL VL L N H VH XH
PERS-R 10% 25% 40% 55% 70% 85% 95%
PREX-R 2mo
4 mo 6 mo 1 yr 3 yrs 6 yrs 10 yrs
I,E, C Multipliers
PM 1.20 1.13 1.06 1.0 .93 .86 .80
M 1.40 1.25 1.12 1.0 .82 .68 .56
P=PM/M .86 .90 .95 1.0 1.13 1.26 1.43
RCAP:RAD Capability of Personnel
PERS-R is the Early Design Capability rating, adjusted to reflect the performers’ capability to rapidly assimilate new concepts and material, and to rapidly adapt to change.
PREX-R is the Early Design Personnel Experience rating, adjusted to reflect the performers’ experience with RAD languages, tools, components, and COTS integration.
60
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
©USC-CSSE11/8/06
RCAP Example RCAP = Nominal PM = 25, M = 5, P = 5
The square root law: 5 people for 5 months: 25 PM
RCAP = XH PM = 20, M = 2.8, P = 7.1A very good team can put on 7 people and finish in 2.8 months: 20 PM
RCAP = XL PM = 30, M = 7, P = 4.3
Trying to do RAD with an unqualified team makes them less efficient (30 PM) and gets the schedule closer to the cube root law:
(but not quite: = 9.3 months > 7 months)
61
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
©USC-CSSE11/8/06
COPLIMO Inputs and Outputs
For current set of similar products,
As functions of # products, # years in life cycle
COPLIMO
Average product size, COCOMO II cost drivers
Percent mission-unique, reused-with-mods,
black-box reuse
RCR, RCWR factors,
annual change traffic
Non-product line effort
Product line investment, effort
Product line savings, ROI
62
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
©USC-CSSE11/8/06
4 Size Drivers1. Number of System Requirements
2. Number of Major Interfaces
3. Number of Operational Scenarios
4. Number of Critical Algorithms
• Each weighted by complexity, volatility, and degree of reuse
63
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
©USC-CSSE11/8/06
Number of System RequirementsThis driver represents the number of requirements for the system-of-interest at a specific level of design. Requirements may be functional, performance, feature, or service-oriented in nature depending on the methodology used for specification. They may also be defined by the customer or contractor. System requirements can typically be quantified by counting the number of applicable “shall’s” or “will’s” in the system or marketing specification. Do not include a requirements expansion ratio – only provide a count for the requirements of the system-of-interest as defined by the system or marketing specification.
Easy Nominal Difficult
- Well specified - Loosely specified - Poorly specified
- Traceable to source - Can be traced to source with some effort
- Hard to trace to source
- Simple to understand - Takes some effort to understand - Hard to understand
- Little requirements overlap - Some overlap - High degree of requirements overlap
- Familiar - Generally familiar - Unfamiliar
- Good understanding of what’s needed to satisfy and verify requirements
- General understanding of what’s needed to satisfy and verify requirements
- Poor understanding of what’s needed to satisfy and verify requirements
64
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
©USC-CSSE11/8/06
14 Cost Drivers
1. Requirements understanding
2. Architecture complexity
3. Level of service requirements
4. Migration complexity
5. Technology Maturity
6. Documentation Match to Life Cycle Needs
7. # and Diversity of Installations/Platforms
8. # of Recursive Levels in the Design
Application Factors (8)
65
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
©USC-CSSE11/8/06
Level of service (KPP) requirementsThis cost driver rates the difficulty and criticality of satisfying the ensemble of Key Performance Parameters (KPP), such as security, safety, response time, interoperability, maintainability, the “ilities”, etc.
Viewpoint Very low Low Nominal High Very High
Difficulty Simple Low difficulty, coupling
Moderately complex, coupled
Difficult, coupled KPPs
Very complex, tightly coupled
Criticality Slight inconvenience Easily recoverable losses
Some loss High financial loss
Risk to human life
66
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
©USC-CSSE11/8/06
14 Cost Drivers (cont.)
1. Stakeholder team cohesion
2. Personnel/team capability
3. Personnel experience/continuity
4. Process maturity
5. Multisite coordination
6. Tool support
Team Factors (6)
67
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
©USC-CSSE11/8/06
4. Rate Cost Drivers - Application
68
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
©USC-CSSE11/8/06
9. *Time Phase the Estimate – Overall Staffing
69
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
©USC-CSSE11/8/06
Proposed COSOSIMO Size Drivers• Subsystem size of interface software measured in effective KSLOC (eKSLOC)
• Number of components
• Number of major interfaces
• Number of operational scenarios
S1
S2
S3
S4Each weighted by
• Complexity• Volatility• Degree of COTS/reuse
70
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
©USC-CSSE11/8/06
Proposed COSOSIMO Scale Factors• Integration risk resolution
Risk identification and mitigation efforts• Integration simplicity
Architecture and performance issues• Integration stability
How much change is expected• Component readiness
How much prior testing has been conducted on the components• Integration capability
People factor• Integration processes
Maturity level of processes and integration lab
71
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
©USC-CSSE11/8/06
Reasoning about the Value of Dependability – iDAVE
• iDAVE: Information Dependability Attribute Value Estimator• Use iDAVE model to estimate and track software dependability ROI
– Help determine how much dependability is enough– Help analyze and select the most cost-effective combination of software dependability techniques– Use estimates as a basis for tracking performance
72
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
©USC-CSSE11/8/06
iDAVE Model Framework Time-phased
information processing capabilities
Project attributes
Time-phased dependability investments
IP Capabilities (size), project attributes
Cost estimating relationships (CER’s)
Dependability investments, project attributes
Dependability attribute estimating relationships (DER’s)
Cost = f
Di = gi
Value estimating relationships (VER’s)
Vj = hj IP Capabilities dependability levels Di
Time-phased Cost Dependability
attribute levels Di Value components
Vj
Return on Investment
73
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
©USC-CSSE11/8/06
Typical Value Estimating RelationshipsV
alu
e ($
)
High-Returns
Production Function Shape
Linear
Investment Diminishing Returns 1.0
Full Value
Availability
Revenue loss per hour system downtimeIntel: $275KCisco: $167KDell: $83KAmazon.com: $27KE*Trade: $8Kebay: $3K
74
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
©USC-CSSE11/8/06
ROI Analysis Results Comparison