Next generation genetics for the next generation: Dr David ...
Next-Generation Systems and Software Cost Estimation
description
Transcript of Next-Generation Systems and Software Cost Estimation
I n t e g r i t y - S e r v i c e - E x c e l l e n c e
Headquarters U.S. Air Force
Next-Generation Systems and Software
Cost Estimation
Wilson RosaTechnical Advisor
Air Force Cost Analysis Agency (AFCAA)
October 28, 2008
I n t e g r i t y - S e r v i c e - E x c e l l e n c eUNCLASSIFIED
Problem Statement
Emerging technologies such as • Systems of Systems (SoS)
• Model Driven Architecture (MDA)
• Enterprise Resource Planning (ERP)
• Service-Oriented Architecture (SOA)
• Commercial Off the Shelf (COTS)
• Design for Reuse (RUSE)
are complicating AFCAA's job of producing accurate software cost estimates
2
I n t e g r i t y - S e r v i c e - E x c e l l e n c eUNCLASSIFIED3
Next-Generation Systems Challenges
1. Lines of Code not appropriate for Model Driven Architecture COTS-Based Systems (SOA, ERP, etc.)
2. No guidelines for estimating beyond software design: Infrastructure (servers, LAN, routers, etc.) Concurrent Users Enterprise Services (collaboration, discovery, etc.) Data Migration, External Interfaces Interoperability and Interdependency
3. Unfamiliar with total system size and cost drivers
4. Lack of Empirical Research – SOA, ERP, SoS, MDA
I n t e g r i t y - S e r v i c e - E x c e l l e n c eUNCLASSIFIED
Data Challenges
AFCAA has multiple software datasetsUnable to combine software datasets because of
inconsistencies and lack of standardization Schedule seems to be reported at program and not CSCI level --
all CSCI’s have same schedule No reporting of % re-design, % re-coding, % re-test No common counting method – logical, physical, etc. No standard application type definitions No common code counting tool Product size only reported in lines of code No reporting of COCOMO, SEER, PRICE parameters No reporting of quality measures – defects, MTBF, etc.
4
I n t e g r i t y - S e r v i c e - E x c e l l e n c eUNCLASSIFIED 5
Parametric Model Challenges
Most DoD Program Offices rely on software parametric models which have not been calibrated with recent DoD data
Parametric Models only cover software design not total system – infrastructure, users, etc.
Calibration will help reduce the program office estimating error rate
Electronic Systems Center (Hanscom AFB) SEER-SEM
Aeronautical Systems Center (WPAFB) True-S
Software Technology Center (Hill AFB) Sage
Space and Missile Systems Center SEER-SEM
I n t e g r i t y - S e r v i c e - E x c e l l e n c eUNCLASSIFIED
Consequence: Significant Cost Growth (%)
6
Statistics *Total System **Software OnlyMinimum -64% -80%
Mean 45% 37%
Median 27% 8%
High 471% 623%
Standard Deviation 71% 107%
Milestone Phase Development Development
Sample Size 137 111
Year of Data 1993-2003 2002-2008
Source : *John McCrillis, 36th DOD Cost Analysis Symposium (2003) **Defense Automated Cost Information System
I n t e g r i t y - S e r v i c e - E x c e l l e n c e
Headquarters U.S. Air Force
7
Software Cost Metrics Manual
OVERVIEW
I n t e g r i t y - S e r v i c e - E x c e l l e n c eUNCLASSIFIED 8
Scope
Cost Agencies in conjunction with University of Southern California will publish a manual to help analysts develop quick software estimates using reliable metrics from recent programs
I n t e g r i t y - S e r v i c e - E x c e l l e n c eUNCLASSIFIED 9
Data Sources
Commodity Source Format Year Projects CSCIs
Space, Ground, Air Software Resource Data Reports DD-Form 2630 2002-2008 32 103
Space AEHF SEER 2008 6 53
Air F-22 EMD and Increment II Boeing 2004 13 112
Space MILSTAR SEER 1990s 4 28
Space FAB-T DD-Form 2630 2008 7 21
Space NPOESS SEER 2008 3 67
Space TSAT DD-Form 2630 2007 3 3
Air, Ground Northrop Grumman COCOMO, SEER 1997-2008 15 32
Space, Ground Raytheon COCOMO 1997-2008 33 49
Air, Ship, Ground Naval Center for Cost Analysis TECHNOMICS 1992-2001 21 68
Air Lockheed Martin COCOMO 1996-2004 2 2
Air Army Cost and Economics Analysis Center TECHNOMICS 2001-2004 16 16
Ground Future Combat System DD-Form 2630 2003-2008 13 42
Space NRO SEER TBD TBD TBD
Space, Ground Aerospace Unknown Unknown TBD TBD
Space, Ground SMC Unknown Unknown TBD TBD
Space NASA JPL Unknown Unknown TBD TBD
>168 >598Note: Expecting over 1600 CSCIs by 2010
I n t e g r i t y - S e r v i c e - E x c e l l e n c eUNCLASSIFIED 10
Data Normalization
USC will interview program offices and developers to obtain additional information…1. COCOMO II Parameters
2. Reuse Type – auto generated, re-hosted, translated, modified
3. Reuse Source – in-house, third party
4. Degree-of-Modification – %DM, %CM, %IM
5. Method – Model Driven Architecture, Object-Oriented, Traditional
Available Data1. DoDAF – System Views, Operational Views, etc.
2. Software Resource Data Report – Software Size, Effort, Schedule
3. Cost Analysis Requirements Description (CARD) System Description, Users, Infrastructure, locations, interfaces, etc.
I n t e g r i t y - S e r v i c e - E x c e l l e n c eUNCLASSIFIED 11
Software Cost Manual Content
Chapter 1: Basic Software Cost Estimation
Chapter 2: Product Size Metrics
Chapter 3: Historical Growth
Chapter 4: Default Effective Size (ESLOC) Parameters
Chapter 5: Historical Productivity Dataset
Chapter 6: Default COCOMO Parameters
Chapter 7: SLIM-ESTIMATE Calibration
Chapter 8: Risk and Uncertainty Parameters
Chapter 9: Data Cleansing
Chapter 10: Space Software Cost Estimation
Chapter 11: Software Maintenance
I n t e g r i t y - S e r v i c e - E x c e l l e n c eUNCLASSIFIED
Chapter 4: ESLOC Parameters
12
Reuse Type Reuse SourceDesign
Modified Code
ModifiedIntegration Modified ESLOC
Auto Generated In-House 0% 0% 50% 15%
Third Party 0% 0% 100% 30%
Re-Host In-House 0% 0% 100% 30%
Third Party 0% 24% 100% 37%
Translated In-House 0% 100% 100% 60%
Third Party 15% 100% 100% 66%
Modified In-House 0% 100% 100% 60%
Third Party 100% 100% 100% 100%
Unmodified In-House 0% 0% 32% 10%
Third Party 0% 0% 100% 30%
Default values from recent programsBased on Reuse Type and Reuse Source
I n t e g r i t y - S e r v i c e - E x c e l l e n c eUNCLASSIFIED 13
Chapter 5: Historical Productivity
Overview and Guidelines Historical Productivity Dataset by ApplicationDefault Productivity Ranges by Application
IOC
CSCI Application
Productivity
(ESLOC/MM)
Raw (KSLOC) ESLOC
Effort
(MM)
Peak Effort
(FTE)Schedule
(Months)
1999 Signal Processing
Avionics 60 90000 90000 1000 69 71
2008 Spot Antenna Control
Payload 39 78000 5000 2000 95 69
2008 Bootstrap Bus 44 35000 31000 800 70 60
I n t e g r i t y - S e r v i c e - E x c e l l e n c eUNCLASSIFIED
Significance of Software Cost Metrics Manual
Collected data can be used for Systems of Systems cost research
COCOMO improvement initiatives Understanding relationships between Next-
Generation Processes and COCOMO cost drivers can encourage researchers to explore new strategies to improve available cost models…
14
I n t e g r i t y - S e r v i c e - E x c e l l e n c eUNCLASSIFIED
Way Ahead
Short Term (2009-2010) Send Software Data Call to program offices,
developers, and USC Affiliates Write Chapters 4 & 5 (2009) Publish Software Cost Metrics Manual (2010)
Long Term (2010-2012) ERP Cost Guide (2010) Impact of MDA on Software Productivity (2010) SOA Cost Study (2012)
15
Note: Any data you provide will not be attributed to your company or program, but will be combined with like data from other sources and generic zed"
I n t e g r i t y - S e r v i c e - E x c e l l e n c eUNCLASSIFIED16
Backup Slides
I n t e g r i t y - S e r v i c e - E x c e l l e n c e