Post on 03-Jan-2016
Software Estimation Methods
Fundamental Questions
Survey of Methods
Fidelity of Estimates
Thoughts to Ponder
Fundamental estimation questions
Fundamental Estimation Questions:1. How much effort is required to complete an activity?
2. How much calendar time is needed to complete an activity?
3. What is the total cost of an activity?
Costing and Pricing Estimates are made to discover the cost, to the developer, of
producing a software system There is not a simple relationship between the development
cost and the price charged to the customer Broader organisational, economic, political and business
considerations influence the price charged Be aware of the difference between an estimate and a price
Estimation Methods
1. Decomposition methods
2. Algorithmic/Parametric: COCOMO and others
3. Estimate by analogy/experience – based on what you have done before
4. Expert: Wideband Delphi – consensus building
5. Agile methods1. Planning Poker – Agile version of consensus building
2. XP Planning Game – relative estimation and guidance
6. Parkinson’s Law - estimate to your resources
7. Pricing to Win - estimate what will win
Decomposition-based Estimating
Base estimate on solution “parts” and size• Decompose the problem during analysis & design• Estimate each part• Sum the result (and add an integration fudge factor)• Perhaps the most straightforward/commonsense
• But you still need quality estimates for each part
Examples:• LOC or FP sizing (remember metrics in 316?)• Change sizing – look at existing project and the delta
that needs to be applied for reuse• RUP’s Use Case Point Analysis – sizes based on
problem space decomposition, not solution space
• Many branch off into algorithmic/analogy methods
Example: WBS and Estimation
• “Top-down” Decomposition-based Estimation: Work breakdown structure allows estimation of
costs/time required, by “rolling up” such quantities:
(children resources) = parent resource
A WBS that is too coarse makes it difficult to assign tasks and utilize resources.
• Stellman and Greene suggest “10 to 20” tasks as a rule of thumb
Algorithmic/Parametric Models
“COCOMO style”, Conversion tables (McConnell) These rely on a large historical base of project data
Cost is estimated as a mathematical function of product, project and process attributes whose values are estimated by project managers Effort = A SizeB M
A is an organisation-dependent constant, B reflects the disproportionate effort for large projects and M is a multiplier reflecting product, process & people attributes
Most commonly used product attribute for estimation is code size (Lines of Code [LOC] or Function Points [FP])
Most models similar but use different values for A, B & M
Algorithmic Case Study: COCOMO
The COnstructive COst MOdel Well-established model by Barry Boehm (1981) “Basic COCOMO” used formula from previous slide
• Simple (“organic”) projects: 2.4 (KDSI)1.05 x M• Moderate (“semi-detached”): 3.0 (KDSI)1.12 x M• Embedded (mission-critical): 3.6 (KDSI)1.20 x M
Evolved today into COCOMO II 3-level model:• Early prototyping - Prototyping projects with much reuse• Early design - Estimates after requirements agreed to• Post-architecture - After analysis, a tuned version of ED
Formulas constructed and tuned to 161 data points on large projects as collected by Boehm
Estimating by analogy/experience
What you are building next most likely resembles something you have built before• Especially if you work in a specialized domain• Another technique relying on historical data
Example: PROBE• By Watts Humphrey - remember the PSP guy?• “Proxy-based Estimating”• Based on idea that if an engineer is building a
component similar to one s/he already built, then it will take about the same effort as it did in the past.
• Individual engineers use a database to maintain a history of the effort they have put into their past projects.
• A formula based on linear regression is used to calculate the estimate for each task from this history.
Expert Consensus: Wideband Delphi
“Secret” estimates by experts, averaged & presented, revised & reviewed until consensus
• Boehm’s 1981 book Software Engineering Economics had 6 steps:1. Coordinator presents each expert with a specification and an estimation form.
2. Coordinator calls a group meeting in which the experts discuss estimation issues with the coordinator and each other.
3. Experts fill out forms anonymously.
4. Coordinator prepares and distributes a summary of the estimates
5. Coordinator calls a group meeting, specifically focusing on having the experts discuss points where their estimates vary widely
6. Experts fill out forms, again anonymously, and steps 4 to 6 are iterated for as many rounds as appropriate.
• Advantages Easy, inexpensive, utilizes expertise of several people Does not require historical data
• Disadvantages Difficult to repeat May fail to get consensus, reach wrong one, or all may have same bias
Agile Methods: Planning Poker• Agile version of Wideband Delphi• Every developer participates, & maybe other team members• Product Owner conducts the session. • Script:
1. Each estimator is given some cards with estimates on them
2. Product Owner selects a story card and reads aloud
3. Each estimator selects an estimate card and puts it face down
4. When ready, all estimators “roll-up” their cards at the same time
5. High and low estimators then conduct a short discussion of their varying estimates
6. Return to step 3 and continue until consensus is reached
• Issues: Having the “right amount of discussion” Getting assumptions out in the open
Agile Methods: The Planning Game
XP Requirements Methodology• 3 phases
Exploration – find out what the system should do
Commitment – decide what subset of all possible requirements to do next
Steering – Guide development as reality molds the plan
The XP Approach
Exploration Phase1. Business people write a story
2. Development estimates, or asks for a clarification or split of the story1. If a clarification or split required, return to step 1
2. Provide an estimate in “Ideal Engineering Time” (IET)
Exploration Commitment Steering
The XP ApproachCommitment Phase
Business chooses scope and release date1. Sort by value: Put cards into 3 piles
1. Those that the system cannot function without
2. Less essential but provide significant business value
3. Nice to have
2. Sort by risk: 3 piles1. Those Development can estimate precisely
2. Those Development can estimate reasonably well
3. Those Development cannot estimate at all
3. Set velocity: Development tells Business how fast (in IET) per calendar month it will take to implement
4. Choose scope: Business chooses the set of cards for a release– Set date first, then choose cards, - OR –
– Choose functionality, and then set date
XP: More on Velocity
Estimating Using Project Velocity Velocity is a measure of your team’s capacity, not of
the “speed” of the project. Developers’ contribute velocity points per iteration
• Example:– Joe is an expert guru programmer: 5 velocity pts/iteration– Sue is an average programmer: 3 velocity pts/iteration– Jimmy is a junior programmer: 1 point per iteration
User Stories are then estimated via “points”• Includes all efforts, including “spikes”
These measures are relative!
XP does not allow velocity to be increased midstream, or estimates to be revised downward
Two Other Ways to Estimate
Parkinson’s Law The project costs whatever resources are available Advantages: No overspend Disadvantages: System is usually unfinished
Pricing to Win project costs whatever the customer has to spend on it!!! Advantages: You get the contract Disadvantages:
• The probability that the customer gets the system he or she wants is small.
• Costs do not accurately reflect the work required
Sometimes, this is the only method available!
Estimating Problems
Complex tasks requiring a significant amount of effort. Revisit/revise estimates at various stages of the project
Many people have little experience doing them. Garbage-in, Garbage-out Try to provide training and mentoring
Sponsors want a number for a bid, not a real estimate. PMs must negotiate with sponsors to create real cost estimates
A bias toward underestimation in developers Developer “fudge-factor” to avoid poor reviews Review and ask questions to ensure estimates are not biased
A bias toward overestimation in PMs Rewarded for early delivery One successful “rush-job” can lead to many
Course Technology, 1999
Fidelity of Estimates
DeMarco (1982):“An estimate is the most optimistic prediction that has a non-zero probability of coming true…or…’what’s-the- earliest-date-by-which-you-can’t-prove-you-won’t-be-finished’”
Cone Of Uncertainty (Boehm 1995)
Moral: Your estimates get better as you get closer to the deadline
Estimating Thoughts to Ponder• “…any disagreement is generally about what is required to
perform the task itself, not about the effort required to complete it.” (p.35)
• “The most accurate estimates are those that rely on prior experience.” (p. 34)
• “For the estimates to be most effective, the assumptions must be written down” (p.35)
• “The assumption allows the team to reach a temporary decision, knowing the final decision is still open.” (p. 36)
• “Estimates are most accurate when everyone on the project team feels that he was actively part of the estimation process.” (p. 37)
From Ch. 3 Applied Software Project Management, Stellmen & Greene
Programmers vs. PMs• “It is common for nontechnical people to assume that programmers pad
their estimates.” (p. 37)
• “Software engineers are often overoptimistic by nature.” (p.50)
• “Some project managers respond to an unrealistic situation by creating ‘estimates’ that are too low but meet it.” (p. 51)
• “…if her estimates come up short, she will be penalized at her next review…Her manager will catch on and eventually catch on and start cutting down any estimate she provides.” (p. 50)
• “There is a basic tug-of-war going on here. Engineers prefer higher estimates, giving them as much time and as little pressure as possible to do their work. Managers prefer to deliver things more quickly, in order to please stakeholders.” (p. 50)
From Ch. 3 Applied Software Project Management, Stellmen & Greene
Final Thoughts
Summary: 4 general approaches1. Based on size (Decomposition, XP)
2. Use of expert opinion (Delphi, analogy/experience)
3. Based on historical data (algorithmic, analogy/experience)
4. Estimate to what it needs to be (Parkison’s Law, Priced To Win)
Estimating Tips: Estimating is hard. Estimating can be personal and political Productivity is not proportional to the number of people working
on a task Adding people to a late project makes it later. The unexpected always happens. Anticipate change. Get group buy-in via participation and transparency Revisit and revise your estimates often Set stakeholder expectations
“Predictions are always hard, especially about the future”
-- Yogi Berra