CMPUT 401- Software Engineerintf Software Measurement - 1 c Paul Sorenson
METRICSMETRICS
WHY DO WE NEED METRICS?WHY DO WE NEED METRICS?
HOW DO WE COLLECT (GET, GATHER) METRICS?HOW DO WE COLLECT (GET, GATHER) METRICS?
• to measure our progress in satisfying our system development/evolution goals
• based on documented evidence of what has or is transpiring in system development/evolution
HOW DO WE USE METRICS?HOW DO WE USE METRICS?
• to improve our processes from prescriptive (predictive), descriptive ("experience-base" building), and proscriptive (defect avoidance) perspectives.
CMPUT 401- Software Engineerintf Software Measurement - 2 c Paul Sorenson
IMPORTANT ASPECTS OF METRICS IMPORTANT ASPECTS OF METRICS ININ
MANAGING QUALITYMANAGING QUALITY• In the SEI Maturity Model the primary difference
between Level 5 (optimized processes) and lower levels are that tests are conducted during the process and not at the end.
• Both the V-System Development Model and the Spiral Model support planned testing and continuous verification and validation. So do certain well-defined, well-control approaches to prototyping.
• While project management tools (e.g., MS Project, Autoplan) are useful in assisting the project manager, they can not be viewed as the only tools. Why?
• For system testing, experience indicates that the best indicators focus on defect discovery and tracking.
CMPUT 401- Software Engineerintf Software Measurement - 3 c Paul Sorenson
Deciding on Software Metrics
BASED HEAVILY ON ROBERT GRADY'S BOOK “Practical Software Metrics for Project Management and Process Improvement”
• What are the goals in developing successful software systems?• We wish to develop systems that ....
meet customer needs
are reliable (perform well)
are cost effective to build and evolve
Meet a a customer’s desired schedule.
CMPUT 401- Software Engineerintf Software Measurement - 4 c Paul Sorenson
A GOAL ELABORATION PROCESS
DUE TO BASILI, ET AL AT UNIV. OF MARYLAND (SEL PROJECT)
• • • •Goal1
Subgoal1.2SG1.1 SG1.3
Q1.2.1 Question1.2.2 Q1.2.3
M1.2.2.1 Metric1.2.2.2 M1.2.2.3
Goaln
Subgoaln.2SGn.1 SGn.3
Qn.2.1 Questionn.2.2 Qn.2.3
Mn.2.2.1 Metricn.2.2.2 Mn.2.2.3
• • • •
• • • •
• • • •
GOAL/QUESTION/METRIC
PARADIGM
CMPUT 401- Software Engineerintf Software Measurement - 5 c Paul Sorenson
GOAL DEVELOPMENT
GOAL: Develop Successful Software Systems
SG1: Maximize CustomerSatisfaction SG4: Minimize Schedule
SG2: Minimize EngineeringEffort and Schedule SG3: Minimize Defects
CMPUT 401- Software Engineerintf Software Measurement - 6 c Paul Sorenson
QUESTION GENERATION
SG1: Maximize Customer Satisfaction
Q1: What are the attributes of customer satisfaction?
Q2: What are the key indicators of customer satisfaction?
Q3: What aspects result in customer satisfaction?
Q4: How satisfied are the customers?
Q5: How do we compare with the competition?
Q6: How many problems are affecting customers?
Q7: How long does it take to fix a problem? (compared to customer expectn)
Q8: How does installing a fix affect the customer?
Q9: How many customers are affected by the problem? (by how much?)
Q10: Where are the bottlenecks?
CMPUT 401- Software Engineerintf Software Measurement - 7 c Paul Sorenson
Q1: What are the attributes Q1: What are the attributes of of
customer satisfaction?customer satisfaction?FFunctionality
UUsability
RReliability
PPerformance
SSupportability
Feature SetFeature SetCapabilitiesCapabilities
GeneralityGeneralitySecuritySecurity
Human FactorsHuman FactorsAestheticsAesthetics
ConsistencyConsistencyDocumentationDocumentation
Failure Freq.Failure Freq.Failure SeverityFailure SeverityRecoverabilityRecoverability
PredictabilityPredictabilityAccuracyAccuracyMean Time to FailMean Time to Fail
SpeedSpeedEfficiencyEfficiencyThruputThruput
Resource Consumpt.Resource Consumpt.Response TimeResponse Time
TestabilityTestabilityExtensibilityExtensibilityAdaptabilityAdaptabilityMaintainabilityMaintainability
CompatiabilityCompatiabilityConfigurabilityConfigurabilityServiceabilityServiceabilityInstallabilityInstallabilityLocalizabilityLocalizability
CMPUT 401- Software Engineerintf Software Measurement - 8 c Paul Sorenson
Q2-Q5: Sources of Customer Needs
• • SurveysSurveysDefine survey goalsgoals => questionsquestions (at least one per FURPS)=> how data will be analyzedanalyzed and results presentedpresented.. State or graph sample conclusions.
• • InterviewsInterviews
Generally more accurateaccurate and informativeinformative, but time-consumingtime-consuming and could be subject to biasbias.
TestTest the survey and your data analysis before sending out
Ask questions requiring simple answerssimple answers,preferably quantitative or yes/no
Keep it shortshort (one to two pages)
Make them very easy to returneasy to return
CMPUT 401- Software Engineerintf Software Measurement - 9 c Paul Sorenson
MAJOR SW STRATEGIES
BASED HEAVILY ON ROBERT GRADY'S BOOK “Practical Software Metrics for Project Management and Process Improvement”
• What are the goals in developing successful software systems?
• We wish to develop systems that ....
meet customer needs
are reliable (perform well)
are cost effective to build and evolve
CMPUT 401- Software Engineerintf Software Measurement - 10 c Paul Sorenson
GOAL/QUESTION/METRICPARADIGM
A GOAL ELABORATION PROCESS
DUE TO BASILI, ET AL AT UNIV. OF MARYLAND (SEL PROJECT)
• • • •Goal1
Subgoal1.2SG1.1 SG1.3
Q1.2.1 Question1.2.2 Q1.2.3
M1.2.2.1 Metric1.2.2.2 M1.2.2.3
Goaln
Subgoaln.2SGn.1 SGn.3
Qn.2.1 Questionn.2.2 Qn.2.3
Mn.2.2.1 Metricn.2.2.2 Mn.2.2.3
• • • •
• • • •
• • • •
CMPUT 401- Software Engineerintf Software Measurement - 11 c Paul Sorenson
GOAL DEVELOPMENT
GOAL: Develop Successful Software Systems
SG1: Maximize CustomerSatisfaction SG3: Minimize Defects
SG2: Minimize EngineeringEffort and Schedule
CMPUT 401- Software Engineerintf Software Measurement - 12 c Paul Sorenson
QUESTION GENERATION
SG1: Maximize Customer Satisfaction
Q1: What are the attributes of customer satisfaction?
Q2: What are the key indicators of customer satisfaction?
Q3: What aspects result in customer satisfaction?
Q4: How satisfied are the customers?
Q5: How do we compare with the competition?
Q6: How many problems are affecting customers?
Q7: How long does it take to fix a problem? (compared to customer expectn)
Q8: How does installing a fix affect the customer?
Q9: How many customers are affected by the problem? (by how much?)
Q10: Where are the bottlenecks?
CMPUT 401- Software Engineerintf Software Measurement - 13 c Paul Sorenson
Q1: What are the attributes Q1: What are the attributes of of
customer satisfaction?customer satisfaction?FFunctionality
UUsability
RReliability
PPerformance
SSupportability
Feature SetFeature SetCapabilitiesCapabilities
GeneralityGeneralitySecuritySecurity
Human FactorsHuman FactorsAestheticsAesthetics
ConsistencyConsistencyDocumentationDocumentation
Failure Freq.Failure Freq.Failure SeverityFailure SeverityRecoverabilityRecoverability
PredictabilityPredictabilityAccuracyAccuracyMean Time to FailMean Time to Fail
SpeedSpeedEfficiencyEfficiencyThruputThruput
Resource Consumpt.Resource Consumpt.Response TimeResponse Time
TestabilityTestabilityExtensibilityExtensibilityAdaptabilityAdaptabilityMaintainabilityMaintainability
CompatiabilityCompatiabilityConfigurabilityConfigurabilityServiceabilityServiceabilityInstallabilityInstallabilityLocalizabilityLocalizability
CMPUT 401- Software Engineerintf Software Measurement - 14 c Paul Sorenson
Q2-Q5: Sources of Customer Needs
• • SurveysSurveysDefine survey goalsgoals => questionsquestions (at least one per FURPS)=> how data will be analyzedanalyzed and results presentedpresented.. State or graph sample conclusions.
• • InterviewsInterviews
Generally more accurateaccurate and informativeinformative, but time-consumingtime-consuming and could be subject to biasbias.
TestTest the survey and your data analysis before sending out
Ask questions requiring simple answerssimple answers,preferably quantitative or yes/no
Keep it shortshort (one to two pages)
Make them very easy to returneasy to return
CMPUT 401- Software Engineerintf Software Measurement - 15 c Paul Sorenson
MEASUREMENT MEASUREMENT PROGRAMPROGRAMPLANNINGPLANNING
FURPS+FURPS+ planned versus actual trackingplanned versus actual tracking (from Fig. 4-3 Grady, Practical Software Metrics)
0
5
10
15
20
25
30
35
40
Actual
Planned
0
5
10
15
20
25
30
35
40
Actual
PlannedFUR
PS
WEEK
CMPUT 401- Software Engineerintf Software Measurement - 16 c Paul Sorenson
Q6: How many problems Q6: How many problems are are
affecting customers?affecting customers?EXAMPLE METRICS:
• Incoming defect rate
• Open critical and serious defects
• Break/fix ratio
• Post release defect density
CMPUT 401- Software Engineerintf Software Measurement - 17 c Paul Sorenson
CRITICAL/SERIOUS PROBLEMSTRACKING
0
20
40
60
80
100
120
1Q85
2Q85
3Q85
Hot Index
Alert Index
Total Index
0
20
40
60
80
100
120
1Q85
2Q85
3Q85
Hot Index
Alert Index
Total Index
3Q853Q85
CMPUT 401- Software Engineerintf Software Measurement - 18 c Paul Sorenson
DEFECT CLOSURE:
REPORTING & ANALYSIS
-100
-50
0
50
100
150
1 2 3 4 5 6 7 8 9 10 11 12 13 14
IncomingRequests
ClosedRequests
PROGRESS
-100
-50
0
50
100
150
1 2 3 4 5 6 7 8 9 10 11 12 13 14
IncomingRequests
ClosedRequests
PROGRESSWEEKWEEK
CMPUT 401- Software Engineerintf Software Measurement - 19 c Paul Sorenson
DEFECT IDENTIFICATION DEFECT IDENTIFICATION & TRACKING& TRACKING
(based on Fig. 7-5 Defects identified/remaining open)
Defects Remaining Open
ReleaseDate
2 4 6 8 10 12 Weeks
0
100
200
300
400
0
TotalDefects Identified Defects
Ideal Defects Identified
CMPUT 401- Software Engineerintf Software Measurement - 20 c Paul Sorenson
PREDICTIVE MODELS
0
20
40
60
80
% of defectsfound
100
Defects – Cumulative %
hours/defect2 5 10 20 50
Defects – Cumulative % by intervals
25%
50%
20%
4% 1%
Kohoutek's model
CMPUT 401- Software Engineerintf Software Measurement - 21 c Paul Sorenson
RELEASRELEASEE
CRITERICRITERIAATESTING STANDARDS SHOULD INCLUDE AGREED-TO GOALS
BASED ON THE FOLLOWING CRITERIA
• breadth – testing coverage of user-accessible and internal functions
• depth – branch coverage testing
• reliability – continuous hours of operation under stress; measured ability to cover gracefully
• defect density – (actual and/or predicted) at release
CMPUT 401- Software Engineerintf Software Measurement - 22 c Paul Sorenson
RELEASE DATESRELEASE DATES (BASED ON DEFECTS)
(based on Fig. 7-8 Critical/serious defects)
0
2
4
6
8
Ma Ju Jl Au Se Oc No
# ofDefects
Months
Critical/SeriousDefects
CriticalDefects
10
# of DefectsTarget
.00
.02
.04
.06
Defect/KNCSSProjected
Release Date
ActualRelease
Date
CMPUT 401- Software Engineerintf Software Measurement - 23 c Paul Sorenson
POST-RELEASE DEFECTS &
CUSTOMER SATISFACTION
(based on Fig. 7-9 Postrelease incoming SR's - service requests)
0
1
2
3
4
0 2 4 6 8 10 12
ServiceReq. perKNCSS
Months
Worst Certified Product
Not Certifiedor Did not MeetCertification
5
Avg. ofCertified
6
3 monthsmoving average
CMPUT 401- Software Engineerintf Software Measurement - 24 c Paul Sorenson
METRICS SELECTIONGoal1
Subgoal1.2SG1.1 SG1.3
Q1.2.1 Question1.2.2 Q1.2.3
M1.2.2.1 Metric1.2.2.2 M1.2.2.3
• GOAL ELABORATION PROCESS
• In reality, we seldom develop a new metric from scratch – we usually adopt an existing one.
• examine "Bang" and "function point" approaches to be applied at the requirements analysis phase
• we have examined fan-in/fan-out and Henry & Kafura's INF (Info Flow Measure) at the design stage
• examine McCabe's cyclomatic complexity measure for intra- module complexity (applies at both the implementation and testing phases) Known "Metric's Base"
CMPUT 401- Software Engineerintf Software Measurement - 25 c Paul Sorenson
McCABE'S CYCLOMATIC COMPLEXITYMcCABE'S CYCLOMATIC COMPLEXITY• Programs are viewed as directed graphs that show control flow.
CC = # of regions
CC = # + 1
CC = # of independent basis paths
1
2
3
45
6
7
8
9
10
1
2
3
4
5
6
7
8
9
Decision
Code block
CMPUT 401- Software Engineerintf Software Measurement - 26 c Paul Sorenson
PROBLEMS WITH McCABE CC
• It was primarily designed for FORTRAN -- doesn't apply so well with newer languages that are less procedural and/or more dynamic in nature. (E.g., how is exception handling dealt with.)
• The case of CC(G) = 1 remains true for any size of linear code.
• Insensitivity of CC to the software structure. Several researchers have demonstrated that CC can increase when applying generally accepted techniques to improve program structure. Evangelist['83] shows that only 2 out of 26 of Kernighan & Plauger's rules results in decreases to CC.
• All decisions have a uniform weight regardless of the depth of nesting or relationship with other decisions.
• Empirical studies show a mixture of success with using McCabe's CC.
• Best suited for test coverage analysis and intra-module complexity
Top Related