Post on 09-Jan-2017
What’s Measured Improves
Metrics
That Matter
!
Raj Indugula raj.indugula@lithespeed.com
Rob Brown robert.david.brown@gmail.com
Ganesh Murugan murugan.ganesh@gmail.com
Agenda • WhyandWhatofMetrics
• PrinciplesofMeasurement
• KeyDrivers&Metrics
• KeyTakeaways• Q&A
Finally,PinYourMetric!
Whatdoes
meanintoday’s
Lean-Agile-DevOpsworld?
3
What is Measurement?
Measures• Ihave5apples• #incidents Metrics
• Ihave5moreapplesthanyesterday
• %ofSev1incidentssincefeaturerollout
ObservaSonthatreducesuncertainty…
4
Are Metrics Evil?
5
6
What Makes A Good Metric?
hUp://www.flickr.com/photos/circasassy/7858155676/
Understandable
ComparaSve
Behaviorchanging
7
Measure
movementtowardsyour
businessgoals&outcomes
Where are you going?
8
KEYMETRICS
1. Business Success • StartswithMeasurableGoals• Indicatorsforsuccess
• Marketshare• Newbusinessserviceenablement
10
2. Customer/User Success • FrequencyofkeytransacSons
• AmountofSmespentintheapplicaSon
• UsersaSsfacSonsurveyresults
• A/Btestresults• CustomerScketvolume
www.glasbergen.com
11
3. Operations Success
• UpSme(availability)• Performance(responseSme)
• ResourceuSlizaSon• DatabasequerySmes• MeanSmetodetecSon• Support
12
4. Development Success
• LeadSmeforchanges(fromdevelopmenttodeployment)
• Deploymentfrequency
• % Failed deployments• Incident severity • Outstanding defects
• Average Revenue per User
• Customer Lifetime Value
• Daily/Monthly Active Users
• Average Session Time
• % Change in Customer Volume
• Feature Metrics
• Development Lead Time
• Idle Time • Cycle Time • Work in
Progress-Technical Debt
• Rework
• Idle Time • MTTD • Defects
• Deployment Lead Time
• MTTR • % of Failed
Deployments • Deployment
frequency, duration
• MTTR • Performance
- Response Time
• Availability • Security Pass
Rate
• Customer Ticket Volume
• Net Promoter Score
• Net Value Score
Deploy Business Success
QA Development Release Customer Success Operate
• Release Frequency
• Time/Cost per Release
• Predictability
Accelerate Value Delivery
Balance speed, risk, quality & costReduce time to obtain/respond to customer
Now, All Together
14
OperaSonsSuccessDevelopmentSuccess
Monitoring Framework
• Aggregatemetricsfromallsuccesscontexts
• SeetheenSresystem
• EnablesvisualizaSon,anomalydetecSon,trending,alerSng…
Source Adaptation: Turnbull, The Art of Monitoring, Kindle edition, chap. 2.
Business Success
Customer Success
Development Success
Operations Success
15
EventAggregator
FROMTHETRENCHES
16 11 19 20 3 8 4 19 6 2
88 94 99
109 118 121
128 134 138 138 138 138
16 27
46
66 69 77 81
100 106 108
0
20
40
60
80
100
120
140
160
78 (6/8-6/21) 79 (6/22-7/5) 80 (7/6-7/19) 81 (7/20-8/2) 82 (8/3-8/16) 83 (8/17-8/30) 84 (8/31-9/13) 85 (9/14-9/27) 86 (9/28-10/11) 87 (10/12-10/25)
88 (10/26-11/8) 89 (11/9-11/22)
# of S
torie
s !
Sprints !
Stories Completed Per Sprint Scope (Total Stories Planned) Total Completed (Cumulative) Projected (Optimistic) Projected (Pessimistic) Projected (Median)
EmpiricalDataDrivesEnquiryandAdjustment
17
Q: When are all planned stories estimated to be completed? A: End of Sprint 90 (12/06/16) Q: What is the total number of stories estimated to be completed by 11/08/2016? A: 119 Stories based on average velocity Q: Can we deliver all current planned stories by 10/16/2016? A: Not Likely, given the current trend
FixedScope
FixedDate
FixedScope&Date
90
7969
6352 48
4436
35
21
15 15
4
11 10
6
11
4 48
146
00
10
20
30
40
50
60
70
80
90
100 Product 1
PendingResolved
229
171162 160 154 144
124
99
71
5230 2718
58
9 6 1028
19 2230
50
100
150
200
250Product 2
PendingResolved
“YouBuildit,YouOwnIt”
ImprovingResponsivenesstoIssues
ImprovingDeploymentSuccessRate,NotJustFrequency
19
BuildDailySuccessRate
DateBuildname
9/12/2016 9/13/2016 9/14/2016 9/15/2016 9/16/2016
TestSuite1
100% 100% 100% 100% 100%
Test Suite 2100% 100% 100% 100% 100%
Test Suite 3100% 100% 100% 100% 100%
Test Suite 4100% 100% 100% 100% 100%
Test Suite 5100% 100% 100% 85% 100%
Test Suite 6100% 100% 94% 80% 100%
2% 7%
2% 6%
52% 41% 39%
26%
3%
4%
3%
3%
3%
3%
33%
28% 36%
44%
5%
20% 9% 18%
2% 7%
0%
10%
20%
30%
40%
50%
60%
70%
80%
90%
100%
% Sprint Work Breakdown Time to Fix
Technical Debt
Production Incidents
Product Enhancements
Major Interface
New Capability
Other
Arch.improvement
InsightintoTeamCapacityandWorkItems
20
BalancingNewFeatureWorkwithMaintenanceandImprovement
KEYTAKEAWAYS
• Keepinventoryofmetricssmall
• Measureoutcomes,notindividuals
• Monitorbalancedsetofmetrics
• Monitortrends,notdatapoints
• ShareWidely–InteracSonencouragesexploraSon
“If you measure me in an illogical way…don’t complain about illogical behavior” - Goldratt
22
Next Steps
ClearGoal Wellformedmetrics
Measure&Improve
“Metrics illuminate, not indicate” –GeorgeDinwiddie
Pin Your Metric
UsefulUseless
REFERENCES
• hUp://onstartups.com/tabid/3339/bid/96738/Measuring-What-MaUers-How-To-Pick-A-Good-Metric.aspx
• hUp://www.slideshare.net/GoAtlassian/understanding-metrics-what-to-measure-and-why-john-custy
• hUp://www.slideshare.net/TeamQualityPro/the-good-the-bad-and-the-metrics
• hUp://devopsenterprise.io/media/DOES_forum_metrics_102015.pdf• hUp://devops.com/2014/11/10/devops-scorecard/• hUp://www.daScal.com/blog/9-metrics-devops-teams-tracking/• hUp://devops.com/2015/01/26/metrics-devops/• hUps://blog.appdynamics.com/devops/quanSfied-devops/• hUp://www.slideshare.net/jedi4ever/devops-metrics• hUp://www.slideshare.net/ITRevoluSon/does15-troy-magennis-and-
julia-wester-metrics-and-modeling-helping-teams-see-how-to-improve