CMMI and Metrics BCS SPIN SG, 19 February 2008 Clifford Shelley OXFORD SOFTWARE ENGINEERING Ltd

31
© OSEL 2008 OXFORD SOFTWARE ENGINEERING Software Engineering Services & Consultancy Slide 1 CMMI and Metrics BCS SPIN SG, 19 February 2008 Clifford Shelley OXFORD SOFTWARE ENGINEERING Ltd 9 Spinners Court, 53 West End, Witney, Oxfordshire OX28 1NH www.osel.co.uk [email protected] Tel. +44 (0) 1993 700878

description

CMMI and Metrics BCS SPIN SG, 19 February 2008 Clifford Shelley OXFORD SOFTWARE ENGINEERING Ltd 9 Spinners Court, 53 West End, Witney, Oxfordshire OX28 1NH www.osel.co.uk [email protected] Tel. +44 (0) 1993 700878. Objective. - PowerPoint PPT Presentation

Transcript of CMMI and Metrics BCS SPIN SG, 19 February 2008 Clifford Shelley OXFORD SOFTWARE ENGINEERING Ltd

Page 1: CMMI and Metrics BCS SPIN SG,  19 February 2008 Clifford Shelley OXFORD SOFTWARE ENGINEERING Ltd

© OSEL 2008

OXFORD SOFTWARE ENGINEERINGSoftware Engineering Services & Consultancy

Slide 1

CMMI and Metrics

BCS SPIN SG, 19 February 2008

Clifford ShelleyOXFORD SOFTWARE ENGINEERING Ltd9 Spinners Court, 53 West End,Witney,OxfordshireOX28 [email protected]. +44 (0) 1993 700878

Page 2: CMMI and Metrics BCS SPIN SG,  19 February 2008 Clifford Shelley OXFORD SOFTWARE ENGINEERING Ltd

© OSEL 2008

OXFORD SOFTWARE ENGINEERINGSoftware Engineering Services & Consultancy

Slide 2

Objective

1. Present the background and history of measurement within CMMI

2. Place measurement within the context of CMMI (or CMMI within the context of measurement?)

3. Identify common issues and concerns with measurement within the context of CMMI, and their resolution

4. Look at some measurement approaches that can be really useful

(it is assumed here that ‘measurement’ is equivalent to ‘metrics’)

Page 3: CMMI and Metrics BCS SPIN SG,  19 February 2008 Clifford Shelley OXFORD SOFTWARE ENGINEERING Ltd

© OSEL 2008

OXFORD SOFTWARE ENGINEERINGSoftware Engineering Services & Consultancy

Slide 3

Background…

• Original SW CMM (a tool to measure software capability)

– Included measurement and analysis as a ‘common feature’ – briefly described

– Expected product size estimates in planning

– Measurement of processes (problematic at L2, expected at L3)

– SPC like KPA at L4

Page 4: CMMI and Metrics BCS SPIN SG,  19 February 2008 Clifford Shelley OXFORD SOFTWARE ENGINEERING Ltd

© OSEL 2008

OXFORD SOFTWARE ENGINEERINGSoftware Engineering Services & Consultancy

Slide 4

…Background…

Carried forward and developed to CMMI

• Staged Representation– ML2

• Measurement and Analysis is a PA in its own right– NB: Includes GQM as SP 1.1-1.4

• PP requires product sizing– ML3

• OPD SP 1.4 - process data defined and collected (using MA)• IPM SP 1.2 – use historical data for estimating

– ML4• SPC expectations (OPP)• Control common causes of variation?• Exploited by QPM

– ML5• Quantitiative Process Improvement – pervasive measurement• CAR uses measurement as analysis tool• OID uses measurement as an analysis tool

Page 5: CMMI and Metrics BCS SPIN SG,  19 February 2008 Clifford Shelley OXFORD SOFTWARE ENGINEERING Ltd

© OSEL 2008

OXFORD SOFTWARE ENGINEERINGSoftware Engineering Services & Consultancy

Slide 5

…Background

Continuous Representation

– CL2: • All PAs expected to be monitored and controlled (GP2.8 –

measurement is expected)– CL3:

• Standard process measures [defined and] stored (GP3.2)– CL4:

• Quantitative objectives for processes established (GP4.1)• SPC required - stabilize sub-process performance (GP4.2)

– Control special causes of variation – part of process capability as understood by production engineering

– CL5:• Establish quantitative process improvement (GP5.1 sub-practice)• Manage common causes too – to enable process capability

Page 6: CMMI and Metrics BCS SPIN SG,  19 February 2008 Clifford Shelley OXFORD SOFTWARE ENGINEERING Ltd

© OSEL 2008

OXFORD SOFTWARE ENGINEERINGSoftware Engineering Services & Consultancy

Slide 6

Measurement within CMMI…

• MA PA is the enabler

• “…is to develop and sustain a measurement capability that is used to support management information needs”

• Interpretation?

Page 7: CMMI and Metrics BCS SPIN SG,  19 February 2008 Clifford Shelley OXFORD SOFTWARE ENGINEERING Ltd

© OSEL 2008

OXFORD SOFTWARE ENGINEERINGSoftware Engineering Services & Consultancy

Slide 7

…Measurement within CMMI…

• MA Scope

– SG1 Align Measurement and Analysis Activities• SP 1.1 Establish Measurement Objectives• SP 1.2 Specify Measures• SP 1.3 Specify Data collection and Storage Procedures• SP 1.4 Specify Analysis Procedures

– SG2 Provide Measurement Results• SP 2.1 Collect Measurement Data (includes verification)• SP 2.2 Analyze Measurement Data• SP 2.3 Store Data and Results• SP 2.4 Communicate Results (to aid decision making)

Page 8: CMMI and Metrics BCS SPIN SG,  19 February 2008 Clifford Shelley OXFORD SOFTWARE ENGINEERING Ltd

© OSEL 2008

OXFORD SOFTWARE ENGINEERINGSoftware Engineering Services & Consultancy

Slide 8

…Measurement within CMMI

• MA PA – Applicability for ML2

– “…support management information needs…”

– Project management (initially)– “…at multiple levels in the organization…”– and process management (GP 2.8)– and products (product components provided by suppliers)

• Not considered explicitly– testing? – testers tend to be a measurement ‘centre of excellence’– development? – developers don’t (design not amenable to

measurement?)

Page 9: CMMI and Metrics BCS SPIN SG,  19 February 2008 Clifford Shelley OXFORD SOFTWARE ENGINEERING Ltd

© OSEL 2008

OXFORD SOFTWARE ENGINEERINGSoftware Engineering Services & Consultancy

Slide 9

Practical concerns 1:

• Fear– ‘I don’t want to be measured’

• It is abstract and can be difficult (Pfleeger)– Distinguish between metrics designers and metrics users– and train accordingly

• Where does it fit?– Tactical capability or organizational infrastructure, or mix?

Page 10: CMMI and Metrics BCS SPIN SG,  19 February 2008 Clifford Shelley OXFORD SOFTWARE ENGINEERING Ltd

© OSEL 2008

OXFORD SOFTWARE ENGINEERINGSoftware Engineering Services & Consultancy

Slide 10

Practical concerns 2:

• What should we measure?– RTFM– SG1– Ask what do you* need to know? Why?

• Who monitors and controls (measures)processes– At ML2, at ML3?

• Who monitors and controls the MA process?

• Metrics Repository– Central/organizaton, or local/project

* ‘you’, perhaps ‘we’, but not ‘they’

Page 11: CMMI and Metrics BCS SPIN SG,  19 February 2008 Clifford Shelley OXFORD SOFTWARE ENGINEERING Ltd

© OSEL 2008

OXFORD SOFTWARE ENGINEERINGSoftware Engineering Services & Consultancy

Slide 11

Common Circumstances 1:

• Product sizing rarely done, or done well– Difficulty with identifying entities and their quantitative attributes

– Fix: analysts extract identities, counts and attributes of system elements from developer/estimators – start with task based estimation spreadsheet and work upstream, ‘What were they thinking?’

• Organizational Measurement Infrastructure in place– Collects lots of data (if it moves measure it)– No traceability back to objectives (SG1 missing) data is orphaned from rationale – can’t reuse

– Fix: discard collected data without definitions or rationale, then reverse engineer objectives (SG1) for existing data collection systems – usually results in opportunity to shed data collection activity – reduce costs – although rarely taken up

• Organizational measurement data is unverified – of unknown accuracy– used for admin/billing– Not credible, known to be invalid (timesheets)– Not used by collectors– Data is for reporting, not using

– Fix: verify data – presumes SG1

Page 12: CMMI and Metrics BCS SPIN SG,  19 February 2008 Clifford Shelley OXFORD SOFTWARE ENGINEERING Ltd

© OSEL 2008

OXFORD SOFTWARE ENGINEERINGSoftware Engineering Services & Consultancy

Slide 12

Common Circumstances 2:

• Good measurement– Developed, owned and used locally, within teams– Can be undervalued (seems obvious)– MA SG1 implicit

• There is a limited ‘information horizon’– Visibility is limited, and may be better that way– Measurement data doesn’t travel well– ‘Drill down’ is limited – even if it looks like it isn’t

Page 13: CMMI and Metrics BCS SPIN SG,  19 February 2008 Clifford Shelley OXFORD SOFTWARE ENGINEERING Ltd

© OSEL 2008

OXFORD SOFTWARE ENGINEERINGSoftware Engineering Services & Consultancy

Slide 13

Good Measurement 1:

1. Purpose clear and understood by collectors, analysts and decision makes

2. Measures are defined (not just described)

3. Data collectors are users (short feedback loops)

4. Accuracy and validity known (as minimal requirement)

5. Can stop collecting when no longer needed

Page 14: CMMI and Metrics BCS SPIN SG,  19 February 2008 Clifford Shelley OXFORD SOFTWARE ENGINEERING Ltd

© OSEL 2008

OXFORD SOFTWARE ENGINEERINGSoftware Engineering Services & Consultancy

Slide 14

Good Measurement 2:1. KISS

– Minimal arithmetic, especially multiplication and division– (this includes percentages)– Arithmetic obscures much, reveals little

2. Non parametric approaches– Robust, widely applicable to ‘messy’ software engineering data– Not the usual statistical approach

3. Consider ‘Exploratory Data Analysis’ (EDA)– Tukey

4. Use Graphics• But not pie charts– Let data show its information content – patterns, trends, outliers – Tufte

5. GQMGtutu– Goal Question Metric, Graphics – guided by Tufte and Tukey– SEI GQ(I)M

6. SPC– Later, much later– SPC Rule #1, know what you’re doing

Page 15: CMMI and Metrics BCS SPIN SG,  19 February 2008 Clifford Shelley OXFORD SOFTWARE ENGINEERING Ltd

© OSEL 2008

OXFORD SOFTWARE ENGINEERINGSoftware Engineering Services & Consultancy

Slide 15

Page 16: CMMI and Metrics BCS SPIN SG,  19 February 2008 Clifford Shelley OXFORD SOFTWARE ENGINEERING Ltd

© OSEL 2008

OXFORD SOFTWARE ENGINEERINGSoftware Engineering Services & Consultancy

Slide 16

O X F O R D

S O F T W A R E E N G I N E E R I N G

L I M I T E D

9 Spinners Court, 53 West End,

Witney,

Oxfordshire

OX28 1NH

www.osel.co.uk

[email protected]

Tel. +44 (0) 1993 700878

Page 17: CMMI and Metrics BCS SPIN SG,  19 February 2008 Clifford Shelley OXFORD SOFTWARE ENGINEERING Ltd

© OSEL 2008

OXFORD SOFTWARE ENGINEERINGSoftware Engineering Services & Consultancy

Slide 17

Page 18: CMMI and Metrics BCS SPIN SG,  19 February 2008 Clifford Shelley OXFORD SOFTWARE ENGINEERING Ltd

© OSEL 2008

OXFORD SOFTWARE ENGINEERINGSoftware Engineering Services & Consultancy

Slide 18

Supplementary Material…

Page 19: CMMI and Metrics BCS SPIN SG,  19 February 2008 Clifford Shelley OXFORD SOFTWARE ENGINEERING Ltd

© OSEL 2008

OXFORD SOFTWARE ENGINEERINGSoftware Engineering Services & Consultancy

Slide 19

Empirical relational system

Page 20: CMMI and Metrics BCS SPIN SG,  19 February 2008 Clifford Shelley OXFORD SOFTWARE ENGINEERING Ltd

© OSEL 2008

OXFORD SOFTWARE ENGINEERINGSoftware Engineering Services & Consultancy

Slide 20

Empirical relational system

Formal relational systemmeasurement

real world mathematical world

Page 21: CMMI and Metrics BCS SPIN SG,  19 February 2008 Clifford Shelley OXFORD SOFTWARE ENGINEERING Ltd

© OSEL 2008

OXFORD SOFTWARE ENGINEERINGSoftware Engineering Services & Consultancy

Slide 21

Empirical relational system

Formal relational system

Results

measurement

mathematics and

statistics

real world mathematical world

Page 22: CMMI and Metrics BCS SPIN SG,  19 February 2008 Clifford Shelley OXFORD SOFTWARE ENGINEERING Ltd

© OSEL 2008

OXFORD SOFTWARE ENGINEERINGSoftware Engineering Services & Consultancy

Slide 22

Empirical relational system

Formal relational system

Results

measurement

interpretation

mathematics and

statistics

real world mathematical world

Relevantempirical

information

Page 23: CMMI and Metrics BCS SPIN SG,  19 February 2008 Clifford Shelley OXFORD SOFTWARE ENGINEERING Ltd

© OSEL 2008

OXFORD SOFTWARE ENGINEERINGSoftware Engineering Services & Consultancy

Slide 23

Empirical relational system

Formal relational system

Results

measurement

interpretation

mathematics and

statistics

decisions and

actions

real world mathematical world

Relevantempirical

informationFrom Pfleeger 1998

Page 24: CMMI and Metrics BCS SPIN SG,  19 February 2008 Clifford Shelley OXFORD SOFTWARE ENGINEERING Ltd

© OSEL 2008

OXFORD SOFTWARE ENGINEERINGSoftware Engineering Services & Consultancy

Slide 24

Empirical relational system

Formal relational system

Results

refinedmeasurement

improvedinterpretation

mathematics and

statistics

betterdecisions

and actions

real world mathematical world

Relevantempirical

information

Page 25: CMMI and Metrics BCS SPIN SG,  19 February 2008 Clifford Shelley OXFORD SOFTWARE ENGINEERING Ltd

© OSEL 2008

OXFORD SOFTWARE ENGINEERINGSoftware Engineering Services & Consultancy

Slide 25

Page 26: CMMI and Metrics BCS SPIN SG,  19 February 2008 Clifford Shelley OXFORD SOFTWARE ENGINEERING Ltd

© OSEL 2008

OXFORD SOFTWARE ENGINEERINGSoftware Engineering Services & Consultancy

Slide 26

I II III IV

X Y X Y X Y X Y

10.00 8.04 10.00 9.14 10.00 7.46 8.00 6.58

8.00 6.95 8.00 8.14 8.00 6.77 8.00 5.76

13.00 7.58 13.00 8.74 13.00 12.74 8.00 7.71

9.00 8.81 9.00 8.77 9.00 7.11 8.00 8.84

11.00 8.33 11.00 9.26 11.00 7.81 8.00 8.47

14.00 9.96 14.00 8.10 14.00 8.84 8.00 7.04

6.00 7.24 6.00 6.13 6.00 6.08 8.00 5.25

4.00 4.26 4.00 3.10 4.00 5.39 19.00 12.50

12.00 10.84 12.00 9.13 12.00 8.15 8.00 5.56

7.00 4.82 7.00 7.26 7.00 6.42 8.00 7.91

5.00 5.68 5.00 4.74 5.00 5.73 8.00 6.89

‘Anscombe’s Quartet’ – American Statistician 1973

Page 27: CMMI and Metrics BCS SPIN SG,  19 February 2008 Clifford Shelley OXFORD SOFTWARE ENGINEERING Ltd

© OSEL 2008

OXFORD SOFTWARE ENGINEERINGSoftware Engineering Services & Consultancy

Slide 27

N = 11

Mean of X’s = 9.0

Mean of Y’s = 7.5

Regression line = Y + 0.5X

Standard error of estimate of slope = 0.118

t = 4.24

Sum of squares X – X = 110.0

Regression of sum of squares = 27.50

Residual sum of squares of Y = 13.75

Correlation coefficient = 0.82

R2 = 0.67

_

Page 28: CMMI and Metrics BCS SPIN SG,  19 February 2008 Clifford Shelley OXFORD SOFTWARE ENGINEERING Ltd

© OSEL 2008

OXFORD SOFTWARE ENGINEERINGSoftware Engineering Services & Consultancy

Slide 28

I

0.00

2.00

4.00

6.00

8.00

10.00

12.00

0.00 2.00 4.00 6.00 8.00 10.00 12.00 14.00 16.00

II

0.00

2.00

4.00

6.00

8.00

10.00

0.00 2.00 4.00 6.00 8.00 10.00 12.00 14.00 16.00

III

0.00

2.00

4.00

6.00

8.00

10.00

12.00

14.00

0.00 2.00 4.00 6.00 8.00 10.00 12.00 14.00 16.00

IV

0.00

2.00

4.00

6.00

8.00

10.00

12.00

14.00

0.00 5.00 10.00 15.00 20.00

Page 29: CMMI and Metrics BCS SPIN SG,  19 February 2008 Clifford Shelley OXFORD SOFTWARE ENGINEERING Ltd

© OSEL 2008

OXFORD SOFTWARE ENGINEERINGSoftware Engineering Services & Consultancy

Slide 29

Page 30: CMMI and Metrics BCS SPIN SG,  19 February 2008 Clifford Shelley OXFORD SOFTWARE ENGINEERING Ltd

© OSEL 2008

OXFORD SOFTWARE ENGINEERINGSoftware Engineering Services & Consultancy

Slide 30

Process Capability:

Indices for measuring process goodness

• Cp = USL - LSL / 6 or 2T / 6 – Cp < 1 process is incapable

– Cp > 1 process is capable (6 processes have Cp of 2)

– does not account for process drift so...

• Cpk = the lesser of (USL - X) / 3 or (X - LSL) / 3= =

Page 31: CMMI and Metrics BCS SPIN SG,  19 February 2008 Clifford Shelley OXFORD SOFTWARE ENGINEERING Ltd

© OSEL 2008

OXFORD SOFTWARE ENGINEERINGSoftware Engineering Services & Consultancy

Slide 31