· Web viewIn level four and level five, there are still variations in processes. However, the...

31
Notes_010 SE 3730 / CS 5730 – Software Quality 1 Standards and Certifications SEI CMMI http://www.sei.cmu.edu/cmm/ CMMI 2 nd Edition, Chrissis, Konrad and Shrum, 2007 Addison Wesley 1.1 Prior to CMMI there was CMM. CMM (Capability Maturity Model) and CMMI (Capability Maturity Model Integrated) the first release of CMMI (v1.02 in 2000). It took a few years for the first CMMI Level 5 Company to appear after the standard was published. 1.2 CMMI Certification Why is it important – technical qualification for government preferred vendors list hopefully better quality software ©2011 Mike Rowe Page 1 3/9/22

Transcript of   · Web viewIn level four and level five, there are still variations in processes. However, the...

Notes_010 SE 3730 / CS 5730 – Software Quality

1 Standards and Certifications

SEI CMMI http://www.sei.cmu.edu/cmm/CMMI 2nd Edition, Chrissis, Konrad and Shrum, 2007 Addison Wesley

1.1 Prior to CMMI there was CMM.

CMM (Capability Maturity Model) and CMMI (Capability Maturity Model Integrated) the first release of CMMI (v1.02 in 2000). It took a few years for the first CMMI Level 5 Company to appear after the standard was published.

1.2 CMMI Certification

Why is it important – technical qualification for government preferred vendors list hopefully better quality software

©2011 Mike Rowe Page 1 5/12/23

Notes_010 SE 3730 / CS 5730 – Software Quality

CMU – SEI – 30 different Orgs improvements.

CMU- SEI – Motorola Global Systems Group Russia – a Level 5 Maturity org.

©2011 Mike Rowe Page 2 5/12/23

Notes_010 SE 3730 / CS 5730 – Software Quality

CMU-SEI – Drop in defects

-- CMU-SEI – Warner Robins Logistics Center Schedule Performance

©2011 Mike Rowe Page 3 5/12/23

Notes_010 SE 3730 / CS 5730 – Software Quality

1.3 CMMI Process Categories and Areas

Humphrey observed during his research, that successful organizations share a common set of capabilities.

He classified these capabilities into 25 process areas, which can be arranged to four different Process Categories.

o Project Management, o Process Management, o Engineering, and o Support.

Each Process Area has Specific Goals (SGs) – these are characteristics that must be present to satisfy Process Area.

Each Specific Goal has Specific Practices (SPs) – these are activities that are expected to result in achievement of SGs, or in other words this is the evidence that the SGs have been met.

There can also be Generic Goals (GGs) and their Generic Practices (GPs) – these are like SGs and SPs except they are not unique to a single Process Area.

Each Specific Practices can have Typical Work Products – these describe typical outputs from Specific Practices.

Each Specific Practices can also have Subpractices that provide guidelines to help interpret and implement goals and practices.

Eventually the 25 process areas were narrowed to 23 and then to the below 22 (V1.2) Process Areas and the Process Category to which they belong.

©2011 Mike Rowe Page 4 5/12/23

Notes_010 SE 3730 / CS 5730 – Software Quality

Maturity Level

Process Management Project Management Engineering Support

1: Initial2: Managed Project Planning (PP) Requirements

Management (REQM)

Configuration Management (CM)

Project Monitoring and Control (PMC)

Process and Product Quality Assurance (PPQA)

Supplier Agreement Management (SAM)

Measurement and Analysis (MA)

3: Defined

Organization Process Focus (OPF)

Requirements Development (RD)

Decision Analysis and Resolution (DAR)

Organizational Process Focus (OPF)+ Integrated Product and Process Development (IPPD)

Integrated Project Management (IPM) + Integrated Product and Process Development (IPPD)

Technical Solutions (TS)

Product Integration (PI)

Organization Training (OT)

Risk Management (RSKM) Verification (VER)

Validation (VAL)4:

Quantitatively Managed

Organization Process Performance (OPP)

Quantitative Project Management (QPM)

5: Optimizing

Organization Innovation and Deployment (OID)

Causal Analysis and Resolution (CAR)

1.4 Maturity and Capability Levels

Maturity starts with the level one and culminates in level five. CMMI has the following Maturity levels:

Level 5 – Optimizing

Level 4 – Quantitatively Managed

Level 3 – Defined

Level 2 – Managed

Level 1 – Initial

With increasing level, the risk for building the wrong product decreases and quality and productivity improves. All maturity levels consist of specific and generic practices and goals for the process areas.

©2011 Mike Rowe Page 5 5/12/23

Notes_010 SE 3730 / CS 5730 – Software Quality

There are six Capability levels:

Level 5 – Optimizing

Level 4 – Quantitatively Managed

Level 3 – Defined

Level 2 – Managed

Level 1 – Performed

Level 0 – Incomplete

The capability levels are applied to each process area. Each process area runs through the capability levels from zero to five.

In contrast to the maturity levels, the capability levels for a process area can progress through levels independently of other process areas.

o Ex: A company can be at different capability levels for Project Management, Process Management, Engineering, and Support. The maturity level is based on the lowest capability level of the organization.

At level zero an organization does not possess or apply any part of a process area. When a process area is at such an incomplete level, this is at capability level zero.

Two Approaches to using CMMI, Continuous and Staged:

o Continuous: An organization can progress on each Process Category and Process Area in a way that optimizes the impact to their organization. An organization is described at a level for each Process Category. For instance they might be Level 2 for Project Management and Level 3 for Engineering.

o Staged: An organization is rated on all Process Categories and Areas. They are rated based on the lowest Process Area that they have obtained.

2 Maturity Level Descriptions

The below material is from a chart published by SEI, CMMI-SE/SW/IPPD/SS V1.2 2007

2.1 Level 1: Initial,

There is no formal process, and success can be attributed to the heroics of a few engineers; For $10 I’ll certify anyone at Level 1

©2011 Mike Rowe Page 6 5/12/23

Notes_010 SE 3730 / CS 5730 – Software Quality

The first level, the initial level, does not really involve improving the quality of software. Usually, at this level, conditions are chaotic and few, if any, processes are defined.

There is no organized management for the project.

Success of a project depends on individual heroes and the project can not be repeated in the same way again.

Few general tools are used for projects on this level.

At this level no measures are taken to control and plan projects and their success.

At this level projects are often running out of time and budget.

2.2 Level 2: Managed

Level 2: Managed, there is a minimal process and the status of projects is visible to management at major milestones. Process varies from project to project.

Repeatable. Basic project management processes are established to track cost, track schedule, and track functionality.

The key process areas at Level 2 focus on the software project's concerns related to establishing basic project management controls. The necessary process discipline is in place to repeat earlier successes on projects with similar applications.

The managed level focuses on organizing project management and project processes. The scope of the managed level is restricted to individual Projects – not all Projects within an

organization.

The processes which need to be used for a project are determined and applied.

The success of these processes is measured and controlled during the entire project.

These projects work with milestones to make the project’s results and success visible to the management at given times (there is not real-time visibility of performance).

Work results are regularly controlled.

Products as well as services are within determined standards and procedures.

©2011 Mike Rowe Page 7 5/12/23

Notes_010 SE 3730 / CS 5730 – Software Quality

Maturity Level Process Management Project Management

Engineering Support

Level 2: Managed

Project Planning (PP) Requirements Management (REQM)

Configuration Management (CM)

Project Monitoring and Control (PMC)

Process and Product Quality Assurance (PPQA)

Supplier Agreement Management (SAM)

Measurement and Analysis (MA)

Not there is NO Process Management needed here. This is because each project does its own thing.

2.2.1 Level 2 Project Management

Project Planning (PP) SG 1: Establish Estimates

o SP 1.1 Estimate the Scope of a Projecto SP 1.2 Establish Estimates of Work Product and Task Attributeso SP 1.3 Define Project Lifecycleo SP 1.4 Determine Estimates of Effort and Cost

SG 2: Develop a Project Plano SP 2.1 Establish the Budget and Scheduleo SP 2.2 Identify Project Riskso SP 2.3 Plan for Data Managemento SP 2.4 Plan for Project Resourceso SP 2.5 Plan for Needed Knowledge and Skillso SP 2.6 Plan Stakeholders Involvemento SP 2.7 Establish the Project Plan

SG 3: Obtain Commitment to the Plano SP 3.1 Review Plans That Affect the Projecto SP 3.2 Reconcile Work and Resourceso SP 3.3 Obtain Plan Commitment

Estimates are documented for planning and tracking Project activities are planned and documented

Project Monitoring and Control (PMC) SG 1: Monitor Project Against Plan

o SP 1.1 Monitor Project Planning Parameterso SP 1.2 Monitor Commitments o SP 1.3 Monitor Project Riskso SP 1.4 Monitor Data Managemento SP 1.5 Monitor Stakeholders Involvemento SP 1.6 Conduct Progress Reviews

©2011 Mike Rowe Page 8 5/12/23

Notes_010 SE 3730 / CS 5730 – Software Quality

o SP 1.6 Conduct Milestone Reviews

SG 2: Manage Corrective Actions to Closureo SP 2.1 Analyze Issueso SP 2.2 Take Corrective Actionso SP 2.3 Manage Corrective Actions

Results and performance tracked against time Changes to commitments are agreed to by affected groups Correction actions can be applied when the project deviates from plan.

Supplier Agreement Management (SAM) SG 1: Establish Supplier Agreements

o SP 1.1 Determine Acquisition Typeo SP 1.2 Select Supplierso Sp 1.3 Establish Supplier Agreements

SG 2: Satisfy Supplier Agreements o SP 2.1 Execute the Supplier Agreemento SP 2.2 Monitor Selected Supplier Processeso SP 2.3 Evaluate Selected Supplier Work Productso SP 2.4 Accept the Acquired Producto SP 2.5 Transition Product

Manage the acquisition and quality of components from outside the company.

2.2.2 Level 2 Engineering

Requirements Management (REQM) SG 1: Manage Requirements

o SP 1.1 Obtain an Understanding of Requirementso SP 1.2 Obtain Commitment to Requirementso SP 1.3 Manage Requirement Changeso SP 1.4 Maintain Bidirectional Traceability of Requirementso SP 1.5 Identify Inconsistencies Between Project Work and Requirements

2.2.3 Level 2 Support

Configuration Management (CM) SG 1: Establish Baselines

o SP 1.1 Identify Configuration Itemso SP 1.2 Establish a Configuration Management Systemo SP 1.3 Create or Release Baselines

SG 2: Track and Control Changes

©2011 Mike Rowe Page 9 5/12/23

Notes_010 SE 3730 / CS 5730 – Software Quality

o SP 2.1 Track Change Requestso SP 2.2 Control Configuration Items

SG 3: Establish Integrityo SP 3.1 Establish Configuration Management Recordso SP 3.2 Perform Configuration Audits

Process and Product Quality Assurance (PPQA) SG 1: Objectively Evaluate Process and Work Products

o SP 1.1 Objectively Evaluate Processeso SP 1.2 Objectively Evaluate Work Products and Services

SG 2: Provide Objective Insighto SP 2.1 Communicate and Ensure Resolution of Noncompliance Issueso SP 2.2 Establish Records

Measurement Analysis (MA) SG 1: Align Measurement and Analysis Activities

o SP 1.1 Establish Measurement Objectiveso SP 1.2 Specify Measureso SP 1.3 Specify Data Collection and Storage Procedureso SP 1.4 Specify Analysis Procedures

SG 2: Provide Measurement Resultso SP 2.1 Collect Measurement Datao SP 2.2 Analyze Measurement Datao SP 2.3 Store Data and Resultso SP 2.4 Communicate Results

2.3 Level 3: Defined

Level 3: Defined, there are organizational-wide standards, procedures, tools and methods.

Level three is known as the defined level and requires maturity level two as a precondition.Works on coordination across groups in the organization.

The software process for both management and engineering activities is o documented, o standardized, and o integrated

The processes are defined and applied organization-wide and can be tailored from the standard process to project’s special requirements. This is an important distinction to level two.

The project is not just managed, but the processes are exactly defined and described in detail.

The process is supported by tools and suitable methods.

©2011 Mike Rowe Page 10 5/12/23

Notes_010 SE 3730 / CS 5730 – Software Quality

Processes are consistent across the entire organization and only tailoring allows some differences between processes used in different projects.

Goals of processes are derived from the organization’s standard processes and their success is being controlled as well as the process itself and its product.

Maturity Level Process Management Project Management Engineering SupportLevel 3: Defined

Organization Process Focus (OPF)

Requirements Development (RD)

Decision Analysis and Resolution (DAR)

Organizational Process Definition (OPD)+ Integrated Product and Process Development (IPPD)

Integrated Project Management (IPM) + Integrated Product and Process Development (IPPD)

Technical Solutions (TS)

Product Integration (PI)

Organization Training (OT)

Risk Management (RSKM) Verification (VER)

Validation (VAL)

2.3.1 Level 3 Process Management

Organization Process Focus (OPF) SG 1: Determine Process Improvement Opportunities

o SP 1.1 Establish Organizational Process Needso SP 1.2 Appraise the Organization’s Processeso SP 1.3 Identify the Organization’s Process Improvements

SG 2: Plan and Implement Process Improvemento SP 2.1 Establish Process Action Planso SP 2.2 Implement Process Action Plans

SG 3: Deploy Organizational Process Assets and Incorporate Lessons Learnedo Deploy Organizational Process Assetso Deploy Standard Process Assetso Monitor Implementationo Incorporate Process-Related Experiences into the Organizational Process Assets

Plan and implement organizational process improvements based on understood strengths and weaknesses of the organization and process assets.

Organizational Process Definition (OPD) + Integrated Product and Process Development (IPPD ) SG 1: Establish Organizational Assets

o SP 1.1 Establish Standard Processeso SP 1.2 Establish Lifecycle Model Descriptionso SP 1.3 Establish Tailoring Criteria and Guidelineso SP 1.4 Establish the Organization’s Measurement Repository

©2011 Mike Rowe Page 11 5/12/23

Notes_010 SE 3730 / CS 5730 – Software Quality

o SP 1.5 Establish the Organization’s Process Asset Libraryo SP 1.6 Establish Work Environment Standards

SG 2: Enable IPPD Managemento SP 2.1 Establish Empowerment Mechanismso SP 2.2 Establish Rules and Guidelines for Integrated Teamso SP 2.3 Balance Team and Home Organization Responsibilities

Develop and maintain a software process standard for the organization – these are the process assets.

Create an environment that enables integrated teams to efficiently meet the project’s requirements and produce quality work.

Organizational Training (OT) SG 1: Establish and Organizational Training Capability

o SP 1.1 Establish the Strategic Training Needso SP 1.2 Determine Which Training Needs are the Responsibility of the Organizationo SP 1.3 Establish and Organizational Training Tactical Plano SP 1.4 Establish Training Capability

SG 2: Provide Necessary Trainingo SP 2.1 Deliver Trainingo SP 2.2 Establish Training Recordso SP 2.3 Assess Training Effectiveness

People are trained in the process and needed technologies so that they can use it effectively.

2.3.2 Level 3 Project Management

Integrated Project Management (IPM) + Integrated Product and Process Development (IPPD ) SG 1: Use the Project’s Defined Process

o SP 1.1 Establish the Project’s Defined Processo SP 1.2 Use Organizational Process Assets for Planning Project Activitieso SP 1.3 Establish the Project’s Work Environmento SP 1.4 Integrate Planso SP 1.5 Manage the Project Using the Integrated Planso SP 1.6 Contribute to the Organizational Process Assets

SG 2: Coordinate and Collaborate with Relevant Stakeholderso SP 2.1 Manage Stakeholder Involvemento SP 2.2 Manage Dependencieso Sp 2.3 Resolve Coordination Issues

SG 3: Apply IPPD Principles (Integrated Product and Process Development)o SP 3.1 Establish the Projects Shared Visiono SP 3.2 Establish the Integrated Team Structure

©2011 Mike Rowe Page 12 5/12/23

Notes_010 SE 3730 / CS 5730 – Software Quality

o SP 3.3 Allocate Requirements to Integrated Teams o SP 3.4 Establish Integrated Teamso SP 3.5 Ensure Collaboration among Interfacing Teams

Integrate and define process that involves relevant stakeholders Software process activities are coordinated – every group has a responsibility in a project

Risk Management (RSKM) SG 1: Prepare for Risk Management

o SP 1.1 Determine Risk Sources and Categorieso SP 1.2 Define Risk Parameterso SP 1.3 Establish a Risk Management Strategy

SG 2: Identify and Analyze Riskso SP 2.1 Identify Riskso SP 2.2 Evaluate, Categorize, and Prioritize Risks

SG 3: Mitigate Riskso SP 3.1 Develop Risk Mitigation Planso SP 3.2 Implement Risk Mitigation Plans

Identify potential problems before they occur. Risk handling activities are planned throughout the products lifecycle to address potential risks.

2.3.3 Level 3 Engineering

Requirements Development (RD) SG 1: Develop Customer Requirements

o SP 1.1 Elicit Needso SP 1.2 Develop Customer Requirements

SG 2: Develop Product Requirementso SP 2.1 Establish Product and Product Component Requirementso SP 2.2 Allocate Product Component Requirementso SP 2.3 Identify Interface Requirements

SG 3: Analyze and Validate Requirementso SP 3.1 Establish Operational Concepts and Scenarioso SP 3.2 Establish a Definition of Requirement Functionalityo SP 3.3 Analyze Requirementso SP 3.4 Analyze Requirements to Achieve Balanceo SP 3.5 Validate Requirements

Technical Solution (TS) SG 1: Select Product Component Solutions

o SP 1.1 Develop Alternative Solutions and Selection Criteriao SP 1.2 Select Product Component Solutions

©2011 Mike Rowe Page 13 5/12/23

Notes_010 SE 3730 / CS 5730 – Software Quality

SG 2: Develop Designo SP 2.1 Design the Product or Product Componento SP 2.2 Establish Interfaces Using Criteriao SP 2.3 Design Interfaces Using Criteriao SP 2.4 Perform Make, Buy, or Reuse Analysis

SG 3: Implement the Product Designo SP 3.1 Implement the Designo SP 3.2 Develop Product Support Documentation

Develop, design and implement solutions to requirements.

Product Integration (PI) SG 1: Prepare for Product Integration

o SP 1.1 Determine Integration Sequenceo SP 1.2 Establish the Product Integration Environmento SP 1.3 Establish Product Integration Procedures and Criteria

SG 2: Ensure Interface Compatibilityo SP 2.1 Review Interface Descriptions for Completenesso SP 2.2 Manage Interfaces

SG 3: Assemble Product Components and Deliver the Product o SP 3.1 Confirm Readiness of Product Components for Integrationo SP 3.2 Assemble Product Componentso SP 3.3 Evaluate Assembled Product Componentso SP 3.4 Package and Deliver the Product or Product Component

Products are assembled to ensure they function properly and deliver the product to customers.

Verification (VER) SG 1: Prepare for Verification

o SP 1.1 Select Work Products for Verificationo SP 1.2 Establish the Verification Environmento SP 1.3 Establish Verification Procedures and Criteria

SG 2: Perform Peer Reviewso SP 2.1 Prepare for Peer Reviewso SP 2.2 Conduct Peer Reviewso SP 2.3 Analyze Peer Review Data

SG 3: Verify Selected Work Productso SP 3.1 Perform Verificationo SP 3.2 Analyze Verification Results

Validation (VAL) SG 1: Prepare for Validation

o SP 1.1 Select Products for Validation

©2011 Mike Rowe Page 14 5/12/23

Notes_010 SE 3730 / CS 5730 – Software Quality

o SP 1.2 Establish the Validation Environmento SP 1.3 Establish Validation Procedures and Criteria

SG 2: Validate Product or Product Componento SP 2.1 Perform Validationo SP 2.2 Analyze Validation Results

2.3.4 Level 3 Support

Decision Analysis and Resolution (DAR) SG 1: Evaluate Alternatives

o SP 1.1 Establish Guidelines for Decision Analysiso SP 1.2 Establish Evaluation Criteriao SP 1.3 Identify Alternative Solutionso SP 1.4 Select Evaluation Methodso SP 1.5 Evaluate Alternativeso SP 1.6 Select Solutions

Analyze possible decisions using a formal evaluation process that takes into account the impact of alternative solutions.

2.4 Level 4: Quantitatively Managed

With Quantitatively managed level four, all objectives from lower levels are already achieved. Additionally sub processes are introduced.

The processes’ entire performance and quality is controlled using quantitative techniques and statistics during the entire lifecycle of processes.

Processes at this level are constantly measured and statistically are analyzed.

The results of this control are used for better managing the processes and projects. This makes the process quantitatively predictable. In comparison, at level three only processes’ quality can be predicted and not actually known during the project.

Customer and user needs are the foundation for quantitative goals.

At this level it is important to detect and correct differences and reasons for differences between the given process and the process applied during the project.

All this has to happen at organization’s level not just on project level, to avoid variations for future projects and to support further decisions.

©2011 Mike Rowe Page 15 5/12/23

Notes_010 SE 3730 / CS 5730 – Software Quality

Maturity Level Process Management Project Management

Engineering Support

4: Quantitatively Managed

Organization Process Performance (OPP)

Quantitative Project Management (QPM)

2.4.1 Level 4 Process Management

Organizational Process Performance (OPP) SG 1: Establish Performance Baselines and Models

o SP 1.1 Select Processeso SP 1.2 Establish Process Performance Measureso SP 1.3 Establish Quality and Process Performance Objectiveso SP 1.4 Establish Process Performance Baselineso Sp 1.5 Establish Process Performance Models

Establish and maintain a quantitative understanding of performance of the organization’s standard process in support of quality and process performance objectives.

Provide performance data, baselines, and models to quantitatively manage people and projects.

2.4.2 Level 4 Project Management

Quantitative Project Management (QPM) SG 1: Quantitatively Manage the Project

o SP 1.1 Establish the Project’s Objectiveso SP 1.2 Compose the Defined Processo SP 1.3 Select the Subprocesses that will be Statistically Managedo SP 1.4 Manage Project Performance

SG 2: Statistically Manage Subprocess Performanceo SP 2.1 Select Measures and Analytic Techniqueso SP 2.2 Apply Statistical Methods to Understand Variationo SP 2.3 Monitor Performance of the Selected Subprocesseso SP 2.4 Record Statistical Management Data

Quantitatively manage the project’s defined process to achieve the project’s quality and process objectives.

©2011 Mike Rowe Page 16 5/12/23

Notes_010 SE 3730 / CS 5730 – Software Quality

2.5 Level 5: Optimizing

Level five, the optimizing level, all processes are already defined and managed. Goals for levels one to four are all achieved.

The purpose of this level is to continually improve the performance of all practices of all processes, based on quantitative improvement. Improvements can be incremental improvements and as well as innovative technologies.

The organization is establishing goals for improving processes. These goals are continually controlled and changed if necessary. This supports changes to business goals used for improving the processes.

After those improvements are introduced, they are continuously measured and their quantitative contribution to process improvements are assessed.

In level four and level five, there are still variations in processes. However, the important difference between level four and level five is that at level five the reasons for the changes are measured and analyzed to optimize the processes. Results from those measurements and analysis are used to change the process for still being able to achieve the given quantitative process goals

Maturity Level Process Management Project Management

Engineering Support

5: Optimizing

Organization Innovation and Deployment (OID)

Causal Analysis and Resolution (CAR)

2.5.1 Level 5 Process Management

Organizational Innovation and Deployment (OID) SG 1: Select Improvements

o SP 1.1 Collect and Analyze Improvement Proposalso SP 1.2 Identify and Analyze Innovationso SP 1.3 Pilot Improvementso SP 1.4 Select Improvements for Deployments

SG 2: Deploy Improvementso SP 2.1 Plan the Deploymento SP 2.2 Manage the Deploymento SP 2.3 Measure Improvement Efforts

Select and deploy incremental innovative improvements that measurably improve the organization’s processes.

©2011 Mike Rowe Page 17 5/12/23

Notes_010 SE 3730 / CS 5730 – Software Quality

2.5.2 Level 5 Project Management

Causal Analysis and Resolution (CAR) SG 1: Determine Causes of Defects

o SP 1.1 Select Defect Data for Analysiso SP 1.2 Analyze Causes

SG 2: Address Causes of Defectso SP 2.1 Implement the Action Proposalso SP 2.2 Evaluate the Effect of Changeso SP 2.3 Record Data

Identify defects and their causes. Take measures to prevent the causes of the defects from occurring in the future.

2.6 Example of a Level 5

Lockheed Martin Federal Systems, Owego has attained the highest rating of a company's software development capability, Level 5, from the Carnegie Mellon Software Engineering Institute (SEI).

In 2003 on only three companies were rated CMM Level 5 (CMM is the predecessor to CMMI) world-wide.

The following link provides query by Level and Year of CMMI certified companies.http://sas.sei.cmu.edu/pars/pars.aspx

2.6.1 Esterline Control Systems – AVISTA

Avista became one of 22 companies in the US to achieve Level five during July 2007 CMMI under the v1.1 standard.

Every three years a company must be re-evaluated. AVISTA was re-evaluated in May 2010 and was recertified CMMI Level 5 under the v1.2 standard.

If you are Level 5 the re-evaluation must indicate that your are continuing to Optimize your process and performance

o If you only hold your ground you are no longer considered Level 5.

Below is the 2007 SCAMPI Evaluation for AVISTA from the above link.

©2011 Mike Rowe Page 18 5/12/23

Notes_010 SE 3730 / CS 5730 – Software Quality

Organization Organization Name:

AVISTA, Incorporated

Appraisal Sponsor Name:

Thomas Bragg, Tim Budden

Lead Appraiser Name:

Johnny Childs

SEI Partner Name: cLear Improvement & Associates, LLC

Organizational Unit Description Projects/Support Groups

**Sensitive: Platteville, WI United States **Sensitive: Platteville, WI United States **Sensitive: Platteville, WI United States **Sensitive: Platteville, WI United States **Sensitive: Platteville, WI United States

Organizational Sample Size % of people included: 33

% of projects included: 7

Org Scope Description:

For the purpose of the appraisal 3 full lifecycle projects were selected as focus projects and 2 verification projects were selected as non-focus projects (VER, VAL, and CM were mapped).

There are presently 137 employees and 80 active projects at AVISTA totalling 220,477 hours. The sample projects comprise 33.8% of the engineering staff and 26.4% (58,049) of the hours.

Appraisal Description Appraisal End Date: Jul 27, 2007

Appraisal Expiration Date: Jul 27, 2010

Appraisal Method Used: SEI SCAMPI v1.2 A

CMMI CMMI v1.1

©2011 Mike Rowe Page 19 5/12/23

View Detail

Notes_010 SE 3730 / CS 5730 – Software Quality

Information:Appraised Functional Areas Included:

Model Scope and Appraisal Ratings Level 2 Level 3 Level 4 Level 5

REQM

PP

PMC

SAM

MA

PPQA

CM

RD

TS

PI

VER

VAL

OPF

OPD

OT

IPM

RSKM

IT

ISM

DAR

OEI

OPP

QPM

OID

CAR

Organizational Unit Maturity Level Rating: 5Additional Information for Appraisals Resulting in Capability or Maturity Level 4 or 5 Ratings:

Process Performance Baselines

1. Number of defects injected into test cases developed internally - data collected at test reviews. This deals with the Enterprise quality objectives and an organizational objective dealing with defects injected per unit. It is a measure of defect removal efficiency as well as the quality of the product. 2. The number of hours spent developing a test case for an average or difficult complexity requirement. This deals with the business objective of performance to budget. 3. The number of hours spent developing a test case for a simple complexity requirement. This deals with the business objective of performance to budget. 4. The number of defects detected in the testing of requirements (from all sources). This

©2011 Mike Rowe Page 20 5/12/23

Satisf ied

Satisf ied

Satisf ied

Satisf ied

Satisf ied

Satisf ied

Satisf ied

Satisf ied

Satisf ied

Satisf ied

Satisf ied

Satisf ied

Satisf ied

Satisf ied

Satisf ied

Satisf ied

Satisf ied

Not Rated

Not Rated

Satisf ied

Not Rated

Satisfied

Satisfied

Satisf ied

Satisf ied

Hide Detail

Notes_010 SE 3730 / CS 5730 – Software Quality

supports their business quality objectives and is a measure of defect removal efficiency.5. Number of defects injected per SLOC developed in-house. This supports their business quality objectives and is a measure of defect removal efficiency as well as the quality of the product.6. Number of defects found per SLOC, regardless of the development source. This supports their business quality objectives and is a measure of review efficiency.7. Hours spent in the test review for an average to difficult requirement covered in a requirement based test. This supports the performance to budget objective.8. Hours spent in the test review for a simple requirement covered in a requirement based test. This supports the performance to budget objective.9. Number of defects detected in structural verification (product integration phase). This supports their business quality objectives and is a measure of defect removal efficiency 10. Number of trivial action items resulting from internally injected defects per requirement covered in requirements based test. This supports their business quality objectives and is a measure of defect removal efficiency and review efficiency.

Models

The models were developed by an AVISTA engineer who is also a professor at the University of Wisconsin, Platteville (UWP) with advanced degrees in statistics and computer science. His statistical analysis with the baselines in development of the models is well documented and exhaustive

1. Based on the number of defects injected - Baseline 1. above - (or the Organizational Mean if the project running mean is unknown), the total Cost of Quality (COQ) or remaining COQ can be projected. COQ is the number of hours and cost associated with "clean up" activities (rework, etc.). Using the measures available for defects/rqmt the COQ Hrs/Rqmt can be projected. This predictor is associated with schedule and cost.

2. Based on the means of development hours/sw requirement tested and review hours/sw requirements tested in a review (or the Organizational Mean if the project running mean is unknown) - baselines 2 & 3 and 7 & 8 above, the total COQ or remaining COQ can be projected. Using the measures listed for development hours/SW rqmt tested and review hours/SW rqmt tested in review the COQ Hrs/Rqmt can be projected. This predictor is associated with schedule and cost.

3. Based on the number of defects detected - baselines 4, 6, and 9 above (or the Organizational Mean if the project running mean is unknown), the CPI for the SW RBT Activity can be projected. This predictor is associated with schedule and cost. The organization makes broad use of Earned value as a tracking mechanism. This model is an enabler for the prediction of EV.

©2011 Mike Rowe Page 21 5/12/23

Notes_010 SE 3730 / CS 5730 – Software Quality

2.7 Rationale For Government Requirement

Below is a rationale for government contracts to obtain at least Level 3. Think back to the first lab that we did in the class – perhaps the State of Wisconsin should require a bit more of its contractors.

08/12/02; Vol. 21 No. 23 Feds must weigh worth of vendors' CMM claims http://www.gcn.com/21_23/outsourcing/19576-1.html By Susan M. Menke GCN Staff

Agencies often rely on Capability Maturity Model ratings touted by vendors when deciding whether to hire them for software development jobs. But are the rating claims made by many vendors legit? Maybe, maybe not.

Since the Software Engineering Institute at Carnegie Mellon University began using the CMM to take the pulse of software development 15 years ago, the number of participating software shops has soared more than tenfold.

SEI's 1993 software maturity profile cited 156 organizations, 65 percent of them government agencies or their contractors. The most recent profile--issued in March by the institute--cited 1,638 organizations, and only 31 percent were government agencies or their contractors.

Despite the last 15 years' tremendous growth in commercial software, which now far overshadows government development, the Defense Department has always been the financial force behind SEI's model.

The CMM for software ranks projects and organizations at levels 1 through 5. The higher levels are supposed to indicate fewer software defects, more repeatable development processes and better project management.

DOD acquisition policy for major systems, such as weapons systems, requires contractors to have what is called "CMM Level 3 equivalence." Bidders that lack such credentials must submit risk-mitigation plans with their bids.

Civilian acquisition officials are far less strict about equivalence ratings, even for projects costing hundreds of millions of dollars.

"Typical DOD IT projects in the $100 million range, which account for most of the problems and failures, are not covered" by the Level 3 equivalence requirement, said Lloyd K. Mosemann II, a former deputy assistant secretary of the Air Force and now a senior vice president with Science Applications International Corp. "At the very least, government [should] specify that the performing organization be Level 3" on the software CMM, Mosemann said in his keynote speech at the recent Software Technology Conference in Salt Lake City.

©2011 Mike Rowe Page 22 5/12/23

Notes_010 SE 3730 / CS 5730 – Software Quality

"Virtually every large DOD contractor can boast at least one organization at Level 4 or above, and several at Level 3," Mosemann said. "On the other hand, most DOD software is still being developed in less mature organizations, mainly because the program executive office or program manager doesn't demand that the part of the company that will actually build the software be Level 3."

Reaching Level 5

The economic pressure to obtain a prestigious Level 3, 4 or 5 rating has led to a proliferation of SEI and non-SEI models--not only for software but for acquisition, personnel, product development, systems integration and other areas.

Only one software shop in 1993 had a strong enough grip on its development practices to reach the rarefied Level 5. In contrast, the latest list shows that 86 organizations say they have Level 5 certification--but SEI does not guarantee the accuracy of these claims.

An SEI disclaimer, at www.sei.cmu.edu/sema/pub_ml.html, says, "This list of published maturity levels is by no means exhaustive."

Why is that?

Software quality assessment, like real estate appraisal, is partly a science and partly an art. SEI maintains a list of hundreds of appraisers, assessors and consultants who will undertake to rate software strengths and weaknesses according to the SEI model.

That wide dispersion of authority, coupled with the enormous growth of the software industry, leaves SEI in the position of neither confirming nor denying the claims that are made using its model.

"As a federally funded research and development center, SEI must avoid any statement that might be perceived to validate or certify the assessment results that an organization chooses to make public," SEI spokesman Bill Pollak said. "The most we can do is to validate the conduct of an assessment--for example, 'An SEI-authorized lead assessor and trained team performed the assessment.' We do receive results from SEI-authorized lead assessors, but we keep those results confidential."

SEI senior technical staff member Mary Beth Chrissis said there are "many different flavors of appraisals. Many other organizations have developed their own appraisals" based on SEI's public-domain model. A number of such organizations are offshore.

Where does that leave agencies that want to make sure they hire competent contractors whose CMM certifications are current?

A starting point is SEI's CMM for software acquisition (SA-CMM), developed in the mid-1990s by a government-industry team. Many agencies, including the General Accounting Office and the IRS, have used its methods to evaluate contracting or outsourcing practices. But there is no recommended list

©2011 Mike Rowe Page 23 5/12/23

Notes_010 SE 3730 / CS 5730 – Software Quality

of SA-CMM vendors, and SEI does not qualify them. In choosing such vendors, SEI says, the key is experience: "The experience should be demonstrated and not just claimed."

Mosemann, one of the instigators of the SA-CMM, said it was not meant to apply to contractors but rather to government program and acquisition offices.

"The problem that I perceived--and it clearly exists today--is that a gross mismatch occurs when a DOD program office that can barely spell the word 'software' oversees a Level 3 or 4 contractor organization," he said.

"The government program manager has no appreciation for the tools, techniques and methods--and their cost--that are necessary to develop software on a predictable schedule at a predictable cost with predictable performance results," Mosemann said. "That is why there is no list of SA-CMM contractors and why SEI has no plan to qualify them."

Meanwhile, Defense is negotiating with its service acquisition executives to use SEI's newer CMM for integration, said Joe Jarzombek, deputy director for software-intensive systems in the Office of the Undersecretary of Defense for Acquisition, Technology and Logistics.

Managers of major software programs are required to choose contractors that have succeeded at comparable systems and that have mature software development processes in place, Jarzombek said. DOD's present Software Development Capability Evaluation Core is equivalent to the software CMM Level 3 criteria, he said.

©2011 Mike Rowe Page 24 5/12/23