Surveying Systems Engineering Effectiveness...-Customer / User-etc....

19
Sponsored by the U.S. Department of Defense © 2005 by Carnegie Mellon University Version 1.0 Oct-05 NDIA SE Conference - page 1 Pittsburgh, PA 15213-3890 Surveying Systems Engineering Effectiveness Joseph P. Elm

Transcript of Surveying Systems Engineering Effectiveness...-Customer / User-etc....

Page 1: Surveying Systems Engineering Effectiveness...-Customer / User-etc. •Organization-Size-Organizational capability-Related experience-etc. Section 1: Characterization The objective

Sponsored by the U.S. Department of Defense© 2005 by Carnegie Mellon University

Version 1.0 Oct-05 NDIA SE Conference - page 1

Pittsburgh, PA 15213-3890

SurveyingSystems EngineeringEffectiveness

Joseph P. Elm

Page 2: Surveying Systems Engineering Effectiveness...-Customer / User-etc. •Organization-Size-Organizational capability-Related experience-etc. Section 1: Characterization The objective

© 2005 by Carnegie Mellon University Version 1.0 Oct-05 NDIA SE Conference - page 2

BackgroundCase studies have shown that properly implementedsystems engineering can result in commensuratebenefits

Broadly applicable quantification of these costs andbenefits remains elusive• Complicated by the lack of a broadly accepted definition

of Systems Engineering• Insufficient identification and tracking of Systems

Engineering costs and efforts• Exacerbated by increasing complexity and size of

systems and Systems of Systems

Page 3: Surveying Systems Engineering Effectiveness...-Customer / User-etc. •Organization-Size-Organizational capability-Related experience-etc. Section 1: Characterization The objective

© 2005 by Carnegie Mellon University Version 1.0 Oct-05 NDIA SE Conference - page 3

The TaskThe Office of the Under Secretary of Defense (AT&L) hastasked the NDIA Systems Engineering Division toresearch and report on the costs and benefits associatedwith Systems Engineering practices in the acquisitionand / or development of military systems.

The Systems Engineering Effectiveness Committee(SEEC) is addressing this task via a survey of programand project managers across the defense industry.

Page 4: Surveying Systems Engineering Effectiveness...-Customer / User-etc. •Organization-Size-Organizational capability-Related experience-etc. Section 1: Characterization The objective

© 2005 by Carnegie Mellon University Version 1.0 Oct-05 NDIA SE Conference - page 4

Survey ObjectiveIdentify the degree of correlation between the use ofspecific systems engineering practices and activities onprojects, and quantitative measures of project / programperformance.

Survey MethodUse the resources of NDIA SE Division to reach a broadconstituency

The initial survey will focus on industry members of NDIAthat are prime contractors and subcontractors

Collect feedback from project / program managers

Page 5: Surveying Systems Engineering Effectiveness...-Customer / User-etc. •Organization-Size-Organizational capability-Related experience-etc. Section 1: Characterization The objective

© 2005 by Carnegie Mellon University Version 1.0 Oct-05 NDIA SE Conference - page 5

Survey Development Plan1. Define the goal

2. Choose the population

3. Define the means to assess usage of SE practices

4. Define the measured benefits to be studied

5. Develop the survey instrument

6. Execute the survey

7. Analyze the results

8. Report

9. Plan future studies

Page 6: Surveying Systems Engineering Effectiveness...-Customer / User-etc. •Organization-Size-Organizational capability-Related experience-etc. Section 1: Characterization The objective

© 2005 by Carnegie Mellon University Version 1.0 Oct-05 NDIA SE Conference - page 6

Step 1:Define the GoalIdentify correlations between SE practices and programperformance

Step 2:

Choose the populationChosen population consists of contractors andsubcontractors providing products to the DoD

Page 7: Surveying Systems Engineering Effectiveness...-Customer / User-etc. •Organization-Size-Organizational capability-Related experience-etc. Section 1: Characterization The objective

© 2005 by Carnegie Mellon University Version 1.0 Oct-05 NDIA SE Conference - page 7

Step 3:Define assessment of SE practices

• 13 Process Areas• 27 Goals• 75 Practices•185 Work Products

CMMI-SW/SE v1.1• 22 Process Areas• 157 Goals• 539 Practices• 402 Work Products

SystemsEngineering

Filter

• 10 Process Areas• 19 Goals• 34 Practices• 63 Work Products

Size ConstraintFilter

Page 8: Surveying Systems Engineering Effectiveness...-Customer / User-etc. •Organization-Size-Organizational capability-Related experience-etc. Section 1: Characterization The objective

© 2005 by Carnegie Mellon University Version 1.0 Oct-05 NDIA SE Conference - page 8

Step 4:Define performance measuresUtilize measures common to many organizations• Earned Value• Award Fees• Technical Requirements Satisfaction• Milestone Satisfaction• Problem Reports

Page 9: Surveying Systems Engineering Effectiveness...-Customer / User-etc. •Organization-Size-Organizational capability-Related experience-etc. Section 1: Characterization The objective

© 2005 by Carnegie Mellon University Version 1.0 Oct-05 NDIA SE Conference - page 9

Step 5:Develop the survey instrumentSelf-administration• formatted for web-based

deployment

Confidentiality• No elicitation of identifying data• Anonymous response collection• Responses accessible only to

authorized SEI staff

Integrity• Data used only for stated

purpose• No attempt to extract

identification data

Self-checking

Section 1ProjectCharacterization

Section 2Systems EngineeringEvidence

Section 3Project / ProgramPerformance Metrics

Page 10: Surveying Systems Engineering Effectiveness...-Customer / User-etc. •Organization-Size-Organizational capability-Related experience-etc. Section 1: Characterization The objective

© 2005 by Carnegie Mellon University Version 1.0 Oct-05 NDIA SE Conference - page 10

Section 1 - CharacterizationCharacterization of theproject / program underconsideration•Project / program

- Size - Stability- Lifecycle phase- Subcontracting- Application domain- Customer / User- etc.

•Organization- Size- Organizational capability- Related experience- etc.

Section 1: Characterization

The objective of this section is to gather information to characterize the project underconsideration. This information will assist the survey analysts in categorizing the project,and the executing organization to better understand your responses.

1.1 Project – information to characterize the specific project under discussion.Size, stability, lifecycle phase, subcontracting, and application domain areamong the parameters used for program characterization.

1.1.1 What phases of the integrated product lifecyclecomprise this project (check all that apply), andwhat phase are you presently executing (check 1)?

Included in project(check all that apply

Currentphase(check 1)

� � Concept Refinement� � Technology

Development andDemonstration

� � Development� � Manufacturing� � Verification� � Training� � Deployment� � Operation� � Support� � Disposal

1.1.2 What is the current total contract value (US$) ofyour project?

$ __________________

1.1.3 What was the initial contract value (US$) of yourproject?

$ __________________

1.1.4 How many contract change orders have beenreceived?

__________________

Page 11: Surveying Systems Engineering Effectiveness...-Customer / User-etc. •Organization-Size-Organizational capability-Related experience-etc. Section 1: Characterization The objective

© 2005 by Carnegie Mellon University Version 1.0 Oct-05 NDIA SE Conference - page 11

Section 2: Systems Engineering EvidenceRate your agreement with the following statements

Stro

ngly

Dis

agre

eD

isag

ree

Agr

eeSt

rong

ly A

gree

2.1 Process Definition

2.1.1 This project utilizes a documented set of systemsengineering processes for the planning and execution ofthe project.

� � � �

2.2 Project Planning

a. … includes task descriptions andwork package descriptions

� � � �

b. … is based upon the productstructure

� � � �

2.2.1 This project hasan accurate andup-to-date WorkBreakdownStructure (WBS)that … c. …is developed with the active

participation of those whoperform the systems engineeringactivities

� � � �

Section 2: SE EvidenceProcess definitionProject /program planningRisk managementRequirements developmentRequirements managementTrade studiesInterfacesProduct structureProduct integrationTest and verificationProject / program reviewsValidationConfiguration management

Page 12: Surveying Systems Engineering Effectiveness...-Customer / User-etc. •Organization-Size-Organizational capability-Related experience-etc. Section 1: Characterization The objective

© 2005 by Carnegie Mellon University Version 1.0 Oct-05 NDIA SE Conference - page 12

Section 3: Project Performance Metrics3.1 Earned Value Management System (EVMS)

Rate your agreement with thefollowing statements

Stro

ngly

Disa

gree

Dis

agre

eA

gree

Stro

ngly

Agr

ee

3.1.1 Your customer requires that yousupply EVMS data?

� � � �

3.1.2 EVMS data is available to decisionmakers in a timely manner (i.e.current within 2 weeks)?

� � � �

3.1.3 The requirement to track and reportEVMS data is levied upon theproject’s suppliers.

� � � �

3.1.4 Variance thresholds for CPI and SPIvariance are defined, documented,and used to determine when

� � � �

Section 3: Performance Metrics

Earned Value

Award fees

Technical requirementssatisfaction

Milestone satisfaction

Problem reports

Page 13: Surveying Systems Engineering Effectiveness...-Customer / User-etc. •Organization-Size-Organizational capability-Related experience-etc. Section 1: Characterization The objective

© 2005 by Carnegie Mellon University Version 1.0 Oct-05 NDIA SE Conference - page 13

Step 6:Execute the survey

NDIA SEDactive roster

IdentifyIndustry

MembersfocalsNDIA mg’t

input

Contactfocals, briefthe survey

process, solicitsupport

Identifyrespondentsand report #

to SEI

Provideweb

accessdata tofocals

Solicitrespondentsand provide

web siteaccess info

Completequestionnaire and

submit to SEI

Collect responsesand response rate

data

Report #of

responsesprovidedto SEI

Analyze dataand report to

SEEC

Report*findings toNDIA and

OSD

reportcompletion

to focal.

Expediteresponse

Expediteresponse

Expediteresponse

Expediteresponse

* Report to include suggestedrecommendations and actions

Indu

stry

fo

cal

SEEC

Res

pond

ent

SEI

Page 14: Surveying Systems Engineering Effectiveness...-Customer / User-etc. •Organization-Size-Organizational capability-Related experience-etc. Section 1: Characterization The objective

© 2005 by Carnegie Mellon University Version 1.0 Oct-05 NDIA SE Conference - page 14

Step 7:Analyze the resultsPartition responses based on project characterizations

Analyze survey responses to look for correlations betweenthe SE practices and the chosen metrics.

Step 8:

ReportSummarize survey results and analysis in a report.

Step 9:

Plan future studiesBased upon the findings from the survey, the need foradditional studies may be defined.

Page 15: Surveying Systems Engineering Effectiveness...-Customer / User-etc. •Organization-Size-Organizational capability-Related experience-etc. Section 1: Characterization The objective

© 2005 by Carnegie Mellon University Version 1.0 Oct-05 NDIA SE Conference - page 15

StatusSurvey instrument development complete

Web deployment complete

Respondent identification in progress

Response collection through Nov.

Analysis through Dec. and Jan.

Report in Feb.

Page 16: Surveying Systems Engineering Effectiveness...-Customer / User-etc. •Organization-Size-Organizational capability-Related experience-etc. Section 1: Characterization The objective

© 2005 by Carnegie Mellon University Version 1.0 Oct-05 NDIA SE Conference - page 16

SE Effectiveness Committee

Brenda ZettervallRuth WuenschelMike Ucchino*Jason StripinisJack StockdaleSarah SheardJay R. SchrandRex SalladeGarry RoedlerPaul RobitailleRusty RentschBob RassaMichael Persson*Brooks NolanRick NeupertBrad Nelson*Gordon F. Neary*John MillerJeff LorenEd KunayGeorge KailiwaiJames HoltonEllis HitteDennis E. HechtDennis GoldensonDonald J. GantzerJohn P. GaddieJoseph ElmTerry DoranBrian DonahueJim DietzGreg DiBennedettoJohn ColombiJack CrowleyThomas ChristianAl BrunsAl Brown*David P. BallBen BadamiMarvin AnthonyDennis Ahearn

* co-chair

Page 17: Surveying Systems Engineering Effectiveness...-Customer / User-etc. •Organization-Size-Organizational capability-Related experience-etc. Section 1: Characterization The objective

© 2005 by Carnegie Mellon University Version 1.0 Oct-05 NDIA SE Conference - page 17

Conclusion

Questions ?

Contact information• Joseph P. Elm [email protected]

Page 18: Surveying Systems Engineering Effectiveness...-Customer / User-etc. •Organization-Size-Organizational capability-Related experience-etc. Section 1: Characterization The objective

© 2005 by Carnegie Mellon University Version 1.0 Oct-05 NDIA SE Conference - page 18

BACK UP

Page 19: Surveying Systems Engineering Effectiveness...-Customer / User-etc. •Organization-Size-Organizational capability-Related experience-etc. Section 1: Characterization The objective

© 2005 by Carnegie Mellon University Version 1.0 Oct-05 NDIA SE Conference - page 19

Target Audience

• Vitech Corp.• SAIC• General Dynamics• Virtual Technology Corp.• Rockwell Collins• GE• United Technologies• Raytheon• Foster-Miller Inc.• United Defense LP• Orbital Sciences Corp.• DRS Technologies• TRW Inc.• Northrop Grumman• DCS Corp.• Trident Systems, Inc.• Motorola• Concurrent Technologies Corp.• Titan Systems Co. (AverStar Group)• Lockheed Martin• Computer Sciences Corp.• TERADYNE, Inc.• L-3 Communications• Boeing• Systems & Electronics, Inc.• Jacobs Sverdrup• BBN Technologies• Support Systems Associates Inc.• ITT Industries• BAE Systems• SRA International• Impact Technologies LLC• AT&T

• Southwest Research Institute• Hughes Space &Communications

• Anteon Corp• Simulation Strategies Inc.• Honeywell• Allied-Signal• SI International• Harris Corp.• Alion Science & Technology• Scientific Solutions, Inc.• Gestalt, LLC• AAI Corp.

Selection criteria: • Active in NDIA SED• Contractors delivering products to the government

Need Point-of-Contact (Focal) from each company to expeditesurvey deployment.