Development of a System Engineering Competency Career ... · Development of a System Engineering...
Transcript of Development of a System Engineering Competency Career ... · Development of a System Engineering...
Calhoun: The NPS Institutional Archive
Reports and Technical Reports All Technical Reports Collection
2014-11-15
Development of a System Engineering
Competency Career Development
Model: an analytical approach using
þÿ�B�l�o�o�m ��s� �T�a�x�o�n�o�m�y
Whitcomb, Clifford
Monterey, California. Naval Postgraduate School
http://hdl.handle.net/10945/44705
1
Systems Engineering Competency Career Model (SECCM)
(FY2014) Technical Report
Development of a System Engineering Competency Career Model: An Analytical Approach Using Bloom’s Taxonomy
Clifford Whitcomb, Phd., Rabia Khan, Corina White
Naval Postgraduate School
November 15, 2014
2
Table of Contents I. INTRODUCTION .......................................................................................................... 11
A. Background ............................................................................................................................... 11 1. Problem Background ............................................................................................................ 12 2. What Is Competency ............................................................................................................. 13 3. Competency Modeling .......................................................................................................... 13
B. Problem Statement .................................................................................................................... 14 C. Objectives ................................................................................................................................. 15 D. Purpose/Benefit ......................................................................................................................... 15 E. Scope ......................................................................................................................................... 15 F. Methodology ............................................................................................................................. 16 G. Chapter Summary ..................................................................................................................... 19
II. COMPETENCY MODEL DEVELOPMENT ........................................................... 21 A. Successful Competency Modeling Approaches ....................................................................... 23 B. Existing SE Competencies Serve as a Foundation ................................................................... 26 C. Validation of Model Using Uniformed Guidelines .................................................................. 28 D. Competency Gap Assessment and Career Developement ........................................................ 35 E. POA&M .................................................................................................................................... 39 F. Chapter Summary ..................................................................................................................... 40
III. DATA AND ANALYSIS ............................................................................................. 43 A. Primary Research by cognitive and affective learning Domain ............................................... 43 B. Chapter Summary ..................................................................................................................... 46
IV. FINDINGS/RESULTS ................................................................................................. 47 A. Primary FIND on experience levels .......................................................................................... 47 B. Use Case Scenario by SPAWAR SSC ATLANTIC: Implications and Findings ..................... 54 C. Chapter Summary ..................................................................................................................... 57
V. Summary, Conclusions, Recommendations and Further Research ......................... 58 A. Summary, Conclusions and Recommendations ....................................................................... 58
Appendix A. Examples of career levels for the v0.78 SECCM model ............................ 61 Appendix B. SE Competency objectives ............................................................................ 65 Appendix C. Krathwohl cognitive AND Affective domains ............................................. 66 Appendix D. Kendall’s memo ............................................................................................. 68 List of References .................................................................................................................. 70
4
LIST OF FIGURES
FIGURE 1. BLOOM’S TAXONOMY DOMAIN (FROM WRIGHTSUFFMUSIC 2014) ....................................................... 14 FIGURE 2. SECCM BACKGROUND ........................................................................................................................ 19 TABLE 2. SE TECHNICAL AND TECHNICAL MANAGEMENT COMPETENCIES .......................................................... 22 TABLE 3. PROFESSIONAL COMPETENCIES .............................................................................................................. 22 FIGURE 3. COMPETENCY MODEL DEVELOPMENT STEPS (FROM U.S. DEPARTMENT OF LABOR EMPLOYMENT AND
TRAINING ADMINISTRATION 2013) .............................................................................................................. 25 FIGURE 4. COMPETENCY MODEL DEVELOPMENT STEPS (FROM U.S. DEPARTMENT OF LABOR EMPLOYMENT AND
TRAINING ADMINISTRATION 2013) .............................................................................................................. 26 FIGURE 5. COMPETENCY SOURCES USED IN SECCM ( FROM WHITCOMB, KHAN AND WHITE 2013) ................. 27 FIGURE 6. KSAS SOURCES USED IN SECCM (FROM WHITCOMB, KHAN AND WHITE 2013) ............................... 28 FIGURE 7. SECCM PROJECT SCOPE ...................................................................................................................... 30 FIGURE 8. VALIDATION OF SE COMPETENCY MODEL WITH OPM GUIDELINES .................................................... 35 FIGURE 9. COMPETENCY GAP ASSESSMENT AND CAREER DEVELOPMENT ........................................................... 38 FIGURE 10. HIGH LEVEL SECCM VALIDATION PHASE TIMELINE ........................................................................ 39 FIGURE 11. PERCENT OF COGNITIVE AND AFFECTIVE KSAS IN SECCM .............................................................. 43 FIGURE 12. PERCENT OF SECCM KSAS WITHIN EACH COGNITIVE LEARNING CONSTRUCT ................................ 44 FIGURE 13. COUNT OF SECCM KSAS CATEGORIZED AS COGNITIVE LEARNING CONSTRUCTS USING BLOOM’S
TAXONOMY .................................................................................................................................................. 44 FIGURE 14. PERCENT OF SECCM KSAS WITHIN EACH AFFECTIVE LEARNING CONSTRUCT ................................ 45 FIGURE 15. COUNT OF SECCM KSAS CATEGORIZED AS AFFECTIVE LEARNING CONSTRUCTS USING BLOOM’S
TAXONOMY .................................................................................................................................................. 45 FIGURE 16. BLOOM’S COGNITIVE LEVELS WITHIN THE SECCM SE-01 CAREER LEVEL ...................................... 48 FIGURE 17. BLOOM’S COGNITIVE LEVELS WITHIN THE SECCM SE-02 CAREER LEVEL ...................................... 49 FIGURE 18. BLOOM’S COGNITIVE LEVELS WITHIN THE SECCM SE-03 CAREER LEVEL ...................................... 49 FIGURE 19. SECCM CAREER LEVEL TREND TO HAVE MAJORITY OF KSAS IN THE COGNITIVE REMEMBER AND
APPLY LEARNING DOMAINS ......................................................................................................................... 50 (NOTE: KEY IS THE SAME AS FIGURES 16-18) ........................................................................................................ 50 FIGURE 20. BLOOM’S AFFECTIVE LEVELS WITHIN THE SECCM SE-01 CAREER LEVEL ..................................... 51 FIGURE 21. BLOOM’S AFFECTIVE LEVELS WITHIN THE SECCM SE-02 CAREER LEVEL ...................................... 52 FIGURE 22. BLOOM’S AFFECTIVE LEVELS WITHIN THE SECCM SE-03 CAREER LEVEL ...................................... 52 FIGURE 23. SECCM CAREER LEVEL TREND TO HAVE MAJORITY OF KSAS IN THE RESPOND LEARNING DOMAIN 53 (NOTE: THE KEY IS THE SAME AS FIGURES 20-22) ................................................................................................. 53 FIGURE 24. PERCENT AND COUNT OF SECCM KSAS ADAPTED BY THE SPAWAR COMPETENCY MODEL ........ 54 FIGURE 25. SPAWAR VS. SECCM: PERCENT OF KSAS BY COGNITIVE DOMAIN ............................................... 55
5
LIST OF TABLES
TABLE 1. SECCM TEAM ....................................................................................................................................... 16 TABLE 2. SE TECHNICAL AND TECHNICAL MANAGEMENT COMPETENCIES .......................................................... 22 TABLE 3. PROFESSIONAL COMPETENCIES .............................................................................................................. 22
6
LIST OF ACRONYMS AND ABBREVIATIONS
AWF acquisition workforce
BKCASE Body of Knowledge and Curriculum to Advance System Engineering
CDM competency development model CDIO Conceive, Design, Implement, Operate
SECCM Systems Engineering Career Competency Model DACM Director, Acquisition Career Management
DAG Defense Acquisition Guidebook DASN Deputy Assistant Secretary of the Navy
DAU Defense Acquisition University DoD Department of Defense
DoN Department of the Navy ELO Enabling Learning Objective
ENG engineering GRCSE Graduate Reference Curriculum for Systems Engineering
INCOSE International Council on Systems Engineering KSAs knowledge, skills and abilities LWDA leadership and workforce development assessment
NASA National Aeronautics and Space Administration NAVAIR Naval Air Systems Command
NPS Naval Postgraduate School NUWC Naval Underwater Warfare Center
OPM Office of Personnel Management OSD Office of the Secretary of Defense
PCD Position Category Description PSE Program Systems Engineering
RDT&E Research, Development, Test and Engineering SE systems engineering
SECCM systems engineering competency career model SME Subject Matter Expert
7
SPAWAR Space and Naval Warfare Systems Command SPRDE Systems Planning, Research, Development and Engineering
SSC Space Systems Center
9
EXECUTIVE SUMMARY
The Systems Engineering Competency Career Model (SECCM) was designed to
assist with career development modeling and in the creation of position descriptions for the
Department of Defense (DoD) (Whitcomb, Khan, & White, 2013). To achieve this,
thousands of knowledge, skills and abilities (KSAs) were mapped and analyzed using the
Bloom’s taxonomic classification schema. Doing so allowed the NPS SECCM researchers to
highlight the cognitive and affective learning domains requisite of competent System
Engineers (SE). The competency modeling approach resulted in an interactive model, which
identifies the KSAs required for DoD systems engineers to be considered competent at
various career experience levels. As such, training levels and competency sources are also
identified within the model.
The NPS SECCM researchers worked with a NDIA SE WG over the past several
years to help create the initial OSD/DAU SE Competency model used as the basis of the
SECCM. The SECCM then added KSA details from several SE Competency models, many
provided to the original NDIA SE WG, from a variety of organizations as a foundation.
Redundancy was eliminated and KSAs were harmonized throughout the SECCM for
consistency. The SECCM is a Systems Engineering (SE) competency model that is based on
the OSD/DAU competency model currently used for the ENG community. The SECCM has
enhanced the current model through the addition of extensive sets of KSAs mapped to each
of the SE competencies defined over a series of typical career development points. The
model encompasses eight different documented systems engineering competency models
from a variety of organizations. These other competency models include The International
Council on Systems Engineering (INCOSE) United Kingdom (UK), Boeing, The National
Aeronautics and Space Administration (NASA), Office of the Secretary of Defense (OSD)
SPRDE Competency Model June 2013 Refresh (also known as the Defense Acquisition
University (DAU) Systems Planning, Research, Development and Engineering (SPRDE)
Competency Model), Naval Aviation Systems Command (NAVAIR), MITRE, Boeing,
Space and Naval Warfare Systems Command (SPAWAR) and the Naval Underwater
Warfare Center (NUWC) Newport. Yet, only four of these models (NUWC, INCOSE UK,
10
OSD SPRDE Competency Model June 2013 Refresh, and MITRE) were used to derive the
KSAs for the SECCM. The OSD SPRDE Competency Model June 2013 Refresh was used
as the basis for the highest-level categorization for the SECCM competencies into which the
KSA were mapped. Also, the DAU SPRDE Level I, II, and III course learning objectives
were transformed into KSAs and added into the SECCM. The iterative competency modeling
approach included analyzing and re-organizing the KSAs based on the similarity of their
competency definition. The KSAs were then re-aligned within the SECCM by following a
bottom-up process of first eliminating duplicate KSAs, then eliminating items that did not
seem to be explicitly defined as a relevant SE competency, and lastly by re-organizing each
KSA into an appropriate competency. As a result of this process, the FY14 SECCM has
2,848 KSAs and 41 competencies.
The KSAs within the SECCM have also been categorized to align with either a
technical/technical management or a professional competency. Research analysis results
indicate that when it comes to technical/technical management competency within systems
engineering at entry-level positions (SE-01), lower level KSAs from the cognitive domain
are required. As the career level increases, so does the complexity of the KSAs within the
cognitive domain. The opposite is true for professional competency within SE domains. For
instance, at entry-level positions, a high number of KSAs within the affective learning
domain need to be learned.
Overall, the majority of the KSAs were determined to fall within the cognitive
learning domains of knowledge and comprehension. The NPS SECCM research suggests that
this finding is noteworthy as these are lower level cognitive domains that can be learned
through training and education. As a SEs’ career progresses through the journey-level (SE-
02) and expert level (SE-03) career phases, the focus shifts to application. At this stage in the
career development, the individual is required to apply what was learned to do his/her job.
As such, this finding suggests that expert level SE position descriptions should substantially
highlight the cognitive learning domain of application.
11
I. INTRODUCTION
This chapter will define what a competency is and explain why it is relevant to study
for the field systems engineering (SE). The chapter will also describe the attributes of a
“good” competency model, and describe how the Systems Engineering Career Competency
Model (SECCM) has evolved.
A. BACKGROUND
In Fiscal Year FY13, the Department of the Navy sponsored work by the Naval
Postgraduate School (NPS) to develop a Systems Engineering Competency Career Model
(SECCM). This model identifies a collection of knowledge, skills, and abilities (KSAs) that
define the basis for developing effective systems engineers. Progress on the SECCM has
positioned the Navy as a Department of Defense (DoD) leader in the human resources
management of this systems engineering competency. The SECCM can also assist graduate
academic programs to specify student outcomes and learning objectives within systems
engineering (SE) programs that will ensure the students have the entry-level KSAs required
to perform successfully in their job. The implications of the research can also be used to
develop structured curriculum content, assessment, and continuous process improvement
techniques related to the development of SE learning, and to develop more valid and reliable
instruments for assessing what systems engineers need to learn, need to know, and need to do
(Khan, 2013). Proficiency levels and competency sources are identified within the model.
Each KSA was defined in terms of Bloom’s Taxonomy in both the cognitive and affective
domains. The model is implemented in an Excel spreadsheet. This approach provides an
interactive model that allows for tailoring the KSAs required for DoD systems engineers to
be considered competent at various career experience levels (White, 2014).
The SECCM used eight different documented systems engineering competency models
from a variety of organizations as a foundation. These other competency models include The
International Council on Systems Engineering (INCOSE) United Kingdom (UK), Boeing,
The National Aeronautics and Space Administration (NASA), Defense Acquisition
University (DAU) Systems Planning, Research, Development and Engineering (SPRDE),
12
Naval Aviation Systems Command (NAVAIR), MITRE, Boeing, Space and Naval Warfare
Systems Command (SPAWAR) and the Naval Underwater Warfare Center (NUWC)
Newport. The KSAs were harmonized with Bloom’s Taxonomy based on affinity in the
KSAs.
1. Problem Background
The Systems Planning, Research, Development and Engineering (SPRDE) career field
has approximately 38,000 employees (Lasley-Hunter and Alan, 2011). Given the sheer
number of engineering field related personnel, one would think that there would be guidance
on occupational codes or position descriptions for what systems engineering need to know,
and need to do. Yet, there is currently no professional engineering occupation code or
position description for systems engineers within the Department of Defense (DoD).
Similarly, there is no official system engineering competency model to form the basis for
employee selection and career development.
To understand the relationship competency modeling has with the identification of the
competencies required of system engineers, the NPS SECCM WG developed a systems
engineering career development competency model by researching the pertinent KSAs for
systems engineers. Proper training of this highly specialized workforce is imperative to
assure successful acquisition programs. The DAU SPRDE curriculum provides DAWIA
certification and some foundation but more must be done to provide comprehensive training
and education for a complete education of systems engineers (Alexander, 2013). Systems
engineers require many of the same KSAs as other members of the engineering workforce,
but also require unique KSAs focused on customer mission/capability areas, technology
areas, SE processes/activities and leadership skills. Developmental methods for systems
engineers to obtain these KSAs range from informal on-the-job training to professional
certifications and degrees (Walter, 2013). The scope of the research project was to develop a
competency model, which could be used to meet the mission need for a competency tool for
the development of position descriptions, career development plans and employee selection.
Referred to as the Systems Engineering Competency Career Model (SECCM), the model
consists of the core technical and general KSAs. The KSAs were researched from the current
models from various naval and engineering enterprises. Two key objectives of the SECCM
13
were: to develop an approach and methodology to obtain baseline information needed for
systems engineering competency model development (for example the related KSAs,
relevant education, training, and experience required) and to define career paths for systems
engineers (jobs, assignments, and timing). Other objectives included to design and administer
a survey to validate the competencies associated with systems engineering.
2. What Is Competency
Competency modeling can be defined as the activity of determining the specific
competencies that are characteristic of high performance and success in a given job.
According to Joshi, Datta and Han (2010), competency is defined as the ability to use the
appropriate KSAs to successfully complete a specific job-related task. When combining
competency with competence, the idea of competency assessment becomes apparent.
Competency assessment is a tool found useful to organizations for allocating human
resources for a successful employer-employee match. Competency assessment is also
beneficial in creating job-specific professional development and accurate training
requirements for employees to obtain a good match for a position.
3. Competency Modeling
Important to this research was the role of competency modeling, which is defined as
the activity of determining the specific competencies that are characteristic of high
performance and success in a given job (LaRocca, n.d.). Current research suggests that a
“good, competency model has the following attributes: it has gone through
much iteration; it focuses on a specific aspect of competency, it is simple and
easy to understand, and it maps competencies across levels”. (Holt and Perry,
2011)
A “good” competency model should also map career levels in a way that is easy to
understand. For example, if a given organization’s standards require that an individual attain
the “practitioner” level for a competency, then it is assumed that the individual must also
hold that competency when at a “supervised practitioner.” In addition, a “good” competency
14
model serves as a platform by which individuals can assess their skill set (Whitcomb, Khan
and White 2013).
B. PROBLEM STATEMENT
Currently, there is no Position Category Description (PCD) for a systems engineer
within Naval System Commands. PCDs have KSAs necessary to complete a job.
Professional engineering occupational codes are used to classify the characteristics desired
for various engineering communities. Occupational codes also provide a government
resource that can assist in determining the number of employees in a specific field or
occupation. Occupational codes can also assist with manpower forecasting efforts. Position
descriptions highlight the KSAs required to be qualified for a specific job. The position
descriptions are helpful with finding the most competent candidate for the position. Thus,
there is a need for a competency model which identifies a set of KSAs that can be used to
create position descriptions and related career development plans designed specifically for
systems engineers within the DoN. This effort resulted in a foundation to build a SE career
development plan that can be truly beneficial to both the Navel engineer professional and the
DoD.
Figure 1. Bloom’s Taxonomy Domain (from Wrightsuffmusic 2014)
15
C. OBJECTIVES
Two key objectives of the SECCM were to: develop an approach and methodology to
obtain baseline information needed for SE competency model development (for example the
related KSAs, relevant education, training, and experience required) and to define career
paths for systems engineers (jobs, assignments, and timing). The objectives of the research
project were achieved by harmonizing the SECCM across various engineering competency
models using the Bloom’s taxonomic classification schema. Doing so resulted in the
baseline information needed for the SECCM, while also defining career paths for systems
engineers.
D. PURPOSE/BENEFIT
The benefit of the research is that it will provide a model, i.e., the SECCM, that can
be used by several organizations to identify KSAs pertinent to the development of systems
engineers. The model will also allow the DoN to formulate competency development plans
for the professional development of systems engineers. Furthermore, the model can
contribute to the guidance on the development of graduate and undergraduate curricula in
systems engineering.
E. SCOPE
The Deputy Assistant Secretary of the Navy Research, Development, Test &
Evaluation Chief Systems Engineer (RDT&E) office continued the sponsoring from FY13, of
the development of an overall approach, including data gathering and survey method and
tools, to be used in support of the development of a naval systems engineering competency
model. In an effort to cover the scope of the project, the project was broken down into three
phases. The first phase was the development of the competency model, followed by the
second phase, which focused on competency assessment. The third phase will include a
Competency Gap Analysis and Career Path Modeling. The SE Competency model
development team members are listed in Table 1. Mrs. Jessica Delgado is the team lead.
16
Name Organization Email Samuel Winograd NUWC [email protected]
Mark Reinig SPAWAR HQ [email protected]
Lori Zipes NSWC PC [email protected]
Mark Jones USMC [email protected]
Paul Walter SSC ATLANTIC [email protected]
Anthony Desantis NUWC [email protected]
Eric Johnsen NAVAIR [email protected]
Dr. Cliff Whitcomb NPS [email protected]
Corina White NPS [email protected]
Rabia Khan NPS [email protected]
Carl Flores SPAWAR HQ [email protected]
Alan Dean NSWC HQ [email protected]
Michael Persson NAVAIR [email protected]
Benito Espe SPAWAR HQ [email protected]
Rick Kelly NAVAIR [email protected]
John Quartuccio NAVAIR [email protected]
David Heckman DASD SE [email protected]
Sara Hoffberg DASD SE [email protected]
Dr. Pamela Knight MDA [email protected]
Dr. Art Pyster SERC [email protected]
Richard Newton ASA ALT [email protected]
Capt. Jim Melvin SSP [email protected]
Aileen Sedmak DASD SE [email protected] Alix Autrey OPM [email protected]
Dana Grambow OPM [email protected]
Jessica Delgado NSWC DD [email protected]
Leo Smith ASA ALT [email protected]
Table 1. SECCM Team
F. METHODOLOGY
Several competency models were used to construct the SECCM, including: NUWC,
INCOSE UK, NAVAIR, SPAWAR, Boeing, NASA, MITRE and OSD SPRDE Competency
Model June 2013 Refresh (also referred to as the DAU ENG Competency Model). The OSD
SPRDE Competency Model June 2013 Refresh was used as the basis for the highest-level
categorization for the SECCM competencies into which the KSA were mapped. However,
only four of these models - NUWC, INCOSE UK, OSD SPRDE Competency Model
17
Refresh, and MITRE - were used to derive the KSA’s for the SECCM. Additionally, a total
of 654 DAU ENG Level I, II, and III course learning objectives were transformed into
KSA’s and added to the model. The competencies were also categorized as to whether they
would be developed by Education and Training, On the Job Experience, or Professional
Development.
In the SECCM, skills are mapped to proficiencies within the competencies based on the
key words from Bloom’s taxonomy, and correlate to a specific cognitive or affective
category. In other words, cognitive and affective terminology is used to define the level of
proficiency within the competency. For example, an individual’s competencies at entry-level
stages would include the cognitive and affective learning domains of: Knowledge,
Comprehension, Receiving Phenomena and Responding to Phenomena. At intermediate
stages of proficiency, the individual would develop competencies within the cognitive and
affective learning domains of: Application, Analysis, Valuing and Internalizing Values.
Finally, at advanced career development stages, an individual would be proficient within
competencies which fall under the cognitive and affective learning domains of: Synthesis,
Evaluation and Organization. A SECCM working group (WG) consisting of subject matter expert (SMEs) was formed
in FY14 with participants representing various organizations including the following: Naval
Air Systems Command (NAVAIR), Naval Sea Systems Command (NUWC), Naval Surface
Warfare Center Dahlgren Division (NSWCDD), Naval Surface Warfare Center Panama City
(NSWCPC), Naval Postgraduate School (NPS), Space and Naval Warfare Systems
Command (SPAWAR) Atlantic and Pacific, US Army (USA), US Marine Corps (USMC)
and the Missile Defense Agency (MDA) to conduct a baseline review to verify the model.
The baseline review conducted by the SECCM WG verified that the KSAs were aligned
to the correct competency. If a KSA was determined to not be aligned to an appropriate
competency, the KSA was re-assigned to one deemed more appropriate by the SMEs. The
baseline review also identified some KSAs that did not belong in the model. If the SMEs and
stakeholders felt that a KSA did not apply to an SE application, the KSA was eliminated
from the model. In some instances, SMEs added KSAs to the model based on their
experience. Following the same sort of iterative process used in FY13, redundant KSAs
18
were also deleted and vague KSAs were re-written. In an effort to enforce consistency in the
model (while also properly using Bloom’s taxonomic classification schema), each KSA was
updated to have an action verb at the very beginning of the sentence. The verbs were all
converted to present tense.
Research conducted thus far has included routine updates and changes to the original
competency model following SME and stakeholder inputs. These processes have played a
significant role in the evolutionary approach and development of the model. The SECCM
WG matched these elements to three notional career experience levels: Entry, Journey, and
Expert. A two part division of skills addresses both core technical systems engineering and
program management, as well as professional skills competencies (Delgado, 2014)
Now that model has been harmonized into a single, coherent model, it will be
analyzed from various perspectives to study its characteristics. Doing so will allow the
SECCM WG to understand how it would be useful in the validation process. During the
validation process the model must be addressed in accordance to The Uniform Guidelines for
Employee Selection. To assist with this process, the Office of Personnel Management (OPM)
joined the SECCM team in July 2014. Figure 2 illustrates the life cycle of the validation
process and provides background information for the SECCM’s development.
19
Figure 2. SECCM Background
The specifications of 5 CFR 300A, Employment Practices, require (1) a job analysis
for selection and competitive promotions in Federal employment, (2) compliance with the
job-relatedness requirements of the Uniform Guidelines on Employee Selection Procedures
(43FR38290), and (3) that resulting assessments target competencies required for the
occupational position. Therefore, OPM recommends a job or occupational analysis to ensure
the most rigorous policies and standards governing human resources practices are met to be
able to fully use the systems engineering competency model for all human resources
functions.
G. CHAPTER SUMMARY
This chapter reviewed the evolution of the SECCM by identifying the eight original
SE competency models used to create the model. The problem statement and objectives were
established, each of which highlighted the importance of this project in developing a
competency model, which can be used to create position descriptions within the DoD.
Bloom’s taxonomic classification schema was used to categorize each KSA into either a
SECCM%Background!Used!exis(ng!SE!Models!as!a!Framework!(2,914!KSAs)!
Developed!team!of!SE!SME!to!perform!baseline!revisions!of!the!model!
(2,848!KSAs)!
Offer!our!Model!to!other!organiza(ons!in!
an!effort!to!get!feedback!
Update!the!Model!to!include!updates!
released!for!any!of!the!original!references!
Validate!the!Model!in!accordance!with!Uniform!Guidelines!
(Reduce!to!a!smaller,!more!manageable!set!of!KSAs)!
20
cognitive or affective learning domain. The methodology of validating the model in
accordance to the Uniformed Guidelines was also discussed in this chapter. (Note: A
detailed literature review of each of the original competency models used is available in the
FY13 report).
21
II. COMPETENCY MODEL DEVELOPMENT
This portion of the report will highlight changes made to the SECCM in FY14. FY14
research by the NPS SECCM research team identified whether each KSA was derived from
Education, Training, or On the Job Experience. It should be noted that the model was
designed to maintain its level of detail through the highly iterative process. The routine
updates and changes to the original reference competency models used, SME and stakeholder
inputs all played a significant role in the evolutionary approach.
The OSD SPRDE Competency Model June 2013 Refresh was used as the basis for
the highest-level categorization for the SECCM competencies into which the KSA were
mapped. The SECCM has 2,848 KSAs and 41 competencies. These are categorized into
Technical/Technical Management (Table 2), and Professional competencies (Table 3).
Number Competency
1.0 Mission-Level Assessment
2.0 Stakeholder Requirements Definition
3.0 Requirements Analysis
4.0 Architecture Design
5.0 Implementation
6.0 Integration
7.0 Verification
8.0 Validation
9.0 Transition
10.0 Design Considerations
11.0 Tools and Techniques
12.0 Decision Analysis
13.0 Technical Planning
14.0 Technical Assessment
15.0 Configuration Management
16.0 Requirements Management
17.0 Risk Management
18.0 Data Management
19.0 Interface Management
22
20.0 Software Engineering Management
21.0 Acquisition
22.0 Problem Solving
34.0 Cost, Pricing and Rates
35.0 Cost Estimating
36.0 Financial Reporting and Metrics
38.0 Capture Planning and Proposal Process
39.0 Supplier Management
Table 2. SE Technical and Technical Management Competencies
It should be noted that although the SECCM was envisioned to focus on the specific
competencies that define systems engineers on a primarily technical and program
management basis, the set of competencies that reflect more generic engineering professional
skills, or “soft skills,” are also included within the core model. The NPS SECCM research
team feels that systems engineers need these generic professional competencies at higher
proficiency levels in the earlier stages of a career than disciplinary engineers. These
professional competencies are listed in Table 3.
Number Competency
23.0 Strategic Thinking
24.0 Professional Ethics
25.0 Leading High-Performance Teams
26.0 Communication
27.0 Coaching and Mentoring
28.0 Managing Stakeholders
29.0 Mission and Results Focus
30.0 Personal Effectiveness/Peer Interaction
31.0 Sound Judgment
32.0 Industry Landscape
33.0 Organization
37.0 Business Strategy
40.0 Industry Motivation, Incentives, Rewards
41.0 Negotiations
Table 3. Professional Competencies
23
The updates mentioned were made in FY14 to the naval SE career levels based on the
following experience level definitions:
SE-01 Entry Level (0–3 years of work experience)
• Able to understand the key issues and their implications. They are able to ask relevant and constructive questions on the subject. This level requires an understanding of the Systems Engineering role within the enterprise.
• Example: New hires enrolled in an engineering career development program, typically able to complete it in 3 years.
SE-02 Journey Level (3–10 years of work experience)
• Displays an understanding of the subject but may require minimal guidance and with proper training and opportunity will be able to provide guidance and advice to others.
• Example: GS-12 engineers who are working in systems engineering.
SE-03 Expert Level (10–12+ years of work experience)
• Contains extensive and substantial practical experience and applied knowledge of the subject.
• Example: Senior systems engineers who are leading systems engineering teams and possibly act as a chief system engineer.
A. SUCCESSFUL COMPETENCY MODELING APPROACHES
Prior to developing SECCM, several competency model approaches were explored.
Such as, the Pragmatic Guide to Competency report by Holt and Perry, the Career and
Competency Pathing report by LaRocca, the Graduate Reference Curriculum for Systems
Engineering (GRCSE) report by INCOSE, and The U.S. Department of Labor Employment
and Training Administration’s (ETA) User Guide to Competency Models. A detailed list of
the existing SE competency models used and a literature review is available in the FY13
SECCM Interim Report, yet it should be noted that the SECCM was developed with all of
these aspects in mind.
The Holt and Perry Guide focuses on what defining a good competency model. As
discussed in Chapter I, this guide concludes that a good competency model goes through
24
many iterations, focuses on a specific aspect of competency but is easy to understand, maps
competencies across levels, keeps a small number of levels, maps levels clearly and
emphasizes technical skills (Holt and Perry, 2011).
The Career and Competency Pathing Competency Modeling Approach by LaRocca
concentrates on how organizations can identify core competencies and how they can apply
the competency data to improve the performance of workers. Additionally, LaRocca’s model
illustrates some emerging trends in competency modeling. According to LaRocca, it is
imperative that organizations understand what knowledge, skills and abilities are required for
people in key roles to deliver business goals. LaRocca also stresses that there are six stages in
defining a competency model for a given job role which include the following: defining the
criteria for superior performance in a given role, choosing a sample of people performing the
role for data collection, collecting sample data about behaviors that lead to success,
developing hypotheses about the competencies of outstanding performers and how
competencies work together to produce desired results, and finally, validating results of data
collection and analysis and applying the competency models in real case scenarios (LaRocca
n.d.).
Research on how competency model approaches has shown that data can be applied
to real-life applications (Schoonover et al., 2012 and Arthur Anderson, 2002). For instance,
the use of competencies (in order of their effectiveness) can be used in hiring practices, job
descriptions, training, performance management, and career pathing. (Shoonover, Shoonover,
Nemerov and Ehly 2012).
The Graduate Reference Curriculum for Systems Engineering (GRCSE) is a part of
the Body of Knowledge and Curriculum to Advance System Engineering (BKCASE)
(INCOSE, 2013). This competency modeling approach focuses on how to use Bloom’s
taxonomy to set the level of attainment of educational or learning outcomes required for
students engaged in an educational unit, course or program. In GRCSE, major attention is
given to the cognitive domain, which is concerned with knowledge and how it is learned
(Huitt, 2014). In contrast, the affective domain is a minor focus, as it is concerned with
feelings, appreciation and valuation of the content that is learned. With this in mind, the NPS
SECCM research team contends that within some educational settings (for example military
and theological), the affective domain is an explicit focus of competencies because of the
25
high standard of morals and values emphasized within the social settings. It should be noted
that this is one reason why the NPS study focuses on both the cognitive and affective
domains.
Lastly, NPS studied The U.S. Department of Labor’s, Employment and Training
Administration (ETA) User Guide to Competency Model. The User Guide to Competency
Model recommends five steps to developing a competency model, as shown in Figure 3.
Figure 3. Competency Model Development Steps (from U.S. Department of Labor
Employment and Training Administration 2013)
Note that steps 3 and 4 are repeated until the subject matter experts and the
development team agrees that the model is an all-inclusive representation of required KSAs.
As part of step 4, NPS (in it’s implementation of the refining the SECCM) updated the
current version of the SECCM based on Bloom’s taxonomic classification schema as adapted
STEP 5: Validate Ensure acceptance by a target community of users.
STEP 4: Re>ine the competency framework Development of a competency model is an iterative process. Revisions,
additions, deletions, and reorganization occur at this step.
STEP 3: Gather feedback from SMEs Review is requested to determine if the framework re>lects appropriate competencies; if any competencies are missing and if any terminology
changes need to be made.
STEP 2: Develop a draft competency model framework Themes and patterns in the existing information are identi>ied,
reorganized and a draft model is developed.
STEP 1: Gather background information Existing frameworks and models are analyzed, organized and evaluated
to determine af>inities
26
by Krathwohl. Krathwohl converts the learning category from a noun to a verb. For
example, within the cognitive domain, Bloom had the skill of “Evaluation,” which became
“Evaluate” in Krathwohl’s taxonomic schema. Figure 6 shows Krathwohl’s Bloom’s
Taxonomy Update.
Figure 4. Competency Model Development Steps (from U.S. Department of Labor
Employment and Training Administration 2013)
B. EXISTING SE COMPETENCIES SERVE AS A FOUNDATION
The last thing that the world needs is yet more frameworks, so, again, the idea of cherry-picking different parts from different frameworks is a very attractive one.
John Holt and Simon A. Perry
A Pragmatic Guide to Competency: Tools, Frameworks and Assessment
In an effort to meet the needs of the Naval System Commands to identify the KSAs
required to be a competent systems engineer at various career experience levels, the NPS
SECCM research team identified the KSAs across a series of systems engineering domains.
Several competency models were used to verify the SECCM in an effort to combine skills
from different sources to generate a complete scope of SE KSAs. The team identified eight
SE competency models to determine the potential SE competencies for the Navy and
organized the elements based on their similarities. The eight SE competency models include:
NUWC, INCOSE UK, NAVAIR, SPAWAR, Boeing, NASA, MITRE and SPRDE. In the
27
end, the SECCM team decided to use the 41 competencies of the OSD Competency Model
Refresh of FY13, this resulted in the SECCM having 41 competencies. Figure 5 is an
illustration of how existing competency models were used to create the SECCM.
Figure 5. Competency Sources Used in SECCM ( from Whitcomb, Khan and White 2013)
The next step in developing the SECCM was to map the KSAs associated with the 4
competency models into the 41 SECCM competencies. The entries from the 4 competency
models were analyzed and re-organized based on the similarity of their competency
definitions.
The baseline review with SMEs verified that the KSAs were aligned to the correct
competency. If a KSA was determined to not be aligned to an appropriate competency, the
KSA was re-assigned to one deemed more appropriate by the SMEs. The baseline review
INCOSE UK SE Competency Model
DAU SPRDE SE Competency Model
NASA SE Competency Model
SECCM Model
NUWC Newport SE Workforce
Development Model
SPAWAR SE Competency Model
MITRE SE Competency
Model Boeing
SE Competency Model
NAVAIR SE Competency Model
28
also identified some KSAs that did not belong in the model. If the SMEs and stakeholders
felt that a KSA did not apply to an SE application, the KSA was eliminated from the model.
In some instances, SMEs added KSAs to the model based on their experience. Following the
iterative process of FY13, redundant KSAs were also deleted and vague KSAs were re-
written. Figure 6 shows the models used to derive the KSAs. At the end of FY14 SECCM
has 2,848 KSAs.
Figure 6. KSAs sources used in SECCM (from Whitcomb, Khan and White 2013)
C. VALIDATION OF MODEL USING UNIFORMED GUIDELINES
Once the SECCM had been harmonized into a single, coherent model, it was ready to
be analyzed by the SECCM WG to study the feasibility of having it accepted by various
naval and engineering command communities. Doing so allowed the SECCM WG to
KSA$Model$Sources$Used$
NUWC$Newport$SE$Workforce$
Development$Model$
INCOSE$UK$SE$Competency$Model$
MITRE$SE$Competency$Model$
SECCM%
DAU$SPRDE$Level$I,$II,$and$III$Course$Learning$ObjecIves$
29
understand how it would be useful in the validation process. To assist with this process, The
Office of Personnel Management’s (OPM’s) Leadership and Workforce Development
Assessment (LWDA) team joined the SECCM WG to assist in the refinement, confirmation,
and strategic planning required to ensure the systems engineering competency model is a
legally defensible, relevant, and sound tool that may be used for a variety of human resources
purposes, such as career path modeling, skills gap assessment, and selection tool
development. The validation process includes the following, a Job Analysis (based on
Incumbent and Supervisor Panels), Occupational Analysis Survey, Proficiency Level
Identification, Competency Gap Analysis, and Career Path Modeling.
There is currently no professional engineering occupational code (08XX) for systems
engineers within DoD. DCAT is being used to identify competency gaps based on
occupational series. Since there is no validated SE competency model aligned with an
occupational series, a model and related systems engineering tasks have to be created in
order to accomplish a DCAT survey. DCPAS does not have a personnel research
psychologist on staff to assist in validating a competency model, as does OPM, therefore
OPM was tasked by DASN RDT&E to accomplish model validation.
In an effort to fulfill Navy’s mission, the SECCM WG identified systems engineering
as an area requiring further research to ensure employees performing systems engineering
tasks are proficient in the competencies required for success. Competency modeling is a key
tool for ensuring a capable staff to accomplish the Navy’s current and future mission. The
systems engineering competency model will be intended for use both in and outside of the
SECCM WG, and will leverage the research that has previously been conducted by NPS.
Figure 7 illustrates the SECCM Project Scope.
30
Figure 7. SECCM Project Scope
The following steps cover the entire occupational analysis methodology
recommended by OPM. The typical occupational analysis methodology entails four steps:
review of occupational information, facilitation of subject matter expert panels,
administration of surveys, and documentation. The occupational analysis methodology will
focus on the critical competencies needed to be successful in performing systems engineering
tasks. Specifically, information collected during the occupational analysis phase will provide
the basis for determining which competencies should be assessed for what purpose, and how
these competencies should be assessed. The NPS SECCM WG has already begun the review
of occupational information - therefore, the memorandum of agreement will begin with the
facilitation of SME panels. In addition, documentation of the work will help to ensure the
!!!!!PROJECT!!MANAGEMENT!1!
MODEL!DEVELOPMENT!2!PROFICIENCY!LEVEL!DEVELOPMENT!3!
COMPETENCY!GAP!ANALYSIS!4!
CAREER!PATH!MODELING!5!
• Conduct!SME!Panels!• Administer!occupaHonal!
analysis!survey!• Analyze!the!results!of!survey!
data!• Create!a!technical!report!
documenHng!occupaHonal!analysis!
• Design!and!administer!an!online!quesHonnaire!to!gather!current!SE!proficiency!data!for!supervisors!and!incumbents!
• Work!with!SME!Panels!to!set!!generic!proficiency!level!scale!
• Work!with!SME!Panels!to!set!custom!!proficiency!level!scale!
• Analyze!quesHonnaire!data!• Document!methodology!and!
results!in!a!technical!report!
• Provide!career!path!progression!documentaHon!
• Plan,!coordinate!and!manage!the!delivery!of!products!and!services!as!defined!in!the!memorandum!of!agreement!
• Conduct!SME!panels!
SECCM%Project%Scope%%
31
model meets legal and professional selection and assessment standards and guidelines for
maintaining current occupational analysis documentation.
Step 1: Conduct SME panels. LWDA psychologists will facilitate SME panels to
gather feedback and further refine the systems engineering competency model. LWDA
psychologists will use the listing of competencies from the listing of competencies previously
identified by NPS as the starting point for the panel. In addition, LWDA will develop an
initial listing of tasks from existing occupational information provided by NPS, such as
available position descriptions, classification standards, previous job analysis data, etc.
LWDA psychologists will then facilitate SME panels (one with incumbents and one with
supervisors) to refine the task and competency lists and finalize occupational analysis survey
content. According to OPM and for the purposes of this project, incumbents are defined as
employees who are currently performing systems engineering work and have a minimum of
6 months experience in SE, but can have more experience than the minimum. The
incumbents form a panel facilitated by OPM to review tasks that SE typically perform as
aligned with the SECCM. Supervisors are defined as first line supervisors of the employees
who perform systems engineering work. The supervisors should have a minimum of 6
months experience. The supervisors form a panel facilitated by OPM to review tasks that SE
typically perform as aligned with the SECCM.
Step 2: Administer occupational analysis survey. The SE population is needed to
identify those SE to include in the survey pool. The population sample should be
representative of the population in order to use as a basis for model validation. Only a sample
of the population will be surveyed. The population information is also needed to complete
the Cost Estimate document that is required to obtain DoD Survey approval. Approval must
be obtained prior to deploying the survey. Identifying the population of systems engineers in
any organization is currently a challenge faced by the DoN and all other defense
organizations. There is no single best way to identify SE, so each organization must attempt
to identify their own population based on identifying engineers who perform tasks related to
SE. Once all organizations have identified their populations, these will be reviewed with
OPM and the SECCM WG to review the results. There are several organizations
32
participating in the SECCM WG, including NPS, DASN RDTE, NAVAIR, NUWC,
NSWCDD, SPAWAR, USMC, US Army, MDA, USAF, SE-UARC SERC. Only the
Missile Defense Agency (MDA) is being asked to provide participation for the 4th estate. The
Naval System Commands follow the tasker process and submit the responses directly into the
tasker system. Non-naval participants will send their inputs directly using a spreadsheet
template provided.
The occupational survey will be administered to a statistically significant sample of
the population, and may include over-sampling to ensure capturing the breadth of the
possible population, and results analyzed. In some cases, the results may indicate the portions
of the sampled population should be excluded from the results. By oversampling, there
should be enough survey responses to maintain a statistically significant sample for the
results to be used to represent the population of SE.
Occupational analysis survey content will be tailored to meet the needs of RDT&E
and NPS. Occupational analysis surveys typically assess, at a minimum, the importance of
competencies to successful job performance and the frequency of performance and
importance of tasks relevant to the job. Surveys may also measure employee need for
training or determine which competences are required at entry to the job. The work
described in step one above would need to be completed prior to beginning this step. The
survey will be developed by OPM after interactions with the incumbents and SMEs. Once it
is drafted, it will be available for review by the members of the SECCM working group. The
survey draft is expected to be ready by early to mid-December 2014. Survey development
and administration includes the following steps.
a. Occupational Analysis Survey. LWDA will compile task and competency
information into a survey format. The survey population will be identified and targeted to
receive the survey through sampling relevant engineering series and survey branching
methodology (i.e., survey respondents will be asked a series of questions to identify
themselves as someone who performs systems engineering tasks, or functions as a systems
engineer, or supervises individuals who do these tasks). Survey respondents will be asked to
evaluate the task and competency items based on criteria tailored to meet the needs of
33
RDT&E and NPS SECCM WG, such as: 1) frequency; 2) importance; 3) required at entry;
and 4) need for training. Both employees who perform systems engineering work and their
supervisors would be surveyed. If the project remains on schedule, the survey will be
deployed in March 2015. Unfortunately, until a draft of the actual survey from OPM is
ready, there is no way to estimate how long the survey will take to complete.
b. Analysis of survey data. LWDA psychologists will analyze the data collected in
the survey and use this information to develop lists of critical tasks and competencies for
professionals performing systems engineering tasks. Data may be grouped for each
occupational series and grade level, by organizational group, or by additional criteria (e.g.,
roles) determined at a later date. Tasks and competencies must meet certain criteria in order
to be identified as critical, required at entry, or a training need.
c. Occupational Analysis Documentation. A key piece of ensuring a competency
model conforms to legal and professional standards and guidelines is the documentation of
the results of an occupational analysis. LWDA will document the methodology and results
for steps one and two in a technical report for use by RDT&E and the NPS SECCM WG.
This report will be designed to meet professional and legal guidelines.
Step 3: Proficiency Level Development. Proficiency levels define the range of proficiency
an individual can possess in a given competency. Proficiency levels are a key tool in many
human resources functions, including selection, development, and performance management.
This memorandum of agreement focuses on the use of proficiency levels for Competency
Gap Analysis. There are two types of proficiency level descriptions: (1) a generic
proficiency level scale, which uses a single set of definitions to describe proficiency for all
competencies and (2) custom proficiency level scales, which are customized to more
specifically describe proficiency on each competency.
a. Generic proficiency level scale. LWDA psychologists will review the competency
model and definitions developed by the SME panels. LWDA will hold a one-day focus
34
group with supervisors from the systems engineering profession. In this focus group,
participants will review the current competency models, and come to consensus on the
minimum required proficiency level for each of the competencies based on the generic
proficiency level scale. LWDA will work with SMEs to set proficiency levels that cut across
individuals performing systems engineering work and will not set proficiency levels for any
subgroups (e.g., by occupational series, organization).
b. Custom proficiency levels. Custom proficiency levels are useful for career path
modeling and employee development initiatives. LWDA recommends developing custom
proficiency levels for a subset of competencies identified by SMEs as essential
developmental competencies. For the purposes of this memorandum of agreement, LWDA
provides pricing based on the development of custom proficiency levels for eight
competencies. LWDA will conduct a four-day SME panel to develop the custom proficiency
levels and set required proficiencies for each of the competencies. Figure 8 illustrates the
details in the process of validating the SE Competency Model with OPM Guidelines.
35
Figure 8. Validation of SE Competency Model with OPM Guidelines
D. COMPETENCY GAP ASSESSMENT AND CAREER DEVELOPEMENT
Step 1: Competency Gap Analysis. A gap analysis includes establishing required
proficiency levels as discussed above and determining employees’ current proficiency levels.
A gap exists when an employee's current proficiency level is below the required proficiency
level. LWDA will conduct competency gap analyses for the systems engineering
professionals as identified by RDT&E and NPS and survey methodology.
a. Competency gap analysis questionnaires. LWDA will design and administer an
online questionnaire to gather current proficiency data. Two separate questionnaires will be
developed, one for supervisors and one for incumbents. Supervisors will provide ratings of
employees’ current levels of proficiency for each competency and employees will provide
ratings of their own proficiency. Respondents will use generic benchmarks that have been
Valida&on)of)SE)Competency)Model)with)OPM)Guidelines)
Step)1:)
)Conduct)SME)Panels)
Step)2:)))
Administer)Occupa&onal)Analysis)Survey)
Step)3:)
Proficiency)Level)
Development)
• Assess)the)importance)of)competencies)to)successful)job)performance)and)the)frequency)of)performance)and)the)importance)of)tasks)relevant)to)the)job.)
• Measure)employee’s)need)for)training)or)determine)which)competences)are)required)at)entry)to)the)job.)
• Analyze)survey)data)to)iden&fy)cri&cal)tasks)and)competencies)for)professionals)performing)systems)engineering)tasks.)
• Document)occupa&onal)analysis)in)a)technical)report.)
• Use)exis&ng)informa&on)to)develop)a)list)of)tasks))(PDs,)classifica&on))standards,)previous)job)analysis)data)etc.))
• Use)SE)competencies)iden&fied)by)the)SEHRM)Team.)• Conduct)SME)Panels)()one)with)employees)and)one)with)supervisors))to)refine)the)
task)and)competency)lists)and)finalize)the)occupa&onal)analysis)survey)content.)
• Set)proficiency)levels)that)go)across)individuals)performing)systems)engineering)work.)
• Develop)the)custom)proficiency)levels)and)set)required)proficiencies)for)each)of)the)competencies.)
36
used in competency gap analyses with other federal agencies; except for the identified
developmental competencies targeted for custom proficiency level development.
b. Analysis of questionnaire data. LWDA will analyze the data collected in the
questionnaire and use this information to identify proficiency gaps for each competency.
LWDA will report gaps across the workforce and can provide additional analyses for
subgroups of employees upon request. The current cost estimate includes analysis across the
workforce, without separate subgroup reporting.
c. Technical report. LWDA will document the methodology and results in a
technical report for use by RDT&E and NPS SECCM WG. This report will be designed to
meet professional and legal guidelines.
Step 2: Career Path Modeling. Career paths are established to guide employees,
their supervisors, and the organization as a whole for employee development purposes.
Career path models serve as a resource to employees seeking to further develop their
professional skills, and in the case of systems engineering professionals at Navy, it may
introduce employees to opportunities they may otherwise not know exist. As systems
engineers do not have a unique general schedule (GS) level classification, employees
currently performing systems engineering tasks may be less aware of the career path
opportunities than their counterparts with a GS classification.
a. SME panels. A series of SME panels will be conducted to create the career paths
and their supporting documentation.
SME Panel 1: OPM will facilitate a one-day focus group with SMEs to define career
paths and grade levels for Navy employees. The paths will inform employees of career
progression options outlining a career lattice. After the first panels, NPS will continue to fill
in any gaps in the draft plans and also solicit input from other employees who did not
participate in the focus group.
37
SME Panel 2: OPM will facilitate a second one-day focus group with SMEs. In the
second panels, OPM will work with SMEs to confirm career paths and revise where needed,
and to identify the criteria that distinguish the progression of work at each grade level.
SME Panel 3: OPM will facilitate a one-day focus group with SMEs to identify
available enrichment activities, such as developmental assignments and training that are
appropriate for employees at each level of the career path. NPS SECCM WG has already
identified employee training opportunities for employees at three levels of seniority. OPM
would leverage this information and incorporate it into the career path model appropriately.
This information will help employees and their supervisors articulate the development that
will prepare employees for the next level of work. After the panels, NPS will continue to fill
in any gaps in the lists of enrichment activities and also solicit input from other employees
who did not participate in the focus group.
Step 3: Career path progression documentation. The project will be summarized in final
documents for the career path project. These will include a document outlining the purpose
of the plans, how they are to be used, and the responsibilities of the employee, supervisor,
and the organization. Career paths, grade levels, grade criteria, and enrichment activities will
be outlined. In addition, OPM will be available to assist NPS SECCM WG in conducting up
to two sessions to roll out the career plans to employees. Figure 9 illustrates the details of the
Competency Gap Assessment and Career Development.
38
Figure 9. Competency Gap Assessment and Career Development
Competency*Gap*Assessment*and*Career*Development*
Step*1:*
Competency*Gap*Analysis*
Ques:onnaires*
Step*2:***
SME*Panels*
Step*3:*
Career*Path*Progression*
• Design*and*administer*an*online*ques:onnaire*to*gather*current*proficiency*data.*• Supervisors*provide*ra:ngs*of*employees’*current*levels*of*proficiency*for*each*
competency.*• Employees*will*provide*ra:ngs*on*their*own*proficiency.*• Analyze*ques:onnaire*data*and*report*gaps*across*the*workforce.*• Document*methodology*and*results*in*a*technical*report.*
• Panel*1:*Will*inform*employees*of*career*progression*op:ons*outlining*a*career*laKce.*
• Panel*2:*Confirm*career*paths*and*revise*where*needed,*and*iden:fy*the*criteria*that*dis:nguish*the*progression*of*work*at*each*grade*level.*
• Panel*3:**Iden:fy*employee*training*opportuni:es*for*employees*and*three*levels*of*seniority.*
• Outline*the*purpose*of*the*plans,*how*they*are*to*be*used,*and*the*responsibili:es*of*the*employee,*supervisor,*and*the*organiza:on.*
• Outline*career*paths,*grade*levels,*grade*criteria,*and*enrichment*ac:vi:es.*
• Roll*out*career*plans*to*employees.*
39
E. POA&M
Figure 10. High Level SECCM Validation Phase Timeline
* Indicates tasks that are currently Navy specific, however DASN (RDT&E) is exploring the possibility of including other DoD organizations.
Project Management
TasksJUL AUG SEPT
FY14QTR4
OCT NOV DEC
2015QTR1
JAN FEB MAR APR MAY JUN JUL AUG SEPT
2015QTR42015QTR32015QTR2
Model Development* Proficiency Level* Competency Gap
Analysis* Career Path
Modeling
40
F. CHAPTER SUMMARY
Continuing research started in FY13, KSAs as defined in the SECCM, were
mapped and analyzed using the Bloom’s taxonomic classification schema of cognitive
and affective learning domains. This approach resulted in an interactive model that
highlights core KSAs required of DoD systems engineers. In FY14, following
recommendations made by SMEs and stakeholders from input on the SECCM, the KSAs
were again realigned to maintain consistency. This was achieved by the elimination of
duplicate KSAs. Items that were deemed to not fit within an SE domain were re-
categorized based on how each KSA was written. Based on the realignment and re-
categorization processes, the SECCM evolved from 2,601 KSAs and 32 competencies in
FY13 to 2,848 KSAs and 41 competencies by the end of FY14.
The SECCM WG baseline review verified that the KSAs were aligned to the
correct competency. If a KSA was determined to not be aligned to an appropriate
competency, the KSA was re-assigned to one deemed more appropriate by the SMEs.
The baseline review also identified some KSAs that did not belong in the model. If the
SMEs and stakeholders felt that a KSA did not apply to an SE application, the KSA was
eliminated from the model. In some instances, SMEs added KSAs to the model based on
their experience. Following an iterative process, redundant KSAs were also deleted and
vague KSAs were re-written. In an effort to enforce consistency in the model (while also
properly using Bloom’s taxonomic classification schema), each KSA was updated to
have an action verb at the very beginning of the sentence. The verbs were all converted to
present tense.
Most importantly, the chapter summarized how now that the model has been
harmonized into a single, coherent model, so it is consistent enough to be used for the
OPM validation process. The OPM LWDA will assist NPS SECCM WG in to achieve
this objective by performing a job analysis for systems engineering by facilitating
incumbent and supervisor level SE panels to identify tasks performed by systems
engineers. Next, an occupational analysis survey will be developed and reviewed based
on the work of panels. Afterwards a survey will be administered to the population using
41
the SE population identified based on input from all participating organizations. The
results will be reviewed and shared with the SECCM WG in FY15. After the model is
validated it will be available for use, sometime in FY16.
43
III. DATA AND ANALYSIS
This chapter discusses some trends and findings identified when analyzing both
the technical and professional portions of the SECCM. This chapter will also discuss the
differences and similarities between the cognitive and affective learning domains.
A. PRIMARY RESEARCH BY COGNITIVE AND AFFECTIVE LEARNING DOMAIN
As NPS’s research reflects, the majority (67%) of the SECCM’s KSAs are
mapped into the cognitive domain (reference Figure 11). Particularly noteworthy of the
NPS research is how it illustrates how specific SE related competencies that are
characteristic of high performance and success in SE related jobs, can be apportioned.
For example, with 33% of the KSAs being grouped in the affective domain, this reflects
that this is important to the development of a SE (reference Figure 11). In addition, the
NPS SECCM research provides weight to current research findings that suggest that
affective learning constructs, such as to Value and Respond (Figure 14), are “important
constructs today’s students must possess” and are essentially a deciding factor to the
successful career achievement of lead performers (Joshi et al., 2012; Moore and Rudd
2005).
Figure 11. Percent of Cognitive and Affective KSAs in SECCM
67% (1,908)
33% (940)
SECCM Cognitive and Affective Breakout
Cognitive
Affective
44
Within the cognitive domain, about 24% of the KSAs are aligned under the
learning constructs of Remember, 11% in Understand and 45% in Application (reference
Figure 12).
Figure 12. Percent of SECCM KSAs within each Cognitive Learning Construct
Figure 13. Count of SECCM KSAs Categorized as Cognitive Learning Constructs
Using Bloom’s Taxonomy
24%
11%
45%
9%
9%
2%
Bloom's Cognitive Levels within the SECCM
Remember (R )
Understand (U)
Apply (AP)
Analyze (AN)
Evaluate (EV)
Create (C )
45
668
161
25
41
0 100 200 300 400 500 600 700 800
Receive (RC)
Respond (RS)
Value (V)
Organize (OR)
Characterize (CH )
Bloom's Cognitive Levels within the SECCM
45
Within the affective learning domain, about 71% of the KSAs were mapped to the
learning construct of Receiving and Responding. Seventeen percent of the KSAs were
mapped to the learning construct of Valuing (reference Figure 14).
Figure 14. Percent of SECCM KSAs within each Affective Learning Construct
Figure 15. Count of SECCM KSAs Categorized as Affective Learning Constructs
Using Bloom’s Taxonomy
5%
71%
17%
3%
4%
Bloom's Affective Levels within the SECCM
Receive (RC)
Respond (RS)
Value (V)
Organize (OR)
Characterize (CH )
45
668
161
25
41
0 100 200 300 400 500 600 700 800
Receive (RC)
Respond (RS)
Value (V)
Organize (OR)
Characterize (CH )
Bloom's Affective Levels within the SECCM
46
B. CHAPTER SUMMARY
The SECCM is primarily comprised of KSAs associated with the cognitive
domain. Within the cognitive domain, a total of 35% of the KSAs are aligned within the
learning domains of Remembering and Understanding. Forty-two percent of the KSAs
were found to be within the learning domain of Applying. A major implication of these
findings is that a great portion of the SECCM relies heavily on applying prior knowledge
learned.
Although the SECCM consists of mostly KSAs in the cognitive domain, it is also
evident that to be a competent SE it is important to have affective skills and to be able to
demonstrate the appropriate level of interpersonal competencies related to emotions,
feelings and attitudes. For example, within the affective learning domain, a total of 76%
of the KSAs were aligned with Receiving and Responding.
47
IV. FINDINGS/RESULTS
This technical report addressed the need for a competency model as a solution to the
gap between the lack of a current SE competency model which can be used in the
development of a SE position description for the DoD. The SECCM model is unique in that
it pinpoints what KSAs are required for development of naval systems engineering
competencies at various levels. The model has over 2,848 KSAs mapped across 41
competencies.
The technical report discussed how the use of the Bloom’s taxonomic
classification schema for categorizing KSAs relates to competency development. To
address the application of these competencies across proficiency levels, each KSA was
mapped to one of three notional career levels, designated as SE-01 (Entry Level), SE-02
(Journey Level) or SE-03 (Expert Level). The sources to obtain the KSAs for each
competency were also categorized as to whether they are developed through Education
and Training, On the Job Experience, or through Professional Development.
After analyzing the SECCM, it was found that within the SE-01 Career Level, the
KSAs were mostly associated with the lower level cognitive and affective learning
constructs. Knowledge, Comprehension, Receiving and Responding were all learning
constructs that can be attained through Education and Training. As an SE progresses
within his/her career from a SE-02 to SE-03 Levels, the focus of their skill set shifts
towards that of Application. Meaning, at this stage in their career development, the
individual is required to apply what was learned to do his/her job.
A. PRIMARY FINDINGS ON EXPERIENCE LEVELS
In the SE-01 Career Level for the Technical competencies, the majority of the
KSAs were found to be within the cognitive learning domains of Remember and
Understand. For instance, 43% of the KSAs in the SE-01 Entry level fell within the
learning domain of Remembering. Eighteen percent fell under Understanding. This
finding implies that these are lower level cognitive domains that can be attained through
Education and Training.
48
Interestingly, when taking a look at the affective domain within the Professional
skills competency model, it is clear that the three career levels all look very similar. For
instance, throughout each of the career levels, Responding and Valuing formed the
majority of the cognitive domain levels. Figure 19 illustrates the trend of the Bloom’s
cognitive levels within the SECCM.
Figure 16. Bloom’s Cognitive Levels within the SECCM SE-01 Career Level
43%
18%
22%
9%
3% 5%
Bloom's Cognitive Levels within the SE-01 Level of SECCM
Remember (R )
Understand (U)
Apply (AP)
Analyze (AN)
Evaluate (EV)
Create (C )
49
Figure 17. Bloom’s Cognitive Levels within the SECCM SE-02 Career Level
Figure 18. Bloom’s Cognitive Levels within the SECCM SE-03 Career Level
16%
9%
49%
8%
9% 9%
Bloom's Cognitive Levels within the SE-02 Level of SECCM
Remember (R )
Understand (U)
Apply (AP)
Analyze (AN)
Evaluate (EV)
Create (C )
12% 5%
54%
8%
13%
8%
Bloom's Cognitive Levels within the SE-03 Level of SECCM
Remember (R )
Understand (U)
Apply (AP)
Analyze (AN)
Evaluate (EV)
Create (C )
50
Figure 19. SECCM Career Level trend to have Majority of the Cognitive KSAs in the Remember and Apply Learning Domains
(Note: key is the same as Figures 16-18)
43%
18%
21%
9%
4% 5%
SE-01 Level of SECCM
16%
9%
49%
8%
9% 9%
SE-02 Level of SECCM
12% 5%
54%
8%
13%
8%
SE-03 Level of SECCM
51
The evolution of KSAs across career levels within the affective domain for the
SECCM is represented in Figures 20-22. In the SE-01 and SE-02 career levels, a majority
of the KSAs deal with the affective learning domains of Receiving and Responding (71%,
collectively). This finding may be associated with the idea that these are lower level
affective traits that are classified as part of an individual’s personality traits. The SE-03 is
representative of Valuing, Characterization and Responding. Figure 23 illustrates the
trend of Bloom’s affective learning domains within the SECCM.
Figure 20. Bloom’s Affective Levels within the SECCM SE-01 Career Level
9%
82%
6% 3% <1%
Bloom's Affective Levels within the SE-01 Level of SECCM
Receive (RC)
Respond (RS)
Value (V)
Organize (OR)
Characterize (CH )
52
Figure 21. Bloom’s Affective Levels within the SECCM SE-02 Career Level
Figure 22. Bloom’s Affective Levels within the SECCM SE-03 Career Level
2%
75%
17%
4% 2%
Bloom's Affective Levels within the SE-02 Level of SECCM
Receive (RC)
Respond (RS)
Value (V)
Organize (OR)
Characterize (CH )
3%
52% 31%
2% 12%
Bloom's Affective Levels within the SE-03 Level of SECCM
Receive (RC)
Respond (RS)
Value (V)
Organize (OR)
Characterize (CH )
53
Figure 23. SECCM Career Level trend to have Majority of Affective KSAs in the Respond Learning Domain
(Note: the key is the same as Figures 20-22)
9%
82%
6% 3%
SE-01 Level of SECCM 2%
75%
17%
4% 2% SE-02 Level of SECCM
3%
52% 31%
2% 12%
SE-03 Level of SECCM
54
B. USE CASE SCENARIO BY SPAWAR SSC ATLANTIC: IMPLICATIONS AND FINDINGS
In FY14, SPAWAR SSC Atlantic adapted the SECCM v0.5 as a use case scenario
to create their SE competency model. As illustrated in Figure 24, about 14% of the
SECCM v0.5 was used directly. The SPAWAR SE competency model is still in the early
stages of development, but as can be seen, a fair amount of it was tailored from the
SECCM v0.5 to specifically to meet the needs of the SPAWAR. This example of a use
case scenario by SPAWAR implies that the SECCM can be used as a foundation to meet
an organization’s needs in developing a tailored SE competency model. With this in
mind, it is expected that the SECCM will be shared with other organizations in the future,
with the additional capability to track the amount of the SECCM’s KSAs embedded into
the model.
Figure 24. Percent and Count of SECCM KSAs adapted by the SPAWAR
Competency Model
Other findings related to the use case scenario by SPAWAR in adapting the
SECCM v0.5 as a foundational tool to build a SE competency model are that, when
comparing the percentage of cognitive KSAs in the models, the majority of the KSAs in
both models seemed to fall under the learning domain of Application. In contrast, the
learning domain of Knowledge was found to be very important (38%) in the SPAWAR
SECCM V0.5 Model KSAs
(2257)
KSA's SPAWAR
Model Used (308)
14%
SECCM can be used as a foundation and tailored to meet organizations needs while developing a SE Competency
Model:SPAWAR Example
55
model, while Comprehension has a proportionally higher percentage in the SECCM. As
an illustration of these comparisons, Figure 25 compares the cognitive KSAs mapped to
SPAWAR’s model to the SECCM. SPAWAR’s competencies 25.0 System of Systems,
5.0 Requirements Analysis, 16.0 Technical Assessment and 6.0 Architecture Design
Competencies represented more than 10% of KSAs from the SECCM. (Note that the
assignment of competency numbers in the SECCM is arbitrary. As an example of a
competency table from the SECCM, refer to Appendix A).
Figure 25. SPAWAR vs. SECCM: Percent of KSAs by Cognitive Domain
The KSAs taken directly from the SECCM v0.5 and used in the SPAWAR model
are shown in Figure 26. A major iteration in FY14 was to develop the SECCM v0.5 to
better align it to Bloom’s taxonomic classification schema used by the Graduate
Reference Curriculum for Systems Engineering. This analysis was based on an earlier
iteration of the SECCM. The current model has been iterated to the SECCM v0.78. The
major iteration to the SECCM v0.78 was to align it to Bloom’s taxonomic classification
schema as adapted by Krathwohl. (Note: it should be kept in mind that in the remaining
portions of the report and in all other previous sections (unless otherwise noted), the
updates to the SECCM refer to the most current version of the model (v0.78)).
56
Figure 26. Number of KSAs from the SECCM used by SPAWAR
0 10 20 30 40 50 6.0 ARCHITECTURE DESIGN
16.0 TECHNICAL ASSESSMENT 5.0 REQUIREMENTS
25.0 SYSTEM OF SYSTEMS 8.0 INTEGRATION
7.0 IMPLEMENTATION 13.0 RAM
3.0 SAFETY ASSURANCE 22.0 SOFTWARE
23.0 ACQUISITION 11.0 TRANSITION
14.0 DECISION ANALYSIS
SE Competency Analysis: Number of SECCM KSAs SPAWAR Used by Competency
57
C. CHAPTER SUMMARY
This chapter discussed trends and findings when analyzing the SECCM. The
chapter included a discussion on how the SECCM was adapted as a foundation for
SPAWAR’s SE Competency model. Similarities between SPAWAR’s model and the
SECCM were provided using pie charts and bar graphs.
58
V. SUMMARY, CONCLUSIONS, RECOMMENDATIONS AND FURTHER RESEARCH
This chapter presents a brief summary of the research conducted in order to
address the need for a SE position description and career development plan. The
development of the SECCM is summarized, conclusions from the analysis of the data are
presented, and recommendations for future research are provided.
A. SUMMARY, CONCLUSIONS AND RECOMMENDATIONS
The goal is for the SECCM to be used across the DoN and as the model evolves the
DoD. In an effort to meet this goal, the model must be validated in accordance with
(IAW) the Uniform Guidelines. In FY14 the Office of Personnel Management agreed to
join the team and review the SECCM for validation under the Uniform Guidelines on
Employee Selection Procedures. Validation of this model will ensure rigorous policies
and standards are available for a DoN systems engineering competency model for human
resource management. Although, many organizations within the DoN and DoD, both the
services and 4th estate have competency models that are locally validated for their own
use, there is currently not SE competency model validated IAW the Uniform Guidelines
on Employee Selection. Only a model that is validated strictly IAW the Uniform
Guidelines can be used with confidence for all HR functions, especially for “high stakes”
functions like hiring, writing position description, and creating job announcements.
While the SECCM has been used as the basis for the INCOSE competency model update
and used as a foundation for SPAWAR’s competency model it will not legally uphold in
court unless it is validated in accordance with the uniformed guidelines. Validation of
the SECCM will involve practicing DoN systems engineers and their supervisors.
There are several organizations participating in the SECCM WG, including: NPS,
DASN RDTE, NAVAIR, NUWC, NSWCDD, SPAWAR, USMC, US Army, MDA,
USAF, SE-UARC SERC. The Office of the Assistant Secretary of the Navy Research,
Development, Test & Evaluation believes that the SECCM can provide a valuable
resource - a validated model IAW the OPM Uniform Guidelines on Employee Selection -
for all the services. Due to the importance of having a model validated for “high stakes”
59
HR functions, DASN (RDT&E) is willing to fund OPM to validate the model for all
services represented in the SECCM WG, making a model that is useful for all of OSD.
In an effort to fulfill the Navy’s mission, the SECCM WG has identified systems
engineering as an area requiring further research to ensure employees performing systems
engineering tasks are proficient in the competencies required for success. Competency
modeling is a key tool for ensuring a capable staff to accomplish the Navy’s current and
future mission. To this end, the systems engineering competency model is intended for
use both in and outside of SECCM WG, and will leverage the research that has
previously been conducted by NPS SECCM research team in FY13.
Continuing research started in FY13, KSAs (as defined in the SECCM) were
mapped and analyzed using the Bloom’s taxonomic classification schema of cognitive
and affective learning domains. This approach resulted in an interactive model that
highlights core KSAs required of DoD systems engineers. In FY14, following
recommendations made by SMEs and stakeholders from input on the SECCM, the KSAs
were again realigned to maintain consistency. This was achieved by the elimination of
duplicate KSAs. Also, items that were deemed to not fit within an SE domain were re-
categorized based on how each KSA was written. Based on the realignment and re-
categorization processes, the SECCM evolved from 2,601 KSAs and 32 competencies in
FY13 to 2,848 KSAs and 41 competencies by the end of FY14.
To summarize, the NPS SECCM research team contends that the model can be
used to assist graduate academic programs in specifying objectives within systems
engineering programs to ensure students have the entry-level knowledge, skills and
abilities required to perform successfully in their future jobs. This will be further
investigated in FY15, as the KSAs are already defined in terms of Bloom’s taxonomy, so
they lend themselves to direct development of learning objectives for university degree
programs. Furthermore, the NPS SECCM research team feels that the SECCM can be
used to assist with career development modeling, and for the development of position
descriptions for the DoD. As previously discussed, current FY14 research has begun to
focus on achieving this aspect of the research with the guidance provided by OPM.
61
APPENDIX A. EXAMPLES OF CAREER LEVELS FOR THE V0.78 SECCM MODEL
Entry Level: 11.0 Tools and Techniques
Knowledge)Skills)&)Abilities)(KSAs)Cognitive)&)Affective)Skill)Levels Education Training On)the)Job)ExperienceRemember,)Understand,)Receive)Phenomena)&)Respond)to)PhenomenaUnderstand)the)scope)and)limitations)of)models)and)simulations,)including)definition,)implementation)and)analysis xUnderstand)the)need)for)system)representations xUnderstand)the)different)types)of)modeling)and)simulation xSupport)the)M&S)specialist)in)creating)and)validating)models xSupport)systems)studies)and)analyses xParticipate)in)networked)and)federated)M&S)developments)(e.g.,)training)exercise)support)or)simulation)war)games),)with)assistance)from)the)specialist xKnow)which)models,)simulations,)or)decision)support)tools)can)be)used)to)support)your)analysis,)evaluation,)instruction,)training)or)experiment xKnow)various)models)&)simulations)and)when)it)is)beneficial)to)integrate)them)) xKnow)the)right)model)or)simulation)to)meet)your)specific)needs)) xKnow)decision)support)tools,)models,)or)simulations)that)are)applicable)to)your)job)) xCollaborate)with)the)specialist)to)run)M&S)scenarios,)based)on)current)and)future)operational)capabilities xAssist)the)specialist)with)collecting)data)and)formulating)assumptions)to)create)and)validate)simple)simulation)models)(e.g.,)operational)capabilities,)networking,)computing)resources,)and)processes) xApply,)Analyze,)&)ValueDemonstrate)candidate)modeling)and/or)simulation)approach)(e.g.,)constructive,)virtual,)and)live)synthetic)environments))while)working)with)the)specialist xBuild)prototypes)or)design)experiments)to)test)system)concepts)and)their)feasibility xEvaluate,)Create,)Organize)&)Characterize)by)a)ValueSurvey)existing)data)and)previous)modeling)efforts)to)incorporate)previous)M&S)capabilities)into)the)current)effort xIdentify)potential)integration)and)interoperability)links)within)and)between)modeling)and)simulation)tools)and)synthetic)environments x
62
Journey Level: 11.0 Tools and Techniques
Knowledge)Skills)&)Abilities)(KSAs)Cognitive)&)Affective)Skill)Levels Education Training On)the)Job)ExperienceRemember,)Understand,)Receive)Phenomena)&)Respond)to)PhenomenaUnderstand)the)risks)of)using)models)and)simulations)which)are)outside)the)validated)limits xModels)systems)of)varying)complexities)in)their)environments xKnow)how)to)run)and)interpret)the)output)of)modeling)and)simulation)tools,)to)provide)insight)or)training)to)real)world)situations xKnow)how)to)initialize)the)modeling)and)simulation)tools) xCollaborate)with)the)specialist)to)interpret)the)results)of)various)M&S)scenarios,)based)on)current)and)future)operational)capabilities xSelect)appropriate)tools)and)techniques)for)functional)analysis xDefine)a)process)that)includes)requirements)for)appropriate)tools)and)techniques)for)architectural)design xUse)scenarios)to)determine)robustness xContribute)to)definition)of)design)and)product)constraints)for)a)subsystem)or)simple)project xApply,)Analyze,)&)ValueApply)models,)simulations)and/or)decision)support)tools xEvaluate)models,)simulations)and/or)decision)support)tools xEvolve)models)and/or)simulations xEmploy)models)and/or)simulations xIntegrate)models)and/or)simulations xManage)models)and/or)simulations xApply)models)and/or)simulations xDevelop)models)and/or)simulations xDetermine)requirements)for)the)application)of)models)and/or)simulations xDefine)an)appropriate)representation)of)a)system)or)system)element xCollaborate)with)the)specialist)to)develop)assumptions)and)scenarios,)and)create)and)validate)complex)simulation)models)(e.g.,)operational)capabilities,)networking,)computing)resources,)and)processes) xCollaborate)with)the)M&S)specialist)to)identify)Approach,)create)and)validate)models,)interpret)results,)and)participate)in)cooperative)modeling)arrangements xApply)realLworld)data)in)models)or)simulations)for)computer)generated)forces,)mathematical)modeling,)physical)modeling,)scientific)research,)and)statistical)analysis xAnalyze)models,)simulations)and/or)decision)support)tools xPerform)Value)Engineering,)an)organized,)systematic)technique)to)analyze)the)functions)of)systems,)equipment,)facilities,)services,)and)supplies)to)ensure)they)achieve)their)essential)functions)at)the)lowest)lifeLcycle)cost)consistent)with)required)performance,)reliability,)quality,)and)safety xPerform)sustainability)analyses)to)reduce)system)total)ownership)cost)by)uncovering)previously)hidden)or)ignored)lifeLcycle)costs,)leading)to)more)informed)decisions)earlier)in)the)acquisition)life)cycle xEvolve)the)authoritative)model)of)the)system)under)development xEmploy)the)authoritative)model)of)the)system)under)development xIntegrate)the)authoritative)model)of)the)system)under)development xManage)the)authorities)model)of)the)system)under)development xDevelop)the)authorities)model)of)the)system)under)development xApply)the)authoritative)model)of)the)system)under)development xInterpret)modeling)or)simulation)results)to)more)fully)explore)concepts,)refine)system)characteristics/designs,)assess)overall)system)performance,)and)better)inform)acquisition)program)decisions xProvide)technical)basis)for)program)budgets)that)reflect)program)phase)requirements)and)best)practices)using)knowledge)of)Earned)Value)Management,)cost)drivers,)risk)factors,)and)historical)documentation)(e.g.)hardware,)operational)software,)lab/support)software) xProvide)technical)basis)for)comprehensive)cost)Estimate)that)reflect)program)phase)requirements)and)best)practices)using)knowledge)of)Earned)Value)Management,)cost)drivers,)risk)factors,)and)historical)documentation)(e.g.)hardware,)operational)software,)lab/support)software) x
63
Journey Level Cont.: 11.0 Tools and Techniques
Knowledge)Skills)&)Abilities)(KSAs)Cognitive)&)Affective)Skill)Levels Education Training On)the)Job)ExperienceEvaluate,)Create,)Organize)&)Characterize)by)a)ValueSuggest)collaboration)with)other)organizations)to)establish)integration)and)
interoperability)within)and)between)modeling)and)simulation)tools)and)synthetic)
environments x
Plan)models)or)simulations)to)drive)exercises x
Identify)Approach,)tools,)and)techniques)to)describe,)analyze,)and)synthesize)
complicated)and)complex)systems x
Execute)models)or)simulations)to)drive)exercises x
Compare)the)strengths)and)limitations)of)modeling)and)simulation)Approach,)
identify)Approach)that)fit)the)scope,)and)define)the)visualization)approach)while)
working)with)the)specialist x
Ensure)credibility)of)models)or)simulations)by)adhering)to)and)applying)sound)
verification,)validation)and)accreditation)(VV&A))practices x
Create)a)comprehensive,)integrated)model)that)Describe)systems)of)varying)
complexities)in)their)environments,)including)systems)dynamics)(e.g.)human,)
organizational)and)technical)dynamics) x
Develop)innovative)solutions)that)include)S&T)developmental)prototyping,)
experimentation)and)visualization)techniques x
64
Expert Level: 11.0 Tools and Techniques
Knowledge)Skills)&)Abilities)(KSAs)Cognitive)&)Affective)Skill)Levels Education Training On)the)Job)ExperienceRemember,)Understand,)Receive)Phenomena)&)Respond)to)PhenomenaPresent)to)the)sponsor/customer)and)key)stakeholders)a)comprehensive,)integrated)model)that)Describe)systems)of)varying)complexities)in)their)environments,)including)systems)dynamics)(e.g.)human,)organizational)and)technical)dynamics) xDescribe)the)VV&A)process xDescribe)the)M&S)planning)process)as)a)support)tool)for)systems)engineering xDescribe)the)M&S)hierarchy xDefine)the)strategy)and)approach)to)be)adopted)for)the)modeling)and)simulation)of)a)system)or)system)element xApply,)Analyze,)&)ValueRecommend)the)scope)of)modeling,)simulation)and)analysis)activities)within)and)across)projects/programs,)with)input)from)the)modeling)and)simulation)specialist xRecommend)modeling)and)simulation)Approach)within)and)across)projects,)programs,)and)enterprises,)including)the)visualization)approach xRecommend)M&S)scope,)Approach,)and)changes)to)operational)capabilities,)and)Facilitate)cooperative)modeling)arrangements xRecommend)changes)to)current)and)future)operational)capabilities)based)on)modeling)and)simulation)results xProvide)expert)technical)advice)on)the)verification,)validation)and)accreditation)of)models)or)simulations xProvide)expert)technical)advice)on)model)or)simulation)architectures xManage/supervise)the)development)and)application)of))models)and/or)simulations)for)a)Program xLead)networked)and)federated)M&S)developments)(e.g.)major)training)exercises)or)simulation)war)games),)with)assistance)from)the)specialist xGuide)the)formulation)of)assumptions)and)scenarios)developed)by)the)specialist)to)create)complex)simulation)models)(e.g.,)operational)capabilities,)networking,)computing)resources,)and)processes) xExplain)the)difference)between)fidelity)and)resolution xExplain)the)difference)between)a)model)and)a)simulation xExplain)the)application)of)modeling)and)simulation)to)systems)engineering. xUpdate)modeling)and)simulation)standards,)policy,)and)guidance)for)an)organization xReview)modeling)and)simulation)standards,)policy,)and)guidance xDevelop)modeling)and)simulation)standards,)policy,)and)guidance xDemonstrate)a)full)understanding)of)complex)simulations)for)a)system)or)system)element xApply)doctrinal)and)operational)knowledge)during)simulation)exercise)execution xAdvise)on)the)suitability)and)limitations)of)models)and)simulations x
65
APPENDIX B. SE COMPETENCY OBJECTIVES
SE Competency: Cognitive & Affective Components
Objec&ves)• Develop'competency'model'and'iden0fy'and'evaluate'the'
cogni0ve'and'affec0ve'domains'associated'with'SE'KSA.'• Define'KSAs'using'Bloom’s)Taxonomy:'Cogni0ve'and'Affec0ve'
Domains.'• Harmonize'SE'Competency'Career'Development'Model'with'
GRCSE.'• Develop'an'approach'and'methodology'to'obtain'baseline'
informa0on'needed'for'the'Naval'SE'Competency'Model'life'cycle'development'(knowledge,'skills,'abili0es'and'behaviors,'and'related'educa0on,'training,'and'experience'needed).'
• Define'career'paths'for'systems'engineers'(jobs,'assignments,'and'0ming).'
• Document'results.'
Approach)• Develop'Excel'spreadsheet'based'SE'
Competency'Career'Development'Model'• U0lize'Bloom’s'Taxonomy'to'define'KSA'in'
cogni0ve'and'affec0ve'domains'• Analyze'the'cogni0ve/affec0ve'skills'
needed'to'develop'as'a'proficient'SE'• Document'results'in'a'thesis'
Image'or'0tle'
)Faculty '''''Cliff'Whitcomb'Sponsor)))))DASN'Partners)))''Naval'SYSCOMS'Student)))))Corina'White'Grad)))))))))))June'2014'
66
APPENDIX C. KRATHWOHL COGNITIVE AND AFFECTIVE DOMAINS
LEVEL$ EXAMPLE$COMPETENCIES$
Remember$(C)$!•!The!student!is!able!to!recite!the!defini3ons!of!“system”!and!“emergence”!and!state!the!connec3on!between!them.!!•!The!student!is!able!to!describe!the!no3on!of!product!system!architecture!and!state!the!impact!architecture!may!have!on!system!development!success.!!!!
Understand$(U)$!•!The!student!is!able!to!explain,!in!a!very!general!way,!the!condi3ons!under!which!a!system!development!team!might!choose!to!use!a!waterfall!(or!itera3ve,!incremental,!or!spiral)!life!cycle!model.!!•!The!student!is!able!to!explain!the!range!of!cases!for!which!a!par3cular!systems!modeling!approach!is!applicable.!!!
Apply$(AP)$
•!Given!the!opera3onal!concept!and!requirements!of!a!simple!system!along!with!a!specified!budget!and!required!comple3on!3me,!the!student!is!able!to!choose!(and!to!provide!a!rudimentary!jus3fica3on!for!the!choice)!a!par3cular!life!cycle!model!to!address!the!project;!e.g.,!waterfall,!itera3ve,!incremental,!or!spiral.!!•!The!student!is!able!to!construct!a!simple!model!of!a!defined!system!that!would!demonstrate!understanding!of!the!rela3onship!of!the!primary!factors!included!in!the!model.!!!
Analyze(AN)$
!•Given!a!simple!requirements!document!and!a!domain!model!for!an!applica3on,!the!student!is!able!to!cri3que!the!domain!model.!!•Given!the!opera3onal!concept!of!a!system!along!with!a!requirements!document,!a!budget,!a!schedule,!a!choice!of!a!development!process,!and!a!jus3fica3on!of!the!use!of!that!process!for!the!project,!the!student!is!able!to!find!and!explain!errors!in!the!jus3fica3on!and/or!in!the!choice!of!the!process.!!•The!student!can!analyze!the!effec3veness!of!a!simple!model!of!a!system!to!describe!the!behavior!of!that!system!and!iden3fy!errors!or!weaknesses!in!the!model!arising!from!the!assump3ons!about!the!system!embedded!in!the!model.!!!!
Evaluate$(EV)$
!•!Given!a!detailed!requirements!document!and!a!wellJconstructed!domain!model!for!a!system,!the!student!is!able!to!design!at!least!one!basic!architecture!for!the!system.!!•!Given!an!opera3onal!concept,!requirements,!architecture,!and!detailed!design!documents!for!a!system,!the!student!is!able!to!construct!a!complete!implementa3on!plan!and!provide!a!cogent!argument!that!if!the!implementa3on!of!the!architecture!or!detailed!design!is!performed!according!to!the!plan,!then!the!result!will!be!a!system!that!sa3sfies!the!requirements,!fulfills!the!opera3onal!concept,!and!will!be!completed!within!the!budget!and!schedule.!!•!The!student!can!develop!and!use!a!model!of!a!simple!system!where!the!system!is!described!by!architecture!to!determine!the!capability!of!the!system!represented!by!the!model!and!to!explore!desirable!parameters!of!model!elements.!!!!
Create$(C)$
•!Given!an!opera3onal!concept,!requirements,!architecture,!a!detailed!design,!and!an!implementa3on!plan,!including!budget!and!schedule,!for!a!system,!as!well!as!a!feasibility!argument!for!the!implementa3on!plan,!the!student!is!able!to!assess!the!plan!and!to!either!explain!why!the!feasibility!argument!is!valid!or!why!and!where!it!is!flawed!with!regard!to!any!of!the!claims!regarding!implementa3on!of!the!requirements,!fulfillment!of!the!opera3onal!concept,!or!the!ability!to!be!completed!within!budget!and!schedule.!!•!Given!a!simple!system,!the!student!is!able!to!plan!a!test!and!evalua3on!method!to!perform!a!verifica3on!and!valida3on!process!of!that!system!against!the!requirements!of!the!system!and!the!need!descrip3on!associated!with!the!system.!!•!Given!a!simple!system!and!a!test!and!evalua3on!plan!of!the!system,!the!student!is!able!to!determine!that!the!results!that!would!be!produced!through!use!of!the!test!and!evalua3on!plan!will!yield!a!useful!verifica3on!and!valida3on!of!the!system.!!!
25 1/17/14
67
LEVEL$ EXAMPLE$COMPETENCIES$
Receive$(RC)$$
•"The"student"accepts"that"customer"or"user"percep1on"of"the"quality"of"a"system"is"the"fundamental"determinant"of"system"quality.""•"The"student"accepts"that"customers"do"not"always"fully"describe"what"they"want"or"need,"and"that"there"is"a"difference"between"what"customers"say"they"want"and"what"they"actually"need.""•"The"student"is"able"to"describe"the"value"of"the"SE"approach"to"design."""
Respond$(RS)$$
"•"The"student"learns"how"to"ask"ques1ons"to"elicit"the"unstated"desires"of"a"stakeholder"who"is"seeking"a"system"development.""•"The"student"is"willing"to"try"the"SE"approach"on"a"small"project."""
Value$(V)$$
"•"The"student"believes"it"is"important"to"provide"system"solu1ons"that"sa1sfy"the"range"of"stakeholder"concerns"in"a"manner"that"the"stakeholders"judge"to"be"good.""•"The"student"believes"it"is"important"to"elicit"a"nuanced"descrip1on"of"what"stakeholders"desire"of"a"system"in"order"to"provide"rich"knowledge"that"can"be"used"in"the"system"solu1on"development.""•"The"student"believes"in"the"value"of"the"applica1on"of"SE"principles"in"a"project,"even"in"the"face"of"advocates"for"other"methods.""•"The"student"recognizes"the"value"of"advancing"in"proficiency"in"SE"competencies."""
Organize$(OR)$$
"•"The"student"is"able"to"organize"a"coherent"framework"of"beliefs"and"understandings"to"support"use"of"a"SE"method"in"a"project.""•"The"student"has"a"coherent"framework"for"how"to"discuss"system"development"with"stakeholders"and"to"incorporate"the"views"of"a"variety"of"stakeholders"in"a"balanced"manner."""
Characterize$CH)$$
"•"The"student"will"rou1nely"approach"system"development"projects"with"a"SE"framework.""•"The"student"will"rou1nely"evaluate"the"appropriate"tailoring"of"SE"processes"to"appropriately"address"the"specific"characteris1cs"of"each"project.""•"The"student"will"appropriately"weigh"the"views"of"all"stakeholders"and"seek"to"overcome"conflicts"between"stakeholders"using"methods"that"are"technically"and"socially"appropriate."""
26 1/17/14
70
LIST OF REFERENCES
Alexander, Juli. 2013 "Analysis of Training Requirements and Competencies for the Naval Acquisition Systems Engineering Workforce." Master’s thesis, Naval Postgraduate School. http://hdl.handle.net/10945/34618 Bloom, Benjamin S., Max D. Engelhart, Edward J. Furst, Walker H. Hill, and David R. Krathwohl. 1956. Taxonomy of Educational Objectives. New York: David McKay. Boulter, Nick, Murray M. Dalziel, and Jackie Hill. 1998. Achieving the Perfect Fit. Houston: Gulf Publishing. Chi, Lei, Avimanyu Datta, Joshi K.D. and Shu Han. 2010. Changing the Competitive Landscape: Continuous innovation through IT-enabled knowledge capabilities. Information Systems Research. Vol. 21. No. 3 pp. 472-495. Delgado, Jessica. Director, Acquisition Career Management (DACM) Newsletter. 2014. http://www.secnav.navy.mil/rda/workforce/Documents/DACM%20Corner%20Newsletter%20-%20October%202014.pdf Florida International University. “Bloom’s Taxonomy Versions.” Accessed May 9, 2014. http://apa.fiu.edu/documents_new/FIU%ASSESSMENT%20HANDBOOK.pdf. GRCSE. 2011. Graduate Reference Curriculum for Systems Engineering (GRCSE) v0.5. Stevens Institution of Technology. Holt, Jon, and Simon A. Perry. 2011. A Pragmatic Guide to Competency. Swindon: Brittish Informatics Society. Hudson. 2013. IT Staffing: Do Technical or Interpersonal Skills Matter More? Accessed May 20, 2014. http://us.hudson.com/it/blog/postid/405/it-staffing-do-technical-or-interpersonal-skills-matter-more. Huitt, William. Educational Psychology Interactive. Accessed May 2014. http://chiron.valdosta.edu/whuitt/col/soccog/soccog.html. INCOSE. BKCASE. Accessed Noveber 2013. http://www.bkcase.org. INCOSE. 2012. Systems Engineering Competency Framework. San Diego, CA, USA: International Council on Systems Engineering (INCOSE) UK, INCOSE-TP-2010-003. Kendall, Frank. September 6, 2013. "Sunsetting of the Systems Planning, Research, Development, and Engineering-Program Systems Engineer Acquisition Workforce Career Path and the Renaming of the Systems Planning, Research, Development, and Engineering Acquisition Workforce Career Field." memo, The Department of Defense.
71
Khan, Rabia. 2014. “Assessment of Self-Efficacy in Systems Engineering as an Indicator of Competency Level Achievement.” Master’s thesis, Naval Postgraduate School. http://hdl.handle.net/10945/42660 Krathwohl, David. 2002. "A Revision of Bloom's Taxonomy: An Overview." Accessed May 2014. http://www.unco.edu/cetl/sir/stating_outcome/documents/Krathwohl.pdf. LaRocca, Maggie. The Career and Competency Pathing: The Competency Modeling Approach. Accessed May 2014. http://edweb.sdsu.edu/people/arossett/pie/interventions/career_1.htm. Lasley-Hunter, Brooke, and Preston Alan. 2011. Systems Planning, Research, Development and Engineering (SPRDE) Workforce Competency Assessment Report. CNA Analysis and Solutions. Layton, Evelyn. 2007. Training Professionals for the Acquisition Workforce. The Defense Acquisition University, Fort Belvoir: Defense Acquisition University. Wilson, Leslie. O. 2013. Beyond Bloom's - Cognitive Taxonomy Revised. Retrieved from The Second Principle: http://thesecondprinciple.com/teaching-essentials/beyond-bloom-cognitive-taxonomy-revised/. MMI. Taxonomies of Learning: Visual Comparisons of the Taxonomies. Accessed May 9, 2014. http://www.mmiweb.org.uk/downloads/bloom.html. Schoonover, Steven, Helen Schoonover, Donald Nemerov, and Christine Ehly. 2012. “Competency-Based HR Applications: Results of a Comprehensive Survey.” Accessed May 20. http://www.humanasset.net/resources/htm. Thesecondprinciple. 2014 “Beyond Bloom's - Cognitive Taxonomy Revised.” Accessed March 7. http://thesecondprinciple.com/teaching-essentials/beyond-bloom-cognitive-taxonomy-revised/. U.S. Department of Labor Employment and Training Administration. 2013 “Department of Labor Employment and Training Administration.” Accessed March 9. http://www.careeronestop.org/CompetencyModel/userguide_competency.aspx. Walter, Paul G. 2013." A Model for Effective Systems Engienering Workforce Development at Space and Naval Warfare Systems Center (SCC) Atlantic." Master’s thesis, Naval Postgraduate School. http://hdl.handle.net/10945/37738 Whitcomb, Clifford, Rabia Khan, and Corina White. 2013. "Systems Engineering Competency Report." Paper presented to the Systems Engineering Stakeholders Group, Washington, D.C. https://calhoun.nps.edu/handle/10945/43424
72
Wrightstuffmusic.2014. “Bloom's Taxonomy from Pirates of the Caribbean.” Accessed May 9. http://wrightstuffmusic.com/wp content/uploads/2012/08/Blooms_taxonomy_poster.jpg. White, Corina.2014. “Development of Systems Engineering Competency Career Development Model: An Analytical Approach Using Bloom’s Taxonomy.” Master’s thesis, Naval Postgraduate School. http://hdl.handle.net/10945/42752