Preliminary Design Review (PDR)web.mst.edu/~cornss/se368/Blanchard/SysEng368_Team5... ·...
Transcript of Preliminary Design Review (PDR)web.mst.edu/~cornss/se368/Blanchard/SysEng368_Team5... ·...
-
Team 5 PDR report Page
Preliminary Design Review (PDR) Wireless Immersive Training Vest Monitoring System
Prepared by SysEng 368 Group 5
Chris Blanchard – [email protected]
Gareth Caunt – [email protected]
Michael Donnerstein – [email protected]
Chris Neuman – [email protected]
Varun Ramachandran – [email protected]
Submitted : November 14th, 2012
mailto:[email protected]:[email protected]:[email protected]:[email protected]:[email protected]
-
Team 5 PDR report Page i
Table of Contents List of Figures ............................................................................................................................................ ii
List of Tables ............................................................................................................................................ iii
Background ................................................................................................................................................... 1
Introduction – Statement of Need ................................................................................................................ 1
Operational View (OV-1) ............................................................................................................................... 2
System Level Requirements (Tier 0) ............................................................................................................. 3
Assumptions .................................................................................................................................................. 4
Feasibility Analysis Overview ........................................................................................................................ 4
Feasibility Analysis and Trade Study ............................................................................................................. 6
Relaying System ........................................................................................................................................ 6
Recording System...................................................................................................................................... 8
Data Analysis ............................................................................................................................................. 9
Feedback Relay to Soldiers ..................................................................................................................... 10
Updated Kiviat Chart ................................................................................................................................... 11
System Level Measures of Effectiveness .................................................................................................... 11
Technical Performance Measurements ...................................................................................................... 13
Functional Analysis and Decomposition ..................................................................................................... 15
Receive Data ........................................................................................................................................... 15
Process Data ............................................................................................................................................ 16
Store Data ............................................................................................................................................... 17
Relay Data ............................................................................................................................................... 18
Receive Data ........................................................................................................................................... 18
Alert Soldier ............................................................................................................................................ 19
Physical Architecture .................................................................................................................................. 20
Functional to Physical Mapping .............................................................................................................. 22
Technical Management Plan ....................................................................................................................... 23
Risk Analysis ................................................................................................................................................ 23
Updated Risk Matrix ............................................................................................................................... 25
Work Breakdown Structure ........................................................................................................................ 29
Cost Estimate .............................................................................................................................................. 30
Schedule ...................................................................................................................................................... 31
-
Team 5 PDR report Page ii
Support Concept ......................................................................................................................................... 32
Program Management Best Practices (PMBP) ........................................................................................... 36
Appendix A – Technical Management Plan ................................................................................................ 38
Appendix B – Software Development ......................................................................................................... 46
Pseudo Code ........................................................................................................................................... 46
Graphical User Interface (GUI) ................................................................................................................ 47
Appendix C – Operational Feasibility .......................................................................................................... 48
Safety ...................................................................................................................................................... 48
Reliability ................................................................................................................................................. 51
Maintainability ........................................................................................................................................ 56
Availability ............................................................................................................................................... 60
Affordability ............................................................................................................................................ 62
Supportability .......................................................................................................................................... 63
Disposability ............................................................................................................................................ 65
Usability .................................................................................................................................................. 67
Appendix D – Hardware Technical Specifications ....................................................................................... 69
Meshlium Router .................................................................................................................................... 69
Laptop ..................................................................................................................................................... 73
Appendix E: Vendor Communications ....................................................................................................... 74
List of Figures Figure 1: Customer Supplied OV-1 ................................................................................................................ 2
Figure 2: Relay Data Kiviat Chart ................................................................................................................... 7
Figure 3: Data Storage Kiviat Chart ............................................................................................................... 8
Figure 4: Data Analysis Kiviat Chart .............................................................................................................. 9
Figure 5: Feedback Relay Kiviat Chart ......................................................................................................... 10
Figure 6: Measure of Architecture Kiviat Chart .......................................................................................... 11
Figure 7: Measures of effectiveness ........................................................................................................... 12
Figure 8: PDR Functional Decomposition ................................................................................................... 15
Figure 9: Receive Data Function ................................................................................................................. 15
Figure 10: Process Data functions ............................................................................................................... 16
Figure 11: Store Data functions .................................................................................................................. 17
Figure 12: Relay Data functions .................................................................................................................. 18
Figure 13: Receive Data from Trainer functions ......................................................................................... 18
Figure 14: Alert Soldier functions ............................................................................................................... 19
file:///C:/Users/Blanchard/Documents/MS%20Systems%20Engineering/SysEng%20368/HW5%20-%20PDR/Report/SysEng368%20Team%205%20PDR%20Report_v4.docx%23_Toc340644087file:///C:/Users/Blanchard/Documents/MS%20Systems%20Engineering/SysEng%20368/HW5%20-%20PDR/Report/SysEng368%20Team%205%20PDR%20Report_v4.docx%23_Toc340644088file:///C:/Users/Blanchard/Documents/MS%20Systems%20Engineering/SysEng%20368/HW5%20-%20PDR/Report/SysEng368%20Team%205%20PDR%20Report_v4.docx%23_Toc340644089file:///C:/Users/Blanchard/Documents/MS%20Systems%20Engineering/SysEng%20368/HW5%20-%20PDR/Report/SysEng368%20Team%205%20PDR%20Report_v4.docx%23_Toc340644090file:///C:/Users/Blanchard/Documents/MS%20Systems%20Engineering/SysEng%20368/HW5%20-%20PDR/Report/SysEng368%20Team%205%20PDR%20Report_v4.docx%23_Toc340644091file:///C:/Users/Blanchard/Documents/MS%20Systems%20Engineering/SysEng%20368/HW5%20-%20PDR/Report/SysEng368%20Team%205%20PDR%20Report_v4.docx%23_Toc340644092file:///C:/Users/Blanchard/Documents/MS%20Systems%20Engineering/SysEng%20368/HW5%20-%20PDR/Report/SysEng368%20Team%205%20PDR%20Report_v4.docx%23_Toc340644093
-
Team 5 PDR report Page iii
Figure 15: System Physical Architecture ..................................................................................................... 20
Figure 16: Risks........................................................................................................................................... 27
Figure 17: Risk consequence assessment table ......................................................................................... 28
Figure 18: Risk likelihood assessment table ............................................................................................... 28
Figure 19: Combined Risk Assessment Table ............................................................................................. 29
Figure 20: Schedule for current phase of project ....................................................................................... 31
Figure 21: Schedule for subsequent phases of project ............................................................................... 32
Figure 22: Preventive Maintenance Flowchart ........................................................................................... 34
Figure 23: In-House Developed GUI ............................................................................................................ 47
Figure 24: Component Interfaces ............................................................................................................... 51
Figure 25: Weibull Probability ..................................................................................................................... 54
Figure 26: Corrective Maintenance Cycle ................................................................................................... 56
Figure 27: Repair Time Distribution ............................................................................................................ 57
Figure 28: Preventative Maintenance Cycle ............................................................................................... 58
Figure 29: Top Level Cost Breakdown Structure ......................................................................................... 62
List of Tables Table 1: Tier 0 Requirements ........................................................................................................................ 3
Table 2: Relay Data ....................................................................................................................................... 7
Table 3: Data Storage .................................................................................................................................... 8
Table 4: Data Analysis ................................................................................................................................... 9
Table 5: Feedback Relay .............................................................................................................................. 10
Table 6: Technical Performance Measures ................................................................................................. 13
Table 7: Replacement Relay Data Element ................................................................................................. 21
Table 8: Tier 0 Mapping .............................................................................................................................. 22
Table 9 System Level Work Breakdown Structure ...................................................................................... 29
Table 10: High Level Cost Breakdown ......................................................................................................... 30
Table 11: Detailed Hardware Costs ............................................................................................................. 30
Table 12: Severity Categories ...................................................................................................................... 49
Table 13: Probability Levels ........................................................................................................................ 49
Table 14: Risk Assessment Matrix ............................................................................................................... 50
Table 15: Life Cycles .................................................................................................................................... 52
Table 16: Estimated MTBF .......................................................................................................................... 52
Table 17: FMECA Example .......................................................................................................................... 54
Table 18: Standard FMECA Rankings .......................................................................................................... 55
Table 19: Typical Maintenance Tasks ......................................................................................................... 59
Table 20: Rolled-Up Sub-Element Costs ..................................................................................................... 63
Table 21: Production Cost Breakdown ....................................................................................................... 63
file:///C:/Users/Blanchard/Documents/MS%20Systems%20Engineering/SysEng%20368/HW5%20-%20PDR/Report/SysEng368%20Team%205%20PDR%20Report_v4.docx%23_Toc340644105file:///C:/Users/Blanchard/Documents/MS%20Systems%20Engineering/SysEng%20368/HW5%20-%20PDR/Report/SysEng368%20Team%205%20PDR%20Report_v4.docx%23_Toc340644107file:///C:/Users/Blanchard/Documents/MS%20Systems%20Engineering/SysEng%20368/HW5%20-%20PDR/Report/SysEng368%20Team%205%20PDR%20Report_v4.docx%23_Toc340644108
-
Team 5 PDR report Page 1
Background A current reality of the US military is that there are numerous operational arenas around the
world in which its soldiers are put into action. In each military action, soldiers will have to relate with
non-combatant foreign nationals as well as work with their squads in near or full combat situations.
One certainty for the military is that they will constantly be training new troops for deployment into the
field. The problem facing commanding officers is that “new soldiers make errors in-country that cost
lives and Intel opportunities.”
The solution to minimize these mistakes is more intensive training for new troops. A system needs to be
designed that will allow a skilled trainer to monitor up to eight soldiers in training missions and to
provide timely feedback to the soldiers should a combat situation or social faux pas occur. This ‘control
center’ for the trainer must also be able to receive and monitor the vital health statistics for each soldier
so that immediate care can be dispatched if needed.
The system to be developed will receive data from the legacy ITV vest, biotelemetry, and gesture
recognition systems through the Mote system. Data received will be processed and displayed to the
trainer for near-live monitoring and feedback as well as recorded for future debriefing of training
sessions. A legacy developed haptic feedback system will also be integrated for soldier ‘live’ feedback.
Introduction – Statement of Need This paper is provided as supplementary justification to the group 5 PDR presentation that was given
and recorded for the customer on November 8th, 2012. The material in the PDR presentation and this
report was assembled by SYSENG 368 group 5 and is intended to satisfy the need statement given
below.
This project is to design a means to record/relay to a trainer the movements and reactions of
soldiers in a given training environment, allowing for the evaluation or their ability to interact
culturally with non-combatant foreign nationals. The scenario this will be used in will be an
Afghanistan village, although the system must be flexible enough to be applied to other training
scenarios. The information provided to the system will be through a set of legacy equipment as
specified by the Integrated Training Vest (ITV) system. This information is relayed to a trainer in
a control room monitoring a group of up to eight soldiers using the ITV system so that the trainer
can evaluate whether a social faux pas has been committed. The system must be capable of
monitoring, recording, and conveying sufficient information to evaluate the soldiers’
performance within the simulation as well as the health of the soldiers during training. The
overall budget for the development of the system is not to exceed $50001. The system design
1 The budgetary constraint was originally set at $5,000 as represented above. The customer authorized the
expansion of the budget to $10,000 during the conceptual design phase of the project.
-
Team 5 PDR report Page 2
must be available by December 11 of 2012, and a prototype must be available for integration
into the Missouri Mote system by May 5 of 2013.
The remainder of this paper deals with specific elements of the system as presented in the PDR.
Operational View (OV-1)
Figure 1: Customer Supplied OV-1
An OV-1 for the Immersive Training System (ITS) is depicted in Figure 1 above. The ITS is envisioned as a
system of systems. Data is relayed to a central control center where trainers can monitor numerous
soldiers engaged in combat and cultural situations. Systems shall monitor biometrics of all the soldiers
to alert of any health concerns. Motion trackers will be used to track soldiers positioning and motions in
the field and in relation to his squad members and non-combatants or hostiles in the training arena.
Data relayed to the central control center will be filtered and will alert the trainer when a cultural or
-
Team 5 PDR report Page 3
combat faux pas occurs. Vital signs shall also be constantly monitored by the trainer. The system shall
be capable of allowing the trainer to send feedback to the soldiers based on a cultural faux pas, combat
faux pas, or health conditions that the system detects. Finally, the system shall store data in a way that
debriefing is streamlined and immediate post-training feedback can be given to the soldier.
System Level Requirements (Tier 0) After reviewing the OV-1 and the statement of need, Team 5 developed a list of requirements. These
requirements were confirmed with our customer Colonel Pape. The Colonel has agreed with the
prioritization of the requirements as listed below. The requirements list below is prioritized from top to
bottom.
Table 1: Tier 0 Requirements
1. The system shall alert the trainer to a medical emergency within 15 seconds 2. The system shall provide sufficient information for a suitably skilled trainer to
monitor the health of the soldiers during training 3. The system shall gauge the criticality of the data being received 4. The system shall prioritize according to the criticality of the data 5. The system shall receive data the prioritized data. 6. The system budget shall be $10,000 or less 7. The system shall interface through the existing legacy equipment specified by the ITV 8. The system shall relay data from the ITV to a control room with a maximum latency
of 500 ms from sensor input to display 9. The system shall monitor data from the equipment in the ITV system available on
project start date in real time 10. Any additions to the ITV must mimic real life mass distributions 11. The system shall record data from ITV 12. The system shall be designed so a trainer can monitor eight (8) soldiers 13. The system shall be able to relay sufficient data for the trainer to evaluate whether a
social faux pas has occurred 14. The system shall relay sufficient data to the trainer to evaluate for whether a ‘Patrol
Tactic/Combat’ faux pas has occurred 15. The system shall alert a suitably skilled trainer to a faux pas within 15 seconds 16. The system reliability over 8 hours shall be 99% 17. The batteries used by the system shall be recycled. 18. The system design must be available by 12-11-2012 19. The system shall have a prototype available by 05-05-13 20. The system shall be operable in an area of 120 sq. ft. to ¼ mile. 21. The system design shall be compatible to the Missouri Mote system available on the
project start date 22. The system shall be adaptable to multiple scenarios
These requirements were analyzed to determine if a solution was feasible within the technical, schedule
and cost constraints. As a result of that analysis, Team 5 firmly believes that a solution is feasible and
there are options available to either increase capability or reduce cost. Some of these were examined
-
Team 5 PDR report Page 4
during the development of the Conceptual Design Review. The preferred solution is a combination of
the existing items available for the ITV, commercially available hardware and software, and Matlab
developed code to cover any deficiencies present in the existing code.
Assumptions The following is a list of the assumption on which our solution is predicated:
1. Existing mote is integrated with the biotelemetry system
2. Existing mote is integrated with the gesture tracking system
3. Existing mote, biotelemetry system and the gesture tracking system do not impact soldier
safety.
4. Existing mote, biotelemetry system and the gesture tracking system do not alter the weight or
weight distribution for the soldier more than is acceptable to the customer. This facilitates
realistic loading on the soldier during the training exercise.
5. Existing mote, biotelemetry system and the gesture tracking system have their own support and
disposal mechanisms that will not adversely impact our design.
6. Matlab software development is at no cost to Team 5.
7. MST Matlab licenses are suitable for the any development that is required.
8. Virtual Cultural Monitoring System (VCMS) is available for integration into our system.
9. Existing API or example code is available for the mote, biotelemetry system and the VCMS and is
suitable for integration with Matlab.
10. Suitable electrical power is available at the training facility.
Feasibility Analysis Overview After reviewing the customer’s Statement of Need, a group of requirements were developed which are
listed above in Table 1. The scope of the project was further subdivided into four Tier 1 sub-systems.
These consisted of a means of relaying data to the motes to the control room, a system to record
separate data streams for each of the monitored soldiers in the control room, an interface to display the
data received in a meaningful way for the trainer in the control room and a system to allow the trainer
to send feedback to the soldiers using the existing mote network. Criteria that were considered to be
equal among the different options were not included in the initial trade study. As further analysis is
performed, they may be considered as required.
By choosing COTS products, the producibility and disposability aspects of the design are delegated to the
COTS vendors. Electronic waste recycling programs which accept electronics are in place and will
continue to be for the foreseeable future. COTS also address the majority of the supportability aspects
of the design should the solution be installed within the United States. Transportation back to base may
be required should the system be installed in a foreign location. The level of user servicing would be
limited to replacement of failed components and configuring the replacement.
-
Team 5 PDR report Page 5
The usability of the system will be heavily influenced by the development of the analysis software.
Usability for maintenance staff of the other components will be dependent on their installation in the
training environment. COTS vendors are responsible for the intrinsic safety of their supplies. Installed
locations of components will impact overall safety. Suitable documentation would need to be provided
to ensure that safe use and maintenance can take place.
To determine a more accurate life-cycle cost of the system, the final selection of elements will be
required. Various logistical models can then be used to predict spares requirements. This study does not
offer a complete life-cycle costing.
The key performance parameters considered were:
The number of trainees the system could accommodate. The customer requested at least 8
trainees in the scenario. The customer’s preference was for more. No maximum number of
trainees was stated; however the size of the training area would limit the number of trainees
based on combat faux pas rules. No analysis has been done to determine the maximum number
of trainees in a 120 sq ft area.
Time to alert. This parameter was flexible with the customer providing guidance for the various
different alert scenarios. Medical emergencies required a response in less than 20 seconds and
training errors in less than 60 seconds. Further refinement led to a common timing of 15
seconds.
Size of operational area. The customer requested an area of 120 sq ft to ¼ sq mile.
Trainee equipment weight. The customer need was for the trainee’s equipment weight not to
vary noticeably from a normal combat patrol weight. The IVT will be used in place of the soldier
body armour.
System Cost. The total cost of the system is limited to $10,000 for procurement. Life cycle
costing is to be based on 8 hours training per day, 5 days a week, with 45 weeks a year. The
annual cost of operations including replacements, spare parts and disposal of expended
equipment is not to exceed $50,000.
System Reliability. The reliability of the system should be 99% over an 8 hour training mission.
The customer also indicated that the system should be designed for at least one years life.
Faux Pas detection. The customer requested that the system detect at least 2 but no more than
30 faux pas. This includes both cultural and combat faux pas. Further analysis is required to
determine a suitable list of the faux pas detected. This would need to be worked with the
customer and subject matter experts.
Our solution does not limit the number of soldiers in the training scenario directly. We have a risk that
states the bandwidth available to communicate the trainee action to the control room is the limiting
factor. As stated above, further analysis would be need to done to determine the maximum number of
trainees in the training area that can be accommodated allowing them to move around the area and not
commit a combat faux pas.
-
Team 5 PDR report Page 6
Our solution does not add any equipment onto the soldier at this stage. This entirely addresses the need
that trainee equipment weight not be significantly increased.
The first pass of the feasibility analysis proved that there were several options that met the customer’s
needs. The selection of our preferred solution was done by further analysis of the options which were
close against the key performance parameters. The sensitivity of the close options to changes in scoring
against these was also assessed. The preferred solution is highlighted below in green. No analysis has
been completed on synergies between the different components of the solution. Further research is
required to obtain relevant information to assess the value of using hardware and software from related
vendors.
Feasibility Analysis and Trade Study
Relaying System (Motes to the Control Room)
Cisco 1552s outdoor access point :
http://www.cisco.com/en/US/prod/collateral/wireless/ps5679/ps11451/ps12440/data_sh
eet_aironet_1552s.pdf
Meshlium ZigBee router
http://www.libelium.com/index.php
National Instruments Wireless Sensor Networks
www.ni.com/wsn
Microstrain gateways and wireless sensors
http://www.microstrain.com/wireless/systems
Missouri Mote system
http://www.cisco.com/en/US/prod/collateral/wireless/ps5679/ps11451/ps12440/data_sheet_aironet_1552s.pdfhttp://www.cisco.com/en/US/prod/collateral/wireless/ps5679/ps11451/ps12440/data_sheet_aironet_1552s.pdffile:///C:/Users/Blanchard/AppData/Local/Microsoft/Windows/Temporary%20Internet%20Files/Content.Outlook/OR9I5WY1/www.ni.com/wsnhttp://www.microstrain.com/wireless/systems
-
Team 5 PDR report Page 7
Table 2: Relay Data
Criteria Weighting Altern
ate
1:
Cis
co 1
552S
Weig
hte
d S
core
Altern
ate
2:
Meshliu
m
Weig
hte
d S
core
Altern
ate
3:
National in
str
um
ents
Weig
hte
d S
core
Altern
ate
4:
Mic
rostr
ain
Weig
hte
d S
core
Altern
ate
5:
Mis
souri M
ote
Weig
hte
d S
core
Cost 50 1 0.5 9 4.5 1 0.5 3 1.5 9 4.5
Power Consumption 15 9 1.35 3 0.45 9 1.35 3 0.45 3 0.45
Range 15 9 1.35 3 0.45 3 0.45 3 0.45 3 0.45
Data Transfer Rate 20 9 1.8 3 0.6 9 1.8 3 0.6 3 0.6
5.00 6.00 4.10 3.00 6.00
Cost
Power Consumption
Range
Data Transfer Rate
Relay Data Attribute Assessment
Alternate 1:Cisco 1552SAlternate 2:TrendnetAlternate 3:National instrumentsAlternate 4:MicrostrainAlternate 5:Missouri Mote
Figure 2: Relay Data Kiviat Chart
-
Team 5 PDR report Page 8
Recording System Military Grade Storage Drives
Flash Storage
Cloud Storage
Consumer Hard Drives
Table 3: Data Storage
Criteria Weighting Altern
ate
1:
Mili
tary
Drives
Weig
hte
d S
core
Altern
ate
2:
Fla
sh S
tora
ge
Weig
hte
d S
core
Altern
ate
3:
Clo
ud S
tora
ge
Weig
hte
d S
core
Altern
ate
4:
Consum
er
Hard
-Drive
Weig
hte
d S
core
Cost 25 1 0.25 3 0.75 3 0.75 9 2.25
Power Consumption 10 1 0.1 3 0.3 9 0.9 3 0.3
Security 15 9 1.35 9 1.35 1 0.15 9 1.35
Portability 10 3 0.3 9 0.9 9 0.9 3 0.3
Component Life 10 9 0.9 1 0.1 3 0.3 3 0.3
Storage Capacity 20 9 1.8 1 0.2 3 0.6 9 1.8
Data Write Rate 10 9 0.9 3 0.3 1 0.1 9 0.9
5.60 3.90 3.70 7.20
Cost
Power Consumption
Security
PortabilityComponent Life
Storage Capacity
Data Write Rate
Data Storage Attribute Assessment
Alternate 1: Military DrivesAlternate 2: Flash StorageAlternate 3: Cloud StorageAlternate 4: Consumer Hard-Drive
Figure 3: Data Storage Kiviat Chart
-
Team 5 PDR report Page 9
Data Analysis LABView
Excel/Access
Google Charts/Fusion
Bespoke code
Table 4: Data Analysis
Criteria Weighting Altern
ate
1:
LA
Bvie
w
Weig
hte
d S
core
Altern
ative 2
:
MatL
AB
/ S
imulin
k
Weig
hte
d S
core
Altern
ate
3:
Excel/A
ccess
Weig
hte
d S
core
Altern
ate
4:
Google
Chart
s/F
usio
n
Weig
hte
d S
core
Altern
ate
5:
Bespoke C
ode
Weig
hte
d S
core
Cost 20 9 1.8 9 1.8 9 1.8 9 1.8 9 1.8
Complexity (code) 20 3 0.6 9 1.8 9 1.8 3 0.6 1 0.2
Maintainabilty 20 9 1.8 9 1.8 9 1.8 3 0.6 3 0.6
Training 10 9 0.9 3 0.3 9 0.9 3 0.3 1 0.1
Portability 10 3 0.3 3 0.3 3 0.3 9 0.9 9 0.9
Flexibility 10 9 0.9 9 0.9 3 0.3 3 0.3 9 0.9
Supportability 10 9 0.9 9 0.9 9 0.9 3 0.3 9 0.9
5.407.20 7.80 7.80 4.80
Cost
Complexity (code)
Maintainabilty
TrainingPortability
Flexibility
Supportability
Data Analysis Attribute Assessment
Alternate 1:LABviewAlternative 2: MatLAB / SimulinkAlternate3:Excel/AccessAlternate 4: Google Charts/FusionAlternate 5:Bespoke Code
Figure 4: Data Analysis Kiviat Chart
-
Team 5 PDR report Page 10
Feedback Relay to Soldiers Verbal Feedback through Radio System
Tactile Feedback through existing Tactor system
Table 5: Feedback Relay
Criteria Weighting Altern
ate
1:
Verb
al
Weig
hte
d S
core
Altern
ate
2:
Tactile
Weig
hte
d S
core
Response Time 30 3 0.9 9 2.7
Accuracy 20 9 1.8 3 0.6
Two-way 15 9 1.4 1 0.2
Recordibility 35 3 1.1 9 3.2
5.10 6.60
Response Time
Accuracy
Two-way
Recordibility
Feedback Relay to soldiers Attribute Assessment
Alternate 1: Verbal
Alternate 2: Tactile
Figure 5: Feedback Relay Kiviat Chart
-
Team 5 PDR report Page 11
Updated Kiviat Chart
Before PDR For PDR
Figure 6: Measure of Architecture Kiviat Chart
System Level Measures of Effectiveness Our system value is measured by cost and effectiveness of the system. Having defined our key system
attributes we can quantify the technical factors of our system based on the inputs from the Customer
and our feasibility analysis. The economic factor of the system value pertains to material required,
operation & support and R&D cost. Figure 7 demonstrates a design objective tree which shows our
system value at the top level breaking in to economic factors and technical factors. Each of these is then
broken down into our design criteria and MOEs at the conceptual design review stage.
-
Team 5 PDR report Page 12
Figure 7: Measures of effectiveness
System Measure
Economic Factors (System Life-cycle Cost)
R&D Cost
Material Cost
Operations Cost
Support Cost
Hardware Cost
Software Cost
Labor/Programming Cost
Product Testing Cost
Training Cost
Maintenance Cost
DIsposal Cost
Miscellaneous Peripherals Cost
and Contingency Cost
Technical Factors
Survivability
Reliability
Flexibility
Affordability
Adaptability
Robustness
Number of simultaneous Trainees: =8
Number of cultural and combat Faux Pas detected= 8
Time to alert of a Faux Pas or Medical Emergency= 15 seconds
Size of operational area of the system= 120 sq. ft. to ¼ mile
Reliability of System= 8 hours >=99%
ITS Trainee Equipment Weight = No Change
System acquisition cost
-
Team 5 PDR report Page 13
Technical Performance Measurements Technical Performance Measures (TPMs) are tools that show how well a system is satisfying its
requirements or meeting its goal.
The TPMs are derived from the functional, maintenance and support requirements of the system.
Maintainability and support have been rolled up into a single TPM for availability. These requirements
have critical importance to accomplish the objectives, meet the customer satisfaction, as well as have an
impact on system usefulness.
Table below displays the system’s selected TPMs. The TPMs are expected to be measured during
lifecycle of the system and tracked using charts. Additional TPMs may be added to the list upon
selection of specific products.
The number of faux pas is less than expected and is software dependent. As the algorithms to recognize
faux pas through software increases, this number is expected to improve.
Table 6: Technical Performance Measures
TPM Description Expected value Current Value Stage Identified
Last Update
Requirement Function(s) Function Allocation
1. Number of Soldiers relayed and recorded
simultaneously
8 soldiers Up to 15 soldiers
2
CDR CDR Tier 0.12 Relay Data Store Data
1.1, 1.2, 1.3, 2.1, 3.1
2. Size of training area monitored
120 sq ft Dependent on MS&T network
WiFi link to Laptop – up to
1600ft3
802.15.4 to Router – up to
4.3mi4
CDR HW2 Tier 0.20 Relay Data 4.3
3. Mission Data storage time
8 hours 8000 hours5 HW3 HW3 Tier 0.11
Store Data
Receive Data
2.1.2
4. Number identifiable faux Pas
12 separate actions
5 separate actions
PDR PDR Tier 0.15 Process Data
3.2
5. Alert Trainer to faux pas time
15 seconds 15 seconds CDR CDR Tier 0.15 Relay Data 4.2
6. Alert Soldier to faux pas time via control
center
15 seconds 15 seconds CDR CDR Tier 0.15 Relay Data 4.2
7. Medical Emergency Alert Time
15 seconds 15 seconds CDR CDR Tier 0.1 Relay Data 4.1
8. Heart Rate Data Measurements
1 / second / soldier
1 / second / soldier
CDR CDR Tier 0.1, 0.2 Relay Data Store Data
3.1.2.1
2 MST had 15 IVTs are available when queried
3Vendor data, line of sight and antenna dependent. Achieved performance in training environment will differ
4Vendor data, line of sight and antenna dependent. Achieved performance in training environment will differ
5Based on 90% of laptop hard disk (1TB) being available for storage and data recorded at the theoretical maximum
data rate for 802.15.4 being 250 kbps. Achieved storage is likely to be far higher as Achieved Data rate will be far lower than 250 kbps. Also, it is likely that not all data received by the system will be required to be stored.
-
Team 5 PDR report Page 14
TPM Description Expected value Current Value Stage Identified
Last Update
Requirement Function(s) Function Allocation
Received, Recorded Process Data
9. Body Temperature Data Measurements Received, Recorded
1 / second / soldier
1 / second / soldier
CDR CDR Tier 0.1, 0.2 Relay Data Store Data
Process Data
3.1.2.2
10. Respiration Data Measurements
Received, Recorded
1 / second / soldier
1 / second / soldier
CDR CDR Tier 0.1, 0.2 Relay Data Store Data
Process Data
3.1.2.3
11. Body Position Data Measurements
Received, Recorded
1 / second / soldier
1 / second / soldier
CDR CDR Tier 0.12, 0.13, 0.14
Relay Data Store Data
Process Data
3.1.3.1
12. Body Acceleration Data Measurements Received, Recorded
2 / second / soldier
2 / second / soldier
CDR CDR Tier 0.12, 0.13, 0.14
Relay Data Store Data
Process Data
3.1.3.2
13. Head Acceleration Data Measurements Received, Recorded
1 / second / soldier
2 / second / soldier
CDR CDR Tier 0.12, 0.13, 0.14
Relay Data Store Data
Process Data
3.1.3.3
14. Arm Position Data Measurements
Received, Recorded
1 / second / soldier
1 / second / soldier
CDR CDR Tier 0.12, 0.13, 0.14
Relay Data Store Data
Process Data
3.1.3.4
15. Arm Acceleration Data Measurements Received, Recorded
2 / second / soldier
2 / second / soldier
CDR CDR Tier 0.12, 0.13, 0.14
Relay Data Store Data
Process Data
3.1.3.5
16. Hand Position Data Measurements
Received, Recorded
2 / second / soldier
2 / second / soldier
CDR CDR Tier 0.12, 0.13, 0.14
Relay Data Store Data
Process Data
3.1.3.6
17. Hand Acceleration Data Measurements Received, Recorded
5 / Second / Soldier
5 / Second / Soldier
CDR CDR Tier 0.12, 0.13, 0.14
Relay Data Store Data
Process Data
3.1.3.7
18. Hand Gesture Measurements
Received, Recorded
10 / Second / Soldier
10 / Second / Soldier
CDR CDR Tier 0.12, 0.13, 0.14
Relay Data Store Data
Process Data
3.1.3.8
19. Average Data Latency from Vest to
Trainer Display
500 ms from sensor input to
display
500 ms from sensor input to
display
CDR CDR Tier 0.8 Relay Data 4.4
20. Number of soldiers actively monitored per trainer screen
8 soldiers 8 soldiers CDR CDR Tier 0.12 Process Data
3
21. Availability of system 99% Uptime 99% Uptime CDR CDR Tier 0.16 N/A
-
Team 5 PDR report Page 15
Functional Analysis and Decomposition A functional decomposition was generated based satisfying Tier 0 requirements. From these
requirements, it was identified that the system must: receive data from the ITV, process and prioritize
the received data, store the data, relay the data to the trainer, receive inputs from the trainer and
activate the ITV to alert soldiers of identified events.
The functional breakdown shown in Figure 8 below gives the design as presented to and approved by
the customer. This design depicts a system where data is first processed and prioritized prior to
relaying to the trainer and storing to the recording device.
Receive Data The six functions depicted in Figure 8 represent a top level decomposition of the system. The initial first
level function, 1: Receive Data, is the entry point into the system. This is where data is received by the
Meshlium router from the soldier’s ITV. The Receive Data function is comprised of three second level
functions, 1.1 Receive Data from ITV, 1.2 Utilize existing ITV equipment, and 1.3 Utilize existing ITV data
formats. These functions cover the requirements for the system to utilize an 802.15.4 interface through
the Missouri MOTE and receive data from up to eight soldiers simultaneously. Figure 9 below details the
decomposition of this function.
1: Receive Data
1.1: Receive Data from ITV
1.2: Utilize existing ITV equipment
1.3: Utilize existing ITV data
formats
1.2.1: Interface with Missouri
MOTE
1.2.1.1: support 802.15.4 interface
1.0
Receive Data from ITV
3.0
Store Data
4.0
Relay Data
2.0
Process Data
5.0
Receive Data from
Trainer
6.0
Alert Soldier
Figure 9: Receive Data Function
Figure 8: PDR Functional Decomposition
-
Team 5 PDR report Page 16
1.0 Receive Data
T1:1.1 The system shall support multiple soldiers simultaneously
T1:1.2 The receiving hardware shall have 99% uptime
1.1 Receive Data from ITV
T2:1.1.1 The system shall support existing Missouri MOTE interfaces
1.2 Utilize existing ITV equipment
T2:1.2.1 The system shall interface with the existing ITV equipment
1.3 Utilize existing ITV data formats
T2:1.3.1 The system shall support existing ITV data formats
Process Data The second first level function, 2: Process Data, handles the requirement to process received data and
algorithmically determine how it will be handled. This function is comprised of five second level
functions, 2.1 Prioritize data algorithmically, 2.2 Prioritize data algorithmically, 3.3 Identify medical
emergency, 3.4 Support new data types, and 3.5 Buffer received data. These functions cover the
requirements for received data to be prioritized, identify medical emergencies and social faux pas’,
support new data formats, and buffer received data. Figure 10 below details the decomposition of this
function.
2.0 Process Data
2: Process Data
2.1: Prioritize data
algorithmically
2.2: Identify social
faux pas
2.1.1: Prioritize
geospatial data
2.1.2: Prioritize
health data
The system shall process… 2.1.3.1: Body Position 2.1.3.2: Body Movements 2.1.3.3: Head Movements 2.1.3.4: Arm Position 2.1.3.5: Arm Movements 2.1.3.6: Hand Position 2.1.3.7: Hand Movements
2.3: Identify
medical
emergency
2.4: Support new
data types
2.5: Buffer
received data
2.1.3: Prioritize
movement data
The system shall process… 2.1.2.1: Heart Rate 2.1.2.2: Temperature 2.1.2.3: Respiration
Figure 10: Process Data functions
-
Team 5 PDR report Page 17
T1:1.1 The system shall actively monitor 8 soldiers on the trainer screen
T1:1.2 The system shall be able to process streamed data
Prioritize data algorithmically
T2:3.1.1 The system shall prioritize geospatial data
T2:3.1.2 The system shall prioritize health data
T2:3.1.3 The system shall prioritize movement data
2.1 Identify social faux pas
T2:3.2.1 The system shall identify a social faux pas within 15 seconds
T2:3.2.2 The system shall identify at minimum 12 separate faux pas’
2.2 Identify medical emergency
T2:3.3.1 The system shall identify a medical emergency within 15 seconds
2.3 Support new data types
T2:3.4.1 The system shall be easily modifiable to support new data types
T2:3.4.2 The system shall be manageable by a single person
2.4 Buffer Received Data
T2:3.5.1 The system shall treat all received data as unclassified
Store Data The third first level function, 3: Store Data, handles the requirement to store received data that may be
used as part of a post-exercise debrief. This function is comprised of two second level functions, 2.1
Record Data from ITV and 2.2 Data is unclassified (U). These functions cover the requirements for
received data to be stored on COTS hardware and to be treated as unclassified. Figure 11 below details
the decomposition of this function.
3.0 Store Data
T1:2.1 The storage equipment shall have 99% uptime
T1:2.2 The storage equipment shall have sufficient storage to support an 8 hour exercise
3.1 Record Data from ITV
T2:2.1.1 The system shall store all received data from the MOTE
3.2 Data is unclassified (U)
T2:2.2.1 The system shall treat all received data as unclassified
3: Store Data
3.1: Record Data
from ITV
3.2: Data is
unclassified (U)
3.1.1: Use COTS
Hard Drive
3.1.2: Support
eight hour
exercise
3.1.2.1: Utilize
FIFO recording
Figure 11: Store Data functions
-
Team 5 PDR report Page 18
Relay Data The fourth first level function, 4: Relay Data, handles the requirement to relay received data to a trainer.
This function is comprised of four second level functions, 4.1: Alert trainer to medical emergency, 4.2:
Alert trainer to faux pas, 4.3: Operable in area of 120 sq. ft. to ¼ mile, and 4.4: Average latency less than
500ms. Figure 12 below details the decomposition of this function.
4.0 Relay Data
T1:4.1 The relaying hardware shall have 99% uptime
4.1 Alert trainer to medical emergency
T2:4.1.1 The system shall alert the trainer of a medical emergency within 15 seconds
4.2 Alert trainer to faux pas
T2:4.2.1 The system shall alert the trainer of a faux pas within 15 seconds
4.3 Operable in area of 120 sq. ft. to ¼ mile
T2:4.3.1 The system shall support an area with a diameter of up to ¼ mile.
4.4: Average latency less than 500ms T2:4.4.1 The system shall have an average latency of less than 500 ms between the
sensor input to the display.
Receive Data The fifth first level function, 5: Receive Data from Trainer, handles the requirement to relay data sent by the trainer back to the soldier. This function is comprised of a single second level function, 5.1: Interface with ITV feedback systems. Figure 13 below details the decomposition of this function.
4: Relay Data
4.1: Alert trainer to medical emergency
4.2: Alert trainer to faux pas
4.3: Operable in area of 120 sq. ft.
to ¼ mile
4.4: Average latency less than
500ms
5: Receive Data
from Trainer
5.1: Interface with ITV feedback
systems
Figure 12: Relay Data functions
Figure 13: Receive Data from Trainer functions
-
Team 5 PDR report Page 19
5.0 Receive Data from Trainer
T1:5.1 The receiving hardware shall have 99% uptime
5.1 Interface with ITV feedback systems
T2:5.1.1 The system shall interface with the existing ITV feedback systems
Alert Soldier The sixth first level function, 6: Alert Soldier, handles the requirement to relay data sent by the trainer back to the soldier. This function is comprised of a single second level function, 6.1: Utilize Tactor interfaces. Figure 14 below details the decomposition of this function.
6.0 Alert Soldier
T1:6.1 The system shall alert the solider when commanded by the trainer
6.1 Utilize Tactor interfaces
T2:6.1.1 The system shall utilize existing ITV tactor interfaces
6: Alert Soldier
6.1: Utilize Tactor interfaces
Figure 14: Alert Soldier functions
-
Team 5 PDR report Page 20
Physical Architecture
Figure 15: System Physical Architecture
This section provides a detailed description of the physical components of the system, their
characteristics and operation of the preferred system architecture. Figure 15 above depicts the system
physical architecture with arrowed lines indicating data flows within the training system.
The system monitors the biometric data from each soldier to assess their physical condition. The data is
sourced from biometric sensors worn by the soldier that is sent to the Mote in the ITV. The MOTE mesh
network receives that data from each soldier and relays it via the Data Relay Router to Data Storage and
Processing laptop computer. Monitoring software on the laptop provides real time health alerts and
logging for post training analysis. The soldier’s health will be assessed using biometric sensor vendor
supplied software for monitoring the various inputs from the biometric monitoring elements. The
display shows a simple to understand red/orange/green indicator for each soldier enabling the trainer to
assess each soldier’s health status quickly. The trainer will select either voice or non-voice feedback
through IVT tactors.
The data relay router was changed from CDR due to the selected router being incompatible with the
current Missouri Mote and that there was no longer a Missouri Mote directly compatible with 802.11
available. The existing options were revisited with the lowest scoring option and the Missouri Mote
eliminated. To make the Trendnet option compatible, 802.15.4 USB interfaces with USB Ethernet
extenders was necessary. The DIGI and Cisco routers had vendor supported options available to perform
the conversion. Another option was discovered in the Meshlium router. Table 7 below shows the results
of the trade study used to decide on the preferred option for the relay data element. The Meshlium was
chosen and it provided the added benefit of being able to store and forward the 802.15.4 data rather
than simply relay it. This may provide for increased system availability at lower cost than the Digi or
Integrated Training Vest
Biometric sensors
Gesture sensors
Data Storage and Processing
Laptop
Data Relay Router
802.11802.15.4
Zigbee
Combat Faux Pas
Detection
Social Faux Pas
Detection
Data Storage
Biometric Monitor and AlertHaptic
Feedback
Wrist tracking
Elbow tracking
Shoulder tracking
Hand/Finger tracking
-
Team 5 PDR report Page 21
Cisco options, and lower complexity than the modified Trendnet option. The modified Trendnet option
is also likely to have higher through life cost due to the increased complexity.
Table 7: Replacement Relay Data Element
Criteria Weighting Altern
ate
1:
Cis
co 1
552S
Weig
hte
d S
core
Altern
ate
2:
Tre
ndnet
+ U
SB
Weig
hte
d S
core
Altern
ate
3:
National in
str
um
ents
Weig
hte
d S
core
Altern
ate
4:
Meshliu
m
Weig
hte
d S
core
Cost 30 1 0.3 3 0.9 1 0.3 3 0.9
Power Consumption 15 9 1.35 3 0.45 9 1.35 9 1.35
Range 15 9 1.35 3 0.45 3 0.45 9 1.35
Complexity 20 9 1.8 3 0.6 9 1.8 9 1.8
Data Transfer Rate 20 9 1.8 3 0.6 9 1.8 9 1.8
6.60 3.00 5.70 7.20
The system monitors the soldier location to assess a combat faux pas. The soldier’s location data is
sourced from the Missouri Mote network and will be compared to the location of other soldiers and
actors in the environment. Combat faux pas criteria will be used to determine if the current soldier
location is in breach of the combat rules. The system will display shows a simple to understand
red/orange/green indicator for each soldier in a representation of the training environment to enable
the trainer to quickly assess combat faux pas. Social faux pas criteria will be used to determine if the
soldier’s current movement relative to an actor is in breach of social rules. The system will display a
simple to understand indicator for each soldier in a representation of the training environment to
enable the trainer to quickly assess a social faux pas. The trainer will select either voice or non-voice
feedback through IVT tactors for combat and social faux pas. Voice feedback will be via soldier radios
which are outside of the scope of this system.
The system monitors the soldier’s hand gestures to assess social faux pas. Data from the hand gestures
monitoring element is sent to the Mote in the ITV. The MOTE mesh network receives that data from
each soldier and relays it via the Data Relay Router to Data Storage and Processing laptop computer.
Social faux pas criteria will be used to determine if the soldier’s current movement relative to an actor is
in breach of social rules. The system will display a simple to understand indicator for each soldier in a
representation of the training environment to enable the trainer to quickly assess a social faux pas. The
trainer will select either voice or non-voice feedback through IVT tactors for social faux pas. The voice
radio network that is used for voice feedback in alerting a soldier is external to this system and is not
shown.
The system recognizes which set of equipment is allocated to a soldier. This equipment allocated to a
soldier includes the IVT, the biometric harness, and the gesture monitoring equipment. The system will
-
Team 5 PDR report Page 22
record biometric data based on the recording capabilities within the COTS biometric software. The
custom analysis software will track the IVT allocated to the soldier and provide for updating the
information during a training exercise. Data recording through the custom software will include any
unique equipment identifiers to aid in offline analysis of any faults.
The dotted line shown in Figure 15 from data relay router to data storage indicates that the data relay
router has the ability to store and forward information contained in the 802.15.4/Zigbee frames. This
flexibility allows for the mission to continue should the laptop be temporarily inaccessible with no data
loss. Data stored is stored to a local hard drive on the laptop. An external hard drive or flash memory
device can be used. A 1TB external hard drive has been included in the costing of the system but is not
necessary for the system to operate.
Functional to Physical Mapping This section shows the relationship between the functions obtained in the functional decomposition and
the elements in the physical architecture. For the purposes of this section, software components are
treated as part of the physical architecture. Software architectural analysis has not been undertaken.
Error! Reference source not found. show the relationship between the high level functions and the
physical architecture elements.
-
Team 5 PDR report Page 23
Table shows a further breakdown of the necessary functions within the system. Soldier and mission
housekeeping functions have been added.
Table 8: Tier 0 Mapping
PHYSICAL ARCHITECTURE
Integrated Training Vest
Data Relay Router
Data Storage and Processing Laptop
Analysis Software
MS&T Mote Network
FUN
CTI
ON
AL
AR
CH
ITEC
TUR
E Receive Data X X X X
Process data X X
Relay Data X X X X
Receive Trainer Instructions
X X X X
Alert Soldier X X X X X
-
Team 5 PDR report Page 24
Table 9: Lower tier mapping
PHYSICAL ARCHITECTURE
Integrated Training Vest
Data Relay Router
Data Storage and Processing Laptop
COTS Biometric Software
Custom Analysis Software
MS&T Mote Network
FUN
CTI
ON
AL
AR
CH
ITEC
TUR
E
Receive IVT Data X X X X
Process data X X X
Relay Data X X
Trainer Instructions X X X X
Register Bioharness to Soldier
X X
Register IVT Mote to Soldier
X X X
Manage Soldier Data X X X
Store Data X X X X
Identify Medical emergency
X X
Identify Social Faux Pas X X
Identify Combat Faux Pas
X X
Alert Soldier X X X X X
Technical Management Plan The Technical Management Plan (TMP) was further developed as the program has progressed from the
conceptual to the preliminary design stages. This plan has been developed to cover some of the
technical and management requirements for the implementation of the proposed system and has been
presented to the customer.
The TMP is incorporated into Appendix A in its current entirety. Where sections of the TMP are also
included in the main body of the PDR report, they have not been duplicated in the appendix for brevity.
Risk Analysis The initial set of risks was developed for the system as the program was entering the Conceptual Design
Review (CDR). They were evaluated on their likelihood of occurrence and severity of the consequences.
Each risk was further categorized as being a risk to the program schedule, cost or of a technical nature.
Risks could cross multiple categories. Initial mitigation plans were formulated and the initial risk matrix,
consisting of nine discrete risks, was created.
Once the program solution was accepted by the customer following the CDR, the risks were revisited
and have been updated. The Updated Risk Matrix is attached below. One risk was eliminated after
further discussions with the customer about his specific requirements. A second identified risk was also
-
Team 5 PDR report Page 25
eliminated after confirming with the Missouri University of Science and Technology (MST) that software
licenses were available for the Team’s use. These two risks remain in the Updated Risk Matrix for
traceability, but have been denoted as eliminated.
Three additional risks have been identified as the program approaches the Preliminary Design Review
(PDR) stage. These are incorporated at the end of the Updated Risk Matrix. Contingency plans have
been developed for two of these risks. The third ‘new’ risk is considered to be of low enough likelihood
that the risk is acknowledged and accepted without developed mitigation. This risk involves the delivery
of third party code with defects present. It is unlikely that this would happen, but if it did the code
would have to be corrected to eliminate the consequences of this risk to the system.
The remaining seven (7) initial risks have been reevaluated and further contingency plans and mitigation
has been put in place for three of these risks. This mitigation has resulted in the risk levels being
reduced.
All high level risks have now been eliminated. All of the remaining moderate risk level items involve the
integration of all of the systems together. These risks vary from hardware potential hardware
compatibility problems, for example the Missouri Mote and the wireless routers not communicating, to
integration of the various legacy systems and the code that will interpret all the data. However, all risk
levels are now considered to be at a manageable level and early field testing will eliminate or allow the
team to develop further contingency plans as the program moves forward.
The risks identified in the system continue to be fluid and mitigation efforts continue to reduce overall
project risk. Risk will be reevaluated at each stage of the program and changes will continue to be
denoted in updated matrices with all changes notated.
-
Team 5 PDR report Page 26
Updated Risk Matrix
Lik
elih
oo
d
Co
nseq
uen
ce
Category
Ite
m
Ris
k D
esc
rip
tio
n
Tec
hn
ical
Sch
edu
le
Co
st
CD
R R
isk
Leve
l
Mo
dif
ied
Ris
k Le
vel
Mit
igat
ion
Act
ion
s
Ad
dit
ion
al
Co
nti
nge
ncy
/
Mit
igat
ion
Ide
nti
fie
d
1
Team member unable to participate (illness, localized adverse events, ...)
2 4
X
Mo
der
ate
Low
Other team member(s) will work actions assigned to unavailable team member.
Contingency plans developed and team members have sufficient inter-disciplinary skills to complete all tasks.
2 Mentors unavailable for review
1 4 X L
ow
Low
Two mentors available for comments, MST to provide alternate if either unavailable for lengthy period.
3 Mote and router don't integrate
2 5 X
Mo
der
ate
Mo
der
ate/
Low
Establish contact with Mote team to work through alternate solutions. Prototype as early as possible.
Initial CDR level router abandoned. Contact has been made with alternate supplier who has confirmed compatibility with MOTE system.
4 Team communications issues
1 4
X L
ow
Low
Alternate communications channels have been distributed.
5
Availability and effectiveness of development tools.
1 3
X X
Low
Ris
k El
imin
ated
MST holds several licenses for development software chosen. It is available in several computer laboratories.
MST has confirmed that the team may use MST licenses
-
Team 5 PDR report Page 27
Lik
elih
oo
d
Co
nseq
uen
ce
Category
Ite
m
Ris
k D
esc
rip
tio
n
Tec
hn
ical
Sch
edu
le
Co
st
CD
R R
isk
Leve
l
Mo
dif
ied
Ris
k Le
vel
Mit
igat
ion
Act
ion
s
Ad
dit
ion
al
Co
nti
nge
ncy
/
Mit
igat
ion
Ide
nti
fie
d
6 Costs of development tools.
3 5
X
Hig
h
Low
Each additional development licenses represents >10% of the overall budget.
MST has confirmed that additional development licenses may be used as a part of this project.
7
Data rate from multiple motes exceeds available bandwidth
3 5 X H
igh
Mo
der
ate
Possible addition of Multiple Meshlium components will alleviate any bandwidth issues
Field Testing to determine bandwidth shall be tested early in detailed design.
8
Use of existing communications and power networks
3 4 X
Mo
der
ate
Mo
der
ate
Installation site survey required to locate networks. Use batteries and wireless communications as alternate where suitable.
9 End user security requirements
2 4 X
X
Mo
der
ate
Ris
k El
imin
ate
d
Solution uses commercial grade encryption. If customer requires higher grade, additional design effort will be required.
Customer has confirmed that this program is Unclassified level security
-
Team 5 PDR report Page 28
Lik
elih
oo
d
Co
nseq
uen
ce
Category
Ite
m
Ris
k D
esc
rip
tio
n
Tec
hn
ical
Sch
edu
le
Co
st
CD
R R
isk
Leve
l
Mo
dif
ied
Ris
k Le
vel
Mit
igat
ion
Act
ion
s
Ad
dit
ion
al
Co
nti
nge
ncy
/
Mit
igat
ion
Ide
nti
fie
d
10
Virtual Cultural Monitoring System is not available for integration
3 4 X N/A
Mo
der
ate
Availability of VCMS must be confirmed through appropriate contacts. If system not available, similar system will be developed.
11
System Code Complexity is higher than anticipated to integrate all systems
2 4 X X X N/A
Mo
der
ate
Preliminary code development will be developed earlier in the PDR process
12
System Code from third party is delivered with defects
1 4 X X X N/A
Low
Acceptable risk involved in item
Figure 16: Risks
-
Team 5 PDR report Page 29
Level Description Generic Technical Schedule Cost
5 High
Major crisis that could result in program
termination if not mitigated
Major crisis; no alternatives
exist.
Cannot achieve a key project
milestone/event.
Requires an overall budget
increase of greater than
10%.
4 Significant
Significant damage to program
viability if not mitigated
Major crisis, but workarounds
available.
Project critical path affected.
Overall budget increase of
between 5% and 10% required.
3 Moderate Major problems
that could be tolerated
Major performance shortfall, but workarounds
available.
Minor schedule slip. May miss a
need date.
Overall budget increase of
between 1% and 5%
required.
2 Minor Minor problems that can easily
be handled
Minor performance
shortfall, same approach retained.
Additional activities required; able to meet key
dates/events.
Overall budget increase of less
than 1% required.
1 Low Little or no
impact Little or no
impact Little or no impact
Little or no impact
Figure 17: Risk consequence assessment table
Level Description Detail
5 High Speculative with no identified
mitigation plan
4 Significant Analytically demonstrated with
possible mitigation plan identified
3 Moderate Partially demonstrated or somewhat mitigated by
approved plan
2 Minor Demonstrated or well mitigated
by approved plan
1 Low Proven or completely mitigated
by an approved plan
Figure 18: Risk likelihood assessment table
-
Team 5 PDR report Page 30
Like
liho
od
5 Low Moderate High High High
4 Low Moderate Moderate High High
3 Low Low Moderate Moderate High
2 Low Low Low Moderate Moderate
1 Low Low Low Low Moderate
1 2 3 4 5
Consequence Figure 19: Combined Risk Assessment Table
Work Breakdown Structure A high level work breakdown structure (WBS) that captures tasks from kickoff to deployment for the
system is given below in Table 9. This structure is identical to that presented at CDR.
Table 9 System Level Work Breakdown Structure
1.0 Needs/Requirements Analysis 1.1 Procure Needs Statement 1.2 Perform Requirements Analysis 1.3 Feasibility Analysis 1.4 Trade Studies 1.5 System Planning
2.0 Conceptual Design Phase
2.1 Relay System 2.1.1 Decide TPMs 2.1.2 Evaluate Alternatives 2.1.3 Perform Trade Studies 2.1.4 Select Alternative
2.2 Recording System 2.2.1 Decide TPMs 2.2.2 Evaluate Alternatives 2.2.3 Perform Trade Studies 2.2.4 Select Alternative
2.3 Data Analysis 2.3.1 Decide TPMs 2.3.2 Evaluate Alternatives 2.3.3 Perform Trade Studies 2.3.4 Select Alternative
2.4 Feedback Relay to Soldiers 2.4.1 Decide TPMs 2.4.2 Evaluate Alternatives 2.4.3 Perform Trade Studies 2.4.4 Select Alternative
3.0 Detail Design 3.1 Evaluate Design Functionality
3.1.1 Component Interface 3.1.2 Software Strategy Evaluation 3.1.3 Maintenance Evaluation 3.1.4 Design Mock Up Duty Cycle 3.1.5 Model System Prototype 3.1.6 Evaluate Prototype System to
Design Requirements
4.0 Verify Components 4.1 Test Components 4.2 Evaluate Test Results to Customer Requirements
5.0 Verification of Subsystems 5.1 Test Subsystem 5.2 Evaluate Test Results to Customer Requirements
6.0 Full System Operation and Verification 6.1 System Installation / Realization 6.2 System Testing 6.3 Evaluate Test Results to Customer Requirements 6.4 System Dispatch to Customer 6.5 Customer Training 6.6 System Support / Maintenance
-
Team 5 PDR report Page 31
Cost Estimate The overall system costs were changed slightly after they were presented at the CDR. In the initial
design, it was assumed that the system would incorporate three Trendnet Routers for receiving data
from the soldier ITV. Further analysis determined that a Meshlium router would be better suited for the
system. This device carries better performance specs which are reflected in the updated Kiviat Chart. A
drawback to the Meshlium router versus the Trendnet router was the price. The Meshlium router costs
roughly three times as much as the Trendnet which brings the cost of the overall system over the
$10,000 threshold. In order to offset this difference, it was determined that the backup battery packs
from the initial design were unnecessary as they were not being directly used within our system. The
net increase in cost in the design presented at the PDR versus what was presented at the CDR was $641.
The resulting final high level cost breakdown for the system is shown Table 10 below.
Table 10: High Level Cost Breakdown
Line Item Spending Fraction of Total Spending
Hardware $4,458 48.11%
Software Suite $200 2.16%
Labor / Programming $1,000 10.79%
Product Testing $500 5.40%
Training $500 5.40%
Maintenance $425 4.59%
Disposal $150 1.62%
Miscellaneous Peripherals $1,250 13.49%
Contingency $1,517 15.17%
Total $10,000
The largest single line-item, hardware, is further analyzed below in Table 11.
Table 11: Detailed Hardware Costs
Line Item Spending Fraction of Total Spending
Laptop $1,499 33.6%
Meshlium Routers (3) $2,814 63.1%
1 TB – Ruggedized External Hard-Drive
(for Raw Data Storage)
$145 3.3%
Hardware Subtotal $4,458
-
Team 5 PDR report Page 32
Detailed specification sheets on the router and laptop follow the report in Appendix B. These two items
account for nearly 45% of total project spending anticipated at this stage.
The routers will powered using Power Over Ethernet (POE) cable. In order to eliminate signal loss over
long runs, POE injectors are installed approximately every 300’. These are essentially signal boosters.
By powering the routers using hardwired Ethernet cables, we ensure the fastest possible data transfer
speeds to the control room and least data loss.
Schedule The overall schedule for this phase of the project is shown in Figure 20. The end of this phase is just after
the Preliminary Design Review. The completion of the project from just post Preliminary design review
through to disposal of the system is shown in Figure 21. The schedule shows the team members working
in parallel taking advantage of the system breakdown. Each team member will tackle a component and
refer to the other team members regularly. Should one component finish early, that team member will
assist others in completing their assigned activities.
Figure 20: Schedule for current phase of project
-
Team 5 PDR report Page 33
Figure 21: Schedule for subsequent phases of project
Support Concept The product support plan is the application of the support functions and logistics elements necessary to
sustain the readiness and the operational capability of the system. The system will be deployed with
both maintenance and troubleshooting instructions. The support concept developed must satisfy the
user specified requirements for sustaining the system performance at the lowest possible cost. The
features included in the plan are:
Availability of support to meet system performance
Logistics support and Life cycle cost management
Maintenance to integrate the elements and optimize readiness
Hardware and associated software
Operator, trainer and maintainer training to support the system
Data management and configuration management throughout the system life cycle
Maintaining the system through its operational lifecycle must start with the requirement analysis so that
the system can be designed for maintainability. The following section outlines the planned support for
our system:
The system shall be provided with logistics facilities for the initial provisioning and procurement
of items.
The maintenance personnel will be provided to cover both scheduled and unscheduled
maintenance.
The operator will be trained in terms of system activation, usage, security and check out.
The trainer and maintenance personnel will be specially trained in terms of operating the
system.
All physical components are off the shelf and easily available except the code which is a
combination of in-house developed code and modified COTS vendor code.
-
Team 5 PDR report Page 34
The system will be shipped with the following manuals: instructions, training, user, maintenance
and troubleshooting manuals.
Technical data including system design (system drawing, specifications), system modification,
logistics provisioning and procurement data, supplier data, interfacing, system operational and
maintenance data and other supporting databases will be documented.
The system shall be provided with additional spare parts only if identified and required during the
operational phase of the system as per the requirements.
Figure 22 provides the basic availability operations concept for our training system considering spares
also.
-
Team 5 PDR report Page 35
Figure 22: Preventive Maintenance Flowchart
Encounter Error
Begin Repair
Repair Completed
Resume Training
dT=0 min dT=30 min dT=90 min dT=120 min
Allocated Repair Timeline
System falters during
training
Get the correct spares
to repair the fault
At most 1 hour to
repair the fault
Training resumes
after repair
Training resumes
Availability
of spares
Can
training
continue?
End
YES
NO
Identify fault location
Replacement spares
are ordered
Training resumes in 30
min
Fault detection
continues
NO
Yes
Can Training
continue?
YES
NO
-
Team 5 PDR report Page 36
Training events will have enough systems in a “stand by” mode. In the event a system in training
becomes NMC (Non-mission capable) the additional system will be used. The supportability equation
will determine appropriate number of systems using anticipated failure rate. Major maintenance issues
will be shipped back to manufacturer for repairs under warranty.
The manufacturer will provide an easy to read and understand “User’s manual” to ensure all parties,
Soldiers and trainers know how to properly use the equipment. The manual will contain information on
regular use, simple maintenance procedures, list of components for accountability purposes and
troubleshooting guide at a minimum. The system will be fully repairable. All minor repairs will be able to
be made at the user level through use of the “User’s manual”
The customer will be responsible for initiating repairs. The manufacturer will be responsible for
supplying customer with necessary repair parts if repair is simple enough, or make repairs on significant
maintenance issues.
-
Team 5 PDR report Page 37
Program Management Best Practices (PMBP) Extremely important in these days of commercial investment to meet cost and schedule goals. These are
the PMBP that are used to ensure the success of this and all programs in this corporation use.
Have strong organizational structure
Our structure is based upon our core competencies that allow us to assign standard company resources
such as Engineering, Accounting, Management oversight and Systems Engineering to each program on
an as needed basis. This ensures that programs don’t have the added expense of a full time person
charging to the program when only part-time is needed. This allows us to utilize our staff more
efficiently while keeping costs down.
Manage to a set of requirements
Each project comes with its own set of requirements, and while we strive to keep costs down by reusing
existing designs, this is only done with the review of the existing technology to ensure it will properly
meet the customers requirement and with a customer review of our plan of how we will meet each
requirement and the method of validating and verifying that the requirement has been met.
Identify risks early and develop plans to mitigate or avoid
In each new requirement, there is an inherent risk that parts, products and services may fail to live up to
the expectations in the design. Prior to making a final decision on a plan for a project, we strive to have
alternate plans, methods or vendors to ensure that should an issue arise, we have things in place to
mitigate any foreseeable system failures.
Integrated Master Planning (IMP) and Scheduling (IMS)
Making use of IMPs and IMSs, allow us to keep track and manage each program efficiently by knowing
what our plan and schedule milestones are, and adjust staffing and or make use of our Engineers and
Tech Fellows to help solve issues before they become big problems.
Manage to the baseline budget & schedule, keep track of changes.
Our managers’ report weekly on the progress of each program that they are responsible for with the use
of our Earned Value Management (EVM) that tracks both out cost and schedule variances.
Plan for affordability- design for cost
Our Systems Engineering approach to allow us to review projects each step of the way to ensure we are
using the right parts and products needed to provide a robust product while keeping costs down by not
over-engineering the systems requirements.
Have frequent communication with the customer
Each step of the way, we review where we are and any problems or milestones that have been
achieved. Both the good and bad are communicated to the customer to ensure that things are dealt
with and adjustments can be made that will best suite both us and the customer.
-
Team 5 PDR report Page 38
Promote a culture of asking for help when needed
We believe that the only bad question is the one that is not asked. We encourage our senior discipline
employees to develop a mentoring relationship with those that are just starting out or those that are
learning new technologies to be used on developing new products or updating older designs to prolong
the usable life of existing technologies.
Maintain one computer system for team to find all information needed
We have many legacy programs that have required data conversion to newer systems in order to be
able to maintain and be able to provide services to customers that bought from us in the past. We are
committed to providing serviceability and parts and resources for maintainability of our products to
meet our customer’s needs.
Manage your suppliers well
Keep metrics on how you are meeting your technical requirements
-
Team 5 PDR report Page 39
Appendix A – Technical Management Plan
1.0 Program Scope
1.1 Program Description
1.1.1 Classification
1.2 Statement of Need
1.3 Program Objectives
1.4 Program Issues and Concerns
1.5 Program Communications
1.6 Program Functional Management Elements
2.0 Program Tasks
2.1 Statement of Work Summary
2.2 Work Breakdown Structure
2.3 Responsibility Assignment Matrix
2.4 Project Deliverables
3.0 Program Management
3.1 Program Organization
3.2 Program Personnel
3.2.1 Program Manager
3.2.2 Faculty Mentor
3.2.3 Boeing Mentor
3.2.4 Program Team Members
3.3 Schedule Management
3.4 Customer Communications and Contact Protocol
3.5 Cost Management and Affordability
3.6 Risks and Opportunities
3.7 Vendor