M&S Based System Development and Testing in a Joint Net-Centric Environment
Simulation-based Testing of Emerging Defense Information Systems
Bernard P. Zeigler
Professor, Electrical and Computer Engineering University of Arizona
Co-Director, Arizona Center for Integrative Modeling and Simulation(ACIMS)
andJoint Interoperability Test Command
(JITC)Fort Huachuca, Arizona
www.acims.arizona.edu
Overview• Modeling and simulation methodology is attaining core-technology
status for standards conformance testing of information technology-based defense systems
• We discuss the development of automated test case generators for testing new defense systems for conformance to military tactical data link standards
• DEVS discrete event simulation formalism has proved capable of capturing the information-processing complexities underlying the MIL-STD-6016C standard for message exchange and collaboration among diverse radar sensors
• DEVS is being used in distributed simulation to evaluate the performance of an emerging approach to the formation of single integrated air pictures (SIAP)
Arizona Center for Integrative Modeling and Simulation (ACIMS)
• Mandated by the Arizona Board of Regents in 2001
• Mission – Advance Modeling and Simulation through
– Research– Education– Technology Transfer
• Spans University of Arizona and Arizona State University
• Maintains links with graduates through research and technology collaborations
Unique ACIMS/NGIT Relationship
• A long term contractual relationship between Northrop Grumman IT (NGIT) with ACIMS
• Employs faculty, graduate/undergrad students, others • Perform M&S tasks for/at Fort Huachuca
• Challenges:– Rationalize different ways of doing business – Deliver quality services on time and on budget
• Benefits:– NGIT Access to well-trained, creative talent– Students gain experience with real world technical requirements
and work environments– Transfer ACIMS technology
Genesis
• Joint Interoperability Test Command (JITC) has mission of standards compliance and interoperability certification
• Traditional test methods require modernization to address– increasing complexity,– rapid change and development, and – agility of modern C4ISR systems
• Simulation based acquisition and net-centricity
– pose challenges to the JITC and the Department of Defense– must redefine the scope, thoroughness and the process of
conformance testing
Response – ATC-Gen
• JITC has taken the initiative to integrate modeling and simulation into the automation of the testing process
• Funded the development of Automated Test Case Generator (ATC-Gen) led by ACIMS
• In R&D of two years, proved the feasibility and the general direction
• The requirements have evolved to a practical implementation level, with help from conventional testing personnel.
• ATC-Gen was deployed at the JITC in 2005 for testing SIAP systems and evaluated for its advantages over conventional methods
The ACTGen Development Team, NGIT & ACIMS was selected as the winner in the Cross-Function category for the 2004/2005 M&S Awards presented by the National Training Systems Association (NTSA).
ATC-Gen Goals and ApproachGoals: • To increase the productivity and effectiveness of standards conformance testing (SCT) at Joint Interoperability Test Command (JITC)• To apply systems theory, modeling and simulation concepts, and current software technology to (semi-)automate portions of conformance testing
Objective: Automate Testing
Capture Specification as If-Then Rules in XML
Analyze Rules to Extract I/O Behavior
Synthesize DEVS Test Models
Test Driver Executes Models to Induce
Testable Behavior in System Under Test (SUT)
Network
DEVS Simulator
Test Driver
HLA
SUT
HLA
Interact With SUT Over Middleware
Link-16: The Nut-to-crack• Joint Single Link Implementation Requirements Specification (JSLIRS)
is an evolving standard (MIL-STD-6016c) for tactical data link information exchange and networked command/control of radar systems
• Presents significant challenges to automated conformance testing:
– The specification document states requirements in natural language– open to ambiguous interpretations– The document is voluminous – many interdependent chapters and appendixes– labor intensive and prone to error– potentially incomplete and inconsistent.
• Problem: how to ensure that a certification test procedure
– is traceable back to specification– completely covers the requirements– can be consistently replicated across the numerous contexts– military service, inter-national, and commercial companies
Original:4.11.13.12 Execution of the Correlation. The following rules apply to the disposition of the Dropped TN and the
retention of data from the Dropped TN upon origination or receipt of a J7.0 Track Management message, ACT = 0 (Drop Track Report), for the Dropped TN. The correlation shall be deemed to have failed if no J7.0 Track Management message, ACT = 0 (Drop Track Report), is received for the dropped TN after a period of 60 seconds from the transmission of the correlation request and all associated processing for the correlation shall be cancelled.
a. Disposition of the Dropped Track Number: (2) If own unit has R2 for the Dropped TN, a J7.0 Track Management message, ACT = 0 (Drop Track Report), shall
be transmitted for the Dropped TN. If the Dropped TN is reported by another IU after transmission of the J7.0 Track Management message, own unit shall retain the dropped TN as a remote track and shall not reattempt to correlate the Retained TN and the Dropped TN for a period of 60 seconds after transmission of the J7.0 Track Management message.
XML Translation:<rule trans="4.11.13" stimulus="4.11.13.12" reference="4.11.13.12.a.2" ruleName="R2 Unit transmits J7.0">
<condition txt="Check for R2 own unit" expression="AutoCor==True and (CRair.TNcor.CORtest==3 and J32.TNref.CORtest==3) and CRair.R2held==1 AND J72.MsgTx==True">
</condition><action txt="Prepare J7.0 Drop Air Track message" expression="J70.TNsrc=TNown; J70.TNref=TNdrop;
J70.INDex=0; J70.INDcu=0; J70.ACTVair=0; J70.SZ=0; J70.PLAT=0; J70.ASchg=0; J70.ACTtrk=0; J70.ENV=0; MsgTx(J70)"></action><output txt="Transmit J7.0" outType="Message" outVal="J70"></output>
</rule><QA>
<revisit name="DHF" date="10/16/04" status="Open">need to add timer for a period of 60 seconds in which correlation is not reattempted</revisit>
</QA>
MIL-STD-6016C Excerpt:
Discrete Event Nature of Link-16 Specification
Constraints(Exception)Rules
Stop
Modify C2Record for TN
1 2 3
RuleProcessing
Stop, Do Nothing,Alerts, Or jump to other
Transaction
TrackDisplay
Operatordecisions
Validity checking
TransmitMsg
Other ConsequentProcessing
Jumps (stimuli) to other
Transactions of specification
Transaction Level - example P.1.2 = Drop Track Transmit
Preparation Processing
Timeouts
PeriodicMsg
Input to
systemDEVS
Output from
system
t1
t2 t
3t4
Level
3 CoupledSystem
2 I/O System
1 I/O Function
0 I/O Frame
System Theory Provides Levels of Structure/Behavior
ATC-Gen Top-Level Architecture
Rule Capture Component
Rule Set PropertyAnalyzer
Rule FormalizationComponent
Test Generation Component
Analyst
Subject Matter Expert
Logic Engineer
Test Engineer
SUT
Sensor
OtherFederates
ATC Gen Overview• Standard to XML Translation
– Analyst interprets the requirements text to extract state variables and rules, where rules are written in the form:
IfIf P is true now ConditionCondition
ThenThen do action A later ConsequenceConsequence
UnlessUnless Q occurs in the interim ExceptionException
• Dependency Analysis & Test Generation– DependencyDependency Analyzer (DA)Analyzer (DA) determines the relationship between rules by
identifying shared state variables
– Test Model GeneratorTest Model Generator converts Analyst defined test sequences to executable simulation models
• Test Driver– Test DriverTest Driver interacts with and connects to SUT via HLA or Simple J
interfaces to perform conformance testing
– Validated against legacy test tools
ATC Gen Tool Process
Rule Interpretation Example
Constraints in Appendix E.1.1.2.1
If Reference TN equals 0, 77, 176, 177, 777 or 77777
ThenThe Host System:
– Performs no further processing– Alerts operator (CAT 4)
Translation to XML (Variables Standards)
ConditionE1.Init == True and
VOI.TNref == 0 or 63 or 126 or 127 or 4095 or 118783
Action E1.Init = False
Output CAT 4 Alert
ATC Gen Tool Process
InputVariables
OutputVariables
Shared State
VariablesOutput:• CAT Alert• Message Transmission• Operator Display
Dependency AnalysisAutomated Visual Display of Rule Relations
Test SequencesManually Derive Paths
Example: Correlation Test Rules
“Correlation was Successful ”
Correlation Test Sequence Examples
“Correlation was Successful ”
1st Prohibition Failed “Correlation Failed”
2nd Prohibition Failed “Correlation Failed”
3rd Prohibition Failed “Correlation Failed”
Test SequencesCreate Paths Through the ATC Gen GUI
Determining initial state and message field values required to drive SUT through sequence
Analyst:• Determine the data
needed to execute a test sequence
• Set state variables and field values accordingly
Test CaseData Input Through the ATC Gen GUI
I/OInitial State
Values
Reception ofJ3.2 Messages
Transmission of J7.2 Message
Transmission of J7.0 Message
Test CaseGenerated XML
ATC Gen Tool Process
Test Driver for Controlled Testing
Middleware
Coupled Test ModelCoupled Test Model
Jx1,data1Jx2,data2Jx3,data3
Jx4,data4 Jx1,data1Jx2,data2Jx3,data3
Jx4,data4 Jx1,data1Jx2,data2Jx3,data3
Jx4,data4
Component Test Model
1
Component Test Model
2
Component Test Model
3
SUT
Test Model Generation for Controlled Testing
Mirroring (flipping) the transactions of a SUT model (system model behavior selected as a test case) allows automated creation of a test model
holdSend(Jx1,data1,t1) holdSend (Jx2,data2,t2)
holdSend (Jx3,data3,t3) waitReceive(Jx4,data4)
receiveAndProcess(Jx1,data1) receiveAndProcess(Jx2,data2)
receiveAndProcess(Jx3,data3) transmit(Jx4,data4)
Jx1,data1Jx2,data2Jx3,data3
Jx4,data4
t1 t2 t3 t4time
Test ModelTest Model
SUT Model
TraceabilityView Through the ATC Gen GUI
Selected Rule in XML
Associated Test Sequences
in XML
TEST MODELGENERATOR
#include "hierSequence.h"#include "PPLI.h"#include "RemoteTNdrop.h" const port_t hierSeqDigraph::start=0;const port_t hierSeqDigraph::inJmsg=1;const port_t hierSeqDigraph::pass=2;const port_t hierSeqDigraph::outJmsg=3; hierSeqDigraph::hierSeqDigraph():staticDigraph(){ PPLI *pp = new PPLI(); add(pp); couple(this, this->start, pp, pp->start); couple(pp, pp->outJmsg, this, this->outJmsg); RemoteTNdrop *p1 = new RemoteTNdrop(); add(p1); couple(this, this->start, p1, p1->start); couple(this, this->inJmsg, p1, p1->inJmsg); couple(p1, p1->outJmsg, this, this->outJmsg);}
Test Case
Test Model
GENERATED TEST CASE
MIRROR(XML)
Test ModelValidation & Generation
TEST DRIVER
SYSTEM UNDER TEST
Test Model (C++)
#include "hierSequence.h"#include "PPLI.h"#include "RemoteTNdrop.h" const port_t hierSeqDigraph::start=0;const port_t hierSeqDigraph::inJmsg=1;const port_t hierSeqDigraph::pass=2;const port_t hierSeqDigraph::outJmsg=3; hierSeqDigraph::hierSeqDigraph():staticDigraph(){ PPLI *pp = new PPLI(); add(pp); couple(this, this->start, pp, pp->start); couple(pp, pp->outJmsg, this, this->outJmsg); RemoteTNdrop *p1 = new RemoteTNdrop(); add(p1); couple(this, this->start, p1, p1->start); couple(this, this->inJmsg, p1, p1->inJmsg); couple(p1, p1->outJmsg, this, this->outJmsg);}
Test ModelExecution
SIAP/IABM —Successor to Link-16
• SIAP (Single Integrated Air Picture) objective– Improve the adequacy and fidelity of information to form a shared understanding of
tactical situation– Aegis ships would be able to target their missiles using information obtained by an
Air Force Airborne Warning and Control System (AWACS) plane.– All users in the battlespace will be able to trust that data to make decisions faster
• Integrated Architecture Behavior Model (IABM) requires that all sensors utilize a standard reference frame for conveying information about the location of targets.
• Navy is the lead – IABM will be integrated into its and other services’ major weapons systems
• Developed by the Joint SIAP System Engineering Organization (JSSEO), Arlington, Va., a sub-office of the Assistant Secretary of the Army for Acquisition, Logistics and Technology.
• JITC is the test agency mandated to do test and evaluation – initial test was Config05 end of last year
ATC Gen Test CasesMIL-STD-6016C
• December Objective: – Complete 17 Link-16 test cases for IABM Config05– Produced 28 test cases and reported results ahead of schedule
• Currently: 29 passed, 4 failed which prompted corrections on new time box releases by JSSEO (8 untested or in progress)
MIL-STD-6016C Function Remaining Completed Tested
Correlation 20 19
Decorrelation 2 1
Track Management 6 2
Reporting Responsibility 6 6
Message Handling 6 4
Track Quality 1 1
Network Management
TDL Alignment
Combat ID
Filters
Sensor Registration
Other System (C2) Level
Forwarding
Totals: 41 33
Test Manager for Opportunistic Testing• Replace Test Models by Test Detectors• Deploy Test Detectors in parallel, fed by the Observer• Test Detector activates a test when its conditions are met• Test results are sent to a Collector for further processing
Jx1,data1Jx2,data2Jx3,data3Jx4,data4
Test Detector 1
Test Detector 2
Test Detector 3
SUO ObserverOtherFederates
ResultsCollector
Test Manager
The Test Detector watches for the arrival of the given subsequence of messages to the SUO and then watches for the corresponding system output
• Define a new primitive, processDetect, that replaces holdSend• Test Detector
– Tries to match the initial subsequence of messages received by the SUO– When the initial subsequence is successfully matched, it enables waitReceive (or
waitNotReceive) to complete the test
processDetect(Jx1,data1,t1) processDetect(Jx2,data2,t2)
processDetect(Jx3,data3,t3) waitReceive(Jx4,data4)
receiveAndProcess(Jx1,data1) receiveAndProcess(Jx2,data2)
receiveAndProcess(Jx3,data3) transmit(Jx4,data4)
Jx1,data1Jx2,data2Jx3,data3
Jx4,data4
t1 t2 t3 t4time
Test Test DetectorDetector
SUO
Test Detector Generation for Opportunistic Testing
Observer & System Under Observation (SUO)
System(e.g. DEVS)
inports outports
ObserverForSystem
inports outports
Observeroutports
Tap into inputs and outputs of SUO
Gather input/output dataand forward for testing
Example: Joint Close Air Support (JCAS) Scenario
Natural Language Specification
JTAC works with ODA!JTAC is supported by a Predator!JTAC requests ImmediateCAS to AWACS !AWACS passes requestImmediateCAS to CAOC! CAOC assigns USMCAircraft to JTAC!CAOC sends readyOrder to USMCAircraft !USMCAircraft sends sitBriefRequest to AWACS !AWACS sends sitBrief to USMCAircraft !USMCAircraft sends requestForTAC to JTAC !JTAC sends TACCommand to USMCAircraft !USMCAircraft sends deconflictRequest to UAV!USMCAircraft gets targetLocation from UAV!!
Observer of AWACS with JCAS
Data gathered by
Observer
addObserver(USMCAircraft, JCASNUM1);
Observer is connected to SUO andmonitors itsI/O traffic
Test Detector Prototype:Sequence Matcher
processDetect(J2.2,data1,t1)
processDetect(J3.2,data2,t2)
waitReceive(J7.0,data3,t3)
Sequential triggering, same as test models
Example of Effect of State: AWACS
Rules
R1: if phase = “passive” & receive= "ImmediateCASIn“then output = "CASResourcesSpec" & state = "doSurveillance“
R2: if state = "doSurveillance“ & receive= "sitBriefRequestIn“then output = "sitBriefOut“ & phase = “passive”
matchsequence 1:initial state = passiveprocessDetect(ImmediateCASIn,””,1)waitReceive(CASResourcesSpec,””)
matchsequence 2:initial state = doSurveillanceprocessDetect(sitBriefRequestIn,””,1)waitReceive(sitBriefOut,””)
state = doSurveillance
need to know the state to enable this sequence
i1 i2
o2o1
state = passive
Solution: make activation of matchsequence2 conditional on matchsequence1
matchsequence2 can only start when matchsequence1 has successfully been performed
Observation Test Of AWACS
Observer ofAWACS
AWACSTestManager
Problem with Fixed Set of Test Detectors
• after a test detector has been started up, a message may arrive that requires it to be re-initialized
• Parallel search and processing required by fixed presence of multiple test detectors under the test manager may limit the processing and/or number of monitor points
• does not allow for changing from one test focus to another in real-time, e.g. going from format testing to correlation testing once format the first has been satisfied
Solution
• on-demand inclusion of test detector instances• remove detector when known to be “finished”• employ DEVS variable structure capabilities• requires intelligence to decide inclusion and removal
Dynamic Test Suite: Features
• Test Detectors are inserted into Test Suite by Test Control
• Test Control uses table to select Detectors based on incoming message
• Test Control passes on just received message and starts up Test Detector
• Each Detector stage starts up next stage and removes itself from Test Suite as soon as the result of its test is known– If the outcome is a pass (test is successful) then next stage is
started up
Dynamic Inclusion/Removal of Test Detectors
message arrives
add induced test detectors into test set
test detector subcomponent removes its enclosing test detector when test case result is known (either pass or fail)
Test Manager Active Test Suite
Test Control
removeAncestorBrotherOf(“TestControl");
addModel(‘test detector”);addCoupling2(" Test Manager ",“Jmessage",“test detector", “Jmessage");
AWACS Opportunistic Testing in JCAS
Test Control
CAS Model with AWACSobservation
Initially empty Test Suite
AWACS Opportunistic Testing in JCAS (cont’d)
Test Control adds appropriate Test Detector and connects it in to interface,
Test Control observes CAS request message to AWACS
AWACS Opportunistic Testing in JCAS (cont’d)
Test Control passes on start signal and request message
First stage detector verifies request message receipt and prepares to start up second stage
AWACS Opportunistic Testing in JCAS (cont’d)
second stage waits for expected response from AWACS to request
First stage detector removes self from test suite
AWACS Opportunistic Testing in JCAS (cont’d)
Second stage observes correct AWACS response and removes itself and starts up second part
AWACS Opportunistic Testing in JCAS (cont’d)
At some later time, second part of Test Detector observes situation brief request message to AWACS First stage removes itself and starts up second stage
AWACS Opportunistic Testing in JCAS (cont’d)
Second stage observes situation brief output from AWACS thus passing test, It removes itself and enclosing Test Detector
Expanding ATC GenAs a JITC Core Technology
Additional Requirements
Additional Test Capabilities
6016C,Other MIL-STDs
One-on-OneStimulus/
Response
FederationHLA Time
Managed S/R
MultiplatformObservation
OpportunisticTesting
PlatformReal Time1-on-1 S/R
IABMConfig05
ADSI,IABM
IABM,ADSI v12
JXF, RAIDER,JIT 06-03, IABM Config07
IABM Config05
Dimensions in which ATC Gen is/can be extended: • Additional requirements (e.g., standards, system-level requirements)• Additional test capabilities
Numbers indicate possible order of extension – dependence on funding and customer priorities noted
Dynamic,Operational,
PerformanceTesting
JROC, Interoperability
Service Oriented Architecture (SOA)Global Information Grid (GIG)
Transition to Web Services
Summary
• Systems theory and DEVS were critical to the successful development of automated test case generator
• ATC-Gen is being integrated into JITC testing as core technology
• ATC-Gen methodology will be expanding to web services and Service Oriented Architecture over GIG
• Model continuity and mixed virtual/real environment support integrated development and testing process
Top Related