Online Testing of Real-Time Systems using...
Transcript of Online Testing of Real-Time Systems using...
Online Testing ofReal-Time Systems using
Uppaal-TRON
Brian NielsenKim Larsen
Marius MikucionisArne Skou
{bnielsen|kgl|marius|ask}@cs.aac.dk
MB-T&V of RTS using UppaalLecture 1
Timed Automata, Uppaal, Model Checking
Lecture 2 1. Optimal Scheduling
Time & Cost Optimal test generation (offline)Deterministic, Output Urgent TA
2. Controller SynthesisTimed Games Deterministic, timing uncertainty
Lecture 3Online real-time testing
Full non-determinism, full (input enabled) Uppaal-TATool TRON was introduced & demo’ed in mondaysexercise session
TRON
AgendaIntroductionCorrectness Criteria
IOCOReal-TimeRelatizived IOCO
Online Testing AlgorithmNon-determinism,State-set computation
Testing, Monitoring, Simulation, Environment EmulationDanfoss Case StudyConclusions Research Problems
TestGene-ratortool
TestGene-ratortool
click?x:=0
click?x<2
x>=2
DBLclick!
Automated Model Based Conformance Testing
fail
pass
Testexecution
tool
Testexecution
toolEvent
mapping
Driver
Model Test suite
TestGenerator
tool
TestGenerator
tool
Implementation Relation
Selection &optimization
Does the behavior of the (blackbox) implementation comply to that of the specification?
ImplementationUnder Test
TestGene-ratortool
TestGene-ratortool
click?x:=0
click?x<2
x>=2
DBLclick!
input
Online Testing
fail
pass
Testexecution
tool
Testexecution
toolEvent
mapping
Driver
Model
TestGenerator
tool
TestGenerator
tool output
Implementation Relation
Selection &optimization
•Test generated and executedevent-by-event (randomly)
•A.K.A on-the-fly testing
ImplementationUnder Test
inputinputinput
outputoutputoutput
Tron Framework
”Relativized Real-Time i/o conformance” Relation
•UppAal-TRON: Testing Real-Time Systems Online•Spec = UppAal Timed Automata Network: Env || IUT
Timed Trace: i1.2½.o1.3.o2.19.i2.5.i3
Correctness Criteria
ioco
coin?
coin?token?
coffee!
token?i
tea! coffee!
coin?
s
I conforms-to S ??
[Jan Tretmans].
I conforms-to S ??
coin?
coin?token?
coffee!
token?i
tea!
ioco
scoin?
coffee!
token?
tea!
[Jan Tretmans].
coffee!
coin?
itoken?
coin?token?
coin?token?
ioco
I conforms-to S ??
scoin?
coffee!
token?
tea!
[Jan Tretmans].
ioco
coin?
coin?
coin?
coffee!
coin?i
coin?
coffee!
coin?
s
I conforms-to S ??
[Jan Tretmans].
i ioco s =def ∀σ ∈ Straces (s) : out (i after σ) ⊆ out (s after σ)
Tretman’s Ioco
p p iff ∀ o! ∈ LU ∪ {τ} : p o!δ
Straces (s) = { σ ∈ ( L ∪ {δ } )* | s }σ
p after σ = { p’ | p p’ }σ
out ( P ) = { o! ∈ LU | p , p∈P }
∪ { δ | p p, p∈P }
!o
δ
[Jan Tretmans].
“The” conformance relation used for blackboxtesting of (untimed) reactive systems
ioco
coin?
coin?token?
coffee!
token?i
tea! coffee!
coin?
s
out (i after coin?) = { coffee! }out (i after token?) = { tea! }
out (s after coin?) = { coffee! }out (s after token?) = ∅
But token? ∉ Straces ( s )
Ioco
i ioco s =def ∀σ ∈ Straces (s) : out (i after σ) ⊆ out (s after σ)
[Jan Tretmans].
Ioco
i ioco s =def ∀σ ∈ Straces (s) : out (i after σ) ⊆ out (s after σ)
coin?
coin?token?
coffee!
token?i
tea!
ioco
scoin?
coffee!
token?
tea!
out (i after coin?) = { coffee! }out (i after token?) = { tea! }
out (s after coin?) = { coffee! }out (s after token?) = { tea! }
[Jan Tretmans].
coffee!
coin?
itoken?
coin?token?
coin?token?
out (s after token?) = { tea! }out (i after token?) = { δ }
ioco
Ioco
i ioco s =def ∀σ ∈ Straces (s) : out (i after σ) ⊆ out (s after σ)
scoin?
coffee!
token?
tea!
[Jan Tretmans].
out (s after coin?) = { coffee! }out (i after coin?) = { δ, coffee! }
ioco
coin?
coin?
coin?
coffee!
coin?i
coin?
coffee!
coin?
s
Ioco
i ioco s =def ∀σ ∈ Straces (s) : out (i after σ) ⊆ out (s after σ)
[Jan Tretmans].
Timed Conformance??
•c?.2.r?.2.weakC•c?.5.r?.4.strongC
•c?.2.r?.2.weakC•c?.5.r?.7
Example Traces
I1 rt-ioco SI2 rt-ioco S
Real-Time conformance
•TTr(s): the set of timed traces from s
•eg.: σ = coin?.5.req?.2.weakCoffee!.9.coin?
•Out(s after σ) = possible outputs and delays after σ•eg. out ({l_2,x=1}): {weakCoffee,0...2}
•i rt-ioco s =def
•∀σ ∈ ttraces(s) :out(i after σ) ⊆ out(s after σ)
•ttraces(i) ⊆ ttraces(s)
•Intuition•no illegal output is produced and•required output is produced (at right time)
•i rt-ioco s =def
•∀σ ∈ TTr(s): Out(i after σ) ⊆ Out(s after σ)
•TTr(i) ⊆ TTr(s)
Sample Cooling Controller
IUT-model Env-model
On!
Off!
Low?
Med?
High?
Cr
•When T is high (low) switch on (off) cooling within r secs.•When T is medium cooling may be either on or off (impl freedom)
Environment Modeling
EL
EM
E1 E2
EL E2 E1 EM
Temp.
time
High!
Med!
Low!
EM Any action possible at any timeE1 Only realistic temperature variationsE2 Temperature never increases when coolingEL No inputs (completely passive)
Implementation relationRelativized real-time io-conformance
•I rt-iocoE S =def
∀σ ∈ TTr(E): Out((E,I) after σ) ⊆ Out((E,S) after σ)
•I rt-iocoE s iff TTr(I) ∩ TTr(E) ⊆ TTr(S) ∩ TTr(E)
•Intuition, for all relevant environment behaviors•never produces illegal output, and•always produces required output in time
•~timed trace inclusion
•Let P be a set of states•TTr(P): the set of timed traces from states in P•P after σ = the set of states reachable after timed trace σ•Out(P) = possible outputs and delays in P
SystemModel
Environmentassumptions ε0’,o0,ε1’,o1…
ε0,i0,ε1,i1…E
IUT
S I
Re-use Testing EffortGiven I, E, SAssume I rt-iocoE S
If S S’ then I rt-iocoE S’
1. Given new (weaker) system specification S’
If E’ E then I rt-iocoE’ S
2. Given new (stronger) environment specification E’
An Algorithm
IDEA: State-set tracking
Dynamically compute all potential states that the model M can reach after the timed trace ε0,i0,ε1,o1,ε2,i2,o2,…
Z=M after (ε0,i0,ε1,o1,ε2,i2,o2)
If Z= ∅ the IUT has made a computation not in model: FAILi is a relevant input in Env iff i ∈ EnvOutput(Z)
(Abstract) Online AlgorithmAlgorithm TestGenExe (S, E, IUT, T ) returns {pass, fail)Z := {(s0, e0)}.
while Z ≠ ∅ ∧ ♯iterations ≤ T do either randomly:1. // offer an input
if EnvOutput(Z) ≠ ∅randomly choose i ∈ EnvOutput(Z)send i to IUTZ := Z After i
2. // wait d for an outputrandomly choose d ∈Delays(Z)wait (for d time units or output o at d′ ≤ d)if o occurred then
Z := Z After d′
Z := Z After o // may become ∅ (⇒fail)else
Z := Z After d // no output within d delay3. restart:
Z := {(s0, e0)}, reset IUT //reset and restartif Z = ∅ then return fail else return pass
Online AlgorithmAlgorithm TestGenExe (S, E, IUT, T ) returns {pass, fail)Z := {(s0, e0)}.
while Z ≠ ∅ ∧ ♯iterations ≤ T do either randomly:1. // offer an input
if EnvOutput(Z) ≠ ∅randomly choose i ∈ EnvOutput(Z)send i to IUTZ := Z After i
2. // wait for an outputrandomly choose d ∈Delays(Z)wait (for d time units or output o at d′ ≤ d)if o occurred then
Z := Z After d′
if o ∉ ImpOutput(Z) then return failelse Z := Z After o
else Z := Z After d // no output within d delay
3. restart:Z := {(s0, e0)}, reset IUT //reset and restart
if Z = ∅ then return fail else return pass
•Sound •Complete (as T →∞)(Under some technical assumptions)
State-set computationCompute all potential states the model can occupy after the timed trace ε0,i0,ε1,o1,ε2,i2,o2,…Let Z be a set of states
l0
τ, x:=0l1
{ ⟨l0,x=0⟩} after 4 = { ⟨l0,x=4⟩, ⟨l1, 0 ≤ x ≤ 4⟩ }
⟨l0,x=0⟩1→ ⟨l0,x=1⟩ ⟨l1,x=0⟩ ⟨l1,x=3⟩τ
→3→
Z after ε :possible states after τ* and εi , totaling a delay of ε
l0
x≥7, a
a
l3
l2
l1
l4a,
x:=0
τ
{ ⟨l0,x=3⟩ } after a = { ⟨l2,x=3⟩, ⟨l4, x=3⟩, ⟨l3, x=0⟩ }
Z after a: possible states after a (and τ*)
State-set Operations
Can be computed efficiently using the symbolic data structures and algorithms in UppAal
τ→
τ→
τ→
a→
a→
a→
τ→τ
→τ→
τ→τ→
τ→
Z after a: possible states after action a (and τ*)
Z
Z after ε :possible states after τ* and εi , totaling a delay of ε
5→
τ→
τ→
τ→
1→
2→
τ→4
→τ→
2→1→
τ→
timeε (5)
Z
Symbolic Interpretation
Tron: implementation
Graphical User Interface (java)
editor simulator verifier
Uppaal Engine Server (C++) - Parsing- Communication- Control
Zones &Reachability,Etc
State-setexplorer
Online Test Generation
Adap
ter
SystemUnderTest
Drive
r
Adap
ter
API
Phys
ical
I/O
Simulator API
Online Testing Example
Online Testing
Online Testing
Online Testing
Online Testing
Online Testing
Online Testing
Online Testing
Online Testing
Online Testing
Online Testing
Online Testing
Online Testing
Testing, Monitoring, Simulation, Environment
Emulation
Real System
SystemUnderTest
Phys
ical
I/O
PlantContinuous
Monitoring
SystemUnderTest
Phys
ical
I/O
TRON-Monitor
ø SystemModelo
i
PlantContinuous
Adapter
o
i
Passively listen and check observed trace
Environment Emulation
Adap
ter
SystemUnderTest
Adap
ter
API
Phys
ical
I/O
TRON-Env Emulator
EnvironmentModel øo
i
PlantContinuous
Simulator / prototype
TRON
ENV=SystemModel øo
i
PlantContinuous
o
i
Adap
terPh
ysic
al I
/O
PlantContinuous
Testing = Environment Emulation + Monitoring
Adap
ter
SystemUnderTest
Adap
ter
API
Phys
ical
I/O
TRON-Monitor
ø SystemModelo
i
TRON-Env Emulator
EnvironmentModel øo
i
Offline Verdict Evaluation
Adap
ter
SystemUnderTest
Adap
ter
API
Phys
ical
I/O
TRON-Monitor
ø SystemModelo
i
TRON-Env Emulator
EnvironmentModel øo
i
o
i
Only Env Emulation is required in real-time
Danfoss EKC CaseElectronic Cooling Controller
Output Relays•compressor relay•defrost relay•alarm relay•(fan relay)Display Output•alarm / error indication•mode indication•current calculated temperature
Sensor Input•air temperature sensor•defrost temperature sensor•(door open sensor) Keypad Input•2 buttons (~40 user settableparameters)
•Optional real-time clock or LON network module
Industrial Cooling Plants
The CollaborationGoals
Can we model significant aspects and time constraints?Can we test in real-time? Is the tool fast enough?How do we control and observe target?Detect errors??
MeansExisting product Documentation
requirements specificationusers manualsequipment and software for real test executionMeeting and e-mail with Danfoss Engineers
Continued collaborationTest of new generation controllers being developedImproved test interfaceTest Case Language & Automatic Execution System
Basic Refrigeration Control
Time
setpoint
setpoint+differential
highAlarmDeviation
lowAlarmLimit
highAlarmLimit
lowAlarmDeviation
differential
start compressor
stopcompressor
start compressor
stop compressor
startalarm
normal min restart time not elapsed
min cooling time not elapsed alarm delay
Refrigeration Control (2)
Sensor errors (open/short circuit): safety mode with duty cycle based on
historical compressor on/off timesor fixed (uninterrupted #cutins >72)
Several defrost modesperiodic (eg. 8 hrs)manualRT-clocktemperature diff. basedelectrical or hot gass
Alarms after a defrost further postponed (90 mins)Fan control
cooling mode / defrost mode~40 user settable parameters
EKC Adaptation 1
Hardware+Physical I/O
Device drivers+kernel
Parameter DB (shared variables)
Control Software
Test Interface
LON GW RS232
win32+OLE+VB
•AK-Online (PC SW)•configuration•supervision•logging
•Read and write parameter “database”•47 parameters
EKC Software Layering
Adaptor
EKC Adaptation 2
tcp/ipLON+rs232
win32+OLE+VB Solaris/Linux (C++)
TRON Engine
compressorOn
22.3 0 1 22.1 0 1
16.7 0 0 old copy
new copy
“continous” readout 2 readouts/s
setTemp(20)
“par#4=20.0”
Need better test interface!•Read-only parameters•Delay and synchronization
Main Model Components Significant sub-set of functionality
18 concurrent timed automata14 clocks, 14 integers
Output
Input
IUT-Model
alarmRelay
compressorRelay
tempMeasurement
compressor
newTempnewTemp
on/off on/off
Environment
TemperatureGenerator
defrostRelay
defrost
autoDefrost
on/off
defrostEventGen
alarmDisplay
on/off
highTempAlarm
Temperature Tracking
Temperature
Time
“periodic” weighted average:5
4*1 samplednn
TTT
+= −
EKC calculated temperature
Model calculated temperatureError/uncertainty envelope
tolerance in sampling time
tolerance in value computation
compressorOn!
Reverse Engineering
Unclear and incomplete specificationsMethod of Working
1. Formulate hypothesis model2. Test 3. FAIL-verdict ⇒ Refine model4. (PASS) ⇒ Confirm with Danfoss
Detects differences between actual and modeled behaviorIndicates promising error-detection capability4 examples
Ex1: Control Period
Control actions issued when ”calculatedTemp” crosses thresholds
No requirements on period givenTested to be 1.2 seconds
“periodic” weighted average:5
4*1 samplednn
TTT
+= −
Ex2: High Alarm Monitor v1
Clearing the alarm do not switch off alarm state, only alarm relay
Ex2: High Alarm Monitor v2
•Add HighAlarmDisplay action •Add location for “noSound, but alarmDisplaying”•(Postpone alarms after defrosting)
Ex3: Defrosting and Alarms
When defrosting the temperature rises, therefore postpone high temperature alarms during defrost System parameter alarmDelayAfterDefrostSeveral Interpretations
Postpone alarmDelayAfterDefrost+alarmDelay after defrost?Postpone alarmDelayAfterDefrost+alarmDelay after highTemp detected?Postpone alarmDelayAfterDefrost until temperature becomes low; then use alarmDelay
Option 3 applies!
Ex4: Defrost TimeTolerance
Defrost relays engaged earlier and disengaged later than expectedAssumed 2 seconds toleranceDefrosting takes long timeImplementation uses a low resolution timer (10 seconds)
Example Test Run
150016001700180019002000210022002300240025002600270028002900300031003200330034003500360037003800
0 100000 200000 300000 400000 500000 600000 700000 800000 900000
setTempmodelTempekcTempCONCOFFAONAOFFalarmRstHADOnHADOffDONDOFFmanDefrostOnmanDefrostOff
defrostOff?
alarmOn!alarmDisplayOn!
resetAlarm?AOFF!
HighAlarmDisplayOff!
manualDefrostOn?COFF!DON!
compressorOn!
//defrost completeDOFF!CON!
State-set Evolution (EKC)State set plot
0
200
400
600
800
1000
1200
0 100 200 300 400 500 600 700 800 900 1000time (sec)
Num
ber
of s
tate
s
0
5
10
15
20
25
degr
ees
State-setHigh Temp LimitTemperatureAlarm Limit
Correlation between state-sets and model behavior
Cost of state-set update
Number ofSymbolic states
μS
Conclusions
Real-Time Online testing from timed automata is feasible, but
Both theoretically and technically very challengingMany open research issues
Research Problems
Testing TheoryEfficient data structures and algorithms for state set computationDiagnosis & DebuggingGuiding and Coverage MeasurementProbabilistic & Hybrid extensionsReal-Time execution of TRON
Controller Synthesis (Observability / Implementability)Scheduling
Adaptor Abstraction
Related Work
Formal Testing Frameworks[Brinksma, Tretmans]
Real-Time Implementation Relations[Khoumsi’03, Briones’04]
Symbolic Reachability analysis of TimedAutomata
[Dill’89, Larsen’97,…]
Online state-set computation[Tripakis’02]
Online Testing[Tretmans’99, Peleska’02, Krichen’04]
END
Coverage EvaluationConvert observed concrete timed trace to a timed automataReplace env-model by trace automatonCoverage Criteria Decorated IUT-modelReachability Analysis
•Possibly Covered E<> ENV_trace.end and e[1]==1
•Definitely Covered A[] ENV_trace.end imply
e[1]==1•Not Covered
A[] ENV_trace.end imply e[1]==0