Yves Le Traon 2003 OO System Testing Automatic test synthesis from UML models.
Transcript of Yves Le Traon 2003 OO System Testing Automatic test synthesis from UML models.
Yves Le Traon 2003
OO System Testing
Automatic test synthesis from UML models
Yves Le Traon 2003
Outline
System testing Behavioral test patterns Generating behavioral test patterns
Yves Le Traon 2003
General context
Textual Requirements
Enhanced use cases model
Test objectivesgeneration
Test casessynthesis
Simulation model
Yves Le Traon 2003
Context & Objectives
Aim: Generating automatically test sequences from the use cases model
Dependencies between the use cases Idea:
Define the input and output of a use case Express the dependencies between the use cases
(declarative approach) Define criteria to generate relevant tests
Yves Le Traon 2003
Planifier
Ouvrir
Cloturer
Consulter
Demander la Parole
Donner la Parole
Parler
Sortir
Participant
Animateur Organisateur
Entrer
Test système et UML
Meeting distribué
Yves Le Traon 2003
The use case scenarios
High level, simple, incomplete Wildcards for genericity Example:
Enter use case scenario (x is a scenario parameter)
(b)
:Server
x:userenter(*, x)
ok
(a) Nominal case (b) Exceptional case
:Server
x:userenter(*, x)
nok
Yves Le Traon 2003
Test système et UML
Cas d’utilisation Scénarios Nominaux
Scénarios Exc. Rares
Scénarios Exc. Echecs
A Planifier NA1, NA2 EA1, EA2
B Ouvrir NB1 EB1, EB2
I Clôturer NI1 RI1
C Consulter NC1 EC1
D Entrer NC1 RD1 ED1, ED2
E Demander la Parole
NE1 EE1
G Parler NG1, NG2 RG1 EG1, EG2
H Sortir NH1 EH1
F Donner la Parole NF1 EF1, EF2
Yves Le Traon 2003
Test système et UML
Critère minimum:
Couvrir chaque scénario avec une donnée de test Ici 27 cas de test
Critère de couverture des combinaisons de use-cases Prérequis : un diagramme d’activité des use-cases
Yves Le Traon 2003
Activity diagrams
A swimlane per actor Visually significant Within UML notations Suitable to apply algorithms
Actor_3Actor_2Actor_1
UC_2 UC_3
UC_4
UC_1
Difficult to build Hard or impossible to
express certain behaviors Not suitable for use cases
shared by actors
Yves Le Traon 2003
Test système et UMLA Planifier
B Ouvrir
<<*>> C Consuter
D Entrer
E Demander la Parole
G Parler
H Sortir
I Cloturer
F Donner la Parole
Organisateur Participant Animateur
Yves Le Traon 2003
Test système et UML
Critère : tester chaque scénario de chaque cas d’utilisation dans chaque séquence nominale élémentaire (1 passage par boucle)
Yves Le Traon 2003
Test système et UML
Données de tests à générer pour la séquence A.B.I.H => 4 + 2x3 + 2x1x2 + 2x1x2x2 = 22 cas de test doivent être générés via ce « pattern »
Cible de Test A B I H
Combinaison des Cas de
Test
2
1
2
1
A
A
A
A
E
E
N
N
2
1
A
A
N
N
2
1
1
B
B
B
E
E
N
2
1
A
A
N
N 1BN
1
1
I
I
R
N
2
1
A
A
N
N 1BN
1
1
I
I
R
N
1
1
H
H
E
N
Yves Le Traon 2003
Test système et UML
En évitant les redondances: ordonner les « cibles de test » => 10 cas de test doivent être générés via ce « pattern »
Cible de Test H I B A
Combinaison des Cas de
Test
2
1
A
A
N
N
1BN NI1
1
1
H
H
E
N
2
1
A
A
N
N
1BN RI1
2
1
A
A
N
N
1BE
2
1
A
A
E
E
Yves Le Traon 2003
Benefits from test patterns
Generation of product specific test cases from product independant test patterns
But tedious to build test patterns especially for « basis tests »
Idea : being able to build automatically significant sets of test patterns
Yves Le Traon 2003
How to exploit use cases ordering ?
Generate pertinent paths of use cases In order to reach a test criterion Issues:
An algorithm to assemble the use cases taking into account the pre and post conditions
Defining pertinent test criterions
Yves Le Traon 2003
Conclusion
From early modeling to test cases : From reusable and generic test pattern To concrete test cases, specific to each product
Two ways of selecting test patterns: manually (qualitative approach) driven by use cases sequential dependencies
(quantitative approach)
Yves Le Traon 2003
Requirements by Contracts allow Automated System Testing
Clémentine NebutFranck Fleurey Yves Le Traon
Jean-Marc Jézéquel
IRISA-INRIAUniversité de Rennes 1
FRANCE
Yves Le Traon 2003
Overview of the Method
2. Deduce a model of the correct orderings of use cases
3. Cover this model with an adequate criterion
4. Result: a set of test objectives satisfying a certain criterion
Test objectives{UC1(p1,p2), UC3(p2),UC4(p1)}{UC3(p1),UC1(p2,p2)}…
UC1
UC2
1. Enhance the use cases with parameters and contracts
Pre
PostPre
Post
Param
Param
Yves Le Traon 2003
Example: a virtual meeting server
VirtualMtg
enterplan
open
close
consult
leave
hand over
speak
moderator
manager
user
connect
ask
Yves Le Traon 2003
Outline
A contract language for the use cases Test generation from the enhanced use cases
Building a model of the valid orderings of use cases
Definition of criteria
Results on a case study Conclusion
Yves Le Traon 2003
Outline
A contract language for the use cases Test generation from the enhanced use cases
Building a model of the valid orderings of use cases
Definition of criteria
Results on a case study Conclusion
Yves Le Traon 2003
Contracts: logical expressions on predicates
Precondition, postcondition Predicate= a name, an arity, and a semantic
+ set of typed parameters Ex: created(m:meeting)
manager(u:participant,m:meeting)
First-order logic: classical boolean operators (and, or, implies, not) + quantifiers (forall, exists)
Yves Le Traon 2003
Identifying the use case parameters
Parameters = Actors or Business concepts Used to represent the inputs of the use case May be reified during the design process In the virtual meeting example:
Types: participant, meeting Examples:
• UC Enter (p: participant, m: meeting)• UC HandOver(p1: participant, p2: participant, m:
meeting)
Yves Le Traon 2003
Contracts: example
#use case OPENUC open(u:participant;m:mtg) pre created(m) and moderator(u,m) and not closed(m) and not opened(m) and connected(u)post opened(m)
#use case CLOSEUC close(u:participant; m:mtg) pre opened(m) and moderator(u,m)post not opened(m) and closed(m) and forall(v:participant) {not entered(v,m) and not asked(v,m) and not speaker(v,m) }
OPEN(u1,m1);CLOSE(u1,m1) is a correct sequence
Yves Le Traon 2003
Outline
A contract language for the use cases Test generation from the enhanced use cases
Building a model of the valid orderings of use cases
Definition of criteria
Results on a case study Conclusion
Yves Le Traon 2003
Use Case Transition System (UCTS) : definition
Quadruple M=(Q,q0,A,) Q = non empty set of states
• State = set of instantiated predicates
q0 = initial state
A = alphabet of actions• action= instantiated use case
Q x A x Q = transition function
Instantiated use case (resp. predicate) = use case (resp. predicate) with its operational parameters
Yves Le Traon 2003
connected(p1), created(m1),
manager(p1,m1), moderator(p1,m1)
opened(m1)
UCTS: exampleconnected(p1),
created(m1), manager(p1,m1), moderator(p1,m1)
connected(p1), created(m1),
manager(p1,m1), moderator(p1,m1)
closed(m1)
connected(p1), created(m1),
manager(p1,m1), moderator(p1,m1)
opened(m1)entered(p1,m1)
close(p1,m1)
enter(p1,m1)
open(p1,m1)
close(p1,m1)
Yves Le Traon 2003
Building algorithm: principles
Need of: the enumeration of each actor and business
conceptfor example, 3 participants and 2 meetings
an initial state given in terms of true predicates
The algorithm tries to apply successively all the instantiated use cases from all the unvisited nodes until all the nodes are visited
Yves Le Traon 2003
UCTS in the testing context
Path of the UCTS = correct sequence of instantiated use cases
= test objective
Aim: Finding a set of paths in the UCTS covering the requirements
Necessity to find test criteria
Yves Le Traon 2003
Test criteria
General objective: generate few but efficient test objectives short test objectives
Structural criteria All edges All vertices All instantiated use cases All instantiated use cases and all vertices
Semantical criterion All precondition terms
Yves Le Traon 2003
All precondition terms criterion Find all the combinations of terms that make the
precondition true Examples:
UC over(u:participant; m:mtg)
pre speaker(u,m) or moderator(u,m)
try to apply Over with:• speaker(u,m) and not moderator(u,m)• not speaker(u,m) and moderator(u,m)• speaker(u,m) and moderator(u,m)
Yves Le Traon 2003
Robustness tests Generate paths leading to an invalid application of the
use case Exercise correctly the system, then make a non specified
action
Criterion: similar to all precondition term (all the terms that makes the precondition fail)
.
.
.
Correct path
Non specified action
Yves Le Traon 2003
Outline
A contract language for the use cases Test generation from the enhanced use cases
Building a model of the valid orderings of use cases
Definition of criteria
Results on a case study Conclusion
Yves Le Traon 2003
Results: virtual meeting case study
Dead code
Nominal code
Robustness code(w.r.t. spec)
Robustness code(w.r.t. env)
VirtualMtg
enter
plan
open
close
consult
leave
hand over
speak
moderator
manager
user
connect
ask
9%
8% 65%
18%
Yves Le Traon 2003
Experimental protocol
Test caseTest case
Test caseTest case
Test objectiveTest objective
Test objectiveTest objective
Test synthesis
System under test
Code coverage measures
Code coverageTool
Yves Le Traon 2003
Comparaison between criteria efficiency
50
60
70
80
90
100
% of covered
code
All edges Allvertices
All IUC All IUCand all
vertices
Allprecond
terms
Robustness test casesFunctional test cases
Yves Le Traon 2003
Results
Criterion# generated test
objectivesaverage size of the
tests
All edges 13841 11
All vertices 769 10
All instantiated UC 50 5
AV-AIUC 819 10
All precondition terms 15 5
Yves Le Traon 2003
Relative efficiency of the test cases
# covered statements
# test cases
0
4
8
12
16
20
All edges Allvertices
All IUC All IUCand All
vertices
Allprecond
terms
Yves Le Traon 2003
Code repartition and covered code
4%
63,5%
Dead code 9 %
Nominal code 65%
Robustness code w.r.t.spec 8%Robustness code w.r.t.env 18%
Covered code
Yves Le Traon 2003
Outline
A contract language for the use cases Test generation from the enhanced use cases
Building a model of the valid orderings of use cases
Definition of criteria
Results on a case study Conclusion
Yves Le Traon 2003
Conclusion
Test objectives automatically generated from use cases enhanced with contracts with test adequacy criteria
The test generated are efficient in terms of code coverage
Prototype tool supporting the approach
Yves Le Traon 2003
Future work
Use scenarios to transform test objectives into test cases use of a test synthesis tool
Give test priorities to the use cases adapt the criteria to the priorities
On-the-fly computation of the test objectives
Yves Le Traon 2003
From system level test patterns to specific test cases :
application to product-Line architectures
Yves Le Traon 2003
Product Line architectures
A product line : a set of systems which share a common software architecture and a set of reusable components. Building a product line aims at developing once the common
core of a set of products, and to reuse it for all the products.
Defining a product family Variants and commonalities Reuse assets
For our purpose: specify behavioural test patterns, that become reusable “test assets” of the product-line
Yves Le Traon 2003
Product Line architectures: a key challenge
Use case scenarios cannot be used directly for testing Generic and incomplete. Parameters are not known, nor object instances (scenarios
concern roles). Specify the general system functionality without knowing – at
that stage - the exact sequence calls/answers.
Generating test cases from such test patterns for a given UML specification is thus one of the key challenges in software testing today.
Yves Le Traon 2003
PL
Variants optional, when a component can be present or not, alternative, when at a variation point, one and only one
component can be chosen among a set of components, multiple, when at a variation point, several components can
be chosen among a set of components. All the variants must appear in the architecture but
not all the possible combination of variants Extracting a product from the global product line
architecture : product instantiation
Yves Le Traon 2003
Testing product lines
Benefiting from the PL specificities Testing commonalities Deriving tests according to the variants Specific tests
Reusing tests Building test assets
Defining test independently from the products Using generic scenarios Deriving product-specific test cases from those generic
scenarios
Yves Le Traon 2003
Product Line architectures: example Virtual Meeting Server PL offers simplified web conference
services: it aims at permitting several kinds of work meetings, on a distributed
platform. ( general case of a ‘chat’ software). When connected to the server, a client can enter or exit a meeting, speak, or
plan new meetings. Three types of meetings
standard meetings where the client who has the floor is designated by a moderator (nominated by the organizer of the meeting)
democratic meetings which are standard meetings where the moderator is a FIFO robot (the first client to ask for permission to speak is the first to speak)
private meetings which are standard meetings with access limited to a defined set of clients.
Yves Le Traon 2003
The Virtual Meeting Example
Connection to the server
Planning of meetings Participation in
meetings Moderation of meetings
VirtualMtgenter
plan
open
close
consult
leave
hand overspeak
moderator
manager user
connect
Virtual meeting use case diagram
Yves Le Traon 2003
Product Line architectures: example
Due to marketing constraints, the Virtual Meeting PL is derivable into three products a demonstration edition: standard and limited a personal edition: any type but limited an enterprise edition: any type, no limitations
Two variants : type (multiple) and participants limitation (optional ) (also OS, languages, interfaces etc.)
Yves Le Traon 2003
The Virtual Meeting Example
Two main variants: the kinds of meetings
available the limitation of the
number of participants
Three products: Demonstration edition Personal edition Enterprise edition
Virtual Meeting
Variant 1 {multiple}: available meetings
Variant 2 {optional}: meetings limitation
Demonstration edition
Standard true
Personal edition
Standard, private,
democratictrue
Enterprise edition
Standard, private,
democraticfalse
Yves Le Traon 2003
Testing product lines
Benefiting from the PL specificities Testing commonalities Deriving tests according to the variants Specific tests
Reusing tests Building test assets
Defining test independently from the products Using generic scenarios Deriving product-specific test cases from those
generic scenarios
Yves Le Traon 2003
A contradiction
Test scenarios must be expressed at a very high level to be reusable to be independent from the variants and the
products
Generic scenarios are too vague and incomplete cannot be directly used on a specific product
Impossible to reuse generic test scenarios ?
Yves Le Traon 2003
Behavioral Test Patterns
Based on the use case scenarios high level generic product independent nominal or exceptional
A selection from among the scenarios : An accept scenario Reject scenarios Prefix scenarios
Yves Le Traon 2003
Behavioral Test Patterns
Based on the use case scenarios high level generic (use of wildcards) incomplete nominal or exceptional
A selection from among the scenarios : An accept scenario ( test objective) Reject scenarios (optional) Prefix scenarios ( initialisation, optional)
Yves Le Traon 2003
Testing a PL
Behavioral Test Patterns (or Test Objective) an accept scenario: it expresses the behavior that has to be
tested, e.g. the successful exit (“leave” a meeting use case) of a participant from a meeting,
one or several (optional) reject scenarios: they express the behaviors that are not significant for the tester, e.g. the consult function of a meeting state does not interact with the entering into a meeting.
one or several (optional) preamble (or prefix) scenarios that must precede the accept scenario. For example, a meeting must be opened before any participant can enter the virtual meeting.
Yves Le Traon 2003
An Example
S-
Prefix
S+
:Server
x:userenter(*, x)
nokx:user
:Server
connect(x)ok
plan(*, x)ok
open(*, x)ok
:Servery:user
close(*, y)
:Server
leave(*, y)y:user
Yves Le Traon 2003
The reject scenarios
Optional
Reduce the « noise » Avoid calls irrelevant for the test
Exclude interfering calls
Yves Le Traon 2003
Describes the preamble part of the test case Guides the synthesis A composition of use-case scenarios Scenarios versus object diagram ?
The prefix
Prefix
x:user
:Server
connect(x)ok
plan(*, x)ok
open(*, x)ok
user2:user
user3:user
user4:user
server:demoServer
user1:user
Object diagram
Yves Le Traon 2003
Typical reject scenarios
Some scenarios can be added automatically Use of a boolean dependency matrix
Plan Open Close Consult Enter Speak Leave
Plan X X
Open X X
Close X X
Consult X X
Enter X X X X
Speak X X X
Leave X X
Scenarios independent from
the enter use case : added as reject scenarios
Yves Le Traon 2003
Typical reject / prefix scenarios
Use of the activity diagram Accept scenario = the targeted scenario in a use
cas Prefix = the previous scenarios in the path Reject = all the scenarios of the use cases that are
not involved in the path.
Yves Le Traon 2003
Generating test patterns
Product instanciation
DetailedDesign
General Design
Main classesInterfaces…
P1
P2
P3
TP1 TP2Test cases synthesis
Use casesUC1
UC2
nominal exceptional
nominal exceptional
Evolution
Test patterns specification
(test objective)
Accept scenario Reject scenarios (optional) Prefix scenarios (optional)
selectionmanual
automatedor
Yves Le Traon 2003
U M L S p ec ifica tio n (c la ss d iag ram , o b jec t
d iag ram , s ta tech a rts ,… )
M o d e llin g
T est p a ttern
2 . F o rm a l o b jec tiv e d e riv a tio n
L T S te s t o b jec tiv e
1 . F o rm a l sp ec if ica tio n d e riv a tio n
L T S sp ec ifica tio n m o d e l v ia s im u la to r A P I
U M L A U T
T G V
v is ib le ac tio n s ( .h id e ) in p u ts /o u tp u ts ( .io )
X M I
X M I
Y o u r fa v o u r ite C A S E to o l
U M L A U T
D isp lay
3 . T es t S y n th es is (o n th e f ly )
IO L T S te s t ca se
4 . U M L te s t c a se d e riv a tio n
T est ca se
X M I
Y o u r fa v o u r ite C A S E to o l
Y o u r fa v o u r ite
C A S E to o l
Yves Le Traon 2003
Compiling the Test Pattern
Inputs venant d’UML: Le diagramme de classes détaillé avec – autant que
possible – un statechart par classe active du système Un diagramme d’objets initial Le pattern de test
Les aspects dynamiques sont fournis à TGV sous forme d’API
Output : Un scénario détaillé UML décrivant tous les appels précis
et les verdicts attendus à effectuer sur le système pour observer le comportement spécifié dans le pattern
Yves Le Traon 2003
Compiling the Test Pattern
accept+ = sequential composition of the prefix and the accept scenario
Scenarios making up the test case = accepted by accept+ rejected by none of the reject scenarios
accept+ LTS S+
reject scenarios {seqj-}jJ LTS {Sj
-}j J
Test pattern LTS S+ j J Sj-
Yves Le Traon 2003
Synthesis of the test case
Inputs of TGV: Simulation API LTS representing the Test Pattern Which actions are internal ? Which actions are inputs ? outputs ?
Output of TGV: IOLTS representing a test case
UML test case derivation
Yves Le Traon 2003
Product Line architectures: example
opened planned
closedsaturated
create
open
enter[usersEntered.card>=max]
leave
close
close
close/nok(not yet opened)
enter/nok(meeting saturated)
opened planned
closed
create
open
close
close/nok(not yet opened)
(a) non-limited meetings (b) limited meetings
Yves Le Traon 2003
An Example
S-
Prefix
S+
:Server
x:userenter(*, x)
nokx:user
:Server
connect(x)ok
plan(*, x)ok
open(*, x)ok
:Servery:user
close(*, y)
:Server
leave(*, y)y:user
Yves Le Traon 2003
Test patterns and test cases
user2:user user3:user user4:user
server:demoServer
user1:user
enter(aMtg, user2)
okenter(aMtg, user3)
okenter(aMtg, user4)
nok
enter(aMtg, user1)ok
connect(user1)ok
plan(aMtg, user1)ok
open(aMtg, user1)ok
Preamble
Test Objective
Yves Le Traon 2003
Conclusion
From early modeling to test cases : From reusable and generic test pattern To concrete test cases, specific to each product
Methodology « not fully automated » …