Verification and Validation

78
Nov. 18, 2004 1 Verification and Validation Verification: checks that the program conforms to its specification. – Are we building the product right? Validation: checks that the program as implemented meets the expectations of the user. – Are we building the right product?

description

Verification and Validation. Verification: checks that the program conforms to its specification. Are we building the product right? Validation: checks that the program as implemented meets the expectations of the user. Are we building the right product?. Static Verification. - PowerPoint PPT Presentation

Transcript of Verification and Validation

Page 1: Verification and Validation

Nov. 18, 2004 1

Verification and Validation

Verification: checks that the program conforms to its specification.– Are we building the product right?

Validation: checks that the program as implemented meets the expectations of the user.– Are we building the right product?

Page 2: Verification and Validation

Nov. 18, 2004 2

Static Verification

Program inspection Formal method

Page 3: Verification and Validation

Nov. 18, 2004 3

Verification and Proofs of Correctness

Formally specify the desired functionality, then verify that the program is a correct implementation of the specification.

Page 4: Verification and Validation

Nov. 18, 2004 4

Hoare's Rules

Program fragments and assertions are composed into triples {P}S{Q}– where P is the precondition assertion, Q is the

postcondition assertion,and S is program statements.

– Interpretation: If P is true before S is executed, then when S terminates, Q is satisfied.

Page 5: Verification and Validation

Nov. 18, 2004 5

Proofs of Correctness

Partial correctness: if the precondition is true, and the program terminates, then the postcondition is satisfied.

Total correctness: is partial correctness, plus a proof of termination.

Page 6: Verification and Validation

Nov. 18, 2004 6

The Assignment Rule

{P} x:=f {Q} where P and Q are the same, except that all

the occurrences of x in P have been replaced by f.

Backward substitution {x=5} x = x + 1 { x = 6} {z > y + 50} x = z – 43 { x > y + 7}

Page 7: Verification and Validation

Nov. 18, 2004 7

Rule for Sequencing Statements{F1} S1; S2{F3}

{F1}S1{F2}, {F2}S2{F3}

{F1}S1;S2{F3}

Page 8: Verification and Validation

Nov. 18, 2004 8

Rule for Conditions and Loops

{P & C} S1 {Q}, {P & ~C} S2 {Q}

{P} if C then S1 else S2 endif {Q}

{I & C} S { I }

{I} while C do S od {I & ~C}

Page 9: Verification and Validation

Nov. 18, 2004 9

Software Testing

Software Requirements Specifications– Describes the expected runtime behaviors of

the software. A Test Plan

– Describe how to test each behavior. The software (source code or executable)

Page 10: Verification and Validation

Nov. 18, 2004 10

Testing

Failure: – The departure of program operation from user

requirements. Fault:

– A defect in a program that may cause a failure. Error:

– Human action that results in software containing a fault.

Page 11: Verification and Validation

Nov. 18, 2004 11

The Test Plan

A ``living document''. It is born with the system and evolves as the system evolves. It is what the key decision makers use to evaluate the system.

User objectives. System Description and traceability Matrices. Special Risk Elements. Required Characteristics-- operational and

technical. Critical Test Issues -- operational and technical.

Page 12: Verification and Validation

Nov. 18, 2004 12

Management Plan

Integrated Schedule Roles and Responsibilities Resources and Sharing

Page 13: Verification and Validation

Nov. 18, 2004 13

Verification Outline

Verification to Date Previous Results Testing Planned

– Unresolved Issues– Issues arising during this phase– Scope of Planned tests– Test Objectives

Special Resources Test Articles

Page 14: Verification and Validation

Nov. 18, 2004 14

Validation Outline

Validation to Date Previous Results Testing Planned

– Unresolved Issues– Issues arising during this phase– Scope of Planned tests– Test Objectives

Special Resources Test Articles

Page 15: Verification and Validation

Nov. 18, 2004 15

Test Results and Traceability

Test Procedures Test Reporting Development Folders

Page 16: Verification and Validation

Nov. 18, 2004 16

Types of Faults

algorithmic faults computation and precision faults documentation faults stress or overloaded faults capacity or boundary faults timing or coordination faults throughput or performance faults

Page 17: Verification and Validation

Nov. 18, 2004 17

IBM Orthogonal Defect Classification

Function: fault that affects capability, end-user interfaces, product interface with hardware architecture, or global data structure.

Interface: fault in interfacing with other components or drives via calls, macros, control blocks, or parameter lists.

Checking: fault in program login that fails to validate data and values properly before they are used.

Assignment: fault in data structure or code block initialization. Timing/serialization: fault that involves timing of shared and real-time

resources. Build/package/merge: fault that occurs because of problems in

repositories, management changes or version control. Documentation: fault that affects publications and maintenance notes Algorithm: fault involving efficiency or correctness of algorithm

or data structure but not design.

Page 18: Verification and Validation

Nov. 18, 2004 18

The Testing Process

Unit testing Component testing Integration testing System testing Acceptance testing

Page 19: Verification and Validation

Nov. 18, 2004 19

Testing Strategies

Top-down testing Bottom-up testing Thread testing Stress testing Back-to-back testing

Page 20: Verification and Validation

Nov. 18, 2004 20

Traditional Software Testing Techniques

Black box testing– program specifications : functional testing– operational profile: random testing, partition testing

White box testing – statement coverage– branch coverage– data flow coverage– path coverage

Others– Stress testing– Back-to back testing

Page 21: Verification and Validation

Nov. 18, 2004 21

Defect Testing

Black-box testing Interface testing Structural testing

Page 22: Verification and Validation

Nov. 18, 2004 22

Black-Box Testing

Graph-based testing methods Equivalence Partitioning Boundary value analysis

Page 23: Verification and Validation

Nov. 18, 2004 23

Graph-Based Testing

Transaction flow modeling Finite state modeling Data flow modeling Timing modeling

Page 24: Verification and Validation

Nov. 18, 2004 24

Partition Testing

If an input condition specifies a range, one valid and two invalid equivalence classes are defined.

If an input condition requires a specific value, one valid and twoinvalid equivalence classes are defined.

If an input condition specifies a member of a {\em set}, one valid andone invalid class are defined.

If an input condition is boolean, one valid and one invalid class are defined.

Page 25: Verification and Validation

Nov. 18, 2004 25

Boundary Value Analysis

If an input condition specifies a range bounded by values a} and b, test cases should be designed with values a and b , justabove and just below a and b, respectively.

If an input condition specifies a number of values, test cases should be developed that exercise the minimum and maximum numbers.Values just above and below minimum and maximum are also tested.

Apply guidelines 1 and 2 to output conditions. If internal program data structures have prescribed boundaries,

be certain to design a test case to exercise the data structure at its boundary

Page 26: Verification and Validation

Nov. 18, 2004 26

Interface Testing

Parameter interfaces Shared memory interfaces Procedural interfaces Message passing interfaces

Page 27: Verification and Validation

Nov. 18, 2004 27

Integration Testing

top-down integration bottom-up integration incremental testing

Page 28: Verification and Validation

Nov. 18, 2004 28

White-Box Testing

Statement Coverage Criterion Branch coverage criterion Data flow coverage criterion Path coverage criterion

Page 29: Verification and Validation

Nov. 18, 2004 29

Statement Coverage

while (…) {

….

while (…) {

…..

break;

}

…..

break;

Page 30: Verification and Validation

Nov. 18, 2004 30

Branch Coverage

if ( StdRec != null) StdRec.name = arg[1];………….

write (StdRec.name)

Page 31: Verification and Validation

Nov. 18, 2004 31

Data Flow

Data-flow graph: is defined on the control flow graph by defining sets of variables DEF(n), C-USE(n) and P-USE(n), for each node n.

Variable x is in DEF(n) if– n is the start node and $x$ is a global, parameter, or static local

variable.

– x is declared in a basic block n with an initializer.– x is assigned to the basic block with the =, op=, ++, or --

operator.

Page 32: Verification and Validation

Nov. 18, 2004 32

Data flow Testing

read (x, y);

if x> 0

z = 1;

else

z = 0;

if y < 0

write (z)

else

write (y / z);

Path : 2n

Data flow : n4

n is the number of conditional statements

Branch:

Cyclomatic complexity

(McCabe)

CC(G) = #E - #N + 2P.

Page 33: Verification and Validation

Nov. 18, 2004 33

C-Use

Variable x is in C-USE(n) if x occurs as a C-USE expression in basic block as– a procedure argument,

– an initializer in a declaration,– a return value in a return statement,– the second operand of “=“– either operand of “op=“– the operand of ++, --, *– the first operand of . , =>

Page 34: Verification and Validation

Nov. 18, 2004 34

P-Use

Variable x is in P-USE(n) if x occurs as a P-USE expression in basic block– as the conditional expression in a if, for, while,

do, or switch statement.– as the first operand of the condition expression

operator (?:), the logical and operator (&&), or the logical or operator (||)

Page 35: Verification and Validation

Nov. 18, 2004 35

C-Use Coverage

A C-Use is a variable x and the set of all paths in the data-flow graph from node na to nb such that– x is in DEF(na), and– x is not in DEF(ni) for any other node ni on the paths

(definition clear path), and– x is in C-USE(nb).

A C-Use is covered by a set of tests if at least one of the paths in the C-Use is executed when the test is run.

Page 36: Verification and Validation

Nov. 18, 2004 36

P-Use Coverage

A P-Use is a variable x and the set of all paths in the data-flow graph from node na to nb such that– x is in DEF(na), and– x is not in DEF(ni) for any other node ni on the paths

(definition clear path), and– x is in P-USE(nb).

A P-Use is covered by a set of tests if at least one of the paths in the P-Use is executed when the test is run.

Page 37: Verification and Validation

Nov. 18, 2004 37

Path Coverage

read (x, y);

if x> 0

z = 1;

else

z = 0;

if y < 0

write (z)

else

write (y / z);

x = 1, y = 1 -> 1

x = -1, y = -1 -> 0

x = 0, y = 0 -> error

X>0

Y < 0

Page 38: Verification and Validation

Nov. 18, 2004 38

all-paths

all-du Paths

all-uses

all-c-uses/some-p-uses

all-c-uses

all-p-uses/some-c-uses

all-p-uses

Branch

Statementall-defs

Page 39: Verification and Validation

Nov. 18, 2004 39

Complexity of White-Box Testing

Branch Coverage:– McCabe’s Cyclomatic complexity:CC(G) = #E - #V + 2P (G = (E, V))

All-defs: M + I * V All-p-uses, all-c-uses/some-p-uses, all-p-uses/some-c-uses, all-uses: N 2 All-du, All-Path: 2N: N is the number of

conditional statements

Page 40: Verification and Validation

Nov. 18, 2004 40

Mutation Testing

A program under test is seeded with a single error to produce a ``mutant'' program.

A test covers a mutant if the output of the mutant and the program under test differ for that test input.

The mutation coverage measure for a test set is the ratio ofmutants covered to total mutants

Page 41: Verification and Validation

Nov. 18, 2004 41

DebuggingProgram Slicing Approach

Static slicing: decomposes a program by statically analyzing data-flow and control flow of the program.– A static program slice for a given variable at a given

statementcontains all the executable statements that could influence the value of that variable at the given statement.

– The exact execution path for a given input is a subset of the static program slice with respect to the output variables at the given checkpoint.

– Focus is an automatic debugging tool based on static program slicing to locate bugs.

Page 42: Verification and Validation

Nov. 18, 2004 42

Dynamic Slicingdynamic data slice: A dynamic data slice with

respect to a given expression, location, and test case is a set of all assignments whose computations have propagated into the current value of the given expression at the given location.

dynamic control slice:A dynamic control slice with respect to a given location and test case is a set of all predicates that enclose the given location.

Page 43: Verification and Validation

Nov. 18, 2004 43

Object-Oriented Testing Process

Unit testing - class Integration testing - cluster System testing - program

Page 44: Verification and Validation

Nov. 18, 2004 44

Class Testing

Inheritance Polymorphism Sequence

Page 45: Verification and Validation

Nov. 18, 2004 45

Class Testing Strategies

Testing inheritance Testing polymorphism State-oriented testing Data Flow testing Function dependence class testing

Page 46: Verification and Validation

Nov. 18, 2004 46

Inheritance

A subclass may re-define its inherited functions and other functions may be affected by the re-defined functions.

When this subclass is tested, which functions need to be re-tested?

Class foo { int local var; ... int f1() { return 1; } int f2() { return 1/f1(); }}Class foo child :: Public foo

{// child class of fooint f1() { return 0; }}

Page 47: Verification and Validation

Nov. 18, 2004 47

Testing Inheritance

``Incremental testing of object-oriented class structures.'' Harrold et al., (1992)

New methods: complete testing Recursive methods: limited testing Redefined methods: reuse test scripts

Page 48: Verification and Validation

Nov. 18, 2004 48

Polymorphism

An object may be bound to different classes during the run time.

Is it necessary to test all the possible bindings?

// beginning of function foo {

...

P1 p;

P2 c;

...

return(c.f1()/p1.f1());

// end of function foo

Page 49: Verification and Validation

Nov. 18, 2004 49

Testing Polymorphism

``Testing the Polymorphic Interactions between Classes.''

McDaniel and McGregor (1994) Clemson University

Page 50: Verification and Validation

Nov. 18, 2004 50

State-Oriented Testing

``The state-based testing of object-oriented programs,'' 1992, C. D. Turner and D. J. Robson

``On Object State Testing,'' 1993, Kung et al.

``The testgraph methodology: Automated testing of collection classes,'' 1995, Hoffman and Strooper

The FREE approach: Binder http://www.rbsc.com/pages/Free.html

Page 51: Verification and Validation

Nov. 18, 2004 51

Data Flow Testing

``Performing Data Flow Testing on Class,'' 1994, Harrold and Rothermel.

``Object-oriented data flow testing,'' 1995, Kung et al.

Page 52: Verification and Validation

Nov. 18, 2004 52

Function Dependence Relationship

A function uses a variable means that the value of the variable is referenced in a computation expression or used to decide a predicate.

A function defines a variable means that the value of the variable is assigned when the function is invoked.

A variable x uses a variable y means the value of x is obtained from the value of y and others. x is affected when the value of y is changed.

Page 53: Verification and Validation

Nov. 18, 2004 53

Function Dependence Relationship

f1 depends on f2

f1 uses a variable x that is defined in f2, f1 calls f2 and uses the return value of f2, f1 is called by f2 and uses a parameter p that is

defined in f2. f1 uses a variable x and x uses a variable y

which is defined in f2.

Page 54: Verification and Validation

Nov. 18, 2004 54

Object-Oriented Pitfalls

Type I faults– inheritance/polymorphism faults

Type II faults– object management faults

Type III faults– traditional faults

Page 55: Verification and Validation

Nov. 18, 2004 55

Object-Oriented Pitfalls - Type I

Class foo {

int local_var

.

int f1() {return 1;}

int f2() {return 1/f1( );}

}

class foo_derived{

int f1() {return 0;}

}

main() {

int x, y;

foo Obj;

cin >> x, y;

if ( x > 0 )

Obj = new foo( );

else

Obj = new foo_derived( );

if ( y > 0 )

cout << Obj.f2( );

else

cout << Obj.f1( );

}

Page 56: Verification and Validation

Nov. 18, 2004 56

Object-Oriented Pitfalls - Type II

class foo {

Int *m_data;

foo() {

m_data = new Int;

*m_data = 0; }

~foo() {delete m_data;}

print() {

cout << *m_data;}

inc() {*m_data++;}

main() {L1: foo *obj1 = new foo( );L2: obj1->inc();L3: foo *obj2 = new foo( );L4: *obj2 = *obj1; if (P1) L5: obj2->inc( );L6: obj2->print( ); if (P2)L7: obj2->~foo( ); if (P3)L8: obj1->print( );L9: obj1->~foo( );}

Page 57: Verification and Validation

Nov. 18, 2004 57

Empirical Study - Applications

System A : GUI System B : Data logging system System C : Network communication

program

Page 58: Verification and Validation

Nov. 18, 2004 58

Fault Summary

System System A System B System CLOC 5.6k 21.3k 16.0kNOF 35 80 85Type I 5 15 10Type II 6 13 7Type III 24 52 68OOF (%) 31% 35% 20%

Page 59: Verification and Validation

Nov. 18, 2004 59

Functional Testing

System System A System B System C NOT 100 383 326 Type I 1(4) 4(11) 5(5) Type II 1(5) 6(7) 3(4) Type III 15(9) 32(20) 44(24) OOF (%) 2(18%) 10(35%) 8(47%) NOF 17(48%) 42(52%) 52(61%)

Page 60: Verification and Validation

Nov. 18, 2004 60

Statement Testing

System System A System B System C NOT 46 55 52 Type I 0(4) 1(10) 0(5) Type II 1(4) 1(6) 1(3) Type III 5(4) 6(14) 4(20) OOF (%) 1(11%) 2(11%) 1(11%) NOF 6(33%) 8(21%) 5(15%)

Page 61: Verification and Validation

Nov. 18, 2004 61

Branch Testing

System System A System B System C NOT 23 41 33 Type I 0(4) 1(9) 1(4) Type II 1(3) 1(5) 0(3) Type III 1(3) 6(8) 5(15) OOF (%) 1(11%) 2(11%) 1(11%) NOF 2(11%) 8(21%) 6(18%)

Page 62: Verification and Validation

Nov. 18, 2004 62

Code-Based Testing

System System A System B System C NOT 69 96 85 OO faults 2(22%) 2(22%) 2(22%) NOF 8(44%) 12(42%) 13(33%)

Page 63: Verification and Validation

Nov. 18, 2004 63

All-States

System System A System B System CNOT 21 37 38Type I 1(3) 3(7) 2(3)Type II 0(5) 2(4) 1(2)Type III 4(5) 8(6) 10(10)OOF (%) 1(11%) 5(28%) 3(33%)NOF 5(28%) 13(34%) 13(40%)

Page 64: Verification and Validation

Nov. 18, 2004 64

All-Transitions

System System A System B System CNOT 43 79 75Type I 1(2) 2(5) 0(3)Type II 2(3) 1(3) 1(1)Type III 4(1) 6(0) 8(2)OOF (%) 3(33%) 3(17%) 1(11%)NOF 7(39%) 9(24%) 9(27%)

Page 65: Verification and Validation

Nov. 18, 2004 65

State-Based Testing

System System A System B System CNOT 64 116 113OOF (%) 4(44%) 8(44%) 4(44%)NOF 12(66%) 22(58%) 22(67%)

Page 66: Verification and Validation

Nov. 18, 2004 66

Object-Flow Based Testing

Object– an object is an instance of a class

Define– an object is defined if its state is initiated or

changed Use

– an object is used if one of its data members is referenced

Page 67: Verification and Validation

Nov. 18, 2004 67

Object-Flow Coverage Criteria

All-du-pairs– at lease one definition-clear path from every

definition of every object to every use of that definition must be exercised under some test.

All-bindings– Every possible binding of every object must be

exercised at least once when the object is defined or used.

Page 68: Verification and Validation

Nov. 18, 2004 68

Weak Object-Flow Testing

An object is defined – the constructor of the object is invoked;– a data member is defined;– a method that initiates/modifies the data

member(s) of the object is invoked

Page 69: Verification and Validation

Nov. 18, 2004 69

Object-Oriented Pitfalls - Type I

Class foo {

int local_var

.

int f1() {return 1;}

int f2() {return 1/f1();}

}

class foo_derived{

int f1() {return 0;}

}

main() {

int x, y;

foo Obj;

cin >> x, y;

if ( x > 0 )

Obj = new foo();

else

Obj = new foo_derived();

if ( y > 0 )

cout << Obj.f2();

else

cout << Obj.f1();

}

Page 70: Verification and Validation

Nov. 18, 2004 70

Object-Oriented Pitfalls - Type II

class foo {

Int *m_data;

foo() {

m_data = new Int;

*m_data = 0; }

~foo() {delete m_data;}

print() {

cout << *m_data;}

inc() {*m_data++;}

main() {L1: foo *obj1 = new foo();L2: obj1->inc();L3: foo *obj2 = new foo();L4: *obj2 = *obj1; if (P1) L5: obj2->inc(); if (P2)L6: obj2->~foo(); if (P3)L7 obj1->print(); obj1->~foo();}

Page 71: Verification and Validation

Nov. 18, 2004 71

All-DU-Pairs

System System A System B System CNOT 54 89 65Type I 2(2) 6(5) 1(4)Type II 4(1) 6(1) 4(0)Type III 3(6) 7(13) 3(21)OOF (%) 6(67%) 12(67%) 5(56%)NOF 9(50%) 19(50%) 8(24%)

Page 72: Verification and Validation

Nov. 18, 2004 72

All-Bindings

System System A System B System CNOT 21 37 32Type I 1(1) 3(2) 3(1)Type II 1(0) 1(0) 0(0)Type III 2(4) 3(10) 4(17)OOF (%) 2(22%) 4(22%) 3(33%)NOF 4(22%) 7(18%) 7(21%)

Page 73: Verification and Validation

Nov. 18, 2004 73

Object-Flow Based Testing I

System System A System B System C NOT 75 126 97 OOF 8(1) 16(2) 8(1) NOF 13(5) 26(12) 15(18)

Page 74: Verification and Validation

Nov. 18, 2004 74

Object-Flow Based Testing I’

System System A System B System C NOT 115 155 145 OOF 8(1) 16(2) 8(1) NOF 16(2) 31(7) 21(12)

Page 75: Verification and Validation

Nov. 18, 2004 75

Object-Flow Based Testing II

System System A System B System C NOT 187 295 247 OOF (%) 9(0) 18(0) 9(0) NOF 17(1) 37(1) 28(5)

Page 76: Verification and Validation

Nov. 18, 2004 76

Object-Flow Based Testing II’

System System A System B System C NOT 264 391 332 OOF 9(0) 18(0) 9(0) NOF 17(1) 37(1) 29(4)

Page 77: Verification and Validation

Nov. 18, 2004 77

Integrated Testing

Functional testing Code-based testing Object-flow based testing I State-based testing

Page 78: Verification and Validation

Nov. 18, 2004 78

Integrated Approach

System System A System B System C NOT 126 131 167 OOF 9(0) 18(0) 9(0) NOF 17(1) 36(2) 29(4)