Chapter 15: Testing -2 - sewiki.iai.uni-bonn.de€¦ · Chapter 15: Testing -2 O bj...
Transcript of Chapter 15: Testing -2 - sewiki.iai.uni-bonn.de€¦ · Chapter 15: Testing -2 O bj...
Chapter 15: Testing - 2
Obj t O i t dObject-Oriented
SoftwareConstruction
Armin B. Cremers, Tobias Rho, Daniel Speicher, Holger Mügge, , p , g gg(based on Bruegge & Dutoit)
Software Lifecycle Activities...and their models
SystemDesign
ObjectDesign
Implemen-tation TestingRequirements
Elicitation Analysis
Expressed in Terms of
Structured by
Realized by
Implemented by
Verified by
class...class...
by y
Sub-
class...class...
SourceSolution Domain
ApplicationDomain Test
? class....?
Use Case
Armin B. Cremers, Tobias Rho, Daniel Speicher, Holger Mügge (based on Bruegge & Dutoit) Object-Oriented Software Construction 2
Sub-systems
SourceCode
Domain Objects
Domain Objects
Test Cases
Use CaseModel
Testing Activities
RequirementsS tUnit RequirementsSubsystem
Tested Subsystem
qAnalysis
Document
SystemDesign
Document
Unit Test
User
qAnalysis
Document
SubsystemCode
Subsystem
F ti lIntegrationTested
Unit Test
ManualSubsystemCode
FunctionalIntegrationTestedSubsystem Test Test
FunctioningSystem
IntegratedSubsystems
S bTested Subsystem
SystemSubsystems
SubsystemCode
Unit Test
rAll tests by developer
Armin B. Cremers, Tobias Rho, Daniel Speicher, Holger Mügge (based on Bruegge & Dutoit) Object-Oriented Software Construction 3
y
Testing Activities continued
GlobalClient’s
UnderstandingGlobalRequirements
Understandingof Requirements
Performance AcceptanceT t
FunctioningSystem
T tInstallation
T t
ValidatedSystem
AcceptedSystem
Test Test Test
Usable
U ’ d t diTests by developer
UsableSystemTests by client
User’s understanding
System inUse
Armin B. Cremers, Tobias Rho, Daniel Speicher, Holger Mügge (based on Bruegge & Dutoit) Object-Oriented Software Construction 4
UseTests (?) by user
The 4 Testing Steps
♦ 1. Select what has to be measured
♦ 3. Develop test casesmeasured
Analysis: Completeness of requirements
A test case is a set of test data or situations that will be used to exercise the unit (code module system) beingDesign: tested for cohesion
Implementation: Code tests♦ 2 Decide how the testing is
(code, module, system) being tested or about the attribute being measured
4 C t th t t l♦ 2. Decide how the testing is done
Code inspection
♦ 4. Create the test oracleAn oracle contains of the predicted results for a set of t tProofs (Design by Contract)
Black-box, white-box, Select integration testing
test cases The test oracle has to be specified before the actual
i k lSelect integration testing strategy (big bang, bottom up, top down, sandwich)
testing takes place
Armin B. Cremers, Tobias Rho, Daniel Speicher, Holger Mügge (based on Bruegge & Dutoit) Object-Oriented Software Construction 5
Guidance for Test Case Selection
♦ Use analysis knowledge about functional requirements (black
♦ Use implementation knowledge about algorithms:functional requirements (black-
box testing):Use cases
about algorithms:Examples:
Force division by zeroExpected input dataInvalid input data
♦ Use design knowledge about
Use sequence of test cases for interrupt handler
♦ Use design knowledge about system structure, algorithms, data structures (white-box testing):testing):
Control structuresTest branches, loops, ...
Data structuresTest records fields, arrays, ...
Armin B. Cremers, Tobias Rho, Daniel Speicher, Holger Mügge (based on Bruegge & Dutoit) Object-Oriented Software Construction 6
Unit-Testing Heuristics
♦ 1. Create unit tests as soon as object design is completed:
♦ 4. Specify the test oracleobject design is completed:
Black-box test: Test the use cases & functional model
Often the result of the first successfully executed test
♦ 5. Execute the test casesWhite-box test: Test the dynamic model
♦ 2. Develop the test cases
Don’t forget regression testingRe-execute test casesp
Goal: Find the minimal number of test cases to cover as many paths as possible
Re execute test cases whenever a change is made.
♦ 6. Compare the results of the test with the test oracleas many paths as possible
♦ 3. Create a test harness Test drivers and test stubs are needed for integration
test with the test oracleAutomate as much as possible
are needed for integration testing
Armin B. Cremers, Tobias Rho, Daniel Speicher, Holger Mügge (based on Bruegge & Dutoit) Object-Oriented Software Construction 7
Integration Testing Strategy
♦ The entire system is viewed as a collection of subsystems ( f l ) d i d d i h d(sets of classes) determined during the system and object design. The order in which the subsystems are selected for♦ The order in which the subsystems are selected for testing and integration determines the testing strategy
Big bang integration (non-incremental)g g g ( )Bottom up integrationTop down integrationSandwich testingVariations of the above
♦ For the selection, use the system decomposition from the system design
Armin B. Cremers, Tobias Rho, Daniel Speicher, Holger Mügge (based on Bruegge & Dutoit) Object-Oriented Software Construction 8
Steps in Integration-Testing
♦ 1. Based on the integration strategy select a component to
♦ 4. Do structural testing: Define test cases that exercise thestrategy, select a component to
be tested. Unit test all the classes in the component.2 P l d
test cases that exercise the selected component
♦ 5. Execute performance tests♦ 2. Put selected component
together; do any preliminary fix-up necessary to make the i t ti t t ti l
♦ 6. Keep records of the test cases and testing activities.
♦ 7 Repeat steps 1 to 7 until the
.
integration test operational (drivers, stubs)
♦ 3. Do functional testing: Define
♦ 7. Repeat steps 1 to 7 until the full system is tested.
test cases that exercise all use cases with the selected component
♦ The primary goal of integration testing is to identify errors in the (current) component
fi ticonfiguration.
Armin B. Cremers, Tobias Rho, Daniel Speicher, Holger Mügge (based on Bruegge & Dutoit) Object-Oriented Software Construction 9
Which integration strategy should you use?
♦ Factors to consider Top-level components are usually important and cannotAmount of test harness
(stubs &drivers)Location of critical parts in
usually important and cannot be neglected up to the end of testingDetection of design errorsthe system
Availability of hardwareAvailability of components
Detection of design errors postponed until end of testing
Top down approachAvailability of componentsScheduling concerns
♦ Bottom up approach
♦ Top down approachTest cases can be defined in terms of functions examined
Good for object oriented design methodologiesTest driver interfaces must
Need to maintain correctness of test stubs Writing stubs can be difficultTest driver interfaces must
match component interfacesg
Armin B. Cremers, Tobias Rho, Daniel Speicher, Holger Mügge (based on Bruegge & Dutoit) Object-Oriented Software Construction 10
System Testing
♦ Functional Testing♦ Structure Testing♦ Performance Testing♦ Acceptance Testing♦ Installation Testing♦ Impact of requirements on system testing:
The more explicit the requirements, the easier they are to test.Quality of use cases determines the ease of functional testingQuality of subsystem decomposition determines the ease ofQuality of subsystem decomposition determines the ease of structure testingQuality of non-functional requirements and constraints
Armin B. Cremers, Tobias Rho, Daniel Speicher, Holger Mügge (based on Bruegge & Dutoit) Object-Oriented Software Construction 11
determines the ease of performance tests
Structure Testing
♦ Essentially the same as white-box testing.♦ Goal: Cover all paths in the system design
Exercise all input and output parameters of each component.Exercise all components and all calls (each component is called at least once and every component is called by all possible callers.)p )Use conditional and iteration testing as in unit testing.
Armin B. Cremers, Tobias Rho, Daniel Speicher, Holger Mügge (based on Bruegge & Dutoit) Object-Oriented Software Construction 12
Functional Testing
♦ Essentially the same as black-box testing.♦ Goal: Test the functionality of system♦ Test cases are designed based on the requirements
specification (better: user manual) and centered around requirements and key functions (use cases)The system is treated as a black box♦ The system is treated as a black box.
♦ Unit test cases can be reused but in the end, new user-oriented test cases have to be developed as well
. oriented test cases have to be developed as well.
Armin B. Cremers, Tobias Rho, Daniel Speicher, Holger Mügge (based on Bruegge & Dutoit) Object-Oriented Software Construction 13
Performance Testing
♦ Stress Testing ♦ Timing testsStress limits of system (peak demands, maximum # of users, long-term operation)
Evaluate response times and time to perform a function
♦ Environmental tests♦ Volume testing
Test what happens if large amounts of data are handled
Test tolerances for heat, humidity, motion, portability
♦ Quality testing♦ Configuration testing
Test the various software and hardware configurations
♦ Quality testingTest reliability, maintainability and availability of the system
hardware configurations ♦ Compatibility tests
Test backward compatibility
♦ Recovery testingTests the system’s response to presence of errors or loss
with existing systems♦ Security testing
Try to violate security
of data.♦ Human factors testing
Tests the user interface
Armin B. Cremers, Tobias Rho, Daniel Speicher, Holger Mügge (based on Bruegge & Dutoit) Object-Oriented Software Construction 14
Try to violate security requirements
Tests the user interface
Test Cases for Performance Testing
♦ Push the (integrated) system to its limits.♦ Goal: Try to break the subsystem♦ Test how the system behaves when overloaded.
Can bottlenecks be identified? (Candidates for redesign in the next iteration.)
Try unusual orders of execution♦ Try unusual orders of execution Call a receive() before send()
Check the system’s response to large volumes of data♦ Check the system s response to large volumes of dataIf the system is supposed to handle 1000 items, try it with 1001, 2000, 5000, 10000 items., , ,
♦ What is the amount of time spent in different use cases?Are typical cases executed in a timely fashion?
Armin B. Cremers, Tobias Rho, Daniel Speicher, Holger Mügge (based on Bruegge & Dutoit) Object-Oriented Software Construction 15
Acceptance Testing
♦ Goal: Demonstrate system is ready for operational use
♦ Alpha tests:ready for operational use
Choice of tests is made by client/sponsor
Sponsor uses the software at the developer’s site.Software used in a controlled
Many tests can be taken from integration testingAcceptance test is performed
setting, with the developer always ready to fix bugs.
♦ Beta tests:p pby the client, not by the developer.
♦ Majority of all bugs in software
Conducted at sponsor’s site (developer is not present)Software gets a realistic♦ Majority of all bugs in software
is typically found by the client after the system is in use, not by the developers or testers.
Software gets a realistic workout in target environ-mentPotential customer might gety p
Therefore two kinds of additional tests:
Potential customer might get discouraged
Armin B. Cremers, Tobias Rho, Daniel Speicher, Holger Mügge (based on Bruegge & Dutoit) Object-Oriented Software Construction 16
Test TeamOrganizational issues
ProfessionalTester
AnalystProgrammer
too familiarwith code
Tester
Analyst
TestTeamUser
SystemDesignerTeamUser Designer
Configuration M t
Armin B. Cremers, Tobias Rho, Daniel Speicher, Holger Mügge (based on Bruegge & Dutoit) Object-Oriented Software Construction 17
ManagementSpecialist
Types of Testing
♦ Unit Testing (last lecture):- Individual subsystem- Carried out by developers (of components)
G l C fi th t b t i tl d d d i t- Goal: Confirm that subsystems is correctly coded and carries out the intended functionality
♦ Integration Testing (mainly this lecture):- Groups of subsystems (collection of classes) and eventually the
entire system- Carried out by developers
Goal: Test the interface and the interplay among the subsystem- Goal: Test the interface and the interplay among the subsystem
Armin B. Cremers, Tobias Rho, Daniel Speicher, Holger Mügge (based on Bruegge & Dutoit) Object-Oriented Software Construction 18
Types of Testing
♦ System Testing:The entire systemThe entire systemCarried out by developers (testers!)Goal: Determine if the system meets the requirements (functional
d l b l)and global)Functional Testing: Test of functional requirementsPerformance Testing: Test of non-functional requirementsPerformance Testing: Test of non functional requirements
Acceptance and Installation Testing:Acceptance and Installation Testing:Evaluates the system delivered by developersCarried out by the client. yGoal: Demonstrate that the system meets customer requirements and is ready to use
Armin B. Cremers, Tobias Rho, Daniel Speicher, Holger Mügge (based on Bruegge & Dutoit) Object-Oriented Software Construction 19
Integration Testing Strategy
♦ Test (sub-)systems (cluster) for problems that arise from subsystem interactions
♦ Assumption: The entire system is viewed as a collection of subsystemsThe entire system is viewed as a collection of subsystems determined during the system and object design.System Decomposition is hierarchical
f♦ The order in which the subsystems are selected for testing and integration determines the testing strategy
Big bang integration (Nonincremental)g g g ( )Bottom up integrationTop down integrationS d i h t tiSandwich testingVariations of the above
♦ For the selection use the system decomposition from the System
Armin B. Cremers, Tobias Rho, Daniel Speicher, Holger Mügge (based on Bruegge & Dutoit) Object-Oriented Software Construction 20
♦ o t e se ect o use t e syste deco pos t o o t e SysteDesign
Example: Three Layer Call Hierarchy
UserUser Interface Layer III
Billing System
Event Service Learning Layer II
System Service
Neural Network
Network AccessDatabase Layer I
Armin B. Cremers, Tobias Rho, Daniel Speicher, Holger Mügge (based on Bruegge & Dutoit) Object-Oriented Software Construction 21
Integration Testing: Big-Bang Approach
Unit Test A
Unit Test B
A Don’t try this!
Unit Test C
B
Unit Test D
CSystem Test
♦ All components (units) are first tested individually and then together as a single and entire system:Pros:♦ Pros:
No test stubs (mocks) and drivers are needed♦ Cons:
Difficult to pinpoint the specific component responsible for the failure
Armin B. Cremers, Tobias Rho, Daniel Speicher, Holger Mügge (based on Bruegge & Dutoit) Object-Oriented Software Construction 22
Difficult to pinpoint the specific component responsible for the failureResults in Strategies that integrate only a few components at the time
Bottom-up Testing Strategy
♦ The subsystem in the lowest layer of the call hierarchy are tested individually
Infrastructure is tested first♦ Then the next subsystems are integrated and tested from
the next layer up that call the previously testedthe next layer up that call the previously tested subsystems
Increment one subsystem at a timeOrder of integration depends on importance of subsystem etc.
♦ This is done repeatedly until all subsystems are included in the testingin the testing
Regression Tests: Rerun previous tests♦ Only Test Drivers are used to simulate the components of y p
higher layers♦ No Test Stubs!
Armin B. Cremers, Tobias Rho, Daniel Speicher, Holger Mügge (based on Bruegge & Dutoit) Object-Oriented Software Construction 23
Bottom-up IntegrationA
Layer III
B C D Layer ITest E
GFE Layer I
Test F
Test B, E, F
Test C Test A, B, C, D,
E F G
Test D,G
E, F, G
Test G
Armin B. Cremers, Tobias Rho, Daniel Speicher, Holger Mügge (based on Bruegge & Dutoit) Object-Oriented Software Construction 24
Pros and Cons of bottom up integration testingg g
♦ Pros:Interface faults can be more easily found (the usage of testInterface faults can be more easily found (the usage of test drivers accomplishes a clear intention of the underlying interfaces of the lower layer)No Stubs are necessary
♦ Cons: f h f d lComponents of the User Interface are tested last
Test cases often hard to deriveFaults found in the top layer may lead to changes in theFaults found in the top layer may lead to changes in the subsystems of lower layers, invalidating previous tests.
Armin B. Cremers, Tobias Rho, Daniel Speicher, Holger Mügge (based on Bruegge & Dutoit) Object-Oriented Software Construction 25
Top-down Testing Strategy
♦ Test the top layer of the controlling subsystem firstThe skeleton of the program is testedThe skeleton of the program is tested
♦ Then combine all the subsystems that are called by the tested subsystems and test the resulting collection of y gsubsystems
Increment one subsystem at a time
♦ Do this until all subsystems are incorporated into the test♦ Test Stubs are used to simulate the components of lower
layers that have not yet been integratedlayers that have not yet been integrated.♦ No drivers are needed
Armin B. Cremers, Tobias Rho, Daniel Speicher, Holger Mügge (based on Bruegge & Dutoit) Object-Oriented Software Construction 26
Top-down Integration Testing
ALayer III
B C D Layer I
GFE Layer I
Test A Test A B C DTest
A B C DTest A
Layer III
Test A, B, C, D A, B, C, D,E, F, G
Layer III
Layer III + II
All Layers
Armin B. Cremers, Tobias Rho, Daniel Speicher, Holger Mügge (based on Bruegge & Dutoit) Object-Oriented Software Construction 27
All Layers
Pros and Cons of top-down integration testingg g
♦ Pros:Test cases can be defined in terms of the functionality of the system (functional requirements)More effective for finding faults that are visible to the userMore effective for finding faults that are visible to the user
♦ Cons:Writing stubs can be difficult: Stubs must allow all possibleWriting stubs can be difficult: Stubs must allow all possible conditions to be tested.Possibly a very large number of stubs may be required
Armin B. Cremers, Tobias Rho, Daniel Speicher, Holger Mügge (based on Bruegge & Dutoit) Object-Oriented Software Construction 28
Sandwich Testing Strategy
♦ Combines top-down strategy with bottom-up strategy ( ll l i i ibl )(parallel testing is possible)
♦ The system is view as having three layersl i h iddlA target layer in the middle
A layer above the target (top layer)A layer below the target (bottom layer)A layer below the target (bottom layer)Testing converges towards the target layer
♦ No Test Stubs and Drivers are necessary for bottom and♦ No Test Stubs and Drivers are necessary for bottom and top layer
Armin B. Cremers, Tobias Rho, Daniel Speicher, Holger Mügge (based on Bruegge & Dutoit) Object-Oriented Software Construction 29
Sandwich Testing StrategyA
Layer III
Test E
B C D
GFE
Layer I
Test B, E, FBottom
GFE Layer I
Test
Test FLayerTests
Test D,GA, B, C, D,
E, F, GTest G
TopLayer
Test A,B,C, D
Armin B. Cremers, Tobias Rho, Daniel Speicher, Holger Mügge (based on Bruegge & Dutoit) Object-Oriented Software Construction 30
Test ALayerTests
Pros and Cons of Sandwich Testing
♦ Pros:Top and Bottom Layer Tests can be done in parallelNo Stubs and Drivers (saves development time)
♦ Cons:System implementation needs to be finishedD t t t th i di id l b t th t t lDoes not test the individual subsystems on the target layer thoroughly before integration (C in the example)
♦ Solution: Modified sandwich testing strategy♦ Solution: Modified sandwich testing strategy
Armin B. Cremers, Tobias Rho, Daniel Speicher, Holger Mügge (based on Bruegge & Dutoit) Object-Oriented Software Construction 31
Modified Sandwich Testing Strategy
♦ Tests the three layers individually before combining them i i l i h hin incremental tests with one another
♦ The individual layer tests consists of three tests:Target layer test with drivers and stubsT l t t ith t bTop layer test with stubsBottom layer test with drivers
♦ The combined Layer Tests consist of two tests:Top layer accessing target layer (top layer replaces drivers)Top layer accessing target layer (top layer replaces drivers)Bottom accessed by target layer (bottom layer replaces stubs)
Armin B. Cremers, Tobias Rho, Daniel Speicher, Holger Mügge (based on Bruegge & Dutoit) Object-Oriented Software Construction 32
Modified Sandwich Testing StrategyA
B C D
Layer III
Test A,BTest A B C D
GFE
Layer II
Layer I
Test A,B
Test A,C
Test A
Test B
Test CTest A,D Test
A, B, C, D
Test D
Test C
Test E Test D,G
Test FTest B, E, F
Test A, B, C, D,
E F GTest G
Armin B. Cremers, Tobias Rho, Daniel Speicher, Holger Mügge (based on Bruegge & Dutoit) Object-Oriented Software Construction 33
E, F, GTest G
Using the Bridge Pattern to enable early Integration Testingy g g
♦ Usage of Design Patterns supports testing♦ Use the bridge pattern to provide multiple
implementations under the same interface.
Client DBInterface DBImplementation
Stub Code MySQLOracle
Armin B. Cremers, Tobias Rho, Daniel Speicher, Holger Mügge (based on Bruegge & Dutoit) Object-Oriented Software Construction 34