Holistic Test Analysis & Design (2007)
-
Upload
neil-thompson -
Category
Technology
-
view
710 -
download
1
Transcript of Holistic Test Analysis & Design (2007)
©
T17 2007Neil Thompson & Mike Smith
Holistic Test Analysis & Design
Neil Thompson Thompson information Systems Consulting Ltd
& Mike Smith Testing Solutions Group Ltd
STARWest 2007 Track presentation T17 v1.4a
23 Oast House CrescentFarnham, SurreyEngland, UKGU9 0NPwww.TiSCL.com
St Mary’s Court20 St Mary at HillLondonEngland, UKEC3R 8EEwww.testing-solutions.com
©
T17 2007Neil Thompson & Mike Smith
2
Do you control your testing, or does your testing control you?
• Test Cases thought of• Scripts / Procedures written• Expectation that “those are the tests”
THE REMAINDER OF YOUR LIFE(ON THAT PROJECT)
FLEXIBLE, RISK-MANAGED
TEST EXECUTION
• What you really want to cover• Governance / management needs• Product risksor…
©
T17 2007Neil Thompson & Mike Smith
3
• Contents:– Standards’ & textbooks’ guidance on test coverage; Test Cases– Test specification process: what then how– Physical & Logical test coverage: Test Conditions– Holistic Test Analysis & Design method: a spreadsheet!– Example– Scripted & exploratory testing– Formal & informal test techniques; mixing techniques– Full process, and route-mappable shortcuts– Fixed “test entities model”?– Business Performance Management; Scorecards; – Information traceability; Measurement– Tools– Conclusions
• Learning objectives for audience:– understand why deriving test cases is not as simple as many believe– appreciate the distinction between logical & physical test coverage – just like
development!– take away a flexible table-driven template for analysing Test Conditions &
designing Test Cases– think about the test entities model, is it fixed?– be ready to mix multiple techniques, and
scripted & exploratory approaches
Agenda
NEIL
MIKE
NEIL
©
T17 2007Neil Thompson & Mike Smith
4
IEEE 829 (1998!) still wags many• This diagram is titled
“Relationship of test documents to testing process” but need to search text for more process detailTEST PLAN includes:• Test Items: software items (source/object/job control code, control data,
or a collection of these) which are objects of testing (stated with reference to their specifications)
• Features to be tested: distinguishing characteristics of software items, eg performance, portability, functionality (stated individually and in combinations to be tested)
TEST DESIGN SPEC includes:• Features to be tested: (including nominated Test Items)• approach refinements: including techniques & rationale,
result checking method, inter-case relationships
• At first sight a one-to-many hierarchy of Plan-Design-Case, but:
- see those Λ symbols! and…- (Intro & p7) “a TCS may be referenced by several TDSs”
TEST CASE SPEC includes:• Test Items (including Features, and
optionally specification references)
TEST PROCEDURE SPEC: …to execute a set of TCSs or to analyse a software item to evaluate Features …includes:• purpose (including specification references)
©
T17 2007Neil Thompson & Mike Smith
5
Test Items, Features, Conditions, Cases…what do standards & textbooks tell us?
1973 Hetzel First1979 Myers Yes 1982 Beizer (2nd ed. 1990) First “Tests”1983 ANSI/IEEE First Yes (in Example) (kind-of) Yes1984 Hetzel Yes Yes First Yes1992 Quentin (kind-of) (kind-of) First (kind-of) Yes1993 Kaner-Falk-Nguyen (kind-of) Yes Yes Yes1994 Marick “Clues→Req’ts” (kind-of)1995 Perry (2nd ed, 2000) (kind-of) (kind-of) (kind-of) (kind-of)1995 Kit Yes Yes First Yes (kind-of) Yes Yes1998 BS 7925-1 (working draft v6.3) (not really) Yes1998 IEEE 829 Yes Yes (in Example) (kind-of) Yes1999 Black (kind-of) (kind-of) (kind-of) Yes First Yes2000 Binder2002 Craig & Jaskiel Yes Yes Yes “Inventory” Yes Yes2003 Hutcheson (“dictionary” pref’d) (not really) Yes “Inventory” of units Yes 2007 ISTQB Foundation Yes Yes Yes Yes Yes Yes Yes
ITEM FEATURE BASIS RISK CONDITION OBJECTIVE CASEThis is not a complete summary, just highlights noticed
…………………………………states compliance with IEEE 829…………………………….
(Yes, there was a “book” before Myers 1976 & 1979)
©
T17 2007Neil Thompson & Mike Smith
6
But what is a Test Case?
• ISTQB definition (Glossary v1.3, 31 May 2007):– “Test Case: A set of
input values, execution preconditions, expected results and execution postconditions, developed for a particular objective or test condition, such as to exercise a particular program path or to verify compliance with a specific requirement. [After IEEE 610]” (highlighting & line spacing added by Neil Thompson)
• Examples in textbooks tend to be for simple on-line functions• But consider a batch job – is a test case:
– a field? – a group of fields?– a whole record / row?– the entire file / table?
• Amazingly, no-one seems to care!
©
T17 2007Neil Thompson & Mike Smith
7
We want to know test coverageNow, let’s
start with aclassification
tree
1479 test cases, so
it must be good,right?
Testspecificationprocess
Documentationto agreecoverage
©
T17 2007Neil Thompson & Mike Smith
8
What’s needed in the test specification process?
• To get concise coverage documentation, Test Design is important (test case design techniques, eg decision tables)
• But test design should be preceded by Test Analysis: “what” to test
• Test scripts / procedures are optional (eg some testers do exploratory testing)…
• But we cannot do without documenting test coverage in a way which:– is reviewable with stakeholders– allows use of informal techniques in addition to formal– gives a holistic view (risk-based but including
“requirements-based”)
©
T17 2007Neil Thompson & Mike Smith
9
Logical & Physical coverage
Requirements
FunctionalSpecification
TechnicalDesign
ModuleSpecs
ComponentTesting
IntegrationTesting
SystemTesting
AcceptanceTesting
LOGICALPHYSICAL
TestStrategy
TestPolicy Master
TestPlan
ATPlan
STPlan
ITPlan
CTPlan
AT Scripts /Procedures
ST Scripts /Procedures
IT Scripts /Procedures
CT Scripts /Procedures
ATAnalysis & Design
STAnalysis & Design
ITAnalysis & Design
CTAnalysis & Design
LOGICAL PHYSICAL
©
T17 2007Neil Thompson & Mike Smith
10
But test coverage is just decomposing the system requirements, isn’t it? No!
programmingwith risk of mistakes
Requirements
FunctionalSpecification
TechnicalDesign
ModuleSpec
Acceptance TestAnalysis & Design
System TestAnalysis & Design
Integration TestAnalysis & Design
Component TestAnalysis & Design
TESTMODEL
DEVELOPMENTMODEL
REALWORLD
simplification
refinementwith risk ofdistortion
ATExecution
STExecution
ITExecution
validation testing
verification testing
CTExecution
SOFTWARE
ORGANISATIONBEFORE AUTOMATIONNeil Thompson,EuroSTAR 1993
SOFTWARE(observed)
DEV MODEL(expected)
TEST MODEL(ver’d / val’d)
REALWORLD(desired)
afterSOFTWARE TESTING:A CRAFTSMAN’S APPROACHPaul Jorgensen
coverage needs to bemulti-dimensional
©
T17 2007Neil Thompson & Mike Smith
11
So what are we proposing for logical & physical coverage?
LOGICAL COVERAGEOF TEST CONDITIONS
(hierarchical, but alsomulti-dimensional)
PHYSICAL COVERAGEOF TEST CASES
(hierarchical)
SCRIPTED
Test Level, egSystem Testing
Test Cases --- --- ---
Test Cases --- --- ---
Test Cases --- --- ---
Test Suites / Packs
Tests Tests Tests
EXPLORATORYeg Sessions
Requirementssections
On-linetransactions
Batchruns
Start ofday
Eventsaffectingentities
Functionality Non-Functionality
Performance
Stress
Security
Riskas a 4th
dimension
useful sub-divisions?
…
…
2. Test Features
Functionalareas
Hardwareinterfaces
Deliveredmodules
… …
1. Test Items
Wholesystem
ServiceinclSystem
3. Test bases
5. Test Conditions
©
T17 2007Neil Thompson & Mike Smith
12
Holistic method: how we show logical & physical coverage (spreadsheet representation)
1. Test Items & Sub-items
2. Test Features & Sub-features
Test Level Test LevelObjectives
WHAT IS TO BE TESTED
HEADER INFORMATION
HOW TEST CASES ARE DESIGNED TEST SCRIPT REFERENCE
MODIFICATIONSFOR CURRENTRELEASE
Test Script or Exploratory Regime
3. Test Basis References
4. Product Risks
5. TestConditions
Ver / ValMethod
Test DataIndications
TechniqueNames
TestObjectives
Overview A …
ABCD…
multiple dimensions are handled by:• allowing flexible many-many relationships
(yes, it’s just a table!)• allowing hierarchy in Test Conditions
• high-level Test Conditions in Overview sheet
• low-level Test Conditions in detail sheets
““ “
“
+ whetherBehaviouralor Structural
B C D
(OR EXPLORATORY
TEST EXECUTIONRECORD)
• (see
later slide)
… …
TestCases
©
T17 2007Neil Thompson & Mike Smith
13
Logical left→right flow: but flexible (1 of 2)
Non-Func
Req’ts
S1ES1ES2S2
ES3
S1ES1ES2S2
ES3
S1ES1ES2S2
ES3
S1ES1ES2S2
ES3
F3 F6F5F2F1
F3 F7F5F1F2
F3F1
F3F2
F10 F12F11F9F8
Data A
Data A
Data B
Data C
Data D
F3 F6F5F2F1
F3 F7F5F1F2
F3F1
F3F2
F10 F12F11F9F8
Data A
Data A
Data B
Data C
Data D
CO
MP
ON
EN
T C
OM
PO
NE
NT
S
YS
TEM
SY
STE
M
A
CC
EP
TAN
CE
INTE
GR
ATIO
N
INTE
GR
ATIO
N
FuncReq’ts …
FuncSpec
TechDesign
ModuleSpecs
ProgrammingStandards
Workshops
Functional Non- Functional Online Batch
Val Nav … Perf Sec …
Behav
Behav
Behav
Struc
Behav
Behav
Behav
Struc
Behav
Struc
Service to stakeholders
Streams
Threads
Modules
TEST ITEMS TEST FEATURES BEHAVIOURAL / TEST PRODUCT TEST STRUCTURAL BASIS RISKS CONDITIONS REFERENCES ADDRESSED
ServiceLevels
Behav
I’faceSpec
(continued on next slide)
F5
F2
F4
F3
F1
C1
C3
C2
C5
C4
Public op
CAB
C6F5
F2
F4
F3
F1
C1
C3
C2
C5
C4
Public op
CAB
Public op
CAB
C6
F5F3
F1C3
C2
F2 F5F3
F1C3
C2
F2
Pairs /clusters of
modules
©
T17 2007Neil Thompson & Mike Smith
14
Logical left→right flow: but flexible (2 of 2)C
OM
PO
NE
NT
CO
MP
ON
EN
T
SY
STE
M
S
YS
TEM
AC
CE
PTA
NC
E
IN
TEG
RAT
ION
IN
TEG
RAT
ION
TEST MANUAL / AUTOMATED TEST DATA TEST CASE TEST SUITE / TEST CASE /CONDITIONS VERIFIC’N / VALID’N & CONSTRAINTS DESIGN TEST / SCRIPT RESULT CHECKING INDICATIONS TECHNIQUES TEST CASE PROCEDURE METHOD OBJECTIVES IDENTIFIERS
Manual:- screen images viewed,- test log hand-written
Manual for changes:- examine interface log prints- view screens in each sys
Auto regression test:in-house test harness
Manual for changes:- database spot-checks- view screens, audit printAuto regression test:threads, approved tool
Manual + Auto:- varies, under team control
Manual for changes:- varies, under individual controlAuto regression test:per component, approved tool
update with care,documentationout of date
copy of live data,timing important,users all have access
ad-hoc data,unpredictable content,check early withsystem contacts
tailored to eachcomponent
contains sanitised livedata extracts
arrange data separation between teams
Use Cases
StateTransitions(all transitions,Chow 0-switch)
BoundaryValueAnalysis
1.0 over0.1 overon0.1 under1.0 under
CT-2.4.1CT-2.4.2CT-2.4.3CT-2.4.4CT-2.4.5
Main success scenarioExtension 2aExtension 4aExtension 4bExtension 6a
AT-8.5.1AT-8.5.2AT-8.5.3
AT-8.5.4
etc
etc
MPTUMPTCMPCMCMN
ST-9.7.1ST-9.7.2
ST-9.7.3
etc
• If you use an informal technique, state so here.• You may even invent new techniques!
©
T17 2007Neil Thompson & Mike Smith
15
So… what is a Test Condition?
• An item or event of a component or system that could be verified by one or more Test Cases, eg a function, transaction, feature, quality attribute or structural element.
• A part / aspect of behavioural, non-functional or structural test coverage in a system (or the service in which it is used) which is proposed by stakeholders and could be tested by one or more Test Cases (or a part of a Test Case).
• May be qualified by specific data limitations / combinations.• Test Conditions are derived by analysis of Items Under Test,
Testable Features, Test Bases and Product Risks.• Test Conditions may be specified in a hierarchical way.
CURRENT ISTQB GLOSSARY OUR CURRENT (v1.3) WORKING DEFINITION
(Italic bold parts arequestioned here)
©
T17 2007Neil Thompson & Mike Smith
16
Relationships betweenTest Conditions & Test Cases (example)
1. Test Items & Sub-items
2. Test Features & Sub-features
WHAT IS TO BE TESTED HOW TEST CASES ARE DESIGNED TEST SCRIPT REFERENCE
MODIFICATIONSFOR CURRENTRELEASE3. Test Basis
References4. Product Risks
5. TestConditions
Ver / ValMechanism
Test DataIndications
TechniqueNames
TestObjectives
+Behav/Struct
TestCases
2ndlevel: teenager in the Mid - East
Adapted from Foundations Of Software Testing – ISTQB CertificationGraham, Van Veenendaal, Evans & Black
Overview
• customer ages:
• customer domicile
3rd level: male teenager in the Mid - East on pay - as - you - go tariff with <$10 creditfemale grumpy old person in Pacific on sunset tariffmale pre - teenager in Atlantic on pay - as - you - go tariff with >=$10 creditcustomer who has a criminal recordcustomer with record of threatening behaviour to call centre staff…
High (1st) -level: - pre - teenager- teenager- grumpy old person
- Pacific- Mid - West- Mid - East (do you have this?)- Atlantic
“drawinga matrix”(informal)
A
A 1
“drawinganother matrix”(informal)
tariffs
credit
gender
Jim YellowPrudence BrownFoxy RedBoxy BlackJim YellowBoxy Black…
©
T17 2007Neil Thompson & Mike Smith
17
Relationships between Test Conditions& Test Cases (example continued)
1. Test Items & Sub-items
2. Test Features & Sub-features
WHAT IS TO BE TESTED HOW SUITE/TEST SCRIPT REFERENCES
3. Test Basis References
4. Product Risks
5. TestConditions
…(some columnsomitted for clarity)
TestCases
SUITES OFTEST SCRIPTS
Jim YellowPre-cond’s----------Post-cond’s
Prudence BrownPre-cond’s----------Post-cond’s
Multi-customerPre-cond’s----------Post-cond’s
Male teenager, Mid - East, pay - as - you - go, <$10 credit, criminalFemale old, Pacific, sunset tariffMale pre - teen, Atlantic, pay - as - you - go, >=$10 credit, threateningMale pre - teen, Atlantic, pay - as - you - go, <=$10 credit, criminal + threatening
Suites: YellowBrownRedBlackMulti-customer
… (from previous slide) …Item:Wholesystem(changes +regression)
Sub-Item:the changes
Sub-Item:system afterthe changes(regressionthread tests)
• New transactions: - Mailshot
• Changed transactions: - Billing - Call centre enquiries
• Functionality: - Field validation - Navigation between windows - Business logic - Help text
• Non-Func: - Usability
• Non-Func: - Performance - Contention
• Functionality: - (overall)
…
• Mailshot Requirements Document 2.1 Select customer 3.1 Examine credit hist 4.1 Examine call hist
1 Overview 2.5 Customer navig 3.5 Credit navig 4.5 Calls navig
• Mailshot Design Meeting
…
…
…
• Two customers• Three customers• Three cust’s, one different
• Selection of paths for one cust (Chow)• Select –> Credit history for three cust’s• Credit history –> Call history same three cust’s• Call history –> Offer for four cust’s
• Yellow, Brown• Yellow, Brown, Red• Yellow, Red, Black
• Red• Yellow, Brown, Black• Yellow, Brown, Black• Brown, Yellow, Black, Red
…
• System may not warn…
• System may not warn in time…
• Already agreed to join promotion
• About to be disconnected • Voluntary• Forced
• Brown
• Red• Black
©
T17 2007Neil Thompson & Mike Smith
18
Holistic method: potential use in exploratory testing (can mix with scripted)
1. Test Items & Sub-items
2. Test Features & Sub-features
EXPLORATORYTEST EXECUTIONRECORD (and/orTEST SCRIPT REF)
3. Test Basis References
4. Product Risks
5. TestConditions
Ver / ValMechanism
Test DataIndications
TechniqueNames
TestObjectives
Overview A …
ABC…
““ “
“
+ whetherBehaviouralor Structural
B C
ProjectEnvironment
QualityCriteria
ProductElements
Tests PerceivedQuality
Test Script or Exploratory Regime
Elements from“Heuristic Test Strategy Model”,“Universal Testing Method v2.0” &“Improving By Doing”quoted from Rapid Software Testing v2.1.2,training from James Bach & Michael Boltonwww.satisfice.com, www.developsense.com cross-referred here by Neil Thompson
DeterminecoverageModel test space Determine
oraclesDeterminetest procedures
Configuretestsystem
Observetestsystem
Evaluatetestresults
Reporttestresults
Operatetestsystem
TESTING MISSION
TESTLAB
PRODUCTDOMAIN PROBLEM DOMAIN
©
T17 2007Neil Thompson & Mike Smith
19
Now let’s look at techniquesTransaction Customer type Account type A2C1C2D1P4…
Calling module
Called modules
Componentunder IntegrationTest
FORMAL:surprisingly little-used!
(or used “implicitly”)INFORMAL: “the freedom to just analyse stuff”
BEHAVIOURAL
STRUCTURAL
• Equivalence Partitioning
• Boundary Value Analysis
• Cause/Effect Graphing – Decision Table Testing
• State Transition Testing
• Classification Tree Method
• Random Testing• Syntax Testing• Process Cycle
Testing• Thread Testing• Elementary
Comparison Testing
• One of Each• Pairwise Testing• Inductive Testing
• Linear Code Sequence and Jump (LCSAJ)• Branch Condition Combination Testing• Modified Condition Decision Testing• Branch Condition Testing• Branch/Decision Testing• Statement Testing• Dataflow Testing
1,2,3
1
2,3
2,4 3,5
4,5
4,5
6
6
?
?
?
Test Analysis: WHAT IS TO BE TESTEDHOW TEST CASES ARE DESIGNED?
“black-box”
“white- or clear-box”Lists compiled by Chris Comey,Testing Solutions Group
©
T17 2007Neil Thompson & Mike Smith
20
Formal techniques are not carved in stone tablets• Published techniques have evolved over time;
names and representations come and go• Some major publications on techniques:
– Myers 1979
– Beizer 1982 (2nd ed 1990)
– Beizer 1995 (black-box)
– British Standard 7925-2
– Binder (object-oriented) 2000
– Copeland 2003
• But why does object orientation seem a different world? And now SOA!• What are patterns, actually?• What about Hetzel; Kaner-Falk-Nguyen; Marick; Jorgensen – their books
don’t seem to talk about “techniques” as such, but contain great value • In exploratory testing, we hear more about heuristics • What about “techniques” for non-functional testing?
©
T17 2007Neil Thompson & Mike Smith
21
but INHERITED FEATURESCANNOT BE TRUSTED
IN ALL CIRCUMSTANCES
So where have “Techniques” been, where are they going?
EXPLORATORYPARADIGM
PROCEDURAL /SCRIPTED PARADIGM
* Modified from Neil Thompson’s minutes of
Software Testing Retreat 2, 2003(from which also
this hierarchy was inverted!)
Heuristics
Patterns
Techniques
Definition*: each must contain all of…• catchy title;• description of the problem which the pattern addresses• solution to the problem• context in which the pattern applies• one or more (pref >3) examples.
Definition (based on Chambers 1981):• art of discovery in logic• education method in which
student discovers for self• principles used in making
decisions when all possibilities cannot be fully explored!
Definition (based on Chambers 1981):• methods of performance• manipulation• mechanical part of an
artistic performance!
• Equivalence Partitioning• Boundary Value Analysis• Cause/Effect Graphing & Decision
Tables• State Transitions• Classification Trees• Random• Syntax• Process Cycles• Threads• Elementary Comparisons• One of Each• Pairwise• Inductive
• Linear Code Sequence and Jump (LCSAJ)• Branch Condition Combinations• Modified Condition Decisions• Branch Conditions• Branch/Decisions• Statements• Dataflows
“white-” or clear-box
black-box
(not applied to software testing
until object orientation?but…)
?
?
“these techniques may also be applied in an exploratory way”
POLARITY-SWITCHING
If after a while you don’t succeed,try something different
We tend to find……………….different… bugs when we do this
This seems a useful way to divide the system’sstructure & behaviourinto manageable parts
Warming up v. cruising v. cooling downDoing v. describing v. thinkingCareful v. quickData gathering v. data analysisSolo work v. team effortYour ideas v. other peoples’ ideasCurrent version v. old versionsTesting v. touringIndividual tests v. general(selected from Testing Outside the Bachs,STAREast 2006)
“COLOURED”BOXES
Invariant boundariesNon - modal classQuasi - modal classModal class
Testing object - oriented systems is similar at thehigher test levsl, but weneed a new approachfor lower levels
Polymorphic serverModal hierarchy
Category - partitionCombinational functionRecursive functionPolymorphic message
Guidance onnon - functionaltesting seemsfragmented andnot “techniqued”
?CLASSES ARE
LIKE TRAD“MODULES”
METHODS ARELIKE TRAD
“UNITS”
PROCESS
PRODUCT
“Test DesignPatterns” from
Binder 2002,Testing OO
Systems
©
T17 2007Neil Thompson & Mike Smith
22
The holistic method: process flow? Route options!
1. Test Items & Sub-items
2. Test Features & Sub-features
WHAT IS TO BE TESTED HOW TEST CASES ARE DESIGNED TEST SCRIPT REF. (OR EXPLORATORYTEST EXECUTIONRECORD)
MODIFICATIONSFOR CURRENTRELEASE3. Test Basis
References4. Product Risks
5. TestConditions
Ver / ValMethod
Test DataIndic’s /Constraints
TechniqueNames
TestObjectives
Refine from Stage Test Plan
Test ConditionAnalysisTechniques
Sponsor/Owner measurement framework Tester measurement framework
Derive Test Conditions
MethodConstraints
Indications
Test CaseDesignTechniques
Tests added /modified / obsolete;regression
Test Cases (in procedure / Script)(full process flow)
6 Session
5
4
3
2
1
(direct from Conditions)
Conditions (omit formal Techniques)
Test Cases (in procedure / Script)
(omit Risks) Method (omit Data Constr/Indics)
Use in Exploratory Charter as appropriate
Conditions Execution
(omit Conditions etc) Cases
Techniques
(omit Objectives)
Cases
Cases Techniques
Method
Method
Data Constr / Indics)
(omit Conditions)
Conditions Data Constr / Indics)
Scripts
©
T17 2007Neil Thompson & Mike Smith
23
So: is a fixed entity model possible / desirable?
• The Software Testing Retreat (UK) has discussed this point
• The initial reaction was that this was already clear, trivial!
• Fierce debate showed otherwise • We sought both an entity model
and a process• This entity model is only
a workshop draft• The conversations continue…
The main issues seem to be:• how many intermediate entities needed (eg “test coverage item”)
• how dependent this all is on a fairly rigorous set of test basis documents
WE SAY FOR NOW:• BUILD THE ENTITIES AROUND IEEE 829 PLUS “TEST CONDITION”• DO NOT BE BOUND BY TEST BASIS DOCUMENTS, ALLOW EXPLORATORY
©
T17 2007Neil Thompson & Mike Smith
24
Test Model described by ‘methods’ group at large European financial institution
Test RequirementsRequirements
WHY
Test Design Specifications
Test Case Specifications
WHAT WHAT HOW
• “From Requirements to Test Case Specification”
in test management tool
©
T17 2007Neil Thompson & Mike Smith
25
Role of test coverage in businessperformance measurement & management
• Some people are talking about testing becoming a profession…
• Professional managers want a clear idea what is well-covered, what is less well-covered, and why (counts of Test Cases and bugs mean little in themselves)
• In particular, they do not want:– important tests omitted– large numbers of low-value tests– higher levels of testing merely repeating Component Testing– insufficient attention to non-functional tests– unstructured piles of detailed scripts– difficult-to-maintain testware…
• Business managers are increasingly governed by structured objectives, eg Balanced Scorecards…
©
T17 2007Neil Thompson & Mike Smith
26
Scorecardswww.balancedscorecard.org © Paul Arveson 1998version afterKaplan & Norton
• Based on feedback loops (Deming)
• Not only output feedback but “outcome” (more dimensions)
• Goal• Question• Metric
MEASUREMENT (data)
METRIC (information)
ProductVERIFICATIONRisksTest coverage
- Faults- Failures
FinancialEfficiencyProductivityOn-time, in budget
- Cost of quality
CustomerVALIDATIONRisksBenefitsAcceptanceSatisfaction- Complaints
Improvementeg TPI/TMM…PredictabilityLearningInnovation
ProcessComplianceeg ISO9000Repeatability
- Mistakes
Software Quality versionpublished by Isabel Evanswww.testing-solutions.com,adapted here by Neil Thompson
VERIFICATION & VALIDATION
Test coverage incl Risks v Benefits
Test Conditions & residual failures
Test Cases & fault-fixing
©
T17 2007Neil Thompson & Mike Smith
27
Principles of Business Performance Measurement & Management
• ‘Translating Strategy into Action’– Kaplan & Norton
• Drives behaviour• Measures outcomes• Links actions to strategy• Leads to predictable outcomes• Comes with a ‘Health Warning’!
– powerful but dangerous if misused, can drive the wrong behaviour if measures are badly constructed
©
T17 2007Neil Thompson & Mike Smith
28
Five key principles of Business Performance Measurement & Management
• Generic application• Objectives, Measures & Targets, Initiatives• What & How• Cascading Scorecards
– one person’s ‘How’ is another person’s ‘What’– Measures & Targets become objectives for next
person • Lead & Lag Indicators
– Goal Indicators (reactive)– Performance Indicators (predictive)
©
T17 2007Neil Thompson & Mike Smith
29
If Test Cases are not a good “measure”, why are Test Conditions better?
SCRIPTED
Sessions
Test Cases --- --- ---
Test Cases --- --- ---
Tests Tests
Requirementssections
On-linetransactions
Batchruns
Start ofday
Eventsaffectingentities
Functionality Non-Functionality
Performance
Stress
Security
4. Product Risks
useful sub-divisions?
…
…
2. Test Features
Functionalareas
Hardwareinterfaces
Deliveredmodules
…
1. Test Items
Wholesystem
ServiceinclSystem
3. Test bases
5. Test Conditions
Initiatives
EXPLORATORY
TargetsMeasuresObjectives
Mitigation of…
VERIFICATION & VALIDATION
Deliveredbenefits
Test coverage incl Risks v Benefits
©
T17 2007Neil Thompson & Mike Smith
30
Test Entity Model needs flexible many-many relationships and not fixed hierarchy
WHY WHAT HOW
Does this work?
No, because of these hierarchies
Test ConditionsRequirements Test
CasesTest Scripts / Procedures
Need thisinstead?
Test Execution Schedule
Test ConditionsRequirements Test
CasesTest Scripts / Procedures
Test Execution Schedule
©
T17 2007Neil Thompson & Mike Smith
31
Analysis, Measurement andInformation Traceability• Generic Model• Separates ‘What’ from ‘How’ • Links Hierarchies and relates Logical to Physical• Horizontal & Vertical traceability within Dev & Test Models• Testing Integrated into Development• ‘Pure’ Test Analysis
– Is difficult, requires expertise– A driver for further test activities– A driver for further development activities– A driver for predictable outcomes
• Programme & Project Measurement Framework for sponsors AND owners of measures!
• We can learn from the world of Business Performance Measurement & Management, and they can learn from us!
©
T17 2007Neil Thompson & Mike Smith
32
‘What’ and ‘How’ give us a Treble-V Model!Proactive ‘Testing’ influence on SDLC
PROJECTREQ’TS SPEC
LOGICALDESIGN
PHYSICALDESIGN
COMPONENTDESIGN
BUILD
STATICTESTING
DYNAMIC TESTDESIGN
DYNAMIC TESTEXECUTION
DYNAMIC TESTANALYSIS
STATICTESTING
STATICTESTING
STATICTESTING
DYNAMIC TESTANALYSIS
DYNAMIC TESTANALYSIS
DYNAMIC TESTANALYSIS
DYNAMIC TESTDESIGN
DYNAMIC TESTDESIGN
DYNAMIC TESTDESIGN
DYNAMIC TESTEXECUTION
DYNAMIC TESTEXECUTION
DYNAMIC TESTEXECUTION
At the higher levels, ‘What’ ………and……… ‘How’ are further apart
FIX, RETEST,REGR TEST
FIX, RETEST,REGR TEST
Test Conditions
Test Cases?
Explicit / implicit Test Cases?
FIX, RETEST,REGR TEST
FIX, RETEST,REGR TEST
©
T17 2007Neil Thompson & Mike Smith
33
Tools to support this method• Spreadsheet (Microsoft Excel) has been used so
far, is flexible and good for graphs – but labour-intensive, and not scaleable
• A tailored RDBMS would be difficult if we do not accept a fixed ERD (limiting route mapping)
• Leading proprietary tools handle the relationships but in different ways:– Most are ‘Tester Measurement & Management
Frameworks’ (eg Quality Center / Test Director) – Test Case focus
– T-Plan (Sponsor/Stakeholder Measurement & Management Framework) – Test Condition focus
©
T17 2007Neil Thompson & Mike Smith
34
Conclusions• Summary:– There is more to software test specification than Test Cases &
standard techniques (and this surprises most of us)– Like development, tests benefit from logical specification
(Analysis→Test Conditions) before physical Design– Our Holistic Method allows hierarchies of Test Conditions, and
integrates multiple techniques & scripted-exploratory mixtures– Supports Test-Driven Design & Development, trad. & exploratory
testing– A measured and risk-based view of test coverage is increasingly
important for business performance measurement & management• Lessons learned:
– The original IEEE829 is better than may be thought; but need to read it carefully, and textbooks have tended merely to quote it without adding much value in terms of how to apply it
– No “one size fits all” for an entity model– IEEE829 is Test Case focussed – supports testers’ measurement
framework• Take away:
– The Holistic Method spreadsheet (it’s in use already at a major client)
©
T17 2007Neil Thompson & Mike Smith
35
Way forward
• A new ISO working group is creating a set of international software testing standards in collaboration with the IEEE
• Mindset change required• Explicit v implicit Test Cases• Need flexibility to cater for sponsors’ / owners’
measurement framework in addition to testers’• Enhanced tool support for testing-business
linkage
©
T17 2007Neil Thompson & Mike Smith
Holistic Test Analysis & DesignThanks for listening!
For further information…
Neil Thompson Thompson information Systems Consulting Ltd
& Mike Smith Testing Solutions Group Ltd
STARWest 2007 Track presentation T17 v1.4a
23 Oast House CrescentFarnham, SurreyEngland, UKGU9 [email protected]
St Mary’s Court20 St Mary at HillLondonEngland, UKEC3R [email protected]