No Slide Title · Exploratory Testing Paul Gerrard [email protected] Twitter:...
Transcript of No Slide Title · Exploratory Testing Paul Gerrard [email protected] Twitter:...
Exploratory Testing Course
© 2008-17 Gerrard Consulting Version 3.0 Page 1
Introduction to
Exploratory Testing
Paul [email protected]
Twitter: @paul_gerrard
Web: gerrardconsulting.com
Slide 1
Agenda
• A New Model for Testing
• Exploring a System as a Source of Knowledge
• Heuristics, Cheat Sheets and Check Lists
• Freestyle Exploratory Testing
• Exploratory Testing Sessions
• Exploring More Complex Systems
• Plus some research I am doing…
Slide 2
Exploratory Testing Course
© 2008-17 Gerrard Consulting Version 3.0 Page 2
Introduce Yourself
Name Your Team
A New Model for Testing
Exploratory Testing Course
© 2008-17 Gerrard Consulting Version 3.0 Page 3
Our sources of knowledge are fallible
and incomplete
• Testers depend on sources of knowledge to test
– Written documentation, specs, standards
– Verbally communicated knowledge and advice
– Experience of using similar systems
– Experience of using the current business process or system (that is to be replaced)
– Intelligence, intuition, beliefs, biases, prejudices etc
• All of our sources of knowledge are fallible and incomplete.
Slide 5
Evolution
• Our brains are evolved to interpret fuzzy data for
the purpose of catching mammoths for dinner
• But we are not good at:
– Communicating in a formal, precise way
– Interpreting what we perceive without error
– Eliciting information, capturing, interpreting,
communicating and acting on it
• These errors are the cause of many of the
defects testers seek to prevent or detect.
Slide 6
Exploratory Testing Course
© 2008-17 Gerrard Consulting Version 3.0 Page 4
How testers gather knowledge
• We question people, documents, the system
under test to understand what is required
• We challenge these sources to check whether
we understand and trust that knowledge
• We EXPLORE our sources to acquire the
knowledge required to test
• In this respect, ALL testing is exploratory.
Slide 7
Let’s explore MindMup.com
• Have you ever used “mind maps”?
• MindMup is a free tool
– Create MindMaps
– Save in the cloud
– Share them
• Go to mindmup.com
• Create a new map, edit it, share it etc.
• Learn how it works for a few minutes.
Slide 8
Exploratory Testing Course
© 2008-17 Gerrard Consulting Version 3.0 Page 5
Test “ideas” - what you might test
• Test ideas are: “things you might test”
– When you see a date field, you might think:
• Yesterday, today, tomorrow, long ago, far in future
• These might be boundary values
– A count of something real (or a loop)
• 0, 1, N, N+1 iterations
– Boolean: True/False, Yes/No etc.
– String: null, 1 char, non alphas, numbers, N chars,
N+1 chars, many many chars.
Slide 9Test design techniques generate values
Test ideas – more exploration?
• When you explore, you encounter new
features all the time
• Think of this activity like “surveying”
• Some features, you understand and then test
• Other features need more exploration
– There are too many to explore right now
– Perhaps they are rather complicated and need a
dedicated session.
Slide 10
Exploratory Testing Course
© 2008-17 Gerrard Consulting Version 3.0 Page 6
Your task
• Identify a feature of MindMup to explore
• For example, the ‘Edit’ feature
• Create a root node called ‘Edit’– Add child nodes for 3-4 features in the Edit menu
– For each feature• Add a test child node and some ideas for testing (some
interesting test values/cases)
• Add a questions node for things you don’t understand
• Don’t be pedantic – yet
• Just capture your test ideas quickly– (try some and see what happens perhaps?)
Slide 11
An exploratory ‘survey’
• It’s easy to use the MindMup menu to identify features:
– To group them as a family/hierarchy
– To think of tests for each feature (isn’t it?)
• Your mindmap is a ‘model of the territory’
• For each test idea, you could create:
– Details for each test as child nodes
– Test runs
– Questions, clarifications and bug reports even.
Slide 12
Exploratory Testing Course
© 2008-17 Gerrard Consulting Version 3.0 Page 7
Feature (mindmap) hierarchy?
• Place– The highest level, so a place could represent a whole app in a
system, sub-system or collection of features
• Feature– A feature is a collection of related features, for example a home
page or the checkout process
• Form– A piece of functionality that is testable i.e. you could create a
test case for it – could be a form or web service even
– A menu option, hyperlink or button on a screen
• Field– An element inside a form. It could be an input field or button on
a form, a link or download
Intelligent Definition and Assurance Slide 13
Choose one feature and explore it
• Consider one feature that is interesting to you and test it
• Use MindMup or handwritten notes to record what you did
• Discuss with your team:
– What did you discover?
– How does your feature work?
– Have you questions you want answered?
– Did you find a bug?
Slide 14Questions, bugs need answers, or more exploration
Exploratory Testing Course
© 2008-17 Gerrard Consulting Version 3.0 Page 8
Models Quiz
Models are innate, essential, human
Slide 16
Exploratory Testing Course
© 2008-17 Gerrard Consulting Version 3.0 Page 9
An instinctive test process?
Slide 17
Hmm, I wonder what this does?
What happens if I try to …?
Hmm. That interesting…
It should do this… let’s see…
It’s instinctive so you don’t think of
it as a ‘process’
Natural curiosity seems to be the
driver
Introducing
The New Model for Testing
Exploratory Testing Course
© 2008-17 Gerrard Consulting Version 3.0 Page 10
Forget Logistics
(for the time being)
Document or not?
Automated or manual?
Agile v waterfall?
This business or that business?
This technology v that technology?
ALL Testing is Exploratory
We explore sources of knowledge ...
... to build test models ...
... that inform our testing.
Exploratory Testing Course
© 2008-17 Gerrard Consulting Version 3.0 Page 11
Judgement, exploring and testing
Testing(the
system)
Our model(s) are adequate
Our model(s) are not adequate
Exploring (sources)
Judgement
Creates testmodels
Uses testmodels
We explore sources of knowledge to build test models that inform our testing
BTW – Do Developers explore the same way? I think so.
Slide 21
Exploration process
ExplorationDefinitions
specs/stories
People(& you)
Sources
Require-ments
TestModels
Enquiring
Challenging
Sources:People, documents,experience, system under test
Modelling
Test Models:Can be documented
or mental models
Predicting
System under test
Slide 22
Exploratory Testing Course
© 2008-17 Gerrard Consulting Version 3.0 Page 12
Testing process
TestingSystem
Under Test
Refining
Informing Applying
Interpreting
TestModels
Revise the System
More exploring Reporting
Slide 23
New Model Testing
My BBC talk: http://www.bbc.co.uk/academy/technology/article/art2015052211302939829 page paper: http://dev.sp.qa/download/newModel
Exploratory Testing Course
© 2008-17 Gerrard Consulting Version 3.0 Page 13
Exploring a System as a Source
of Knowledge
When did YOU start testing?
Before your first ‘real’ job?
Discuss
How old were you when you
started testing?
Exploratory Testing Course
© 2008-17 Gerrard Consulting Version 3.0 Page 14
Playfulness, Curiosity and Testing
• Kids like to play
• Driven by their ‘natural curiosity’
• What can this toy (object) do?
• Are there rules to be learnt?
• I can hypothesise what those rules are
• I can test my hypotheses
• I learn by exploration, experimentation.
Slide 27
Exploration and rules
• Kids play, adults explore?
• We play with toys, but exploration better describes
the activity
• The “need to know” drives the activity
• Need to understand the “rules of the game”
• When we know the rules
– we can predict the future, based on the current situation
– we can play the game better and win.
Slide 28
Do our education systems suppress curiosity and play?
Exploratory Testing Course
© 2008-17 Gerrard Consulting Version 3.0 Page 15
Exploration and rules 2
• Play and the scientific method are similar
• Applies to games, physics, economics,
psychology… almost any discipline
• We hypothesise, and make conjectures
• Then, we stage tests of those conjectures
• As we test, we refine our conjectures,
recognise exceptions or start again.
Slide 29
Conjectures and heuristic(s)
• Conjecture:
– “the formation of conclusions based on
incomplete evidence”
– “a guess”
• Heuristic (adjective):
– “A learning approach based on trial and error”.
Slide 30
Exploratory Testing Course
© 2008-17 Gerrard Consulting Version 3.0 Page 16
Heuristic (noun) – a definition
“A rule of thumb, simplification, or educated
guess that reduces or limits the search for
solutions in domains that are difficult and poorly
understood.
Unlike algorithms, heuristics do not guarantee
optimal, or even feasible, solutions and are often
used with no theoretical guarantee”
Slide 31
Example heuristics
• Right wing governments cut taxes
• Ice on the road makes driving dangerous
• Smaller class sizes improve education
• Older people respect authority more
• Men with beards can’t be trusted
• Some heuristics are personal beliefs or prejudices,
but could reflect an underlying principle
• Are heuristics true? Not usually.
Slide 32
Exploratory Testing Course
© 2008-17 Gerrard Consulting Version 3.0 Page 17
Heuristics in software development
1. Complex code is more likely to fail
2. Good requirements lead to better systems
3.
4.
5.
Suggest three system design heuristics of your own derived from your own experience.
Slide 33
Heuristics, Cheat Sheets and
Check Lists
Exploratory Testing Course
© 2008-17 Gerrard Consulting Version 3.0 Page 18
Use of heuristics (cheat sheets)
http://testobsessed.com/wp-content/uploads/2011/04/testheuristicscheatsheetv1.pdf
Slide 35
Heuristics are…
• Known to have exceptions
• They are not ‘provable’ or guaranteed
• Used when there isn’t:
– A Test Oracle to consult (i.e. no requirements)
– Enough time to do full analysis (i.e. much of the time)
• Useful, when the problem to be solved is open ended
or extremely complex
• Approximations or specific cases of the true ‘law’.
Slide 36
Exploratory Testing Course
© 2008-17 Gerrard Consulting Version 3.0 Page 19
Heuristics in action
• Question the product to see what it does in
‘interesting situations’
– Invent a heuristic – model the behaviour
– Does it make sense as a model?
– Can you make predictions?
– Use challenges to test and refine the model
• Its similar to ‘twenty questions’…
Slide 37
20 Questions
1. Is it living?
2. Is it a machine?
3. Is it useful?
4. Is it a tool?
5. Could I lift it?
6. Can I cut with it?
No
No
Yes
Yes
Yes
No
What is your latest guess for the object?
What is your next question?
Slide 38
Exploratory Testing Course
© 2008-17 Gerrard Consulting Version 3.0 Page 20
Twenty questions 2
• Next question please......
• Was your first guess correct?
• Can you refine/change your view?
• Did you try to specify the object or object
type too early?
• Was your heuristic/guess too general?
• We make guesses that reduce the number of
possibilities by 50% (ideally).
Slide 39
Learning by exploration
• We make a guess/invent a heuristic, make a
speculative model
• Our model allows us to make a prediction
• We test/challenge to validate our model; to
reveal some new knowledge:
– To confirm our belief in our model or…
– …to refute our belief in our model.
Slide 40
Exploratory Testing Course
© 2008-17 Gerrard Consulting Version 3.0 Page 21
Evolution of your understanding
Validateheuristic
Add tomodel
Generalisemodel
Abandonheuristic
Exploration is characterised by refinement and occasional setbacks (dead ends)
Slide 41
Chooseheuristic
Chains of exploration
• A bit like island hopping
• Current knowledge means our model ‘works’
• To refine the model is a leap of faith
• Need tests of confirmation for every leap
• May need to “go back one space” sometimes.
Slide 42
Exploratory Testing Course
© 2008-17 Gerrard Consulting Version 3.0 Page 22
‘Growing’ your understanding
• Sometimes it feel like your understanding
grows organically
Slide 43
Exploration process
ExplorationDefinitions
specs/stories
People(& you)
Sources
Require-ments
TestModels
Enquiring
Challenging
Sources:People, documents,experience, system under test
Modelling
Test Models:Can be documented
or mental models
Predicting
System under test
Slide 44
Exploratory Testing Course
© 2008-17 Gerrard Consulting Version 3.0 Page 23
Predicting and Challenging
• When we think we understand a requirement or the system itself
– We can create example scenarios and predict the behaviour of the system using our model
• If we think the requirement (and our model) is wrong, we can use selected examples to challenge our sources
– “If this happens, and that happens, then the system will behave this way…”
– “Is that correct?”
Slide 45
The first test application
• A tiny test application to get you in the mood…
• http://dev.sp.qa/workshop
• There are two test application on the menu
– Example 1
– Example 2
• Let’s look at “Example 1”
Slide 46
Exploratory Testing Course
© 2008-17 Gerrard Consulting Version 3.0 Page 24
Exercise
• DON’T CLICK SUBMIT YET!
• Explore the form that has just the input field and submit button and work out the following:– What does the form do?
– Does “it work”?
• As you proceed, jot down the heuristic you are trying to test or refine
• Watch it evolve
• You can only click “Submit” twenty times
• What is your first guess?
Slide 47
Exercise 1 review
• What do you think the form does?
• How many tests to reach a conclusion?
• Did you use ‘spare’ tests to confirm your beliefs?
• Did you hit ‘dead ends’?
• Did you evolve your guess after every test?
• Did it take several tests to refute your guess?
• Did you feel you were getting closer all the time or
that you weren’t making progress?
Slide 48
Exploratory Testing Course
© 2008-17 Gerrard Consulting Version 3.0 Page 25
Exploration process revisited
ExplorationTest
Models
Enquiring
Challenging
Modelling
Predicting
System under test
1. We ask questions of the system to see what happens.(We don't know yet)
4. We compare the behaviour of the system with our predictions.
(Is our model correct?Is our source correct?)
2. We build a mental model of the system from our findings
3. We use our model to make predictions
(The system should do....)
Slide 49
You have explored and perhaps
you know how it works
But have you tested it?
Now, the testing begins, doesn't it?
Exploratory Testing Course
© 2008-17 Gerrard Consulting Version 3.0 Page 26
Explore this and describe it?
You explore, and know what it is
"But I wanted
Chicken!"
Slide 51
Exploring a system as the only source
of knowledge
• Firstly – it's impossible to ignore:
– You have your own knowledge and experience
– You know of conventions, standards
– You were asked to test for a reason, weren't you?
– In your projects is there NOTHING WRITTEN?
• If you ignore these sources…
– The system does what you have seen it do
– Your model (a reverse engineered spec/requirements) is based on this evidence
– So how could you ever find a bug?
No!
Slide 52
Exploratory Testing Course
© 2008-17 Gerrard Consulting Version 3.0 Page 27
What if our source of knowledge is
mostly the system?
• We use "Exploratory testing" when there are
few or no trusted sources
• The system under test becomes our main
source of knowledge
• In the model, our source of knowledge and
system under test merge into one...
Slide 53
"Exploratory" testing
Testing System Under Test
Refining
Applying
Interpreting
TestModels
Revise the System
Exploration
Enq
uirin
g
Ch
allengin
g
Modelling
Predicting
Info
rmin
g
Reporting
Slide 54
But... is this a purely theoretical
situation?
Exploratory Testing Course
© 2008-17 Gerrard Consulting Version 3.0 Page 28
Explore this?
Slide 55
Exploration of (complex) systems
• We normally have multiple sources of knowledge– Stakeholders, users, customers, advisors
– Designers, architects, developers
– Our own experience and knowledge, as testers
– and we *may* have the system too
• We use all our sources, select, use critical thinking to derive:– What is required
– Oracles (how it is expected to behave)
• The need and oracles come from 'any of the above' except the system itself.
Slide 56
Exploratory Testing Course
© 2008-17 Gerrard Consulting Version 3.0 Page 29
The second test application
• http://dev.sp.qa/workshop
• Let’s look at “Example 2”
• Try working in PAIRs?
– Talk to each other
1. Both think, have ideas and share
2. One types, the other observes (and vice versa)
3. “Are we finished?” (Or go to 1).
Slide 57
“Freestyle” Exploratory Testing
This is the ‘popular’ style of
exploratory testing
Exploratory Testing Course
© 2008-17 Gerrard Consulting Version 3.0 Page 30
What is exploratory testing (ET)?
• ET is characterised by:
– Interactivity – very much a hands-on activity
– Concurrency – think, test, learn, test, think…
– Creativity – you use your own imagination
– A drive towards fast results
– Reduced or no dependency on written test
scripts.
Slide 59
Definition of ET
• Cem Kaner (invented the term ET):
“Exploratory software testing is a style of software
testing that emphasizes the personal freedom and
responsibility of the individual tester to continually
optimize the value of her work by treating test-related
learning, test design, test execution, and test result
interpretation as mutually supportive activities that run
in parallel throughout the project.”
Slide 60
Exploratory Testing Course
© 2008-17 Gerrard Consulting Version 3.0 Page 31
Scripted testing
What’s good?
•
•
•
•
•
What’s bad?`
•
•
•
•
•
Slide 61
Exploratory (unscripted)
What’s good?
•
•
•
•
•
What’s bad?
•
•
•
•
•
Slide 62
Exploratory Testing Course
© 2008-17 Gerrard Consulting Version 3.0 Page 32
ET Overview
• Most testers do exploratory testing at one time or another – it’s just not visible to outsiders
• Has a bad reputation in some circles– Sloppy, unprofessional, ineffective, unsystematic
– But is gaining credibility as an agile practice
• “Test design and test execution” at the same time– Some people call it the opposite of scripted testing
– I think it’s the same except things happens faster; less gets written down
• For a fuller description of the pros/cons of ‘freestyle’ exploratory testing, see Cem Kaner:
http://testingeducation.org/BBST/exploratory/BBSTExploring.pdf
Slide 63
Do the Test
https://www.youtube.com/watch?v=Ahg6qcgoay4
Another?https://www.youtube.com/watch?v=8-hapS2SPz4
Slide 64
Exploratory Testing Course
© 2008-17 Gerrard Consulting Version 3.0 Page 33
Testing Styles, Approaches
“Structured/Waterfall/
Staged” Testing• Systematic
• Transparent, Documented
• Reviewable
• Auditable
• Repeatable, measurable
• Automatable
• Inflexible, not responsive
• Obsolescent/inaccurate documentation
• Prone to biases, inattention
• Outdated process
• Expensive, Inefficient
• Unimaginative, boring
“Exploratory” Testing
• Agile (with a small ‘a’)
• Improvised, imaginative
• Flexible and responsive to change
• Faster, cheaper
• More effective
• Personally enjoyable
• Not repeatable
• Not easily automated
• Little or no documentation
• Hard to manage
• Hard to scale
• Opaque
• Not auditable, measurable.
Intelligent Definition and Assurance Slide 65Some of these are perceptions – which vary
Why do exploratory testing?
• Scripts document tester’s ideas and there is a
lot of value in that way of testing
• But scripts can disrupt the thinking required
to find important problems quickly
• If testing is intellectually rich and fluid, we will
select better tests
• The richness of ET is only limited by our
imagination and emerging insights into SUT.
Slide 66
Exploratory Testing Course
© 2008-17 Gerrard Consulting Version 3.0 Page 34
Questioning strategies
• A key aspect of exploration is the way we
learn by asking questions
• Every test asks a question
– What happens if…?
– Can the product do this?
– Will this make it fail?
• Every answer increases our understanding of
the product’s capabilities.
Slide 67
Questioning strategies 2
• Act like a newspaper reporter
– The five questions a good story answers
– Who? What? Why? When? How?
• Edward DeBono – lateral thinking and other
thinking strategies
• Tony Buzan – Mindmaps etc.
• Lots of thinking and enquiry tools – none are
designed for ET, but might still be useful.
Slide 68
Exploratory Testing Course
© 2008-17 Gerrard Consulting Version 3.0 Page 35
5 questions
• Who is the user of this feature?
• What are they trying to do?
• Why are they doing this (what goal)?
• When do they do this?
• How are they actually using this feature?
Slide 69
Use of heuristics (cheat sheets)
http://testobsessed.com/wp-content/uploads/2011/04/testheuristicscheatsheetv1.pdf
Slide 70
http://karennicolejohnson.com/wp-content/uploads/2012/11/KNJohnson-2012-heuristics-
mnemonics.pdf
Exploratory Testing Course
© 2008-17 Gerrard Consulting Version 3.0 Page 36
What skills does an ETester need?
• You need creativity, memory of past events and intuition
• (Of all things!) Compare with improvisational acting
• When its your turn to ‘act’…– You need a memory of the story to date
– Need to “go for it” and use your intuition
– You need to be able to express your creativity
• But don’t forget – asking the user/dev might save you a LOT of wasted exploration time.
Slide 71
ET is unpredictable – of course
• With improvisation, you can’t predict the future
direction of the ‘scene’
• We use “charters” to define the scope of testing
– A test plan for 1-2 hours testing (no more)
– Sets out the scope (e.g. the ‘screen validation’, ‘error
handling’, ‘ease of use’)
– Describes coverage objective, budget, techniques,
particular risks.
Slide 72
Exploratory Testing Course
© 2008-17 Gerrard Consulting Version 3.0 Page 37
Testing (not just exploring) in pairs
• It is not:
– One person thinking, one person typing
– One dominating, one subservient
– Being joined at the hip
• It is:
– Continuous communication about the task in hand
– Diffusion of knowledge throughout a team as teams merge, break, reshuffle
– A team effort – and two heads are better than one.
Slide 73
Benefits of working in pairs
• One person takes the ‘strategic’ view
• One continuously reviews the other’s work
• Less likely to skip a step (through fatigue, laziness)
• To achieve common understanding, both must reach same high level of understanding
• Senior partner coaches junior
• Junior partner keeps the ideas coming
• Experience evens up quickly
• One partner keeps the other on track.
• But…. It’s not for everyone.
Slide 74
Exploratory Testing Course
© 2008-17 Gerrard Consulting Version 3.0 Page 38
Exploratory Testing Sessions
https://qastuff.wordpress.com/2012/08/17/what-is-
exploratory-testing-and-how-is-it-managed/
Useful MindMaps on Session-Based Testing
Slide 76
http://www.slideshare.net/codecentric/exploratory-testing-inagileoverviewmeettheexpertselisabethhendrickson
Exploratory Testing Course
© 2008-17 Gerrard Consulting Version 3.0 Page 39
Slide 77
http://www.slideshare.net/codecentric/exploratory-testing-inagileoverviewmeettheexpertselisabethhendrickson
Introducing the Test Session
• Charter
• Time Box
• Coverage
• Reviewable Result
• Debriefing
• We write a Charter rather than test scripts
• Like a meeting agenda – a guide, not a script.
Slide 78
Google “exploratory test charter template” – there are lots
Exploratory Testing Course
© 2008-17 Gerrard Consulting Version 3.0 Page 40
Charter: A clear mission for the
(timeboxed) session
• Charter may suggest what should be tested, how it should be tested, and what problems to look for
• Not meant to be a detailed plan – think of it as a ‘strategy for the next hour or so’
• Early charters are quite general:
– “Analyse the Generate TOC Feature”
• Specific charters provide better focus:
– “Create a document with >100 headings of level 1-5. Focus on the TOC options: level of detail, style, page numbering etc. We are concerned that these options behave inconsistently in the generated TOC.”
Slide 79
Time Box: Focused test effort of fixed
duration
• Between 45 and 120 minutes
– Brief enough for accurate reporting
– Brief enough to allow flexible scheduling
– Brief enough to allow course correction
– Long enough to get some testing done
– Long enough for efficient debriefings
– Beware of overly precise timing
• There is no ‘law’ but two hours is a long time to concentrate.
Slide 80
Exploratory Testing Course
© 2008-17 Gerrard Consulting Version 3.0 Page 41
Coverage: specifying coverage areas
• Coverage areas can include anything
– Functional areas of the product
– Test configuration (platforms, browsers etc.)
– Structural (programs, modules, lines of code etc.)
– Test strategies (boundaries, attack types, random etc.)
– System configuration parameters
– Specific failure modes (the risks of most concern)
– Specific tasks to be supported by the product
• Coverage items can just be a list
• Lists are models too, don’t forget.
Slide 81
Charter outline
• Charter mission:
• Mission Time:
• Areas to explore/coverage model:
• Specific areas of concern:
• Approach to use:
– Bugs types to be investigated?
– Extreme values?
– Sneaky situations?
– Anything else?
– Etc. etc…
Slide 82
Exploratory Testing Course
© 2008-17 Gerrard Consulting Version 3.0 Page 42
Create a charter (as a pair)
Word customised number list
dialog: Testing Charter
Explore area/feature
[ with resources, conditions or constraints ]
to discover information
Exploratory Testing Course
© 2008-17 Gerrard Consulting Version 3.0 Page 43
Session Report provides this kind
of output• What was your session goal?
(Charter)
• What was in scope:– Features, versions
– O/S/version
– Browser/version/environment
• Start time and duration
• Where did you spend your time?– Asking questions
– Exploring system
– Testing
– Bug investigation/reporting
– Setup and other hassles
– Opportunities/other activity
• What did you discover?– What did you test?
– What works?
– Risks investigated
– Bugs reported
• Wrap up– Outstanding questions?
– Risks to check/tests to run?
– Other observations?
– Ideas for new tests
– Do you need another session?
– Recommended action (close, repeat, extend…)
Slide 85
Session work breakdown
all work
non-
session(Interruptions?)
session
opportunity on charter
explore bug setup
Slide 86
Questions/
conversations
test
Exploratory Testing Course
© 2008-17 Gerrard Consulting Version 3.0 Page 44
Testing Triangles
Triangle Testing
Slide 88
Exploratory Testing Course
© 2008-17 Gerrard Consulting Version 3.0 Page 45
Test CharterCharter Testing Triangles
Time Box ____ minutes
Coverage •
•
•
•
Approach
Environment http://dev.sp.qa/workshop/triangle
Tester
Start/end Start: __:__:__ End __:__:__
Notes:
Slide 89
Introduction to
Exploratory Testing
Paul [email protected]
Twitter: @paul_gerrard
Web: gerrardconsulting.com
Slide 90