Exploratory testing and the mobile tester : A presentation by Jon Hagar
-
Upload
gallop-solutions -
Category
Technology
-
view
101 -
download
5
Transcript of Exploratory testing and the mobile tester : A presentation by Jon Hagar
Attack-based Exploratory Testing and The Mobile Tester
(with bonus round on ISO 29119)
Jon D. Hagar, Consultant, Grand Software Testing
Author: Software Test Attacks to Break Mobile and Embedded Devices
Copyright 2015, Jon D. Hagar Grand Software Testing, LLC –“Software Test Attacks to Break Mobile and Embedded Devices”
1
18th June 2015http://www.gallop.net/meetups
Gaming Testing Story
It only takes a few minutes using an App before users like or hate it
Worse than that. . . Many users will post a social media review of the app
You don’t want to be a BAD
Copyright 2015, Jon D. Hagar Mobile-Embedded Taxonomies from “Software Test Attacks to Break Mobile and Embedded Devices”
2
The Mobile Opportunity
Depth
Passion
Speed
What Does it Take to be a GreatMobile App Tester?
Copyright 2015, Jon D. Hagar Grand Software Testing, LLC –
Software Test Attacks to Break Mobile and Embedded Devices3
Copyright 2015, Jon D. Hagar Grand Software Testing, LLC –
Software Test Attacks to Break Mobile and Embedded Devices3
As the names imply, these are devices—small, held in the hand, connected to communication networks, including
Cell and smart phones – apps
Tablets
Medical devices
Typically have:
Many of the problems of classic embedded systems
The power of PCs/IT
More user interface (UI) than classic embedded systems
Fast and frequent updates
However, mobile devices are “evolving” with more power, resources, apps, etc.
Mobile is the “hot” area of computers/software
Testing rules and concepts are still evolving
Now starting to include IoT
You know what they are right?Mobile and Handheld?
Copyright 2015, Jon D. Hagar Mobile-Embedded Taxonomies from “Software Test Attacks to Break Mobile and Embedded Devices”
Definitions
Industry Error Trends Taxonomy
Developer Attacks
Basic Attacks for the Tester
The Big “Scary” Security Attacks
ISO 29119
Summary
Copyright 2015, Jon D. Hagar Mobile-Embedded Taxonomies from “Software Test Attacks to Break Mobile and Embedded Devices”
5
Agenda
Test – the act of conducting experiments on something to determine the quality and to provide information to stakeholders
Many methods, techniques, approaches, levels, context
Considerations: input, environment, output, instrumentation
Quality (ies) – Value to someone (that they will pay for)
Functions
Non-functional
It “works”
Does no harm
Are there (critical) bugs?
Copyright 2015, Jon D. Hagar Mobile-Embedded Taxonomies from “Software Test Attacks to Break Mobile and Embedded Devices”
6
Basic Definitions
From Wikipedia:
Taxonomy is the practice and science of classification. The word finds its
roots in the Greek τάξις, taxis (meaning 'order', 'arrangement') and νόμος, nomos ('law' or 'science'). Taxonomy uses taxonomic units, known as taxa(singular taxon). In addition, the word is also used as a count noun: a taxonomy, or taxonomic scheme, is a particular classification ("the taxonomy of ..."), arranged in a hierarchical structure.
Helping to “understand and know”
Copyright 2015, Jon D. Hagar Mobile-Embedded Taxonomies from “Software Test Attacks to Break Mobile and Embedded Devices”
7
Seeing the Eyes of the Enemy
Copyright 2015, Jon D. Hagar Mobile-Embedded Taxonomies from “Software Test Attacks to Break Mobile and Embedded Devices”
8
Taxonomy (researched)
Super Category Aero-Space Med sys Mobile General
Time 3 2 3
Interrupted - Saturation (over time) 5.5
Time Boundary – failure resulting from incompatible system time formats or values
0.5 1
Time - Race Conditions 3 1
Time - Long run usages 4 1 20
Interrupt - timing or priority inversions 0.7 3
Date(s) wrong/cause problem
0.5 1
Clocks 4 2
Computation - Flow 6 23 19
Computation - on data 4 1 3 1
Copyright 2015, Jon D. Hagar Mobile-Embedded Taxonomies from “Software Test Attacks to Break Mobile and Embedded Devices”
9
Taxonomy part 2
Super Category Aero-Space Med sys Mobile General
Data (wrong data loaded or used) 4 5.00 2
Initialization 6 2.00 3 5
Pointers 8 2.00 18 10
Logic and/or control law ordering 8 43 3 30
Loop control –Recursion 1
Decision point (if test structure) 0.5 1 1
Logically Impossible & dead code 0.7
Operating system – (Lack of Fault tolerance , interface to OS, other) 1.5 2 6Software - Hardware interfaces 16 13
Software - Software Interface 5 2.00 3
Software - Bad command- problem
on server 3 5
UI - User/ operator interface 4 5.00 20 10
UI - Bad Alarm 0.5 3
UI - Training – system fault resulting from improper training 3
Other 10.6 9.00 5 5
Note: one report on C/C++ indicated 70% of errors found involved pointers
A pattern (of testing) based on a common mode of failure seen over and over Part of Exploratory Testing May be seen as a negative, when it really is a positive Goes after the “bugs” that may be in the software May include or use classic test techniques and test concepts
Lee Copeland’s book on test design Many other good books
A Pattern (more than a process) which must be modified for the context at hand to do the testing
Testers learn mental attack patternsworking over the years in a specific domain
Attack-based TestingWhat is an attack?
Copyright 2015, Jon D. Hagar Mobile-Embedded Taxonomies from “Software Test Attacks to Break Mobile and Embedded Devices”
Attacks(from Software Test Attacks to Break Mobile and Embedded Devices)
Attack 1: Static Code Analysis
Attack 2: Finding White–Box Data Computation Bugs
Attack 3: White–Box Structural Logic Flow Coverage
Attack 4: Finding Hardware–System Unhandled Uses in Software
Attack 5: Hw-Sw and Sw-Hw signal Interface Bugs
Attack 6: Long Duration Control Attack Runs
Attack 7: Breaking Software Logic and/or Control Laws
Attack 8: Forcing the Unusual Bug Cases
Attack 9 Breaking Software with Hardware and System Operations
9.1 Sub–Attack: Breaking Battery Power
Attack 10: Finding Bugs in Hardware–Software Communications
Attack 11: Breaking Software Error Recovery
Attack 12: Interface and Integration Testing
12.1 Sub–Attack: Configuration Integration Evaluation
Attack 13: Finding Problems in Software–System Fault Tolerance
Attack 14: Breaking Digital Software Communications
Attack 15: Finding Bugs in the Data
Attack 16: Bugs in System–Software Computation
Attack 17: Using Simulation and Stimulation to Drive Software Attacks
Attack 18: Bugs in Timing Interrupts and Priority Inversion
Attack 19: Finding Time Related Bugs
Attack 20: Time Related Scenarios, Stories and Tours
Attack 21: Performance Testing Introduction Attack 22: Finding Supporting (User) Documentation
Problems Sub–Attack 22.1: Confirming Install–ability Attack 23: Finding Missing or Wrong Alarms Attack 24: Finding Bugs in Help Files Attack 25: Finding Bugs in Apps Attack 26: Testing Mobile and Embedded Games Attack 27: Attacking App–Cloud Dependencies Attack 28 Penetration Attack Test Attack 28.1 Penetration Sub–Attacks: Authentication —
Password Attack Attack 28.2 Sub–Attack Fuzz Test Attack 29: Information Theft—Stealing Device Data
Attack 29.1 Sub Attack –Identity Social Engineering
Attack 30: Spoofing Attacks Attack 30.1 Location and/or User Profile Spoof Sub–Attack Attack 30.2 GPS Spoof Sub–Attack Attack 31: Attacking Viruses on the Run in Factories or PLCs Attack 32: Using Combinatorial Tests Attack 33: Attacking Functional Bugs
Copyright 2015, Jon D. Hagar Mobile-Embedded Taxonomies from “Software Test Attacks to Break Mobile and Embedded Devices”
1: Developer Attacks for Mobile and IoT
Three of many
Copyright 2015, Jon D. Hagar Grand Software Testing, LLC – Software Test Attacks to Break Mobile and Embedded Devices
12
Attack 1: Static Code Analysis (testing)
When to apply this attack? After/during coding
What faults make this attack successful? Many Example: Issues with pointers
Who conducts this attack? Developer, tester, independent party
Where is this attack conducted? Tool/test lab
How to determine if the attack exposes failures? Review warning messages and find
true bugs
How to conduct this attack
Obtain and run tool
Find and eliminate false positive
Identify and address real bugs
Repeat as code evolves
Single unit/object
Class/Group
Component
Full system
Copyright 2015, Jon D. Hagar Grand Software Testing, LLC –Software Test Attacks to Break Mobile and Embedded Devices
13
Attack 2: Finding White–Box Data
Computation Bugs
When to apply this attack? After/during coding
What faults make this attack successful? Mistakes associated with data Example: Wrong value of Pi
Who conducts this attack? Developer, tester, independent party
Where is this attack conducted? Development Tool/test lab
How to determine if the attack exposes failures? Structural-data test success criteria
not met
How to conduct this attack
Obtain tool
Determine criteria and coverage
Create test automation with specific values (really a programing problem)
NOT NICE NUMBERS
Run automated test cases
Resolve failures
Peer check test cases
Repeat as code evolves
Copyright 2015, Jon D. Hagar Grand Software Testing, LLC –Software Test Attacks to Break Mobile and Embedded Devices
2: Tester Basic AttacksWhat is Missing, Usability, Alarms
Sampling of where to start Exploratory Testing
Copyright 2015, Jon D. Hagar Grand Software Testing, LLC – Software Test Attacks to Break Mobile and Embedded Devices
15
Attack 4: Finding Hardware–System Unhandled User Cases
When to apply this attack? Starting at system-software analysis
What faults make this attack successful? Lack of understand of the world Example: Car braking on ice
Who conducts this attack? Developer, tester, analyst
Where is this attack conducted? Environments, simulations, field
How to determine if the attack exposes failures? An unhandled condition exist Note: data explosion problem
How to conduct this attack
Knowledge
Out-of-box thinking
Operation Concepts
Analysis
Modeling
Lab testing
Field testing
Feedback
Repeat
Copyright 2015, Jon D. Hagar Grand Software Testing, LLC –Software Test Attacks to Break Mobile and Embedded Devices
When to apply this attack? …when your app/device has a user
What faults make this attack successful? …devices are increasingly complex
Who conducts this attack? …see chart on Roles
Where is this attack conducted? …throughout lifecycle and in user’s environments
How to determine if the attack exposes failures? Unhappy “users”
Bugs found
See sample checklist
Jean Ann Harrison Copyright 2013
Attack : Testing UsabilityMobile IoT Usability Tends to be “Poor”
Copyright 2015, Jon D. Hagar Grand Software Testing, LLC – “Software Test Attacks to Break Mobile and Embedded Devices”
Refine checklist to context scope Define a role Watch what is happening with this role
Define a usage (many different user roles) Guided explorations or ad hoc Stress, unusual cases, explore options Capture understanding, risk, observations, etc. Checklist (watch for confusion of the tester)
Run Exploratory Attack (s) Learn Re-plan-design Watch for Bias Switch testers
Repeat
Copyright 2015, Jon D. Hagar Grand Software Testing, LLC – “Software Test Attacks to Break Mobile and Embedded Devices”
Usability Attack Pattern
3: IoT and Mobile Security Attacks
And Now for Something Completely Different
Well, At Least A Very Scary (Not Silly) Walk
19
Copyright 2015, Jon D. Hagar Grand Software Testing, LLC – “Software Test Attacks to Break Mobile and Embedded Devices”
Fraud – Identity
Worms, virus, etc.
Fault injection
Processing on the run
Hacks impact
Power
Memory
CPU usage
Copyright 2015, Jon D. Hagar Grand Software Testing, LLC – Software Test Attacks to Break Mobile and Embedded Devices
Mobile Security Concerns
• Eavesdropping – yes everyone can hear you
• Hijacking
• Click-jacking
• Voice/Screen
• Physical Hacks
• File snooping
• Lost phone
Mobile systems are highly integrated hardware–software–system solutions which: Must be highly trustworthy since they handle sensitive data
Often perform critical tasks
Security holes and problems abound Coverity Scan 2010 Open Source Integrity Report - Android
Static analysis test attack found 0.47 defects per 1,000 SLOC
359 defects in total, 88 of which were considered “high risk” in the security domain
OS hole Android with Angry Birds Researchers Jon Oberheide and Zach Lanier
Robots and Drones rumored to be attacked
Cars and medical devices being hacked
Stuxnet Virus and its family
The Current Security Situation
Copyright 2015, Jon D. Hagar Grand Software Testing, LLC – “Software Test Attacks to Break Mobile and Embedded Devices”
Apply when the device is mobile and has Account numbers
User-ids and passwords
Location tags
Restricted data
Current authentication approaches in use on mobile devices Server-based
Registry (user/password)
Location or device-based
Profile-based
Security Attacks
Copyright 2015, Jon D. Hagar Grand Software Testing, LLC – “Software Test Attacks to Break Mobile and Embedded Devices”
Attack 28 Penetration Attack Test
Attack 28.1 Penetration Sub–Attacks: Authentication — Password
Attack 28.2 Sub–Attack Fuzz Test
Attack 29: Information Theft—Stealing Device Data
Attack 29.1 Sub Attack –Identity Social Engineering
Attack 30: Spoofing Attacks
Attack 30.1 Location and/or User Profile Spoof Sub–Attack
Attack 30.2 GPS Spoof Sub–Attack
Security Attacks (only a starting point checklist of things to do)
Copyright 2015, Jon D. Hagar Grand Software Testing, LLC – “Software Test Attacks to Break Mobile and Embedded Devices”
Security attacks must be done with the knowledge and approval of owners of the system and software
Severe legal implications exist in this area Many of these attacks must be done in a lab (sandbox) In these attacks, I tell you conceptually how to “drive a car very fast
(150 miles an hour) but there are places to do this with a car legally (a race track) and places where you will get a ticket (most public streets)”
Be forewarned - Do not attack you favorite app on your phone or any connected server without the right permissions due to legal implications
Warnings when Conducting Security Attacks
Copyright 2015, Jon D. Hagar Grand Software Testing, LLC – “Software Test Attacks to Break Mobile and Embedded Devices”
Attacks(from Software Test Attacks to Break Mobile and Embedded Devices)
Attack 1: Static Code Analysis
Attack 2: Finding White–Box Data Computation Bugs
Attack 3: White–Box Structural Logic Flow Coverage
Attack 4: Finding Hardware–System Unhandled Uses in Software
Attack 5: Hw-Sw and Sw-Hw signal Interface Bugs
Attack 6: Long Duration Control Attack Runs
Attack 7: Breaking Software Logic and/or Control Laws
Attack 8: Forcing the Unusual Bug Cases
Attack 9 Breaking Software with Hardware and System Operations
9.1 Sub–Attack: Breaking Battery Power
Attack 10: Finding Bugs in Hardware–Software Communications
Attack 11: Breaking Software Error Recovery
Attack 12: Interface and Integration Testing
12.1 Sub–Attack: Configuration Integration Evaluation
Attack 13: Finding Problems in Software–System Fault Tolerance
Attack 14: Breaking Digital Software Communications
Attack 15: Finding Bugs in the Data
Attack 16: Bugs in System–Software Computation
Attack 17: Using Simulation and Stimulation to Drive Software Attacks
Attack 18: Bugs in Timing Interrupts and Priority Inversion
Attack 19: Finding Time Related Bugs
Attack 20: Time Related Scenarios, Stories and Tours
Attack 21: Performance Testing Introduction Attack 22: Finding Supporting (User) Documentation
Problems Sub–Attack 22.1: Confirming Install–ability Attack 23: Finding Missing or Wrong Alarms Attack 24: Finding Bugs in Help Files Attack 25: Finding Bugs in Apps Attack 26: Testing Mobile and Embedded Games Attack 27: Attacking App–Cloud Dependencies Attack 28 Penetration Attack Test Attack 28.1 Penetration Sub–Attacks: Authentication —
Password Attack Attack 28.2 Sub–Attack Fuzz Test Attack 29: Information Theft—Stealing Device Data
Attack 29.1 Sub Attack –Identity Social Engineering
Attack 30: Spoofing Attacks Attack 30.1 Location and/or User Profile Spoof Sub–Attack Attack 30.2 GPS Spoof Sub–Attack Attack 31: Attacking Viruses on the Run in Factories or PLCs Attack 32: Using Combinatorial Tests Attack 33: Attacking Functional Bugs
Copyright 2015, Jon D. Hagar Mobile-Embedded Taxonomies from “Software Test Attacks to Break Mobile and Embedded Devices”
Bonus RoundISO 29119
(and how in may impact mobile and testing)
Copyright 2015, Jon D. Hagar Grand Software Testing, LLC –
Software Test Attacks to Break Mobile and Embedded Devices 26
Motivation for ISO29119 Conflicts in definitions, processes and procedures “One ring to rule them all” — standards to be replaced by one
e.g., IEEE 829, IEEE 1008, BS7925-1/-2, IEEE 1028
Users do not know which standard to follow
Lacking in current standards or incomplete: Organizational areas
e.g., Test Policy and Organizational Test Strategy
Project Test Management
BS7925 only covers unit testing
General processes
Common functional techniques missing
Coverage of non-functional testing
ISO/IEC29119 –Structure and Flow
BS7925-1
BS7925-2 IEEE 829
Concepts & Vocabulary
Part 1
ProcessAssessment
TestingTechniques
Part 4
Documentation
Part 3Part 2
Processes
Keyword-Driven Testing
Part 5 ISO/IEC 33063
ISO 12207
ISO 15288
Directives
IEEE 1008
Thanks to Stuart Reid
Part 1: Concepts & Vocabulary
SOFTWARE TESTING CONCEPTS
Scope, Conformance, Normative References
TESTING IN DIFFERENT LIFE CYCLE MODELS
ROLES AND RESPONSIBILITIES IN TESTING
ANNEXES – Metrics, Examples, Bibliography
DEFINITIONS
Test:
Appro
ach,
Basis
, M
eth
ods -
Ris
k B
ased T
esting
Part 2: Testing Processes
TEST MANAGEMENT PROCESSES
ORGANIZATIONAL TEST PROCESS
DYNAMIC TEST PROCESSES
TEST MANAGEMENT PROCESSES
ORGANIZATIONAL TEST PROCESS
DYNAMIC TEST PROCESSES
Instantiating Testing Processes
Ref: S. Reid
Organize
Test Plan
Development
Identify &
Estimate Risks
Design Test
Strategy
Determine
Staffing and
Scheduling
Document
Test Plan
Schedule,
Staffing Profile
Test
Strategy
Analyzed
Risks
Scope
Identify Risk
Treatment
Approaches
Gain
Consensus on
Test Plan
Approved
Test Plan
Draft
Test Plan
Test
Plan Publish
Test Plan
Understand
Context
Treatment
Approaches
Test Planning Processes
Part 3 – Test Documentation
TEST DOCUMENTATION
ANNEXES - EXAMPLES
Scope, Conformance,
Normative References
Sele
ct
a s
ubset
of docs
Part 3: Test Documentation Organizational test documentation
Test policy
Test strategy
Project test documentation
Project test plan
Test project completion report
Test Level documentation
Test plan
Test specification
Test results
Anomaly reports
Level test status report
Test environment report
Test level completion report
Appendices
Examples of documents at each level of testing
Part 4 – Test Techniques
TEST COVERAGE MEASUREMENT
Scope, Conformance, Normative References
ANNEXE – TESTING OF QUALITY CHARACTERISTICS
ANNEXE – SELECTION OF TECHNIQUES
ANNEXE – TEST TECHNIQUE EFFECTIVENESS
TEST DESIGN TECHNIQUES
Functional Structural
Part 5- Keyword-Driven Testing
Part 5 will address:
Concept
Applicability
Interfaces
Approach
Part 5 WD was sent out in May and next draft due in Nov 2013
Impact to 29119 to Mobile/Smart
Maybe not much EXCEPT
Some domains are more regulated such as: safety-related
telecoms
financial – banks, stock markets, etc.
Impact to business International contracting
Assessment
Legal
Process improvement
Large company to company (trading language)
Do Testers Need Standards? –Yes, maybe, but
Standards support common communication within the topic Common reference points
Starting point for research, usage (pro & con), critic
Maturity is an issue but a baseline serves as sounding board and common reference point for “Scientific” methods An international benchmark
Thinkers and researchers can prove/disprove benchmark(s)
Part of being in a profession (but only part)
These attacks are presented at a summary level only
Much more detail and effort are needed
Understanding your local context and error patterns is important
(one size does NOT fit all)
Attacks are patterns…you still must THINK and tailor
ISO may help, but only in a few contexts
Wrap Up of this Session
Copyright 2015, Jon D. Hagar Grand Software Testing, LLC – “Software Test Attacks to Break Mobile and Embedded Devices”
James Whittaker (attacks) Elisabeth Hendrickson (simulations) Lee Copeland (techniques) Brian Merrick (testing) James Bach (exploratory and tours) Cem Kaner (test thinking) Jean Ann Harrison (her thinking and help) ISO 29119 standards and working group 26
Many teachers Generations past and future Books, references, and so on
Notes: Thank You (ideas used from)
Copyright 2015, Jon D. Hagar Grand Software Testing, LLC – “Software Test Attacks to Break Mobile and Embedded Devices”
“Software Test Attacks to Break Mobile and Embedded Devices”
– Jon Hagar
“How to Break Software” James Whittaker, 2003 And his other “How To Break…” books
“A Practitioner’s Guide to Software Test Design” Copeland, 2004 “A Practitioner’s Handbook for Real-Time Analysis” Klein et. al., 1993 “Computer Related Risks”, Neumann, 1995 “Safeware: System Safety and Computers”, Leveson, 1995 Honorable mentions:
“Systems Testing with an Attitude” Petschenik 2005 “Software System Testing and Quality Assurance” Beizer, 1987 “Testing Computer Software” Kaner et. al., 1988 “Systematic Software Testing” Craig & Jaskiel, 2001 “Managing the Testing Process” Black, 2002
Book Notes List (my favorites)
Copyright 2015, Jon D. Hagar Grand Software Testing, LLC – Software Test Attacks to Break Mobile and Embedded Devices
• www.stickyminds.com – Collection of test info
• www.embedded.com – info on attacks
www.sqaforums.com - Mobile Devices, Mobile Apps -Embedded Systems Testing forum
• Association of Software Testing– BBST Classes http://www.testingeducation.org/BBST/
• Your favorite search engine
• Our web sites and blogs (listed on front page)
More Resources
Copyright 2015, Jon D. Hagar Grand Software Testing, LLC – Software Test Attacks to Break Mobile and Embedded Devices