Testing Imperatives in the World of Agile Development and Continuous Delivery

47
Testing Imperatives in the World of Agile Development and Continuous Delivery Huw Price, CA Technologies, Inc. Jonathon Wright, Director, Hitachi Consulting 28/07/2015

Transcript of Testing Imperatives in the World of Agile Development and Continuous Delivery

Testing Imperatives in the World of Agile Development and Continuous Delivery

Huw Price, CA Technologies, Inc.Jonathon Wright, Director, Hitachi Consulting

28/07/2015

2 © 2015 CA. ALL RIGHTS RESERVED.

Huw Price

VP, CA Technologies, Inc.

Jonathon Wright

Director, Hitachi Consulting

Featuring

3 © 2015 CA. ALL RIGHTS RESERVED.

Pre-webinar survey – question 1 (112 responses)

0% 10% 20% 30% 40% 50% 60% 70%

Time/resources in test data compliance (PII)

Defects stemming from ambiguous requirements

Testing innefficiencies leading to higher cost

Lack of test coverage creating defects/rework

Difficulty finding the right data for a particular test

Manual testing leading to project delays

What are the main software challenges you are facing? Select all that apply.

4 © 2015 CA. ALL RIGHTS RESERVED.

Pre-webinar survey – question 2 (97 responses)

0% 10% 20% 30% 40% 50% 60% 70%

Test Data Warehouse and Test Data Allocation

Data Masking and subsetting

Synthetic Data Generation

Requirements Definition

Test Case Optimization

Automated Test Case Design

Test Automation

In what ways do you consider (or have already started) using Test Data Management and/or Test Case Design technology for? Select all that apply.

5 © 2015 CA. ALL RIGHTS RESERVED.

Pre-webinar survey – question 3 (99 responses)

0% 10% 20% 30% 40% 50% 60% 70% 80%

Meet data compliance requirements

Automated creation and execution

Optimized Test Data Coverage

Reduced testing cost and complexity

Accelerated Time to Market

Improved software quality

Expected/realized benefits of TDM and TCD

6 © 2015 CA. ALL RIGHTS RESERVED.

Jonathon has over 15 years of international automation

experience with a number of global organizations.

Jonathon also contributed to the best-selling book

Experiences of Test Automation: Case Studies of

Software Test Automation, Dorothy Graham, and a

number of books on Testing as a Service (TaaS)

models, Advanced UFT 12 for Test Engineers Cookbook

and API testing in the cloud (service and network

virtualization).

He has also presented at various international testing

conferences, such as Gartner (London), STARWest

(California), STAREast (Orlando), Fusion (Sydney),

ANZTB (Melbourne), EuroSTAR (Gothenburg and

Dublin), Unicom (London), BCS SIGIST (London).

Director of Digital Engineering

Digital Engineering PracticesJonathon Wright

7 © 2015 CA. ALL RIGHTS RESERVED.

Addressing the ‘Digital Enterprise’ imperative

1 Hitachi Consulting, Becoming a Digital Enterprise: Evolution over Revolution, http://www.hitachiconsulting.com/digitalenterprise/

1

8 © 2015 CA. ALL RIGHTS RESERVED.

Hitachi helps ‘Metropolize’ the Digital Enterprise

Smart Cities

Smart eco cities

Big Data Infrastructure

Energy Consumption

Information & Telecommunication Solutions

X as a Service (XaaS) Opportunities

IT Service, Data Centre Solutions, Consulting

Enterprise IT Solutions

Large-scale systems, Highly reliable systems

Finger Vein Authentication

Future PlatformsHardware, Software

Worlds FastestElevator

Railway Solutions

Rolling stock + maintenance

Electrical components

Signaling / Train control systems

Distribution Solutions

Nuclear Power Business

Mining Equipment

Water & Natural Resources

9 © 2015 CA. ALL RIGHTS RESERVED.

Digital Transformation

Disruptors

Journey to the ‘Digital Transformation Age’

10 © 2015 CA. ALL RIGHTS RESERVED.

BiModal ‘Digital Enterprise’ Delivery

• Reliable, compliant, secure

• Think price / performance

• Plan and approvals driven

• Long life-cycles

CoreIT• Agile and fluid• Innovation, brand,

profit measures• Minimum viable

experience / product • Think continuous• Think days, weeks

FluidIT

Digital Transformation

11 © 2015 CA. ALL RIGHTS RESERVED.

Digital Enterprise is not just about “disruptors”

12 © 2015 CA. ALL RIGHTS RESERVED.

Shift towards digital engineering?

“Technology is no longer the enabler, everything is continuously evolving the tools and techniques that worked yesterday may no longer be the correct approach for tomorrow… …continuous open innovation combined with strategic partnerships enable digital transformation through modern digital engineering practices.”

Innovation Projects (FluidIT)

Differentiation Project (FluidIT)

Evolution of Core Systems

(CoreIT)

1

1 Hitachi Consulting, Becoming a Digital Enterprise: Evolution over Revolution, http://www.hitachiconsulting.com/digitalenterprise/

13 © 2015 CA. ALL RIGHTS RESERVED.

Ideas & hypotheses

Experiment team assess the initial proposal and

brainstorm ideas on how to realise the vision. Each idea

is defined by hypotheses which can then be tested.

(Experimentation is prioritised based on the identification of a

minimum viable experience)

Design

An experiment is defined that will test one or more ideas

based on their hypotheses

Build

Assets required to perform the experiment are created

Measure

The experiment is performed against the criteria defined

in the hypothesis

Learn

The outcomes of the experiment are assessed and the

insights gained are used to iterate on the initial ideas

and hypotheses and improve the product/service. If

required, a decision is made on whether the proposal

needs reworking or the initial vision needs to be

changed/rejected.

Continuous Delivery fail-fast experiments, testing ideas and hypotheses resulting in Continuous Learning

14 © 2015 CA. ALL RIGHTS RESERVED.

The Journey to Smart / Predictive Testing

Smart / Predictive Improvement

Smart / Predictive Learning

Smart / Predictive Intelligence

Smart / Predictive Insight

Smart / Predictive Assessment

Smart / Predictive Quality

Smart / Predictive Delivery

Smart / Predictive Support

Smart / Predictive Testing

Smart / Predictive Innovation

15 © 2015 CA. ALL RIGHTS RESERVED.

Traditional testing methods are too slow, manual, and linear for Dev/Ops, compromising speed and quality

User/BA create ambiguous,Incomplete Requirements

RequirementsCI/Build Functional

testing UAT Integration testing

Deploy to production

Code commit SCM

Manual test case design and script generation

Too much manual execution

Manual/automated tests pile up in an ad hoc manner

Tests provide poor coverage

Test data can only execute 10-20% of the tests that need to

be run

RequirementsCI/Build Functional

testing UAT Integration testing

Deploy to production

Code commit SCM

“Garbage in, Garbage Out” – Defects are

written into the code

A lack of available data and environments prevents parallel test and

environment

Manual/automated testing frameworks cannot react to

change

Tests have to be checked and updated by hand

“If I make a change here, I break the entire system”

Defects make it to production, where they create time-

consuming, costly late rework

70% of testing is still manual

Testing takes up to half the resources in the SDLC

74% of projects run over time / 59% go over budget / only 69% of desired functionality is actually delivered

1 - Bloor, Automated Test Case Generation Report, 2014

2 - Standish Group, Chaos Manifesto 2013

1

1

Software is buggy and does

not reflect customer needs

2

PoorCustomer

Experience

16 © 2015 CA. ALL RIGHTS RESERVED.

Poor requirements

A plethora of requirements techniques exist

They tend to create ambiguous, incomplete requirements, creating defects

56% of defects stem from ambiguity in requirements 1

1 – Bender RBT, Requirements Based Testing Process Overview, 2009

17 © 2015 CA. ALL RIGHTS RESERVED.

Requirements and change requests are usually a “wall of words” or static diagrams

The requirements are “static” - they offer no way to derive tests directly from them…

…And no way to update tests when the requirements change – this has to be done manually

18 © 2015 CA. ALL RIGHTS RESERVED.

Manual Test Case Design

Currently manual – a time consuming, error-prone process

Is unsystematic, ad hoc, and has no real notion of “coverage”

Over-testing and under-testing – 10-20% coverage with 4 times over-testing

Poor requirements lead to poor overall testing, with testers having to fill in the gaps

No linkage to test data – process is manual, painstaking and very time-consuming

No flexibility for change requests: a critical weakness in an agile or Continuous Delivery environment. Changes take longer than the original requirement!

19 © 2015 CA. ALL RIGHTS RESERVED.

Automated testing frameworks are heavily scripted

Script generation is usually done manually

As well as the maintenance

Alternative solutions, use:

Record Playback

Use script-less automation frameworks (keyword)

But you’re back to Manual test case design

Automated Testing: Manual Script Generations

Poor Testing

20 © 2015 CA. ALL RIGHTS RESERVED.

Automated testing frameworks are heavily scripted

Script generation is usually done manually

As well as the maintenance

Alternative solutions, use:

Record Playback

Use script-less automation frameworks (keyword)

But you’re back to Manual test case design

Automated tests: manual generation

Poor Testing

21 © 2015 CA. ALL RIGHTS RESERVED.

Most testing is – Frankly:

Random

Unstructured

Repetitive

Not thorough enough

Can’t be measured

Can’t keep up

Too slow

Model based testing lets you define what is supposed to happen and then test that.

Model based testing is:

Accurate

Structured

Thorough

Measurable

Can keep up with change

Allows you to measure risk and resources

Lets you automate very quickly

Model Based Testing

22 © 2015 CA. ALL RIGHTS RESERVED.

• Test cases and scripts are created automatically from “Active” requirements

• They are executed automatically

• Testing is Model Based

• Use a Component Library of common and optimized tests

• Test Data and Virtual End Points are created or found as part of the Automation

• Automating the automation

Active Automation

Active Automation

23 © 2015 CA. ALL RIGHTS RESERVED.

1) Model requirements as an “Active” Flowchart

A formal model that is accessible to the business who already use VISIO, BPM, etc.

Which is also a mathematically precise model of a system, so that it eliminates ambiguity and incompleteness

It can be used by testers and developers –it brings the user, business and IT into close alignment

24 © 2015 CA. ALL RIGHTS RESERVED.

The “Active” Flowchart

Testers can overlay the flowchart with all the functional logic and data involved in a system

Tests can therefore be automatically derived from it

25 © 2015 CA. ALL RIGHTS RESERVED.

2) Auto-generate test cases directly from requirements

Test cases can be created automatically, in minutes – not days or weeks

They are optimized, so that they test 100% of functional coverage in the smallest amount of tests possible

26 © 2015 CA. ALL RIGHTS RESERVED.

Auto-generate test cases directly from requirements

Complexity and coverage can be measured

Tests can be executed as either manual tests, or automated tests

27 © 2015 CA. ALL RIGHTS RESERVED.

3) “Match” Tests Directly to the Right Data and Expected Results include point on virtual end-points

“Match Jobs” can be run automatically, finding data from multiple back-end systems, the Test Data Warehouse, or creating it from scratch when none exists

Data is created from default values, based on the output names, attributes and values defined in the flowchart

Expected results are linked to the logic gate in the flowchart, and are exported along with the test cases

28 © 2015 CA. ALL RIGHTS RESERVED.

A fully automated pipeline, capable of taking an idea from design to deployment at pace, without compromising quality

It is reactive, and can respond to the high velocity and volume of change requests

It places user experience at the start, not the end

Continuous Delivery driven by Active Automation

The testers validates it works as intended by the requirements

Developers write codeUser/BA Design a System

RequirementsCI/Build Integration

testing UAT Functional Testing

Deploy to production

Code commit SCM

Change Requests

CD starts with an idea!

High quality software &

Good Customer Experience

Defects detected

CustomerExperience

29 © 2015 CA. ALL RIGHTS RESERVED.

Design QA/TESTDEV PRODUCTIONPRE-PROD

Shift up ‘minimum viable product’ to prove value to the ‘business’

Shift down ‘minimum viable experience’ for

target ‘customer’

Smart Predictive

Lean

Release Plan

CI/Build Functional testing UAT Integration

testing Performance

testingDeploy to pre-prod

Code commit SCM

Shift right from ‘ideas & hypotheses’ to ‘fail-fast

experiments’

Design Spec

Requirements

Business Value

TESTData

TESTData

TESTCase

User BA

Product Manager

Journey to Value – teleportation (shift X)

CustomerExperience

CustomerExperience

CustomerExperience

CustomerExperience

CustomerExperience

Business Value

Business Value

Business Value

Improvement

Innovation

Intelligence

Assessment

Insight

Learning

Maintenance

Operations

Delivery

Testing

Support

Shift left from ‘support’ mapping to ‘stories&/or requirements’

Bu

sin

ess

Cu

sto

me

r

Experiments Scenarios

Lean Lean

Waste Waste TESTData

REALUsers

TESTCase

REALData

TESTStub

USERCase

30 © 2015 CA. ALL RIGHTS RESERVED.

Pre-webinar survey – question 1 (112 responses)

0% 10% 20% 30% 40% 50% 60% 70%

Time/resources in test data compliance (PII)

Defects stemming from ambiguous requirements

Testing innefficiencies leading to higher cost

Lack of test coverage creating defects/rework

Difficulty finding the right data for a particular test

Manual testing leading to project delays

What are the main software challenges you are facing? Select all that apply.

31 © 2015 CA. ALL RIGHTS RESERVED.

Pre-webinar survey – question 2 (97 responses)

0% 10% 20% 30% 40% 50% 60% 70%

Test Data Warehouse and Test Data Allocation

Data Masking and subsetting

Synthetic Data Generation

Requirements Definition

Test Case Optimization

Automated Test Case Design

Test Automation

In what ways do you consider (or have already started) using Test Data Management and/or Test Case Design technology for? Select all that apply.

32 © 2015 CA. ALL RIGHTS RESERVED.

Pre-webinar survey – question 3 (99 responses)

0% 10% 20% 30% 40% 50% 60% 70% 80%

Meet data compliance requirements

Automated creation and execution

Optimized Test Data Coverage

Reduced testing cost and complexity

Accelerated Time to Market

Improved software quality

Expected/realized benefits of TDM and TCD

slideshare.net/CAinc

linkedin.com/company/ca-technologies

VP, Test Data Management

[email protected]

Huw Price

@datainventor

Director, Hiatchi Consulting

[email protected]

Jonathon Wright

@Jonathon_Wright

linkedin.com/company/hitachi-consulting

hitachiconsulting.com

Contact Us

slideshare.net/Jonathon_Wright

ca.com

34 © 2015 CA. ALL RIGHTS RESERVED.

Every Company is now a Digital Enterprise

Today, customer demands drive markets

There is a greater demand for speed and quality than every before

Organizations need to be able to react to constantly changing customer needs…

…Or be forced out of the market by “digital disruptors”

1 Diego Lo Giuidice, VP, Principal Analyst, Forrester Research, The Forrester Wave™: Functional Test Automation (FTA), Q2 2015: Test Fast To Deliver Fast! (June 25, 2015)

1

1

35 © 2015 CA. ALL RIGHTS RESERVED.

We are now in the “Age of Unicorns” – the $1 billion start up

Uber:– Founded in 2009

– Now valued at $41.2 billion

– The world’s biggest taxi company…

…Owns no cars

Air BnB– Founded in 2008

– Valued at $20 billion

– One of the biggest lodgings companies…

…Owns no lodgings

“Digital Disruptors” shake up markets and force companies with traditional models out

Fortune’s List of Unicorn start-ups (2015), http://fortune.com/unicorns/

36 © 2015 CA. ALL RIGHTS RESERVED.

What do these companies have in common?

They deliver constant innovation to customers, providing improved software and services over their competitors, and the better customer experience associated with it

They are fast moving companies who use Continuous Delivery to accelerate time to market

1 Google, Accelerating your Development Process with Continuous Delivery

1

Speed Quality

37 © 2015 CA. ALL RIGHTS RESERVED.

Test data warehouse

Design QA/TESTDEV PRODUCTIONPRE-PROD

Provide instant access to data

Remove environmental constraints with

realistic virtual data

ParallelWork

Streams

ELIMINATE DEPENDENCIES AND CONSTRAINTS TO ENABLE PARALLEL WORK STREAMS

Release Plan

CI/Build Functional testing UAT Integration

testing Performance

testingDeploy to pre-prod

Deploy to production

Code commit SCM

Shift left and work from “active” requirements

Service and Message

Virtualization

Design Spec

Requirements

Set up devenvironment

Set up test environments

Set up pre-prod

Set up prod

CustomerExperience

Design Spec

Requirements

TESTData

TESTData

TESTData

TESTCase

Test Data Manager

ProfileModelMask

GenerateCloneSubset

User BA

Product Manager

Test Case Designer

Journey to Dev/Ops – parallel data and environments

38 © 2015 CA. ALL RIGHTS RESERVED.

DEV QA/TEST PRODUCTIONPRE-PROD

Remove bottlenecks and find defects here

…instead of here

SHIFT LEFT

Test data warehouse

AUTOMATE TESTING ACROSS EVERY PHASE OF THE SDLC

CustomerExperienceRelease Plan

Design

User BA

Product Manager

Design Spec

CI/Build Functional testing UAT Integration

testing Performance

testingDeploy to pre-prod

Deploy to production

Code commit SCM

Testcases

Production Data

SourcesProfileModelMask

GenerateCloneSubset

Test Data Manager

TestData

Javelin / Existing Automation Engines

Requirements

Data and Test Cases are

linked directly to

requirements

Automatically execute and update tests after change

Create, store and manage test data to

execute 100% of tests

Auto-generate the perfect set tests directly

from requirements

QualityAcceleration

Test Case Designer

A requirements driven approach to CD – fully automated testing and no bottlenecks

Testcases

Testcases

TestData

TestData

39 © 2015 CA. ALL RIGHTS RESERVED.

CustomerExperience

Design

Reflect changes made to requirements in tests

automatically

Auto-generate tests directly from existing requirements

and test components

Integration with test and requirement design tools, and automation engines

QA/TestDEV PRE-PROD

PROVISIONREP MGR SCM CI TESTCONTAINER CONFIG CLOUD

Model “Active” requirements

Plan Release

Design Spec

Code commit SCM CI/Build Functional

testing UAT Integration testing

Performance testing

Deploy to pre-prod

Deploy to production

End-to-End Automation

APM

Change Requests

“Shift left” the effort of testing into the requirements gathering stage – everything after that “falls out”

automatically

User BA

Product Manager

Test Case Designer

Defects detected earlier

A requirements driven approach to CD – the shortened feedback loop

40 © 2015 CA. ALL RIGHTS RESERVED.

“Shift left” the effort of testing into the requirements gathering stage, so that test/dev assets automatically “fall out” from it

By taking a requirements driven, model based approach, data, expected results, manual and automated tests are all traceable to user requirements

They can then be auto-updated when the requirements changes – testing is fully automated, and can keep up!

Continuous Development must start with an idea, not after testing has already begun

41 © 2015 CA. ALL RIGHTS RESERVED.

Baseline Agile Designer

Test Cases Created 3 12

Time Taken 4 hours 30 mins

Test Coverage 5% 100%

Client 1 – Slow, Manual Design and Poor Test Coverage

The Contrast: Optimized testing vs Manual Test Design

Baseline Agile Designer

Test Cases Created 150 19

Time Taken 5 hours 40 mins

Over testing 18 x 0

Client 2 – Rampant Over Testing

42 © 2015 CA. ALL RIGHTS RESERVED.

Baseline Agile Designer

Checked/Updated 67 3

Time Taken 7.5 hours 2 minutes

Test Coverage maintained Unknown 100%

Client 1 – Implementing Change

The Contrast: Smart Testing vs. Manual Test Design

43 © 2015 CA. ALL RIGHTS RESERVED.

1. “Gold Copy” data is stored in a central warehouse

2. It is delivered to teams in parallel, “matched” to tests on the basis of specific attributes

3. The data is “cloned” and is fully versioned

Synthetic/Clone Data

Masked

Subset

Created

Production

Deve

lop

men

t

Web - SOA Test Harness

Virtualization

Data Masking

Subsetting/Data Slicing

Data ProfilingCoverage Metrics

Agile-Designer

Test Design

Data Design

Test Data on Demand

UI - Load Test Harness

Test Mart

Find and Reserve

REQUEST/RESPONSE POOLS

VERSION CONTROLEXPECTED RESULTS

REGRESSION PACKS

DATA DRIVEN TEST PACKS

TEMPLATES

Agile-DataHP ALM/QC

Automation

Data Visualization

Virtualization

Javelin – Data Orchestration

Data is “matched” to specific tests and can be delivered to multiple teams in parallel

44 © 2015 CA. ALL RIGHTS RESERVED.

1. Automatically profile data, model it, and accurately measure its coverage

2. Generate rich synthetic data which provides 100% coverage

3. Cover every outlier, unexpected result, boundary condition and negative path

4. Create thousands of rows of complex, inter-related data in minutes

“Empty”

Datamaker + Required Data Characteristics

Provision fit for purpose data anytime and every time!Provision data with or without access to production systems!

Ready for Testing!

Generate all the test data you need

45 © 2015 CA. ALL RIGHTS RESERVED.

4) Full traceability with requirements - Auto-update tests when the user requirements change

Know what needs to be re-tested and when the integrity of a system is at risk… “If I change this, what will I break?”

The impact of a change made to an individual component is identified system wide

The impact on test cases and user stories up and down a system can also be identified automatically

46 © 2015 CA. ALL RIGHTS RESERVED.

Update manual or automated tests in minutes

Remove any broken or redundant tests automatically – no more checking test cases by hand

Restore functional coverage to 100%, creating any new test cases required

Execute only the tests needed to validate a change

Make sure a change has successfully “rippled up”

47 © 2015 CA. ALL RIGHTS RESERVED.

The Integrated Development Chain

Agile Designer

BA

User

Project Manager <-> Test Manager <-> Programmer <-> Testers

Clarity of Vision

Static Testing

Requirement <-> Traceability <-> Use Cases <-> Traceability <-> Test Case <-> Traceability <-> Defect

DataMaker

HP QCHP ALMSelenium

AgileBoards

• Test Cases• Virtual Data• Automation

• Find• Allocate• Synthesize• Data On Demand

• Requirements• Test Cases• Test Data• Defects• Automation

• Story Boards• Work Allocation• Work Tracking• Story Points

Use CasesRequirementsComplexityTest CasesTest DataVirtual DataAutomationExpected Results

Backlog

User Requirement

Change Request Development

Functional

Integration

System Testing

Performance

Acceptance

Developers

Testers

Continuous Development

CA SV