MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016 There is no...

106
MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016 MTAT.03.159: Software Testing Lecture 06: Test Tools, Metrics, Documentation, Organisation and Process Improvement (Textbook Ch. 14, 9, 7, 8, 16) Dietmar Pfahl email: [email protected] Spring 2016

Transcript of MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016 There is no...

Page 1: MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016 There is no shortage of Test Tools • Defect Tracking (98) • GUI Test Drivers (71) • Load

MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016

MTAT.03.159: Software Testing

Lecture 06:

Test Tools, Metrics,

Documentation, Organisation

and Process Improvement

(Textbook Ch. 14, 9, 7, 8, 16) Dietmar Pfahl email: [email protected] Spring 2016

Page 2: MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016 There is no shortage of Test Tools • Defect Tracking (98) • GUI Test Drivers (71) • Load

MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016

Structure of Lecture 06

• Test Tools

• Test Measurement

• Test Planning & Documentation

• Test Organisation

• Test Process Improvement (TMMi)

• Lab 6

Page 3: MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016 There is no shortage of Test Tools • Defect Tracking (98) • GUI Test Drivers (71) • Load

MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016

Tools – the Workbench

• Good for

– repeating tasks

– organising data

• Requires training

• Must be introduced

incrementally

• No “silver bullet”

Evaluation criteria

• Ease of use

• Power

• Robustness

• Functionality

• Ease of introduction

• Quality of support

• Cost

• Company policies and goals

Page 4: MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016 There is no shortage of Test Tools • Defect Tracking (98) • GUI Test Drivers (71) • Load

MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016

Test Tools – in the Process

Test management tools

Test

execution

and

comparison

tools

Dynamic

analysis tools

Debugging

tools

Coverage tools

Static analysis tools

Test design

tools

Architectural

design

Detailed

design

Code

Requirement

specification

Unit test

Integration

test

Performance

simulator

tools

System test

Acceptance

test

Page 5: MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016 There is no shortage of Test Tools • Defect Tracking (98) • GUI Test Drivers (71) • Load

MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016

Tools categories by purpose

1. Reviews and

inspections

2. Test planning

3. Test design and

development

4. Test execution and

evaluation

5. Test support

Page 6: MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016 There is no shortage of Test Tools • Defect Tracking (98) • GUI Test Drivers (71) • Load

MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016

Tools categories by purpose

1. Reviews and

inspections

2. Test planning

3. Test design and

development

4. Test execution and

evaluation

5. Test support

• Complexity analysis

– Identify problem areas

• Code comprehension

– Show different views

of the artefact

• Syntax and semantics

analysis

– Check and warn

Page 7: MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016 There is no shortage of Test Tools • Defect Tracking (98) • GUI Test Drivers (71) • Load

MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016

Tools categories by purpose

1. Reviews and

inspections

2. Test planning

3. Test design and

development

4. Test execution and

evaluation

5. Test support

• Templates for test plan

documentation

• Test schedule and

staffing estimates

• Complexity analyser

To large extent: general

project management tools

Page 8: MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016 There is no shortage of Test Tools • Defect Tracking (98) • GUI Test Drivers (71) • Load

MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016

Tools categories by purpose

1. Reviews and

inspections

2. Test planning

3. Test design and

development

4. Test execution and

evaluation

5. Test support

• Test data generator

• Requirements-based

test design tool

• Capture/replay

• Coverage analysis

Often integrated with

test execution tools

Page 9: MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016 There is no shortage of Test Tools • Defect Tracking (98) • GUI Test Drivers (71) • Load

MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016

Tools categories by purpose

1. Reviews and

inspections

2. Test planning

3. Test design and

development

4. Test execution and

evaluation

5. Test support

• Test case management

• Capture/replay

• Coverage analysis

• Memory testing (leaks)

• Simulators and

performance analysis

– HW emulators

– SW simulators (mocking)

– Load generators

Page 10: MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016 There is no shortage of Test Tools • Defect Tracking (98) • GUI Test Drivers (71) • Load

MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016

Tools categories by purpose

1. Reviews and

inspections

2. Test planning

3. Test design and

development

4. Test execution and

evaluation

5. Test support

• Issue reporting

– Report, Dispatch, Follow-

up

• Configuration

management

– Manage, control and

coordinate changes

Page 11: MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016 There is no shortage of Test Tools • Defect Tracking (98) • GUI Test Drivers (71) • Load

MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016

There is no shortage of Test Tools

• Defect Tracking (98)

• GUI Test Drivers (71)

• Load and Performance (52)

• Static Analysis (38)

• Test Coverage (22)

• Test Design Tools (24)

• Test Drivers (17)

• Test Implementation (35)

• assist with testing at runtime - memory

leak checkers, comparators, and a wide

variety of others

• Test case Management (24)

• Unit Test Tools (63)

• 3 different categories of others

Other links to test tool overviews:

• http://www.aptest.com/resources.html

• http://www.softwareqatest.com/qatweb1.html

From http://www.testingfaqs.org/

Page 12: MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016 There is no shortage of Test Tools • Defect Tracking (98) • GUI Test Drivers (71) • Load

MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016

Test data generator – Generatedata.com

online service for generating data

Page 13: MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016 There is no shortage of Test Tools • Defect Tracking (98) • GUI Test Drivers (71) • Load

MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016

Test data generated from spec (previous slide)

Advanced tools allow for

more complex

specification of data

generation rules

Page 14: MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016 There is no shortage of Test Tools • Defect Tracking (98) • GUI Test Drivers (71) • Load

MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016

LinkScan tool to detect broken links

http://www.elsop.com/linkscan/docs/links00.html

Page 15: MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016 There is no shortage of Test Tools • Defect Tracking (98) • GUI Test Drivers (71) • Load

MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016

LinkScan tool to detect broken links

Plan:

Select web page to be

scanned

Page 16: MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016 There is no shortage of Test Tools • Defect Tracking (98) • GUI Test Drivers (71) • Load

MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016

LinkScan tool to detect broken links

Scan:

Recursively crawl over the

web-page and

automatically try every link

Page 17: MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016 There is no shortage of Test Tools • Defect Tracking (98) • GUI Test Drivers (71) • Load

MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016

LinkScan tool to detect broken links

Exam:

Show summary report of

broken links

Show locations of broken links

Page 18: MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016 There is no shortage of Test Tools • Defect Tracking (98) • GUI Test Drivers (71) • Load

MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016

LinkScan tool to detect broken links

Exam:

Show summary report of

broken links

Show locations of broken links

Page 19: MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016 There is no shortage of Test Tools • Defect Tracking (98) • GUI Test Drivers (71) • Load

MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016

Functional (Web-)Testing Approaches

1. Recorded Scripts

2. Engineered Scripts

3. Data-driven Testing

4. Keyword-driven Testing

5. Model-based Testing

First Last Data

Pekka Pukaro 1244515

Teemu Tekno 587245

Page 20: MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016 There is no shortage of Test Tools • Defect Tracking (98) • GUI Test Drivers (71) • Load

MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016

Recorded Scripts

• Unstructured

• Scripts generated using capture and

replay tools

• Relatively quick to set up

• Mostly used for regression testing

• Scripts non-maintainable, in practice

– If the system changes they need to be

captured again

• Capture Replay Tools – Record user’s actions to a script

(keyboard, mouse)

• Tool specific scripting language

– Scripts access the (user) interface of the software

• Input fields, buttons and other widgets

– Simple checks can be created in the scripts

• Existence of texts and objects in the UI

• Data of GUI objects

Page 21: MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016 There is no shortage of Test Tools • Defect Tracking (98) • GUI Test Drivers (71) • Load

MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016

Recorded Scripts

• Example with Selenium IDE

Some web-

application to

be tested ...

Record Button

switched on ...

http://opensource.demo.orangehrm.com

http://opensource.demo.orangehrm.com

Page 22: MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016 There is no shortage of Test Tools • Defect Tracking (98) • GUI Test Drivers (71) • Load

MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016

Recorded Scripts

• Example with Selenium IDE

Some web-

application to

be tested ...

Record Button

switched on ...

Make sure Record button is ON!

Open Base URL in browser

Login using values:

Login Name: demo

Password: demo

Click ‘Login’ button

http://opensource.demo.orangehrm.com

Page 23: MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016 There is no shortage of Test Tools • Defect Tracking (98) • GUI Test Drivers (71) • Load

MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016

Recorded Scripts

• Example with Selenium IDE

...

next

actions

...

Record Button

switched on ...

Highlight ‘Welcome demo’ text

Verify that text is present

- command: VerifyTextPresent

Click ’Logout’

... then stop recording ...

http://opensource.demo.orangehrm.com

Page 24: MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016 There is no shortage of Test Tools • Defect Tracking (98) • GUI Test Drivers (71) • Load

MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016

Recorded Scripts

• Example with Selenium IDE

Test Case

(=Test

Scenario)

consists of

several

Test Steps

Record Button

switched off ...

http://opensource.demo.orangehrm.com

Open Base URL in browser

Login using values:

Login Name: demo

Password: demo

Click Login button

Highlight ‘Welcome demo’ text

Verify that text is present

Click ’Logout’

Test Suite

TC1

TC2

Step1

Step2 ...

Selenium

commands

Data

(values)

Location on Web-page (Target)

may use xpath, css, id, field, etc. TCs can be saved and exported into

several programming languages (java,

python, c#, etc.)

Tests can be

run (replay) ...

Page 25: MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016 There is no shortage of Test Tools • Defect Tracking (98) • GUI Test Drivers (71) • Load

MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016

Engineered Scripts

• Scripts are well-designed (following a systematic approach),

modular, robust, documented, and maintainable

• Separation of common tasks

– E.g. setup, cleanup/teardown, and defect detection

• Test data is still embedded into the scripts

– One driver script per test case

• Test code is mostly written manually

• Implementation and maintenance require programming skills

which testers (test engineers) might not have

• “Just like any other software development project”

Page 26: MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016 There is no shortage of Test Tools • Defect Tracking (98) • GUI Test Drivers (71) • Load

MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016

Engineered

Scripts

• Example with

Selenium / TestNG

Page 27: MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016 There is no shortage of Test Tools • Defect Tracking (98) • GUI Test Drivers (71) • Load

MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016

Engineered

Scripts

• Example with

Selenium / TestNG

xpath notation

Click on ’Math Calculator’

Selects at current node ( . )

all elements ( //* ) with id=’menu’

then takes the 3rd ’div’ element ( div(3) ),

then element ’a’ (a)

... and then ...

clicks on the link (math-calculator.html)

Page 28: MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016 There is no shortage of Test Tools • Defect Tracking (98) • GUI Test Drivers (71) • Load

MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016

Engineered

Scripts

• Example with

Selenium / TestNG

Click on ’Percent Calculator’

Page 29: MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016 There is no shortage of Test Tools • Defect Tracking (98) • GUI Test Drivers (71) • Load

MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016

Engineered

Scripts

• Example with

Selenium / TestNG

Enter ’10’ – first number

Enter ’50’ – second number

10 50

Get result with getText() – ’5’

Click on ’Calculate’

Page 30: MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016 There is no shortage of Test Tools • Defect Tracking (98) • GUI Test Drivers (71) • Load

MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016

Engineered

Scripts

• Example with

Selenium / TestNG

10 50

Checks if result

equals ’5’ (10% of 50)

5

Page 31: MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016 There is no shortage of Test Tools • Defect Tracking (98) • GUI Test Drivers (71) • Load

MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016

Data-Driven Testing

• Data-Driven Testing = Executing the same set of test steps with multiple (different) data

• Test inputs and expected outcomes stored as data

– Typically in tabular format

• Test data are read from an external data source

• One driver script can execute all of the designed test cases

Note that in previous example test data (‘10’ and ‘50’) was embedded in the test case definition

First Last Data

Pekka Pukaro 1244515

Teemu Tekno 587245

Example data (new):

imdb film database

employees

Page 32: MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016 There is no shortage of Test Tools • Defect Tracking (98) • GUI Test Drivers (71) • Load

MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016

Data-Driven

Testing

public class DataProviderExample extends SeleneseTestCase{

@BeforeClass

public void setUp() throws Exception {

}

@DataProvider(name = "DP1")

public Object[][] createData1() throws Exception{

Object[][] retObjArr=getTableArray("test\\Resources\\Data\\data1.xls",

"DataPool", "imdbTestData1");

return(retObjArr);

}

@Test (dataProvider = "DP1")

public void testDataProviderExample(String movieTitle,

String directorName, String moviePlot, String actorName) throws Exception

{

//enter the movie title

selenium.type("q", movieTitle);

//they keep switching the go button to keep the bots away

if (selenium.isElementPresent("nb15go_image"))

selenium.click("nb15go_image");

else

selenium.click("xpath=/descendant::button[@type='submit']");

selenium.waitForPageToLoad("30000");

//click on the movie title in the search result page

selenium.click("xpath=/descendant::a[text()='"+movieTitle+"']");

selenium.waitForPageToLoad("30000");

//verify director name is present in the movie details page

verifyTrue(selenium.isTextPresent(directorName));

//verify movie plot is present in the movie details page

verifyTrue(selenium.isTextPresent(moviePlot));

//verify movie actor name is present in the movie details page

verifyTrue(selenium.isTextPresent(actorName));

}

@AfterClass ...

Defines where to find the

data (uses Java Excel API)

Page 33: MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016 There is no shortage of Test Tools • Defect Tracking (98) • GUI Test Drivers (71) • Load

MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016

Data-Driven

Testing public String[][] getTableArray(String xlFilePath, String sheetName, String tableName){

String[][] tabArray=null;

try{

Workbook workbook = Workbook.getWorkbook(new File(xlFilePath));

Sheet sheet = workbook.getSheet(sheetName);

int startRow,startCol, endRow, endCol,ci,cj;

Cell tableStart=sheet.findCell(tableName);

startRow=tableStart.getRow();

startCol=tableStart.getColumn();

Cell tableEnd= sheet.findCell(tableName, startCol+1,startRow+1, 100, 64000, false);

endRow=tableEnd.getRow();

endCol=tableEnd.getColumn();

System.out.println("startRow="+startRow+", endRow="+endRow+", " +

"startCol="+startCol+", endCol="+endCol);

tabArray=new String[endRow-startRow-1][endCol-startCol-1];

ci=0;

for (int i=startRow+1;i<endRow;i++,ci++){

cj=0;

for (int j=startCol+1;j<endCol;j++,cj++){

tabArray[ci][cj]=sheet.getCell(j,i).getContents();

}

}

}

catch (Exception e) {

System.out.println("error in getTableArray()");

}

return(tabArray);

}

Reads the data from the data table ...

(method getTableArray(...))

Page 34: MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016 There is no shortage of Test Tools • Defect Tracking (98) • GUI Test Drivers (71) • Load

MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016

Data-Driven Testing

• External test data can be edited without programming skills Test design and framework implementation are now separate tasks:

– design task can be given to someone with the domain knowledge (business people, customers) and

– framework implementation to someone with programming skills.

• Avoids the problems of embedded test data – Data are hard to understand in the middle of all scripting details

– Updating tests or creating similar tests with slightly different test data types/structures always requires programming (-> copy-paste scripting)

• Follow this link for a fully worked example of Data-Driven Testing with Selenium:

http://functionaltestautomation.blogspot.com/2009/10/dataprovider-data-driven-testing-with.html

First Last Data

Pekka Pukaro 1244515

Teemu Tekno 587245

Page 35: MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016 There is no shortage of Test Tools • Defect Tracking (98) • GUI Test Drivers (71) • Load

MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016

Data-Driven Testing

Page 36: MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016 There is no shortage of Test Tools • Defect Tracking (98) • GUI Test Drivers (71) • Load

MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016

Keyword-Driven Testing

• Keywords also known as action words

• Keyword-driven testing improves data-driven testing:

– Keywords abstract the navigation and actions from the script

– Keywords and test data are read from an external data source

• When test cases are executed keywords are interpreted by a test library (=set of test scripts) which is called by a test automation framework

• Example 1: Login: admin, t5t56y; // 2 args

AddCustomers: newCustomers.txt // 1 arg

RemoveCustomer: Pekka Pukaro // 1 arg

• Example 2: click, checkbox, undo, …

• More keywords (=action words) can be defined based on existing keywords

• Keyword driven testing ~= domain specific languages (DSL)

• Details: http://doc.froglogic.com/squish/4.1/all/how.to.do.keyword.driven.testing.html

• Another tool: http://code.google.com/p/robotframework/

Page 37: MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016 There is no shortage of Test Tools • Defect Tracking (98) • GUI Test Drivers (71) • Load

MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016

Keyword-Driven Testing

Keyword-driven test data file

adds one level of abstraction

Similar to data-driven testing

But instead of testing functions

directly, handler looks for keywords

and based on that derives required

test-data

Purpose: increase flexibility and re-

usability

Page 38: MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016 There is no shortage of Test Tools • Defect Tracking (98) • GUI Test Drivers (71) • Load

MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016

Keyword-Driven Testing

Several layers of keywords possible

Benefit: can define new keywords using existing ones

Page 39: MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016 There is no shortage of Test Tools • Defect Tracking (98) • GUI Test Drivers (71) • Load

MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016

Architecture of a Keyword-Driven

Framework

Pekka Laukkanen. Data-Driven and Keyword-Driven Test Automation

Frameworks. Master’s Thesis. Helsinki University of Technology. 2006.

Page 40: MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016 There is no shortage of Test Tools • Defect Tracking (98) • GUI Test Drivers (71) • Load

MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016

Architecture of a Keyword-Driven

Framework

Pekka Laukkanen. Data-Driven and Keyword-Driven Test Automation

Frameworks. Master’s Thesis. Helsinki University of Technology. 2006.

For detailed description of the concepts, see:

Pekka Laukkanen. Data-Driven and Keyword-

Driven Test Automation Frameworks. Master’s

Thesis. Helsinki University of Technology.

2006.

Page 41: MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016 There is no shortage of Test Tools • Defect Tracking (98) • GUI Test Drivers (71) • Load

MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016

Model-based Testing

• System under test is modelled

– UML-state machines, domain specific languages (DSL)

• Test cases are automatically generated from the model

– The model can provide also the expected results for the generated

test cases

– More accurate model -> better test cases

• Generate a large amount of tests that cover the model

– Many different criteria for covering the model

– Execution time of test cases might be a factor

• Challenges:

– Personnel competencies

– Data-intensive systems (cannot be modelled as a state-machine)

• Simple MBT tool http://graphwalker.org/

Page 42: MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016 There is no shortage of Test Tools • Defect Tracking (98) • GUI Test Drivers (71) • Load

MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016

GraphWalker http://graphwalker.org/

Example:

A regression test for the login function

of the Spotify Desktop Client

The feature is supposed to work like this:

In a freshly installed client, and the client is started, the

Login dialog is expected to be displayed.

The user enters valid credentials and the client is

expected to start.

If the user quits, or logs out, the Login dialog is displayed

once again.

If the user checks the Remember Me checkbox, and logs

in (using valid credentials), the client starts, and, next time

the user starts the client, it will start without asking the

client for credentials.

Test

Page 43: MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016 There is no shortage of Test Tools • Defect Tracking (98) • GUI Test Drivers (71) • Load

MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016

GraphWalker http://graphwalker.org/

Example:

A regression test for the login function

of the Spotify Desktop Client

basic flow

For testing the 2 first steps, a model would look something

like this:

1. The Start vertex is where the tests starts.

2. In e_Init, we remove all cache, and kill any previous client

processes.

3. v_ClientNotRunning will assert that there is no Spotify client

process running.

4. e_Start starts the client.

5. v_LoginPrompted asserts that the login dialog is displayed and

correctly rendered.

6. e_ValidCredentials enters a valid username and password and

clicks the Sign In button.

7. v_Browse asserts that the Browse view is correctly displayed.

Page 44: MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016 There is no shortage of Test Tools • Defect Tracking (98) • GUI Test Drivers (71) • Load

MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016

GraphWalker http://graphwalker.org/

Example:

A regression test for the login function

of the Spotify Desktop Client

Complete Model

(first 2 steps)

Page 45: MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016 There is no shortage of Test Tools • Defect Tracking (98) • GUI Test Drivers (71) • Load

MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016

GraphWalker http://graphwalker.org/

Example:

A regression test for the login function

of the Spotify Desktop Client

%> java -jar

graphwalker.jar offline -m

Login.graphml

"random(edge_coverage(100))

e_Init v_ClientNotRunning

e_StartClient

v_LoginPrompted

e_InvalidCredentials

v_LoginPrompted

e_ValidPremiumCredentials

v_Browse e_Logout

v_LoginPrompted e_Close

v_ClientNotRunning

...

Page 46: MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016 There is no shortage of Test Tools • Defect Tracking (98) • GUI Test Drivers (71) • Load

MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016

Functional Testing Approaches

1. Recorded Scripts

– Cheap to set up, quick & dirty

2. Engineered Scripts

– Structured

3. Data-driven Testing

– Data separation

4. Keyword-driven Testing

– Action separation, DSL

5. Model-based Testing

– Modeling & Automatic

test case generation

First Last Data

Pekka Pukaro 1244515

Teemu Tekno 587245

Page 47: MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016 There is no shortage of Test Tools • Defect Tracking (98) • GUI Test Drivers (71) • Load

MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016

What can be automated?

• Test generation

– Test case (steps with data & oracle)

– Test data (input)

– Test oracle (expected output)

– Test verdict (PASS/FAIL decision)

• Test execution

– E.g., regression testing

• Test reporting

• Debugging

– Fault localisation (using failure/error information)

Page 48: MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016 There is no shortage of Test Tools • Defect Tracking (98) • GUI Test Drivers (71) • Load

MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016

What can be automated?

• Test generation

– Test case

– Test data (input)

– Test oracle (expected output)

• Test execution

– E.g., regression testing

• Test selection

• Test reporting

• Debugging

– Fault localisation

Often, people mean

automated test

execution when they

talk about test

automation

Page 49: MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016 There is no shortage of Test Tools • Defect Tracking (98) • GUI Test Drivers (71) • Load

MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016

When to automate?

• Test Automation should be used by considering the following:

– Large and critical projects

– Projects that require testing the same areas frequently

– Requirements not changing frequently

– Accessing the application for load and performance with

many virtual users

– Stable software

– Availability of time/effort (for set-up, execution,

maintenance, etc.)

Page 50: MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016 There is no shortage of Test Tools • Defect Tracking (98) • GUI Test Drivers (71) • Load

MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016

Test automation promises

1. Efficient regression test

2. Run tests more often

3. Perform difficult tests (e.g. load, outcome check)

4. Better use of resources

5. Consistency and repeatability

6. Reuse of tests

7. Earlier time to market

8. Increased confidence

Page 51: MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016 There is no shortage of Test Tools • Defect Tracking (98) • GUI Test Drivers (71) • Load

MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016

Common problems

1. Unrealistic expectations

2. Poor testing practice

”Automatic chaos just gives faster chaos”

3. Expected effectiveness

4. False sense of security

5. Maintenance of automatic tests

6. Technical problems (e.g. Interoperability)

7. Organizational issues

Page 52: MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016 There is no shortage of Test Tools • Defect Tracking (98) • GUI Test Drivers (71) • Load

MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016

Limits of automated testing

• Does not replace manual testing

• Manual tests find more defects than automated tests

– Does not improve effectiveness

• Greater reliance on quality of tests

– Oracle problem

• Test automation may limit the software development

– Costs of maintaining automated tests

Page 53: MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016 There is no shortage of Test Tools • Defect Tracking (98) • GUI Test Drivers (71) • Load

MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016

Structure of Lecture 06

• Test Tools

• Test Measurement

• Test Planning & Documentation

• Test Organisation

• Test Process Improvement (TMMi)

• Lab 6

Page 54: MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016 There is no shortage of Test Tools • Defect Tracking (98) • GUI Test Drivers (71) • Load

MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016

How good are we at testing?

Test quality

Product quality

Many faults

Few faults

Few faults

Few faults

Are we here?

Or are we here?

Page 55: MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016 There is no shortage of Test Tools • Defect Tracking (98) • GUI Test Drivers (71) • Load

MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016

Purpose of Measurement

• Test monitoring – check the status

• Test controlling – corrective actions

• Plan new testing

• Measure and analyze results

– The benefit/profit of testing

– The cost of testing

– The quality of testing

– The quality of the product

– Basis of improvement, not only for the test process

Page 56: MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016 There is no shortage of Test Tools • Defect Tracking (98) • GUI Test Drivers (71) • Load

MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016

Test Monitoring

• Status

– Coverage metrics

– Test case metrics: development and execution

– Test harness development

• Efficiency / Cost metrics

– How much time have we spent?

– How much money/effort have we spent?

• Failure / Fault metrics

– How much have we accomplished?

– What is the quality status of the software?

• Effectiveness metrics

– How effective is the testing techniques in detecting defects?

Metrics

Estimation

Cost

Stop?

Page 57: MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016 There is no shortage of Test Tools • Defect Tracking (98) • GUI Test Drivers (71) • Load

MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016

Test metrics: Development status

• Test case development status

– Planned

– Available

– Unplanned (not planned for, but

needed)

• Test harness development status

– Planned

– Available

– Unplanned

Page 58: MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016 There is no shortage of Test Tools • Defect Tracking (98) • GUI Test Drivers (71) • Load

MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016

Test metrics: Coverage

What?

% statements covered

% branches covered

% data flow

% requirements

% equivalence classes

Why?

• Track

completeness of test

Page 59: MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016 There is no shortage of Test Tools • Defect Tracking (98) • GUI Test Drivers (71) • Load

MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016

Test metrics: Test execution status

What?

• # failures/hour

• # executed tests

• Requirements

coverage

Why?

• Track progress of test

project

• Decide stopping

criteria

Page 60: MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016 There is no shortage of Test Tools • Defect Tracking (98) • GUI Test Drivers (71) • Load

MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016

Test metrics: Efficiency

What?

• # faults/hour

• # faults/test case

Why?

• Evaluate efficiency of

V&V activities

Page 61: MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016 There is no shortage of Test Tools • Defect Tracking (98) • GUI Test Drivers (71) • Load

MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016

Test metrics: Faults/Failures

(Trouble reports)

What?

• # faults/size

• repair time

• root cause

Why?

• Monitor quality

• Monitor efficiency

• Improve

Page 62: MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016 There is no shortage of Test Tools • Defect Tracking (98) • GUI Test Drivers (71) • Load

MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016

Test metrics: Effectiveness

What?

% found faults per

phase

% missed faults

Why?

• Evaluate effectiveness

of V&V activities

Page 63: MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016 There is no shortage of Test Tools • Defect Tracking (98) • GUI Test Drivers (71) • Load

MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016

When to stop testing?

• All planned tests are executed and passed

• All coverage goals met (requirements, code, ...)

• Detection of specific number of failures

• Rates of failure detection fallen below a specified

level

• Fault seeding ratios are favourable

• Reliability above a certain value

• Cost has reached the limit

Page 64: MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016 There is no shortage of Test Tools • Defect Tracking (98) • GUI Test Drivers (71) • Load

MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016

Structure of Lecture 06

• Test Tools

• Test Measurement

• Test Planning & Documentation

• Test Organisation

• Test Process Improvement (TMMi)

• Lab 6

Page 65: MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016 There is no shortage of Test Tools • Defect Tracking (98) • GUI Test Drivers (71) • Load

MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016

Test Planning

• Objectives

• What to test

• Who will test

• When to test

• How to test

• When to stop

Elective course (Fall’16):

Hands-on SW Testing

MTAT.03.294

Page 66: MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016 There is no shortage of Test Tools • Defect Tracking (98) • GUI Test Drivers (71) • Load

MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016

IEEE 829-2008:

Standard for

Software and

System Test

Documentation

Page 67: MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016 There is no shortage of Test Tools • Defect Tracking (98) • GUI Test Drivers (71) • Load

MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016

Software Project

Management Plan

Goals

• Business

• Technical

• Business/technical

• Political

Quantitative/qualitative

Policies

• High-level statements

of principle or courses

of action

• Govern the activities

implemented to

achieve stated goals

SPMP

Page 68: MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016 There is no shortage of Test Tools • Defect Tracking (98) • GUI Test Drivers (71) • Load

MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016

Hierarchy of Test Plans

Page 69: MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016 There is no shortage of Test Tools • Defect Tracking (98) • GUI Test Drivers (71) • Load

MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016

Test plan according to

IEEE Std 829-2008 (Appendix II)

a) Test plan identifier

b) Introduction

c) Test items

d) Features to be tested

e) Features not to be tested

f) Approach

g) Item pass/fail criteria

h) Suspension criteria and

resumption requirements

i) Test deliverables

j) Testing tasks

k) Environmental needs

l) Responsibilities

m) Staffing and training needs

n) Schedule

o) Risks and contingencies

p) Approvals

Page 70: MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016 There is no shortage of Test Tools • Defect Tracking (98) • GUI Test Drivers (71) • Load

MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016

Test Plan (1)

a) Test plan identifier

b) Introduction

– Product to be tested, objectives, scope of the test plan

– Software items and features to be tested

– References to project authorization, project plan, QA plan, CM plan, relevant policies & standards

c) Test items

– Test items including version/revision level

– Items include end-user documentation

– Defect fixes

– How transmitted to testing

– References to software documentation

Slide not shown in lecture;

Only illustrative background info

Page 71: MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016 There is no shortage of Test Tools • Defect Tracking (98) • GUI Test Drivers (71) • Load

MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016

Test Plan (2)

d) Features to be tested

– Identify test design / specification techniques

– Reference requirements or other specs

e) Features not to be tested

– Deferred features, environment combinations, …

– Reasons for exclusion

f) Approach

– How you are going to test this system • Activities, techniques and tools

– Detailed enough to estimate

– Completion criteria (e.g. coverage, reliability)

– Identify constraints (environment, staff, deadlines)

Slide not shown in lecture;

Only illustrative background info

Page 72: MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016 There is no shortage of Test Tools • Defect Tracking (98) • GUI Test Drivers (71) • Load

MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016

Test Plan (3)

g) Item pass/fail criteria

– What constitutes success of the testing

– Coverage, failure count, failure rate, number of executed tests, …

– Is NOT product release criteria

h) Suspension and resumption criteria

– For all or parts of testing activities

– Which activities must be repeated on resumption

i) Test deliverables

– Test plan

– Test design specification, Test case specification

– Test procedure specification, Test item transmittal report

– Test logs, Test incident reports, Test summary reports

Slide not shown in lecture;

Only illustrative background info

Page 73: MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016 There is no shortage of Test Tools • Defect Tracking (98) • GUI Test Drivers (71) • Load

MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016

Test Plan (4)

j) Testing tasks

– Including inter-task dependencies & special skills

– Estimates

k) Environment

– Physical, hardware, software, tools

– Mode of usage, security, office space

– Test environment set-up

l) Responsibilities

– To manage, design, prepare, execute, witness, check, resolve issues, providing environment, providing the software to test

m) Staffing and Training needs

Slide not shown in lecture;

Only illustrative background info

Page 74: MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016 There is no shortage of Test Tools • Defect Tracking (98) • GUI Test Drivers (71) • Load

MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016

Test Plan (5)

n) Schedule

– Test milestones in project schedule

– Item transmittal milestones

– Additional test milestones (environment ready)

– What resources are needed when

o) Risks and Contingencies

– Testing project risks

– Contingency and mitigation plan for each identified risk

p) Approvals

– Names and when approved

Slide not shown in lecture;

Only illustrative background info

Page 75: MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016 There is no shortage of Test Tools • Defect Tracking (98) • GUI Test Drivers (71) • Load

MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016

Test Case Specification – Why?

• Organization

– All testers and other project team members can review and use test

cases effectively

• Repeatability

– Know what test cases were last run and how so that you could

repeat the same tests

• Tracking

– What requirements or features are tested?

– Tracking information’s value depends on the quality of the test

cases

• Evidence of testing

– Confidence (quality)

– Detect failures

TCS

Page 76: MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016 There is no shortage of Test Tools • Defect Tracking (98) • GUI Test Drivers (71) • Load

MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016

Defect Report (Test incidence report)

• Summary

• Incident Description

• Impact

Lab 1

Page 77: MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016 There is no shortage of Test Tools • Defect Tracking (98) • GUI Test Drivers (71) • Load

MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016

Defect Report (Test incidence report)

Summary

• This is a summation/description

of the actual incident.

– Provides enough details to

enable others to understand

how the incident was

discovered and any relevant

supporting information

• References to:

– Test Procedure used to

discover the incident

– Test Case Specifications that

will provide the information to

repeat the incident

– Test logs showing the actual

execution of the test cases and

procedures

– Any other supporting

materials, trace logs, memory

dumps/maps etc.

Page 78: MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016 There is no shortage of Test Tools • Defect Tracking (98) • GUI Test Drivers (71) • Load

MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016

Defect Report (Test incidence report)

Incident Description

• Provides as much details on the

incident as possible.

– Especially if there are no other

references to describe the

incident.

• Includes all relevant information

that has not already been

included in the incident summary

information or any additional

supporting information

• Information:

– Inputs

– Expected Results

– Actual Results

– Anomalies

– Date and Time

– Procedure Step

– Attempts to Repeat

– Testers

– Observers

Page 79: MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016 There is no shortage of Test Tools • Defect Tracking (98) • GUI Test Drivers (71) • Load

MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016

Defect Report (Test incidence report)

Impact

• Describe the actual/potential

damage caused by the incident.

– Severity

– Priority

• Severity and Priority need to be

defined so as to ensure

consistent use and

interpretation, for example:

• Severity – The potential impact to

the system

– Mission Critical - Application will not

function or system fails

– Major - Severe problems but possible

to work around

– Minor – Does not impact the

functionality or usability of the process

but is not according to

requirements/design specifications

• Priority – The order in which the

incidents are to be addressed

– Immediate – Must be fixed as soon as

possible

– Delayed – System is usable but

incident must be fixed prior to next

level of test or shipment

– Deferred – Defect can be left in if

necessary doe to time or costs

Page 80: MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016 There is no shortage of Test Tools • Defect Tracking (98) • GUI Test Drivers (71) • Load

MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016

Test results report

• Test cases executed

• Versions tested

• Defects found and reported

Page 81: MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016 There is no shortage of Test Tools • Defect Tracking (98) • GUI Test Drivers (71) • Load

MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016

Standards

• IEEE 829-2008

Standard for Software Test Documentation

• IEEE 1008-1993

Standard for Software Unit Testing

• IEEE 1012-2012

Standard for System and Software Verification and Validation

->

• ISO/IEC/IEEE 29119 Software Testing (5 parts)

– replaces most of the older standards

Page 82: MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016 There is no shortage of Test Tools • Defect Tracking (98) • GUI Test Drivers (71) • Load

MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016

Page 83: MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016 There is no shortage of Test Tools • Defect Tracking (98) • GUI Test Drivers (71) • Load

MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016

Structure of Lecture 06

• Test Tools

• Test Measurement

• Test Planning & Documentation

• Test Organisation

• Test Process Improvement (TMMi)

• Lab 6

Page 84: MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016 There is no shortage of Test Tools • Defect Tracking (98) • GUI Test Drivers (71) • Load

MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016

7 approaches to test organisation

1. Each person’s responsibility

2. Each unit’s responsibility

3. Dedicated resource

4. Test organisation in QA

5. Test organisation in development

6. Centralized test organisation

7. Test technology centre

[Kit, Software Testing in the Real World Ch 13, 1995]

Page 85: MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016 There is no shortage of Test Tools • Defect Tracking (98) • GUI Test Drivers (71) • Load

MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016

7 approaches to test organisation

1. Each person’s responsibility

2. Each unit’s responsibility

3. Dedicated resource

4. Test organisation in QA

5. Test organisation in development

6. Centralized test organisation

7. Test technology centre

[Kit, Software Testing in the Real World Ch 13, 1995]

Page 86: MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016 There is no shortage of Test Tools • Defect Tracking (98) • GUI Test Drivers (71) • Load

MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016

Which organization

should we choose?

• Depending on

– size

– maturity

– focus

• The solution is often a

mix of different

approaches

Page 87: MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016 There is no shortage of Test Tools • Defect Tracking (98) • GUI Test Drivers (71) • Load

MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016

Watch James Bach’s open lecture video (course wiki)!

Page 88: MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016 There is no shortage of Test Tools • Defect Tracking (98) • GUI Test Drivers (71) • Load

MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016

Structure of Lecture 06

• Test Tools

• Test Measurement

• Test Planning & Documentation

• Test Organisation

• Test Process Improvement (TMMi)

• Lab 6

Page 89: MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016 There is no shortage of Test Tools • Defect Tracking (98) • GUI Test Drivers (71) • Load

MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016

Process quality and product quality

• Quality in process

Quality in product

• Project:

– instantiated process

• ISO 25000:

– Process quality contributes

to improving product quality,

which in turn contributes to

improving quality in use

Process

Project

Product

Page 90: MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016 There is no shortage of Test Tools • Defect Tracking (98) • GUI Test Drivers (71) • Load

MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016

Process improvement models vs

Test Process improvement models

• (Integrated) Capability maturity model (CMM, CMMI)

• Software process improvement and capability determination (SPICE)

• ISO 9001, Bootstrap, …

Test Process Improvement Models:

• Test maturity model (TMM, TMMi)

• Test process improvement model (TPI)

• Test improvement model (TIM)

• Minimal Test Practice Framework (MTPF)

• …

Page 91: MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016 There is no shortage of Test Tools • Defect Tracking (98) • GUI Test Drivers (71) • Load

MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016

Test Maturity Model (TMM)

• Levels

• Maturity goals and sub-goals

– Scope, boundaries, accomplishments

– Activities, tasks, responsibilities

• Assessment model

– Maturity goals

– Assessment guidelines

– Assessment procedure

Page 92: MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016 There is no shortage of Test Tools • Defect Tracking (98) • GUI Test Drivers (71) • Load

MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016

Level 2: Phase Definition

• Institutionalize basic testing techniques and

methods

• Initiate a test planning process

• Develop testing & debugging tools

Page 93: MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016 There is no shortage of Test Tools • Defect Tracking (98) • GUI Test Drivers (71) • Load

MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016

Level 3: Integration

• Control and monitor the testing

process

• Integrate testing into software life-

cycle

• Establish a technical training

program

• Establish a software test

organization

Page 94: MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016 There is no shortage of Test Tools • Defect Tracking (98) • GUI Test Drivers (71) • Load

MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016

Level 4: Management and Measurement

• Software quality evaluation

• Establish a test management

program

• Establish an organization-wide

review program

Page 95: MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016 There is no shortage of Test Tools • Defect Tracking (98) • GUI Test Drivers (71) • Load

MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016

Level 5: Optimizing, Defect Prevention,

and Quality Control

• Test process optimization

• Quality control

• Application of process data for

defect prevention

Page 96: MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016 There is no shortage of Test Tools • Defect Tracking (98) • GUI Test Drivers (71) • Load

MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016

Test Tools

– by Test Maturity

TMM level 1 Debuggers

Configuration builders

LOC counters

TMM level 2 Test/project planners

Run-time error checkers

Test preparation tools

Coverage analyzers

Cross-reference tools

TMM level 3 Configuration management

Requirements tools

Capture-replay tools

Comparator

Defect tracker

Complexity measurer

Load generators

TMM level 4 Code checkers

Code comprehension tools

Test harness generators

Perform./network analyzers

Simulators/emulators

Test management tools

TMM level 5 Test library tools

Advanced…

Page 97: MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016 There is no shortage of Test Tools • Defect Tracking (98) • GUI Test Drivers (71) • Load

MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016

Test Improvement Model – TIM

• Key areas

– Organization

– Planning and

monitoring

– Test cases

– Testware

– Review / Inspection

• Maturity levels

– Initial

– Basic

– Cost effective

– Risk thinking

– Optimizing

Page 98: MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016 There is no shortage of Test Tools • Defect Tracking (98) • GUI Test Drivers (71) • Load

MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016

Maturity profile (example)

Page 99: MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016 There is no shortage of Test Tools • Defect Tracking (98) • GUI Test Drivers (71) • Load

MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016

Minimal Test Practice Framework

Phase

3 (30+)

Maintain &

Evolve System Define Teams

Perform

Inspections

Risk

Management Coordinate

Software

Quality

Assurance Phase

2 (20)

Create System Define Roles Perform

Walkthroughs Test Cases

Phase

1 (10) Define Syntax

Define

Responsibility Use Checklists

Basic

Administration

of Test

Environment

Test Plan

Category

Problem &

Experience

Reporting

Roles &

Organisation

Verification &

Validation

Test

Administration

Test

Planning

Page 100: MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016 There is no shortage of Test Tools • Defect Tracking (98) • GUI Test Drivers (71) • Load

MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016

Recommended

Textbook Exercises

• Chapter 14

– 2, 4, 5, 6, 9

• Chapter 9

– 2, 3, 4, 5, 8, 12

• Chapter 7

– 2, 3, 6, 8, 9, 11

• Chapter 8

– 2, 3, 6, 7, 9

• Chapter 16

– No exercises

Page 101: MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016 There is no shortage of Test Tools • Defect Tracking (98) • GUI Test Drivers (71) • Load

MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016

Structure of Lecture 06

• Test Tools

• Test Measurement

• Test Planning & Documentation

• Test Organisation

• Test Process Improvement (TMMi)

• Lab 6

Page 102: MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016 There is no shortage of Test Tools • Defect Tracking (98) • GUI Test Drivers (71) • Load

MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016

SUTV1 & V2

SUT Spec. V1 & V2

Lab 6 – Automated GUI Testing

Lab 6 (week 33: Apr 25 – Apr 28) – Automated Web-Page Testing (Regression Testing) (10%) Lab 6 Instructions & Tools Submission Deadlines:

• Monday Lab: Sunday, 01 May, 23:59 • Tuesday Labs: Monday, 02 May, 23:59 • Wednesday Labs: Tuesday, 03 May, 23:59 • Thursday Labs: Wednesday, 04 May, 23:59

• Penalties apply for late delivery: 50% penalty, if

submitted up to 24 hours late; 100 penalty, if submitted more than 24 hours late

Instructions

Sikuli Tool Suite

SUT Spec. V1 & V2

Installation Guide

SUT V1 & V2

Page 103: MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016 There is no shortage of Test Tools • Defect Tracking (98) • GUI Test Drivers (71) • Load

MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016

Lab 6 – Automated GUI Testing

Instructions

SUT Spec. V1

Installation Guide

Sikuli Tool Suite

SUT Spec. V2

SUT V1 SUT V2

Page 104: MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016 There is no shortage of Test Tools • Defect Tracking (98) • GUI Test Drivers (71) • Load

MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016

Lab 6 – Automated GUI Testing

Make a ZIP/RAR file which contains the following:

• Report1.pdf – generated when autotest ran against version 1 (must contain 8 results)

• Report2.pdf – generated when autotest ran against version 2 (must contain 8 results)

• Report3.pdf – generated when autotest ran against version 2, where defects have been

fixed in the autotest framework (must contain 9 results)

• Autotest1.java – file with autotest code (applied to Software1 and Software2 – 1st and

2nd iteration)

• Autotest2.java – file with autotest code (applied to Software2 – 3rd iteration)

• Resources folder with the pictures

• Info&Feedback.pdf – Team member’s names (even if you are working alone);

constructive feedback about the Lab experience.

Use the ’submit’ button for Lab 6

on the course wiki

Note: Please do not submit the whole

autotest project (it’s too large!)

Page 105: MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016 There is no shortage of Test Tools • Defect Tracking (98) • GUI Test Drivers (71) • Load

MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016

Lab 6 – Automated GUI Testing

• Preparation – ideally before Lab starts!

– Read Lab 6 instruction

– Install required software:

• Eclipse with Maven integration

• Java 8 (JRE or JDK)

• ...

Details can be found in the installation guide on the

course wiki

Page 106: MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016 There is no shortage of Test Tools • Defect Tracking (98) • GUI Test Drivers (71) • Load

MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2016

Next 2 Weeks

• Lab 6:

– Automated GUI Testing (Regression Testing)

– Make sure to install the basic software (Eclipse / Java 8)

before the lab starts

• Lecture 7:

– Industry Guest Lecture (Playtech) – 60 min

• Kerli Rungi, QA Manager: Do Software Quality Standards matter

in Playtech?

– Exam Preparation – 30 min