MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2015 MTAT.03.159:...
Transcript of MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2015 MTAT.03.159:...
![Page 1: MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2015 MTAT.03.159: Software Testing Lecture 06: Tools, Metrics and Test Process Improvement / TMMi](https://reader034.fdocuments.net/reader034/viewer/2022042505/5f86eeeb62219e579417e230/html5/thumbnails/1.jpg)
MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2015
MTAT.03.159: Software Testing
Lecture 06:
Tools, Metrics and Test
Process Improvement / TMMi
(Textbook Ch. 14, 9, 16) Dietmar Pfahl email: [email protected] Spring 2015
![Page 2: MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2015 MTAT.03.159: Software Testing Lecture 06: Tools, Metrics and Test Process Improvement / TMMi](https://reader034.fdocuments.net/reader034/viewer/2022042505/5f86eeeb62219e579417e230/html5/thumbnails/2.jpg)
MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2015
Structure of Lecture 06
• Test Tools
• Test Measurement
• Test Process Improvement
• Lab 6
![Page 3: MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2015 MTAT.03.159: Software Testing Lecture 06: Tools, Metrics and Test Process Improvement / TMMi](https://reader034.fdocuments.net/reader034/viewer/2022042505/5f86eeeb62219e579417e230/html5/thumbnails/3.jpg)
MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2015
Tools – the Workbench
• Good for
– repeating tasks
– organising data
• Requires training
• Must be introduced
incrementally
• No “silver bullet”
Evaluation criteria
• Ease of use
• Power
• Robustness
• Functionality
• Ease of introduction
• Quality of support
• Cost
• Company policies and goals
![Page 4: MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2015 MTAT.03.159: Software Testing Lecture 06: Tools, Metrics and Test Process Improvement / TMMi](https://reader034.fdocuments.net/reader034/viewer/2022042505/5f86eeeb62219e579417e230/html5/thumbnails/4.jpg)
MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2015
Test Tools – in the Process
Test management tools
Test
execution
and
comparison
tools
Dynamic
analysis tools
Debugging
tools
Coverage tools
Static analysis tools
Test design
tools
Architectural
design
Detailed
design
Code
Requirement
specification
Unit test
Integration
test
Performance
simulator
tools
System test
Acceptance
test
![Page 5: MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2015 MTAT.03.159: Software Testing Lecture 06: Tools, Metrics and Test Process Improvement / TMMi](https://reader034.fdocuments.net/reader034/viewer/2022042505/5f86eeeb62219e579417e230/html5/thumbnails/5.jpg)
MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2015
Tools categories by purpose
1. Reviews and
inspections
2. Test planning
3. Test design and
development
4. Test execution and
evaluation
5. Test support
![Page 6: MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2015 MTAT.03.159: Software Testing Lecture 06: Tools, Metrics and Test Process Improvement / TMMi](https://reader034.fdocuments.net/reader034/viewer/2022042505/5f86eeeb62219e579417e230/html5/thumbnails/6.jpg)
MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2015
Tools categories by purpose
1. Reviews and
inspections
2. Test planning
3. Test design and
development
4. Test execution and
evaluation
5. Test support
• Complexity analysis
– Identify problem areas
• Code comprehension
– Show different views
of the artefact
• Syntax and semantics
analysis
– Check and warn
![Page 7: MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2015 MTAT.03.159: Software Testing Lecture 06: Tools, Metrics and Test Process Improvement / TMMi](https://reader034.fdocuments.net/reader034/viewer/2022042505/5f86eeeb62219e579417e230/html5/thumbnails/7.jpg)
MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2015
Tools categories by purpose
1. Reviews and
inspections
2. Test planning
3. Test design and
development
4. Test execution and
evaluation
5. Test support
• Templates for test plan
documentation
• Test schedule and
staffing estimates
• Complexity analyser
To large extent: general
project management tools
![Page 8: MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2015 MTAT.03.159: Software Testing Lecture 06: Tools, Metrics and Test Process Improvement / TMMi](https://reader034.fdocuments.net/reader034/viewer/2022042505/5f86eeeb62219e579417e230/html5/thumbnails/8.jpg)
MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2015
Tools categories by purpose
1. Reviews and
inspections
2. Test planning
3. Test design and
development
4. Test execution and
evaluation
5. Test support
• Test data generator
• Requirements-based
test design tool
• Capture/replay
• Coverage analysis
Often integrated with
test execution tools
![Page 9: MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2015 MTAT.03.159: Software Testing Lecture 06: Tools, Metrics and Test Process Improvement / TMMi](https://reader034.fdocuments.net/reader034/viewer/2022042505/5f86eeeb62219e579417e230/html5/thumbnails/9.jpg)
MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2015
Tools categories by purpose
1. Reviews and
inspections
2. Test planning
3. Test design and
development
4. Test execution and
evaluation
5. Test support
• Test case management
• Capture/replay
• Coverage analysis
• Memory testing (leaks)
• Simulators and
performance analysis
– HW emulators
– SW simulators (mocking)
– Load generators
![Page 10: MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2015 MTAT.03.159: Software Testing Lecture 06: Tools, Metrics and Test Process Improvement / TMMi](https://reader034.fdocuments.net/reader034/viewer/2022042505/5f86eeeb62219e579417e230/html5/thumbnails/10.jpg)
MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2015
Tools categories by purpose
1. Reviews and
inspections
2. Test planning
3. Test design and
development
4. Test execution and
evaluation
5. Test support
• Issue reporting
– Report, Dispatch, Follow-
up
• Configuration
management
– Manage, control and
coordinate changes
![Page 11: MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2015 MTAT.03.159: Software Testing Lecture 06: Tools, Metrics and Test Process Improvement / TMMi](https://reader034.fdocuments.net/reader034/viewer/2022042505/5f86eeeb62219e579417e230/html5/thumbnails/11.jpg)
MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2015
There is no shortage of Test Tools
• Defect Tracking (98)
• GUI Test Drivers (71)
• Load and Performance (52)
• Static Analysis (38)
• Test Coverage (22)
• Test Design Tools (24)
• Test Drivers (17)
• Test Implementation (35)
• assist with testing at runtime - memory
leak checkers, comparators, and a wide
variety of others
• Test case Management (24)
• Unit Test Tools (63)
• 3 different categories of others
Other links to test tool overviews:
• http://www.aptest.com/resources.html
• http://www.softwareqatest.com/qatweb1.html
From http://www.testingfaqs.org/
![Page 12: MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2015 MTAT.03.159: Software Testing Lecture 06: Tools, Metrics and Test Process Improvement / TMMi](https://reader034.fdocuments.net/reader034/viewer/2022042505/5f86eeeb62219e579417e230/html5/thumbnails/12.jpg)
MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2015
Test data generator – Generatedata.com
online service for generating data
![Page 13: MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2015 MTAT.03.159: Software Testing Lecture 06: Tools, Metrics and Test Process Improvement / TMMi](https://reader034.fdocuments.net/reader034/viewer/2022042505/5f86eeeb62219e579417e230/html5/thumbnails/13.jpg)
MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2015
Output of the previous input field
Advanced tools allow for
more complex
specification of data
generation rules
![Page 14: MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2015 MTAT.03.159: Software Testing Lecture 06: Tools, Metrics and Test Process Improvement / TMMi](https://reader034.fdocuments.net/reader034/viewer/2022042505/5f86eeeb62219e579417e230/html5/thumbnails/14.jpg)
MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2015
LinkScan tool to detect broken links
http://www.elsop.com/linkscan/docs/links00.html
![Page 15: MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2015 MTAT.03.159: Software Testing Lecture 06: Tools, Metrics and Test Process Improvement / TMMi](https://reader034.fdocuments.net/reader034/viewer/2022042505/5f86eeeb62219e579417e230/html5/thumbnails/15.jpg)
MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2015
LinkScan tool to detect broken links
Plan:
Select web page to be
scanned
![Page 16: MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2015 MTAT.03.159: Software Testing Lecture 06: Tools, Metrics and Test Process Improvement / TMMi](https://reader034.fdocuments.net/reader034/viewer/2022042505/5f86eeeb62219e579417e230/html5/thumbnails/16.jpg)
MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2015
LinkScan tool to detect broken links
Scan:
Recursively crawl over the
web-page and
automatically try every link
![Page 17: MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2015 MTAT.03.159: Software Testing Lecture 06: Tools, Metrics and Test Process Improvement / TMMi](https://reader034.fdocuments.net/reader034/viewer/2022042505/5f86eeeb62219e579417e230/html5/thumbnails/17.jpg)
MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2015
LinkScan tool to detect broken links
Exam:
Show summary report of
broken links
Show locations of broken links
![Page 18: MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2015 MTAT.03.159: Software Testing Lecture 06: Tools, Metrics and Test Process Improvement / TMMi](https://reader034.fdocuments.net/reader034/viewer/2022042505/5f86eeeb62219e579417e230/html5/thumbnails/18.jpg)
MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2015
LinkScan tool to detect broken links
Exam:
Show summary report of
broken links
Show locations of broken links
![Page 19: MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2015 MTAT.03.159: Software Testing Lecture 06: Tools, Metrics and Test Process Improvement / TMMi](https://reader034.fdocuments.net/reader034/viewer/2022042505/5f86eeeb62219e579417e230/html5/thumbnails/19.jpg)
MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2015
Functional (Web-)Testing Approaches
1. Recorded Scripts
2. Engineered Scripts
3. Data-driven Testing
4. Keyword-driven Testing
5. Model-based Testing
First Last Data
Pekka Pukaro 1244515
Teemu Tekno 587245
![Page 20: MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2015 MTAT.03.159: Software Testing Lecture 06: Tools, Metrics and Test Process Improvement / TMMi](https://reader034.fdocuments.net/reader034/viewer/2022042505/5f86eeeb62219e579417e230/html5/thumbnails/20.jpg)
MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2015
Recorded Scripts
• Unstructured
• Scripts generated using capture and
replay tools
• Relatively quick to set up
• Mostly used for regression testing
• Scripts non-maintainable, in practice
– If the system changes they need to be
captured again
• Capture Replay Tools – Record user’s actions to a script
(keyboard, mouse)
• Tool specific scripting language
– Scripts access the (user) interface of the software
• Input fields, buttons and other widgets
– Simple checks can be created in the scripts
• Existence of texts and objects in the UI
• Data of GUI objects
![Page 21: MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2015 MTAT.03.159: Software Testing Lecture 06: Tools, Metrics and Test Process Improvement / TMMi](https://reader034.fdocuments.net/reader034/viewer/2022042505/5f86eeeb62219e579417e230/html5/thumbnails/21.jpg)
MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2015
Recorded Scripts
• Example with Selenium IDE
Some web-
application to
be tested ...
Record Button
switched on ...
http://opensource.demo.orangehrm.com
http://opensource.demo.orangehrm.com
![Page 22: MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2015 MTAT.03.159: Software Testing Lecture 06: Tools, Metrics and Test Process Improvement / TMMi](https://reader034.fdocuments.net/reader034/viewer/2022042505/5f86eeeb62219e579417e230/html5/thumbnails/22.jpg)
MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2015
Recorded Scripts
• Example with Selenium IDE
Some web-
application to
be tested ...
Record Button
switched on ...
Make sure Record button is ON!
Open Base URL in browser
Login using values:
Login Name: demo
Password: demo
Click ‘Login’ button
http://opensource.demo.orangehrm.com
![Page 23: MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2015 MTAT.03.159: Software Testing Lecture 06: Tools, Metrics and Test Process Improvement / TMMi](https://reader034.fdocuments.net/reader034/viewer/2022042505/5f86eeeb62219e579417e230/html5/thumbnails/23.jpg)
MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2015
Recorded Scripts
• Example with Selenium IDE
...
next
actions
...
Record Button
switched on ...
Highlight ‘Welcome demo’ text
Verify that text is present
- command: VerifyTextPresent
Click ’Logout’
... then stop recording ...
http://opensource.demo.orangehrm.com
![Page 24: MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2015 MTAT.03.159: Software Testing Lecture 06: Tools, Metrics and Test Process Improvement / TMMi](https://reader034.fdocuments.net/reader034/viewer/2022042505/5f86eeeb62219e579417e230/html5/thumbnails/24.jpg)
MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2015
Recorded Scripts
• Example with Selenium IDE
Test Case
(=Test
Scenario)
consists of
several
Test Steps
Record Button
switched off ...
http://opensource.demo.orangehrm.com
Open Base URL in browser
Login using values:
Login Name: demo
Password: demo
Click Login button
Highlight ‘Welcome demo’ text
Verify that text is present
Click ’Logout’
Test Suite
TC1
TC2
Step1
Step2 ...
Selenium
commands
Data
(values)
Location on Web-page (Target)
may use xpath, css, id, field, etc. TCs can be saved and exported into
several programming languages (java,
python, c#, etc.)
Tests can be
run (replay) ...
![Page 25: MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2015 MTAT.03.159: Software Testing Lecture 06: Tools, Metrics and Test Process Improvement / TMMi](https://reader034.fdocuments.net/reader034/viewer/2022042505/5f86eeeb62219e579417e230/html5/thumbnails/25.jpg)
MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2015
Engineered Scripts
• Scripts are well-designed (following a systematic approach),
modular, robust, documented, and maintainable
• Separation of common tasks
– E.g. setup, cleanup/teardown, and defect detection
• Test data is still embedded into the scripts
– One driver script per test case
• Test code is mostly written manually
• Implementation and maintenance require programming skills
which testers (test engineers) might not have
• “Just like any other software development project”
![Page 26: MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2015 MTAT.03.159: Software Testing Lecture 06: Tools, Metrics and Test Process Improvement / TMMi](https://reader034.fdocuments.net/reader034/viewer/2022042505/5f86eeeb62219e579417e230/html5/thumbnails/26.jpg)
MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2015
Engineered
Scripts
• Example with
Selenium / TestNG
![Page 27: MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2015 MTAT.03.159: Software Testing Lecture 06: Tools, Metrics and Test Process Improvement / TMMi](https://reader034.fdocuments.net/reader034/viewer/2022042505/5f86eeeb62219e579417e230/html5/thumbnails/27.jpg)
MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2015
Engineered
Scripts
• Example with
Selenium / TestNG
xpath notation
![Page 28: MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2015 MTAT.03.159: Software Testing Lecture 06: Tools, Metrics and Test Process Improvement / TMMi](https://reader034.fdocuments.net/reader034/viewer/2022042505/5f86eeeb62219e579417e230/html5/thumbnails/28.jpg)
MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2015
Engineered
Scripts
• Example with
Selenium / TestNG
Click on ’Percent Calculator’
![Page 29: MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2015 MTAT.03.159: Software Testing Lecture 06: Tools, Metrics and Test Process Improvement / TMMi](https://reader034.fdocuments.net/reader034/viewer/2022042505/5f86eeeb62219e579417e230/html5/thumbnails/29.jpg)
MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2015
Engineered
Scripts
• Example with
Selenium / TestNG
Enter ’10’ – first number
Enter ’50’ – second number
10 50
Get result with getText() – ’5’
Click on ’Calculate’
![Page 30: MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2015 MTAT.03.159: Software Testing Lecture 06: Tools, Metrics and Test Process Improvement / TMMi](https://reader034.fdocuments.net/reader034/viewer/2022042505/5f86eeeb62219e579417e230/html5/thumbnails/30.jpg)
MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2015
Engineered
Scripts
• Example with
Selenium / TestNG
10 50
Checks if result
equals ’5’ (10% of 50)
5
![Page 31: MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2015 MTAT.03.159: Software Testing Lecture 06: Tools, Metrics and Test Process Improvement / TMMi](https://reader034.fdocuments.net/reader034/viewer/2022042505/5f86eeeb62219e579417e230/html5/thumbnails/31.jpg)
MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2015
Data-Driven Testing
• Data-Driven Testing = Executing the same set of test steps with multiple (different) data
• Test inputs and expected outcomes stored as data
– Typically in tabular format
• Test data are read from an external data source
• One driver script can execute all of the designed test cases
Note that in previous example test data (‘10’ and ‘50’) was embedded in the test case definition
First Last Data
Pekka Pukaro 1244515
Teemu Tekno 587245
Example data (new):
imdb film database
employees
![Page 32: MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2015 MTAT.03.159: Software Testing Lecture 06: Tools, Metrics and Test Process Improvement / TMMi](https://reader034.fdocuments.net/reader034/viewer/2022042505/5f86eeeb62219e579417e230/html5/thumbnails/32.jpg)
MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2015
Data-Driven
Testing
public class DataProviderExample extends SeleneseTestCase{
@BeforeClass
public void setUp() throws Exception {
…
}
@DataProvider(name = "DP1")
public Object[][] createData1() throws Exception{
Object[][] retObjArr=getTableArray("test\\Resources\\Data\\data1.xls",
"DataPool", "imdbTestData1");
return(retObjArr);
}
@Test (dataProvider = "DP1")
public void testDataProviderExample(String movieTitle,
String directorName, String moviePlot, String actorName) throws Exception
{
//enter the movie title
selenium.type("q", movieTitle);
//they keep switching the go button to keep the bots away
if (selenium.isElementPresent("nb15go_image"))
selenium.click("nb15go_image");
else
selenium.click("xpath=/descendant::button[@type='submit']");
selenium.waitForPageToLoad("30000");
//click on the movie title in the search result page
selenium.click("xpath=/descendant::a[text()='"+movieTitle+"']");
selenium.waitForPageToLoad("30000");
//verify director name is present in the movie details page
verifyTrue(selenium.isTextPresent(directorName));
//verify movie plot is present in the movie details page
verifyTrue(selenium.isTextPresent(moviePlot));
//verify movie actor name is present in the movie details page
verifyTrue(selenium.isTextPresent(actorName));
}
@AfterClass ...
Defines where to find the
data (uses Java Excel API)
![Page 33: MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2015 MTAT.03.159: Software Testing Lecture 06: Tools, Metrics and Test Process Improvement / TMMi](https://reader034.fdocuments.net/reader034/viewer/2022042505/5f86eeeb62219e579417e230/html5/thumbnails/33.jpg)
MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2015
Data-Driven
Testing public String[][] getTableArray(String xlFilePath, String sheetName, String tableName){
String[][] tabArray=null;
try{
Workbook workbook = Workbook.getWorkbook(new File(xlFilePath));
Sheet sheet = workbook.getSheet(sheetName);
int startRow,startCol, endRow, endCol,ci,cj;
Cell tableStart=sheet.findCell(tableName);
startRow=tableStart.getRow();
startCol=tableStart.getColumn();
Cell tableEnd= sheet.findCell(tableName, startCol+1,startRow+1, 100, 64000, false);
endRow=tableEnd.getRow();
endCol=tableEnd.getColumn();
System.out.println("startRow="+startRow+", endRow="+endRow+", " +
"startCol="+startCol+", endCol="+endCol);
tabArray=new String[endRow-startRow-1][endCol-startCol-1];
ci=0;
for (int i=startRow+1;i<endRow;i++,ci++){
cj=0;
for (int j=startCol+1;j<endCol;j++,cj++){
tabArray[ci][cj]=sheet.getCell(j,i).getContents();
}
}
}
catch (Exception e) {
System.out.println("error in getTableArray()");
}
return(tabArray);
}
Reads the data from the data table ...
(method getTableArray(...))
![Page 34: MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2015 MTAT.03.159: Software Testing Lecture 06: Tools, Metrics and Test Process Improvement / TMMi](https://reader034.fdocuments.net/reader034/viewer/2022042505/5f86eeeb62219e579417e230/html5/thumbnails/34.jpg)
MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2015
Data-Driven Testing
• External test data can be edited without programming skills Test design and framework implementation are now separate tasks:
– design task can be given to someone with the domain knowledge (business people, customers) and
– framework implementation to someone with programming skills.
• Avoids the problems of embedded test data – Data are hard to understand in the middle of all scripting details
– Updating tests or creating similar tests with slightly different test data types/structures always requires programming (-> copy-paste scripting)
• Follow this link for a fully worked example of Data-Driven Testing with Selenium:
http://functionaltestautomation.blogspot.com/2009/10/dataprovider-data-driven-testing-with.html
First Last Data
Pekka Pukaro 1244515
Teemu Tekno 587245
![Page 35: MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2015 MTAT.03.159: Software Testing Lecture 06: Tools, Metrics and Test Process Improvement / TMMi](https://reader034.fdocuments.net/reader034/viewer/2022042505/5f86eeeb62219e579417e230/html5/thumbnails/35.jpg)
MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2015
Data-Driven Testing
![Page 36: MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2015 MTAT.03.159: Software Testing Lecture 06: Tools, Metrics and Test Process Improvement / TMMi](https://reader034.fdocuments.net/reader034/viewer/2022042505/5f86eeeb62219e579417e230/html5/thumbnails/36.jpg)
MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2015
Keyword-Driven Testing
• Keywords also known as action words
• Keyword-driven testing improves data-driven testing:
– Keywords abstract the navigation and actions from the script
– Keywords and test data are read from an external data source
• When test cases are executed keywords are interpreted by a test library (=set of test scripts) which is called by a test automation framework
• Example 1: Login: admin, t5t56y; // 2 args
AddCustomers: newCustomers.txt // 1 arg
RemoveCustomer: Pekka Pukaro // 1 arg
• Example 2: click, checkbox, undo, …
• More keywords (=action words) can be defined based on existing keywords
• Keyword driven testing ~= domain specific languages (DSL)
• Details: http://doc.froglogic.com/squish/4.1/all/how.to.do.keyword.driven.testing.html
• Another tool: http://code.google.com/p/robotframework/
![Page 37: MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2015 MTAT.03.159: Software Testing Lecture 06: Tools, Metrics and Test Process Improvement / TMMi](https://reader034.fdocuments.net/reader034/viewer/2022042505/5f86eeeb62219e579417e230/html5/thumbnails/37.jpg)
MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2015
Keyword-Driven Testing
Keyword-driven test data file
adds one level of abstraction
Similar to data-driven testing
But instead of testing functions
directly, handler looks for keywords
and based on that derives required
test-data
Purpose: increase flexibility and re-
usability
![Page 38: MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2015 MTAT.03.159: Software Testing Lecture 06: Tools, Metrics and Test Process Improvement / TMMi](https://reader034.fdocuments.net/reader034/viewer/2022042505/5f86eeeb62219e579417e230/html5/thumbnails/38.jpg)
MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2015
Keyword-Driven Testing
Several layers of keywords possible
Benefit: can define new keywords using existing ones
![Page 39: MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2015 MTAT.03.159: Software Testing Lecture 06: Tools, Metrics and Test Process Improvement / TMMi](https://reader034.fdocuments.net/reader034/viewer/2022042505/5f86eeeb62219e579417e230/html5/thumbnails/39.jpg)
MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2015
Architecture of a Keyword-Driven
Framework
Pekka Laukkanen. Data-Driven and Keyword-Driven Test Automation
Frameworks. Master’s Thesis. Helsinki University of Technology. 2006.
![Page 40: MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2015 MTAT.03.159: Software Testing Lecture 06: Tools, Metrics and Test Process Improvement / TMMi](https://reader034.fdocuments.net/reader034/viewer/2022042505/5f86eeeb62219e579417e230/html5/thumbnails/40.jpg)
MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2015
Model-based Testing
UML
![Page 41: MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2015 MTAT.03.159: Software Testing Lecture 06: Tools, Metrics and Test Process Improvement / TMMi](https://reader034.fdocuments.net/reader034/viewer/2022042505/5f86eeeb62219e579417e230/html5/thumbnails/41.jpg)
MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2015
Model-based Testing
• System under test is modelled
– UML-state machines, domain specific languages (DSL)
• Test cases are automatically generated from the model
– The model can provide also the expected results for the generated
test cases
– More accurate model -> better test cases
• Generate a large amount of tests that cover the model
– Many different criteria for covering the model
– Execution time of test cases might be a factor
• Challenges:
– Personnel competencies
– Data-intensive systems (cannot be modelled as a state-machine)
• Simple MBT tool http://graphwalker.org/
![Page 42: MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2015 MTAT.03.159: Software Testing Lecture 06: Tools, Metrics and Test Process Improvement / TMMi](https://reader034.fdocuments.net/reader034/viewer/2022042505/5f86eeeb62219e579417e230/html5/thumbnails/42.jpg)
MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2015
Functional Testing Approaches
1. Recorded Scripts
– Cheap to set up, quick & dirty
2. Engineered Scripts
– Structured
3. Data-driven Testing
– Data separation
4. Keyword-driven Testing
– Action separation, DSL
5. Model-based Testing
– Modeling & Automatic
test case generation
First Last Data
Pekka Pukaro 1244515
Teemu Tekno 587245
![Page 43: MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2015 MTAT.03.159: Software Testing Lecture 06: Tools, Metrics and Test Process Improvement / TMMi](https://reader034.fdocuments.net/reader034/viewer/2022042505/5f86eeeb62219e579417e230/html5/thumbnails/43.jpg)
MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2015
What can be automated?
• Test generation
– Test case (steps with data & oracle)
– Test data (input)
– Test oracle (expected output)
– Test verdict (PASS/FAIL decision)
• Test execution
– E.g., regression testing
• Test reporting
• Debugging
– Fault localisation (using failure/error information)
![Page 44: MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2015 MTAT.03.159: Software Testing Lecture 06: Tools, Metrics and Test Process Improvement / TMMi](https://reader034.fdocuments.net/reader034/viewer/2022042505/5f86eeeb62219e579417e230/html5/thumbnails/44.jpg)
MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2015
What can be automated?
• Test generation
– Test case
– Test data (input)
– Test oracle (expected output)
• Test execution
– E.g., regression testing
• Test selection
• Test reporting
• Debugging
– Fault localisation
Often, people mean
automated test
execution when they
talk about test
automation
![Page 45: MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2015 MTAT.03.159: Software Testing Lecture 06: Tools, Metrics and Test Process Improvement / TMMi](https://reader034.fdocuments.net/reader034/viewer/2022042505/5f86eeeb62219e579417e230/html5/thumbnails/45.jpg)
MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2015
What to automate first?
• Most important tests
• A set of breadth tests (sample each system area
overall)
• Test for the most important functions
• Tests that are easiest to automate
• Tests that will give the quickest payback
• Test that are run the most often
![Page 46: MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2015 MTAT.03.159: Software Testing Lecture 06: Tools, Metrics and Test Process Improvement / TMMi](https://reader034.fdocuments.net/reader034/viewer/2022042505/5f86eeeb62219e579417e230/html5/thumbnails/46.jpg)
MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2015
When to automate?
• Test Automation should be used by considering the following:
– Large and critical projects
– Projects that require testing the same areas frequently
– Requirements not changing frequently
– Accessing the application for load and performance with
many virtual users
– Stable software
– Availability of time/effort (for set-up, execution,
maintenance, etc.)
![Page 47: MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2015 MTAT.03.159: Software Testing Lecture 06: Tools, Metrics and Test Process Improvement / TMMi](https://reader034.fdocuments.net/reader034/viewer/2022042505/5f86eeeb62219e579417e230/html5/thumbnails/47.jpg)
MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2015
Test automation promises
1. Efficient regression test
2. Run tests more often
3. Perform difficult tests (e.g. load, outcome check)
4. Better use of resources
5. Consistency and repeatability
6. Reuse of tests
7. Earlier time to market
8. Increased confidence
![Page 48: MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2015 MTAT.03.159: Software Testing Lecture 06: Tools, Metrics and Test Process Improvement / TMMi](https://reader034.fdocuments.net/reader034/viewer/2022042505/5f86eeeb62219e579417e230/html5/thumbnails/48.jpg)
MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2015
Common problems
1. Unrealistic expectations
2. Poor testing practice
”Automatic chaos just gives faster chaos”
3. Expected effectiveness
4. False sense of security
5. Maintenance of automatic tests
6. Technical problems (e.g. Interoperability)
7. Organizational issues
![Page 49: MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2015 MTAT.03.159: Software Testing Lecture 06: Tools, Metrics and Test Process Improvement / TMMi](https://reader034.fdocuments.net/reader034/viewer/2022042505/5f86eeeb62219e579417e230/html5/thumbnails/49.jpg)
MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2015
What can be automated?
Intellectual
Performed
once
Repeated
Clerical
1. Identify
2. Design
3. Build
4. Execute
5. Check
![Page 50: MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2015 MTAT.03.159: Software Testing Lecture 06: Tools, Metrics and Test Process Improvement / TMMi](https://reader034.fdocuments.net/reader034/viewer/2022042505/5f86eeeb62219e579417e230/html5/thumbnails/50.jpg)
MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2015
Limits of automated testing
• Does not replace manual testing
• Manual tests find more defects than automated tests
– Does not improve effectiveness
• Greater reliance on quality of tests
– Oracle problem
• Test automation may limit the software development
– Costs of maintaining automated tests
![Page 51: MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2015 MTAT.03.159: Software Testing Lecture 06: Tools, Metrics and Test Process Improvement / TMMi](https://reader034.fdocuments.net/reader034/viewer/2022042505/5f86eeeb62219e579417e230/html5/thumbnails/51.jpg)
MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2015
Structure of Lecture 06
• Test Tools
• Test Measurement
• Test Process Improvement
• Lab 6
![Page 52: MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2015 MTAT.03.159: Software Testing Lecture 06: Tools, Metrics and Test Process Improvement / TMMi](https://reader034.fdocuments.net/reader034/viewer/2022042505/5f86eeeb62219e579417e230/html5/thumbnails/52.jpg)
MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2015
Test Management
• Monitoring (or tracking)
– Check status
– Reports
– Metrics
• Controlling
– Corrective actions
![Page 53: MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2015 MTAT.03.159: Software Testing Lecture 06: Tools, Metrics and Test Process Improvement / TMMi](https://reader034.fdocuments.net/reader034/viewer/2022042505/5f86eeeb62219e579417e230/html5/thumbnails/53.jpg)
MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2015
How good are we at testing?
Test quality
Product quality
Many faults
Few faults
Few faults
Few faults
Are we here?
Or are we here?
![Page 54: MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2015 MTAT.03.159: Software Testing Lecture 06: Tools, Metrics and Test Process Improvement / TMMi](https://reader034.fdocuments.net/reader034/viewer/2022042505/5f86eeeb62219e579417e230/html5/thumbnails/54.jpg)
MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2015
Purpose of Measurement
• Test monitoring – check the status
• Test controlling – corrective actions
• Plan new testing
• Measure and analyze results
– The benefit/profit of testing
– The cost of testing
– The quality of testing
– The quality of the product
– Basis of improvement, not only for the test process
![Page 55: MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2015 MTAT.03.159: Software Testing Lecture 06: Tools, Metrics and Test Process Improvement / TMMi](https://reader034.fdocuments.net/reader034/viewer/2022042505/5f86eeeb62219e579417e230/html5/thumbnails/55.jpg)
MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2015
Test Monitoring
• Status
– Coverage metrics
– Test case metrics: development and execution
– Test harness development
• Efficiency / Cost metrics
– How much time have we spent?
– How much money/effort have we spent?
• Failure / Fault metrics
– How much have we accomplished?
– What is the quality status of the software?
• Effectiveness metrics
– How effective is the testing techniques in detecting defects?
Metrics
Estimation
Cost
Stop?
![Page 56: MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2015 MTAT.03.159: Software Testing Lecture 06: Tools, Metrics and Test Process Improvement / TMMi](https://reader034.fdocuments.net/reader034/viewer/2022042505/5f86eeeb62219e579417e230/html5/thumbnails/56.jpg)
MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2015
Selecting the right metrics
• What is the purpose of the collected data?
– What kinds of questions can they answer?
– Who will use the data?
– How is the data used?
• When and who needs the data?
– Which forms and tools are used to collect the data?
– Who will collect them?
– Who will analyse the data?
– Who have access to the data?
![Page 57: MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2015 MTAT.03.159: Software Testing Lecture 06: Tools, Metrics and Test Process Improvement / TMMi](https://reader034.fdocuments.net/reader034/viewer/2022042505/5f86eeeb62219e579417e230/html5/thumbnails/57.jpg)
MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2015
Measurement Basics
Basic data:
• Time and Effort (calendar- and staff-hours)
• Failures / Faults
• Size / Functionality
Basic rule:
• Feedback to origin
• Use data or don’t
measure
![Page 58: MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2015 MTAT.03.159: Software Testing Lecture 06: Tools, Metrics and Test Process Improvement / TMMi](https://reader034.fdocuments.net/reader034/viewer/2022042505/5f86eeeb62219e579417e230/html5/thumbnails/58.jpg)
MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2015
Test metrics: Development status
• Test case development status
– Planned
– Available
– Unplanned (not planned for, but
needed)
• Test harness development status
– Planned
– Available
– Unplanned
![Page 59: MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2015 MTAT.03.159: Software Testing Lecture 06: Tools, Metrics and Test Process Improvement / TMMi](https://reader034.fdocuments.net/reader034/viewer/2022042505/5f86eeeb62219e579417e230/html5/thumbnails/59.jpg)
MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2015
Test metrics: Coverage
What?
% statements covered
% branches covered
% data flow
% requirements
% equivalence classes
Why?
• Track
completeness of test
![Page 60: MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2015 MTAT.03.159: Software Testing Lecture 06: Tools, Metrics and Test Process Improvement / TMMi](https://reader034.fdocuments.net/reader034/viewer/2022042505/5f86eeeb62219e579417e230/html5/thumbnails/60.jpg)
MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2015
Test metrics: Test execution status
What?
• # failures/hour
• # executed tests
• Requirements
coverage
Why?
• Track progress of test
project
• Decide stopping
criteria
![Page 61: MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2015 MTAT.03.159: Software Testing Lecture 06: Tools, Metrics and Test Process Improvement / TMMi](https://reader034.fdocuments.net/reader034/viewer/2022042505/5f86eeeb62219e579417e230/html5/thumbnails/61.jpg)
MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2015
Test metrics: Size/complexity/length
What?
• Size/Length – LOC
• Functionality –
Function Points
• Complexity – McCabe
• Difficulty – Halstead
• Cohesion, Coupling,
...
Why?
• Estimate test effort
![Page 62: MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2015 MTAT.03.159: Software Testing Lecture 06: Tools, Metrics and Test Process Improvement / TMMi](https://reader034.fdocuments.net/reader034/viewer/2022042505/5f86eeeb62219e579417e230/html5/thumbnails/62.jpg)
MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2015
Test metrics: Efficiency
What?
• # faults/hour
• # faults/test case
Why?
• Evaluate efficiency of
V&V activities
![Page 63: MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2015 MTAT.03.159: Software Testing Lecture 06: Tools, Metrics and Test Process Improvement / TMMi](https://reader034.fdocuments.net/reader034/viewer/2022042505/5f86eeeb62219e579417e230/html5/thumbnails/63.jpg)
MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2015
Test metrics: Faults/Failures
(Trouble reports)
What?
• # faults/size
• repair time
• root cause
Why?
• Monitor quality
• Monitor efficiency
• Improve
![Page 64: MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2015 MTAT.03.159: Software Testing Lecture 06: Tools, Metrics and Test Process Improvement / TMMi](https://reader034.fdocuments.net/reader034/viewer/2022042505/5f86eeeb62219e579417e230/html5/thumbnails/64.jpg)
MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2015
Test metrics: Effectiveness
What?
% found faults per
phase
% missed faults
Why?
• Evaluate effectiveness
of V&V activities
![Page 65: MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2015 MTAT.03.159: Software Testing Lecture 06: Tools, Metrics and Test Process Improvement / TMMi](https://reader034.fdocuments.net/reader034/viewer/2022042505/5f86eeeb62219e579417e230/html5/thumbnails/65.jpg)
MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2015
When to stop testing?
• All planned tests are executed and passed
• All coverage goals met (requirements, code, ...)
• Detection of specific number of failures
• Rates of failure detection fallen below a specified
level
• Fault seeding ratios are favourable
• Reliability above a certain value
• Cost has reached the limit
![Page 66: MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2015 MTAT.03.159: Software Testing Lecture 06: Tools, Metrics and Test Process Improvement / TMMi](https://reader034.fdocuments.net/reader034/viewer/2022042505/5f86eeeb62219e579417e230/html5/thumbnails/66.jpg)
MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2015
Structure of Lecture 06
• Test Tools
• Test Measurement
• Test Process Improvement
• Lab 6
![Page 67: MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2015 MTAT.03.159: Software Testing Lecture 06: Tools, Metrics and Test Process Improvement / TMMi](https://reader034.fdocuments.net/reader034/viewer/2022042505/5f86eeeb62219e579417e230/html5/thumbnails/67.jpg)
MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2015
Process quality and product quality
• Quality in process
Quality in product
• Project:
– instantiated process
• ISO 25000:
– Process quality contributes
to improving product quality,
which in turn contributes to
improving quality in use
Process
Project
Product
![Page 68: MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2015 MTAT.03.159: Software Testing Lecture 06: Tools, Metrics and Test Process Improvement / TMMi](https://reader034.fdocuments.net/reader034/viewer/2022042505/5f86eeeb62219e579417e230/html5/thumbnails/68.jpg)
MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2015
Process improvement models vs
Test Process improvement models
• (Integrated) Capability maturity model (CMM, CMMI)
• Software process improvement and capability determination (SPICE)
• ISO 9001, Bootstrap, …
Test Process Improvement Models:
• Test maturity model (TMM)
• Test process improvement model (TPI)
• Test improvement model (TIM)
• Minimal Test Practice Framework (MTPF)
• …
![Page 69: MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2015 MTAT.03.159: Software Testing Lecture 06: Tools, Metrics and Test Process Improvement / TMMi](https://reader034.fdocuments.net/reader034/viewer/2022042505/5f86eeeb62219e579417e230/html5/thumbnails/69.jpg)
MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2015
Test Maturity Model (TMM)
• Levels
• Maturity goals and sub-goals
– Scope, boundaries, accomplishments
– Activities, tasks, responsibilities
• Assessment model
– Maturity goals
– Assessment guidelines
– Assessment procedure
![Page 70: MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2015 MTAT.03.159: Software Testing Lecture 06: Tools, Metrics and Test Process Improvement / TMMi](https://reader034.fdocuments.net/reader034/viewer/2022042505/5f86eeeb62219e579417e230/html5/thumbnails/70.jpg)
MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2015
Level 2: Phase Definition
• Institutionalize basic testing techniques and
methods
• Initiate a test planning process
• Develop testing and debugging tools
![Page 71: MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2015 MTAT.03.159: Software Testing Lecture 06: Tools, Metrics and Test Process Improvement / TMMi](https://reader034.fdocuments.net/reader034/viewer/2022042505/5f86eeeb62219e579417e230/html5/thumbnails/71.jpg)
MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2015
Level 3: Integration
• Control and monitor the testing
process
• Integrate testing into software life-
cycle
• Establish a technical training
program
• Establish a software test
organization
![Page 72: MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2015 MTAT.03.159: Software Testing Lecture 06: Tools, Metrics and Test Process Improvement / TMMi](https://reader034.fdocuments.net/reader034/viewer/2022042505/5f86eeeb62219e579417e230/html5/thumbnails/72.jpg)
MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2015
Level 4: Management and Measurement
• Software quality evaluation
• Establish a test management
program
• Establish an organization-wide
review program
![Page 73: MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2015 MTAT.03.159: Software Testing Lecture 06: Tools, Metrics and Test Process Improvement / TMMi](https://reader034.fdocuments.net/reader034/viewer/2022042505/5f86eeeb62219e579417e230/html5/thumbnails/73.jpg)
MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2015
Level 5: Optimizing, Defect Prevention,
and Quality Control
• Test process optimization
• Quality control
• Application of process data for
defect prevention
![Page 74: MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2015 MTAT.03.159: Software Testing Lecture 06: Tools, Metrics and Test Process Improvement / TMMi](https://reader034.fdocuments.net/reader034/viewer/2022042505/5f86eeeb62219e579417e230/html5/thumbnails/74.jpg)
MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2015
Test Tools
– by Test Maturity
TMM level 1 Debuggers
Configuration builders
LOC counters
TMM level 2 Test/project planners
Run-time error checkers
Test preparation tools
Coverage analyzers
Cross-reference tools
TMM level 3 Configuration management
Requirements tools
Capture-replay tools
Comparator
Defect tracker
Complexity measurer
Load generators
TMM level 4 Code checkers
Code comprehension tools
Test harness generators
Perform./network analyzers
Simulators/emulators
Test management tools
TMM level 5 Test library tools
Advanced…
![Page 75: MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2015 MTAT.03.159: Software Testing Lecture 06: Tools, Metrics and Test Process Improvement / TMMi](https://reader034.fdocuments.net/reader034/viewer/2022042505/5f86eeeb62219e579417e230/html5/thumbnails/75.jpg)
MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2015
Test Improvement Model – TIM
• Key areas
– Organization
– Planning and
monitoring
– Test cases
– Testware
– Review / Inspection
• Maturity levels
– Initial
– Basic
– Cost effective
– Risk thinking
– Optimizing
![Page 76: MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2015 MTAT.03.159: Software Testing Lecture 06: Tools, Metrics and Test Process Improvement / TMMi](https://reader034.fdocuments.net/reader034/viewer/2022042505/5f86eeeb62219e579417e230/html5/thumbnails/76.jpg)
MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2015
Maturity profile (example)
![Page 77: MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2015 MTAT.03.159: Software Testing Lecture 06: Tools, Metrics and Test Process Improvement / TMMi](https://reader034.fdocuments.net/reader034/viewer/2022042505/5f86eeeb62219e579417e230/html5/thumbnails/77.jpg)
MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2015
Minimal Test Practice Framework
Phase
3 (30+)
Maintain &
Evolve System Define Teams
Perform
Inspections
Risk
Management Coordinate
Software
Quality
Assurance Phase
2 (20)
Create System Define Roles Perform
Walkthroughs Test Cases
Phase
1 (10) Define Syntax
Define
Responsibility Use Checklists
Basic
Administration
of Test
Environment
Test Plan
Category
Problem &
Experience
Reporting
Roles &
Organisation
Verification &
Validation
Test
Administration
Test
Planning
![Page 78: MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2015 MTAT.03.159: Software Testing Lecture 06: Tools, Metrics and Test Process Improvement / TMMi](https://reader034.fdocuments.net/reader034/viewer/2022042505/5f86eeeb62219e579417e230/html5/thumbnails/78.jpg)
MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2015
Recommended
Textbook Exercises
• Chapter 14
– 2, 4, 5, 6, 9
• Chapter 9
– 2, 3, 4, 5, 8, 12
• Chapter 16
– No exercises
![Page 79: MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2015 MTAT.03.159: Software Testing Lecture 06: Tools, Metrics and Test Process Improvement / TMMi](https://reader034.fdocuments.net/reader034/viewer/2022042505/5f86eeeb62219e579417e230/html5/thumbnails/79.jpg)
MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2015
Structure of Lecture 06
• Test Tools
• Test Measurement
• Test Process Improvement
• Lab 6
![Page 80: MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2015 MTAT.03.159: Software Testing Lecture 06: Tools, Metrics and Test Process Improvement / TMMi](https://reader034.fdocuments.net/reader034/viewer/2022042505/5f86eeeb62219e579417e230/html5/thumbnails/80.jpg)
MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2015
SUTV1 & V2
SUT Spec. V1 & V2
Lab 6 – Automated GUI Testing
Lab 6 (week 33: Apr 27 – Apr 30) – Automated Web-Page Testing (Regression Testing) (10%) Lab 6 Instructions & Tools Submission Deadlines:
• Monday Lab: Sunday, 03 May, 23:59 • Tuesday Labs: Monday, 04 May, 23:59 • Wednesday Labs: Tuesday, 05 May, 23:59 • Thursday Labs: Wednesday, 06 May, 23:59
• Penalties apply for late delivery: 50% penalty, if
submitted up to 24 hours late; 100 penalty, if submitted more than 24 hours late
Instructions
Sikuli Tool Suite
SUT Spec. V1 & V2
Installation Guide
SUT V1 & V2
![Page 81: MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2015 MTAT.03.159: Software Testing Lecture 06: Tools, Metrics and Test Process Improvement / TMMi](https://reader034.fdocuments.net/reader034/viewer/2022042505/5f86eeeb62219e579417e230/html5/thumbnails/81.jpg)
MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2015
Lab 6 – Automated GUI Testing
Instructions
SUT Spec. V1
Installation Guide
Sikuli Tool Suite
SUT Spec. V2
SUT V1 SUT V2
![Page 82: MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2015 MTAT.03.159: Software Testing Lecture 06: Tools, Metrics and Test Process Improvement / TMMi](https://reader034.fdocuments.net/reader034/viewer/2022042505/5f86eeeb62219e579417e230/html5/thumbnails/82.jpg)
MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2015
Lab 6 – Automated GUI Testing
• What to submit?
– Info.pdf – Team member’s names and student IDs (even if you are working alone)
– Report1.pdf – generated when the autotest ran against SUT version 1 (must
contain 8 results)
– Report2.pdf – generated when the autotest ran against SUT version 2 (must
contain 8 results)
– Report3.pdf – generated when the autotest ran against SUT version 2, where
defects have been fixed in the autotest framework (must contain 9 results)
– Autotest1.java - file with autotest code 1
– Autotest2.java - file with autotest code 2
– Resources folder with the pictures
• How to submit?
– Create an archive and submit using the ’submit’ button for Lab 6 on the
course wiki
Note: Please do not submit the
whole autotest project (it’s too
large!)
![Page 83: MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2015 MTAT.03.159: Software Testing Lecture 06: Tools, Metrics and Test Process Improvement / TMMi](https://reader034.fdocuments.net/reader034/viewer/2022042505/5f86eeeb62219e579417e230/html5/thumbnails/83.jpg)
MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2015
Lab 6 – Automated GUI Testing
• Preparation – ideally before Lab starts!
– Read Lab 6 instruction
– Install required software:
• Eclipse with Maven integration
• Java 8 (JRE or JDK)
• ...
Details can be found in the installation guide on the
course wiki
![Page 84: MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2015 MTAT.03.159: Software Testing Lecture 06: Tools, Metrics and Test Process Improvement / TMMi](https://reader034.fdocuments.net/reader034/viewer/2022042505/5f86eeeb62219e579417e230/html5/thumbnails/84.jpg)
MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2015
Lab 6 – Automated GUI Testing
• Feedback (your lab 6 experience)
– This lab package is part of a BSc thesis project
– You can get one bonus mark, if you answer all
questions in the questionnaire provided here
(SurveyMonkey):
https://www.surveymonkey.com/s/PPFCF9P
– Submission deadline is the same as for the lab reports
– The questionnaire must be filled-in individually
– Make sure to identify yourself (name & student ID)
![Page 85: MTAT.03.159: Software Testing - ut · MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2015 MTAT.03.159: Software Testing Lecture 06: Tools, Metrics and Test Process Improvement / TMMi](https://reader034.fdocuments.net/reader034/viewer/2022042505/5f86eeeb62219e579417e230/html5/thumbnails/85.jpg)
MTAT.03.159 / Lecture 06 / © Dietmar Pfahl 2015
Next 2 Weeks
• Lab 6:
– Automated GUI Testing (Regression Testing)
– Make sure to install the basic software (Eclipse / Java 8)
before the lab starts
• Lecture 7:
– Industry Guest Lecture (Playtech) – 60 min
• Kerli Rungi, QA Manager: Testing Practices in Playtech Estonia
– Exam Preparation – 30 min