Automated Testing Plan

27
Generic Project Prepared for Project Name Prepared by Company Name Date July 17, 2001 © 2001, COMPANY NAME. All rights reserved. This documentation is the confidential and proprietary intellectual property of COMPANY NAME. Any unauthorized use, reproduction, preparation of derivative works, performance, or display of this document, or software represented by this document, without the express written permission of Sabre Inc. is strictly prohibited. COMPANY NAME and the COMPANY NAME logo design are trademarks and/or service marks of an affiliate of COMPANY NAME. All other trademarks, service marks, and trade names are owned by their respective companies.

Transcript of Automated Testing Plan

8/8/2019 Automated Testing Plan

http://slidepdf.com/reader/full/automated-testing-plan 1/27

Generic Project

Prepared for 

Project Name

Prepared by

Company Name

Date

July 17, 2001

© 2001, COMPANY NAME. All rights reserved.

This documentation is the confidential and proprietary intellectual

property of COMPANY NAME. Any unauthorized use, reproduction,

preparation of derivative works, performance, or display of this

document, or software represented by this document, without the

express written permission of Sabre Inc. is strictly prohibited.

COMPANY NAME and the COMPANY NAME logo design are

trademarks and/or service marks of an affiliate of COMPANY

NAME. All other trademarks, service marks, and trade names are

owned by their respective companies.

8/8/2019 Automated Testing Plan

http://slidepdf.com/reader/full/automated-testing-plan 2/27

8/8/2019 Automated Testing Plan

http://slidepdf.com/reader/full/automated-testing-plan 3/27

PROJECT NAME Abacus Detail Test Plan

D O C U M E N T R E V I S I O N I N F O R M A T I O N

The following information is to be included with all versions of the document.

Project Name Project Number  

Prepared by Date Prepared

Revised by Date Revised

Revision Reason Revision Control No.

Revised by Date Revised

Revision Reason Revision Control No.

Revised by Date Revised

Revision Reason Revision Control No.

Sabre Inc. Confidential/All Rights Reserved iii

8/8/2019 Automated Testing Plan

http://slidepdf.com/reader/full/automated-testing-plan 4/27

PROJECT NAME Abacus Detail Test Plan

D O C U M E N T A P P R O V A L

This signature page is to indicate approval from COMPANY NAME sponsor and Client sponsor for the attached Abacus

Detail Test Plan for the PROJECT NAME. All parties have reviewed the attached document and agree with its contents.

COMPANY NAME Project Manager:

Name, Title: Project Manager, PROJECT NAME

Date

CUSTOMER Project Manager:

Name, Title:

Date

COMPANY NAME/DEPARTMENT Sponsor:

Name, Title:

Date

COMPANY NAME Sponsor:

Name, Title:

Date

CUSTOMER NAME Sponsor:

Name, Title:

Date

COMPANY NAME Manager:

Name, Title:

Date

Sabre Inc. Confidential/All Rights Reserved iv

8/8/2019 Automated Testing Plan

http://slidepdf.com/reader/full/automated-testing-plan 5/27

Table of Contents

1 I n t r o d u c t i o n ................................................................................................................. v i 

1.1 Automated Testing DTP Overview..................................................................................................vi

2 T e s t D e s c r i p t i o n ...................................................................................................... v i i 

1.2 Test Identification............................................................................................................................vii

1.3 Test Purpose and Objectives..........................................................................................................vii

1.4 Assumptions, Constraints, and Exclusions.....................................................................................vii

1.5 Entry Criteria..................................................................................................................................viii

1.6 Exit Criteria.....................................................................................................................................viii

1.7 Pass/Fail Criteria............................................................................................................................viii

3 T e s t S c o p e ...................................................................................................................... x

1.8 Items to be tested by Automation....................................................................................................x

1.9 Items not to be tested by Automation..............................................................................................x

4 T e s t A p p r o a c h ............................................................................................................. x i 

1.10 Description of Approach................................................................................................................xi

5 T e s t D e f i n i t i o n ......................................................................................................... x i i 

1.11 Test Functionality Definition (Requirements Testing)...................................................................xii

1.12 Test Case Definition (Test Design)...............................................................................................xii

1.13 Test Data Requirements...............................................................................................................xii

1.14 Automation Recording Standards.................................................................................................xii

1.15 Winrunner Menu Settings............................................................................................................xiii

1.16 Winrunner Script Naming Conventions........................................................................................xiii

1.17 Winrunner GUIMAP Naming Conventions...................................................................................xiii

1.18 Winrunner Result Naming Conventions......................................................................................xiv

1.19 Winrunner Report Naming Conventions......................................................................................xiv

1.20 Winrunner Script, Result and Report Repository........................................................................xiv

1.21 Test Environment..........................................................................................................................xv

Sabre Inc. Confidential/All Rights Reserved Table of Contents v

8/8/2019 Automated Testing Plan

http://slidepdf.com/reader/full/automated-testing-plan 6/27

6 T e s t P r e p a r a t i o n S p e c i f i c a t i o n s .................................................................... x v

1.22 Test Team Roles and Responsibilities........................................................................................xvii

1.23 Test Team Training Requirements.............................................................................................xviii

1.24 Automation Test Preparation.....................................................................................................xviii

1.25 Issues...........................................................................................................................................xix

7 T e s t I s s u e s a n d R i s k s ......................................................................................... x i x

1.26 Risks.............................................................................................................................................xix

8 A p p e n d i c e s .................................................................................................................. x x

1.1 Traceability Matrix...........................................................................................................................xx

1.2 Definitions for Use in Testing.........................................................................................................22

1.2.1 Test Requirement..................................................................................................................22

1.2.2 Test Case..............................................................................................................................22

1.2.3 Test Procedure......................................................................................................................22

1.3 Automated Test Cases ..................................................................................................................23

1.3.1 NAME OF FUNCTION Test Case.........................................................................................23

1.4 Glossary Reference........................................................................................................................25

9 P r o j e c t G l o s s a r y ...................................................................................................... 2 5

1.5 Sample Addresses for Testing.......................................................................................................26

1.6 Test Credit Card Numbers.............................................................................................................27

1.1 Automated Testing DTP Overview

This Automated Testing Detail Test Plan (ADTP) will identify the specific tests that are to be

 performed to ensure the quality of the delivered product. System/Integration Test ensures the

 product functions as designed and all parts work together. This ADTP will cover information for Automated testing during the System/Integration Phase of the project and will map to the

specification or requirements documentation for the project. This mapping is done in conjunction

with the Traceability Matrix document, that should be completed along with the ADTP and is

referenced in this document.

This ADTP refers to the specific portion of the product known as PRODUCT NAME. It provides

clear entry and exit criteria, and roles and responsibilities of the Automated Test Team are identified

such that they can execute the test.

Sabre Inc. Confidential/All Rights Reserved Table of Contents vi

1Introduction

8/8/2019 Automated Testing Plan

http://slidepdf.com/reader/full/automated-testing-plan 7/27

The objectives of this ADTP are:

• Describe the test to be executed.

• Identify and assign a unique number for each specific test.

• Describe the scope of the testing.

• List what is and is not to be tested.

• Describe the test approach detailing methods, techniques, and tools.

• Outline the Test Design including:

Functionality to be tested.– Test Case Definition.

– Test Data Requirements.

• Identify all specifications for preparation.

• Identify issues and risks.

• Identify actual test cases.

• Document the design point or requirement tested for each test case as it is developed.

1.2 Te st Id en t i f i c a t i on

This ADTP is intended to provide information for System/Integration Testing for the PRODUCT

 NAME module of the PROJECT NAME. The test effort may be referred to by its PROJECT

REQUEST (PR) number and its project title for tracking and monitoring of the testing progress.

1.3 T e s t P u r p o s e a n d O b j e c t i v e s

Automated testing during the System/Integration Phase as referenced in this document is intended to

ensure that the product functions as designed directly from customer requirements. The testing goal

is to identify the quality of the structure, content, accuracy and consistency, some response times andlatency, and performance of the application as defined in the project documentation.

1.4 A s s u m p t i o n s , C o n s t r a i n t s , a n d E x c l u s i o n s

Factors which may affect the automated testing effort, and may increase the risk associated with the

success of the test include:

• Completion of development of front-end processes

Sabre Inc. Confidential/All Rights Reserved Table of Contents vii

2Test Description

8/8/2019 Automated Testing Plan

http://slidepdf.com/reader/full/automated-testing-plan 8/27

• Completion of design and construction of new processes

• Completion of modifications to the local database

• Movement or implementation of the solution to the appropriate testing or production environment

• Stability of the testing or production environment

• Load Discipline

• Maintaining recording standards and automated processes for the project

• Completion of manual testing through all applicable paths to ensure that reusable automated

scripts are valid

1.5 En t ry C r i te r ia

The ADTP is complete, excluding actual test results. The ADTP has been signed-off by appropriate

sponsor representatives indicating consent of the plan for testing.

The Problem Tracking and Reporting tool is ready for use. The Change Management and

Configuration Management rules are in place.

The environment for testing, including databases, application programs, and connectivity has been

defined, constructed, and verified.

1.6 Ex i t Cr i t e r ia

In establishing the exit/acceptance criteria for the Automated Testing during the System/Integration

Phase of the test, the Project Completion Criteria defined in the Project Definition Document (PDD)

should provide a starting point. All automated test cases have been executed as documented. The

 percent of successfully executed test cases met the defined criteria. Recommended criteria: No

Critical or High severity problem logs remain open and all Medium problem logs have agreed upon

action plans; successful execution of the application to validate accuracy of data, interfaces, and

connectivity.

1.7 Pa ss / Fa i l C r i te r ia

The results for each test must be compared to the pre-defined expected test results, as documented in

the ADTP (and DTP where applicable). The actual results are logged in the Test Case detail within

the Detail Test Plan if those results differ from the expected results. If the actual results match the

expected results, the Test Case can be marked as a passed item, without logging the duplicated

results.

Sabre Inc. Confidential/All Rights Reserved Table of Contents viii

8/8/2019 Automated Testing Plan

http://slidepdf.com/reader/full/automated-testing-plan 9/27

A test case passes if it produces the expected results as documented in the ADTP or Detail Test Plan

(manual test plan). A test case fails if the actual results produced by its execution do not match the

expected results. The source of failure may be the application under test, the test case, the expected

results, or the data in the test environment. Test case failures must be logged regardless of the source

of the failure.

Any bugs or problems will be logged in the DEFECT TRACKING TOOL.

The responsible application resource corrects the problem and tests the repair. Once this is complete,

the tester who generated the problem log is notified, and the item is re-tested. If the retest is

successful, the status is updated and the problem log is closed.

If the retest is unsuccessful, or if another problem has been identified, the problem log status is

updated and the problem description is updated with the new findings. It is then returned to the

responsible application personnel for correction and test.

Severity Codes are used to prioritize work in the test phase. They are assigned by the test group andare not modifiable by any other group. The following standard Severity Codes to be used for 

identifying defects are:

Table 1 Severity Codes

Severity Code

Number 

Severity Code

Name Description

1 Critical Automated tests cannot proceed further within applicable testcase (no work around)

2 High The test case or procedure can be completed, but producesincorrect output when valid information is input.

3 Medium The test case or procedure can be completed and producescorrect output when valid information is input, but producesincorrect output when invalid information is input.

(e.g. no special characters are allowed as part of specifications but when a special character is a part of thetest and the system allows a user to continue, this is amedium severity)

4 Low All test cases and procedures passed as written, but therecould be minor revisions, cosmetic changes, etc. Thesedefects do not impact functional execution of system

The use of the standard Severity Codes produces four major benefits:

• Standard Severity Codes are objective and can be easily and accurately assigned by those

executing the test. Time spent in discussion about the appropriate priority of a problem is

minimized.

Sabre Inc. Confidential/All Rights Reserved Table of Contents ix

8/8/2019 Automated Testing Plan

http://slidepdf.com/reader/full/automated-testing-plan 10/27

• Standard Severity Code definitions allow an independent assessment of the risk to the on-schedule delivery of a product that functions as documented in the requirements and design

documents.

• Use of the standard Severity Codes works to ensure consistency in the requirements, design, and

test documentation with an appropriate level of detail throughout.

• Use of the standard Severity Codes promote effective escalation procedures.

The scope of testing identifies the items which will be tested and the items which will not be tested

within the System/Integration Phase of testing.

1.8 I t e m s to b e te s t e d b y A u to m a t i o n

1. PRODUCT NAME

2. PRODUCT NAME

3. PRODUCT NAME

4. PRODUCT NAME

5. PRODUCT NAME

1.9 I t e m s n o t t o b e te s t e d b y A u to m a t i o n

1. PRODUCT NAME

2. PRODUCT NAME

Sabre Inc. Confidential/All Rights Reserved Table of Contents x

3Test Scope

8/8/2019 Automated Testing Plan

http://slidepdf.com/reader/full/automated-testing-plan 11/27

1.10 Des cr i p t i on o f App ro ach

The mission of Automated Testing is the process of identifying recordable test cases through all

appropriate paths of a website, creating repeatable scripts, interpreting test results, and reporting to

 project management. For the Generic Project, the automation test team will focus on positive testing

and will complement the manual testing undergone on the system. Automated test results will be

generated, formatted into reports and provided on a consistent basis to Generic project management.

System testing is the process of testing an integrated hardware and software system to verify that the

system meets its specified requirements. It verifies proper execution of the entire set of application

components including interfaces to other applications. Project teams of developers and test analysts

are responsible for ensuring that this level of testing is performed.

Integration testing is conducted to determine whether or not all components of the system are

working together properly. This testing focuses on how well all parts of the web site hold together,

whether inside and outside the website are working, and whether all parts of the website are

connected. Project teams of developers and test analyst are responsible for ensuring that this level of 

testing is performed.

For this project, the System and Integration ADTP and Detail Test Plan complement each other.

Since the goal of the System and Integration phase testing is to identify the quality of the structure,

content, accuracy and consistency, response time and latency, and performance of the application,

test cases are included which focus on determining how well this quality goal is accomplished.

Content testing focuses on whether the content of the pages match what is supposed to be there,

whether key phrases exist continually in changeable pages, and whether the pages maintain quality

content from version to version.

Accuracy and consistency testing focuses on whether today’s copies of the pages download the same

as yesterday’s, and whether the data presented to the user is accurate enough.

Response time and latency testing focuses on whether the web site server responds to a browser 

request within certain performance parameters, whether response time after a SUBMIT is acceptable,

or whether parts of a site are so slow that the user discontinues working. Although Loadrunner 

 provides the full measure of this test, there will be various AD HOC time measurements withincertain Winrunner Scripts as needed.

Performance testing (Loadrunner) focuses on whether performance varies by time of day or by load

and usage, and whether performance is adequate for the application.

Completion of automated test cases is denoted in the test cases with indication of pass/fail and

follow-up action.

Sabre Inc. Confidential/All Rights Reserved Table of Contents xi

4Test Approach

8/8/2019 Automated Testing Plan

http://slidepdf.com/reader/full/automated-testing-plan 12/27

This section addresses the development of the components required for the specific test. Included areidentification of the functionality to be tested by automation, the associated automated test cases and

scenarios. The development of the test components parallels, with a slight lag, the development of 

the associated product components.

1.11 T e s t F u n c t i o n a l i t y D e f i n i t i o n ( R e q u i re m e n ts

T e s t i n g )

The functionality to be automated tested is listed in the Traceability Matrix, attached as an appendix.

For each function to undergo testing by automation, the Test Case is identified. Automated Test

Cases are given unique identifiers to enable cross-referencing between related test documentation,

and to facilitate tracking and monitoring the test progress.

As much information as is available is entered into the Traceability Matrix in order to complete the

scope of automation during the System/Integration Phase of the test.

1.12 T e s t C a s e D e f i n i t i o n ( T e s t D e s i g n )

Each Automated Test Case is designed to validate the associated functionality of a stated

requirement. Automated Test Cases include unambiguous input and output specifications. This

information is documented within the Automated Test Cases in Appendix 8.5 of this ADTP.

1.13 Tes t Da t a Req u i re men ts

The automated test data required for the test is described below. The test data will be used to

 populate the data bases and/or files used by the application/system during the System/Integration

Phase of the test. In most cases, the automated test data will be built by the OTS Database Analyst

or OTS Automation Test Analyst.

1.14 A u t o m a t i o n R e c o rd i n g S ta n d a r d s

Initial Automation Testing Rules for the Generic Project:

1. Ability to move through all paths within the applicable system

2. Ability to identify and record the GUI Maps for all associated test items in each path

3. Specific times for loading into automation test environment

Sabre Inc. Confidential/All Rights Reserved Table of Contents xii

5Test Definition

8/8/2019 Automated Testing Plan

http://slidepdf.com/reader/full/automated-testing-plan 13/27

4. Code frozen between loads into automation test environment

5. Minimum acceptable system stability

1.15 Wi n ru nn er Men u Se t t i ngs

1. Default recording mode is CONTEXT SENSITIVE

2. Record owner-drawn buttons as OBJECT

3. Maximum length of list item to record is 253 characters

4. Delay for Window Synchronization is 1000 milliseconds (unless Loadrunner is operating in same

environment and then must increase appropriately)

5. Timeout for checkpoints and CS statements is 1000 milliseconds

6. Timeout for Text Recognition is 500 milliseconds

7. All scripts will stop and start on the main menu page

8. All recorded scripts will remain short; Debugging is easier. However, the entire script, or 

 portions of scripts, can be added together for long runs once the environment has greater 

stability.

1.16 W i n r u n n e r S c r i p t N a m i n g C o n v e n t i o n s

1. All automated scripts will begin with GE abbreviation representing the Generic Project and be

filed under the Winrunner on LAB11 W Drive/Generic/Scripts Folder.

2. GE will be followed by the Product Path name in lower case: air, htl, car 

3. After the automated scripts have been debugged, a date for the script will be attached: 0710 for 

July 10. When significant improvements have been made to the same script, the date will be

changed.

4. As incremental improvements have been made to an automated script, version numbers will beattached signifying the script with the latest improvements: eg. GEsea0710.1 GEsea0710.2

The .2 version is the most up-to-date

1.17 W i n r u n n e r G U I M A P N a m i n g C o n v e n t i o n s

Sabre Inc. Confidential/All Rights Reserved Table of Contents xiii

8/8/2019 Automated Testing Plan

http://slidepdf.com/reader/full/automated-testing-plan 14/27

1. All Generic GUI Maps will begin with GE followed by the area of test. Eg. GEsea.

GEpond GUI Map represents all pond paths. GEmemmainmenu GUI Map represents all

membership and main menu concerns. GElogin GUI Map represents all GE login concerns.

2. As there can only be one GUI Map for each Object, etc on the site, they are under constant

revision when the site is undergoing frequent program loads.

1.18 W i n r u n n e r R e s u l t N a m i n g C o n v e n t i o n s

1. When beginning a script, allow default res## name to be filed

2. After a successful run of a script where the results will be used toward a report, move file to

results and rename: GE for project name, res for Test Results, 0718 for the date the script was

run, your initials and the original default number for the script. Eg. GEres0718jr.1

1.19 W i n r u n n e r R e p o r t N a m i n g C o n v e n t i o n s

1. When the accumulation of test result(s) files for the day are formulated, and the statistics are

confirmed, a report will be filed that is accessible by upper management. The daily Report file will

  be as follows: GEdaily0718 GE for project name, daily for daily report, and 0718 for the date

the report was issued.

2. When the accumulation of test result(s) files for the week are formulated, and the statistics are

confirmed, a report will be filed that is accessible by upper management. The weekly Report file will be as follows: GEweek0718……… GE for project name, week for weekly report, and 0718 for the

date the report was issued.

1.20 W i n r u n n e r S c r i p t , R e s u l t a n d R e p o r t R e p o s i t o ry

1. LAB 11, located within the GE Test Lab, will “house” the original Winrunner Script, Results and

Report Repository for automated testing within the Generic Project. WRITE access is granted

Winrunner Technicians and READ ONLY access is granted those who are authorized to run

scripts but not make any improvements. This is meant to maintain the purity of each script

version.

Sabre Inc. Confidential/All Rights Reserved Table of Contents xiv

8/8/2019 Automated Testing Plan

http://slidepdf.com/reader/full/automated-testing-plan 15/27

2. Winrunner on LAB11 W Drive houses all Winrunner related documents, etc for GE automatedtesting.

3. Project file folders for the Generic Project represent the initial structure of project folders

utilizing automated testing. As our automation becomes more advanced, the structure will

spread to other appropriate areas.

4. Under each Project file folder, a folder for SCRIPT, RESULT and REPORT can be found.

5. All automated scripts generated for each project will be filed under Winrunner on LAB11 W

Drive/Generic/Scripts Folder and moved to folder ARCHIVE SCRIPTS as necessary

6. All GUI MAPS generated will be filed under 

Winrunner on LAB11 W Drive/Generic/Scripts/gui_files Folder.

7. All automated test results are filed under the individual Script Folder after each script run.

Results will be referred to and reports generated utilizing applicable statistics. Automated Test

Results referenced by reports sent to management will be kept under the Winrunner on LAB11

W Drive/Generic/Results Folder. Before work on evaluating a new set of test results is begun,

all prior results are placed into Winrunner on LAB11 W Drive/Generic/Results/Archived Results

Folder. This will ensure all reported statistics are available for closer scrutiny when required.

8. All reports generated from automated scripts and sent to upper management will be filed under 

Winrunner on LAB11 W Drive/Generic/Reports Folder 

1.21 Te st En v i ro nm en t

Table 2 Environment for Automated Test

Automated Test environment is indicated below. Existing dependencies are entered in comments.

Environment Test System Comments

Test –

System/Integration Test

(SIT)

Cert Access via http://xxxxx/xxxxx

Production Production Access via http:// www.xxxxxx.xxx

Other (specify) Development Individual Test Environments

Sabre Inc. Confidential/All Rights Reserved Table of Contents xv

6Test Preparation Specifications

8/8/2019 Automated Testing Plan

http://slidepdf.com/reader/full/automated-testing-plan 16/27

Table 3 Hardware for Automated Test

The following is a list of the hardware needed to create production like environment:

Manufacturer Device Type

Various Personal Computer (486 or Higher) with monitor & required peripherals; withconnectivity to internet test/production environments. Must be enabled toADDITIONAL REQUIREMENTS.

Table 4 Software 

The following is a list of the software needed to create a production like environment:

Software Version (if applicable) Programmer Support

Netscape Navigator ZZZ or higher  

Internet Explorer ZZZ or higher  

Sabre Inc. Confidential/All Rights Reserved Table of Contents xvi

8/8/2019 Automated Testing Plan

http://slidepdf.com/reader/full/automated-testing-plan 17/27

1.22 T e s t T e a m R o l e s a n d R e s p o n s i b i l i t i e s

Table 5 Test Team Roles and Responsibilities

Role Responsibilities Name

COMPANY NAMESponsor 

Approve project development, handle major issuesrelated to project development, and approvedevelopment resources

Name, Phone

Abacus Sponsor Signature approval of the project, handle major issues Name, Phone

Abacus ProjectManager 

Ensures all aspects of the project are being addressedfrom CUSTOMERS’ point of view

Name, Phone

COMPANY NAME

DevelopmentManager 

Manage the overall development of project, including

obtaining resources, handling major issues, approvingtechnical design and overall timeline, delivering theoverall product according to the Partner Requirements

Name, Phone

COMPANY NAMEProject Manager 

Provide PDD (Project Definition Document), project plan,status reports, track project development status, managechanges and issues

Name, Phone

COMPANY NAMETechnical Lead

Provide Technical guidance to the Development Teamand ensure that overall Development is proceeding in thebest technical direction

Name, Phone

COMPANY NAMEBack End

Services Manager 

Develop and deliver the necessary Business Services tosupport the PROJECT NAME

Name, Phone

COMPANY NAMEInfrastructureManager 

Provide PROJECT NAME development certification,production infrastructure, service level agreement, andtesting resources

Name, Phone

COMPANY NAMETest Coordinator 

Develops ADTP and Detail Test Plans, tests changes,logs incidents identified during testing, coordinatestesting effort of test team for project

Name, Phone

COMPANY NAME

Tracker Coordinator/Tester 

Tracks SCR’s in DEFECT TRACKING TOOL. Reviews

new SCR’s for duplicates, completeness and assigns toModule Tech Leads for fix. Produces status documentsas needed. Tests changes, logs incidents identifiedduring testing.

Name, Phone

COMPANY NAMEAutomationEnginneer 

Tests changes, logs incidents identified during testing. Name, Phone

Sabre Inc. Confidential/All Rights Reserved Table of Contents xvii

8/8/2019 Automated Testing Plan

http://slidepdf.com/reader/full/automated-testing-plan 18/27

1.23 T e s t T e a m T ra i n i n g R e q u i r e m e n ts

Table 6 Automation Training Requirements

Training

Requirement Training Approach

Target Date for 

Completion

Roles/Resources to

be Trained

1.24 A u t o m a t i o n T e s t P re p a ra t i o n

1. Write and receive approval of the ADTP from Generic Project management

2. Manually test the cases in the plan to make sure they actually work before recording repeatable

scripts

3. Record appropriate scripts and file them according to the naming conventions described within

this document

4. Initial order of automated script runs will be to load GUI Maps through a STARTUP script.

After the successful run of this script, scripts testing all paths will be kicked off. Once an

appropriate number of PNR’s are generated, GenericCancel scripts will be used to automatically

take the inventory out of the test profile and system environment. During the automation test

 period, requests for testing of certain functions can be accommodated as necessary as long as

these functions have the ability to be tested by automation.

5. The ability to use Generic Automation will be READ ONLY for anyone outside of the test

group. Of course, this is required to maintain the pristine condition of master scripts on our data

repository.

6. Generic Test Group will conduct automated tests under the rules specified in our agreement for 

use of the Winrunner tool marketed by Mercury Interactive.

7. Results filed for each run will be analyzed as necessary, reports generated, and provided to upper 

management.

Sabre Inc. Confidential/All Rights Reserved Table of Contents xviii

8/8/2019 Automated Testing Plan

http://slidepdf.com/reader/full/automated-testing-plan 19/27

1 . 2 5 I s s u e s

The table below lists known project testing issues to date. Upon sign-off of the ADTP and Detail

Test Plan, this table will not be maintained, and these issues and all new issues will be tracked

through the Issue Management System, as indicated in the projects approved Issue Management

Process.

Table 7 Issues

Issue Impact

Target Date

for Resolution Owner 

COMPANY NAME test teamis not in possession of market data regarding whatbrowsers are most in use inCUSTOMER target market.

Testing may not cover somebrowsers used by CLIENTcustomers

Beginning of AutomatedTesting duringSystem andIntegration TestPhase

CUSTOMER TOPROVIDE

OTHER

1 . 2 6 R i s k s

The table below identifies any high impact or highly probable risks that may impact the success of 

the Automated testing process.

Table 8 Risk Assessment Matrix

Risk Area Potential Impact Likelihood of  

Occurrence

Difficulty of 

Timely Detection

Overall Threat

(H, M, L)

1. Unstable

Environment

Delayed Start HISTORY OF

PROJECT

Immediately DEPENDENT

ON

LIKELIHOOD

2. Quality of 

Unit Testing

Greater delays

taken by automated

scripts

Dependent upon

quality

standards of 

development

group

Immediately

3. Browser 

Issues

Intermittent Delays Dependent upon

browser version

Immediately

Sabre Inc. Confidential/All Rights Reserved Table of Contents xix

7Test Issues and Risks

8/8/2019 Automated Testing Plan

http://slidepdf.com/reader/full/automated-testing-plan 20/27

Table 9 Risk Management Plan

Risk Area Preventative

Action

Contingency Plan

Action

Trigger Owner  

1. Meet with

Environment

Group

2. Meet

with

Development Group

3.

1 . 1 T r a c e a b i l i t y M a t r i x

The purpose of the Traceability Matrix is to identify all business requirements and to trace each

requirement through the project's completion.

Each business requirement must have an established priority as outlined in the Business

Requirements Document. They are:

Essential - Must satisfy the requirement to be accepted by the customer.

Useful - Value -added requirement influencing the customer's decision.

Nice-to-have - Cosmetic non-essential condition, makes product more appealing.

The Traceability Matrix will change and evolve throughout the entire project life cycle. The

requirement definitions, priority, functional requirements, and automated test cases are subject to

change and new requirements can be added. However, if new requirements are added or existing

requirements are modified after the Business Requirements document and this document have been

approved, the changes will be subject to the change management process.

The Traceability Matrix for this project will be developed and maintained by the test coordinator. At

the completion of the matrix definition and the project, a copy will be added to the project notebook.

Sabre Inc. Confidential/All Rights Reserved Table of Contents xx

8Appendices

8/8/2019 Automated Testing Plan

http://slidepdf.com/reader/full/automated-testing-plan 21/27

Functional Areas of Traceability Matrix

# Functional Area Priority

B1 Pond E

B2 River E

B3 Lake U

B4 Sea E

B5 Ocean E

B6 Misc U

B7 Modify E Legend:

L1 Language E

EE1 End-to-End Testing EE B = Order Engine

L = Language

N = Nice to have

EE = End-to-End

E = Essential

U = Useful

8/8/2019 Automated Testing Plan

http://slidepdf.com/reader/full/automated-testing-plan 22/27

1 . 2 D e f i n i t i o n s f o r U s e i n T e s t i n g

1 . 2 . 1 T e s t R e q u i r e m e n t

A scenario is a prose statement of requirements for the test. Just as there are high level and detailedrequirements in application development, there is a need to provide detailed requirements in the test

development area.

1 . 2 . 2 T e s t C a s e

A test case is a transaction or list of transactions that will satisfy the requirements statement in a test

scenario. The test case must contain the actual entries to be executed as well as the expected results,

i.e., what a user entering the commands would see as a system response.

1 . 2 . 3 T e s t P r o c e d u r e

Test procedures define the activities necessary to execute a test case or set of cases. Test procedures

may contain information regarding the loading of data and executables into the test system,

directions regarding sign in procedures, instructions regarding the handling of test results, and

anything else required to successfully conduct the test.

8/8/2019 Automated Testing Plan

http://slidepdf.com/reader/full/automated-testing-plan 23/27

1 . 3 A u t o m a t e d T e s t C a s e s

1 . 3 . 1 N A M E O F F U N C T I O N T e s t C a s e

Project Name/Number  Generic Project / Project Request # Date

Test Case Description Check all drop down boxes, fill in boxes and pop-up windows

operate according to requirements on the main “Pond” webpage.

Build #

Run #

Function / Module Under Test B1.1 Execution Retry #

Test Requirement # Case # AB1.1.1 (A for Automated)

 Written by

Goals Verify that Pond module functions as required

Setup for Test Access browser, Go to …..

Pre-conditions Login with name and password. When arrive at Generic Main Menu…..

Step Action Expected Results Pass / Fail Actual Results if Step Fails

1. Go to Pond and ….. From the Generic Main Menu, click on the Pond gif  and go to Pond web page. Once on the Pond webpage, check all drop down boxes for appropriateinformation (eg Time….7a, 8a in 1 hour increments), fill in boxes (remarks allows alpha andnumeric but no other special characters), and popup windows (eg. Privacy. Ensure it is retrieved,has correct verbage and closes).

8/8/2019 Automated Testing Plan

http://slidepdf.com/reader/full/automated-testing-plan 24/27

8/8/2019 Automated Testing Plan

http://slidepdf.com/reader/full/automated-testing-plan 25/27

1 . 4 G l o s s a r y R e f e r e n c e

Term Definition Pertains to:

CAT Customer Acceptance Testing

CR Change Request

DOC Document Only Change Problem Log

ADTP Automated Detail Test Plan

DTP Detail Test Plan

DUP Duplicate Problem Log

FIX Fix incident resolved/repaired Problem Log

MTP Master Test Plan

PDD Project Definition Document

PMO Project Management Office

PR Project Request

SIT System/Integration Test

SME Subject Matter Expert

UTR Developer unable to recreate Problem Log

WAD Working As Designed Problem Log

9Project Glossary

8/8/2019 Automated Testing Plan

http://slidepdf.com/reader/full/automated-testing-plan 26/27

1 . 5 S a m p l e A d d r e s s e s f o r T e s t i n g

United States

International

8/8/2019 Automated Testing Plan

http://slidepdf.com/reader/full/automated-testing-plan 27/27

1 . 6 T e s t C r e d i t C a r d N u m b e r s

American Express/Optima (AX)

Visa (BA)

Diners Club (DC)

Discover Card/Novus (DS)

MasterCard (IK)