Fig Leaf Software FUNCTIONAL/REGRESSION TEST PLAN TEMPLATE ...

31
Fig Leaf Software FUNCTIONAL/REGRESSION TEST PLAN TEMPLATE DRAFT v.1.3 August 21, 2004

Transcript of Fig Leaf Software FUNCTIONAL/REGRESSION TEST PLAN TEMPLATE ...

Page 1: Fig Leaf Software FUNCTIONAL/REGRESSION TEST PLAN TEMPLATE ...

Fig Leaf Software FUNCTIONAL/REGRESSION TEST PLAN TEMPLATE

DRAFT v.1.3 August 21, 2004

Page 2: Fig Leaf Software FUNCTIONAL/REGRESSION TEST PLAN TEMPLATE ...

i

Table of Contents

Application Overview .....................................................................................................1 QA Description of the Application ..............................................................................1

Testing Objectives...........................................................................................................2 High Level ..................................................................................................................2 Spectra Objectives (if applicable) ................................................................................2 Database Verification ..................................................................................................2

References/Testing Inputs ...............................................................................................3 Requirements/Specifications/System Design Document(s) ......................................3 Client Documents ....................................................................................................3 Project Plan/Build Schedule.....................................................................................3 Software Configuration Management Scheme .........................................................3 Books, Web Pages, etc.............................................................................................3 Interface/Application Standards and Conventions ....................................................3 Style Manual (FLS/Client Specific) .........................................................................3 Printable Characters Text File..................................................................................3 Accessibility Guidelines (if applicable)....................................................................3 Automated Testing (if applicable)............................................................................3 Performance and Scalability Testing ........................................................................3 Prior Test Plans and Related Data (Maintenance Releases Only)..............................3

Testing Outputs ...............................................................................................................4 Defect Tracking...........................................................................................................4 Defect Metrics and Reporting ......................................................................................6 Client Test Reports ..................................................................................................6 Internal Reports .......................................................................................................7

Personnel and Roles ........................................................................................................8 Functional Organization...........................................................................................8

Platform/Asset Specification Info ....................................................................................9 Client.......................................................................................................................9 Middleware .............................................................................................................9 Server ......................................................................................................................9 Other Supporting Technologies or Considerations ...................................................9

Test Environment Requirements....................................................................................10 Test Approach...............................................................................................................11 Test Methods.................................................................................................................12 Functional Test Case Design and Construction ..........................................................12 Programming Team Testing ......................................................................................12 Database Inspections .................................................................................................12 System/Application Setup Testing .............................................................................12 Regression Testing ....................................................................................................13 Acceptance Testing ...................................................................................................13

Test Scope.....................................................................................................................15 Functional Testing .....................................................................................................15 FLS Recurring Functionality .....................................................................................16

Page 3: Fig Leaf Software FUNCTIONAL/REGRESSION TEST PLAN TEMPLATE ...

ii

GUI Testing...............................................................................................................16 Content/Editorial QA Testing ....................................................................................17 Usability Testing .......................................................................................................17 Documentation/Help Testing .....................................................................................17 Installation/Configuration/Integration/Security Testing..............................................17 Backup/Recovery Testing..........................................................................................17 Accessibility Testing .................................................................................................17 Performance/Scalability Testing ................................................................................17

Test Cases .....................................................................................................................18

Page 4: Fig Leaf Software FUNCTIONAL/REGRESSION TEST PLAN TEMPLATE ...

Functional/Regression Test Plan for [Insert Client/Project Name here]

August 21, 2001 1 Fig Leaf Software

Blue text is boilerplate information used for all applications; it can be left as is for each application­specific test plan. Blue text within the document should be changed to black when writing a test plan for a specific application; this text block should be removed.

Application Overview

Note name of client, that Fig Leaf is the developer of the application, and an approximate time frame during which development is taking (or took) place. If applicable, such as for a maintenance release, discuss project history. If prior testing efforts took place, briefly summarize the nature and outcomes of those tests.

QA Description of the Application Briefly summarize the high­level application functionality and the overall scope of the tests to be performed. This discussion should also include information on the following:

• Key test areas • Assumptions for testing • Identify high risk/high usage areas • Area(s) that will not be tested

o If applicable, note reason(s) why area(s) will not be tested • Tools to employ • Special requirements (e.g. accessibility compliance)

A chart/diagram of overall application flow/relationships with a brief explanation of main modules and functionality would be helpful here.

Insert overview list of application’s functional requirements (the “what,” not the “how”).

Page 5: Fig Leaf Software FUNCTIONAL/REGRESSION TEST PLAN TEMPLATE ...

Functional/Regression Test Plan for [Insert Client/Project Name here]

August 21, 2001 2 Fig Leaf Software

Testing Objectives

High Level The guiding criterion is that it “works like it’s supposed to” as described in the Functional/Design Specification document(s); this section should discuss the details of what that means for the given project.

Spectra Objectives (if applicable) Discuss specific testing needs when Spectra is used with/by the application. For example, based on our experience to date we know that editing containers is an area that requires a lot of testing attention.

Database Verification Discuss plans to verify that changes that should have been made to the database were in fact registered.

Page 6: Fig Leaf Software FUNCTIONAL/REGRESSION TEST PLAN TEMPLATE ...

Functional/Regression Test Plan for [Insert Client/Project Name here]

August 21, 2001 3 Fig Leaf Software

References/Testing Inputs

For all of the following items that are relevant to the application, note the locations of the inputs (e.g. \\Pdcfile01\clientdocs\projectx\specs\funcspec.doc) and how they are to be used or referenced for the testing effort.

Requirements/Specifications/System Design Document(s)

Client Documents

Project Plan/Build Schedule Make note of testing schedule dates: when each module will be tested, when the complete app will be tested, important milestones for completing tasks.

Software Configuration Management Scheme

Books, Web Pages, etc.

Interface/Application Standards and Conventions Information on this should be found in the Functional/Design Specification document(s) and includes things like keyboard equivalents for button functionality.

Style Manual (FLS/Client Specific)

Printable Characters Text File The “printablechars.txt” file is located in the \\Pdcfile01\clientdocs\TestDocs directory. This document lists all printable characters, organized by ASCII code sets. All characters in this file should be tested in all data entry fields in the application.

Accessibility Guidelines (if applicable)

Automated Testing (if applicable)

Performance and Scalability Testing Make reference to separate documentation that covers this aspect of testing.

Prior Test Plans and Related Data (Maintenance Releases Only)

Page 7: Fig Leaf Software FUNCTIONAL/REGRESSION TEST PLAN TEMPLATE ...

Functional/Regression Test Plan for [Insert Client/Project Name here]

August 21, 2001 4 Fig Leaf Software

Testing Outputs

Defect Tracking All “defects” found by FLS QA staff are logged into a tracking database each day that testing takes place. The tracking database is posted daily to a network location accessible by all project team members. All records compiled by QA staff must have a final resolution (“Closed,” “Verified,” or “Deferred”) before the application is delivered to the client.

As development staff fixes issues, the appropriate records are marked as “Fixed.” Records can be marked as “Fixed” by development or QA staff. QA staff verifies all fixes and mark records as “Verified” after confirming that a correct fix has been applied. Only QA staff can mark a record as “Verified.”

Standard FLS guidelines for issue reporting are in effect for this application’s test effort:

General Guidelines for QA Staff:

• Fill out all areas of the issue report as completely as possible. • Be as detailed as possible when writing the description of the problem:

o Where did it happen? o What were you doing just before it happened (i.e. what steps triggered the

problem)? o Can you consistently replicate the problem? o Did you notice anything that might be an additional symptom or otherwise

related in some way to the cause of the problem? o Were you using a user ID that might in some way be related to the

problem you are reporting? • On the other hand, brevity is important. Respect the time of the developer who

will be reading these reports by explaining, in the most direct manner possible, what you did to create the error. If there are tangents to the main error, write them up as separate records and in those descriptions note the number of the original record that they relate back to.

• Stick to the facts. A report that is complete and easily understood by the developer will make it much easier to isolate and fix the issue. Reasoned opinions are also good; don’t just write, “This doesn’t make sense” and leave it at that. Explain why the problem you are writing up is awkward, confusing, misleading, etc.

• Please resist the urge to play “Bug Jeopardy” by phrasing your problem report in the form of a question, unless the question is targeted to solving a specific problem such as an inconsistency in labeling, e.g. “Should it be class or course?” Questions in a format like “Why does it work this way?,” especially if there is a spec that explains why it works that way, are just distractions from the work at hand.

Page 8: Fig Leaf Software FUNCTIONAL/REGRESSION TEST PLAN TEMPLATE ...

Functional/Regression Test Plan for [Insert Client/Project Name here]

August 21, 2001 5 Fig Leaf Software

• If you are one of multiple people working in the same issue database, take a few moments to review the comments that have been entered by others before starting your review of the application. This will give you an idea of what the program is (or isn’t!) doing and will prevent the entry of duplicate records into the system.

General Guidelines for Development Staff:

• Descriptive subject lines in email notifications are very helpful. “CareerNotes: Issues 1234, 1236, 1237” is more useful as a notification than a successive series of emails that all have a subject line of “Issue,” Issue Fixed,” etc. QA Staff are sometimes working across several projects at once and this helps to clarify what’s what.

Issue Classification Information:

All issue reports include the following pieces of information:

• ID (auto­generated) • Date Reported (auto­generated) • Reported By (drop down list populated with names of appropriate project team

members) • Screen/Location (drop down list populated with names of major application

modules/functions) • Nature of Problem (drop down list populated with common issue categories, e.g.

“Code,” “Enhancement Request,” or “Text”; will vary somewhat from application to application)

• Fix/Change (drop down list allowing classification of issues as items that do not work as specified – “fix” – versus issues that are really requests to change specified functionality – “change”)

• Severity (drop down list populated with levels of severity; details below) • Status (drop down list populated with status choices, e.g. “Open,” “Fixed,”

“Verified,” “Closed”) • Assigned To (drop down list populated by names of appropriate project team

members) • Fixed By (drop down list populated by names of appropriate project team

members) • Date Fixed (date field into which is entered the date of the issue fix by developer) • Verified By (drop down list populated by names of appropriate project team

members) • Date Verified (date field into which is entered the date of issue fix verification by

test staff)

Page 9: Fig Leaf Software FUNCTIONAL/REGRESSION TEST PLAN TEMPLATE ...

Functional/Regression Test Plan for [Insert Client/Project Name here]

August 21, 2001 6 Fig Leaf Software

• Priority (drop down list populated with priority levels; details below) • Description of Problem (text entry field for description of issue) • Suggested Solution (text entry field for notes on possible resolutions) • Attachment (container field for attaching files, such as screen shots or docs,

related to the issue) • Version (drop down list populated by version/build numbers)

Severity Levels for issues are classified as follows:

• Low – The issue reports an instance that does not cause a failure, does not impair usability, and/or there is an easy workaround for the issue.

• Medium – The issue reports an instance in which the application produces incorrect, incomplete, or inconsistent results, or the issue reports an instance in which the application’s usability is impaired.

• High ­ The issue reports the failure of the entire application, a portion of the application, or a module within the application, but there is an acceptable workaround for the problem.

• Critical ­ The issue reports the failure of the entire application, a portion of the application, or a module within the application (there are no workarounds).

Priority Levels for issues are classified as follows:

• Low ­ The issue should be resolved after more serious issues have been fixed. • Medium – The issue should be resolved in the normal course of development

activities. • High –The issue must be resolved as soon as possible because application use is

severely affected and this is hindering development and testing activities. • Immediate – Further development and/or testing cannot occur until the issue has

been fixed. The application (or major module/function) cannot be used. • Defer – The issue will be revisited/resolved at a later time.

Defect Metrics and Reporting

Client Test Reports Client should receive a report at the end indicating for what platform(s) the application has been tested and “certified” against. This report should also indicate what functions were tested and “passed” by the test staff.

Page 10: Fig Leaf Software FUNCTIONAL/REGRESSION TEST PLAN TEMPLATE ...

Functional/Regression Test Plan for [Insert Client/Project Name here]

August 21, 2001 7 Fig Leaf Software

Internal Reports Summary report giving overall status of tests run against the application to date.

Page 11: Fig Leaf Software FUNCTIONAL/REGRESSION TEST PLAN TEMPLATE ...

Functional/Regression Test Plan for [Insert Client/Project Name here]

August 21, 2001 8 Fig Leaf Software

Personnel and Roles

This section discusses who is on the project team and the role on the project fulfilled by each person.

Functional Organization • Programming Team

o List Names • Graphics

o List Names • Software Test Team

o List Names • Project/Senior Management

o List Names

Page 12: Fig Leaf Software FUNCTIONAL/REGRESSION TEST PLAN TEMPLATE ...

Functional/Regression Test Plan for [Insert Client/Project Name here]

August 21, 2001 9 Fig Leaf Software

Platform/Asset Specification Info

Create a matrix for required hardware/software combinations, if applicable. Note all files needed to successfully run the application.

Information for the items in this section should be available in the Functional/System Design Specification document(s).

Client Browser types/versions, plug­ins and versions, screen resolution, bit depth, OS/hardware details (processor speed, required network protocols, etc)

Middleware CF, Spectra, database (SQL, Oracle, Sybase, Access)

Server OS and version, web server and version (Apache, IIS, Netscape)

Other Supporting Technologies or Considerations Generator would be an example of something to discuss here.

Page 13: Fig Leaf Software FUNCTIONAL/REGRESSION TEST PLAN TEMPLATE ...

Functional/Regression Test Plan for [Insert Client/Project Name here]

August 21, 2001 10 Fig Leaf Software

Test Environment Requirements

Reference section 6 and add information on any additional needs or items to procure. One example would be training time for test staff on the software to be tested or supporting software/technologies that must be understood in order to effectively test the application.

Page 14: Fig Leaf Software FUNCTIONAL/REGRESSION TEST PLAN TEMPLATE ...

Functional/Regression Test Plan for [Insert Client/Project Name here]

August 21, 2001 11 Fig Leaf Software

Test Approach

Start with a brief summary of what activities will be carried out by the test staff, the means by which findings will be communicated, to whom the findings are distributed, etc.

One thing to note here is whether the application is content­driven or functionality driven. The application type will determine the specific activities listed in the “Test Scope” section.

Also note test completion criteria: “We are done testing when…”

Page 15: Fig Leaf Software FUNCTIONAL/REGRESSION TEST PLAN TEMPLATE ...

Functional/Regression Test Plan for [Insert Client/Project Name here]

August 21, 2001 12 Fig Leaf Software

Test Methods

The overall approach is to test each discrete module as it is completed, then run system­ wide tests at the end when the complete, integrated application is available for testing. In particular, note business scenarios to be tested once the complete, integrated application is available for testing.

Note whether production or test data is to be used. If applicable, note how much data and/or how frequently the data will be updated or changed for each round or type of testing.

Functional Test Case Design and Construction Discuss what manual test scripts/cases will be executed and what automated test scripts/cases will be executed.

Discuss how all test documentation is registered and tracked by the FLS Configuration Management system (when the system has been implemented).

Note any known limitations or constraints on any tests (example: testing against the NT dev box, but client implementation will be on Solaris).

Programming Team Testing In addition to the activities carried out by the software testing group, the following functions performed by the Programming Team should be noted and detailed when applicable:

• Unit testing • Code reviews • Esoteric functionality test

Database Inspections Describe how the application’s modifications to the database (additions, deletions, modifications) will be verified. It could be helpful to note the size changes to the database from time to time. This could assist to better calculate minimum requirements on the database server and also to catch instances where the database might be over­inflated or reached a size that is too excessive that performance is affected.

System/Application Setup Testing The application’s System Design document contains a checklist of server and application settings that must be in place for the application to work correctly. The steps outlined in these checklists are put into place on the test server and the application is run against those settings to validate their suitability.

Page 16: Fig Leaf Software FUNCTIONAL/REGRESSION TEST PLAN TEMPLATE ...

Functional/Regression Test Plan for [Insert Client/Project Name here]

August 21, 2001 13 Fig Leaf Software

Regression Testing The purpose of regression testing is to verify that features that worked in a previous build still work in successive builds, as well to verify any new functionality or fixes in the latest build. The first goal is achieved by using previous test cases to test an application; the second goal is achieved by replicating the steps previously used to generate the issue and confirming that the issue no longer appears.

Regression Testing Procedure:

1. Specify test cases to run against the application. 2. If applicable, discuss test scripts/cases to be reused for this purpose. 3. Re­test the application and establish a baseline of working stability and

correctness using the specified test cases. 4. Create new tests for the new portion of the application (if applicable). 5. Update the existing test cases with the new tests.

Expected Results:

1. The identical results are returned which indicates that there has been no unforeseen adverse impact on the application.

2. The results are expected to be different because there have been changes in the application (or part of the application) that is being tested that also changes the relevant test case.

3. The results will be different because the change in the application has broken something in another area of the system.

Defect Verification:

1. Follow the precise steps in the issue report to reproduce the problem. 2. The expected results should be different because the broken portion of the

application should now be fixed. 3. Document results in the defect tracking system.

Acceptance Testing Acceptance Testing can also be thought of as Deployment Testing and/or Validation of the Production Environment. In other words, this is the testing required to ensure that the application’s functions will work correctly and that the application will run in an optimal manner when implemented in the client’s hardware and software environment.

Discuss what constitutes acceptance, the basis for which should have been set by the Requirements and Functional Specification documents, and generally includes the following:

Page 17: Fig Leaf Software FUNCTIONAL/REGRESSION TEST PLAN TEMPLATE ...

Functional/Regression Test Plan for [Insert Client/Project Name here]

August 21, 2001 14 Fig Leaf Software

• Test the checklist of key business scenarios/functions that have been designated as critical for the successful implementation of the application

• Confirm that the checklist of server and application setup information has been set correctly on the appropriate systems. Examples of application setup information to check include CF Template Cache, Trusted Cache, Session Timeout, and Application Timeout.

Note that while initial acceptance tests can be run in Fig Leaf’s environment, the application has not fully passed acceptance testing until all elements on the acceptance checklist have been successfully passed in the client’s development or production (as applicable) environment.

In some cases, acceptance testing may require work at the client site by QA staff.

NOTE: Before any changes are made to a client’s production environment, the Production Server Permission form must be completed and signed by a representative from Fig Leaf and the client.

Page 18: Fig Leaf Software FUNCTIONAL/REGRESSION TEST PLAN TEMPLATE ...

Functional/Regression Test Plan for [Insert Client/Project Name here]

August 21, 2001 15 Fig Leaf Software

Test Scope

This section would be modified as needed for each project. What follows is a general listing of things that should be taken into consideration when formulating the information for each project.

Functional Testing Start by discussing the objectives of the functional testing. This would include, but not necessarily be limited to:

• Verifying Client Application Logic • Verifying Server­Based Application Logic • Verifying On­Line Transaction Processing Logic • Verifying Batch Processing Logic • Verifying Middleware Processes (such as CF and Spectra underpinnings) • Verifying Application Process­Database Interactions • Verifying Database Structure/State Changes • Verifying Stored Procedures/Triggers

Then move on to discuss the types of functionality to be tested, for example:

• Cross­browser/cross­platform functionality (when required) • Minimum Access Speed/Page Download Times

List specific application functions to be tested. These can include:

• Every visible function and subfunction • Every command • Every menu and every menu choice

• Data entry in all fields on data entry forms • Using all printable characters in data entry fields (use printablechars.txt file as a

reference for complete list of characters to test) • Tab order on data entry forms • Field validation for all required fields on data entry forms • Images all appear in the correct locations • Button states

• Functionality in all dialog and message boxes • Button states

• Each entry into each part of the application • Each exit from each part of the application

Page 19: Fig Leaf Software FUNCTIONAL/REGRESSION TEST PLAN TEMPLATE ...

Functional/Regression Test Plan for [Insert Client/Project Name here]

August 21, 2001 16 Fig Leaf Software

• Error handling • Error messages that the program will generate • Exception Handling, i.e. what happens if the database isn’t there?

• Link/Navigation Testing

• Data in = data out

FLS Recurring Functionality There are a number of standard types of functionality that appear across multiple Fig Leaf applications. Discuss which of the following items is applicable to the application (more details regarding test cases are listed in the “Test Cases” section of this document).

• Logons • User accounts/permissions • Administrative interface à Front end interactions • Calendar Controls/Scheduling • Credit Card Verification • Spectra Workflows • Input formats • Field validations • Documentation • E­mail (formatting and sending)

GUI Testing The objectives of GUI testing are as follows [add or remove items from the list as needed for each application]:

• Verify the presence of all specified GUI controls. • Verify the initial states of all specified GUI controls. • Verify all specified alternate states (enabled, disabled, not visible, etc.) of GUI

controls. • Verify mouse access. • Verify keyboard access/mouse action equivalents (as specified for the

application). • Verify adherence to GUI standards described in Specification document(s). • Verify accessibility requirements (if applicable), such as ALT tags for images.

Page 20: Fig Leaf Software FUNCTIONAL/REGRESSION TEST PLAN TEMPLATE ...

Functional/Regression Test Plan for [Insert Client/Project Name here]

August 21, 2001 17 Fig Leaf Software

Content/Editorial QA Testing This consists primarily of proofreading and validation that content is accurate. Proofreading needs to take place in all main interfaces as well as in all dialog and message boxes. Content vocabulary should be aimed for its target audience and use terms commonly associated with actions (i.e. “click mouse button” instead of “push and release mouse button” or “click and drag” instead of “grab and move”). If applicable, verify that all foreign characters used for specified language appear properly. Clients will need to be the ultimate approvers for content accuracy.

Usability Testing We have a “Web Application Usability Defect Log” example from Constantine & Lockwood as an example of things to look for.

Documentation/Help Testing This includes testing the validity of user manuals, admin guides, installation instructions, etc. as well as any other public documents relating to the application, such as the demo path document created for CareerNotes.

Installation/Configuration/Integration/Security Testing This includes an installation test of the application in its base state (i.e. whatever state the application files and database will be in when it’s handed to the client), any initial configuration that must take place for the application to work, and integration or connections to other applications that are required for the application to work (such as setting up databases), and testing that network access/security is correctly implemented.

Backup/Recovery Testing If needed, this would be a test to confirm that all necessary files can be backed up and restored in the event of a problem with the hardware or software environment in which the application runs. Ideally, this test is based on documentation that outlines the application’s backup and recovery procedures.

Accessibility Testing When required by the client, this section discusses what tasks will be carried out to ensure compliance with the requested level of accessibility. Fig Leaf’s standard reference for Web Content Accessibility is the W3C’s “Checklist of Checkpoints for Web Content Accessibility Guidelines,” available online at http://www.w3.org/TR/WAI­ WEBCONTENT/full­checklist.html.

Performance/Scalability Testing Note where documentation will exist for any planned or completed performance testing of the application. Verify that firewalls on both ends (client and server) do not cause any problems.

Page 21: Fig Leaf Software FUNCTIONAL/REGRESSION TEST PLAN TEMPLATE ...

Functional/Regression Test Plan for [Insert Client/Project Name here]

August 21, 2001 18 Fig Leaf Software

Test Cases

List order in which tests would be executed.

List steps to be performed for each test and relate them to the list of application functions.

Include discussion of or reference to sample input and output data for the testing activities described in the previous section.

Include or reference test matrices/worksheets/checklists/decision tables/diagram paths specific to the testing activities described in the previous section. The worksheets/checklists should record the inputs, expected outcomes, and actual outcomes of the specific tests.

General functionality items that require test cases [add or remove items as needed for each specific application]:

• Presence/absence of supporting technologies (example: Flash plug­in) o Auto­detection o Alternate presentation/navigation (if possible) o If JavaScript is used, turn off JavaScript in the browser and see if there is

any error trapping for this circumstance. o Java apps – turn off Java in the browser and see if there is any error

trapping for this circumstance. o Check that any Java/ActiveX controls do not generate false positives when

scanned by the leading anti­virus packages.

• Testing to trigger errors o Apostrophes, commas, non­standard characters, double quotes o Blank or NULL fields o Submitting data without a valid selection

• Static v. Dynamic pages o Static

§ Links ­Http 404 errors § Typos § HTML version compliance using tools such as Bobby or

WebLINT o Dynamic ­ Functionality

• Boundary Analysis o Maximum input in fields for truncated or overflow effects o Numerical input in fields for computational, rounding, or display errors o Record or character limit in database

• Equivalence Partitioning

Page 22: Fig Leaf Software FUNCTIONAL/REGRESSION TEST PLAN TEMPLATE ...

Functional/Regression Test Plan for [Insert Client/Project Name here]

August 21, 2001 19 Fig Leaf Software

• Browser Compatibility o Supported display resolutions o Supported color depths o Version inconsistencies o Required vs. default settings o Detection of supported browsers with appropriate error message for non­

supported browsers. o Are there any ill­effects to using the non­default Text Size?

• Browser settings (cache settings, print settings) o The default Cache setting for Internet Explorer is “Automatically” and

Netscape is “Once per session”. Application functionality should be tested rigorously under these settings.

o It will be necessary to enable the “print background colors and images” setting (“print backgrounds” in Netscape) in order to output a browser page similar to the display on the screen. Note that Microsoft’s Internet Explorer version 5.5 comes with a page rendering featuring to allow printing of web pages exactly as they appear on the screen.

• Cookies (if used) o Start testing by clearing memory and disk cache and deleting any

permanent cookies from the test site. o Disable cookies and see if site still works (possibly more slowly) or if an

appropriate error/warning message appears. o Disable cookies in the middle of a transaction; see if disabling is detected

and/or handled appropriately. o Delete the cookie in the middle of a transaction. o Clear memory and/or disk cache in the middle of a transaction. o Edit the cookie (using Notepad/WordPad) and change some data:

§ Add, delete, swap parameters § Set parameters to null § Set parameters to values above or below the accepted range § Set parameters to invalid values (ex: numbers instead of letters) § Add control characters (ex: carriage returns) § Add multiple entries for a site in Netscape’s cookies.txt file § Change the user ID in the midst of a transaction (try admin, test,

guest, master, any other common user IDs you can think of). o Repeat basic cookie tests, using two instances of the same browser brand

and version. o Repeat basic cookie tests, using two or more instances of the same brand

but different version of browser. o Check that cookies are not storing sensitive information, e.g. passwords or

authorization levels.Session Maintenance o Broken/Abandoned Transmissions.

Page 23: Fig Leaf Software FUNCTIONAL/REGRESSION TEST PLAN TEMPLATE ...

Functional/Regression Test Plan for [Insert Client/Project Name here]

August 21, 2001 20 Fig Leaf Software

o Multiple PCs accessing the same account. o Whose machine clock is right (for time­sensitive transactions). o Time delay to complete transaction/how long a transaction can be idle

before being dropped. o Users clearing disk and/or memory cache. o Browsers that do not allow cookies, Java applets, or Active X controls (as

applicable). o Database locking strategies. o Are all resources (server memory, network connections, disk space) being

freed? o Can other transactions affect a transaction in a wait state (ex: what if

trading on a stock is suspended in the midst of a buy or sell order?)? o What happens if a step or event doesn’t occur (ex: final confirmation to

place an order does not go through)? o What if events occur out of order (ex: user jumps into middle of process

via bookmark)? o Are there any wait states without an “exit”? o What if incorrect data is added to the system; is there any way to

detect/correct this? o Save a page accessed via a secure log on to the hard drive. Close the

browser, then re­open the page from the saved file on the hard drive. Can you get in without providing log on credentials?

• Permissions o Proper access privileges for each logon account o Verification of private and public data accessible based on permission

level

• Printing o Test print pages to as many different printers as possible. o Test print pages from all supported browsers. o Test print to color and B&W printers, if possible. o If frames are used, test that frame printing options work correctly.

• Bookmarking – pages should have an effective description for bookmarking. o Look for concise, meaningful descriptions. o Try to avoid articles such as “the” at the beginning of the description; if a

search engine is sorting found pages in alphabetical order this will potentially sort the page into an order where it is not as likely to be seen.

o HTTPS (secure) pages can be bookmarked. Make sure that the bookmark cannot be re­accessed without providing the log on credentials required to see the page in the first place (pre­bookmarking).

• Legal Considerations: o Copyrights and trademarks o Terms, disclosures, disclaimers

Page 24: Fig Leaf Software FUNCTIONAL/REGRESSION TEST PLAN TEMPLATE ...

Functional/Regression Test Plan for [Insert Client/Project Name here]

August 21, 2001 21 Fig Leaf Software

o Shipping overseas (e­commerce) o Reverse engineering protectionInternational ConsiderationsField entries

that may be made by non­US users: o Date formats o Address formats (zip code formats are different everywhere) o Currency formats o Tax rates/rules o Varying shipping costs o Import duties/rules o Database where information is stored may need to store non­standard

alphabets (incl. double byte characters); if not supported, “garbage” characters may be recorded instead.

FLS Recurring Functionality items that require test cases [add or remove items as needed for each specific application]:

• Logons o Make sure that you are required to enter a username and/or a password. o If you enter an incorrect username or password, you should receive an

alert message prompting to enter a valid username or password. Once you have acknowledged the alert box, the username and password fields are reset. § You should not be permitted to gain access without entering the

correct username and password. o Logon screen should always have “Logon” not “Login” if that text is

displayed. o Password input should always be masked or hidden. o Logon information should be encrypted and validated via a secure

connection to the server. o Pressing ENTER or RETURN while the password field is the focus should

act the same as clicking the “Logon” button.

• User accounts/permissions o Check for different permission levels for access to certain functionality.

§ Try to access using all different permission levels.

• Administrative interface à Front end interactions o Adjustments made in the admin section should be reflected on the front

end and vice versa.

• Calendar Controls/Scheduling

Page 25: Fig Leaf Software FUNCTIONAL/REGRESSION TEST PLAN TEMPLATE ...

Functional/Regression Test Plan for [Insert Client/Project Name here]

August 21, 2001 22 Fig Leaf Software

o Check to make sure that the date is populated into the date field and that it is in the correct format.

o Standard options at the bottom of the calendar control are “OK” and “Cancel.”

o By default, date displayed should be current date until populated. Note that this can be pulled from the server and may not reflect the local system’s date.

o Test that the dates entered make logical sense, e.g. if you are scheduling a course you should not be able to set the course to start in the past.

• Search Functions o Search on common words such as “a,” “the,” or the company name. o Search for words that don’t exist (“kalfjsl” “skjfls”). o Search for nothing (null/blank request). o Search for three spaces. o Search for multiple words, with and without Boolean operators (“and and

or,” “and or or”). o Search for the same term or phrase using all upper case, all lower case,

and mixed case (should bring back consistent results). o Search for words that you know should be found, such as products and

services. o Search for a misspelled word – see if the search engine can make a match. o Search for HTML tags/JavaScript commands (IFRAME, </SCRIPT>): if

these are found, it probably means that there are tags that are not properly opened or closed and as a result there is some stray code displaying on the screen.

o Search for a quoted quote.

• Shopping Carts o If the cart is designed to hold some maximum number of items, add one

item beyond the maximum. o If a checkout is attempted with an empty cart, an appropriate message

should be displayed. o If an item is removed from the cart, is the action registered correctly? o If a purchase is completed, then the Back, Go, or History button is used to

return to a pre­checkout page, the contents cannot be altered. o If a user abandons the cart but returns to the site within the "time­out"

period for retaining the state of the cart, the correct selections are in effect, the time out period is reset, and continued shopping is allowed.

o If a user abandons the cart and does not return to the site within the "time­ out" period, the cart is "cleared."

o If two instances of a browser are in use and items are added to the cart using both instances, are the cart(s) updated correctly?

• Credit Card Verification

Page 26: Fig Leaf Software FUNCTIONAL/REGRESSION TEST PLAN TEMPLATE ...

Functional/Regression Test Plan for [Insert Client/Project Name here]

August 21, 2001 23 Fig Leaf Software

o Numbering Schemes for Commonly Used Credit Cards § MASTERCARD Prefix 51­55 Length 16 § VISA Prefix 4 Length 13, 16 § AMEX Prefix 34 or 37 Length 15 § Diners Club/Carte Blanche Prefix 300­305 or 36 or 38 Length 14 § Discover Prefix 6011 Length 16

o Check that valid numbers are handled correctly by the front end and back end all the way through fulfillment.

o Algorithmically correct but invalid numbers should pass through the front end and be caught by the authorization functions in middleware or backend.

o Algorithmically incorrect numbers should be caught at the front end. o If credit card submission procedures include error message generation

(example: if the card is algorithmically correct but is rejected during authorization), the system may do something like send an e­mail to the client. Test any procedural communications that have been specified.

o Verify form handling for various credit card number input formats: "nnnnnn" and "nn nn nn" and nn­nn­nn" etc. are all possibilities that end users will try. If the form requires a certain format (or formats) for input, make sure it accepts the correct format(s) and gracefully rejects the incorrect format(s).

o Check for trailing and leading spaces. o Check one type of credit card number against another type of credit card

database. For example, take an American Express Optima and run it against a Discover.

o Watch the back and front end for abbreviations of American Express. It should be consistent, whether it’s "Amex", "AE", or "American Express". Keep it the same throughout; if it’s not, there could be problems.

o Entry of invalid credit card number or expiration date displays an appropriate error message.

o Entry of invalid contact information displays an appropriate error message.

o Credit cards that are cancelled or past their expiration date are rejected. o If the current credit available on the card is not sufficient for the purchase,

an appropriate message is displayed. o Credit card numbers (and any other confidential information) are stored in

an encrypted format. o Credit card information is stored in a separate database/file on a different

physical server than other aspects of a customer's profile (increases security).

o Payments are in fact received for completed transactions.

Page 27: Fig Leaf Software FUNCTIONAL/REGRESSION TEST PLAN TEMPLATE ...

Functional/Regression Test Plan for [Insert Client/Project Name here]

August 21, 2001 24 Fig Leaf Software

o Tax collection processing is in place to calculate and collect all appropriate taxes (Fed, State, County, other municipalities).

o Fraud detection and credit card address verification services are functioning as specified.

• Spectra Workflows o Not documented yet (9/6/00).

• Input formats o Dates

§ MM/DD/YY, MM/DD/YYYY, MONTH/DAY/YEAR o Time

§ Hour:Min, Hour:Min:Sec, Hour:min (24 hour clock) o Email Address

§ [email protected] o Zip codes

§ Zip + 4 support o File Names

§ The use of the forward slash (/), asterisk (*), percent (%), question mark (?), greater than and less than (<>), and “pipe” (|) symbols should provide error messages when used in path and filenames.

o Currency § $2000/$2,000/$2000.00/$2,000.00

o URLs § http://www.figleaf.com or www.figleaf.com § If the user is entering a URL, make sure the required input format

is indicated. o Username/Password

§ If case­sensitive, make sure this is indicated to the user. o Telephone/fax numbers

§ (202) 797­5410 or 202­797­5410

• Field validations o Check for the presence of data in the field. o Validate any special formatting requirements (date, time, etc.). o The use of the apostrophe (’), double quotes (“”), backslash (\), semicolon

(;), and comma (,) are known to cause unexpected results and should be checked thoroughly within an application.

o Alert boxes are used for validation.

Page 28: Fig Leaf Software FUNCTIONAL/REGRESSION TEST PLAN TEMPLATE ...

Functional/Regression Test Plan for [Insert Client/Project Name here]

August 21, 2001 25 Fig Leaf Software

• Documentation o Make sure it follows Fig Leaf Software standards as defined in the

\\Pdcfile01\clientdocs\Documentation\Style Guide\style Guide.doc (e.g. “e­mail,” “logon”).

o Ensure that the documentation is an accurate representation of the application functionality.

• E­mail (formatting and sending) o When receiving an application generated e­mail, check the format (font,

font size, text) to make sure that it is consistent. o If an application sends multiple e­mail messages, check to make sure that

there is consistent formatting across all of the messages. o Make sure that the email is sent from the correct address, using the correct

alias name, such as “CareerNotes Administrator,” and is sent to the correct recipient address.

o The subject line should be descriptive of the e­mail’s content. o E­mails sent out as newsletters or “subscription content” should have an

“unsubscribe” disclaimer at the bottom. The reply e­mail should also go to a valid e­mail address for processing.

GUI items that require test cases [add or remove items as needed for each specific application]:

• Home/Main Menu (DHTML) o Make sure that you are able to access the site the links are directing you to

from the menu. o Make sure that only one link is highlighted at a time.

• Home/Main Menu (Flash)

• Formso Text o Text area o Password o Checkbox o Radio o Select boxes o Sort order

§ Able to select by pressing the key that corresponds with the first letter in the name.

o Multiple select boxes § Sort order

Page 29: Fig Leaf Software FUNCTIONAL/REGRESSION TEST PLAN TEMPLATE ...

Functional/Regression Test Plan for [Insert Client/Project Name here]

August 21, 2001 26 Fig Leaf Software

o Submit button o Reset button o Check to make sure that all required fields have alert boxes associated

with them. o Check the spelling/grammar of the alert boxes. o Check to make sure that the message in the alert box is associated with the

correct field on the form. o TAB key switches focus in logical order (including activation of drop­

down lists, radio buttons, check boxes, etc.). o Drop­down lists should be sorted in a logical order. o Drop­down list fields should be wide enough to display the longest choice. o Use browser Back and Forward buttons in the middle of a series of forms

to confirm that data is retained. o Select the Reload/Refresh button in the middle of a series of forms to

confirm that data is retained. o Use Go and/or History buttons to revisit previous forms or screens within

a series of forms. o Maximize, minimize, and resize the browser by hand while filling out a

form or series of forms. o Check the database after doing each of the previous tests to see if any data

“anomalies” have resulted.

• Tables o Text size is OK using default settings in each browser. o White space is OK when viewing in each browser. o Cell size holds up in different browsers. o All attributes hold up when viewing at different screen resolutions (use the

standard sizings and resize by hand to non­standard sizes). • Framesets

o Confirm each frame has sufficient screen area at minimum supported screen resolution.

o Confirm each frame has an adequate description and meta tag for bookmarking and search engine classification.

o Check borders and margins in all supported browser/platform combinations.

• Menus o Must not be too complex or too simple. o Minimize number of paths to the same link. o Are there keyboard equivalents?

• Pop­up Menus o Does right­clicking the mouse invoke a pop­up menu?

§ Should right­clicking be disabled?

Page 30: Fig Leaf Software FUNCTIONAL/REGRESSION TEST PLAN TEMPLATE ...

Functional/Regression Test Plan for [Insert Client/Project Name here]

August 21, 2001 27 Fig Leaf Software

• Keyboard functionality/equivalents o Do any keys not work or cause an error?

• Tool Tips o Check to make sure that tool tips are present when they are supposed to

be. o Check relevance, consistency in phrasing, and grammar.

• Status bars o Check what is in the status bar in the lower left corner.

• Missing information o Cursors, hourglass, instructions

• Help o Is Help present? o Check the screen shots to make sure that they are current and match the

look of the application. o Make sure that the help files are an accurate representation of the

application functionality. o Check to make sure that the “Table of Contents” and “Index” functionality

work.

• Error Messages o Are error messages consistent? o Check grammar. o Check to ensure that messages are appropriate and not offensive or too

complicated.

• Layout Errors o Wrong size/format? o Make sure that layout is as specified.

• Display layout o Color o Fonts o Poor aesthetics o Obstruction o Turn graphics OFF in the browser and see if the site/application handles

this gracefully (e.g. displays ALT tags for the missing images).

• Modal o Is the window really modal? Is it set up properly?

Page 31: Fig Leaf Software FUNCTIONAL/REGRESSION TEST PLAN TEMPLATE ...

Functional/Regression Test Plan for [Insert Client/Project Name here]

August 21, 2001 28 Fig Leaf Software

• Sizable o If there are frames, are they resizable? o Are windows resizable?

• Canceling (Dialog/alert boxes) o What method? o Any methods cause errors?

• Default buttons (Dialog/alert boxes) o Are there any default buttons? What are they?

• Make sure that items that should be labeled are properly labeled.