Test Managementusers.du.se/~hjo/cs/gmi23n/lectures/MT_ISTQB_05_test_management.pdfA master test plan...

40
Test Management https://www.guru99.com/ > Test Management

Transcript of Test Managementusers.du.se/~hjo/cs/gmi23n/lectures/MT_ISTQB_05_test_management.pdfA master test plan...

Page 1: Test Managementusers.du.se/~hjo/cs/gmi23n/lectures/MT_ISTQB_05_test_management.pdfA master test plan or a separate plan for each “test level” An outline of a test plan may include

Test Managementhttps://www.guru99.com/ > Test Management

Page 2: Test Managementusers.du.se/~hjo/cs/gmi23n/lectures/MT_ISTQB_05_test_management.pdfA master test plan or a separate plan for each “test level” An outline of a test plan may include

Independent Testing 1• Effectiveness in finding defects can be improved by using independent testers• Testing may be performed by many people at different levels

A degree of independence and perspectives will uncover defects

• Degrees of independence in testing include the following (low → high) Developers testing their code (Low) Independent developers or testers within the development teams Independent test team within the organization Independent testers from the business organization or user community Outsourced or external to the organization (High)

• For critical systems or complex projects Should include multiple phases (levels) of testing Some levels should be performed by independent testers

• Independence of testing is dependent on a variety of factors SDLC, Criticality of the system, Contractual requirements, External or internal customer, etc.

independence of testing: Separation of responsibilities, which encourages the accomplishment of objective testing

Page 3: Test Managementusers.du.se/~hjo/cs/gmi23n/lectures/MT_ISTQB_05_test_management.pdfA master test plan or a separate plan for each “test level” An outline of a test plan may include

Independent Testing 2• Benefits of independence

Different defects are found Unbiased perspective Verification of any assumptions related to specification

• In Agile development testers may be part of the development team Product owner usually perform the acceptance testing Drawbacks and risks of testers being part of development team (agile development)

● Defects may be documented later● Defects may be forgotten, especially if they were not fixed in the sprint● May not having the tools in place to support Agile practices, and so on...

• Drawbacks (not part of dev team) Isolation from development team Developers may lose a sense of ownership for quality Testers missing important information Testers perceived as a bottleneck (Blamed for delays in release)

Page 4: Test Managementusers.du.se/~hjo/cs/gmi23n/lectures/MT_ISTQB_05_test_management.pdfA master test plan or a separate plan for each “test level” An outline of a test plan may include

Tasks of a Test Manager and Tester

• Test manager Responsible for the test process and

successful leadership of test activities The person responsible for project

management of testing activities and resources, and evaluation of a test object

The individual who directs, controls, administers, plans an regulates the evaluation of a test object

May be performed by● Project manager● Development manager● Quality assurance manager

• ISTQB recognizes two distinct testing roles Each has some typical activities

Although these depend on the context and the organization

May be performed by people in specific testing roles or as part of other roles

The way in which the role are carried out varies depending on the SDLC

• Tester Responsible for creating and

executing the test cases

Page 5: Test Managementusers.du.se/~hjo/cs/gmi23n/lectures/MT_ISTQB_05_test_management.pdfA master test plan or a separate plan for each “test level” An outline of a test plan may include

Tasks of a Test Manager9.Adapt planning based on test results and

progress

10.Support defect and configuration management of testware for traceability

11.Implement metrics for measuring test progress and evaluating quality

12.Support the selection and implementation of tools for the test environment, configuration management, automation, and recommending budgets

13.Promote and advocate the testers, the test team, and the test profession within the organization

14.Develop the skills and careers of testers

1.Write or review test policy and strategy

2.Plan the test activities

3.Write and update the test plan(s)

4.Coordinate the test strategy and plan with project manager and others

5.Share testing perspectives with other project activities

6. Initiate the analysis, design, implementation, and execution of tests

7.Monitor test progress and results and verify exit criteria

8.Gather information and prepare testing progress and summary reports

Page 6: Test Managementusers.du.se/~hjo/cs/gmi23n/lectures/MT_ISTQB_05_test_management.pdfA master test plan or a separate plan for each “test level” An outline of a test plan may include

Tasks of a Tester6.Prepare and acquire test data7.Create the detailed test execution

schedule8.Perform test execution, document

results and any defects9.Use appropriate tools10.Automate tests as required11.Participate in test case reviews12.Evaluate non-functional characteristics

1.Review and contribute to the plans

2.Review requirements and other artifacts from which test cases can be derived

3.Create test conditions, link traceability between the test cases and the associated documentation

4.Set up test environment5.Design and implement test

cases and test procedures

Page 7: Test Managementusers.du.se/~hjo/cs/gmi23n/lectures/MT_ISTQB_05_test_management.pdfA master test plan or a separate plan for each “test level” An outline of a test plan may include

Purpose and Content of a Test PlanExample Test Plan

1. Context of the testing

a) Project/Test sub process

b) Test item(s)

c) Test scope

d) Assumptions and constraints

e) Stakeholders

2. Testing communication

3. Risk register

a) Product risks

b) Project risks

4. Testing strategy

a) Test subprocesses

b) Test deliverables

c) Test design techniques

d) Test completion criteria

e) Metrics

f) Test data requirements

g) Test environment requirements

h) Retesting and regression testing

i) Suspension and resumption criteria

j) Deviations from Organizational Test Strategy

5. Testing activities and estimates

6. Staffing

a) Roles, activities and responsibilities

b) Hiring and training needs

7. Schedule

• Planning is influenced by Test policy, strategy, scope, and objectives Lifecycles, methods, risks, and constraints Resources, testability, and criticality

• Planning may be documented using A master test plan or a separate plan for each

“test level” An outline of a test plan may include

Scope, objective and approach Integration of the test activities into SDLC Resources, test schedule, and activities Features to be tested Test metrics and budget Test documentation and structure

The content of test plans vary between projects

Page 8: Test Managementusers.du.se/~hjo/cs/gmi23n/lectures/MT_ISTQB_05_test_management.pdfA master test plan or a separate plan for each “test level” An outline of a test plan may include

ISO/IEC/IEEE 29119-3:2013, Part 3: Test documentation https://en.wikipedia.org/wiki/ISO/IEC_29119

3. Dynamic Test Process Documentation:- Test Design Specification- Test Case Specification- Test Procedure Specification- Test Data Requirements- Test Data Readiness Report- Test Environment Requirements- Test Environment Readiness Report- Actual Results- Test Result- Test Execution Log- Test Incident Report

Sample ISO standard test plan templates

1. Organizational Test Process Documentation:- Test Policy- Organizational Test Strategy

2. Test Management Process Documentation:- Test Plan (including a Test Strategy)- Test Status- Test Completion

Page 9: Test Managementusers.du.se/~hjo/cs/gmi23n/lectures/MT_ISTQB_05_test_management.pdfA master test plan or a separate plan for each “test level” An outline of a test plan may include

ISO/IEC/IEEE 29119-3 standard

https://slideplayer.com/slide/2784167/

Page 10: Test Managementusers.du.se/~hjo/cs/gmi23n/lectures/MT_ISTQB_05_test_management.pdfA master test plan or a separate plan for each “test level” An outline of a test plan may include

Test Strategy and Test Approach 1• Test strategy

Generalized description of the test process Defined and refined in the test plans and test designs Integral in selecting the test design technique and test types

• Test approach Tailors the test strategy for a particular project or release Starting point for selecting

Test techniques, levels, and types Defining entry/exit criteria

• When tailoring the test strategy, consider Complexity, goals, and objectives of the project Product type, resources, and skills Safety and regulations Technology and product risk analysis System type (COTS, in-house or custom built)

Page 11: Test Managementusers.du.se/~hjo/cs/gmi23n/lectures/MT_ISTQB_05_test_management.pdfA master test plan or a separate plan for each “test level” An outline of a test plan may include

Test Strategy and Test Approach 2Test Strategies Examples

Analytical Based on an analysis of some factor. Testing designed toward greatest area of risk

Model-based Based on some model of some required aspect of the product. Examples of such models include business process models, state models, and reliability growth models

Methodical Relies on making systematic use of some predefined set of tests or test conditions. Examples: Likely failure types, a list of important quality characteristics, or companywide look-and-feel

standards

Process- or standardcompliant

Industry-specific standards and various Agile methodologies

Directed or Consultative

Driven primarily by the advice of business domain and technology experts

Regression-averse Reuse of existing test material, extensive automation, and standard test suites

Reactive Tests are designed and implemented, and may immediately be executed in response to knowledge gained from prior test results

Reactive Exploratory testing is a common technique employed in reactive strategies

Page 12: Test Managementusers.du.se/~hjo/cs/gmi23n/lectures/MT_ISTQB_05_test_management.pdfA master test plan or a separate plan for each “test level” An outline of a test plan may include

Entry Criteria and Exit Criteria 1• Entry criteria define when to start testing; “definition of ready” in Agile development

At the beginning of a test level When a group of tests are ready for execution Define it for each test level and test type

Based on the test objectives Typically, entry criteria may cover the following

Availability of testable requirements, user stories, completion of code Availability of the test environment, data and tools Availability of test items that have met the exit criteria for any prior test levels

• Exit criteria define when to stop testing; “definition of done” in Agile development End of a test level or phase A set of tests has achieved a specified goal Define it for each test level and test type

Based on the test objectives

Page 13: Test Managementusers.du.se/~hjo/cs/gmi23n/lectures/MT_ISTQB_05_test_management.pdfA master test plan or a separate plan for each “test level” An outline of a test plan may include

Entry Criteria and Exit Criteria 2• Exit criteria cont.

Typically, exit criteria may cover the following Test completion Traceability information Defect information (Density, Severity, and Priority) Evaluated level of non-functional and quality characteristics Risks associated with opened defects

• Common reasons for test activities to be shortened Budget, Schedule, Market pressures, etc. Stakeholders and business owners Must accept risks associated with a go live decision

without further testing

• Establish the exit criteria and completion strategy Early in the project Map the test cases to the source documents Get an agreement on the acceptance criteria (basis for delivery to the customer)

Page 14: Test Managementusers.du.se/~hjo/cs/gmi23n/lectures/MT_ISTQB_05_test_management.pdfA master test plan or a separate plan for each “test level” An outline of a test plan may include

Test Execution Schedule (example) 1• Example project and activities

The company you work for has purchased an application that provides home security and automation. There are some concerns that we should only provide limited automation because of the delivery schedule. The team has decided to automate the security system to work with other third party devices and needs you to create a test schedule (cont. next slide)

• Goal: Define the order in which tests should be executed• Consider the following

Determine the critical pieces of the system and Prioritize which tests or group of tests should be scheduled

Identify any test case Dependencies List all valid scenarios and perform Positive tests (confirmation) Regression tests Most efficient sequence for executing the tests

• Evaluate the trade-offs to consider Efficiency of test execution versus adherence to prioritization

Page 15: Test Managementusers.du.se/~hjo/cs/gmi23n/lectures/MT_ISTQB_05_test_management.pdfA master test plan or a separate plan for each “test level” An outline of a test plan may include

Test Execution Schedule (example) 2• Additional project information: The home security system provides the

following features Arm Away – Use this feature to arm when you are not at home. The system will

detect an alarm condition if the following happens Motion inside the home Window opens – but you can set it to bypass via program Door opens – a delay of 30, 60, or 90 seconds, you can decide. This gives you time to

turn off alarm system when you open the door Arm Stay – Use this feature when you are at home

Motion (not applicable) Window opens – but you can set it to bypass via program Door – a delay of 30, 60, or 90 seconds, you can decide. This gives you time to turn off

alarm system when you open the door

Page 16: Test Managementusers.du.se/~hjo/cs/gmi23n/lectures/MT_ISTQB_05_test_management.pdfA master test plan or a separate plan for each “test level” An outline of a test plan may include

Test Execution Schedule (example) 3• Additional project information cont.:

Arm Instant – Use this feature when you not expecting any visitors Motion (can be defined as no delay) Window (no delay), but you can set it to bypass Door (no delay) – Instantly alarms and sends signal

Home Automation Users have the ability to speak into smart devices to arm/disarm their system remotely The users are also sold smart plugs and door locks. These devices allow you to control

turning on/off lights and small appliances in your home In addition, you can setup programs to perform many of these tasks on a daily or weekly

basis

• Test cases should be derived from these Alarm attribute features

Page 17: Test Managementusers.du.se/~hjo/cs/gmi23n/lectures/MT_ISTQB_05_test_management.pdfA master test plan or a separate plan for each “test level” An outline of a test plan may include

Suggested Test Execution Schedule (example) 4C) Test Schedule (Positive tests)

1. Arm Away no bypassing

2. Arm Stay no bypassing

3. Arm Instant no bypassing

4. Arm Stay bypass 1 door

5. Arm Stay bypass 1 window

6. Etc.

D) Test Schedule (Regression)

1. Alarm (Arming/Disarming)

2. Alarm conditions(active alarm)

3. Door lock features

4. Window sensor features

5. Smart plug features

A) Test Schedule (Priority)

1. Arm Away

2. Arm Instant

3. Arm Stay

4. Disarming (Away, Stay and Instant)

5. Repeat 1 thru 4 with the door lock

6. Repeat 1 thru 4 with the motion detector

7. Etc.

B) Test Schedule (Dependency)

1. All Arming features

2. All Disarm features

3. All bypass features

4. All smart plug features

5. All door lock features

6. All window features

7. Etc.

E) Test Schedule (Sequence)

1. Alarm features

2. Door lock features

3. Window sensor features

4. Smart plug features

Page 18: Test Managementusers.du.se/~hjo/cs/gmi23n/lectures/MT_ISTQB_05_test_management.pdfA master test plan or a separate plan for each “test level” An outline of a test plan may include

Factors Influencing the Test Effort

1. Product characteristics

1. Product risks

2. Quality of test basis

3. Product size

4. Complexity

5. Requirements for quality characteristics

6. Test documentation

7. Requirements for legal and regulatory

2. Process characteristics1. Organization (stability and maturity)2. SDLC3. Test approach4. Tools5. Test process6. Time pressure

• Goal: estimate the amount of effort needed so a schedule can be drawn up and resources allocated

• Factors influencing the test effort

1. Characteristics of the product

2. Characteristics of the development process

3. Characteristics of the people

4. Test results

3. People characteristics1. Skills and experience of personnel2. Team cohesion3. Leadership

4. Test results1. Defects (Severity and Count)2. Rework effort

Page 19: Test Managementusers.du.se/~hjo/cs/gmi23n/lectures/MT_ISTQB_05_test_management.pdfA master test plan or a separate plan for each “test level” An outline of a test plan may include

Test Estimation Techniques• Two approaches for estimating testing effort

Metrics-based technique (based on historical data from previous projects) Expert-based technique (based on SMEs)

• The testing effort may depend on Characteristics of the product (complexity, criticality, and size) Characteristics of the development process (maturity, skills, tools) The outcome of testing (defects and rework)

Agile1. Burndown charts (metrics)2. Team velocity (metrics)3. Planning poker (expert)

Sequential1. Defect removal and correction (metrics)2. Wideband Delphi (expert)

SME: subject-matter experts

For example, in Agile development, burndown charts are examples of the metrics-based approach as effort is being captured and reported, and is then used to feed into the team’s velocity to determine the amount of work the team can do in the next iteration. Whereas, planning poker is an example of the expert-based approach, as team members are estimating the effort to deliver a feature based on their experience

Page 20: Test Managementusers.du.se/~hjo/cs/gmi23n/lectures/MT_ISTQB_05_test_management.pdfA master test plan or a separate plan for each “test level” An outline of a test plan may include

Test Monitoring and Control• Test monitoring

A test management activity that involves checking the status of testing activities, identifying any variances from the planned or expected status, and reporting status to stakeholders

Provides visibility of testing activities (planned versus actuals) Ongoing throughout the testing process to ensure there is feedback Reports are prepared measuring test exit criteria Information to be monitored may be collected manually or automatically

• Test control A test management task that deals with developing and applying a set of corrective actions

to get a test project on track when monitoring shows a deviation from what was planned

• Examples of test control actions Re-prioritization of testing activities when a risk is identified Modifying the test schedule because resources are not available – These resources may

include people, environment, tools, etc. Re-evaluating whether an item meets an entry/exit criteria because of rework

Page 21: Test Managementusers.du.se/~hjo/cs/gmi23n/lectures/MT_ISTQB_05_test_management.pdfA master test plan or a separate plan for each “test level” An outline of a test plan may include

Metrics Used in Testing• Metrics are used to assess the progress of testing

Collection of information throughout testing activities May be manual or automated Helps in the measurement of exit criteria Assess quality of the product

• Common test metrics1. Test completion

2. Test success rate

3. Test coverage

4. Plan versus actual (schedule and cost)

5. Defect information Defect density: the number of defects per unit size of a work product Failure rate: the ratio of the number of failures of a given category to a given unit of measure Defects found and fixed Retest results

Page 22: Test Managementusers.du.se/~hjo/cs/gmi23n/lectures/MT_ISTQB_05_test_management.pdfA master test plan or a separate plan for each “test level” An outline of a test plan may include

Purposes, Contents, and Audiences for Test Reports

• Summarizes information about testing What happened during and at the end of testing?

• Typical items that may be addressed in a test progress report Status of test activities and Progress against the test plan Obstacles and blocking events Testing planned for the next cycle or iteration Quality attributes pertaining to the product

• Typical items that may be addressed in a test progress or summary report Summary of test activities and Information on what occurred during a test period Any deviations from plan, timeline, and test activities e.g. Blocking activities Test completion and test success rate Defect quality information Residual risks Reusable test work products produced

Page 23: Test Managementusers.du.se/~hjo/cs/gmi23n/lectures/MT_ISTQB_05_test_management.pdfA master test plan or a separate plan for each “test level” An outline of a test plan may include

Example: Test Summary Report (TSR) Template• IEEE TSR or ISO Test Completion Report purpose

1. Summarize the results of testing activities

2. Provide an evaluation based on the resultsTest summary report(synonyms: test report):A test report that provides an evaluation of the corresponding test items against exit criteria

https://www.softwaretestinghelp.com/test-summary-report-template-download-sample/

Page 24: Test Managementusers.du.se/~hjo/cs/gmi23n/lectures/MT_ISTQB_05_test_management.pdfA master test plan or a separate plan for each “test level” An outline of a test plan may include

Configuration Management (CM) Terminology• Configuration management (CM)

A discipline applying technical and administrative direction and surveillance to identify and document the functional and physical characteristics of a configuration item, control changes to those characteristics, record and report change processing and implementation status, and verify compliance with specified requirements

• Version control An element of configuration management, consisting of the evaluation, coordination,

approval or disapproval, and implementation of changes to configuration items after formal establishment of their configuration identification

• Configuration item An aggregation of work products that is designated for configuration management and

treated as a single entity in the configuration management process

• Configuration Control Board (CCB) Represent stakeholders who may be impacted Approve changes after assessing the impact

Page 25: Test Managementusers.du.se/~hjo/cs/gmi23n/lectures/MT_ISTQB_05_test_management.pdfA master test plan or a separate plan for each “test level” An outline of a test plan may include

Configuration Management (CM)

• Secure repository of intellectual property• Maintains the integrity of the products throughout the life cycle• For testing, CM activities may help ensure

Testware items are Identified, Version controlled and Tracked for changes

Traceability throughout the process Identified documents are unambiguously referenced in test documentation

Full insight into artifacts so that they are Uniquely identified and Reproducible

• During test planning CM procedures and infrastructure tools should be selected, documented,

and implemented

Page 26: Test Managementusers.du.se/~hjo/cs/gmi23n/lectures/MT_ISTQB_05_test_management.pdfA master test plan or a separate plan for each “test level” An outline of a test plan may include

Tools Should Support Traceability Throughout Process

• All work products, including test documentation are covered in CM Identified, Version controlled, Tracked for changes, Traceable and Reproducible

Page 27: Test Managementusers.du.se/~hjo/cs/gmi23n/lectures/MT_ISTQB_05_test_management.pdfA master test plan or a separate plan for each “test level” An outline of a test plan may include

Example – Versioning• A circle represents a version number

In addition to a Change Request (CR), a tool is typically used to track and answer the following informationa) What was changed?

b) Who made the change?

c) When did it change?

d) Why were the changes made?

Page 28: Test Managementusers.du.se/~hjo/cs/gmi23n/lectures/MT_ISTQB_05_test_management.pdfA master test plan or a separate plan for each “test level” An outline of a test plan may include

Example – Change Request• Had this world been perfect, a system would be created and there

would be no future changes. Unfortunately, it is not a perfect world and after a system is deployed, many changes are needed, thereby giving birth to change requests (CR)

• CR is a documented request to modify the current software system, usually supplied by the user. It is typically different from a defect report, which reports an anomaly in the system

• CR is generated as a part of the configuration control activity under the process of SCM (Software Configuration Management)

• Some of the reasons for change are The requirements change The design changes

• The CR forms (right) serves as the document vehicle to record and disseminate the actions of change control

• The actual structure of change information may differ according the requirements of the implementation

Every change should be supported by a documented request!

Page 29: Test Managementusers.du.se/~hjo/cs/gmi23n/lectures/MT_ISTQB_05_test_management.pdfA master test plan or a separate plan for each “test level” An outline of a test plan may include

Example: Traceability From Requirements to Test Cases

• Source: customer, business, or user requirements

• Test cases

• Intelligent numbering scheme achieves two outcomes Provides traceability to

and from the source Captures the extent of

test coverage

Page 30: Test Managementusers.du.se/~hjo/cs/gmi23n/lectures/MT_ISTQB_05_test_management.pdfA master test plan or a separate plan for each “test level” An outline of a test plan may include

Example: Traceability From Requirements to Test Cases

• A common method of documenting test cases

• Question – Why did Away.TC.002 fail? (spec. slide 15)• Come up with one additional test case that you would suggest for the

requirement Alarm.SR.001• Remember that your alarm system has many states (Arm Away, Arm Stay,

and Arm Instant)• There are multiple test cases traced to the SR.001. This indicates that many

test cases would be required in order to sufficiently test the requirement

Page 31: Test Managementusers.du.se/~hjo/cs/gmi23n/lectures/MT_ISTQB_05_test_management.pdfA master test plan or a separate plan for each “test level” An outline of a test plan may include

Risks and Testing• Risk is usually assessed in terms of likelihood and impact• In risk management, one must understand the following

Risks must be recognized and don’t disappear (we can’t ignore them) Risks have a cost and must be planned for Inform the stakeholders of risks Decisions are made regardless of risks expected Risks that actually happen are called “issues”

• A key part of risk scoring system is the risk assessment matrix A matrix provides a visual illustration to communicate risk levels

• The example (right) indicates the likelihood of certain events happening and the associated impact

risk: A factor that could result in future negative consequences

Page 32: Test Managementusers.du.se/~hjo/cs/gmi23n/lectures/MT_ISTQB_05_test_management.pdfA master test plan or a separate plan for each “test level” An outline of a test plan may include

Example: Risk Management Process

• The overall risk management process is documented in a Risk Management Plan (RMP) Established early during the initiative Updated periodically to ensure it remains relevant

• Initiatives must be managed from start to finish• Typical steps in the risk management process

include1. Publish and distribute a RMP

2. Identify risks and risk triggers

3. Analyze and assess risks

4. Develop and implement risk responses (Including budget and timelines)

5. Conduct regular risk reviews

6. Collect and apply lessons learned to improve risk management

Page 33: Test Managementusers.du.se/~hjo/cs/gmi23n/lectures/MT_ISTQB_05_test_management.pdfA master test plan or a separate plan for each “test level” An outline of a test plan may include

Example: Risk Triggers and Using Risk Triggers for Improvement

1. Identify potential triggers of risks

2.Set up a system to monitor whether risk triggers have occurred

3.Use the reality of a trigger in order to Improve your prediction of risk events Improve response time to a risk event Refine your ability to respond in an

efficient manner Note: Additional benefits of

identifying and monitoring risk triggers Boost confidence in risk management

processes Diminish judgement from stakeholders

Page 34: Test Managementusers.du.se/~hjo/cs/gmi23n/lectures/MT_ISTQB_05_test_management.pdfA master test plan or a separate plan for each “test level” An outline of a test plan may include

Product and Project Risks 1• Product risk

Possibility that a work product may fail to satisfy the legitimate needs of its users

A risk directly related to the test object Referred to as a quality risk if impacting quality characteristics

Examples: functional suitability, reliability, performance efficiency, usability, security, compatibility, maintainability, and portability

• Project risk A risk related to management and control of the (test) project A risk that the project may not be able to be delivered

Examples: lack of staffing, strict or unrealistic deadlines, changing requirements These are different from, but may lead to, product risks

Page 35: Test Managementusers.du.se/~hjo/cs/gmi23n/lectures/MT_ISTQB_05_test_management.pdfA master test plan or a separate plan for each “test level” An outline of a test plan may include

Product and Project Risks 2• A project risk involves situations that, should they

occur, may have a negative effect on a project's ability to achieve its objectives

• Project issues Delays – delivery, task completion, or satisfaction of exit

criteria Inaccurate estimates, reallocation of funds, or general cost

cutting Changes – may result in substantial re-work

• Organizational factors Staffing – lack of resources, skill-set issues, and training Personnel – conflicts and communication between groups Availability of key resources, priorities

• Supplier issues Vendor (fails to deliver, out of business, bankruptcy) Contractual issues (late delivery, noncompliance)

• Software that does not meet specifications

• Software that does not meet user needs

• Potentially harmful to an individual or company

• Response times may be inadequate• Customers cannot use it (usability)• Does not satisfy nonfunctional

requirements Reliability Maintainability Conformance to standards

Page 36: Test Managementusers.du.se/~hjo/cs/gmi23n/lectures/MT_ISTQB_05_test_management.pdfA master test plan or a separate plan for each “test level” An outline of a test plan may include

Product and Project Risks 3• Technical issues

Requirements – poorly written or not satisfied Test environment – not ready Data conversion, migration planning, and tool support – not ready Immature development process may impact the consistency or quality of

project work products Quality – in the defect management process, design documentation, code,

and tests

• A project risk may affect both development activities and test activities

• A project manager is responsible for handling project risks• It is not unusual for test managers to have responsibility for test-

related project risks

Page 37: Test Managementusers.du.se/~hjo/cs/gmi23n/lectures/MT_ISTQB_05_test_management.pdfA master test plan or a separate plan for each “test level” An outline of a test plan may include

Risk-Based Testing and Product Quality 1• Testing is used as a risk mitigation activity

Provide feedback about identified risks and information about unresolved risks Decrease uncertainty about risks Support the identification of new risks and Determine what risks should be reduced

• Risk-based testing: testing in which the management, selection, prioritization, and use of testing activities and resources are based on corresponding risk types and risk levels Involves the identification of product risks and the use of risk levels to guide the test process

• Provides proactive opportunities to help reduce the level of product risks• The following key steps should be taken when using this approach

Identify the product risks and its likelihood and impact Use the product risk information to guide the test process for activities

Test planning, test case development, and execution Test monitoring and control

Page 38: Test Managementusers.du.se/~hjo/cs/gmi23n/lectures/MT_ISTQB_05_test_management.pdfA master test plan or a separate plan for each “test level” An outline of a test plan may include

Risk-Based Testing and Product Quality 2• In a risk-based approach, the results of product risk analysis are used to

Determine the test techniques to be used Determine the particular levels and types of testing to be performed Determine the extent of testing to be carried out Prioritize testing in an attempt to find the critical defects as early as possible Determine whether any activities, in addition to testing, could be employed to reduce risk

• Risk-based testing draws on the collective knowledge and insight of the project stakeholders to carry out product risk analysis

• To ensure that the likelihood of a product failure is minimized, risk management activities provide a disciplined approach to Analyze (and re-evaluate on a regular basis) what can go wrong (risks) Determine which risks are important Implement actions to alleviate the risks Make contingency plans to deal with the risks should they become actual events

Page 39: Test Managementusers.du.se/~hjo/cs/gmi23n/lectures/MT_ISTQB_05_test_management.pdfA master test plan or a separate plan for each “test level” An outline of a test plan may include

Defect Management 1• Defect management: the process of recognizing and recording

defects (bug, fault), classifying them, investigating them, taking action to resolve them, and disposing of them when resolved

• When managing defects to resolution, an organization should Establish a defect management process Document a workflow and rules for classification Gain consensus on the process (impacted parties)

• Minimize the number of false positives reported as defects Failures due to network communication etc. Failures due to environment, hardware, etc.

• Defects can and should be reported in the following Source code Requirements, user stories, and acceptance criteria Documents (Development Test, user manuals, or install guides)

• Organization should define process for reporting defects

Page 40: Test Managementusers.du.se/~hjo/cs/gmi23n/lectures/MT_ISTQB_05_test_management.pdfA master test plan or a separate plan for each “test level” An outline of a test plan may include

Defect Management 2• Objectives of defect report

Summary information about the defect(s) Steps for reproducing Logs and screenshots Track the quality of the work product Impacts on testing (e.g., too many defects

and less time in test execution) Provides ideas for development and test

process improvements

• Defect report filed during dynamic testing typically includes →

• Details may be automatically included and/or managed when using defect management tools

• Defects found during static testing will normally be included in reviewing meeting notes