1
Market Operator Services Ltd ©
Master Test Strategy
MOSL Central Market Operating System (CMOS)
Version 1 Published 30/11/2015
Version 2.2_Final published 8/1/2016 (this document)
Version 3 Updates to Data & Environment sections due January
Version 2.2_Final
Version Date: 08/01/16
Status: Final
Date Last updated: 08/01/16
Last update by: Jason Taylor
Author: Jason Taylor
2
Market Operator Services Ltd ©
Table of Contents
1 VERSION CONTROL ............................................................................................................................ 5
2 EXECUTIVE SUMMARY ....................................................................................................................... 6
3 PROJECT & CURRENT TECHNICAL LANDSCAPE OVERVIEW ............................................... 7
3.1 Technical Landscape Overview ................................................................................................ 7 3.2 High Level Plan ......................................................................................................................... 12 3.3 CGI Project Team Structure .................................................................................................... 13 3.4 MOSL Project Team Structure ................................................................................................ 14 3.5 Current Market Participation Landscape ............................................................................... 15
4 TEST APPROACH ............................................................................................................................... 16
5 TEST PHASES ..................................................................................................................................... 19
5.1 TO VERIFY THE SOLUTION DELIVERY .............................................................................................. 19 5.1.1 Static Testing (Complete) .................................................................................................... 19 5.1.2 Unit Testing (supports development phases 4th Dec ‘15 – 26th Feb ’16) ...................... 20 5.1.3 Settlements Testing (tested in Unit, SIT, UAT 4th Dec ’15 – 3rd Aug ’16) ..................... 20 5.1.4 System & Integration Testing (SIT) (P1 Iteration 1 – P2 Iteration 4 12th Oct ’15 – 19th
May ’16) ................................................................................................................................................. 22 5.1.5 User Acceptance Testing (UAT) (Phase 1 to Phase 2 2nd Feb ’16 – 3rd Aug’16) ........ 24 5.1.6 Failover & Recovery Testing (FRT) (8th Sept ’16 – 28th Sept ‘16) ................................. 25 5.1.7 Performance Testing (25th Apr ’16 – 7th Sept ’16) ............................................................ 26 5.1.8 Security & Penetration Testing (24th May ’16 – 28th Jul ’16)........................................... 28
5.2 TO VERIFY THE MARKET OPERATION ............................................................................................. 29 5.2.1 Service Readiness Testing (29th July ’16 – 25th Aug ’16) ............................................... 29
5.3 TO VERIFY THE MARKET PARTICIPATION ........................................................................................ 30 5.3.1 Connection Test (16th & 17th Dec ’15) ................................................................................ 30 5.3.2 Sector Test System (STS1 & 2 for B2B High Volume Interface) Test (STS1 Jan ’16 –
STS2 Feb ’16 to the end of March ‘16).............................................................................................. 31 5.3.3 Market & Company Readiness (Market Entry Assurance Certification (4th Aug ’16 –
30th Sept ’16) ......................................................................................................................................... 32 5.3.4 Interface and Data Transaction Testing (August 2016) .................................................. 32
5.3.4.1 Objectives ......................................................................................................................................... 32 5.3.4.2 Approach .......................................................................................................................................... 32 5.3.4.3 Outcome ........................................................................................................................................... 33 5.3.4.4 Timing and resource needs ......................................................................................................... 33 5.3.4.5 Supporting materials ..................................................................................................................... 33 5.3.4.6 Preparing for success ................................................................................................................... 34
5.3.5 Market Scenario Testing (August to September 2016) ................................................... 34 5.3.5.1 Objective ........................................................................................................................................... 34 5.3.5.2 Approach .......................................................................................................................................... 34 5.3.5.3 Outcome ........................................................................................................................................... 35 5.3.5.4 Timing and resource requirements ............................................................................................ 36 5.3.5.5 Supporting materials ..................................................................................................................... 36 5.3.5.6 Preparing for success ................................................................................................................... 36
5.3.6 Market & Company Readiness (Shadow Market Operation) (4th Oct ’16 – 5th Apr ’17) .
................................................................................................................................................. 38
6 TESTING SCOPE ................................................................................................................................. 40
6.1 RISK BASED TESTING ...................................................................................................................... 40
3
Market Operator Services Ltd ©
6.2 FUNCTIONAL AREAS TO BE TESTED (TO BE CONFIRMED) ............................................................. 40 6.3 FEATURES NOT TO BE TESTED ...................................................................................................... 41 6.4 NON FUNCTIONAL REQUIREMENTS TO BE TESTED ....................................................................... 41
7 TEST ENVIRONMENTS ...................................................................................................................... 42
7.1 DEV & TEST ENVIRONMENT - TBC ................................................................................................. 42 7.1.1 Bridgeall Test Environment for Tariffs & Settlements ...................................................... 42 7.1.2 Unit Test Environment .......................................................................................................... 42 7.1.3 Systems & Integration Test Environment .......................................................................... 42 7.1.4 Data Migration Test Environment ....................................................................................... 42
7.2 PRE-PRODUCTION TEST ENVIRONMENT - TBC ............................................................................. 42 7.2.1 User Acceptance Test Environment................................................................................... 42 7.2.2 Performance Test Environment .......................................................................................... 42 7.2.3 Security & Penetration Test Environment ......................................................................... 42 7.2.4 Service Readiness Test Environment ................................................................................ 42 7.2.5 Market & Company Readiness Test Environment ........................................................... 42
7.3 CONFIGURATION MANAGEMENT ..................................................................................................... 43 7.4 RELEASE MANAGEMENT ................................................................................................................. 46
8 TEST DATA MANAGEMENT - TBC ................................................................................................. 48
8.1 UAT TEST DATA .............................................................................................................................. 48
9 COMMUNICATION STRATEGY ....................................................................................................... 50
9.1 COMMUNICATIONS APPROACH ....................................................................................................... 51
10 DEFECT MANAGEMENT ............................................................................................................... 55
10.1 GENERAL PRINCIPLES ................................................................................................................. 55 10.2 DEFECT PRIORITY & SEVERITY ................................................................................................... 55 10.3 DEFECT TURNAROUND SLAS ...................................................................................................... 56 10.4 TRIAGE PROCESS ........................................................................................................................ 56 10.5 OPEN DEFECT REVIEW AT END OF TEST PHASES ........................................................................ 57 10.6 DEFECT MANAGEMENT PROCESS FLOWS .................................................................................. 57
11 TEST GOVERNANCE ..................................................................................................................... 59
11.1 ENTRY & EXIT CRITERIA PROCESS (QUALITY GATES) .............................................................. 59 11.1.1 Agree quality gate acceptance criteria............................................................................... 59 11.1.2 Compile evidence ................................................................................................................. 61 11.1.3 Prepare for gate review ........................................................................................................ 61 11.1.4 Gate review ............................................................................................................................ 61 11.1.5 Test phase complete ............................................................................................................ 62
11.2 SUSPENSION & RESUMPTION CRITERIA ..................................................................................... 62
12 ROLES & RESPONSIBILITIES ..................................................................................................... 64
12.1 TESTING RACI MATRIX ............................................................................................................... 64
13 APPENDIX A – REFERENCES ..................................................................................................... 65
14 APPENDIX B – GENERAL PRINCIPLES OF TESTING ........................................................... 66
14.1 STAGES OF THE TEST APPROACH .............................................................................................. 66 14.2 SOLUTION PROVIDER APPROACH (CGI, CMOS) ...................................................................... 67
15 APPENDIX C - TEST CASE DESIGN .......................................................................................... 68
15.1 DESIGN & DOCUMENT TEST ASSETS ......................................................................................... 68 15.2 BENEFITS OF IDENTIFYING TEST SCENARIOS AND USE CASES EARLY ..................................... 68 15.3 TEST DESIGN SPECIFICATION (FROM IEEE829) ......................................................................... 69 15.4 TEST CASE DESIGN TECHNIQUES BEING EMPLOYED ................................................................ 70
15.4.1 Equivalence Partitioning ...................................................................................................... 70
4
Market Operator Services Ltd ©
15.4.2 Boundary Value Analysis ..................................................................................................... 70 15.4.3 State Transition ..................................................................................................................... 70 15.4.4 Decision Tables ..................................................................................................................... 71
15.5 CREATE MANUAL TEST SCENARIOS ........................................................................................... 72 15.6 CREATE MANUAL TEST CASES ................................................................................................... 73 15.7 CREATE MANUAL TEST SCRIPTS ................................................................................................ 74 15.8 DEVELOP TEST SCHEDULES ....................................................................................................... 74 15.9 TEST DATA REQUIREMENTS & PREPARATION ............................................................................ 74
15.9.1 Data Security ......................................................................................................................... 75 15.9.2 Base Data .............................................................................................................................. 75
15.10 PRACTICALITY OF THE DATABASE REQUIREMENTS .................................................................... 75 15.11 NEWLY CREATED BASE DATA ..................................................................................................... 75 15.12 SYNCHRONISATION WITH OTHER DATABASES ............................................................................ 76 15.13 AMENDMENTS AND CANCELLATIONS .......................................................................................... 76 15.14 TRANSACTIONAL DATA ................................................................................................................ 76 15.15 MAINTENANCE ROUTINES ........................................................................................................... 76 15.16 USER SET UP .............................................................................................................................. 76 15.17 AUDIT TRAILS .............................................................................................................................. 77 15.18 BEST PRACTICE ........................................................................................................................... 77 15.19 RESOURCES ................................................................................................................................ 77 15.20 ESCALATION ................................................................................................................................ 77 15.21 PROCESS IMPROVEMENT ............................................................................................................ 78 15.22 QUALITY ....................................................................................................................................... 78
16 APPENDIX D - TEST REPORTS ................................................................................................... 79
16.1 TEST METRICS ............................................................................................................................ 79 16.2 PLANNING FOR TEST METRICS ................................................................................................... 80 16.3 BASELINE METRICS ..................................................................................................................... 81
16.3.1 Time to Test Estimation ....................................................................................................... 81 16.3.2 Test Execution Report .......................................................................................................... 81 16.3.3 Test Summary Report .......................................................................................................... 82
16.4 KPI METRICS ............................................................................................................................... 83 16.5 DEFECT MANAGEMENT REPORT ................................................................................................. 83 16.6 END OF TEST REPORT (EOTR) .................................................................................................. 84 16.7 WEEKLY REPORTING ................................................................................................................... 84
17 APPENDIX E - TEST ENVIRONMENT MANAGEMENT DELIVERABLES ........................... 85
18 APPENDIX F - TEST DATA MANAGEMENT ............................................................................. 86
18.1 MDM TESTING LEVELS AND OBJECTIVES .................................................................................. 86 18.1.1 Integration Testing ................................................................................................................ 86 18.1.2 System Testing ...................................................................................................................... 86 18.1.3 Data Validation ...................................................................................................................... 86 18.1.4 Acceptance Testing .............................................................................................................. 86 18.1.5 Completion Criteria ............................................................................................................... 86
19 APPENDIX G - TEST DELIVERABLES ....................................................................................... 87
20 APPENDIX H - TEST TOOLS ........................................................................................................ 88
20.1 HP ALM ...................................................................................................................................... 88 20.2 JMETER- FOR LOAD AND PERFORMANCE TESTING .................................................................... 88 20.3 BRIDEGALL SMARTTEST FOR AUTOMATION ................................................................................ 89
21 APPENDIX I – TEST ASSUMPTIONS, DEPENDENCIES & RISKS ....................................... 90
21.1 ASSUMPTIONS ............................................................................................................................. 90 21.2 DEPENDENCIES ........................................................................................................................... 90 21.3 RISKS AND CONTINGENCIES ........................................................................................................ 91
5
Market Operator Services Ltd ©
1 Version Control
Version History
Version
No.
Date Reason for Amendment
0.1 14.03.11 Initial Draft
0.2 20.11.15 Updated following review feedback from Ben Jeffs
0.3 23.11.15 Review and update by Jason Taylor
0.4 24.11.15 Various updates
0.5 25.11.15 Restructured following feedback from Ben Jeffs
0.6 25.11.15 Restructured following additional feedback from Ben Jeffs
0.7 26.11.15 Market Participant replaced by Trading Party
1.0 30.11.15 Final updates based on Test SIG feedback and font sizes
2.0 22.12.15 Updated to contain Market & Company Readiness for both Market Entry
Assurance and Shadow Operations. Updates based on CGI feedback
2.1 30.12.15 Further updates based on CGI feedback
2.2_Final 8/1/2015 Final updates from CGI, update Market Participants to Market Participants &
Publish
Document Owner
Authorised for Release By: Mark Salisbury
.......................................
Owner.
.......................................
Reviewer.
Original Author: Jason Taylor
Owner: Jason Taylor
Last Updated by: Jason Taylor
Distribution List
Name Role Purpose Signature Date
Ben Jeffs CEO Signature
Mark Salisbury Delivery Director Signature
Martin Silcock Director of MOSL Market Services
Signature
6
Market Operator Services Ltd ©
2 Executive Summary
Market Operator Services Limited (MOSL) is working with DEFRA, Ofwat and the water companies
as part of the Open Water Programme. Open Water will offer non-household customers a choice of
supplier and access to tailored packages that best suit their needs.
MOSL's role is to deliver the operational capability needed to support the efficient operation of the
new market when it opens in April 2017 via the Central Market Operator System (CMOS),
developed by CGI, the solution provider.
It will deliver the IT systems that enable registration, customer switching and settlement between
wholesalers and retailers.
MOSL CMOS will be tested at the following levels:
Solution Delivery – validates CMOS has been delivered as stated in the solution providers
contract and the required technical attributes, including performance, security,
maintainability and support.
Market Operation – validates CMOS and the MOSL operating model against the Open
Water Codes
Market Participation – validates against the markets and company readiness requirements
By completing these phases of testing MOSL will have proven the market architecture.
The purpose of this document is to define one common Master Test Strategy detailing the test
approach that will be used to verify all aspects of the delivered solution to ensure that it meets the
legislative and business requirements prior to being promoted to “live production” in April 2017.
The owner of this document will be the MOSL Test Lead assigned to this programme of work.
Prospective members will be described in this document as ‘Market Participants’, as a ‘Market
Participant’ is not a ‘Trading Party’ until they have met all of the trading criteria.
This MTS is a working document where dates may be amended subject to project change
controls over the duration of the project delivery.
7
Market Operator Services Ltd ©
3 Project & Current Technical Landscape Overview
3.1 Technical Landscape Overview
The information contained in this section 3.1 of the document is confidential to CGI and/or CGI group companies. The content in this section shall not be reproduced in any form or by any mechanical or electronic means, including electronic archival systems, without the prior written approval of CGI
MOSL is a newly formed company funded by its members (Market Participants) English Water
Utility companies to provide Market Operator (MO) services for the newly forming English non-
household water market. These services predominately allow MOSL as MO to act as settlement
house delivering support to both retailers and wholesalers in the potable (drinking water) and non-
potable water market. MOSL will also provide customer switching, tariffs, reporting and market
performance measurement services.
As the audience for this document is predominantly MOSL and already recognised associates and
3rd parties, it is deemed not necessary to provide a full breakdown of MOSL as a company and its
operating model; this can be provided upon request.
The driver behind the MOSL CMOS Project is to deliver the application that MOSL will use to act
as MO to the English non-household water market.
The CMOS architecture uses a loosely coupled SOA approach containing multiple systems to
provide the full CMOS application detailed here in Figure 1:
Figure 1 - CMOS system
Core Market Operating System (CMOS) contains:
MarketPortal (MP)
o Provides the single point of access for external users and systems as well as access for
MOSL users to all functions other than those provided via the System Monitor. It does this
via multiple delivery channels; such as, a web browser UI (WebGUI) and a system-to-
system IDEX (B2B Hub)
o In addition the MarketPortal contains all the generic functional components required by a
market operator system, for example access control, audit logging, validation services and
process definition and management.
Central Connection Register (CCR)
8
Market Operator Services Ltd ©
o The Central Connection Register contains the central register for SPIDs (Service Point
Identifier’s) and associated data and delivers all services required to maintain and retrieve
this data.
Market Meter Data (MMD) (for settlement & tariff purposes)
o Market Meter Data contains all meter and reading data relevant to the market and delivers
services for validation and storage of this data.
Grid Fee Billing (GFB)
o Grid Fee Billing provides all the services to allow a financial settlement process to derive
billable usage (including MO estimation for missing reads), calculate the charges and to
then store the results at the most granular level (such as, for each variation of input into the
charge calculation). Aggregation of this data to provide settlement reports is done within
the reporting service of the MarketPortal.
Sector Test System (STS)
o The Sector Test System provides early testing facilities for Market Participants (MP) to
test and verify their system against the central system. MOSL can define the test scenarios
and test data to be used by each party and each party can run these test scenarios under
their own control. STS is not dependent on the implementation of the market processes,
instead relying on simulation of the processes and transactions.
o STS makes use of the MarketPortal functions like Validation, Configuration and
Participants. It gets messages from the B2B channel to manage STS as well as from
Market Messages to support the internal CIM message model for the market messages
o To be clear STS is used to simulate the High Volume Interface (HVI) testing.
Data Migration (DM)
o Data Migration an application that enables migration of market data into the system while
at the same time subjecting it to the same format and validation rules applied to messages
received via the B2B Hub and the WebGui
System Monitor
o The System Monitor provides real time services to monitor the solution’s health which are accessible by both the application services provider and by authorised MO staff. It looks at the hardware and software of all system components; providing on screen alerts and/or emails when predefined thresholds are exceeded or conditions are not met. These services are near real time and hence minimise detection time of occurring or upcoming problems. This component is delivered using the TIBCO HAWK software product – see http://www.tibco.co.uk/products/automation/monitoring-management/enterprise-monitoring/hawk for more details.
9
Market Operator Services Ltd ©
Figure 2 below provides the detailed functional view
Figure 2 – Detailed functional view – SOA based loosely couple capabilities
The MOSL CMOS Project aims to deliver, over the next 18 months in order to be market ready on
the 1st April 2017. CMOS will be in place to support MOSL as MO and members in the newly
opened market. With full integration to the Market Participant Systems, Processes and Procedures;
these include:
The Market Portal high level component is the core to the CMS product. It contains all the common functions used in every implementation. At this level it breaks down into the following sub-components:
WebGUI Framework – this is a web based UI to provide access into CMS for MP and MOSL users. It provides a low cost means of market messaging, delivery of reporting services and access into system management functions (such as granting access to users, uploading tariffs and maintenance of standing data)
B2B Hub - a web service (SOAP and XML) based delivery channel for data exchange with Retailer, Wholesaler and Other Authorised Party systems
FTP – a bulk data delivery channel used by the Data Migration component when loading in MPs data as part of the data migration activities. In order to secure this channel a server certificate is used, such that a secure FTP (FTPS protocol as utilised on Windows) channel is created. User accounts are set up manually (application support team activity) in Active Directory per MP. Each MP has a folder with a set of sub folders for inbound/outbound data etc. This is usually scripted to set up for different (test) environments. Access to other MP folders is not permitted
Mediation – this layer is responsible for authorisation of web services. The Mediation layer handles all web service requests from the presentation layer that integrate with any of the underlying parts of the system. All requests (web services) from the presentation layer get policies applied by the Mediation layer. The policies (configuration) that can be applied are:
Authorisation: Verify that the (system) user is allowed to request the service
Audit logging: Both request and response messages can be sent to the Audit Log
10
Market Operator Services Ltd ©
Routing: Based on the application channel, the message is routed to the corresponding service implementation
The Mediation layer abstracts the service implementation (business logic layer) from the provided service interface for the consumers (presentation layer):
Security Framework – the component managing access and authorisation into the platform, providing a service to Mediation for authorising users at the service operation level and at the front ends (both web GUI and B2B hub) for authentication of users and systems. It also implements the Single Sign On between the Web GUI and the Reporting Framework. It is based on Microsoft ActiveDirectory for authentication of users, passwords and client side certificates
Audit Log – this tracks all changes to data within the CMS system. The audited information consists of the exchanged data which relates to Market processes. The audited information is used to report on the completeness of processing the exchanged information. The audit trail forms the bases to guarantee the processing completeness of the CMS. All exchanged messages, leading to a change of Market Information, independent of the used communication (GUI, XML) channel are stored in the audit log. The audit log can be viewed from the maintenance module, MOSL can view all messages exchanged while Market Participants can only view their own messages. The audit log, storing the incoming and outgoing messages as the first and last step in a market process, is the basis for performance reporting
Participant Repository – this manages all of the MPs, market roles, users and user roles. Registered MPs can be updated and new MPs can be added. Each registered MP will be able to register (via a system administrator) its own employees as users of the CMS on behalf of their company. A user bound to a MP is restricted in its access depending on the market role of the MP. For example, a user from a MP with the market role of Wholesaler cannot initiate the execution of a ‘Change of Supplier’. The available access to system functionality and the relationship to market each role is defined in the authorisation matrix of the CMS. The authorisation matrix is a set of market roles and system functionality available per market role which is maintained by the MOSL system administrator
Configuration Repository - this holds the configurable market information and the configurable system information such as error messages and help texts. The configuration information is independent of the underlying applications; the information is shared between CCR, MMD, GFB and DM. The information is viewed and maintained through the MarketPortal web GUI
System Log – this service is used by all other services in the system to log any technical or functional errors, warnings or information messages. It is used to ensure that all system incidents are logged and can be traced and resolved
Notification Management – this service ensures messages and emails are sent or made available based on the state of the BTD steps. Notifications may require a delay in sending and the Notification Management service will pick up these messages as and when required
Market Messaging – this provides central entry point for both B2B and GUI for market messages and provides the queuing mechanism for Market Participants to retrieve market messages
Validation Services – this component provides validation functionality based on validation rules. The Validation Services component works at three different levels:
Message Validation - this checks the semantics of an incoming message. This message can either be a B2B market message or a message which resulted from a GUI action. The input for a message validation is the message itself and the corresponding Document Type. Message validation is only executed for incoming messages
Process Validation - this applies business process specific rules. Inputs for this service are the current Metering Point State, the (incoming) message and the process type
State Validation - this checks the full (new) state of a Metering Point before it is stored in the database by the Metering Point Service. Inputs for this service are one or more Metering Point states
11
Market Operator Services Ltd ©
Transaction Contention - this module identifies and then manages the between contention two different market processes taking place on the same Metering Point
Business Transaction Dossier – is the component that creates and monitors the progress of invocations of the market processes. The Business Transaction Dossier is always created for mutations on the Metering Point and in some case for consulting actions on the Metering Point. The Business Transaction Dossiers together are used as a basis for reporting to MOSL. The reporting is used for tracking the number of transactions, the number of erroneous transactions and all kinds of differentiations on the type of transactions and for which market parties
The Reporting Framework is a web based reporting module delivered via the Tibco JasperSoft reporting suite. Its UI is integrated into the WebGUI Framework and utilises SSO. The web based reporting allows running and downloading a set of pre-defined reports and it capable of creating ad-hoc reports based on the available data sets. Access is restricted so a user can only see the data belonging to their MP. JasperServer allows for a maximum query execution time configuration to prevent users from having to wait too long in case of queries demanding too many resources. The JasperSoft web based reporting has its own auditing capabilities. For more details regarding the capabilities of JasperSoft please see http://www.JasperSoft.com for more details.
The Central Connection Register, Market Meter Data, Grid Fee Billing, Sector Test System and Data Migration high level components all have Business Services and Data Migration services
Business Services – these are services to execute business processes, Business Service may call other Business Services or Entity Services
Entity Services – these are CRUD services used by Business Service to read data from or write data to the database.
In addition, some of these high level components use other generic sub-components; such as:
Calculation Services – services written to perform specific calculation routines such as charge calculation or usage derivation
Data Interfaces – these interfaces are used for high volume integration with the services data. Use of standard entity services does not provide sufficient performance when large volumes of data need to be accessed and the transactional integrity may not support such interfaces. By implementing these specific high volume data interfaces on the basis of double Oracle views, data can be access while maintaining integrity and performance. These interfaces are used for the Settlement component
Test Services – STS implements a number of specific test facilities that have been encapsulated in these test services. These include services for preparing test data, preparing test runs, managing and executing test runs and managing the test result data
Data Migration Services – the Data Migration function provides facilities for migration data into the platform. The data migration services implement the actual migration operations.
Data flow diagrams can be found in the Schedule 13 Method Statement Part 1
12
Market Operator Services Ltd ©
3.2 High Level Plan
The high level plan to deliver all of the test phases and therefore prove the market architecture is
as follows:
Please note at this point of publication of the MTS V2.2_Final some of the dates are subject
to change. Any updates will be provided as soon as they are understood. The dates relating
to each phase are detailed in section 5 Test Phases.
15
Market Operator Services Ltd ©
3.5 Current Market Participation Landscape
MOSL’s footprint covers the Market Participants listed in the following table who will be expected to Interface with the CMOS application. These companies are not all fully confirmed as members at the time of writing the current version of this document however this will give this MTS the context and possible size of the English non-household water market.
Companies are categorised as:
Water & Sewerage Companies (WASC) Water Only Companies (WOC) Other Appointed Businesses Unassociated New Entrants Other Businesses
Type Company Market
Participant
WASC Anglian Water
Northumbrian Water
Severn Trent Water
Southern Water
South West Water
Thames Water
United Utilities
Wessex Water
Yorkshire Water
WOC Affinity Water
Bournemouth Water
Bristol Water
Portsmouth Water
South East Water
South Staffordshire Water
Sutton & East Surrey Water
Other Appointed
Business
Albion Water
Peel Water Networks
Veolia Water Projects
Un-associated new
entrants
Business Stream
Castle Water
Cobalt Water
Clear Business Water
Blue Business Water
Ondeo
Other businesses Welsh Water
16
Market Operator Services Ltd ©
4 Test Approach
Testing will be executed by the solution provider (CGI), MOSL (supported by their selected Test
Partner), and Market Participants to mitigate risks to the market launch; for example a risk that
business and legislative requirements of this programme of work may not be correctly delivered.
The testing approach used on this programme of work is aligned to the MOSL Test Policy v1.0 and
will be delivered in a structured and phased way shown in the following diagram.
Ap
plic
ati
on
s
an
d
Inte
rfa
ce
s
No
n F
un
cti
on
al
/
Te
ch
nic
al
Data
Mig
rati
on
Migrated Data
Solution Delivery Market Operation
Unit Test
Unit Test
Systems &
Integration
Test
UATMarket & Company
Readiness(MCR)
MOSL
Service
Readiness
Test
Static
Test
Failover &
RecoveryPerformance
Test
Security &
Penetration
Test
HVI B2B Test
(STS1 & 2)
Settlements
Test
The following table provides a high level view of the risks mitigated by each test phase.
Risk Test Phase in which the risk
will be mitigated
The requirements (codes) may not be clear enough to design,
develop and test against
Static Testing
Functionality may not be developed as designed Unit Testing
Solution may not function as designed Systems & Integration Testing
The financial outcome of Settlements may not be as expected Settlements Testing
Solution may not meet MOSL / legislative requirements User Acceptance Testing
MOSL may not be ready to operate the service Service Readiness Testing
The solution architecture may not be resilient to failure Failover & Recovery Testing
The solution may not be able to support the required number of
users or transactions?
Performance Testing
The solution architecture may not be secure Security & Penetration Testing
The Market Participants may not be able to use the solution HVI B2B Test (STS 1 & 2)
Market & Company Readiness
17
Market Operator Services Ltd ©
This test approach represents best practice as it:
Has been developed and executed across many projects
Complies with ISEB, ISO and TickIT best practice
Is delivered in test stages with common delivery phases; by using entry and exit criteria at
each stage enables the use of quality gates to assure the delivery of the software under
test.
It also enables the implementation of best practice relationships with solution providers and Market
Participants based on a clear communication strategy and a commitment to collaborative working.
The testing approach is based on the following logical delivery stages for each test phase:
Strategy: Document the overall test approach that all test phases must align to.
Plan: Determine the scope, approach and requirements for each test phase
Prepare: Design and build all the required test resources necessary to mitigate the
specific risks linked to the test phase
Execute: Carry out the testing activities against the plan
Evaluate: Assess actual against expected results to determine pass or failed tests.
Complete: Archive all test assets for future reuse.
All the activities involved in the test programme are defined by these stages and the approach
supports the following principles that will be promoted during this programme of work:
1 Test Strategically: Develop an overarching Master Testing Strategy, and ensure that all
testing in the programme conforms to this strategy (this document).
2 Test to Mitigate
Business Risk:
Analyse the business risks associated with each component of the
system under test, and use this analysis to prioritise the test effort,
testing highest risk areas first.
Please note that CGI are using a Requirements Based Testing
approach where all requirements are covered by functional test
cases and executed.
MOSL advocate a risk based approach to UAT. The volume of
business processes and test types that may be required to
provide full coverage will be risk assessed to ensure those with a
highest risk assessment are delivered fully (detailed in section 6)
3 Test Early and
Continuously
Start testing as soon as possible, test throughout the development life
cycle, and continue testing after deployment.
4 Test Visibly: Determine the success criteria, metrics and result data to be provided
by testing at the start of the process, deliver these results constantly
and review frequently.
5 Automate For
Efficiency
Improve the efficiency of the test effort by automating appropriate tests
6 Test Independently The MOSL Test Lead is accountable for quality assurance across the
CMOS delivery programme all Test Teams executing each of the test
phases will act as one team, providing visibility of all test assets and
transparency of test results and defects found.
18
Market Operator Services Ltd ©
The benefits of implementing this approach are:
Industry recognised structure and sequence.
Documented tests (COMMITMENT).
o Having shared documented test scripts means that there is cross-team knowledge
of what will be done, and a commitment to support it.
Tests assigned to and owned by people (RESPONSIBILITY).
o Having tests assigned to individuals means that everyone will know who is
responsible to execute a test, therefore avoiding duplication of effort.
Agreed measurable success criteria at every stage that need to be proved (TARGETS).
o Having targets means that individuals will have a mechanism to prove that they
have met their responsibilities. Where there are no measurable success criteria,
there is no achievable test.
Communication between teams (TRUST).
o If there is no trust between teams there will always be duplicated effort, which
results in low morale, and longer timescales to deliver the product to market.
Documented evidence that tests have been executed (CONFIDENCE).
o Providing the evidence to support the measurable success criteria, means that the
depth and quality of the tests executed can be confidently accepted by everyone
associated with the testing.
Independent authority (QUALITY ASSURANCE)
o Having an independent authority made up of representatives of each group,
means that standards are set and enforced across teams, departments and
projects.
19
Market Operator Services Ltd ©
5 Test Phases
5.1 To verify the Solution Delivery
The CMOS solution will be delivered by the Solution Provider in two phases.
Phase 1 allows MOSL to operate the market and contains all Market Participant related
functionality
Phase 2 allows MOSL to administrate the market and contains all MOSL related administrative and
reporting functionality
5.1.1 Static Testing (Complete)
Verifies that the requirements are fit for purpose by validating them against specific acceptance
criteria
Accountable – Solution Providers
Responsible – Development Stream Lead for CGI & Bridgeall
Entry Criteria
Business requirements have been documented and approved by MOSL stakeholders.
An agreed set of requirement tests have been defined and agreed with the MOSL Test Lead.
Functional Designs are in a reviewable state (determined by the design team). (FDs and NFDs are complete)
Controls
This is the responsibility of the solution provider.
Exit Criteria:
Entry criteria fully satisfied.
The required level of testing has been completed to verify each requirement, unit and unit integration has been developed as required.
FDs are approved by all reviewing parties mentioned in the design review plan.
Written acceptance will be issued by MOSL provided the review results are within the following ranges as determined by MOSL in schedule 3: (MOSL will review the Requirements Traceability Matrix (RTM phases 1 & 2)).
Key Test Principles
The key test principles of static testing are:
Every requirement must be tested.
The requirements should be tested for: o Correctness o Ambiguity o Completeness o Consistency o Verifiability o Modifiability o Traceability
20
Market Operator Services Ltd ©
5.1.2 Unit Testing (supports development Phase 1 12th Oct ‘15 – 24th Mar ‘16 & Phase 2 8th Jan’16 – 17th May ‘16)
This tests the individual units (program or module) of the application. This test phase ensures that
the unit behaves as per the program specification or design.
Accountable – Solution Providers, internal or external
Responsible – Development Stream Lead, CGI & Bridgeall
Entry Criteria
Business requirements have been documented and approved by MOSL stakeholders.
Relevant design documents are complete and approved before the start of the actual build.
Test approach and plans, detailed unit test scripts are complete before the start of the test and approved by the solution provider development lead and assured by the MOSL Test Lead.
A stable development environment for each application. Depending on the development stream, the environment may be within the solution providers’ architecture.
A stable controlled unit test environment for each application.
Defect management procedures have been created and agreed between MOSL and CGI.
Controls
This is the responsibility of the solution provider.
Exit Criteria:
Entry criteria fully satisfied.
The required level of testing has been completed to verify each requirement, unit and unit integration has been developed as required.
The unit End of Test report has been provided to MOSL.
There are no “Critical” defects outstanding.
There are no more than [2] “Major” defects outstanding.
There are no more than [5] “Minor” defects outstanding.
Key Test Principles
The key test principles of Unit testing are:
Every component & line of code/statement should be exercised in test execution.
The unit test scripts should include an appropriate set of negative tests (to attempt to break the application) and validate warning / error messages.
Any interfaces should be unit integration tested to target test systems to ensure they are readable and can be processed correctly.
5.1.3 Settlements Testing (tested in Unit Phase 1, SIT Phase 1, UAT Phase 1 4th Dec ’15 – 5th Aug ’16)
This phase of testing mitigates the risk that the financial outcome calculated is not as expected. It
will be validated at three levels at various stages in the project.
1. Bridgeall will validate under Unit Test that Tariffs & Settlements function as expected and
generate the expected financial outcomes as defined in the system designs
21
Market Operator Services Ltd ©
2. CGI will validate during SIT, that Tariffs & Settlements function as expected and generate
the expected financial outcomes as defined both at a functional level and in specific market
scenarios
3. MOSL will validate during UAT and the Service Readiness Test phases that Tariffs &
Settlements function as expected and generate the expected financial outcomes as
defined both at a functional level and in specific market scenarios (supported by CGI)
4. Market Participants will execute financial equivalence testing as part of their internal
delivery programmes to assure their financial revenues
Accountable – Solution Providers, internal or external for Bullet 1 & 2
MOSL for Bullet 3
Market Participants for Bullet 4
Responsible – Development Stream Lead, Bridgeall, MOSL and Market Participants
Entry Criteria
Business requirements have been documented and approved by MOSL stakeholders (Bullets 1 & 2)
Relevant design documents are complete and approved before the start of the actual build (Bullets 1 & 2)
MOSL Market Services have supported MOSL testing service to define a set of settlement scenarios with financial expectations that will be executed as part of UAT and Market Participants as part of Market Entry Assurance activities (Bullets 3 & 4)
Test approach and plans, detailed unit test scripts are complete before the start of the test and approved by the solution provider development lead and assured by the MOSL Test Lead (Bullets 1 & 2)
A stable development environment for each application. Depending on the development stream, the environment may be within the solution providers’ architecture (Bullet 1)
Stable controlled test environment for each application (Bullets 2,3 & 4)
Defect management procedures have been created and agreed between MOSL and CGI (Bullets 1,2,3 & 4)
Controls
For Bullets 1 & 2 this is the responsibility of the solution provider.
For Bullet 3 this is the responsibility of MOSL
For Bullet 4 this is the responsibility of the respective Market Participants
Exit Criteria:
Entry criteria fully satisfied (Bullets 1,2,3)
The required level of testing has been completed to verify each requirement, unit and unit integration has been developed as required (Bullet 1 & 2)
The unit End of Test report has been provided to MOSL (Bullet 1)
There are no “Critical” defects outstanding (Bullets 1,2,3)
There are no more than [2] “Major” defects outstanding (Bullets 1,2,3)
There are no more than [5] “Minor” defects outstanding (Bullets 1,2,3)
Key Test Principles
The key test principles of settlement are covered by the principles governing the phases under
which testing is being executed:
Bullet 1 Unit Test Principles
Bullet 2 SIT Principles
Bullet 3 UAT & System Ready Test Principles
22
Market Operator Services Ltd ©
Bullet 4 The test principles employed by each respective Market Participant as part of their Test programmes
5.1.4 System & Integration Testing (SIT) (Phase 1 Iterations ‘1-3’ 1 Oct ‘15 – 6th April ‘16 & Phase 2 Iteration ‘4’ 29th Mar ‘16 – 13th June ‘16) dates include test case analysis & preparation activity
The purpose of the System & Integration testing is to ensure that the deliverables from the solution
provider behave as per the required specification / design in isolation and when interfacing with
other systems in the solution.
This level of testing must validate that each delivered solution, including interfaces, conforms to the
following:
Functional and usability requirements.
Performance requirements, for example volume, response and batch completion times (this refers to functional performance tests and should not be confused with Non-functional performance tests described in section 5.1.7).
Integrity & Security requirements, for example user roles, access rights, data validation and error handling (this refers to functional security tests and should not be confused with Non-functional security & penetration tests described in section 5.1.8).
The solution providers are expected to have performed adequate testing before releasing their
solutions to MOSL, and it is in MOSL’s best interests to verify this before accepting the solution.
SIT will be witnessed by MOSL, to determine the test approach, coverage and depth of the testing
carried out by the solution provider and to assure quality at the end of each SIT Phase.
Please note as part of SIT Bridegall will execute a ‘factory’ test to ensure the ‘Settlements’ &
‘Tariffs’ modules have been fully tested and integrated into the CMOS solution. Bridgeall
will execute their testing using an automated approach by employing their automation
framework tool called SmartTest. Bridegeall currently hold approx. 1,000 automated test
cases used in the implementation of the Scottish Water Market, they will review these test
cases for their reusability and amend or add new test cases to support the implementation
of the English Water Market supported by CMOS. As part of this process MOSL will
independently assure this phase.
Accountable – MOSL Test Lead
Responsibility – Solution provider’s test lead and team
Entry Criteria
Requirements approved by MOSL and uploaded to the HP ALM test tool.
Functional and technical design documents, including developed code, is complete and approved by MOSL stakeholders.
<Phase> Test Plan has been created by the CGI Test Manager and assured by the MOSL Test Lead.
Detailed test scripts are cross referenced to the requirements in HP ALM to ensure availability of requirement and evidence of test coverage.
Test environment acceptance criteria have been met (detailed in the Environment & Release Strategy)
Requirements and test assets are stored in HP ALM and available for independent assessment if required.
All requirements are covered by sufficient tests to verify the requirements have been delivered.
23
Market Operator Services Ltd ©
Defect tracking system in place to support test execution.
Test Cases have been created by the solution provider and assured by MOSL
There are no “Critical” defects outstanding from component/unit test cycles.
There are no more than 3 “Major” defects outstanding.
There are no more than 20 “Minor” defects outstanding and workarounds acceptable to MOSL exist in respect of each of these.
At least two (2) Wholesalers and two (2) Retailers have interfaces in place. This entry criteria only applies to SIT 1 Cycle 3 (this is the final SIT phase 1 cycle)
Controls
The following controls will be applied at different points in this test phase for the solution providers:
A test approach and plan have been prepared, reviewed and approved before the start of SIT by the solution provider and the MOSL Test Lead.
Detail test scripts prepared, cross referenced to requirements, reviewed and approved before the start of the test, by the solution providers Test Lead and the MOSL Test Lead.
Weekly/Daily Reports showing:
Current status of the test coverage against requirements.
Current status of the test execution: Total tests to be executed. Total tests executed. Total tests passed.
Defect metrics showing current open and closed totals, which should indicate a declining rate for finding defects, and open totals approaching zero, over time.
Evidence to support quality gate reviews to be carried out at the end of this test phase.
Exit Criteria
Entry criteria fully satisfied.
All requirements are mapped to solution provider test scripts, or exclusions have been agreed.
All tests required to verify that a requirement has been delivered have been executed and evidence of actual test results is captured on HP ALM for review.
Reports providing a clear indication of test execution completion and coverage.
Physical evidence of passed test conditions is available for review on demand.
All required defects closed based on an agreement with the MOSL Test Lead and solution provider Test Lead.
Mitigation action agreed for all outstanding defects.
SIT Test closure Report agreed and assured by the MOSL Test Lead and solution provider Test Lead.
There are no “Critical” defects outstanding.
There are no more than 5 “Major” defects outstanding where resolution plans are in place and that outstanding defects will be resolved during UAT.
There are no more than 20 “Minor” defects outstanding and workarounds acceptable to MOSL exist in respect of each of these.
Key Test Principles
The key test principles of SIT are:
All requirements are exercised by sufficient test scripts, verifying both positive and negative paths.
The test scope should exercise Functional requirements.
Tests are executed in a configuration and release controlled environment.
Each release of the application is regression tested.
The scope of SIT should include testing of batch processes where in scope, and time taken to complete them.
24
Market Operator Services Ltd ©
The test and defect metrics will be used to monitor the progress and assess product quality.
Test approach, plan, scripts and execution will be subjected to quality governance.
CGI request that Market Participants support their System Integration test phase however
this request is not aimed at all Market Participants. CGI will select 2 retailers and 2
wholesalers to support this testing phase from those Market Participants that volunteer.
5.1.5 User Acceptance Testing (UAT) (Phase 1 2nd Feb – 10th June ‘16 Phase 2 10th June – 5th Aug ‘16) dates include test case analysis & preparation activity
User Acceptance Testing is focussed on executing the business processes as defined in the CSD
Codes and other identified MOSL documentation provided by MOSL Market Services to ascertain if
the solution is suitable for live operation by MOSL users. The User Acceptance Test (UAT) is
executed by the MOSL Test Team who will develop test scripts from the requirements stated in
CSD Codes and those requirements documented outside of the codes which include CMOS admin
and reporting functions. The UAT scripts are mapped to the requirements in HP ALM. This is to
ensure that any requirements not covered by either SIT or UAT are identified as early as possible
so that remedial activity can be initiated.
Accountable – MOSL Test Lead
Responsibility – MOSL UAT Test Manager
Entry Criteria:
Requirement coverage in HP ALM, CSD Codes and functional design documents from the design phase, from which the test plan and scripts are prepared, are complete and approved before the UAT test planning activity.
Test plans, detail scripts and test data are complete and approved before the start of the UAT execution
UAT test scenarios are defined, reviewed, prioritised and agreed before the start of the UAT test execution phase
A stable test environment is available for testing
All previous test phases successfully complete and there are no unacceptable outstanding issues
SIT, for the first release of the System, has been successfully completed.
The UAT environment is configured and made available. Including sanity check and smoke test by CGI.
The selected MOSL Test Partner and selected Market Participant volunteers have been trained by CGI and can login to the UAT environment.
There are no more than 5 “Major” defects outstanding where resolution plans are in place and that outstanding defects will be resolved during UAT.
There are no more than 20 “Minor” defects outstanding and workarounds acceptable to MOSL exist in respect of each of these.
Controls
The following controls will be applied at different points in this test phase.
User Acceptance Test Scripts need to be reviewed and accepted by MOSL Director of Market Services before execution.
A test approach and plan have been prepared, reviewed and approved.
Detail test scripts prepared, reviewed and assured.
Completeness of the test coverage
Completeness of the test execution
Defect metrics showing declining rate for finding defects – approaching to zero
The defect resolution rate is monitored to ensure the test plan timescales are met
25
Market Operator Services Ltd ©
All defects closed at the phase exit point
Successful review of the test execution results
Exit Criteria:
• Entry criteria fully satisfied. • The required level of testing has been completed to verify each requirement:
High priority – all tests executed.
Medium priority – all high and medium priority tests executed.
Low priority – all high priority tests executed. • Reports providing a clear indication of test execution completion and coverage. • All required defects closed based on an agreement with the MOSL Test Lead and solution
provider. • Mitigation action agreed for all outstanding defects. • Test closure agreed and assured by the MOSL Test Lead and solution provider. • There are no “Critical” defects outstanding. • There are no “Major” defects outstanding. • There are no more than [5] “Minor” defects outstanding and workarounds acceptable to
MOSL are in place.
Key Test Principles
The key test principles of UAT are:
• The user acceptance test verifies the functionality of all components of the solution to support the business operation.
• The user acceptance test simulates a series of actual business cases performed by end-users. It includes real business operations to ensure they are run without fault.
• It will be performed by the MOSL Test Team with support from the project and Market Services team.
• The Director of Market Services will be required to sign off UAT • User acceptance test closure must be approved by the appropriate stakeholders
• MOSL request that Market Participants support their User Acceptance test phase however this
request is not aimed at all Market Participants. MOSL will select 2 retailers and 2 wholesalers
to support this testing phase from those Market Participants that volunteer.
5.1.6 Failover & Recovery Testing (FRT) (Dates to be reconfirmed)
This Testing requires the forced failure of the System in a variety of ways to verify that recovery can be properly performed. This will monitor how well an activity is able to recover from crashes, hardware failure and other similar problems. It also monitors any unexpected errors and application processing continues as expected. It will also verify that should the primary hosting site become unusable the entire system can be brought up in the disaster recovery site without data loss detailed in CSD0007 & Schedule 10.
Accountable – MOSL Test Lead
Responsibility – Solution provider’s test lead and team
Entry Criteria:
Test plans, detail scripts and test data are complete and approved before the start of the FRT execution
FRT test scenarios are defined, reviewed, prioritised and agreed before the start of the FRT test execution phase
The pre-production and disaster recovery environments are available for testing
All previous test phases successfully complete and there are no unacceptable outstanding issues
All previous Test stages have been successfully completed.
26
Market Operator Services Ltd ©
Controls
The following controls will be applied at different points in this test phase.
A test approach and plan have been prepared, reviewed and approved.
Detail test scripts prepared, reviewed and assured.
Completeness of the test coverage
Completeness of the test execution
Defect metrics showing declining rate for finding defects – approaching to zero
The defect resolution rate is monitored to ensure the test plan timescales are met
All defects closed at the phase exit point
Successful review of the test execution results
Exit Criteria:
• Entry criteria fully satisfied. • The required level of testing has been completed to verify each requirement: • Reports providing a clear indication of test execution completion and coverage. • All required defects closed based on an agreement with the MOSL Test Lead and solution
provider. • Mitigation action agreed for all outstanding defects. • Test closure agreed and assured by the MOSL Test Lead and solution provider. • The System fails over to the Disaster Recovery environment in accordance with the
requirements of Schedule 10 (Business Continuity and Disaster Recovery).
Key Test Principles:
The key test principles of FRT are:
• The Failover & Recover test verifies that CGI can provide a robust solution in terms of system and architecture, which provides confidence that MOSL can operate the business operation in the event of a system or architectural failure.
• The FRT simulates a series of actual failure cases created by the solution provider team. It verifies that the solution is resilient to failure and that adequate error monitoring is in place to detect and report on the error conditions generated.
• A full disaster recovery scenario should be run to ensure that the processes and procedures are in place and work to bring the CMOS system up in the required time should the primary site become unavailable.
• It will be performed by Solution Provider Team. • Failover & Recovery test closure must be approved by the appropriate stakeholders
5.1.7 Performance Testing (Phase 1 19th May – 9th June ‘16 Phase 2 30th June – 10th Aug ‘16)
This level of testing verifies that the solution and system architecture can support the required
numbers of concurrent users and transaction load from the Low Volume Interface (LVI) and High
Volume Interfaces (HVI). It should confirm that:
The expected busiest hour of the busiest day of the year can be supported (Load & Volume Test)
Response times for client server interaction and interface file processing are acceptable (Load & Volume Test)
The solution can operate over 24 hours at a percentage to be determined of peak load volume without loss of performance or server resource utilisation becoming unacceptable (Soak Test)
The solution architecture has headroom to support the predicted growth in the market. The current agreed growth that CGI will support +% above and below 100% of expected volumes (Stress Test)
27
Market Operator Services Ltd ©
Accountable – MOSL Test Lead
Responsibility – Solution provider’s test lead and team
Entry Criteria:
Load profile of user numbers and the business transactions to be simulated, including interface files and data volumes, has been agreed for peak load defined in the detailed performance test plan
Response time acceptance criteria has been agreed for client server interactions and the time to process interface files defined in the detailed performance test plan
Server resource maximum values have been agreed (detailed in the approved schedule 3 Testing & Acceptance document).
Test plans, automated scripts, monitoring solution and test data are complete and assured before the start of the test execution
Performance test scenarios are defined, reviewed, prioritised and agreed before the start of the test execution phase
A production sized test environment is available for testing
An agreed approach to test monitoring, forensic results analysis and error handling has been defined in the detailed Performance Test Plan
All previous test phases successfully complete and there are no unacceptable outstanding issues.
All previous Test stages have been successfully completed.
The Performance Test Environment has been successfully built and deployed.
The performance test plan has been reviewed and approved by MOSL Test Lead.
Controls
The following controls will be applied at different points in this test phase.
A test approach, plan and all test assets have been prepared, reviewed and assured.
Completeness of the test coverage
Completeness of the test execution
Defect metrics showing declining rate for finding defects – approaching to zero
The defect resolution rate is monitored to ensure the test plan timescales are met
All defects closed at the phase exit point
Successful review of the test execution results
Exit Criteria:
• Entry criteria fully satisfied. • All required tests have been executed and acceptance criteria has been met. • Reports providing a clear indication of test execution completion and coverage. • All required defects closed based on an agreement with the MOSL Test Lead and solution
provider. • Mitigation action agreed for all outstanding defects. • Test closure agreed and assured by MOSL Test Lead and solution provider. • A performance test report with recommendations has been provided to MOSL • The Service Levels as described in the performance test plan have been met or there is
agreement with MOSL on a plan to resolve outstanding issues at a later time.
Key Test Principles:
The key test of Performance testing are:
• The Performance test verifies that CGI have provided a performant solution in terms of supporting concurrent user loads and transaction volumes with acceptable response times, which provides confidence that MOSL can operate the business operation at peak volumes.
• It will be performed by Solution Provider Team. • Performance test closure must be approved by the appropriate stakeholders
28
Market Operator Services Ltd ©
5.1.8 Security & Penetration Testing (26th April ’16 – 29th Jun ’16)
Security & Penetration testing verifies that the solution and system architecture is secure and will
prevent misuse by external agencies or people.
The objective of Security testing is to verify that the security model in place meets the requirements
as stated in the approved Security schedules and detailed security plan.
Penetration tests should verify that the solution
Is secure from particular sets of attack vectors
Has no vulnerabilities that could be exploited in any way
Has the ability to detect and respond to attacks
Accountable – MOSL Test Lead
Responsibility – Solution provider’s test lead and CGI selected Independent Security &
Penetration Testing Company approved by MOSL
Entry Criteria:
All previous Test stages have been successfully completed.
The security & penetration test plan has been reviewed and agreed by MOSL Test Lead.
The Production and Disaster Recovery environments are available and at the requisite build level.
Controls
The following controls will be applied at different points in this test phase.
A test approach and plan have been prepared, reviewed and approved.
All test assets have been prepared, reviewed and assured.
Completeness of the test coverage
Completeness of the test execution
Any vulnerabilities are to be captured as defects in HP ALM
All defects closed at the phase exit point
Successful review of the test execution results
Exit Criteria:
• All required tests have been executed and acceptance criteria has been met. • Reports providing a clear indication of test execution completion and coverage. • All required defects closed based on an agreement with the MOSL Test Lead and solution
provider. • Mitigation action agreed for all outstanding defects. • Test closure agreed and assured by MOSL Test Lead and solution provider. • An adequate (as determined by MOSL) Security and Penetration Test outcome report has
been provided to MOSL by the CGI selected Security & Penetration testing company. • An adequate (as determined by MOSL) action plan has been provided to MOSL by CGI.
Key Test Principles:
The key test principles covering this phase of testing are to be identified by the solution provider.
29
Market Operator Services Ltd ©
5.2 To verify the Market Operation
5.2.1 Service Readiness Testing (30th Aug ’16 – 30th Sept ‘16)
This level of testing is focussed on running a subset of the UAT test cases to ascertain that MOSL
users can operate and support CMOS. The tests are executed by the MOSL Market Services
Team with support from the MOSL Test Team who will develop test scripts from the requirements
stated in CSD Code.
Accountable – MOSL Test Lead
Responsibility – Director of Market Services
Entry Criteria:
Test plans, detail scripts and test data are complete and assured before the start of the SRT execution
SRT test scenarios are defined, reviewed, prioritised and agreed before the start of the SRT test execution phase
A stable test environment is available for testing
All previous test phases successfully complete and there are no unacceptable outstanding issues
All previous testing has been successfully completed
All elements of the Services are available to the MOSL Testing Service.
Controls
The following controls will be applied at different points in this test phase.
A test approach and plan have been prepared, reviewed and approved.
Detail test scripts prepared, reviewed and assured.
Completeness of the test coverage
Completeness of the test execution
Defect metrics showing declining rate for finding defects – approaching to zero
The defect resolution rate is monitored to ensure the test plan timescales are met
All defects closed at the phase exit point
Successful review of the test execution results
Exit Criteria:
• Entry criteria fully satisfied. • The required level of testing has been completed to verify each requirement:
High priority – all tests executed.
Medium priority – all high and medium priority tests executed.
Low priority – all high priority tests executed. • Reports providing a clear indication of test execution completion and coverage. • All required defects closed based on an agreement with the MOSL Test Lead and solution
provider. • Mitigation action agreed for all outstanding defects. • Test closure agreed and approved by MOSL Test Lead and solution provider. • There are no “critical” or “major” defects.
• There is an agreed plan to resolve minor defects.
Key Test Principles:
The key test principles of Service Readiness testing are:
30
Market Operator Services Ltd ©
• The service readiness test verifies a sufficient level of functionality to confirm that MOSL can operate the business operation.
• The user acceptance test simulates a series of actual business cases performed by end-users. It includes real business operations to ensure they can run faultless.
• It will be executed by MOSL Test Team with support from the project and Market Services team.
• The Director of Market Service will be required to sign off the SRT • SRT test closure must be approved by the appropriate stakeholders
5.3 To verify the Market Participation
5.3.1 Connection Test (16th & 17th Dec ’15)
This test is offered to those Market Participants that register an interest to assure connectivity
between their and CGI’s company infrastructure. Put simply this is a test that will ensure
messaging can flow between the two entities and through their respective security firewalls. This
test will need to be executed as a pre-requisite to Market Participants use of environments supplied
by CGI for example Sector Test System (STS) 1 & 2.
Please note this test has now been completed and another Test is scheduled in January
date will be confirmed by the Solution Provider
Accountable – Solution provider
Responsibility – Market Participants who have registered an interest to execute this test of the
use of STS environments with support from the CGI.
Entry Criteria:
Market Participant has registered their interest to execute
Connectivity details are available
Technical resources at both CGI & Market Participants are ready to execute and assure
the test as successful
Controls:
CGI’s environment manager has defined and provided full details of the test to be executed
CGI’s environment manager has provided a schedule for testing including time of
execution
Success criteria has been defined between CGI and the Market Participants executing the
test
Exit Criteria:
The test has completed and connectivity has been proved between CGI & Market
Participants that registered their interest
Key Test Principles:
The key test principles of the connection test are:
The connection prove the infrastructure and white listing of IP addresses from Market
Participants and therefore connectivity at this level
It will be executed using technical resources in both organisations that understand and are
able to verify connectivity
Both the CGI and the Market Participant are able to verify and approve connectivity.
31
Market Operator Services Ltd ©
5.3.2 Sector Test System (STS1 & 2 for B2B High Volume Interface) Test (STS1 Jan ‘16 – STS2 Feb ‘16 to the end of April ‘16)
This phase of testing is offered to those Market Participants that register their requirement to
execute early B2B High Volume Interface (HVI) testing. Two Sector Test Systems will be made
available to Market Participants to execute STS testing, STS1 will be made available throughout
January and STS2 will be made available throughout February and to the end of March. Each
Market Participant that registers their interest to use STS 1 & 2 will have completed the connection
test due on the 16th & 17th December.
STS runs in the same application architecture as the CMOS system; it is a functional component of this system and uses the same users and user management, webportal and B2B interfaces. Test scenarios are scripted and run independent of any central data; responses are defined as part of the test scripts. STS can therefore be made available before go-live A test scenario consists of one or more Transactions being sent and/or received by the Market Participant. STS supports two main interfaces:
B2B interface for sending and receiving transactions
GUI interface for preparing and executing STS test scenarios
The B2B interface uses the exact same functional interface as the CMOS published interface as defined in the external Interface Specification XSDs. The WebService endpoint is different to differentiate between STS and normal Transaction streams. The GUI interface uses same CMOS portal as normal operation and requires, Organisation and user set up and implementation of the Organisation Certificate as defined in CSD0006
CGI will supply the test cases that the Market Participants will be expected to execute in order to
prove that the HVI and the required messaging. Currently the test scenarios for STS 1 are
understood by CGI and will include meter and transfer related test cases, STS2 is currently
unknown and CGI expect to work with the registered Market Participants to decide what tests
should be executed.
Accountable – Registered Market Participants
Responsibility – CGI Environment and Test Managers to provide support and the STS
environments and test cases that will be executed
Entry Criteria:
CGI & registered Market Participants have completed the connection test on the 16th /17th December
Environment and test details are made available to registered Market Participants
STS 1 & 2 environments are available
CGI provided test cases are made available to the registered Market Participants
A defect process is made available
Controls
The following controls will be applied at different points in this test phase.
CGI environment and test manager support is in place for both STS 1 & 2
A detailed test plan for both STS 1 & 2 IS available
A communications plan including escalation and defect procedures is shared between CGI & the registered Market Participants
Exit Criteria:
32
Market Operator Services Ltd ©
• Market Participants are able to prove and exercise the HVI between themselves and CGI using the test cases provided
• Any defect or issues found are raised with CGI as part of the agreed defect resolution process
Key Test Principles:
The key test principles of the HVI B2B STS1 & 2 tests are:
The test verifies the messaging flows between CGI and Market Participants for the HVI
and is able to move through their respective firewalls
Technical test resources at the Market Participants will execute using Test cases supplied
by CGI
CGI environment & test manager will be available to support Market Participants and
provide a defect and issue escalation point
Both the CGI Test manager and their respective peer at the Market Participant are able to
verify and approve HVI Testing is successful and complete.
5.3.3 Market & Company Readiness (Market Entry Assurance Certification (8th Aug ‘16 – 30th Sept ‘16)
The principles of Market Entry Assurance Certification (MEAC) tests are stated below. For this
document (MTS) both Interface and Data Transaction testing & Market Scenario Testing are
detailed but there are a number of manual documented processes that must be executed prior to
these phases to complete certification which can be found in the Market and Company Readiness
Plan document section 6 page 17 distributed on the 17/12/2015 by MOSL Market Services. This
section of the MTS must be read in conjunction with the MCR Plan.
Accountable – Director of Market Services
Responsibility – Market Participants with support from Market Services and MOSL Test Team
5.3.4 Interface and Data Transaction Testing (August 2016)
5.3.4.1 Objectives
Interface and Data Transaction Testing (IDTT) covers the activities that an applicant is required to
undertake to become an entity on CMOS.
The objective is that by the end of this activity the applicant will be ready to undertake Market
Scenario Testing and will have proved that it is technically able to interact with CMOS.
In particular it will have:
Established the required administrator and user accounts within the relevant test
environment,
Set up the supply points and all of the associated non-transactional (standing) data entities
(tariffs, meters, etc.) required to undertake Market Scenario Testing, and
Uploaded all required test data.
5.3.4.2 Approach
The approach set out here should be read in conjunction with MOSL’s Master Test Strategy
version 2.2_Final (this version).
33
Market Operator Services Ltd ©
During transaction testing, using the administrator account details provided by MOSL, the
applicant’s administrator will create the required user accounts and enter non-transactional data
within the full CMOS market entry assurance test environment to enable Market Scenario Testing.
MOSL will confirm details of the test cases to be run and the approach to both IDTT and Market
Scenario Testing by end of February 2016 in its Detailed Market Entry Assurance Test Plan.
The test cases will be based on those being prepared for MOSL’s own User Acceptance Testing.
Prior to entry into IDTT:
A detailed Market Entry Assurance Test Plan will have been agreed between MOSL and
the participant,
The Business Solution Assessment must have been completed, so that there is sufficient
confidence that the testing will be successful,
Market and interface training must have been completed,
Digital certificates must have been created to enable the applicant to communicate with
CMOS,
MOSL has set up the applicant as a Market Participant in the test environment,
MOSL has created an administrator account for the applicant and issued the test URL to
be used.
IDTT will be run on a small dataset. We expect that the number of supply points needed to be
established will be, at the most, in the hundreds.
MOSL will create a test script to assist participants in undertaking this testing. It will also define in
detail the characteristics of the static test data that it expects participants to upload.
Participants will provide the static data to be uploaded. These may be based on their actual supply
points or use dummy data. The upload will be achieved by means of standard market transactions
(i.e. creation of a SPID). Further details of this process will be provided by the end of February
2016.
Once the participant has uploaded the static data, MOSL will verify that sufficient data has been
created to support the Market Scenario Testing.
MOSL is exploring methods for automating the reporting of testing. It aims to be able to confirm
successful completion of IDTT soon after completion in order for each participant to enable
immediate commencement of Market Scenario Testing. A short report will be issued to confirm this
outcome.
5.3.4.3 Outcome
The expected outcome of IDTT is that the applicant is confirmed as ready to commence Market
Scenario Testing. This will be confirmed by MOSL.
5.3.4.4 Timing and resource needs
IDTT will commence for all applicants at the beginning of August 2016. All applicants will be able to
undertake IDTT simultaneously. We would expect that IDTT would take no longer than a few days.
Participants will need to provide resources to identify or create test data, execute the tests and deal
with any issues and defects arising.
5.3.4.5 Supporting materials
As well as this the Master Test Strategy the following document will be published and maintained
by MOSL to support this process step.
34
Market Operator Services Ltd ©
Document Purpose Publication Date
Detailed MEA Test Plan Details of the approach to IDTT and
MST
End February 2016
5.3.4.6 Preparing for success
MOSL will validate the approach to IDTT during user acceptance and test (UAT) by ‘walking
through’ the process as if it was a market participant.
Publication of details of the test cases to be run and the non-transactional data requirements
should enable participants to start preparing test data and resources.
MOSL intends to make available a testing environment from May 2016 to enable applicants to run
their own tests. Further details of this are available in the test strategy.
The testing special interest group will continue to meet regularly to discuss progress, resolve any
issues and unblock barriers to progress.
5.3.5 Market Scenario Testing (August to September 2016)
5.3.5.1 Objective
The approach set out here needs to be read in conjunction with MOSL’s Master Test Strategy
V2.2_Final.
Market Scenario Testing (MST) provides the substantive technical test of an applicant’s ability to
transact with CMOS using its own business processes and systems. It aims to provide assurance
that new business systems and processes will function in the new market. It is important that MST
exercises the systems that the applicant intends to use to transact in the shadow and live market.
Successful completion of these tests will be a pre-requisite to the granting of a MEAC (Market
Entry Assurance Certification) which in turn is required to participate in the shadow market and,
where relevant, for Ofwat to grant a WSSL.
The market scenario testing aims to prove an applicant’s technical capability by running a set of
end-to-end test cases covering key market transactions against the limited dataset uploaded during
interface and data transaction testing. It is not therefore a performance test, nor can it be regarded
as a substitute for an applicant’s testing of its own systems.
5.3.5.2 Approach
MST will run a set of defined test cases covering the principal market transactions. The test cases
will run against the non-transactional data set which has been set up on the testing environment by
each applicant.
The test cases that will be run will be set out in detail by MOSL at the end of February 2016. MOSL
expects that the test cases will cover the following market transactions in line with CSD0001:
Set up a new supply point,
Transfer registration,
Cancellation of transfer registration,
Add a meter,
Remove a meter,
Disconnection,
Meter exchange,
Submission of meter reads, and
35
Market Operator Services Ltd ©
Data correction.
These will be end-to-end test cases and so settlement calculation and reporting will be exercised.
MOSL will generate detailed test scripts and test data covering the elements of CMOS processing.
These will be made available to participants at the beginning of UAT in May 2016. However, they
will be validated during user acceptance testing itself.
Please note although detailed Test scripts will not be available until May 2016 a full CMOS
environment will be made available to Market Participants from April 2016.
MOSL cannot provide details of the test steps that will be needed to exercise the participants own
business systems and processes. Participants will be expected to provide this detail working in
conjunction with MOSL to ensure each test script constitutes a coherent end-to-end test across
both the market operator systems and the participants systems.
Clearly most market transactions involve more than just the market operator and one other
participant. MOSL will not make successful completion of MEAC depend on the actions of another
participant. For any particular set of tests, MOSL will provide required responses from other types
of market participant using a ’dummy’ retailer or wholesaler entity to do so. The precise method by
which this will be achieved is being explored and will be clarified in the Detailed MEA Test Plan.
MOSL expects each applicant will develop a detailed test plan, working with MOSL, following the
provision of the Application Information Return. These plans will require detailed information to be
provided by both MOSL and the participant and should cover matters such as:
The exact business processes and systems that will be tested and the entry and exit points
for this testing,
How testing will be managed,
How test data is to be introduced into the systems,
How test execution is to be evidenced for each test step, e.g. via prints of data transactions
sent and received, relevant screenshots, reports, files and database prints,
The generation of expected results at various stages of the transaction flow. It is
particularly important that expected results are defined at the inbound and outbound
interface between participants and CMOS in order that the location of any discrepancy can
be identified,
Defect management and re-testing,
Test result reporting.
Further details of the exact content and timing of such plans needs to be finalised and MOSL
expects to develop and publish guidance for this. MOSL will generate expected results from CMOS
processing as part of its test scripts and will provide these to applicants for execution.
MOSL will ‘witness’ test execution by examining the documentary evidence provided. It will also
examine test management products such as defect logs.
5.3.5.3 Outcome
The desired outcome will be the successful completion of all required Market Scenario Test cases.
MOSL will adopt a risk-based approach to setting standards for Market Scenario Testing.
Successful completion for test cases is likely to be defined in terms of the numbers of outstanding
defects in different categories.
The details of this approach will be confirmed in the Detailed Market Assurance Test Plan.
However, the over-riding principle is that there should be no unacceptable errors so the tolerances
of defects will likely be relatively low.
36
Market Operator Services Ltd ©
MOSL will evaluate and report on the completion of Market Scenario Testing as an input to its
certification decisions.
5.3.5.4 Timing and resource requirements
Market Scenario Testing can begin once MOSL has confirmed that an applicant has successfully
completed IDTT. This is expected to be complete for all applicants by the end of the first week of
August 2016. However, MOSL will need to manage its resources to support execution over the
period of Market Scenario Testing. It therefore expects to set out a series of start dates for
commencement of MST to manage this workload.
Applicants will be able to book a starting date on a ‘first come first served’ basis once they have
supplied a complete trading application. Applicants will need to devote resources to the planning,
preparation, execution and management of testing.
5.3.5.5 Supporting materials
As well as this the Master Test Strategy the following documents will be published and maintained
by MOSL to support this process step.
Document Purpose Publication Date
MOSL detailed MEA test
plan
Details of MOSL’s overall approach to IDTT
and MST
End February 2016
Pro-forma detailed
participant MEA test
plan
Provides guidance as to the structure and
content of the detailed test plan that will be
agreed between MOSL and each participant.
End February 2016
5.3.5.6 Preparing for success
MOSL will validate the approach to market scenario testing during user acceptance testing as it will
be running similar, if not the same, test cases.
Publication of details of the market scenario test cases to be run and the non-transactional data
requirements in May 2016 should enable participants to start preparing test data and resources.
MOSL intends to make available a testing environment from May 2016 to enable applicants to run
their own tests. Further details of this are clarified in the Master Test Strategy.
The Testing SIG will continue to meet regularly to discuss progress, resolve any issues and
unblock barriers to progress.
Consequences of failure
MOSL is committed to working with market participants to support them in achieving MEAC by 30
September. There will be a short period after this date for remedial MEA to resolve any outstanding
issues.
However, if a market participant does not successfully obtain market entry assurance certification
after this, the consequences for the company could be significant:
A retailer is unlikely to be granted a WSSL in time to be able to participate in the new
market from market opening or acquire customers via the exit process,
An incumbent undertaker will not be allowed to exit the new market at market opening if the
retailer acquiring its customers does not have a WSSL, and
A market participant will not be able to participate in the shadow market and will be unable
to carry out CMOS transaction to identify and resolve issues with data and processes.
37
Market Operator Services Ltd ©
Ofwat intends to issue a transitional licence condition which will apply to incumbent water and
sewerage undertakers and water supply licensees through Instrument of Appointments (IoA) and
water supply licences (WSL) respectively.
If an incumbent undertaker or licensee does not make reasonable endeavours to achieve MEAC,
Ofwat may view this as a relevant factor in considering whether to take any action under this
transitional licence condition.
Remedial market entry assurance (October 2016)
The remedial MEA process is timed to enable any incumbent undertakers seeking to exit and any
retailers who are seeking to be granted a WSSL by Ofwat and acquire customers at market
opening (through exit) to achieve market entry assurance certification in time to submit an exit
application to Defra for approval.
MOSL will not offer remedial market entry assurance to any market participants who have not
already sought, but subsequently failed, to achieve certification.
During remedial MEA, market participants will again carry out Market Scenario Testing in order to
gain certification. The prerequisites to Market Scenario Testing as defined within this section will
not be re-tested because in order to have sought certification in September 2016 these would have
been completed.
If a market participant has a different business solution or interface with CMOS than those on
which certification was previously sought, this will require market entry re-assurance and not
remedial MEA.
Business-as-usual (BAU) market entry assurance (December to March 2015)
MOSL will carry out market entry assurance as part of ‘business-as-usual’ from December 2016 to
mid-March 2017 to enable any currently unknown new entrant retailers or NAVs to enter the market
at market opening. The market entry assurance process will follow the same process steps as
those defined above.
The timing of the business-as-usual market entry assurance process means that those applicants
seeking to enter the market will not benefit from being able to understand the capabilities of their
processes and systems during the shadow market. MOSL will have a not be issuing certifications
during March 2017 in order to ensure that there is stability in the Market Participants for market
opening.
Following market opening, the market entry assurance process will continue as defined within
CSD0001, which will, by this time, be a statutory document.
Market entry re-assurance
During pre-market opening if the scale of a market participant’s operations changes significantly
from that on which it achieved or is seeking MEAC, it will have to go through complete market entry
re-assurance including all steps outlined within the market entry assurance process above.
The MEA activities which the market participant will need to re-assure are defined in the table
below (adapted from CSD0001 for pre-market opening).
Material change MIT BSA IDTT MST
1
Market participant has completed self-
assessment MEAC but would now like to access
the HVI.
2
Market participant wishes to use the HVI for
additional data transactions and is familiar with
the HVI
3 Market participant hardware platform change
where it is already familiar with the HVI
4 Market participant communications change (for
example changes to network or ISP)
38
Market Operator Services Ltd ©
5 Market participant data handling software
platform change
6 Market participant data handling software
platform upgrade
7
Significant changes to operational staff which
require induction over and above that normally
taken
8 Changes to business process(es) which are
highly relevant to compliance with Market Codes
Market entry re-assurance will take place alongside business as usual market entry assurance
from December 2016 onwards
5.3.6 Market & Company Readiness (Shadow Market Operation) (4th Oct ’16 – 5th Apr ’17)
The principles of Shadow Market Operation are stated below. This document (MTS) should be read
in conjunction with the Market and Company Readiness Plan document section 8 page 43.
Accountable – Director of Market Services
Responsibility – Market Participants with support from Market Services and MOSL Test Team
The duration and scope of the shadow market
The proposed period for the shadow market to run is from 3 October 2016 until market opening.
This period represents an opportunity for wholesalers, retailers and MOSL to trial new systems and
processes to identify and resolve issues with data in a safe environment before ‘go live’. This
period also provides all market participants, and MOSL, with an opportunity to assess their own
readiness.
The shadow market will:
Run on market production systems from 3 October 2016,
Exercise all market processes possible (registration, settlement etc.),
Require all updates to market data during the shadow market to be via market
transactions,
Put arrangements in place for one-off data uploads in limited circumstances,
Link entry criteria to participation including completion of Market Entry Assurance
Certification and, for wholesalers and incumbent retailers, successful initial data upload
Include trialling of operational processes, organised by market participants,
Involve as many participants as possible, and
Facilitate the switch-over to the live market.
The benefits of participation
The benefits of running the shadow market period is that it will:
Allow all parties, including MOSL, a chance to rehearse new business processes with
realistic transaction volumes and datasets in expected live timescales,
39
Market Operator Services Ltd ©
Allow participants to understand the outputs of settlement calculations for themselves on
the basis of their own full datasets to provide confidence that settlement calculations are
working, are consistent with current customer billing and will result in expected levels of
revenues or payments,
Enable problems to be identified and rectified in a safe environment with no commercial
consequences, in particular by allowing for the identification of data issues relating to
supply points that affect settlement values and resolving customer or address
discrepancies,
Increase the confidence of all parties that MOSL and its systems are functioning
effectively,
Allow all parties to gain assurance on their readiness activities through full live simulation.
Therefore during the shadow market, we expect companies to:
Verify that the processes to update data are robust in a realistic simulated environment,
Verify datasets to ensure that they do not get unexpected outcomes when the market is
functioning normally,
Trial bilateral transactions to ensure that operational updates for secondary processes lead
to data updates where required,
Ensure tariffs, tariff codes and cost-sensitive data fields have been applied appropriately
such that the amounts chargeable to retailers and the amounts charged on to customers
reflect current billing using existing processes. Retailers and wholesalers are expected to
work together to verify and resolve any discrepancies,
Resolve identified data issues in back-end systems as well as in market systems, with
retailers and wholesalers working together to verify and resolve any discrepancies,
Resolve issues and work with MOSL in relation to progress on implementing performance
rectification plans, where the need for a plan has been identified (e.g. as a result of data
upload),
Work with MOSL in any market audit
The shadow market may include a test of a market wide incident to allow MOSL and all market
participants to verify, in a realistic scenario, that their business continuity plans are robust and
effective, or to resolve any issues arising in a safe environment.
Overall approach to the shadow market: parallel running in production mode
The concept of the shadow market is for market participants to operate market processes in
parallel with their legacy systems in a pre-production CMOS environment that will seamlessly
evolve into the production environment on market opening. Participants will be required to maintain
SPIDs within this environment using market transactions and processes in accordance with the
CSDs.
Running the shadow market in a pre-production environment means processing only data which
reflects what is being processed by participants in their existing systems. This means that the BAU
production systems and the systems running the shadow market are the same.
The live market will not be given legal effect until 1 April 2017 and customers cannot apply to be
switched until the market opens.
Participants must maintain the data within CMOS to reflect their legacy production system until the
market opens.
It is expected that participants will operate the market processes during the shadow market as they
would in production mode.
40
Market Operator Services Ltd ©
6 Testing Scope
6.1 Risk Based Testing
Ideally, every Functional and Non-Functional requirement, test script and scenario would be tested,
but this may not be possible or practical given the time challenges we have to be market ready,
therefore we propose to use a risk based approach to testing.
By prioritising testing it is possible to have sufficient confidence of market readiness by targeting
test effort on areas of High or Medium risk and mitigate those areas first.
Test coverage (i.e. the extent of executed test cases and level of effort) will be proportional to the
level of risk associated with each Functional or Non-Functional requirement, or each logical
function of the solution.
Requirements will be mapped to the business processes they are delivered by and this will
determine the priority associated with them.
By accessing each business process we can identify the minimal quality requirement for it.
High priority business processes will require that all high, medium and low priority test cases be
executed at least once.
Medium priority business processes will require all high and medium priority test cases be
executed at least once.
Low priority business processes will require all high priority test cases be executed at least once.
Please note CGI are using a Requirements based testing approach. A Risk based approach
is being used by MOSL for UAT
6.2 Functional Areas to be Tested (To Be Confirmed)
RBAC Portal
Portal Screens
Standing Data & MDM
Settlement Calculations
Tariff Data
Notifications
Standard Reports
Ad-hoc Reports
Market Performance
Business Service Desk
41
Market Operator Services Ltd ©
Data Migration
6.3 Features Not To Be Tested
This section will be completed with the assistance of the Market Services Analysts, as part of the
testing review process however no features may be out of scope in this project instance. This
section will be updated if any out of scope features are identified.
6.4 Non Functional Requirements To Be Tested
A full and detailed Non-Functional Test Plan will be defined by CGI and referenced here. Non-
Functional testing is referenced in the approved CGI Test Strategy (SA_T_Test Strategy v1 0 4 0).
42
Market Operator Services Ltd ©
7 Test Environments
As solution provider, CGI are expected to deliver the following four logical environments, which will
be partitioned to support testing:
ProductionPre-ProductionDev & Test
Unit Test Environment
Systems & Integration
Test Environment
Data Migration
Test Environment
User Acceptance
Test Environment
Performance Test
Environment
Service Readiness
Test Environment
Market Readiness
Test Environment
Production
Bridgehall Test
Environment(Tariffs &
Settlements)
Disaster Recovery
Disaster Recovery
Site
The physical details of these environments has not yet been supplied by the CGI Environments
Manager.
7.1 Dev & Test Environment - TBC
7.1.1 Bridgeall Test Environment for Tariffs & Settlements
7.1.2 Unit Test Environment
7.1.3 Systems & Integration Test Environment
7.1.4 Data Migration Test Environment
7.2 Pre-Production Test Environment - TBC
7.2.1 User Acceptance Test Environment
7.2.2 Market Scenario Test Environment
7.2.3 Performance Test Environment
7.2.4 Security & Penetration Test Environment
7.2.5 Service Readiness Test Environment
7.2.6 Market & Company Readiness Test Environment
At points in the delivery of the MOSL CMOS Program test environments will be made available to
Market Participants to complete their testing, although some but not all of these testing phases are
43
Market Operator Services Ltd ©
optional MOSL would encourage all Market Participants to use the following environments as they
are made available.
Sector Test Systems (STS 1 Jan 2016 & STS 2 Feb 2016 to End of March ‘16)
(Optional);
The STS Environments will be delivered on a request basis individually to those Market
Participants that require them. Both STS 1 & 2 will be made available to test the High
Volume Interface (HVI) between Market Participants and CGI, the test cases that the
Market Participants will execute will be provided to them as well as the environment by CGI
and there will not be an option for Market Participants to execute their own test cases in the
STS environments. STS 1 will be delivered by CGI in January 2016 and STS 2 will be
delivered by CGI in February 2016, however for those that wish to request these
environments a technical connectivity test between CGI and Market Participants is due
16th/17th December 2015, to prove messaging can be delivered through their and CGI
respective firewalls.
Market Scenario Test Environment (User Testing April 2016) (Optional); In April 2016
a full CMOS environment including HVI and the Low Volume Interface (LVI) will be made
available to those Market Participants that request access. This will be known as the
‘Market Scenario Testing Environment that at this stage in the MOSL CMOS program
delivery will not have completed UAT so will be caveated as such. This environment can
be used by all Market Participants and will give them an opportunity to execute the end to
end tests in their testing programs. MOSL & CGI will not be providing test cases for use in
this environment.
Market Entry Environment (Market Entry Assurance test August & September 2016)
(Mandatory); MOSL UAT will be completed and assured by the 3rd August 2016. The
Market Entry Environment will then contain the final assured CMOS version for live
operations. In this environment in the months of August and September 2016 MOSL
Market Services will work directly with all Market Participants to complete Market Entry
Assurance Testing in order to certify their entry into the Shadow Market in September
2016. The test scenarios that Market Participants will be expected to execute and pass will
be provided in this phase by MOSL.
Please note: MOSL expect to share the Market Entry Assurance test cases with
Market Participants from May onwards in order that they can be executed and
practised before the final Market Entry Assurance Testing Phase is due.
Other Environments (volunteers only); Those Market Participants that are selected as
volunteers to support the SIT & UAT phases will access SIT environments between
January and April 2016, and UAT environments between April and July 2016. In the SIT
phase Test cases will be provided by CGI, in the UAT phase MOSL expect that the
selected Market Participants will support the preparation of the UAT cases that will be
executed in UAT. Both SIT & UAT will be executed in environments that are created
specifically for these phases and will not be accessible by those Market Participants other
than those selected volunteers
7.3 Configuration Management
It will be necessary to have an agreed and approved base configuration for all system settings and parameters for all applications contained in the Test and baseline solution. The baseline configuration will be the standard that all test and production environments must be aligned to. The maintenance of this information in its initial state and going forward during the MOSL CMOS Project will be the responsibility of the CGI Environment Manager. On-going maintenance, following handover, will be the responsibility of CGI and their appointed teams and it must be their responsibility for the up keep of the delivered environments on an on-going basis as part of their contract agreed Roles and Responsibilities.
44
Market Operator Services Ltd ©
It is absolutely critical that when testing commences, the baseline configuration is as close to the settings that will be the Production cut as possible. Any changes to any of the settings or parameters must be communicated to all key stakeholders of the Projects as defined, and specifically to all Test managers so that any changes required to test execution in respect of how scenarios will operate can be amended accordingly and validated with the new settings. It is recommended that any changes to parameters follow the Change Control process adopted by the Project for tracking and visibility It will be necessary to have an agreed and approved base configuration for systems settings and parameters for all applications contained in the baseline solution. The baseline configuration will be the standard that ALL Test Environments must be aligned to. In addition it will be necessary for CGI to have a process owner that is responsible, as an owner, for this baseline configuration who will not only collate and publish the baseline view for review and agreement, but who will also have the relevant access or permissions to be able to amend the baseline settings in all environments. It is absolutely critical that when testing commences that the baseline configuration is as close to the settings that will be implemented into production as possible. It is recognised that this isn’t always possible, however this doesn’t negate from the processes contained within this section and may be subject to test suspension criteria should there be any adverse effect on testing progress of any kind.
Configuration Management (CM) is a four-part discipline applying technical and administrative
direction, control, and surveillance for:
Configuration Identification
o Conventions for identifying baseline and revision levels for all program files (source, object listing) and revision-specific documents
o Derivation records identify “build” participants (including source and object files, tools and revision levels used, data files)
Configuration and Change Control
o Safeguard defined and agreed baselines
o Prevent unnecessary or marginal changes
o Expedite worthwhile changes
Configuration Status Accounting
o Recording and tracking problem reports, change requests, change orders, etc.
Configuration Audit
o Unscheduled audits for standards compliance
o Schedule functional and physical audits near the end of the project
Testing’s interest in Configuration Management consists of the following:
To manage its own validation tests and their revision levels efficiently
To associate a given version of a test with the appropriate version of the build to be tested
To ensure that problem reports can identify software and hardware configurations accurately
To ensure that the right thing is built
45
Market Operator Services Ltd ©
To ensure that the right thing is tested
To ensure that the right thing is shipped
The following Configuration Management process, or amended version, must be agreed within MOSL and CGI;
Definitions
Build, Release, or Test Release: An installable software item transmitted to the test group
for testing
Repository, Library, Source Code Control System: The system
Build Manager or Release Manager: The person responsible for creating the installable
item, software or hardware
Build ID, Release Name, Version Number, or Version: Sequence of numbers and/or
characters that uniquely identifies the release type
Test Cycle: A set of tests run against a given build
Test Pass: A complete run of all test phases
Release Timing: The software schedule to Testers or Users
The Test Environment Release Process
Ensure that the Change Control has been agreed and approved; usually through CAB
Define who will be involved in the release process
Define Roles and Responsibilities
Define the release type - Plan in advance the type of Test Release to be implemented;
New, Database Refresh, Bug Fix, Emergency, Standard, Configuration, Production
Alignment. Etc.
Ensure that the Change Board (CAB) have approved the release unless special
dispensation has been granted but this must be backed up by an official statement from
the authorising body and not a general comment and/or statement
Define the architecture receiving the release including any multiplatform timing releases.
Define the release timing schedule and frequency.
Ensure support agreements are defined and communication lines in place for the release
Ensure agreement has been reached on the Bug Fixes and any new Configuration and
Change Requests as part of the build
Ensure that ALM contains the new release details in preparation of any further defect
logging
Ensure that all communications have been completed.
Ensure that any other project or program which is either using, planning to use or
dependant on this release is factored into all communication plans and that any risks have
been mitigated; don’t be tempted to ignore these be them large of small
Ensure that a Back Out plan is compiled and all stake holders have been walked through it
in CAB and that this is communicated to affected and influenced stakeholders – Do Not
ignore the Test Managers
Ensure that the Test Environment is locked down to ANYONE who doesn’t have the
correct release or access privileges
Ensure that release notes have been completed and distributed.
46
Market Operator Services Ltd ©
Ensure that all naming conventions, version control and Configuration Management items
are included
Ensure that any configuration items that are required (Login, Password, Account Details
not in the Data Definition, Firewall, I.P. & Port Configurations, System Dependant
Configuration, etc.) are clearly defined and Base Lined
Select the content (bug fixes, new features, configuration and documentation) for a
particular release
Define the test success criteria of the new build
Implement the changes required for the bug fixes and new features
Compile, integrate, and otherwise assemble the build; and, mark (in the build and in the
repository) the build with a version number – The Configuration Manager is accountable in
this process
Smoke Test the build prior to delivery to the Test Lab. If the tests pass, continue with the
next step of implementation – This is vital and will not be overlooked
Does the Smoke Test fail? If the tests fail, invoke the back out plan, figure out what went
wrong, back out the build, fix the problem and restart the process
o Don’t be tempted to fix on the fly as a change in the build must be tested again before
it is returned to the build cycle and released to the Test Manager / Lead
IMPORTANT Compile a Test Certificate for the receiving Test Manager and Environment
Manager
The Test Manager and Environment Manager must plan their receipt schedule of the new
build. The CGI Environment Manager must define the steps taken in order to receive and
install the new build; in addition they must plan their own Smoke Tests prior to handing
over to the Test Manager(s).
The Test Manager must plan, in alignment with their Test Strategy, what initial tests will be
performed as part on the new release. This may consist of Defect Re-Tests of Regression
Testing prior to commencing testing proper; assuming that that’s the next steps
Deliver Smoke Test results to the Test Manager and key Stakeholders
Compile a Lessons Learned document from Smoke Testing
End of Process
7.4 Release Management
It will be necessary to have strict levels of control and process around the discipline of Release Management. For the purposes of the Project, it is critical for testing to have clear visibility of what levels of code and/or changes are being applied to any of the test environments. In addition to this, it is imperative that the release of code is controlled and follows strict deployment guidelines as it moves through the environments. A release is defined as drop of code that is to be applied to an agreed environment ready for testing prior to implementation. These releases of code must be prioritised and agreed as to their content. Their makeup will differ from release to release, the main or common reasons for inclusion into a release will be:
Code now ready to be added to an environment for the first time
Change Control forcing change(s) to a code version(s) for an application or a number of
applications
Defect fixes ready to apply
Hardware fixes
47
Market Operator Services Ltd ©
Software Patches
It will be necessary to issue a release document prior to the deployment of each release for all key stakeholders for review and agreement. In addition it may be necessary to hold meetings to agree the content of a release based on changing priorities and all key stakeholder and interested parties should be involved to agree the content of each release. The CGI Release Manager will have close links to the following areas:
All Technology Development Units
All Infrastructure Units
Configuration Management
Defect Management
Test Management
Production Support
48
Market Operator Services Ltd ©
8 Test Data Management - TBC
Test data to be used through all test types, should be delivered via the Data Migration work stream
apart from IDTT detailed in Section 5.3.4 which will be created using transactions.
This will ensure that all test phases are using test data in a format that all components of the
solution must be able to process.
Data Migration is tested under Unit and Unit Integration Test Types, and where necessary Non-
Functional Tested for performance and resilience.
Any changes to reference or configuration data should move through Change, Configuration and
Release Management to ensure consistency across test types and releases.
Although not all data will require obfuscation, it is good practice to ensure that the possibility of a
test transaction leaving the test environment and executing actual business transactions should be
prevented by using Market Participant test data, for example dummy retailers and wholesalers,
rather than real data.
The process to supply this data will be provided by the MOSL Data Lead Kaleem Sayed - TBC
MOSL & CGI will not provide Test Data to Market Participants in the same way test cases will not
be provided. There are some exceptions.
Data will be provided in the STS 1 & 2 environments by CGI
Data for SIT will be requested of selected volunteer Market Participants by CGI, the data
expectations will be requested by CGI and communicated as part of their detailed SIT test
plan
Data for UAT will be requested of the selected volunteer Market Participants by MOSL, the
data expectations will be requested by MOSL and communicated as part of the detailed
UAT test plan. Please note that those Market Participants selected for UAT will also
provide data pre UAT for the system data test, this data will then be used in the UAT test
phase
Market participant data expectations will be described in the Data Strategy Document (due
January ‘16)
8.1 UAT Test Data
During test case analysis, high level test data requirements will be captured in the MOSL Test Data
Requirements spreadsheet, as:
Category - This will categorise the data requirement, making it easier to assign preparation
later and contains values such as security, reference, and master data.
Data Type – The data type is the name given in the CSD’s to the data and will contain
values such as Digital Signature and Market Participant Unique ID.
Characteristics – This is the description of the data as defined in the CSD.
Source of Info – This will be the CSD reference ID and the unique section reference from
the document to enable further reading if required.
At the end of Test Analysis we aim to have a complete list of high level data requirements.
49
Market Operator Services Ltd ©
During Test Design, how each test case will be tested will be documented as single or multiple test
scripts, which will also provide more detailed test requirements.
The MOSL Test Data Requirements spreadsheet, will be updated as follows:
Used in which scripts – will contain the unique ID of the test script as provided by HP ALM.
Where a row already exists for a data item, then the field will updated with the new script
ID.
Source / Generation method – This will contain the identified method for sourcing or
creating the test data.
No. Avail Already – As data is created or sourced, this number will be incremented.
Total No Required – As test scripts are written, this value will be incremented by one. If a
row already exists for a data item, then the count will be incremented by one.
Owner – This will contain the name of the person responsible for creating the data
Status – This will either be “Open” or “Closed”, depending on the state of test data
creation.
It is assumed that test data creation will be the responsibility of the Test Data Manager, who will
work closely with the MOSL Data Lead to either source test data from the Bulk Data Loads
expected from Market Participants.
Where test data is required to be created, there are various options available that will be explored
to ensure the most cost effective solution is used:
The LVI interface and system screens can be used to enter the required test data.
It is assumed that MOSL will have access to the CGI Test Data Tool to fabricate the
required data.
The Test Data Manager will be able to:
o fabricate HVI interface files, or
As all of the required test data is created, the Status of the Test Data Requirement and Task row
will be set to Closed and the Identifier field updated, for each test script, so that HP ALM can be
updated with the requested test data values.
Prior to test execution it is assumed that a backup of all created test data will be taken, to enable a
database restore back to a known starting point.
50
Market Operator Services Ltd ©
9 Communication Strategy
It is essential that all test activity is visible and transparent to the project stakeholders and
interested parties to provide adequate confidence that the risks that testing is designed to mitigate
have been.
There will undoubtedly be a large number of different quality test metric types required throughout
the MOSL CMOS Test Project. The diagram below offers an overview of the four quadrants that
the Test Leads will focus on when communicating progress for this Project
Detailed in the outer boxes are the categories that will be available to the necessary stakeholders.
51
Market Operator Services Ltd ©
9.1 Communications Approach
The communications approach described here will ensure that key questions are answered
throughout the testing phases, using MOSL Test Templates where ever possible.
Question Action to communicate answers Benefit of this approach Output Reference
In each phase of
testing:
What is being tested?
Where is it being
tested?
When will it be tested?
Who will be testing it?
Test Leads will produce a Test Approach for
their test phase.
Formal document produced
<phase> Test Plan detailing:
* What is being tested.
* What isn't being tested?
* Who is testing what?
* Where the testing will occur.
* When it will happen.
<Phase> Test
Approach
Test Leads will produce an MS Project Plan
with tasks, estimated duration, resource and
dependencies.
The impact of delays on the Test
Phase Plan, from upstream or
downstream activity can be clearly
seen and reported.
MS Project Plan of
the Test Phase
activity.
Are we testing enough
during each phase?
Produce a coverage matrix of test
requirements to test scripts.
Requirements without tests can be
clearly identified and managed.
HP ALM dashboard.
This will be a Quality
Gate Criteria that
must have evidence
that it has been met.
Get agreement from the project
stakeholders that the test scripts cover the
requirements to the required depth.
Confidence in the depth, breadth,
accuracy and completeness of the
tests planned to be executed. Are our test scripts
correct and complete?
Work on the principle of Plan, Do, Check,
Act. Identify what needs to be tested.
Document how to verify each requirement.
Get agreement from the Design, Build and
Business Teams that the test steps and
expected results are correct and email
confirmation of agreement to the project.
How do we handle
requirements that we
cannot verify?
Requirements that have no apparent
measurable success criteria, e.g. no method
of verifying if they operate correctly or not,
are given a status of 'Test Exclusion'.
A complete list or requirements
that will not be tested is available
for the MOSL Test Lead and
project stakeholders to see. Test Exclusions are
agreed and
documented in the
<phase> Test
Approach.
The 'Test Exclusions' are reviewed by the
Test Leads and project stakeholders and
either a measurable success criteria method
and expectation is identified, or the status of
'Test Exclusion' is agreed, with a Risk value
assigned to it.
An agreed list of requirements that
will not be tested is visible to
everyone on the project.
Are we ready to start
testing?
List the entry criteria for the test phase and
report progress on the delivery of each in
the Weekly <phase> Test Progress Report
Delivery of each Entry Criterion
can be allocated to a person,
responsible to deliver, and
progress can be reported on that
delivery.
Weekly <phase>
Test Progress
Report
What is in the release
received from the
solution provider?
Receive a Release Note from the solution
provider with the delivered component,
containing:
New content,
Defect fixes,
Tests and results,
Build Instructions,
Back-out procedures.
The <Phase> Test Team will
know what the contents of a
release are and can plan their
activity around its contents.
Release Note
52
Market Operator Services Ltd ©
Question Action to communicate answers Benefit of this approach Output Reference
How do we know the
release was installed
correctly?
The solution provider should provide a list of
new or amended content that is visible to
the user, what to look for and how to
navigate to it.
The Test Team have documented
expectations and the means to
verify the release has been
correctly installed.
Sanity Test Results
will be published to
the <phase> Test
Team.
As part of the Sanity Tests, the new or
amended content should be verified as
present and working by the Test Lead.
The Test Team knows exactly
what has been received and has
confidence that it is worth detailed
testing.
Poor quality releases are
prevented from being deployed
into the test environment, wasting
everyone’s time.
The results of the Sanity Tests are
published to the test team.
What if the release
doesn't meet the
agreed acceptance
criteria?
The release will be backed out and an email
sent to the solution provider, Test Manager,
MOSL Test Lead and business PM’s,
rejecting the release and providing details of
what acceptance criteria has not been met.
The quality of the releases from
solution providers will be kept at
the agreed levels.
Email sent to the
solution provider,
Test Teams, MOSL
Test Lead and PM’s,
rejecting the release
with the reasons
why, and the action
required to obtain
acceptance.
The solution provider will correct the
situation by re-releasing the package with
the missing acceptance criteria met.
What is the progress of
activity going on in the
test phase?
<Phase> Test Manager will produce a
weekly progress report and issue to the
MOSL Test Lead
Current status of all testing is
known by stakeholders
Weekly <phase>
Test Progress
Report.
<Phase> Test Manager will update the MS
Project Plan with progress to date.
The impact of delays on the Test
Phase Plan, from upstream or
downstream activity can be clearly
seen and reported.
MS Project Plan of
the Test Phase
activity.
MOSL Test Lead will compile a summary
test phase plan from the Test Leads
Detailed Test Activity Plans.
The impact of delays on the Test
Phase Plan, from upstream or
downstream activity can be clearly
seen, reported and managed.
MS Project
Summary Plan of
the Test Phase
activity, for Project
Managers Meetings.
What needs to be done
today?
Daily get together by MOSL Test Lead and
Test Managers to explain what was
delivered yesterday and what is planned to
be delivered today.
This is the time to highlight any perceived
risks to deliverables.
Short meeting, to exchange
information, and to enable the
MOSL Test Lead to prioritise
strategic requirements, and for
Test Managers to understand what
is needed to be delivered and to
escalate risks and issues.
Daily Meeting
Test Manager issues email to Test Team
with summary of:
what was delivered yesterday?
What needs to be delivered today,
identified risks and issues, and what is
being done to manage them.
The Test Team share success,
have clearly defined goals, with
method of knowing when they
have met their goals. Sight of
issues and risks that have been
escalated, and what is being done
to resolve them.
Email from Test
Manager to Test
Team.
53
Market Operator Services Ltd ©
Question Action to communicate answers Benefit of this approach Output Reference
What is the current
state of test execution?
Produce a spreadsheet showing a graph of
the following total counts by date:
planned tests to be executed,
actual tests executed,
passed tests,
total defects opened,
total defects closed,
total open defects by type,
total open defects by priority,
total open defects by owner,
Total open defects by age.
Summary page answering the
following additional questions:
Are we on target to finish on time?
What is the quality of the system
under test?
Are we finding many defects?
Are we correcting defects fast
enough?
What kind of defects are we
finding most of?
Excel Spreadsheet
distributed to project
stakeholders and
solution providers.
What is the current
state of the Project
Activities?
Weekly get together between the IS
Programme and Project Managers to:
Discuss progress against plan,
Risks that will impact progress,
Issues that are impacting progress,
Project issues,
AOB.
Possible delays, issues and risks
are communicated quickly. The
impact of any delay is
communicated and understood.
Success is broadcast amongst the
different work silos, so that as a
team we can celebrate success.
Weekly Project
Management
Meeting
What defects are
opened, who has them
and what is the priority
for fixing them?
Produce a Defect Summary Report
containing the outstanding defects, by
Allocated To, Priority and Severity.
Regular discussions on defects
ensure that none are left open for
too long and are always visible.
HP ALM - Defect
Summary Report
Hold a meeting between solution providers
and the phase Test Lead to discuss the:
Quality of the defects raised.
Additional information required.
The order defects are required to be fixed,
delivery plan.
Quality of responses from solution
providers, code & resolution details.
The solution provider and the test
teams are working to the same
schedules and priorities.
How do we handle
defects that don't
specify what the
problem is, why and
how to recreate it, or
how it was fixed?
Solution providers will reject defects and
issue them to the Test Manager, with a
status of 'Rejected'.
The quality of defects will be high,
enabling defect investigation and
resolution to be achieved quicker
by solution provider.
HP ALM - Web
Defects
Test Manager will reject defects that do not
have understandable resolution details
added by the solution provider to the "fixed"
defects.
MOSL Test Lead will discuss rejected
defects with solution providers and ensure
additional data is provided and the quality of
the defects are improved.
MOSL Test Lead will monitor the numbers
and types of 'Rejected' defects and discuss
how to reduce these with the Test
Managers and the solution providers.
How do we escalate
issues and risks?
MOSL Test Lead and Test Managers will
meet each morning for 10 minutes
maximum to agree the priorities for the day
and communicate progress, issues or risks
that need to be managed by the MOSL Test
Lead.
Risks and issues can be
communicated and managed
effectively.
Project Risk and
Issue register
The MOSL Test Lead will escalate via the
Weekly Project and Project Meetings.
54
Market Operator Services Ltd ©
Question Action to communicate answers Benefit of this approach Output Reference
Is the test strategy
working?
The MOSL Test Lead and <Phase> Test
Managers will meet monthly to discuss
implementation of the Test Strategy, test
activities, progress to date, project plan,
lessons learnt, tips and techniques, and
issues and risks.
Any problems with the test
strategy can be identified and
communicated quickly. So that
improvements can be
implemented before problems
occur.
No reporting output
is necessary.
Have we finished
testing?
<Phase> Test Managers will produce a Test
Phase Results report, detailing what was
tested, what wasn't, where and how it was
tested, the results of every test, and the
status of all defects found.
Documented summary of what has
happened during a test phase,
which is an indication that a test
phase is ready for the Quality Gate
Management process to start.
<Phase> Test
Report template
Execute a Quality Gate Management
Process that reviews the following:
Evidence of what has been tested, by whom
and when has been provided.
The phase Entry Criteria has been met.
The phase Exit Criteria has been met.
The outstanding defects have been
reviewed and agreement obtained that we
can proceed to the next phase of the
project.
The project will have confidence
that the required testing has been
completed satisfactorily.
Resources can be released.
There is a central list of
outstanding defects, the Known
Issues List (KIL).
Activity to clear the KIL can be
planned.
Quality Gate
Checklist, with:
objectives,
actions,
acceptance criteria
to be met,
evidence required
as proof.
Quality Gate
Checklist approved
by Gate Keepers.
Meetings held prior to a Gate Delivery, to
ensure that the actual Quality Gate Meeting
results in approval to proceed.
Everyone knows what they need
to do to achieve a successful gate.
Everyone can help in delivering
the gate. There are no surprises
that can impact the current and
downstream test teams.
What did we learn
during this phase of
testing?
Hold a Post Test Phase Execution Review
Meeting and produce a report of the
findings.
A document is available that
details the activities that worked
well, the things that need to be
looked at and improved, the
problems encountered and how
we dealt with them.
Lessons Learnt
template and review
meeting
55
Market Operator Services Ltd ©
10 Defect Management
The main purpose of testing is to find defects before the system users do. Defects are defined as a
discrepancy between what was expected to happen as opposed to what actually happened.
The solution provider have their own defect tracking system, and to enable efficient tracking, HP
ALM will be used as the repository for all testing.
This section is a supplement guide to specify the use of Priority and Severity values and also to
document SLAs around defect turnaround.
The <Phase> Test Managers will be expected to take on the role and responsibility of Defect
Manager for their defects within this programme of work.
10.1 General Principles
A summary of the main points around defect handling are as follows:
Each defect will have an owner, Assigned To, and a Status.
HP ALM will enable amendments to each defect to be traced and audited, and permissible activities to be controlled.
The solution provider will nominate a single point of contact to receive defects, and supply contact details.
The Solution provider and MOSL Test Lead will agree a triage team to review defects and agree priority, severity and confirm that the discrepancy requires a fix.
10.2 Defect Priority & Severity
To avoid confusion between solution providers and test teams only the Priority value will be used
on defects.
Priority Description
1 Critical A problem has occurred that has rendered further testing impossible either at a functional or test script level, because:
No workaround
Affects all users
Affects all system usage activities
Could cause significant loss of revenue
Will cause interruption of a major process
Is perceivable by the business resulting in almost certain adverse reaction
2 Major Unable to complete testing of a test script, but can continue on other tests. Test script will need to be completely re-tested.
A problem that is not easily recoverable without significant manual effort:
A workaround would involve a high level of additional user effort
Creates significant operational risk
Affects most users and most system usage activities
Data corruption (recoverable)
Is perceivable by business which may result in some reaction
3 Minor Able to complete the test script but with significant non-compliance with expected results. May not require complete re-test of the test script.
A problem with a business impact
A workaround would involve a moderate level of additional user effort
56
Market Operator Services Ltd ©
Priority Description
Creates moderate increase in level of operational risk
Affects significant minority of users and significant minority of system usage activities.
Can be recovered from at a later stage without impacting Operational Efficiency
May be required to be fixed prior to exiting the current test phase
4 Minor Able to complete the test script but with minor non-compliance with expected results. Will not require complete re-test of the test thread. Minor non-compliance is a defect which does not impact the functionality and which an explanation would temporarily resolve.
A minor problem which:
A workaround would involve little additional user effort
Carries little operational risk
Affects a small number of users and small number of system usage activities
Does not impact functionality of the system or cause serious confusion to the user
Is not required for delivery prior to exiting the current test phase
10.3 Defect turnaround SLAs
The following SLAs are in place with the solution provider:
Priority Definition
Emergency
(‘Immediate’)
Service Provider to respond within a day;
Work to start immediately;
Resolution to be reached as soon as possible.
High
(‘Urgent’)
Service Provider to respond within 2 days;
Work to be scheduled for as soon as practicable;
Resolution most likely to be required before the end of the Test cycle within which the defect was identified.
Medium/Low
(‘Routine’)
Service Provider to respond within 5 days;
Resolution to be within an agreed timetable.
10.4 Triage process
Defects are raised using a combination of Priority and Severity values, which when viewed in
combination define the SLAs for service provider initial response and fix expectations.
The purpose of the triage process is to identify the ‘owner’ of the defect therefore reducing the
amount of time for resolution.
To ensure this process is conducted smoothly the Defect Triage Team will review the list of all
defects with a Status of “New” for the purpose of:
Ensuring that the correct Priority and Severity levels are set
Ensuring that there is sufficient and correct information for the incident to progress with investigation and fix.
Ensuring the defect is assigned to the correct team for investigation
Identify changes as opposed to defects.
57
Market Operator Services Ltd ©
Given the geographical distance between the test teams and solution providers, the defect triage
will occur daily via a conference call.
Changes will be closed as defects and recorded on the project Change Request log.
10.5 Open defect review at end of test phases
As part of the quality gate process that enables test phases to close, a review of open defects
should occur involving the project stakeholders, project and Market Services, and solution provider
to agree which defects can be postponed to later. This postponement may involve downgrading
Priority and Severity to Low & Cosmetic.
10.6 Defect Management Process Flows
The following diagrams represents the Defect Management processes for both Internal and
external defects agreed with CGI.
Internal defect management
Internal defect management process flow
External defect management
Ec
59
Market Operator Services Ltd ©
11 Test Governance
The provision of test governance is a requirement to support effective testing within the Project and
the delivery lifecycle. Test governance is an essential aspect of testing and ensures a complete,
visible and transparent testing process is maintained.
Test governance covers the following aspects:
Quality Gates
Audit Trails
Best Practice
Resources
Escalation
Process Improvement
Reporting
Quality
11.1 Entry & Exit Criteria Process (Quality Gates)
To ensure that all phases of testing are completed satisfactorily before moving onto the next, a
quality gate review process will be implemented.
The process will consist of:
Agree Quality Gate Acceptance Criteria
Compile EvidencePrepare For Gate
ReviewGate Review Test Phase Complete
Remedial Activity
At the end of each test phase a test completion report (End of Test Report EoTR) is mandatory to
support the quality gate review process.
11.1.1 Agree Entry & Exit (quality gate) acceptance criteria
The MOSL Test Lead will agree the quality gate acceptance criteria with each phase test manager,
to measure the successful completion of that test phase.
For example:
Test Phase Verification Method Evidence
Unit Testing Entry and exit criteria met
Report from a test coverage tool indicating all code branches have been passed through during unit & Unit integration testing
No defects outstanding that require fixing before next test
Report from test coverage tool or email from Development Manager of solution provider that code has been satisfactorily tested
Known Issues List circulated
60
Market Operator Services Ltd ©
Test Phase Verification Method Evidence
phase to stakeholders showing current open defect list with supporting email evidence of why no fix is currently required
Systems &
Integration
Testing
Entry and exit criteria met
All required tests executed successfully at least once to the level required by the risk priority
All requirements mapped to test scripts
Physical evidence of successful tests audited to confirm that reported result is the same as expected result
No defects outstanding that require fixing before next test phase
Email from the phase Test Manager stating that the test phase has completed successfully
Test coverage matrix showing requirements mapped to test scripts with execution status
Defect report showing Open defects and comment stating why fix is not required
Non Functional
Testing:
Failover &
Recovery
Performance
Test
Security &
Penetration
Test
Entry and exit criteria met
All non-functional requirements in scope have been mapped to test scripts
All required tests executed successfully at least once to the level required by the risk priority
Physical evidence of successful tests audited to confirm that reported result is the same as expected result
No defects outstanding that require fixing before next test phase
Email from the phase Test Manager stating that the test phase has completed successfully
Test coverage matrix showing requirements mapped to test scripts with execution status
Defect report showing Open defects and comment stating why fix is not required
User
Acceptance
Testing
All business requirements in scope have been mapped to test scripts
All required tests executed successfully at least once to the level required by the risk priority
Physical evidence of successful tests audited to confirm that reported result is the same as expected result
No defects outstanding that require fixing before next test phase
Email from the phase Test Manager stating that the test phase has completed successfully
Test coverage matrix showing requirements mapped to test scripts with execution status
Defect report showing Open defects and comment stating why fix is not required
Service
Readiness
Testing
All acceptance requirements in scope have been mapped to test scripts
All required tests executed successfully at least once
Physical evidence of successful tests audited to confirm that reported result is the same as expected result
Email from the phase Test Manager stating that the test phase has completed successfully
Test coverage matrix showing requirements mapped to test scripts with execution status
Defect report showing Open
61
Market Operator Services Ltd ©
Test Phase Verification Method Evidence
No defects outstanding that require fixing before next test phase
defects and comment stating why fix is not required
Market &
Company
Readiness
Testing
All business processes tested to ensure that the Market Participant is market ready
Agreement from all stakeholders that the process of market & company readiness has been verified
Successful interfacing to a test environment
Emails from all stakeholders stating successful test
11.1.2 Compile evidence
Evidence to prove that a test phase has been successfully completed will be required to ensure
quality gate approval. It is essential that evidence is collected throughout the test phase, and not
just during test execution.
11.1.3 Prepare for gate review
Before any test phase can be considered complete a quality gate review will be held.
The review preparation is the responsibility of the phase Test Manager and should consist of, but
not be restricted to, the following tasks:
Complete the test phase completion form and publish to the stakeholders that will be required to attend the gate review
Book meeting room and time with stake holders for the review.
Ensure that all required evidence is available for review
11.1.4 Gate review
Each gate review will require a “governance body” to review the evidence provided by the phase
test manager. That group may differ from one test phase to another, but should have a core team
made up of:
Design authority, systems architect and or Market Services analysts
Project Manager, either IS or business or both
Phase Test Manager to present evidence that all acceptance criteria has been met, and
The MOSL Test Lead to act as a subject matter expert on testing, to confirm that the acceptance criteria have been met.
At each Quality Gate review, the phase Test Manager should present a status report against the
criteria to the appropriate “governance body”. It is as important to discuss the criteria which have
not been fully met and the associated risks to the subsequent test phases, as it is to agree on has
delivered. A Quality Gate Exception Report template can be used to highlight the exceptions.
The “governance body” must be satisfied that the evidence provided is appropriate to confirm that
the test phase has met its purpose.
Should any acceptance criteria lack evidence of successful completion, for example open defects,
tests with status of “FAILED”, or not enough physical evidence of completion, remedial activity may
be required to resolve the issue.
62
Market Operator Services Ltd ©
11.1.5 Test phase complete
Where the “governance body” agree that a test phase is complete, the <Phase> Test Manager will
issue an email stating that the test phase is complete, attaching the approved Test Completion
Report. All evidence and test assets should be archived to enable review at a future date should
the need arise.
The Test Managers will define and own the entry and exit criteria for each phase in the Testing
Lifecycle as a basis for the Quality Gates.
At the beginning of each phase of testing, the Test Managers will be responsible for ensuring that
the Quality Gate process is followed in accordance with MOSL Testing Services Policies and
Procedures
The process is defined as establishing a meeting that has representatives from the present phase,
the next phase, MOSL Market Services, technology and testing. The Quality Gate meeting will
assess the exit criteria from the present phase and the entry criteria for the next phase.
Based on these criteria a judgement will be made as to whether to proceed to the next phase.
If the approval to move to the next phase is not received the Test Managers will ascertain where
the main risks and issues exist and facilitate the steps required to resolve the main issues before
the next approval meeting.
In the majority of circumstances, the decision to proceed or not will be reached as a consensus
view by the meeting. However, the Test Managers can be used as a point of escalation for any
unresolved issues, including any meeting participants or Senior Management attempting to
override objective data found during testing.
In all circumstances, the artefacts that were used as evidence in the meeting, including the Quality
Gate report (e.g. defect reports, End of Test Repots, project documents) will be made available by
the Test Managers, to the relevant stakeholders.
At this stage the Quality gates are defined by the Entry and Exit criteria detailed in the Schedule 3
Testing & Acceptance document and the CGI contract. The following Entry & Exit criteria are
detailed in the Test Phases Section 6 of this document.
11.2 Suspension & Resumption Criteria
Abnormal Test Termination; testing may be stopped or suspended if at least one of the indicated criteria is fulfilled:
During the tests, requirements change which result in need to re-plan testing from the
beginning and the testing has to be reoriented
An excessive number of Change Requests impact on testing schedules and changes to
test cases
The installation of the software is not successful
There exist defects of Severity 1, making testing impossible.
There exist a high volume of defects of Severity 2 and 3 – a significant test is not possible
The Project goes into Exception
Test resource is not available at planned times Hardware & Human
Agreed additional scope thus increasing the number of scripts to be executed
Test Environment not available; all testing stops
Application down-time preventing progress through a group of scripts
Global and/or local infrastructure issues preventing progress through a group of scripts
63
Market Operator Services Ltd ©
Data correctness issues preventing progress through a group of scripts
Test environment and/or application configuration issues preventing progress through a
group of scripts
Lack of training and/or knowledge to plan prepare and execute testing
Testing tools and licences are unavailable
The occurrence of a series of major deficiencies which indicate that the build has been
unsuccessful, incomplete or contains incorrect or broken versions of software, or that
insufficient unit, systems or integration testing has taken place.
Lack of availability of experienced personnel to execute or support the tests.
Lack of availability of experienced personnel to correct system defects.
Presence of Critical (show-stopper) defects (environmental or application), unexpected
data and/or database corruptions that by default render further testing impossible.
Loss of support of the testing environment or of significant features of the environment
Unstable environment
A high occurrence of defects being raised, the volumes being such that the time allocated
for daily re-testing is insufficient or the ability to progress with testing at the planned
execution rate is impeded
Lack of 3rd party and/or associate support and interaction throughout their delivery process,
period and obligations
65
Market Operator Services Ltd ©
13 Appendix A – References
[This section lists all project specific documents that support the test for the CMOS Project TBC]
Ref Description Ref Number Version Date
1 Master Test Strategy (this doc) MTS 2.2 6/1/2016
2 CGI Test Strategy SA_T_Test Strategy v1 0 4 0 17/11/2015
3
Detailed UAT Test plan MOSL Phase Test
Plan
0.3 8/1/2016
4
Detailed SIT Test Plan SA_T_Detailed Test
Plan - Phase 1
v1 0 1 0 23/11/2015
5
Bridegall Master Test Plan Bridegall Master Test
Plan
V0.3 8/1/2016
6
Market & Company Readiness Plan Market & Company
Readiness
Final 17/12/2015
7 Test & Acceptance Schedule 03 Final 25/9/2015
8 Method Statements Part 1 Final 24/9/2015
9 Method Statements Part 2 Final 24/9/2015
10 Data Strategy TBC TBC TBC
11 Environment & Release Strategy TBC TBC TBC
12
Various codes documents (found on
the OpenWater website
CSD0001, CSD0007,
CSD0301, CSD0401,
CSD0402, CSD0403,
CSD0404, CSD0405,
CSD0406
66
Market Operator Services Ltd ©
14 Appendix B – General Principles of Testing
14.1 Stages of The Test Approach
The different stages of the testing approach within the MOSL CMOS Project are as follows:
Understand the Project Drivers
Understand the Project Landscape and Foot-Print
Define the Master Test Strategy framework (this document)
o The main objective of this stage is to define, build, organise, manage and schedule
the Test Project. The Master Test Strategy is the basis on which the detailed Level
Test Plans are shaped and derived.
o The detailed Level Test Plans identify the tasks, the transactions, Market scenarios,
Market rules to be automated or manually executed and the contingency plans for the
identified risks, the schedule, roles and responsibilities and the deliverables from the
MOSL CMOS testing team. The infrastructure required to carry out the tests is
identified, the detailed analysis is carried out, the test artefacts are created, the teams
and relationships are built, the test data and interfaces are defined, a detailed
schedule of events is planned, specifics of the what, where, when, who and how are
derived and documented – all of this is broken down into a detailed set of documents
that are walked through with key stakeholders before agreement and approval is
achieved; this will be in accordance with MOSL Testing Services policies and
procedures.
o The detailed Level Test Plans will make up the overall framework for the Test Project
and feed into this document
Understand MOSL’s Operating Model
Carry out Analysis Workshops with the various MOSL Team Leads and Market Participant
representatives to:
o Understand the ‘make-up’ of the Market Operation, and how the various Market
Participants will interact
o Understand the Market Operation Processes & Procedures
o Understand and Analyse the Functional and Technical Requirements Documentation
o Understand and Analyse Market Operation Rules documentation for all impacted
areas of the CMOS solution
o Understand and Analyse the Detailed Design Documentation
Start Static Analysis Testing
Start to Build the required Test Teams and engage with Market Participant volunteers who
wish to support the SIT & UAT test phases
Build the detailed Level Test Plans to feed into the Master Test Strategy
Further develop the fully integrated and detailed Level Test plans
Understand the Integrated Project Implementation Plan and derive the necessary inputs
and outputs through those process documents and procedures
Conduct Risk Assessment on the new CMOS solution in order to feed into Test Case
Design, Definition and Prioritisation
Identify Test Data requirements both Static and Transactional requirements
67
Market Operator Services Ltd ©
Start the Static Analysis Testing process (details below)
Conduct analysis on Test Artefacts from previous projects (relates to Bridgeall existing test
pack to test ‘Settlements’ & ‘Tariffs’ modules) on how they may or may not be re-used –
This feeds into Test Scenario, Script & Case design and requirements
Identify the external applications to which the applications under test communicates – work
with limited number of Market Participants (2 x retailers / 2 x wholesalers) to connect and
test integration with real world Market Participant applications.
Define high level Test Scenarios that need to be tested in each module
o At this stage the high level scenarios for each are identified and a methodology to test
the scenarios is planned. All real world Market Participant scenarios will be
understood and agreed at this point.
Analyse and list the transactions that need to be automated (may not be relevant) and to
be executed manually.
Define the tools required for testing.
Identify the tool that will be used for automation – in this case it’s anticipated to be
Bridegall’s SmartTest framework for the ‘Settlements’ & ‘Tariff’ modules, CGI have not
indicated whether or how they will automate their SIT cases for Functional and Regression
testing purposes however CGI will NOT be using Bridgeall’s SmartTest tool.
Risks are identified and dependencies are understood and communicated to the relevant
stakeholders.
Identifying the correct Market Participant data requirements to support testing; both transactional and static.
Estimate the effort.
Estimate the resource requirements.
Consolidate test artefacts in HP ALM.
Plan and Prepare test execution.
Agree and plan defect management & resolutions processes.
14.2 Solution Provider Approach (CGI, CMOS)
It’s understood that CGI as Solution Provider, has been chosen to implement the Central Market
Operator System (CMOS) in the newly forming English non-household water market.
CGI’s testing approach can be found in the CGI Test Strategy (SA_T_Test Strategy v1 0 4 0).
68
Market Operator Services Ltd ©
15 Appendix C - Test Case Design
15.1 Design & Document Test Assets
The objective of this phase is to create and design the test assets for the MOSL CMOS Project with
the prerequisite level of detail for the Execution Phase in respect of Scenarios, Cases, Scripts and
detailed plans.
The diagram below is an aid to discussion around the validation of requirements and design
documentation. It should be noted that Load and Performance Testing can span any one of the
standard test phases, however is recommended to start as early as possible during System testing.
It must also be noted that Test Case Design is driven by Automation due to its complexities and is
a bottom-up approach.
The information below starts off by detailing the Manual process and concludes in the Automation
and Load & Performance Test Case Design and structure.
15.2 Benefits of Identifying Test Scenarios and Use Cases Early
Engaging early on the MOSL CMOS Project and identifying Test Scenarios and Use Cases early
has a number of significant benefits:
Coverage: By identifying the number of Test Scenarios we can start to determine coverage
Prioritisation: Having identified the Test Scenarios we can start prioritising them
Granularity of steps: Having identified and prioritised the Test Scenarios we can decide on the
level of detail required in the steps for each Test Case
Metrics: Having determined the number of Test Scenarios needed we can use this figure to
generate metrics – coverage measurements, number of Test Scenarios completed vs. number
to be done.
Planning / estimation: By identifying the number of Test Scenarios needed we are able to feed
this information back into the project estimation
The number of Test Scenarios identified is best endeavours at the start of the process. As the
Project develops and as more information is understood relating to the system under test (CMOS),
the number of Test Scenarios required steadily becomes less of a guess and more an accurate
figure.
A Test Scenario describes interactions between actors, including users and systems, which
produce a result of value to a system user; or SME”.
69
Market Operator Services Ltd ©
Each Test Scenario has preconditions, which need to be met for a scenario to work successfully.
Each use scenario terminates with post-conditions, which are the observable results and final state
of the system after the Test Scenario has been completed. A Test Scenario usually has a most
likely scenario, and sometimes alternative branches.
Test Scenario’s describe the “process flows” through a system based on its actual likely use, so
Test cases derived from Test Scenarios are most useful in uncovering defects in the process flows
during ‘real-world’ use of the system.
Test Scenarios are very useful for designing User Acceptance Test cases (UAT) with user
participation which will be the case for the MOSL CMOS Project. They also help uncover
integration defects caused by the interaction and interference of different components, which
individual component testing would not always detect; thus reducing the cost of defect resolution
and increasing early error detection in the design of test artefacts.
15.3 Test design specification (from IEEE829)
During Test Case Design, the MOSL CMOS Project will conform to IEE829, excerpt below for
information only:
ID: Unique id
Features to be tested:
Identify the test items and describe the features and combinations of features that are the object of
this test specification.
For each feature or feature combination, a reference to its associated requirements in the item
requirement specification or design specification should be included.
Approach refinements:
Specify the refinements to the approach described in the test plan. Include test techniques to be
used. The method of analysing results should be identified.
Specify the results of any analysis that provides a rationale for test case selection. For example,
one might specify conditions that permit a determination of error tolerance (e.g. those conditions
that distinguish valid inputs from invalid inputs).
Summarise the common attributes of any test cases. This may include input constraints that must
be true for every input in the set of associated test cases, any shared environment needs, any
special procedural requirements and any shared case dependencies.
Test identification:
List the identifier and a brief description of each test case associated with this design. A particular
test case may be identified in more than one test design specification. List the identifier and a brief
description of each procedure associated with this test design specification.
Pass / fail criteria:
Specify the criteria to be used to determine whether the feature or feature combination has passed
or failed.
70
Market Operator Services Ltd ©
15.4 Test Case Design Techniques Being Employed
Information to the reader: The Test Case Design Techniques that you see below are also Testing
Techniques. The reason that they’re in this section is that they are crucial during Test Case Design
before being take forward into Test Preparation and Test Execution during System and System
Integration test phases.
15.4.1 Equivalence Partitioning
Equivalence Partitioning testing is a method of reducing significant volumes of tests, particularly at
Test Scenario level where either the input or the output is more or less exactly the same and where
there’s no need to execute thousands of tests only to obtain the same results.
For example if you are testing a ‘drop down’ box’ which accepts a hard coded number range from 1
to 1000 then there is no use in writing thousand test cases for all manner of combinations of test for
both valid input numbers plus other Test Scenarios for invalid data if the outcome is exactly the
same – this is where common sense prevails and the ‘80/20’ coverage rule will apply.
Equivalence partitioning splits inputs and outputs into sets, where each value within a set will be
treated the same. Therefore, one value from each set is tested – that value being representative of
all values with that set.
In this method the input domain data will be divided into different equivalence data classes which
provides sufficient coverage. This method is typically used to reduce the total number of Test
Scenarios to a finite and manageable set of testable Test Scenarios, still covering maximum
requirements.
In short it is the process of taking all possible Test Scenarios and subsequent artefacts and placing
them into classes. One test value is picked from each class while testing.
15.4.2 Boundary Value Analysis
Boundary value analysis tests the boundaries of an input or output. The MOSL CMOS Project Test
Team will test the boundary itself, one unit above the boundary and one unit below the boundary.
It’s widely recognised that input values at the extreme ends of input domain cause more errors in
system and more application errors occur at the boundaries of input domain. The Boundary Value
Analysis testing technique will be used to identify errors at boundaries rather than finding those that
exist in centre of an input domain.
As Boundary Value Analysis is the subsequent part of Equivalence Partitioning, the MOSL CMOS
Project Team will use this technique for designing at Test Case and Test Script level.
15.4.3 State Transition
In State Transition Test Design, the Test Analyst will create a model of the system, identifying all
the various states of the system or an item within the system (such as a variable). As there can be
hundreds upon thousands of these, the Test Analyst and their Lead will have decided on the level
of coverage and modules where this technique will be executed before designing the test artefacts.
Test cases will be created to test the transition between states such as, but not exclusive to the
following:
Models each state transition
Defines for each state transition
Start state
Input
71
Market Operator Services Ltd ©
Output
Finish
Use process flowcharts to create Test Scenarios that follow the Market Flow and test scenarios
through the flowchart. The Test Analyst will create Test Scenarios that exercise all parts of the
agreed flowchart and / or each outcome of a decision
15.4.4 Decision Tables
Decision table testing is a method for representing logical conditions that define a system’s
behaviour in a decision table. Typically decision table testing is used in situations where there are
complex Market rules.
This method of testing is used to record complex Market rules and assist with identifying Test
Cases (for both negative and positive combinations of rules).
Decision tables provide a clear and coherent analysis of complex logical combinations and
relationships. This method uses simple two-dimensional tables to concisely describe logical
relationships between program variables.
Because of the large involvement of settlement and tariff testing through CMOS interaction and
interface with Market Participant applications, it is expected the CGI Test Manager in partnership
with Bridgeall will define the Decision Tables along with the key areas of the MOSL Market
Operation in for the crucial role of settlement in delivery of the non-household water market.
It should be noted that this is one of the most time consuming Test Case Design methods that will
should be employed on the MOSL CMOS Project, but is likely to be one of the most critical and
must not be underestimated in terms of the time required to plan and prepare, the complexity of the
design, and the execution.
The execution of Decision Tables from a testing perspective is, from experience, best automated
and not executed manually due to time and complexity.
The benefit of using this technique is that it is an effective technique that can be used for both data
rich and function rich applications and that it is also a last resort technique that can be used when
all else fails. Decision table testing is also relatively easy to understand relatively easy to explain
and being a systematic technique with a deliverable gives something that can be checked and
reviewed.
Challenges using decision table testing is that Decision Tables unfortunately can quickly become
very large and the main problem this causes is they then become difficult to maintain, check and
review; this will be a challenge that will have to be closely managed.
Finally, it is without question, the MOSL Market Services in collaboration with the CGI AND
Bridgeall delivery and test teams who are responsible for defining and providing these rules, their
conditions and the definition of their output.
Decision tables consist of four quadrants:
Condition stub
Condition entry
Action stub
Action entry
Condition Stub
The condition stub is where the conditions are listed. For limited entry decision tables, conditions
will be worded as questions to which the answer would either be yes or no (generally preferred), or
true or false. For a limited entry decision table with complex rules, two or more rules will be
combined and a yes and no (or true or false) test will be designed. For extended entry decision
72
Market Operator Services Ltd ©
tables, conditions are worded such that there will be more than a yes or no, or a true or false, and
there will be any number of answers beyond just two.
Condition Entry
The condition entry is where the all the combinations of conditions are recorded as rules. These
are the Market rules as defined by MOSL’ Market Services. As mentioned before, the number of
rules can be reduced by turning two simple rules in to one complex rule and by using Equivalence
Partitioning techniques; this is explained below.
Action Stub
The action stub contains the list of possible actions that are a result of the combinations of
conditions; there can be any number of these.
Action Entry
The action entry defines what combination of actions that are a result of each rule. Usually when an
action is taken as a result of a rule, an “X” is marked in the table. Marking an “X” in the action entry
gives no indication of sequence. Sometimes therefore, actions are marked with numbers to indicate
the sequence in which the actions mush be taken; this will be decided once the rules and results
have been defined and those being tested agreed.
Combining Simple Rules to Form Complex Rules
If it is noticed that two rules have the same actions, and there is only one difference in the
conditions that form the two rules; these two rules can be combined to form a complex rule. This
can occur recursively using mathematical processes, hence two complex rules, or even a complex
rule and a simple rule, can be further combined to form another complex rule.
Decision Table Testing
Each rule on the decision table will and must correspond to a Test Case. Therefore, if there are five
rules identified in the decision table, there will be five Test Cases.
Coverage for Decision Table Testing
Like the classification tree method, it is possible to define more than one level of coverage. The
simplest and easiest level of coverage will be to test the decision table with simple rules reduced to
as many complex rules as possible or is required; but MUST be agreed with MOSL Market
Services. It may be decided that every combination of condition will be tested and if that is the
case, then complex rules should not be used, or should be expanded back out to simple rules
when Test Cases are identified in order to make the process of definition and test more
understandable and easier to plan, prepare and execute.
15.5 Create Manual Test Scenarios
The process is to define Test Scenarios for existing and proposed Functional and Non-Functional
requirements.
Creation of Test Scenarios is a testing activity that uses scenario tests, or simply scenarios which
are based on a hypothetical story (or Use Case) to help a tester think through a complex problem
or system for a testing execution schedule.
A combination of the approved Requirement Acceptance Documents (RAD’s) and Functional
Design (FD) and Non-Functional Design (NFD) documents defines the CMOS Test Scenarios.
The ideal Test Scenario has four key characteristics: motivating, credible, complex, and easy to
evaluate.
These tests are different from Test Cases, in that Test Cases are variable tests to prove a Test
Scenario, whereas Test Scenarios cover a number of actions within a Market Process.
Test Scenarios are high level descriptions of an expected Test Set or Market Process based on
Market Process, Systemic Process and Manual flows derived from the RAD, FD & NFD.
73
Market Operator Services Ltd ©
In order to create the Test Scenarios, the Test Manager and their Analyst teams will work closely
with the MOSL Market Services Analysts (BA’s) and Market Participants where appropriate using
HP’s Application Lifecycle Manager (ALM)
HP ALM will be the predominant tool for Manual Test Scenario and subsequent artefact
design; it may also be used for Automation which is covered in a later section. As
previously mention ALM will take its initial requirements feed from the approved RAD, FD
and NFD documents. From here ALM will be used to define the Test Scenarios, Cases and
Test Scripts. ALM will be used to indicate Data set up requirements and validation as well
as Defect Management, Reporting and Metrics. Please refer to MOSL ALM standards
document for full information on the use of ALM, its policies and procedures.
15.6 Create Manual Test Cases
Test Case is a set of conditions or variables under which a tester will determine whether an
application or software system meets specifications. The mechanism for determining whether a
software program or system has passed or failed such a test is known as a test repository, and in
the case of the MOSL CMOS Project, ALM will be the repository.
It may take many Test Cases to prove a single Test Scenario in order to determine that a software
program or system is functioning correctly. Test cases must not be confused with test scripts.
At this level, Negative testing will be considered as per the MOSL Test Policy as follows;
Negative test cases are designed to test software behaviour in ways it was not intended to be executed, and also verify exception handling routines in the application.
The following tests are considered for each project
o Defects are mapped back to the requirement indirectly
o Embedded single quote/special character ( “ &$£ )
o Required data entry (including null values)
o Field type Field size Alpha / Numeric boundaries Alpha / Numeric limits
Middle of valid partition
Middle of lower invalid partition
Middle of upper invalid partition
Lower boundary
Lower boundary -1
Lower boundary +1
Upper boundary
Upper boundary -1
Upper boundary +1
Nothing 0
Negative
Non-digits
Lower boundary of number of digits
o Date bounds
o Date validity
o Web sessions
o Navigation
74
Market Operator Services Ltd ©
15.7 Create Manual Test Scripts
A Test Script is a set of instructions that will be performed on the system under test (SUT) in order
to test that the solution functions as expected and prove the Test Case. These detailed step-by-
step accounts can be executed manually or automatically. Test Scripts are in essence, the very
detailed steps that must be performed in order to run the Test Case.
There are various means for executing Test Scripts both manually and automated. Therefore to be
clear, a Test Script is the step by step detailed process that will prove whether the scripts that are
defined from Test Cases either pass or fail.
15.8 Develop Test Schedules
Production of Test Execution schedule for release by phase broken, down by module, Systems,
followed by Scenario, Case and number of Scripts, will be located in the within HP ALM. This not
to be confused the Test Plan Schedule to be created in MS Project.
The Test Schedule will be designed in such a way that it follows logical Market Process Flows
followed by Transactional Flows which will inherently define the inputs and outputs along with the
static and transactional data requirements.
It must be noted at this point that Market Rules Testing, Boundary Testing and Equivalence
Partitioning testing detail will be imbedded at Test Script and Test Case level and NOT at Test
Scenario level.
15.9 Test Data Requirements & Preparation
All test data requirements will be defined at a detailed level, before during and after the completion of all Test Scenarios and Scripts. It may be necessary to condition certain data requirements as part of the preparation exercise, this is very normal in solutions that have that require external application integration (as is the case of the Market Participant’s applications and CMOS) and where spurious data may exist. It may not be possible to obtain all of the test data in the requisite format, therefore certain test data may need to be conditioned prior to use in execution. IT MUST BE STATED ABSOLUTELY CLEARLY THAT THE MOSL ENGAGED DATA PARTNER ARE FULLY RESPONSIBLE FOR THE MANAGEMENT AND PROVISION OF THE DATA HOWEVER IT IS THE RESPONSIBILITY OF THE TESTING TEAMS TO DEFINE THIS DATA WITH SUPPORT FROM THE DATA PARTNER. THE TEST TEAMS WILL BE RESPONSIBLE FOR THE IDENTIFICATION OF BOTH TRANSACTIONAL AND STATIC TEST DATA. Prior to the start of each cycle of testing and post completion of the previous cycle of testing, it may be necessary to perform a data refresh of the test data supplied to return it to its original state. It is imperative that the Data partner own, supply, condition and load the data required for testing into the test environment(s) as it is the Data Partner supported by the MOSL Data Lead and MOSL Market Services that understands the data; this relates to both static and transactional data. It is advised that for a project of this importance, there MUST be a person or persons, responsible for data management throughout the test lifecycle as it cannot be the responsibility of testing to own, supply, condition and load the data required for testing. Testing’s responsibility is to use the data as an enabler to prove or disprove the end-to-end solution through test execution. Test data selection is an important part of the Test Environment set up and ensures that the requester has made the correct analysis when determining the scenarios required for fully testing a system, solution or module. The variety of test data usually generates different expected results and should always be taken into consideration during set up as not all requests will be standard. Data flows through a system, solution or module; as well as ‘between systems’, and so all must be considered for the MOSL CMOS solution project.
75
Market Operator Services Ltd ©
The Data Lead supported by the MOSL Testing Service and the CGI Environment Manager will be responsible for loading the data into the Test Environments and ensure that an environment Smoke Test is completed before being handed over to the Test Manager for each testing phase. Smoke testing of the environment is key for the CMOS project as timescales will be negatively impacted if environments are unavailable in the dates required Test Data set up requirements for testing any of the solution must be considered at the start of any Test Environment set up procedure, and also where a specific Change Request for test data has been received. The information below is broken down into two main sections:
Data Security – Ensuring customer sensitivity and data protection
Base Data. What is the state of the database/s at the start of testing
15.9.1 Data Security
Ensure guidelines from MOSL Security documentation in respect to the use of live data during
testing are being conformed to.
As testing will be performed by third party associates on the MOSL CMOS Project, careful
consideration must be given to the data used and where necessary, client confidential data must
be changed.
This is most important also as offsite locations will be used for testing and therefore it is imperative
that data sanitisation is performed. Advice from MOSL Security Department must be sought in
order to clarify the process.
15.9.2 Base Data
To be completed as an output of the MOSL Data Strategy due January ‘16
15.10 Practicality of the Database Requirements
The following questions should be considered before making a final decision on the test
base data requirement:
Has the MOSL CMOS Project defined and presented a test data model that flows through the solution
How long will this take to set up
Has a specific Test Environment resource request been submitted, evaluated and approved
Who will do this piece of work and do they have time allocated within the timeframes required
Do the resources doing this piece of work have the required skill set and knowledge
Is there enough space or will disc space need to be ordered and installed - is this even possible
Has a full cost justification been carried out prior to change request submission
15.11 Newly Created Base Data
If creating all data from scratch consider the following points:
Can all fictional test data be set up manually
Are all relationships with other data and any cross reference table entries known
Does the fictional data need to link to realistic or live data feeds / extracts
Does the use of simulators need to be considered
Does the use of stubs or drivers need to be considered
76
Market Operator Services Ltd ©
15.12 Synchronisation with other Databases
Is there more than one database involved in the test flow or used for reference purposes and will need to be synchronised with in terms of base data requirements
They may both need to be copied from live at the same point in time.
Ensure one is not pre and the other post any overnight or month processing / housekeeping
The data could be populated by existing extract or push jobs with greater selection dates / criteria, this needs to be considered
Are the feeds real time or batched
Are any shell scripts required in order to get data
15.13 Amendments and Cancellations
Consider the following:
Can amendments be made and how are they either kept in synchronisation with or extracted to other databases or systems
Are they represented as a cancellation if cancelled
Are they represented as an amendment if amended
Where an amendment or cancellation is made, consider a test at each point of the transaction lifecycle, not just at the beginning or end
15.14 Transactional Data
Ensure that the requestor has properly scoped the end-to-end cycle of the transaction in order to ensure that more than just the entry and exit data points have been mapped. This safeguards against relation gaps in test data
Suggest to the requester that the overall solution is broken down into smaller scenarios or test flows in order to ensure there are no gaps in the transactions which could cause data gaps or errors
Ensure that all database versions are correct for handling test data format
Ensure that all data transactional rules have been defined along with data structure where transfer is happening; especially in Revenue Assurance and Finance type streams
15.15 Maintenance Routines
Consider all database maintenance routines such as history purging, patch upgrades or power up/power down routines
Consider if result or transactional data needs to be moved to a safe holding area to safeguard test results
Ensure all baseline data has been identified, agreed and in any new build or refresh process
15.16 User Set Up
Ensure that all user logins have been defined o User Name o Directorate o Password o E-Mail Address o Contact Number o Access Level o Date Required From o Date Required To o Domain Name/s where access is required o Test Machine location details o Justification o Cost analysis impact
77
Market Operator Services Ltd ©
15.17 Audit Trails
The Test Managers will provide centralised management of information to allow the Test process to
be audited. This is done to verify that best practice is adhered to and also to assist in analysis of
the present process to allow continuous testing improvement.
Wherever possible, the change history of artefacts within HP ALM will be maintained by the Test
Managers and their teams. In addition, all management reports and Quality Gate artefacts will be
archived. The Test Managers and Test Leads will own this archive and will ensure the MOSL
CMOS Project PMO is kept fully informed.
The Test Managers will carry out planned reviews of their own team’s performance against pre-set
and pre-defined criteria in accordance with MOSL Policies and Procedures, and if a review shows
that the defined processes have not been followed, then deviations will be highlighted and an issue
raised that will be managed according to the issues management process of the Project lifecycle.
15.18 Best Practice
The Test Managers are required to follow testing best practice, however, the challenges facing
testing will continuously change due to developments in technology, application lifecycles, test
tools and test approaches. It is imperative that current developments in testing best practice are
reflected in the processes and documentation owned by the Test Managers and disseminated to all
testing teams.
The Test Managers will act as a central repository for testing knowledge and will both collect and
provide that information.
15.19 Resources
There are 3 key resources that are required for effective testing:
Test Data
Test Environments
Testing Personnel
It is the responsibility of the Test Managers to ensure that a high-level view of these resources is
maintained. This will be done to facilitate efficient use of these resources and to provide
management information to assist in the arbitration of issues when there are shortages of
resources or scheduling conflicts for specific resources.
The information that will be provided can also be used in resource pool planning to ensure that all
of the risks associated with testing resources are understood and documented.
15.20 Escalation
The Test Managers have been engaged with the high level sponsorship of the MOSL CMOS
Programme Delivery Team (PDT) and will provide a separate escalation route for Test Managers
that is independent of the development teams, MOSL Market Services and Market Participants.
This independent escalation route is important because testing can produce quantitative data that
has commercial implications for either the development teams, MOSL Market Services or Market
Participants. It is important that the objective data provided by testing cannot be altered to suit the
message that would favour one stakeholder above another.
If agreement cannot be reached within the Quality Gate process then an issue must be raised by
the MOSL Test Lead and managed according to the issues management process endorsed and
managed by the PMO.
78
Market Operator Services Ltd ©
15.21 Process Improvement
The Test Managers are responsible and accountable to provide a process to enable effective
testing for the MOSL CMOS Project and therefore it will be possible to use audits and lessons
learnt to implement improvements in that process.
Once sufficient data has been collected for a phase of the CMOS project, it will be possible to carry
out analysis to determine where the process would benefit from improvement; particularly
regarding pertaining to future testing within MOSL BAU. This will be the responsibility of the Test
Managers through continuous process improvement and documentation thereof.
Any identified improvements will be incorporated with the existing testing collateral of MOSL.
15.22 Quality
Quality is a key enabler for change. Change is enabled because the effects of defective code is
reduced. This means that reusing and combining code components in new configurations results in
a more predictable outcome.
Quality is an often neglected aspect of the Project Management triangle of cost, time and quality.
This is driven by the high pressure applied to meet deadlines within specific cost constraints.
However, being as a large proportion of overall project costs comes from testing, maintenance and
rework it is imperative that the importance of quality is emphasised in all project decisions.
It will be the responsibility of the Project and Project Managers to provide advice and practical
expertise to all areas of MOSL and development teams to aid them in improving the overall quality
of delivered software.
Likewise it will be the responsibility of the MOSL Test Lead to do the same throughout the Test
Teams.
79
Market Operator Services Ltd ©
16 Appendix D - Test Reports
16.1 Test Metrics
When referring to Test Metrics, we are referring to different measurements of testing progress,
quality, risk and efficiency data generated from both manual and automated test methods. There
are different types of metrics required from this data which must be pitched at different layers in an
organisation.
One of the key aspects of testing within the MOSL CMOS Project is to make it clear where and
when defects are being found. This has to be supplemented by quantifying the amount of testing
effort that has been expended on testing in each phase.
The primary source of data will be HP ALM. Information relating to the progress of test
development and execution will be presented along with the number of defects found. The
information will be presented in a pre-defined and agreed format that is updated on a regular basis.
The following is a minimum set of data that will be presented from ALM on a project basis:
Number of tests in development
Number of tests ready to execute
Number of tests executed
Number of tests passed
Number of tests failed
Number of defects raised
Number of defects closed
An important point of note is that the Unit testing carried out by CGI supported in Settlements &
Tariffs by Bridgeall is unlikely to be included in HP ALM due to the large number of tests expected.
However, it is recommended that code coverage statistics are produced to show how much of the
code has been exercised by testing at this stage. It is recommended that code coverage is less
than 90%, the reason why is explained in detail by the solution provider.
This section will refer to the following Project Test Metrics;
Planning testing metrics
Illustrating progress against planned
Presenting metrics in a dashboard fashion
Calculate the risk of progress against plans
Planning and presentation
Defect tracking and efficiency
Qualification and adaptation of metrics
Measuring KPI’s
Also, this section is referring to the different measurements of Testing Progress, Quality, Risk and
Efficiency data generated from both Manual and Automated test methods.
There are different types of metrics required from this data and which will be pitched at different
layers within the Project to suit the relevant audience.
There are a wide variety of metrics which will be tracked; these will be selected depending upon
the situation to hand within the various phases and requests for information.
80
Market Operator Services Ltd ©
By understanding the metrics, we will not only be able to understand what is happening within the
MOSL CMOS Test Project; specifically from a testing perspective, but will also be able to make
informed decisions based on the information provided in order to achieve the required levels of
Quality, to Time and Budget.
Quality Metrics will be classified into four categories:
1. Product Metrics
2. Process Metrics
3. Project Metrics
4. Quality Metrics
Product Metrics - describes the characteristics of the product such as size, complexity, design
features, performance, and quality level. These metrics will not be defined by the Test Project;
rather by Market Services Analysts and Design Leads. However, it is vital that these metrics
are as accurate as possible as they will be required to feed into the Test Project for such Test
Phases as Non-Functional, System and System Integration Testing; and vitally for UAT where
the Product Metrics will be defined in order to prove that the solution is ‘fit-for-purpose’ for the
various MOSL divisions in the respective Entities. It will be upon this, that UAT Success Criteria
and final test Quality Gate will be measured.
Process Metrics - are used to improve software development and maintenance. Examples
include the effectiveness of defect removal during Static Analysis, the pattern of testing defect
arrival, and the response time of the fix process. These types of metrics will be produced by the
Test Managers in the form of Static Testing output, Unit Test output (the exit criteria into
System Integration Testing), Release Defect Analysis into testing (first time fix analysis) and
Defect Management metrics.
Project Metrics - describes the Project characteristics and execution. Examples include the
number of software testers, the staffing pattern over the life cycle of the Project and
productivity. Some or all of these types of metrics will be produced by the Test Managers, but
only upon prior agreement with the MOSL Test Lead; ad-hoc requests will be discussed and
decided on based upon need, requirement, necessity and usefulness.
Quality Metrics - focus on the quality aspects of the solution, product, process, and testing.
They can be grouped into four categories; these metrics will be produced by the Test
Managers with an agreed format:
1. Solution quality metrics
2. End-product quality metrics
3. In-process quality metrics
4. Maintenance quality metrics
It must be noted that these test metrics must not be confused with such things as website hits, click
through metrics or banner type campaigns, these are produced from a production environment and
will remain separate from test metrics.
Reporting is essential to ensure progress and status is effectively reported and visible, as well as
providing a project progress history. It also ensures that senior management are able to raise any
concerns on progress as early as possible. It is hoped that daily Test Execution metrics will be
designed in such a way that they can be obtained ‘Live’ from ALM.
16.2 Planning For Test Metrics
The Test Managers will agree the method of presenting their Metrics with the MOSL Test Lead
(with the approval of the MOSL Delivery Director), Project Manager, and relevant MOSL Market
Services Stakeholders. This will be agreed using templates established as part of the CMOS
delivery project.
81
Market Operator Services Ltd ©
These templates must be established early in the planning stage to allow the Test Managers to
start populating and presenting progress; it’s also important that care is taken with templates in
order to ensure they deliver exactly what’s needed; communication with stakeholders and
recipients of the information is very important.
In order to present metrics on such things as tests planned, estimated time to completion, budget
and resource information, the Test Managers can start to gather this information at a very early
stage during the planning process; it is very important at this stage that the basis for coverage is
understood and criteria being used to achieve the level of coverage required.
In summary; know your audience, have your metrics templates ready, decide on your test-to-
requirement ratio, set expectations and start to populate your metrics.
The information below defines the baseline Metrics that are to be produced as part of the Project;
please note that these will be expanded upon as this process matures:
16.3 Baseline Metrics
16.3.1 Time to Test Estimation
Requestor: Project Manager
Mechanism: Informal meetings as required if test pack changes
Requirement: Maintain plan detailing length of execution of test pack by Functional, Non-
Functional and Business area, including at differing risk levels where available.
This will be based on a stable environment and development state.
Output: Spreadsheet
Responsible: Project Test Manager
16.3.2 Test Execution Report
Requestor: Delivery Director, MOSL Test Lead, Project Manager, Test Managers
Mechanism: Daily email during the full Test Execution cycles and post Implementation
Requirement: A report detailing progress, status and issues of the Test Execution cycles/phases,
to include:
Automated, Manual, Regression Functional and Non-Functional testing
What has been planned to date
What has been completed to date
Where are we against the plan
Are we on track to finish at the expected time
What issues have we encountered and what Risks & Issues have been identified
What defects have been raised as well as statuses of all defects as defined within ALM
Pass & Fail percentages
Executive summary, ALM graphs pass/fail/not run/incomplete
Number of actual artefacts completed versus number forecast by Scenario, Case and Script
Number of outstanding test artefacts that cannot be completed as a result of outstanding defects or issues
82
Market Operator Services Ltd ©
Total number of defects found to date split by outstanding and resolved; by priority
Total number of unresolved defects found to date split by priority, division, entity and owner
System downtime and the associated impact to the testing schedule
System downtime due to data correctness issues and/or application configuration issues and the associated impact to the testing schedule
Dashboard reports and presentations as agreed with the Delivery Director
Wherever possible graphs will be used and statistics broken down by Test Phase.
The testing organisation will produce a daily test summary for the project and will be derived from either ALM directly, manually created or created by interrogating the data within the ALM database
Output: Report and/or dashboard in an agreed media; likely to be MS Excel, MS PowerPoint or
MS Word
Responsible: Test Manager, Creation and Distribution
16.3.3 Test Summary Report
Test Dashboards are there to provide all of the information required to assess progress against plans, cost versus budgeted forecast, current Risks and Issues, and productivity, the state and the stage of the MOSL CMOS Test Project overall. This will be provided in a Test Dashboard format at an agreed level and time to suit the audience. As this type of reporting does take a lot of maintaining, time must be planned in to getting this correct and therefore this will NOT be produced on an ad-hoc request basis. As this information will be at the request of the Project Board and Senior Management, it’s vitally important that the MOSL Test Lead keeps the information accurate and up to date and is therefore reliant upon timely and accurate data from all requested parties; including 3rd party suppliers. The following headers and detail will be considered when compiling this Test Dashboard:
Test Project Summary o Items For Management Attention o Items For Management Action o Current action plan for Items for Management Attention
Test Project Highlights/Updates from Previous Report
Test Project Highlights/Updates since Previous Report
Key Activities Planned for Next Period
Strategic Developments
Top Test Project Risks/Issues o Risk/Issue Description o Project Test Phase o Risk/Issue o Owner o Category o Risk Rating o Probability o Impact o Mitigation o Target Resolution Date o Resource Profile
Resource profile (example below)
Test Project Updates - On-time analysis
Schedule Variance (On-Time) o Planned o Actual
83
Market Operator Services Ltd ©
Effort Variance (On-Budget) o Planned o Actual
Downtime Incident Categories o Build availability o Environments o Connectivity o Test Data Availability
Project Updates - Defect Analysis o Number of test cases executed o Number of defects detected in (Phase) o Defects By Severity (List The Severity Categories) o Defect By Categories (Requirements, Design, Build, etc)
This information can be of course be expanded upon or reduced, however it must be stated at this
point that the more that is built in, the more has to be maintained.
16.4 KPI Metrics
There are different ways in which KPI's can be measured through testing. KPI’s are often confused
with metrics, and whilst they are very similar, KPI’s must be set by MOSL as a marker by which to
measure Testing success. Metrics are developed to measure Project success which feed into the
KPI measurements. Typical metrics types are product metrics, process metrics, and project
metrics.
In all cases, providing there is clear detail and the quality metrics are agile enough to cope with
change, these can be tracked and reported upon to show progress, and where possible, cost
benefits through either simple or detailed metrics through testing. Failure of MOSL to know or
clearly define what they’re measurement and tolerance points are, will detract from the accuracy of
information through quality metrics.
16.5 Defect Management Report
A defect is defined as a discrepancy between expected and actual results. Whether the defect is a
system error, a software defect, a defect with a document, a process issue, a problem with the
environment or a problem with the test model, it needs to be formally captured, analysed,
prioritised and managed through to resolution. In addition to this, in order to manage and
communicate defect data to the relevant stakeholders, it’s important that the relevant data is
presented in a manageable and meaningful format.
Therefore defect data will be extracted from ALM and will be presented in a Dashboard format.
It’s important to note that the Defect Management process is detailed as part of the approved CGI
Test Strategy (SA_T_Test Strategy v1 0 4 0), defect Management reporting is necessary to ensure:
Defect resolution is formally managed
Defects are resolved appropriately, fully and in good time
Any emerging issues that could impact a delivery schedule are immediately recognised
Delivery of a fully tested, high quality product that meets specifications
Progress across all areas of testing is clearly visible and under scrutiny
Defect resolution is appropriately prioritised
Actions for unresolved defects are agreed with the necessary resolution owner
A constant flow of timely project critical information
84
Market Operator Services Ltd ©
It is accepted that the HP ALM toolset will be used for defect management and act as a repository
for defect logging, tracking and reporting. Defects from all phases of testing will be recorded, and
each defect will be allocated a unique reference that will be reported upon.
Requestor: MOSL Test Lead, PMO, Delivery Director, SLT
Mechanism: Email
Requirement: Documents the activities undertaken over the test execution cycles end to end,
including maintenance tasks.
Includes:
Output: Report, MS Excel and Dashboard documentation reports
Responsible: Test Managers
16.6 End of Test Report (EoTR)
Requestor: MOSL Test Lead, PMO, Delivery Director, SLT
Mechanism: Email
Requirement: Documents the activities undertaken over the test execution cycles end to end,
including maintenance tasks.
Includes:
An executive summary
Full test execution report
Defects found and their status
Issues and their status
Risks outstanding to release and comments on release “go/no-go” decision
Recommendations
Output: Report, MS Word document report
Responsible: Test Managers
16.7 Weekly Reporting
Requestor: MOSL Test Lead, PMO, Delivery Director
Mechanism: Email (weekly)
Requirement: To communicate the weekly progress of and issues affecting the test team and
execution
Report includes:
Points for Management Attention which cover the top 10 Risks and Issues affecting the team.
A high-level test plan timeline update of active tasks
Achievements over the week for each phase and entity including deliverables completed
Planned activities for the following week
Output: Report Dashboard
Responsible: Test Managers
85
Market Operator Services Ltd ©
17 Appendix E - Test Environment Management Deliverables
The management of the test environments is a crucial function to the project as this will cover off
the coordination of architecture builds internal and external to the project.
The Test Environment Management Deliverables are to be confirmed as part of CGI’s Environment
& Release strategy (Document due)
86
Market Operator Services Ltd ©
18 Appendix F - Test Data Management Driven by and referenced by the Data Strategy Document due January ‘16
18.1 MDM Testing Levels and Objectives
18.1.1 Integration Testing
18.1.2 System Testing
18.1.3 Data Validation
18.1.4 Acceptance Testing
18.1.5 Completion Criteria
87
Market Operator Services Ltd ©
19 Appendix G - Test Deliverables
Deliverable Description Format
Master Test Strategy Master Test Strategy detailing the process and method to be undertaken, derived by structured workshop – This document
Word Standard Template
Test Plan No lower level test plans will be produced
Milestone Plan (Gantt) Milestones and tasks and activities for the test effort to be embedded into the overall project plan
MPP
Test Requirements 1:1 cross reference test requirement for each business requirement.
ALM
Test Conditions Many : 1 cross reference test requirements and detailed test conditions for each test requirement
ALM
Test Procedure Step by step instructions as to how to execute the manual test or automated script
ALM
Test Script The test executable
ALM
Test Data specification The test data to be input to the system Excel Standard Template attach to ALM
Test Environment specification
Specification and configuration of the test environment on which the tests are expected to be executed.
Word Standard Template
Expected Results The expected outcome ALM
Test Schedule The route map for which test will be run on each logical day
ALM
Test Results The actual outcome ALM
A reusable test pack For each test cycle for each defined date comprising control language, data, and comparison reports as a manual test
ALM
Test Completion Report Summary of the testing, measured against the Master Test Strategy and logging any deviation from defined success criteria, in plain English and available for review and auditing
Word Standard Template
88
Market Operator Services Ltd ©
20 Appendix H - Test Tools Throughout the testing lifecycle, tools are utilised to improve the quality, quantity and speed of
testing
To monitor all aspects of test development, from planning and analysis to preparation and
execution
To standardise and automate test cases saving time and effort
To increase transparency, making project control more efficient and providing view of
current status and progress
To analyse testing statistics that measure productivity, root cause analysis to identify the
cause of defects, and code quality though defect detection practices
To detect defects earlier and remove them quickly and economically
To reduce the start-up time and training requirements for testers as all historical tests
assets are stored
20.1 HP ALM
HP ALM will be used address and mange quality assurance for the MOSL CMOS Project during the testing phases. MOSL’s & CGI’s agreed Defect Management process is tightly integrated into HP ALM; so by using ALM on the MOSL CMOS Project, it will bring a consistent, repeatable and standardised software quality processes that is tightly integrated with the MOSL ways of working. To aid this process further, and make ALM more accessible to the selected Market Participants, Microsoft Excel will be used to create templates in order to ‘auto-upload’ test scripts into ALM. HP ALM will provide the following benefits to the MOSL CMOS Project:
Gain real-time visibility into requirements coverage and associated defects to paint a clear
picture of business risk
Manage the release process and make more informed release decisions with real-time
KPIs and reports
Measure progress and effectiveness of quality activities
Collaborate in the CMOS Project delivery lifecycle with a single global platform
Manage manual and automated testing assets centrally
Facilitate standardised testing and quality processes that boost productivity through
workflows and alerts
Testing tool to capture and manage defects
Provide a Defect Management module in order to manage defects as per the MOSL Defect
Management processes and procedures
20.2 Jmeter- For Load and Performance testing
CGI have selected Jmeter as their preferred Load & Performance testing tool
Jmeter will provide CGI with functionality to replicate users and transactions to exercise CMOS
89
Market Operator Services Ltd ©
20.3 Bridegall SmartTest for automation
Bridegall are using an automation framework created and used internally at Bridgeall that they
have called SmarTest. Bridgeall will use automation where appropriate to deliver value and
benefits to the Settlements & Tariffs modules being delivered for the CMOS solution.
These benefits include:
Increased test coverage
Reduction of risk due to increased coverage
Quicker time to market
Reduction of human error
90
Market Operator Services Ltd ©
21 Appendix I – Test Assumptions, Dependencies & Risks
21.1 Assumptions
RAD’s, FD’s NFD’s and the Requirements Traceability Matrix will be complete and assured for the phase under test
An integrated Test and Project Plan will be developed and remain work in progress
Full Project Governance documented and approved
All identified and required Market Services, MOSL technical and 3rd party resource will be available to the Test Managers
All necessary funding will be agreed, sanctioned and approved
Testing will define a full test governance and strategy pack complete with deliverables such as Test Approach & Master Test Strategy, Test Plans, Reporting, Test Assets, Deliverables, Terms of Reference and Roles and Responsibilities
The MOSL Data Lead and Data Partner will be in place and available to the Test Managers
Environment Managers will be in place and available to the Test Managers
Market Services Analysts will be in place and available to the Test Managers
All required environments are fully designed and built for the phase under test.
Environments are locked down and fully configuration & Change Managed to prevent, or limit the damage of, co-existence with other stakeholders
A project PMO is in place
Market Services project resource made available, to the respective Test Managers during planning, preparation and execution
Selected Market Participant volunteers made available to the respective Test Managers during planning, preparation and execution
Testing will be fully protected from Change Control in the event of scope creep and ad-hoc demands to complete additional testing to that agreed and approved
Test Teams will be provided with full competency and trained on the technology
Where requirements are ambiguous, testing will aim to validate through Static Analysis Testing. Where this is not possible, testing will identify and be protected from lack of information
Testing will follow project Risk, Issue and Change Management as defined by the Project Manager
Selected Market Participant volunteers (testers during Acceptance) will be fully trained by MOSL & CGI
Policies and procedures are created and approved
Market Entry Assurance Certification will be completed and passed by all Market Participants entering the Shadow Market
Testing KPIs are defined and relevant reports created (to be confirmed)
The necessary support will be supplied by MOSL, (includes MOSL’s selected Test Partner) and CGI (includes Bridgeall) subject to clear articulation and mobilisation
CGI (includes Bridgeall) have the expertise and resource to aid with providing the required service throughout the Test Phases
There will be unknown changes that happen and realistic contingency for this must be built into the MOSL CMOS Test Project and a process review will need to take place
21.2 Dependencies
All requirements will be stored in a common directory
Key requirements must have been reviewed, agreed and placed under configuration control
Test resources will be identified and made available
91
Market Operator Services Ltd ©
Test environment(s) will prepared in advance of test execution o Test Environment(s) will contain test data o Test environment(s) will be available, supported & maintained throughout each
relevant test phase
A Release note must accompany each delivery of the agreed Test environments
Changes to the RAD’s, FD’s and NFD’s will be made through a change control mechanism. These may result in rework of some of the test cases
Market Entry Assurance Certification can only follow a completed UAT that has been assured by MOSL that CMOS is fit for purpose to operate the ‘Shadow Market’
21.3 Risks and contingencies
Risks Mitigation / Contingency
Test resources unavailable in test phase Planning fully completed and resources
assigned throughout the CMOS program test
phases
HP ALM is not available Ensure ALM can be used offline if required,
ensure defect process allows for defects to be
processed without ALM. Agree workarounds if
ALM is unavailable
Test Environment(s), not working as expected /
unavailable
Ensure environment strategy details processes
in the event test environment(s) do not work as
expected or are unavailable
Market Participants are not ready for relevant
test phases
Ensure all Market Participants are engaged
regularly through Test SIG(s), ensure early
indications of a lack of readiness are addressed
and mitigation is agreed and put in place
No Market Participants volunteer for SIT or UAT Possibly work with Solution & Service providers
rather than individual Market Participants
Entry & Exit criteria is not met in Test Phase(s) Entry & Exit criteria detailed in this strategy will
be discussed as each test phase is completed
in order to move to the next test phase. In the
case of exit criteria not being met a decision will
be taken to assess and move to the next phase
if a pragmatic approach can be facilitated with
no risk to the quality of the program delivery.
CMOS defects are too numerous for release
into Shadow Market / Production
Ensure defect process is well understood and
any significant issues that may impact delivery
into Shadow Market / Production are prioritised
and fixed first. Go / No-go meeting will decide
on the final approval to move into Shadow
Market / Production
Top Related