Wholesale Market Reform

Slides:



Advertisements
Similar presentations
 Acceptance testing is a user-run test that demonstrates the application’s ability to meet the original business objectives and system requirements and.
Advertisements

Software Quality Assurance Plan
Senior Stakeholder Forum 04/02/2014. Agenda UK-Link Programme Update –Including Data Cleansing Update Nexus Modification Update Faster Switching EU Reform.
COMP8130 and 4130Adrian Marshall 8130 and 4130 Test Execution and Reporting Adrian Marshall.
Types of Requirements  Functional Requirements  Descriptions of actions or processes that create or update information.  Outlines of reports or on-line.
The Software Product Life Cycle. Views of the Software Product Life Cycle  Management  Software engineering  Engineering design  Architectural design.
Software Testing Prasad G.
Software Engineering Institute Capability Maturity Model (CMM)
Release & Deployment ITIL Version 3
OSF/ISD Project Portfolio Management Framework January 17, 2011.
Independent User Acceptance Test Process (IUAT)
1 GROUP 1 - OUTPUT OBJECTIVES 1 and 2 (ENGLISH/BCS group) o PRVA GRUPA 1 - CILJEVI IZLAZNOG REZULTATA 1 i 2 (ENGLESKO/BHS grupa)
ISM 5316 Week 3 Learning Objectives You should be able to: u Define and list issues and steps in Project Integration u List and describe the components.
Fleming College Quality and Risk Management Review Summary November 10, 2005.
QUALITY ASSURANCE PRACTICES. Quality Plan Prepared and approved at the beginning of project Soft filing system approach followed. Filing location – –
Develop Project Charter
Page 1 TEST in the large RELEASE REWORK ASSESS packaged application documentation models and source code management documents requirement alloc. matrix.
STEP 4 Manage Delivery. Role of Project Manager At this stage, you as a project manager should clearly understand why you are doing this project. Also.
Chair of Software Engineering Exercise Session 6: V & V Software Engineering Prof. Dr. Bertrand Meyer March–June 2007.
Lecture 2.1b: DoD Acquisition Process (SEF Ch 2)
Requirements Management Overview NIGMS Software Development.
State of Georgia Release Management Training
Overview PRINCE Hogeschool Rotterdam. 2 Project definition  A project is a temporary organization that is created for the purpose of delivering.
BSBPMG501A Manage Project Integrative Processes Manage Project Integrative Processes Project Integration Processes – Part 2 Diploma of Project Management.
6/6/ SOFTWARE LIFE CYCLE OVERVIEW Professor Ron Kenett Tel Aviv University School of Engineering.
SLAs with Software Provider. Scope “…declare the rights and responsibilities between EGI.eu and the Software Provider for a particular component.” Which.
SOFTWARE TESTING TRAINING TEST MANAGEMENT Chapter 5 immaculateres 1.
SOFTWARE TESTING TRAINING TOOLS SUPPORT FOR SOFTWARE TESTING Chapter 6 immaculateres 1.
Introduction to CAST Technical Support
ITIL: Service Transition
Xoserve Change Management: Guiding Principles
Establishing and Managing a Schedule Baseline
2011 Prioritization Update to Market Subcommittees
11.3 Perform Qualitative Risk Analysis
Release Update (CIO Forum)
Software Configuration Management
Defect and Enhancement Severities Levels
Software and Systems Integration
AgilizTech Support Desk Overview
Regression testing is a type of software testing that seeks to uncover new software bugs, or regressions, in existing functional and non-functional areas.
Pega 9/14/2018 8:48 AM Definition of Done = ready for PO acceptance
Description of Revision
Phase 2 Tollgate Review Discussion Template
Phase 4 Tollgate Review Discussion Template
Phase 4 Tollgate Review Discussion Template
Phase 4 Tollgate Review Discussion Template
Engineering Processes
Guidance notes for Project Manager
Change Assurance Dashboard
Amendment Invoice Task Force Progress Report
[Work Order #] [ARB Date]
Agenda Action Review Update on IX and iGT access Testing strategy
Amendment Invoice Task Force Progress Report
Market Participant User Forum
Cutover & Implementation Aggregated Programme RAG
Amendment Invoice Task Force Progress Report
Change Assurance Dashboard
Release 3 Update 9th January 2019.
Project Name - Testing Iteration 1 UAT Kick-off
Name of Project Manager Date
Amendment Invoice Task Force Progress Report
The Software Testing Life Cycle
Amendment Invoice Task Force Progress Report
Software Testing Lifecycle Practice
UKL Future Release Prioritisation & Scoping Approach
PROJECT FINANCIAL MANAGEMENT BEST PRACTICES RICHARD MAGU
ISSUE MANAGEMENT PROCESS MONTH DAY, YEAR
{Project Name} Organizational Chart, Roles and Responsibilities
EST103 Estates myProject Implementation Project Report 03/05/19
EST103 Estates myProject Implementation Project Report 10/05/19
Presentation transcript:

Wholesale Market Reform V0.1 31/10/2016 Wholesale Market Reform CMOS 2.3 Exit Criteria 10th January 2017

Executive Summary The purpose of this document is to set out for review a set of Exit Criteria to be applied to the MOSL CMOS 2.3 release. This criteria only covers MOSL and it excludes Market Participant criteria. The Test Exit Criteria will be used as the gateway between the transition of the CMOS 2.3 release from the Market Participant Sandpit (MPS) to the Production Environment. This Test Exit Criteria is designed to capture the current risk position and it will be used to determine when the testing is complete. This may include the acceptance of concessions. Note: although this criteria covers the defects uncovered by the Market Participant and MOSL’s testing, it isn’t intended to replace the criteria defined by MPs or MOSL set out within their own test strategies. Specifically, MOSL retain the responsibility of ensuring CMOS is fit for purpose, and each MP has a responsibility of ensuring CMOS integrates successfully with their own internal systems and processes, through their own internal quality assurance processes.

Exit Criteria (MOSL) In the context of the following criteria, the set of "tests" refers to the testing scope covered by both MPs (covering their internal systems and processes) and MOSL (covering the CMOS product), including and not limited to SIT and UAT. The criteria is as follows: A confirmed baselined version of the application and any associated data and configuration items was delivered into the Market Participant Sandpit; 100% of all in-scope tests have been attempted and all defects encountered have been logged and triaged with MOSL, complete with the agreed assigned priority and severity; All defects delivered as part of the CMOS 2.3 Release Note have been retested; All fixed defects delivered during the testing as hotfixes have been retested; The impact of de-scoped tests (including de-scoped defect fixes) is understood and captured; No defects exist with a priority of P1 or business severity of S1 (see, “Appendix A” for definitions); The impacts of any P2 or S2 defects is understood and accepted; Outstanding P2 or S2 defects have an agreed (CIO Forum) release for delivery; The impact of P3 and P4 defects are understood and accepted; If test cases or defects are outstanding, an acceptable technical or procedural workaround has been identified with an acceptable agreed residual risk;

Exit Criteria (MOSL) Test Completion Report is updated with the results of the testing and is signed-off within MOSL; A tested back-out plan is available and in place; Release Notes, deployment, and back-out guides have been updated for use in live; Any test constraints (e.g. missing end-points requiring the use of stubs and / or “test only” configuration changes) are captured and shared; It is agreed any P1 or S1 defects occurring in production after the release date are resolved within 10 days of go-live.

Appendix A - Definitions Note: the Market Participant User Forum will need to agree how to convert observations to defects. Defect priorities and severities: Priority Description P1 - Urgent Blocking defects (i.e. technical showstoppers) experienced by more than one Market Participant. No testing is possible. P2 - High Highest priority for non-blocking defects. This indicates areas of the testing cannot be performed or the functional area is significantly degraded; however, progress can continue in other areas. P3 - Medium Indicates a defect which should be fixed. P4 - Low Indicates little urgency is needed for a defect fix Severity Description S1 - Business Critical Blocking defects (i.e. Business showstoppers) experienced by more than one Market Participant. The Business cannot operate, can operate at considerable risk (e.g. reputational, financial) or cost, there is impractical effort required. S2 – Serious Areas of the Business cannot operate or can operate only for days at considerable risk or cost with significant effort required. S3 - Degraded Service Business workarounds are required at risk and cost. Note increased volumes of minor defects collectively may become a serious issue. S4 - No Service Impact Business workaround available at an acceptable risk and cost.