Download presentation
Presentation is loading. Please wait.
1
Project Name - Testing Iteration 1 UAT Kick-off
2
Contents Iteration 1 Overview Testing Plan Defect Management Process
SAP Functionality Overview Iteration 1 UAT Logistics
3
Iteration 1 Overview Iteration 1 consists of 43 user stories
Out of 43, 17 are functional & 23 are data user stories Campaign type “Auto Pay” would be run as part of the campaign execution process
4
CCM Test Metrics / Communications Weekly View
For the foundational release, there are two proposed types of testing communications: Defect Management Meetings (Daily at TBD) Internal Deloitte team meeting Defect Management Reports (Daily at TBD ) Published externally to our clients
5
Defect Management Meetings
Purpose Review new defects Identify appropriate developer for triage Receive status update on defects in development and defects that failed re-test Identify next steps to achieve a fix Determine estimated fix date based on priority / severity Identify items that are being handed back and forth, requiring escalation to testing leadership Determine if escalation is needed for high severity or high aging defects, and subsequently work with Test Governance Lead to address Meeting Format Facilitator to walk through open defect export from CCM Test project in Quality Center Defect ID Severity Developer Assigned Status Summary Comments Developers to make real-time updates to defects in Quality Center Required Fields When Opening a Defect Proposed Facilitator Defect Managers Proposed Attendees Developers Defect Manager Test Managers Test Data Manager Test Governance Lead (optional) Proposed Timing Daily 1 hour
6
Test Execution Process
Actor Involved 1 Approved test cases and steps uploaded to Quality Center through upload template Test Managers Test Governance Lead 2 Test cases are assigned a planned execution date and responsible tester in the test execution schedule 3 Testers determine which test cases are assigned to them based on the execution schedule Testers 4 Testers work with Test Manager to get any necessary data, pre-conditions, etc. prior to running the tests 5 Testers execute tests 6 Test Manager pulls test execution reports at the end of each day 7 Test Governance Lead compiles and publishes reports Test Governance Leads
7
Roles and Responsibilities
Role Type Name Governance Test Managers Test Authors Testers Defect Manager Environment Manager Test Data Manager
8
Defect Management Process
The defect management process is as follows: Developer Tester Defect Manager Test Case Passes ? Executes test case Indicates step as passed Logs a defect with supporting details group retest Closes defect Assigns defect back to developer Yes No Receives defect assignment Defect Management Process Assigns it to Appropriate person Assigns back to tester Developer fixes the code and assigns it back to defect manager
9
Iteration 1 UAT Logistics
All test cases should be executed directly into Quality Center Test cases should not be executed in offline tools and later “mass updated” in Quality Center, as all metrics / reporting will be based real-time out of Quality Center If you have issues accessing Quality Center, please reach out to Name/ to determine an alternate testing approach Defects should be created directly from failed test steps within test cases and linked accordingly This will show the correlation between failed test cases and defects If a test step is Blocked, it should also be linked back to a defect This will show the correlation between blocked test cases and defects If there is an environment outage preventing testing, please notify Name / immediately Testing resources should follow the guidelines listed in this presentation when it comes to Defect Severity, Test Case Status, processes, etc. For any issues with accessing and setting up Quality Center, please contact Name / If there are questions about specific processes within Quality Center (what a mandatory / optional field means and how /when it should be used, etc.) testing resources will contact the Testing Lead, Name /
10
Defect Severity Guidelines
Defects will be categorized as follows: Severity Definition 5. Urgent Very severe Entire application, component, or function will not work Client, system or environment is unavailable. No work-around available Severe data loss or corruption Data integrity issue related to security, confidentiality, legal, or regulatory non-compliance Client X definition: Major functionality not working or blocking defect Intermittent defects that result in any of the above are also classified as Urgent. 4. Very High Significant Entire application, component or function will not work. A work-around is available Corruption of a critical component Loss of a non-critical component Client X definition: Major functionality not working but there is a workaround or normal functionality not working and the user is unable to proceed with this feature Intermittent defects that result in any of the above are also classified as Very High. 3. High Result is not as expected Corruption of a non-critical component. A work-around is available Low impact to the end user or application Client X definition: Non-essential functionality not working but with correct inputs the user is still able to proceed. Intermittent defects that result in any of the above are also classified as High. 2. Medium Minor defect Some of the application operations are unexpected Client X definition: Minor functionality not working - usually cosmetic or screen display Intermittent defects with low impact to the business operations or end users. 1. Low Cosmetic issues or some functionality unavailable but has a simple workaround.
11
Defect Priority Guidelines
Defects will be categorized as follows: Priority Definition Acknowledgement Time 5. Urgent Must be fixed immediately Requires notification of responsible executive Requires customer notification and daily follow-up Does not meet Definition of Done if any Critical defect is open 30 minutes 4. Very High 3. High Must be included in the next Iteration Requires notification of manager Requires customer notification and weekly follow-up Does not meet Definition of Done if any Serious defect is open 1 hour 2. Medium Must be scheduled in the next 2-3 Iterations 2 hours 1. Low Schedule when time is available 4 hours
12
Defect and Re-test Assignment
Work stream Testers Primary Developer(s) SAP CRM TBD ADT
13
Test Case Status Definitions
No Run Test has not undergone its first run and is not blocked Passed Actual results match expected results for each step in the test case Failed Actual results deviate from expected results for at least one step in the test case Blocked At least one step is unable to be run due to an existing open defect(s) Not Complete Test is in progress but not all steps have been executed N/A Test has been descoped
14
Test Ground Rules All test cases should be executed directly into Quality Center Test cases should not be executed in offline tools and later “mass updated” in Quality Center, as all metrics / reporting will be based real-time out of Quality Center If you have issues accessing Quality Center, please reach out to Name / to determine an alternate testing approach Defects should be created directly from failed test steps within test cases and linked accordingly This will show the correlation between failed test cases and defects If a test step is Blocked, it should also be linked back to a defect This will show the correlation between blocked test cases and defects If there is an environment outage preventing testing, please notify Name / immediately Testing resources should follow the guidelines listed in this presentation when it comes to Defect Severity, Test Case Status, processes, etc. For any issues with accessing and setting up Quality Center, please contact Name / If there are questions about CCM specific processes within Quality Center (what a mandatory / optional field means and how /when it should be used, etc.) testing resources will contact the Testing Lead, Name /
15
Test Progress Example Overall Testing Progress Summary Statement
Test Planning - Summary Test Planning - Progress Test Execution - Summary Test Execution - Progress
16
Defects Summary Example Defects Summary Statement Defects - Summary
Defects - Progress SAP CRM RTOM Defects Pending 164 16 Sev 1 Sev 2 Sev 3 Sev 4
17
Priority 1 Outstanding Defects
Priority 1 Defects Summary Statement Defect ID Defect Summary Defect Severity Team Assigned Rationale Days Open
18
Use Cases Execution Example Use Cases (Requirements) Summary Statement
Use Cases - Progress Use Cases - Summary Considerations and Issues
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.