CV-SHARP Brief 1 January 2018.

Slides:



Advertisements
Similar presentations
Student Learning Targets (SLT)
Advertisements

OVERVIEW TEAM5 SOFTWARE The TEAM5 software manages personnel and test data for personal ESD grounding devices. Test and personnel data may be viewed/reported.
DTMS How-To 25 September 2009.
HP Quality Center Overview.
Basic guidelines for the creation of a DW Create corporate sponsors and plan thoroughly Determine a scalable architectural framework for the DW Identify.
MIS 385/MBA 664 Systems Implementation with DBMS/ Database Management Dave Salisbury ( )
Release & Deployment ITIL Version 3
Non-Academic Staff Compensation Program Employee Presentation 2013.
SuccessFactors: Phase 2 Annual Review Process Presented by: Patricia Kelly / VeLonda Dantzler Human Resources Management.
U.S. Department of Agriculture eGovernment Program August 14, 2003 eAuthentication Agency Application Pre-Design Meeting eGovernment Program.
Administrator – Employee Overview September, 2011.
P1516.4: VV&A Overlay to the FEDEP 20 September 2007 Briefing for the VV&A Summit Simone Youngblood Simone Youngblood M&S CO VV&A Proponency Leader
Materials Planning & Control. Introduction With the development of ‘integrated materials management’ and ‘supply chain management’, material managers.
Squadron PESTO Pillar Course
Proficy Workflow for Water reduce. improve. secure.
DRRS-N and SHARP Reporting While in Transition 1 January, 2016 UNCLASSIFIED.
Day in the Life (DITL) Production Operations with Energy Builder Copyright © 2015 EDataViz LLC.
How long are most watches aboard ship? NS3 Naval Skills Shipboard and Ashore Watches Unit 2 Chapter 2.
A ship’s organization is set up to NS3 Naval Skills Shipboard Organization Unit 2 Chapter 1.
Establishing (or Enhancing) PMO Effectiveness Nicolle Goldman, PMP March 28, 2007.
© Arbela Technologies Accounts Payable + Procurement & Sourcing Workflows.
JMFIP Financial Management Conference
Butler University Goal and Performance System (GPS) Human Resources
Cost Estimating Investment Planning and Project Management
Software Application Overview
Commit and Adapt with Whole Mine Schedules (Weekly, Daily, Shiftly)
Feedback/Performance Review and Compensation Process
Chapter 6: Database Project Management
DRRS-N and SHARP Reporting While in Transition
Squadron PESTO Pillar Course
IFSP Aligned with the Early Intervention Data System
Chapter 18 Automatic Account Assignment
Introduction to Triggers
QlikView Licensing.
Five-Level Performance Management System
(Additional materials)
KNB Grounding Incident
Systems Design: Activity Based Costing
Reporting  Easy search tools
Description of Revision
FedEx Billing Online (FBO) Non-Revenue Quick Guide
DRRS-N and SHARP Reporting While in Transition
DRRS-N Software Tutorial
Reports for Data & Progress Monitoring
Raytheon Missile Systems Steve Lacy
09 Calculation
CVN OPS ADMIN Officer Course
Overview of Transaction Processing and Enterprise Resource Planning Systems Chapter 2.
DRRS-N Software Tutorial
Your Facility Your Information
Project Management Process Groups
Aviation Maintenance Supply Readiness Reporting (AMSRR) Overview
DRRS-N and SHARP Reporting While in Transition
2018 SMU Staff Performance Review Training
Squadron PESTO Pillar Course
Overview of Oracle Site Hub
About this Template Dear Colleague, This template is provided by Valooto to help you communicate the facts about your need for a CPQ (Configure Price Quote)
Carrier Airwing Readiness Report (CVW CARR)
CVN OPS ADMIN Officer Course
Car Hire Rate Negotiation Self-Service 2016 Release Highlights
Object-Oriented Software Engineering
Expert Group on Quality of Life Indicators
Systems Design: Activity Based Costing
Evaluating the Estimate at Completion (EAC)
Squadron PESTO Pillar Course
DRRS-N Software Tutorial
Defense Readiness Reporting System – Strategic (DRRS-S) PXO Course
Defense Readiness Reporting System – Strategic (DRRS-S)
SmartPay 3 DoD Data Mining Process (IOD 101)
Staff Turnover and Silos in Our State, Oh My!
Presentation transcript:

CV-SHARP Brief 1 January 2018

THIS IS AN INFORMATIONAL BRIEF Agenda CV-SHARP Review Agenda Overview Program Foundations Data Flow Methodology Overview of program Why it’s here Program growth and development Program and Policy interaction Nitty gritty of how it looks and feels Q & A THIS IS AN INFORMATIONAL BRIEF

Overview - What is CV-SHARP? CV-SHARP Afloat SIPRNET Web application Captures completions of Sub-Events dictated in the CVN-TRAMAN No Reactor Requirements No Equipment Displays level of training ‘Experience’ and Performance’ vs. TRAMAN Reports to: CV-SHARP Ashore, DRRS-N T-Pillar TRAMAN T&R Matrix is policy CV-SHARP captures the execution of that policy

T&R Matrix – Sub-events per phase standards Program Foundations T&R Matrix – Sub-events per phase standards Each sub-event has phase targets and periodicities (Learn / Maintain / Degrade) Team-based with individual crediting E consists of ‘reps & sets’ Degrades over time; crew turnover (individual impacts) P externally assessed with periodicity (unit-based) Enabled through TACs (TYCOM-controlled) FXP format abandoned (lacked NMETL conditions and standards) Compendium of CNAF best standards and practices “Crawl, walk, run” approach at ship’s discretion CVN leadership oversight Crew’s training readiness and ‘depth on the bench’ T&R Matrix Similar to Squadron T&R Matrix on steroids Exp is how often Perf is how well, as determined by external agency (similar to a unit squadron Unit Eval) Started as points based matrix where sailors earned points while standing watches and now has evolved into a Sub-Event completion based matrix where sailors earn credit for participating as a team member completing a Sub-Event PRMAR based PRMAR score (Experience) is based on the accumulation of the individual Exp credit rolling up the that PRMAR based on the system logic/relationships Standard curves derived from TRAMAN based standards Since all personnel are accounted for – leadership has the tool that can not only indicate that the crew can perform a capability/mission, but also how long it can be sustained

SUSTAINMENT & Periodicity CVN TRAMAN - Appendix 1 EXPERIENCE Experience builds from phase to phase (and event to event). To grow from an Experience level of 1 to 9 (as required during TSTA), the event must be scheduled eight times within the Learning interval (14 days) for each watch and then maintained by repeating every 60 days (the Maintenance interval). FRTP Requirements for Experience Reporting (Exp) and Performance Assessment (Perf) Sub Event TITLE MAINT PHASE BASIC PHASE INTEGRATED SUSTAINMENT & Periodicity   Wt In Port Crew Prep FDC CART TSTA FEP C2X JTFX Exp Level Exp Period Perf Expiration (days) Dept E P Learn Maint Deg CCC 2013 Direct and Manage Comms - Information Assurance N 1 9 Y 11 13 15 14 60 360 CBS   CV SHARP documents Experience and Performance

Sub-Event Logging - Bridge-Officers, MOB-N 1301

FOCUS: TRAINING AT TEAM LEVEL CV-SHARP Functions Internal – Comprehensive T & R management tool Captures training completed at Team level Individuals receive ‘Experience’ credit as part of the Watch Team Team ‘Experience’ rolls up to Sub-Event/Department Can be viewed in Afloat via the Dept Experience Report Internal management and decision making tool “Who needs to do What” IAW TRAMAN Requirements External - DRRS-N Training Pillar Populates DRRS-N “T Pillar” Team Experience x Sub-Event Performance (TFOM = Ef x Pf) rolls up to NTA/PRMAR Fleet-wide readiness comparison Internal CV-SHARP captures training completed IAW T&R Matrix in TRAMAN TRAMAN is policy, CV-SHARP captures execution of that policy CV-SHARP was developed as tool that captures training readiness down to the individual Sailor (similar to SHARP) Individual receives Exp when he is part of the Watch Team completing the training Sub-event Allows better decisions about ship’s operational readiness and compliance with TRAMAN requirements All watch teams involved, records accomplishments of entire crew Captures depth and sustainability External Ship’s readiness based on training of all teams and therefore translates to Unit level training readiness Current method is a statistical sampling. E level reported to DRRS-N is based on the E level of the Primary Team Type. Each Sub-Event has only one primary Team Type. Future method will use all data. The individuals with experience in a particular Sub-Event are pooled to make notional teams (This notional team calculator will be implemented in the future, spring/summer 2014) Notional team Experience will be used in DRRS-N calculation Externally assessed Pf is used for each Sub-Event FOCUS: TRAINING AT TEAM LEVEL 7

Completion Data for ALL teams CVN Training Data Flow CV - SHARP (Afloat) Internal T&R Management Captures teams’ completion of Sub-Events with visibility down to individual Sailor “Raw” Sub-Event Completion Data for ALL teams + Assessment scores CV-SHARP Afloat is the Units T&R management tool STILL Evolving and we learn fleet needs and requirements Legacy T&R Matrix started as system to log what the watch teams were doing on a daily basis Capturing the daily watch duties – i.e. Conning the ship, conducting flight ops Not capturing true capabilities – i.e., maneuvering to avoid a mine, or precision anchorage Now we capture the completion of Sub-Events not watches Concentrating on the CORE information that goes to DRRS-N Correlates with DRRS-N CV-SHARP Ashore Captures and stores completion of All sub-events Houses Calculation engine to generate Ef and Pf, NMETL mapping and for sending to NTIMS NTIMS Receives pre-calculated Ef and Pf and combines them to display in proper DRRS-N format CV-SHARP Ashore NTIMS Ef & Pf Tfom Aviation Data Warehouse hosts CV-SHARP Ashore T&R Matrix residency Calculates Ef and Pf CV-SHARP data storage Ef x Pf = Tfom NTA mapping to METL No normalizations applied MET/PRMAR population by Tfom

Individual Experience Experience Hierarchy Individuals Receive Experience as part of Team Teams Complete and log Sub-Events Sub-Event Cumulative Team Experience NTA/PRMAR Experience Cumulative Sub-Event Experience DRRS-N T-Pillar Notional team Experience combined with Sub-Event Performance then mapped to NTAs (METL) for Unit Level Readiness depiction Individual Experience Earned when Team logs Sub-events Team Experience Hierarchy System hierarchy goes from the individual to teams to Teams to Sub-Event and then through METL mapping to NTA and PRMAR for DRRS-N reporting Watch-team completing Sub-Event for Experience is the foundation of methodology Individuals cannot complete training Sub-Events alone – it takes the team EXP credit is given to the Individuals that make up the team Cumulative Team Type EXP rolls up to either DEPT (Department Experience Report) or NTA and PRMAR (DRRS-N) IAW system relationships Current method is a statistical sampling. E level reported to DRRS-N is based on the E level of the Primary Team Type. Each Sub-Event has only one primary Team Type. There is no current tie between individual readiness and DRRS-N T Pillar. That connection will established in the future method described below. Future method will use all data. The individuals with experience in a particular Sub-Event are pooled to make notional teams (This notional team calculator will be implemented in the future, summer/fall 2014) Notional team Experience will be used in DRRS-N calculation Externally assessed Pf is used for each Sub-Event CV-SHARP Sub-Event Experience NTA/PRMAR Experience DRRS-N “T” Pillar (Ef x Pf)

Readiness Calculation - Current Methodology Statistical sampling concept Each Sub-Event has a Primary Team Type Example: AAW 1001-Primary Team Type: Mission Planning CV-SHARP assumption is that if the Primary Team Type did the Sub-Event, all the other Team Types also completed it. CV-SHARP then uses that Sub-Event Completion date ICW the Learn/Maintain/Degrade periodicities to calculate Sub-Event E Doesn’t account for required minimum number of Teams. Example: Damage Control teams usually require 10 teams for Experience. Performance All P Scores defaulted to 100% in DRRS-N and CV-SHARP Doesn’t account for required minimum number of teams: Any time a Primary Team completes the Sub-Event, the ship that date is used ICW the Learn/Maintain/Degrade matrix to update the ship’s E. For example, you are required to have 16 DCRS/BDS Stretcher Bearer Teams. If only one of the 16 teams logs the Sub-Event (according to the Learn/Maintain/Degrade Periodicities) CV SHARP will show the ship with the required E Level. Upon switching to the notional team method (the future methodology) all 16 DCRS/BDS Stretcher Bearer Teams will need to log the Sub-Event within the Learn Interval, in order to increase the ship’s Experience level by 1.

All Designated Watch Standers Future Notional Team Methodology - Delayed implementation Actual Watch Bill 3 Teams Required Team 1 Team 2 Team 3 Team 4 OOD 100% 20% 75% 70% JOOD 80% 60% JOOW 90% 95% 72% 92% 77% Team 1 Team 2 Team 3 OOD 100% 75% 70% JOOD 80% JOOW 95% 92% 90% NOTE: When looking at individual E in the Department Experience report it will display as “3/6” rather than “50%”. The numerator is the individual E Level and the denominator is the phase target E Level. Utilizing percentages to simplify the slide. PROs Reports against requirements of directives, i.e., NAVDORM Display True Capability that could be produced by ship at that point in time, if needed Reduces fluctuation is personnel changes, i.e., changing actual watch bill would cause numbers to change, but using notional teams eliminates/reduces that impact CONs Difficult to follow data streams (low traceability of actual team’s performance – CV-SHARP not equal to DRRS-N) because system generates the team Mitigated by the use of CV-SHARP Department Experience Report to show actual team training Similar to SHARP methodology (Notional Teams assembled) Highest experienced individuals pooled to determine most Experienced Essential Teams Most probable teams to be used in war time scenario All Designated Watch Standers OOD JOOD JOOW 100% 80% 90% 20% 95% 75% 70% 60% 50% 45% 40% 15% Building Block for calculating Sub-Event E Level Selects highest E from whole crew, not just watch team members Optimizes lowest notional team score

Future Notional Team Methodology Conducted by Team Types: Bridge-Navigation 3 Bridge-Officers 3 NAV Brief 1 Piloting 1 Sea & Anchor-Nav 1 MOB-N 1301 Plan Navigation CV-SHARP will calculate notional teams for each of the 5 Team Types conducting the Sub-Event. It will take the lowest notional team score for each of the 5 Team Types and average those 5 scores to come up with the Sub-Event E Level.

Future Notional Team Methodology Notional Team Scores Average worst notional Team Types Bridge-Navigation 78 Bridge-Officers 90 NAV Brief 82 Piloting 95 Sea & Anchor-Navigation 91 Sub-Event E for MOB-N 1301 87.2 Sent to DRRS-N as Sub-Event E Level

Average of lowest notional Team Type scores Future Notional Team Methodology MOB-N 1301 Sub-Event Team CV-SHARP Ashore DRRS-N MET PRMAR Nav Plotter QMOW Bridge-Nav Ind 70 86 Piloting Ind NTA 1.2.11 Conduct Navigation MOB-N Exp Perf 85% JL Phone Nav Radar Piloting Off Radar Log Shipping Off SPA-25G SSDS Console TOP Supervisor 90 100 80 MOB-N 1301 87 100 MOB-N 1311 75 90 MOB-N 1313 100 100 MOB-N 1314 100 90 MOB-N 1315 100 90 MOB-N 1325 75 80 MOB-N 1326 100 65 MOB-N 1327 100 100 MOB-N 1328 0 100 MOB-N 1341 100 100 Bridge Officers OOD JOOD JOOW 70 100 Example uses MOB-N 1301 3 required (Essential) teams for Team Types Bridge-Officer and Bridge-Navigation. 1 required (Essential) team for Team Types NAV Brief, Piloting and Sea & Anchor-Navigation CV SHARP calculates only the required number of notional teams. If the ship has 4 Bridge-Officer teams on the Watchbill and constructed in Team Builder, only 3 notional Bridge-Officer teams will be built CV SHARP will use the notional teams for each team type to calculate a Sub-Event E Level. It will average the worst notional Team E Level from each participating Team Type to get the Sub-Event E Level. In this case the worst notional Bridge-Nav, Bridge-Officer, NAV Brief, Piloting and Sea&Anchor-Nav teams will be averaged. This will be the Sub-Event E level sent to DRRS-N. Averaged in with all the other lowest notional Sub-Event team E scores that are mapped to a particular NTA in the NMETL NTA 1.2.11 gets 84% E after averaging all the SE mapped to it Unit Perf scores (from the Performance entry page) gets sent to Ashore (100% in the example) and is averaged in the other SE Perf scores mapped to that NTA NTA 1.2.11 gets 92% P after averaging all the SE mapped to it DRRS-N The NTA scores – in the form of a Ef and Pf (84% and 92% in the example) are sent to DRRS-N They are multiplied to determine Tfom for that NTA (77% in example) Then averaged in with the other NTAs mapped to that capability area to determine the Tfom for that capability (85% MOB-N in the above example) CV-SHARP Afloat Exp (in Afloat) Individual credit received in that role as member of a team completing event Team Exp (in Afloat) Visible in DEPT EXP Report on Afloat (at Individual Level) 100% NTA 1.1.1.7 93% NTA 1.1.2.3.1 85% NTA 1.1.2.3.2 87% NTA 1.2.6 83% NTA 1.2.11 77% NTA 1.3 100% NTA 6.1.1.12 70% 85% ATG Assessment NAV Brief Sea& Anchor-Nav 100 50 90 75 40 80 92 84% 92% 1st LT Air Boss CHENG Conn Fathometer Gun Boss Lee Helm Master Helm Gator OOD OPS O Piloting Off RO Shipping O QMOW 87% Aft Master Helm P Aft Master Helm S ANAV Bearing Taker 1 Bearing Taker 2 Bearing Taker 3 Bearing Taker 4 Gator Lee Helm Master Helm Nav Evaluator Nav Plotter Phone Talker 1 Phone Talker 2 Phone Talker 3 Phone Talker 4 80 90 100 65 NTA 1.2.11 Conduct Navigation = 77% Tfom E x P Average of lowest notional Team Type scores

T Score Summary Page - This slide shows the overall concept. The actual report is broken down into two parts. The first drills down from Capability to Sub-Event Level, the second drills down from Sub-Event to Individual level. - Only available for ships where replication is working correctly - Provides data on ship’s progress vs the maximum possible T score for that phase - Provides provide preview of ship’s DRRS-N scores under the notional team methodology

Experience Analysis - Internal DEPT Experience The Department Experience Report shows the Sub-Events supported by each department. The experience numbers roll up so that the numerator and denominator are additive. For example: If ADTT Team A has 15 members and the phase target for the Sub-Event is 1, and each member has completed Sub-Event FSO 1092 one time, ADTT Team A’s Experience will display as 15/15. If ADTT Team A has 15 members and the phase target for the Sub-Event is 1, and 14 have completed Sub-Event FSO 1092 one time, ADTT Team A’s Experience will display as 14/15. If ADTT Team A has 15 members and the phase target for the Sub-Event is 2, and 15 have completed Sub-Event FSO 1092 one time, ADTT Team A’s Experience will display as 15/30. The numerators and denominators continue to add up as the Team rolls up to Team Type, Sub-Event and Department. A large Department will have very large numbers in the numerator/denominator. The numbers are also affected by the number of Teams the ship has built. If a ship builds extra teams, then the denominators will be correspondingly bigger. Sub-Event Experience Team Type Experience Team Experience

Experience Analysis - Internal Individual Experience is displayed in a separate window when you click on the team name. The numerator is the individual’s current E level and the denominator is the phase target from the CV TRAMAN. Individual Experience

Future Unit Performance (Pf) - Business Rules Pf = Score ranges 90-100% = 100 Pf; 80-89% = 90; < 80% = actual Actual Scores used for Battle ‘E’ computations Range Score used in DRRS-N Calculation for T-Pillar Currently all Pf defaulted to 100% in DRRS-N Broader Performance Targets Provides grading buffers Reduces ‘bad-day’ impact Reporting to DRRS-N T Scores currently defaulted to 100% in DRRS-N reporting Way ahead: 90-100% will go to DRRS-N as 100, 80-89% will go to DRRS-N as 90, <80 will go to DRRS-N as actual score Will not be implemented for several months, after switch to notional team methodology for E Actual P scores used for Battle E

Performance Assessment External assessor (ATG) enters UNIT Sub-Event assessment scores (%) into CV-SHARP, which feeds DRRS-N ‘Unit’ Performance Score is at the unit level, not tied to a particular individual or team in CV SHARP Unit assessed on a Teams performance in the completion of a Sub-Event At unit’s discretion as to which team(s) is used for the assessment Designed to allow assessor to enter assessment scores for Sub-Event Assessor will have specific log-on credentials with limited administrative access/rights Maintains historical record, user entering data and any comments Reporting to DRRS-N T Scores currently defaulted to 100% in DRRS-N reporting Way ahead: 90-100% will go to DRRS-N as 100, 80-89% will go to DRRS-N as 90, <80 will go to DRRS-N as actual score Will not be implemented until summer 2014, after switch to notional team methodology for E Actual P scores used for Battle E

Performance Assessment - Summary Page Allows users to view the assessment scores for SEs Filterable dependent on user needs

CV-SHARP – DRRS-N Alignment PESTO Pillars – Training Drill Down

CV-SHARP – DRRS-N Alignment PESTO Pillars – Training Drill Down Tabs Performance Factor Tab Experience Factor Tab

Takeaway Accurate logging Ensure all Team Types required for each Sub-Event are logging the training. Previous method of using only the Primary Team Type is being phased out. Accurate logging now will provide better E levels when the new calculation method is implemented.

Other Customer Support email CNAF T Pillar Leads CV-SHARP Program Manager LCDR Sean “Butterbean” Blackman sean.blackman@navy.mil 619-545-0791 LCDR Sean Lennon Sean.lennon@navy.mil 619-545-1547 Contractor Support Leads CNAL Customer Support Maxalex Fequiere MFequiere@innovasi.com (757) 444-0419 CNAP Customer Support Ed Joiner Edward.i.joiner.ctr@navy.mil (.smil.mil) (619) 545-1549 Other Customer Support email CNAF_CVSHARPSUPPORT@navy.mil CVSHARPsupport@innovasi.com CNAF_CVSHARPSUPPORT@navy.smil.mil