Presentation is loading. Please wait.

Presentation is loading. Please wait.

CV-SHARP Brief 1 January 2018.

Similar presentations


Presentation on theme: "CV-SHARP Brief 1 January 2018."— Presentation transcript:

1 CV-SHARP Brief 1 January 2018

2 THIS IS AN INFORMATIONAL BRIEF
Agenda CV-SHARP Review Agenda Overview Program Foundations Data Flow Methodology Overview of program Why it’s here Program growth and development Program and Policy interaction Nitty gritty of how it looks and feels Q & A THIS IS AN INFORMATIONAL BRIEF

3 Overview - What is CV-SHARP?
CV-SHARP Afloat SIPRNET Web application Captures completions of Sub-Events dictated in the CVN-TRAMAN No Reactor Requirements No Equipment Displays level of training ‘Experience’ and Performance’ vs. TRAMAN Reports to: CV-SHARP Ashore, DRRS-N T-Pillar TRAMAN T&R Matrix is policy CV-SHARP captures the execution of that policy

4 T&R Matrix – Sub-events per phase standards
Program Foundations T&R Matrix – Sub-events per phase standards Each sub-event has phase targets and periodicities (Learn / Maintain / Degrade) Team-based with individual crediting E consists of ‘reps & sets’ Degrades over time; crew turnover (individual impacts) P externally assessed with periodicity (unit-based) Enabled through TACs (TYCOM-controlled) FXP format abandoned (lacked NMETL conditions and standards) Compendium of CNAF best standards and practices “Crawl, walk, run” approach at ship’s discretion CVN leadership oversight Crew’s training readiness and ‘depth on the bench’ T&R Matrix Similar to Squadron T&R Matrix on steroids Exp is how often Perf is how well, as determined by external agency (similar to a unit squadron Unit Eval) Started as points based matrix where sailors earned points while standing watches and now has evolved into a Sub-Event completion based matrix where sailors earn credit for participating as a team member completing a Sub-Event PRMAR based PRMAR score (Experience) is based on the accumulation of the individual Exp credit rolling up the that PRMAR based on the system logic/relationships Standard curves derived from TRAMAN based standards Since all personnel are accounted for – leadership has the tool that can not only indicate that the crew can perform a capability/mission, but also how long it can be sustained

5 SUSTAINMENT & Periodicity
CVN TRAMAN - Appendix 1 EXPERIENCE Experience builds from phase to phase (and event to event). To grow from an Experience level of 1 to 9 (as required during TSTA), the event must be scheduled eight times within the Learning interval (14 days) for each watch and then maintained by repeating every 60 days (the Maintenance interval). FRTP Requirements for Experience Reporting (Exp) and Performance Assessment (Perf) Sub Event TITLE MAINT PHASE BASIC PHASE INTEGRATED SUSTAINMENT & Periodicity Wt In Port Crew Prep FDC CART TSTA FEP C2X JTFX Exp Level Exp Period Perf Expiration (days) Dept E P Learn Maint Deg CCC 2013 Direct and Manage Comms - Information Assurance N 1 9 Y 11 13 15 14 60 360 CBS CV SHARP documents Experience and Performance

6 Sub-Event Logging - Bridge-Officers, MOB-N 1301

7 FOCUS: TRAINING AT TEAM LEVEL
CV-SHARP Functions Internal – Comprehensive T & R management tool Captures training completed at Team level Individuals receive ‘Experience’ credit as part of the Watch Team Team ‘Experience’ rolls up to Sub-Event/Department Can be viewed in Afloat via the Dept Experience Report Internal management and decision making tool “Who needs to do What” IAW TRAMAN Requirements External - DRRS-N Training Pillar Populates DRRS-N “T Pillar” Team Experience x Sub-Event Performance (TFOM = Ef x Pf) rolls up to NTA/PRMAR Fleet-wide readiness comparison Internal CV-SHARP captures training completed IAW T&R Matrix in TRAMAN TRAMAN is policy, CV-SHARP captures execution of that policy CV-SHARP was developed as tool that captures training readiness down to the individual Sailor (similar to SHARP) Individual receives Exp when he is part of the Watch Team completing the training Sub-event Allows better decisions about ship’s operational readiness and compliance with TRAMAN requirements All watch teams involved, records accomplishments of entire crew Captures depth and sustainability External Ship’s readiness based on training of all teams and therefore translates to Unit level training readiness Current method is a statistical sampling. E level reported to DRRS-N is based on the E level of the Primary Team Type. Each Sub-Event has only one primary Team Type. Future method will use all data. The individuals with experience in a particular Sub-Event are pooled to make notional teams (This notional team calculator will be implemented in the future, spring/summer 2014) Notional team Experience will be used in DRRS-N calculation Externally assessed Pf is used for each Sub-Event FOCUS: TRAINING AT TEAM LEVEL 7

8 Completion Data for ALL teams
CVN Training Data Flow CV - SHARP (Afloat) Internal T&R Management Captures teams’ completion of Sub-Events with visibility down to individual Sailor “Raw” Sub-Event Completion Data for ALL teams + Assessment scores CV-SHARP Afloat is the Units T&R management tool STILL Evolving and we learn fleet needs and requirements Legacy T&R Matrix started as system to log what the watch teams were doing on a daily basis Capturing the daily watch duties – i.e. Conning the ship, conducting flight ops Not capturing true capabilities – i.e., maneuvering to avoid a mine, or precision anchorage Now we capture the completion of Sub-Events not watches Concentrating on the CORE information that goes to DRRS-N Correlates with DRRS-N CV-SHARP Ashore Captures and stores completion of All sub-events Houses Calculation engine to generate Ef and Pf, NMETL mapping and for sending to NTIMS NTIMS Receives pre-calculated Ef and Pf and combines them to display in proper DRRS-N format CV-SHARP Ashore NTIMS Ef & Pf Tfom Aviation Data Warehouse hosts CV-SHARP Ashore T&R Matrix residency Calculates Ef and Pf CV-SHARP data storage Ef x Pf = Tfom NTA mapping to METL No normalizations applied MET/PRMAR population by Tfom

9 Individual Experience
Experience Hierarchy Individuals Receive Experience as part of Team Teams Complete and log Sub-Events Sub-Event Cumulative Team Experience NTA/PRMAR Experience Cumulative Sub-Event Experience DRRS-N T-Pillar Notional team Experience combined with Sub-Event Performance then mapped to NTAs (METL) for Unit Level Readiness depiction Individual Experience Earned when Team logs Sub-events Team Experience Hierarchy System hierarchy goes from the individual to teams to Teams to Sub-Event and then through METL mapping to NTA and PRMAR for DRRS-N reporting Watch-team completing Sub-Event for Experience is the foundation of methodology Individuals cannot complete training Sub-Events alone – it takes the team EXP credit is given to the Individuals that make up the team Cumulative Team Type EXP rolls up to either DEPT (Department Experience Report) or NTA and PRMAR (DRRS-N) IAW system relationships Current method is a statistical sampling. E level reported to DRRS-N is based on the E level of the Primary Team Type. Each Sub-Event has only one primary Team Type. There is no current tie between individual readiness and DRRS-N T Pillar. That connection will established in the future method described below. Future method will use all data. The individuals with experience in a particular Sub-Event are pooled to make notional teams (This notional team calculator will be implemented in the future, summer/fall 2014) Notional team Experience will be used in DRRS-N calculation Externally assessed Pf is used for each Sub-Event CV-SHARP Sub-Event Experience NTA/PRMAR Experience DRRS-N “T” Pillar (Ef x Pf)

10 Readiness Calculation - Current Methodology
Statistical sampling concept Each Sub-Event has a Primary Team Type Example: AAW 1001-Primary Team Type: Mission Planning CV-SHARP assumption is that if the Primary Team Type did the Sub-Event, all the other Team Types also completed it. CV-SHARP then uses that Sub-Event Completion date ICW the Learn/Maintain/Degrade periodicities to calculate Sub-Event E Doesn’t account for required minimum number of Teams. Example: Damage Control teams usually require 10 teams for Experience. Performance All P Scores defaulted to 100% in DRRS-N and CV-SHARP Doesn’t account for required minimum number of teams: Any time a Primary Team completes the Sub-Event, the ship that date is used ICW the Learn/Maintain/Degrade matrix to update the ship’s E. For example, you are required to have 16 DCRS/BDS Stretcher Bearer Teams. If only one of the 16 teams logs the Sub-Event (according to the Learn/Maintain/Degrade Periodicities) CV SHARP will show the ship with the required E Level. Upon switching to the notional team method (the future methodology) all 16 DCRS/BDS Stretcher Bearer Teams will need to log the Sub-Event within the Learn Interval, in order to increase the ship’s Experience level by 1.

11 All Designated Watch Standers
Future Notional Team Methodology - Delayed implementation Actual Watch Bill 3 Teams Required Team 1 Team 2 Team 3 Team 4 OOD 100% 20% 75% 70% JOOD 80% 60% JOOW 90% 95% 72% 92% 77% Team 1 Team 2 Team 3 OOD 100% 75% 70% JOOD 80% JOOW 95% 92% 90% NOTE: When looking at individual E in the Department Experience report it will display as “3/6” rather than “50%”. The numerator is the individual E Level and the denominator is the phase target E Level. Utilizing percentages to simplify the slide. PROs Reports against requirements of directives, i.e., NAVDORM Display True Capability that could be produced by ship at that point in time, if needed Reduces fluctuation is personnel changes, i.e., changing actual watch bill would cause numbers to change, but using notional teams eliminates/reduces that impact CONs Difficult to follow data streams (low traceability of actual team’s performance – CV-SHARP not equal to DRRS-N) because system generates the team Mitigated by the use of CV-SHARP Department Experience Report to show actual team training Similar to SHARP methodology (Notional Teams assembled) Highest experienced individuals pooled to determine most Experienced Essential Teams Most probable teams to be used in war time scenario All Designated Watch Standers OOD JOOD JOOW 100% 80% 90% 20% 95% 75% 70% 60% 50% 45% 40% 15% Building Block for calculating Sub-Event E Level Selects highest E from whole crew, not just watch team members Optimizes lowest notional team score

12 Future Notional Team Methodology
Conducted by Team Types: Bridge-Navigation 3 Bridge-Officers 3 NAV Brief 1 Piloting 1 Sea & Anchor-Nav 1 MOB-N 1301 Plan Navigation CV-SHARP will calculate notional teams for each of the 5 Team Types conducting the Sub-Event. It will take the lowest notional team score for each of the 5 Team Types and average those 5 scores to come up with the Sub-Event E Level.

13 Future Notional Team Methodology
Notional Team Scores Average worst notional Team Types Bridge-Navigation 78 Bridge-Officers 90 NAV Brief 82 Piloting 95 Sea & Anchor-Navigation 91 Sub-Event E for MOB-N 1301 87.2 Sent to DRRS-N as Sub-Event E Level

14 Average of lowest notional Team Type scores
Future Notional Team Methodology MOB-N 1301 Sub-Event Team CV-SHARP Ashore DRRS-N MET PRMAR Nav Plotter QMOW Bridge-Nav Ind 70 86 Piloting Ind NTA Conduct Navigation MOB-N Exp Perf 85% JL Phone Nav Radar Piloting Off Radar Log Shipping Off SPA-25G SSDS Console TOP Supervisor 90 100 80 MOB-N MOB-N MOB-N MOB-N MOB-N MOB-N MOB-N MOB-N MOB-N MOB-N Bridge Officers OOD JOOD JOOW 70 100 Example uses MOB-N 1301 3 required (Essential) teams for Team Types Bridge-Officer and Bridge-Navigation. 1 required (Essential) team for Team Types NAV Brief, Piloting and Sea & Anchor-Navigation CV SHARP calculates only the required number of notional teams. If the ship has 4 Bridge-Officer teams on the Watchbill and constructed in Team Builder, only 3 notional Bridge-Officer teams will be built CV SHARP will use the notional teams for each team type to calculate a Sub-Event E Level. It will average the worst notional Team E Level from each participating Team Type to get the Sub-Event E Level. In this case the worst notional Bridge-Nav, Bridge-Officer, NAV Brief, Piloting and Sea&Anchor-Nav teams will be averaged. This will be the Sub-Event E level sent to DRRS-N. Averaged in with all the other lowest notional Sub-Event team E scores that are mapped to a particular NTA in the NMETL NTA gets 84% E after averaging all the SE mapped to it Unit Perf scores (from the Performance entry page) gets sent to Ashore (100% in the example) and is averaged in the other SE Perf scores mapped to that NTA NTA gets 92% P after averaging all the SE mapped to it DRRS-N The NTA scores – in the form of a Ef and Pf (84% and 92% in the example) are sent to DRRS-N They are multiplied to determine Tfom for that NTA (77% in example) Then averaged in with the other NTAs mapped to that capability area to determine the Tfom for that capability (85% MOB-N in the above example) CV-SHARP Afloat Exp (in Afloat) Individual credit received in that role as member of a team completing event Team Exp (in Afloat) Visible in DEPT EXP Report on Afloat (at Individual Level) 100% NTA % NTA % NTA % NTA % NTA % NTA % NTA % 85% ATG Assessment NAV Brief Sea& Anchor-Nav 100 50 90 75 40 80 92 84% 92% 1st LT Air Boss CHENG Conn Fathometer Gun Boss Lee Helm Master Helm Gator OOD OPS O Piloting Off RO Shipping O QMOW 87% Aft Master Helm P Aft Master Helm S ANAV Bearing Taker 1 Bearing Taker 2 Bearing Taker 3 Bearing Taker 4 Gator Lee Helm Master Helm Nav Evaluator Nav Plotter Phone Talker 1 Phone Talker 2 Phone Talker 3 Phone Talker 4 80 90 100 65 NTA Conduct Navigation = 77% Tfom E x P Average of lowest notional Team Type scores

15 T Score Summary Page - This slide shows the overall concept. The actual report is broken down into two parts. The first drills down from Capability to Sub-Event Level, the second drills down from Sub-Event to Individual level. - Only available for ships where replication is working correctly - Provides data on ship’s progress vs the maximum possible T score for that phase - Provides provide preview of ship’s DRRS-N scores under the notional team methodology

16 Experience Analysis - Internal
DEPT Experience The Department Experience Report shows the Sub-Events supported by each department. The experience numbers roll up so that the numerator and denominator are additive. For example: If ADTT Team A has 15 members and the phase target for the Sub-Event is 1, and each member has completed Sub-Event FSO 1092 one time, ADTT Team A’s Experience will display as 15/15. If ADTT Team A has 15 members and the phase target for the Sub-Event is 1, and 14 have completed Sub-Event FSO 1092 one time, ADTT Team A’s Experience will display as 14/15. If ADTT Team A has 15 members and the phase target for the Sub-Event is 2, and 15 have completed Sub-Event FSO 1092 one time, ADTT Team A’s Experience will display as 15/30. The numerators and denominators continue to add up as the Team rolls up to Team Type, Sub-Event and Department. A large Department will have very large numbers in the numerator/denominator. The numbers are also affected by the number of Teams the ship has built. If a ship builds extra teams, then the denominators will be correspondingly bigger. Sub-Event Experience Team Type Experience Team Experience

17 Experience Analysis - Internal
Individual Experience is displayed in a separate window when you click on the team name. The numerator is the individual’s current E level and the denominator is the phase target from the CV TRAMAN. Individual Experience

18 Future Unit Performance (Pf) - Business Rules
Pf = Score ranges 90-100% = 100 Pf; % = 90; < 80% = actual Actual Scores used for Battle ‘E’ computations Range Score used in DRRS-N Calculation for T-Pillar Currently all Pf defaulted to 100% in DRRS-N Broader Performance Targets Provides grading buffers Reduces ‘bad-day’ impact Reporting to DRRS-N T Scores currently defaulted to 100% in DRRS-N reporting Way ahead: % will go to DRRS-N as 100, 80-89% will go to DRRS-N as 90, <80 will go to DRRS-N as actual score Will not be implemented for several months, after switch to notional team methodology for E Actual P scores used for Battle E

19 Performance Assessment
External assessor (ATG) enters UNIT Sub-Event assessment scores (%) into CV-SHARP, which feeds DRRS-N ‘Unit’ Performance Score is at the unit level, not tied to a particular individual or team in CV SHARP Unit assessed on a Teams performance in the completion of a Sub-Event At unit’s discretion as to which team(s) is used for the assessment Designed to allow assessor to enter assessment scores for Sub-Event Assessor will have specific log-on credentials with limited administrative access/rights Maintains historical record, user entering data and any comments Reporting to DRRS-N T Scores currently defaulted to 100% in DRRS-N reporting Way ahead: % will go to DRRS-N as 100, 80-89% will go to DRRS-N as 90, <80 will go to DRRS-N as actual score Will not be implemented until summer 2014, after switch to notional team methodology for E Actual P scores used for Battle E

20 Performance Assessment - Summary Page
Allows users to view the assessment scores for SEs Filterable dependent on user needs

21 CV-SHARP – DRRS-N Alignment
PESTO Pillars – Training Drill Down

22 CV-SHARP – DRRS-N Alignment
PESTO Pillars – Training Drill Down Tabs Performance Factor Tab Experience Factor Tab

23 Takeaway Accurate logging
Ensure all Team Types required for each Sub-Event are logging the training. Previous method of using only the Primary Team Type is being phased out. Accurate logging now will provide better E levels when the new calculation method is implemented.

24 Other Customer Support email
CNAF T Pillar Leads CV-SHARP Program Manager LCDR Sean “Butterbean” Blackman LCDR Sean Lennon Contractor Support Leads CNAL Customer Support Maxalex Fequiere (757) CNAP Customer Support Ed Joiner (.smil.mil) (619) Other Customer Support


Download ppt "CV-SHARP Brief 1 January 2018."

Similar presentations


Ads by Google