Download presentation
Presentation is loading. Please wait.
Published byJayson Hardy Modified over 9 years ago
1
Defense Readiness Reporting System – Army Senior Leader Presentation
Readiness is the Army’s ability to win the nation’s wars and execute the National Military Strategy. -- Our path to the future -- As of 12 May 2006
2
To Inform the Army Senior Leadership on the Implementation of
Purpose To Inform the Army Senior Leadership on the Implementation of Defense Readiness Reporting System (DRRS) Good morning. For this next briefing I’m going to provide you with the “5-W’s” of OSD’s Defense Readiness Reporting System and the Army’s plan to implement it.
3
What is DRRS? The Defense Readiness Reporting System:
Focuses on unit readiness primarily by assessing unit capability with Mission Essential Tasks (Y/Q/N) in addition to standard resource metrics (P/R/S/T). Requires Services to provide OSD access to authoritative databases (personnel, equipment, training, medical, real property, etc..). Allows planners at all levels to bridge the “knowledge gap” between readiness status, war plans and COAs. OSD, COCOMs, JS and Services can have access to current unit level information to facilitate sourcing, deployment and operational employment decisions. Answers “ready for what?” questions posed by the OSD, the COCOMs and JS. “DRRS” is the new OSD readiness reporting system that will replace the current SORTS systems and change our current USR system in two ways: First, we will use new metrics to assess the ability to execute designated missions. Army commanders will also address Mission Essential Task assessments in addition to the traditional USR measurements of personnel and equipment. DRRS will measure unit capabilities with the metrics of: “Yes” my unit is “Fully Qualified” to execute the task; “Q” my unit is a “Qualified Yes” to execute the task; and “N” or my unit is “Not Qualified” to execute the task; similar to our current T/P/U task standards. Second, unit readiness reports will link directly to authoritative databases for training, equipment and personnel. This will provide an unprecedented level of data visibility available to DRRS users throughout DoD to include OSD, the Joint Staff and the COCOMs. The aim of the new system is to assess the capability of our units to execute their METLs and answer the “Ready for What” questions. NOTE: Status of Resource and Training System (SORTS): The Chairman’s readiness reporting guidance to all the Services. Dictates most of the readiness measurements we have in the USR today.
4
OSD Main Objectives of DRRS
What is DRRS? OSD Main Objectives of DRRS Expand the way we measure force status Focus on forces and missions by assessing unit capability with Mission Essential Tasks (Y/Q/N) Continues standard resource metrics (P/S/R/T) Includes Joint, operational units and CSAs Focus how we think of readiness Beyond narrow resource accounting Synonymous with capability—What can forces do? In the context of assigned missions Primarily assess Improve our ability to assess risk Uses collaborative tools linked to near-real time data Operational context—What can’t forces do? Link deficiencies to operational impact Builds in the identification of mitigation strategies Considers readiness reporting, risk assessment and adaptive planning as one large, iterative process Answers the “Ready for What?” question DRRS provides: Rapid scenario-based readiness assessments Planning and course of action options Higher confidence in the quality of data and assessment results Adaptive Planning Joint Quarterly Readiness Assessments Global Force Management Process Reports to Congress DRRS Tools to Manage Joint Forces Identifying and Managing Force Usability
5
Why Are We Implementing DRRS?
We have been directed to implement DRRS OSD has mandated all Services move towards a single readiness reporting system – Defense Readiness Reporting System (DRRS): June 2002 DoD Directive Establishes DRRS. Measures and reports readiness of military forces & supporting infrastructure to meet missions & goals assigned by the Secretary of Defense using a MET construct December 2004 Strategic Planning Guidance. The Service Secretaries, the Directors of Combat Support Agencies, and the Joint Community will report mission readiness and force management data in DRRS according to Mission Essential Tasks (METs)...” What will DRRS do for the Army? OSD has directed DRRS implementation in a June 2002 DoD Directive, establishing DRRS, and the December 2004 Strategic Planning Guidance. Dr. Chu, the USD for Personnel and Readiness, is overseeing DRRS implementation across the entire Department of Defense. He has already distributed 3 serial guidance memorandums providing specific instructions on DRRS implementation and we anticipate more guidance in the future. The key point is that DRRS will improve the Army’s Readiness reporting by: Retiring our legacy system and simplifying reporting procedures. Improving the accuracy of unit readiness reporting, allowing more detailed or more precise assessments I can tell you that by moving away from a legacy system that uses 1970’s technology we will improve the accuracy of the data we use to manage the Army. And, unlike the sluggish tools we have now, the Army can have a system that can be rapidly adjusted to meet new or emerging readiness requirements. The new readiness reporting system provides the Army several major improvements: Allows the opportunity to break the “SORTS anchor” by retiring our legacy system and simplifying reporting procedures. It improves the accuracy of reporting unit readiness.
6
How the Army uses the Readiness System Today
ASORTS IS THE AUTHORITATIVE DATABASE OF RECORD FOR OPERATIONAL ARMY ORGANIZATIONS Units report thru ASORTS to meet Congressional and CJCS GSORTS requirements ASORTS supports the TUCHA process to develop deployment units data Units must be verified from ASORTS to document force structure Uses ASORTS to conduct deployment analysis Units must be verified from ASORTS to get a DODAAC. Units must be verified from ASORTS to get soldiers Army system within ASORTS used for readiness analysis GSORTS MDIS/ DAMPS LOGSA HRC TAADS JOPES ARMS ASORTS Over 60% of ASORTS consists of Army unique data – needed for Army management systems LOGSA ITAPDB SAMAS BIDE/ABIDE JOPES MOBODEE USR DAMO-FM/ MACOMS AMC/G4 HRC DAMO-FM JOINT FORSCOM Field
7
DRRS-A Implementation
We are currently in a 4-phase plan to bring the Army in line with the OSD DRRS requirements. Our plan gives us the roadmap to replace our legacy ASORTS system by October 2006 with a program called DRRS-Army. Our current focus is to complete three major tasks by October 2006: #1 Complete DRRS-A software development and have it fielded to the Army. #2 Complete all individual DRRS-A Training and conduct a Proof of Principle test #3 When fielding and testing are complete, retire the current USR system. We will continue to refine Readiness Reporting requirements until October 2007, when we expect to field the next major version of DRRS-A.
8
DRRS-A Implementation is a 4 Phased Plan
In phase I selected Army Commands and Corps began reporting METs directly into DRRS. In phase II Army units began reporting METs via our legacy ASORTS (Feb 06). Army Cmd Phase 1 & 2 OSD DRRS XXX COMPLETE XX ASORTS GSORTS And Below Current Effort Army units will report readiness using DRRS-A. All Army readiness unit reps will train on the new system. They have access to the new web based DRRS-A input tool. Select Installations will begin reporting into DRRS-A. COMPLETE NLT OCT 06 DRRS -ARMY Army Cmd OSD DRRS XXX And Below We are currently in a 4-phase plan to bring the Army in line with the OSD DRRS requirements. Our plan gives us the roadmap to replace our legacy ASORTS system by October 2006 with a program called DRRS-Army. Our current focus is to complete three major tasks by October 2006: #1 Complete DRRS-A software development and have it fielded to the Army. #2 Complete all individual DRRS-A Training and conduct a Proof of Principle test #3 When fielding and testing are complete, retire the current USR system. We will continue to refine Readiness Reporting requirements until October 2007, when we expect to field the next major version of DRRS-A. NOTE: OCT07 version expected to improve reporting efficiency and have more installations and TDA units reporting into DRRS-A. NOTE: Army Status of Resource and Training System (ASORTS): The Army readiness reporting guidance to all the Services. Includes SORTS + Army requirements. Over 60% of ASORTS has Army unique requirements. NOTE: ASORTS drives many Army management process. Examples include: Used to cut personnel orders or determine DODAACs to move equipment. And Below Army Cmd OSD DRRS DRRS -ARMY Future Effort COMPLETE NLT OCT 07 All Army Commands and ASCCs will report into DRRS-A. Significantly more installations and TDA will begin reporting into DRRS-A.
9
DRRS-A NETUSR What Army users can expect
Replacing the legacy system with a user friendly application This screen shot from our current development effort shows what our readiness reports will look like under DRRS-A. Commanders and their staffs will immediately see that DRRS-Army replaces the painful stubby pencil drill for a semi automated reports process. This new system will cut in half the time a commander uses to prepare his current USR. There will be no change to the current reporting levels: Our “AA” and selected derivative UIC units will continue to report readiness. There have been concerns from the field about reporting down to platoon, detachment or even individual levels. . . The Army will not require that level of reporting and we won’t let OSD levy an unreasonable requirement on us. Finally, the way the chain of command reviews readiness reports will not change. The ARSTAF will not use any unit readiness report until the Chain of Command has approved it. Regardless of the changes you see with the new system, DRRS-Army will always remain a “commanders” report. Anything we automate in this tool will only be used to assist the commander and not replace him. NOTE: AA units include Battalion and separate company type units. Future Version used for DRRS-A reports Current Version used for USR reports
10
Who Reports Readiness No Change New Reqmt
DoDD requires Service Secretaries: Identify and include as measured units within ESORTS operational and support organizations within the scope of their responsibilities needed to execute mission essential tasks in support of Combatant Commanders and Service-assigned missions. Develop resource and training standards for all organizations designated for inclusion in ESORTS. Collect and report metrics and supporting data for these organizations as specified by DoD implementation instructions. Army retains current reporting requirements Operational Units MTOE units with AA UIC MTOE units with FF UIC; and major headquarters TDA units that already submit reports Training Base: RC Training Divisions, Brigades, Battalions APS Derivative UICs (DUIC) already reporting, or when directed to report. [Note: OSD DRRS Serial 2.0 Guidance requires reports from all DUICs. Current CJCSI and Army policy: Services determine when DUICs report. Currently HQDA, Army Commands/ASCC/DRU with ADCON determine if and when DUICs report readiness] Army adds new reporting requirements per DRRS Serial guidance Institutional Army Select Installations start October 2006 (focus on Power Generation/Power Projection Platforms) Other TDA units: Phased implementation pending refinement of requirements with Army Commands, ASCCs, DRUs No Change New Reqmt
11
What Readiness is Reported
DoDD directs Service Secretaries Develop resource and training standards for all organizations designated for inclusion in ESORTS. Collect and report metrics and supporting data for these organizations as specified by DoD implementation instructions. OSD DRRS Serial Guidance 1.0 directs Services report Core Tasks, Major Plans, and Named Operations MET/METL assessments IAW Y/Q/N criteria CJCSI D GSORTS requirements not rescinded Recommend Army add MET assessments to current AR 220-1 Army units develop and report readiness against METL based on appropriate TOE mission statement and MTP for the unit SRC (applies through all phases of ARFORGEN) Army units with directed mission (DEF, ILO) develop and report readiness based on directed MET Army Brigade FF report includes all assigned or OPCON units Army Division FF report includes organic units (STB, CPs) Army Division and higher headquarters develop and report readiness for Major Plans and Named Operations
12
What Capabilities Are Measured in DRRS-A?
“Core Tasks” Army units develop and report readiness against Core METs based on appropriate TOE and MTP for the “type” unit. Core Tasks become the basis for assessment by units in ARFORGEN. Unit Commander considers MTOE resources in assessment of ability to execute unit core mission. “Named Operation” (OIF/OEF/ONE) Army units sourced for named operation, or provided a non-standard or “in lieu of” mission, are considered to have a directed mission. They will develop METL and report readiness based on that directed mission. Unit Commander considers mission required resources in assessment of ability to execute directed mission. “Major Plans” (Korea Peninsula War Plans) Army two-star headquarters and theater strategic commands develop and report MET for Major Plans when directed by HQDA/ASCC. The major change to readiness reporting is the requirement to report Mission Essential Task assessments. Under DRRS-A all reporting commanders will provide an assessment of their unit’s capability to execute its MET based on its core mission (MTP relating to the unit). If a unit has been identified to support a named operation - OIF, for example, the commander's MET assessment will reflect the unit’s ability to perform its assigned mission in that operation. . . whether “full spectrum operations” or “security force” missions . A third MET requirement is to report readiness against major plans. Requirements for OPLAN assessments will be determined by the Joint Staff. We will only require this assessment by Army headquarters with sufficient planners, i.e., two-star headquarters and above. NOTE: MAJOR PLANS will be assess by organizations with dedicated planners. These organizations have the capability to do an accurate and proper assessment. METs are assessed on the capability of the unit to “fight tonight”
13
How Are Unit Capabilities Measured in DRRS-A?
Army moves from C-Level to Y/Q/N “Yes” (Y) Assessment: Organization can accomplish task to standard under specified conditions. “Yes” assessment should reflect demonstrated performance in training or operations whenever possible. Unit possesses the necessary resources, or those resources have been explicitly identified to the unit, to allow it to execute tonight. “Qualified Yes” (Q) Assessment: Organization is expected to accomplish the task to standard, but this performance has not been observed or demonstrated in training or operations. Organizations assessing their task or mission as a “Qualified Yes” can be employed for those tasks. Unit possesses the necessary resources, or those resources have been explicitly identified to the unit, to allow it to execute tonight. “No” Assessment: The organization is unable to accomplish the task to standard at this time. These are the detailed definitions that will be used to assess unit METs using the new OSD metrics of “Yes”, “Qualified Yes”, or “No”. The Blue Text highlights the Army additions to the OSD Y/Q/N definitions and requires commanders to consider the availability of resources in making the MET assessment. These new reporting metrics will be addressed in a “rapid action” revision of AR We are currently writing the draft version and will be sending out the first staffing by June 2006. A key note here is that, under the OSD guidance, assessments of “Yes” or “Qualified Yes” mean the unit is considered deployable for those tasks assessed. NOTE: We assessed the OSD definitions for Y/Q/N as to vague and subjective. We wrote and staffed the new version to the field. The VCSA then approved the new definitions on 6 April 06 at the Army Campaign Plan DP 64 ARFORGEN Readiness Metrics briefing. In DRRS; “Yes” or “Qualified Yes” indicates the unit is considered Deployable and Available for COCOM use
14
Army’s Overall Y/Q/N Assessment
Core Mission Assessment: Commander considers MTOE resources in assessment of ability to execute unit core mission Directed Mission Assessment: Commander considers mission required resources in assessment of ability to execute directed mission Need to set thresholds for “Y/Q/N” in AR 220-1 If MET is rated “Untrained”, Commander cannot assess a “Yes” or “Qualified Yes” against that task. If any METs are “No”, Commander cannot assess overall “Yes” or “Qualified Yes” If MET is rated “Needs Practice” and the unit lacks resources, Commander can assess Task as “Qualified Yes” only if risks can be mitigated; otherwise, Task is assessed as “No”
15
How Hard is “Y” OSD DRRS Serial 2.0 Guidance requires more Y than Q/N tasks If the majority of the Command level METs are assessed as “Yes”, and the remaining METs are assessed as “Qualified Yes”, then the overall assessment should be “Yes”. If the majority of the Command level METs are assessed as “Qualified Yes” and the remaining METs are assessed as “Yes”, then the overall mission assessment should be “Qualified Yes”. If any of the tasks are assessed as “No (Red), then the Commander must make a judgment as to whether the mission objectives can still be accomplished. Any “No” task would normally preclude an overall mission assessment of “Yes”. If the overall mission is rated other than “No” the commander should clearly explain how the plan will be accomplished despite the inability to accomplish the MET and any mitigation actions that will be taken. OSD guidance inconsistent with Army training doctrine. Recommend Army set high threshold for Y Y means Ready for War … Now Any MET rated N precludes a Unit Overall Assessment of Y
16
DRRS MET Assessments DRRS METRICS ARMY TRAINING METRICS
“Yes” (Y) (Green) Assessment: Organization can accomplish task to standard under specified conditions. “Yes” assessment should reflect demonstrated performance in training or operations whenever possible. “Qualified Yes” (Q) (Amber) Assessment: Organization is expected to accomplish the task to standard, but this performance has not been observed or demonstrated in training or operations. Organizations assessing their task or mission as a “Qualified Yes” can be employed for those tasks. “No” (Red) Assessment: The organization is unable to accomplish the task to standard at this time. ARMY TRAINING METRICS Trained (T) (Green): Unit is trained and has demonstrated proficiency in accomplishing the task to Army Standard. Task performance judged to be free of significant shortcomings. Practice (P) (Amber): Unit can perform the task with some shortcomings. Performance is not achieved to standard without some difficulty or unit has failed to perform some tasks to standard. Shortcomings not severe enough to require retraining. Untrained (U) (Red): Unit cannot demonstrate an ability to achieve wartime training proficiency. Leader prepares comprehensive plan to train all supporting tasks not executed to standard. Assessed as “Go to War Now” with current resources (personnel, equipment, training) Primarily assessed against Training proficiency. May not consider resource constraints
17
DRRS-A Task Assessment Metrics
Task is assessed by unit commander as TRAINED? Sufficient resources (i.e., personnel & equipment) required to successfully accomplish the task are available? YES Task is assessed by unit commander as either NEEDS PRACTICE or UNTRAINED ? QUALIFIED (A reason code is required; comments are optional) NO Y N
18
DRRS-A “A Commander’s Report”
Like the current USR readiness system DRRS-A will remain a “Commander’s Report”. Data is validated by the chain of command and quality-controlled by the Army institutional processes. Installations and additional TDA unit readiness reports will be added over time. DRRS-A provides enhanced features. Users directly linked to the respective Army authoritative databases. Provides user-friendly web-based input tools to ease report submission.
19
When is Readiness Reported
DoDD requires Service Secretaries: Identify and include as measured units within ESORTS operational and support organizations within the scope of their responsibilities needed to execute mission essential tasks in support of Combatant Commanders and Service-assigned missions. Develop resource and training standards for all organizations designated for inclusion in ESORTS. Collect and report metrics and supporting data for these organizations as specified by DoD implementation instructions. Army retains current reporting requirements Operational Units MTOE units with AA UIC MTOE units with FF UIC; and major headquarters. TDA units that already submit reports Training Base: RC Training Divisions, Brigades, Battalions APS Derivative UICs (DUIC) already reporting, or when directed to report. [Note: OSD DRRS Serial 2.0 Guidance requires reports from all DUICs. Current CJCSI and Army policy: Services determine when DUICs report. Currently HQDA, Army Commands/ASCC/DRU with ADCON determine if and when DUICs report readiness] Army adds new reporting requirements per DRRS Serial guidance Institutional Army Select Installations start October 2006 (focus on Power Generation/Power Projection Platforms) Other TDA units: Phased implementation pending refinement of requirements with Army Commands, ASCCs, DRUs No Change
20
How will the Army Implement DRRS-A?
Oct 06 ASORTS Retired and DRRS-A Implemented Sep 06 & beyond: Web-based training Training Key Events 21 Aug 06: Army Readiness Conference Aug - Oct 06: DRRS-A Field Testing June-Sep 06: MTTs (Unit Level) NLT 17 May 06: Publish HQDA Message for DRRS-A Implementation Apr-Jun 06: Command Information 7 Apr 06 (T): Brief ACP DP 64 (ARFORGEN Measuring Model) Mar-Nov 06: DRRS-A Publicity ICW PAO The DRRS-A Fielding Timeline to meet the October 2006 suspense is very ambitious but we are on track. Some of the challenges we face include new equipment training for over 5,000 reporting units and fully testing the DRRS-A process before going live in October. I’d like to highlight some key events: Starting at the end of June, our MTTs will be visiting designated installations to conduct user training. We will follow the MTTs with web-based distributed training, to include self-paced courses and train-the-trainer packets. Starting in August, we will conduct proof of principle testing. My intent is to include a number of unique units to ensure we have this process right before locking it down by the October implementation date. We are already soliciting your commands for units to assist our effort. I’ll discuss specifics on this in just a minute. 24 Mar 06: OPSDEP Tank (OSD DRRS) 7 Mar 06: ARWG--Business Rules Review #2/Training Update 28 Feb 06: ARWG—Draft Business Rules Review/Training Update 22 Feb 06: Army Times Interview 10 Feb 06: ARWG Kick OFF Meeting
21
DRRS-A Full Dress Rehearsal Proof of Principle
FT DRUM WEPGAA 0710 CS BN BDE SPT BN INF WGBRAA 0087 IN RGT 02 BN INF BCT WAKLAA 0022 IN RGT 02 BN INF BCT WD82AA 0010 CS BN BDE SPT BN INF WJJ4AA 0071 AR RGT 03 SQDN RSTA WA29AA 0025 FA RGT 04 BN 105T INF UA WD8YAA 0010 IN BDE 01 HHC 1BCT 10 MTN WJJQAA 0010 IN BDE 03 HHC 3BCT 10 MTN WAKKAA 0032 IN RGT 01 BN INF BCT BASELINE UNITS ID all the units NLT JUN06. Prepare FDR EXORD and execute FDR twice – in AUG and SEP06. Ft STEWART WAQJFF 0003 IN DIV INF MECH DIV WAQ1AA 0069 AR RGT 03 BN MANEUVER WAQGAA 0007 IN BN 02 MANEUVER BN WAP9AA 0007 AR SQ 03 ARMORED WAQPAA 0003 IN BDE 02 HHC HVY 2BDE WAQEAA 0015 IN BN 03 MANEUVER BN UA WAQZAA 0041 FA BN SP HEAVY UA WAQNAA 0003 IN BDE 01 HHC HVY 1BDE WAX1AA 0003 AV BN 03 AVIATION REGT WA2SAA 0009 FA BN SP HEAVY UA WJAUAA 0026 CS BN BDE SPT BN WDMPAA 0007 AR SQD 05 ARMORED WJJJAA 0003 AR BN 01 BDE STB HVY WJATAA 0003 CS BN BDE SPT BN HVY WAZDAA 0064 AR BN 01 MANEUVER BN WJJ6AA 0003 AR HHC STB WAQRAA 0003 FA BDE HHB DIVARTY WDX9AA 0003 AV BN 02 GENERAL WJJKAA 0003 AR BN 02 BDE TRP BN UA2 TN GUARD WXC6AA 1174 TC CO MDM TRUCK POL W7L6AA W7L6 45 CIVIL SPT TM TNARNG WTR9AA 0771 OD CO MAINT NONDIV DS WPD9AA 0230 CS HHC HHC AREA SPT GP WVD9AA 0278 AR RGT HHT 278 CAV RGT CA GUARD WV7TAA 0000 AV HHC DIV AVN BDE WTJKAA 0838 MP CO CBT SPT WQRXAA 0140 AV RGT 01 BN GEN SPT WV76AA 0140 AV CO CO G USARC WSP6AA 0367 PI DET MOBILE WQ11AA 0212 CS CO QM SUPPLY CO WSJ1AA 0300 MP HHC PW CMD WR17AA 0091 JA TM LSO WSPQAA 0209 QM CO SUPPLY DS WSUEAA 0753 QM CO WATER SUPPLY Proof of principle testing will ensure we can effectively replace our legacy ASORTS system. This August and September we will test the entire DRRS-A system: from the user inputting data to sending it up the chain of command until it reaches HQDA. We cannot field this system if any of those steps fail. We will start with over 100 units from all three components in the Proof of Principle test. These are the same units that recently conducted Proof of Principle testing in February for the current ASORTS reports. These units have recent experience with testing and also represent a good sample of Army unique and specialized requirements. For the first time we will have 18 installations also reporting into DRRS-A. Installations are a new reporting requirement from OSD that will be included in the DRRS-Army Proof of Principle. NOTE: Some of the 18 (power projection platforms) installations include: FORTS Stewart, Campbell, McCoy, Benning, Eustis, Drum, Dix, Bragg, Hood, Lewis, Carson, Bliss, Sill, Riley, Polk, and CAMPS Shelby, Atterbury, Roberts. MACOMS and COMPONENTS ARNG USAR ARCENT USAREUR FORSCOM HQDA CCSA OSD DRRS All 18 installations will also take part in the Proof of Principle Full Dress Rehearsal
22
Training Strategy Overall training concept will occur in two phases:
Phase I: Includes an DRRS-A information campaign starting in March 2006 with briefs posted on the Army readiness website, publication in professional periodicals and interviews on the Army news channels. An Army Readiness Conference will be conducted in late August 2006. Phase II: Includes executive level leadership briefings to Army Commands and their primary staff. This will include DRRS-A demonstrations online computer conference, and/or on site presentations beginning in May 2006. The DRRS-A Mobile Training Team (MTT) will conduct distance learning training for battalion, separate company level USR officers beginning in July 2006. The GCCS-A PM will develop self paced courses (web enabled) and train the trainer packets. One of the primary concerns is the ability to train everyone who will put their hands on DRRS-A data. This includes the Army leadership, who may not be inputting data, but must understand the new metrics and what they mean to OSD, the Joint Staff and the COCOMS. Because of the limited time to train the soldiers who will actually enter data into DRRS-A, we are using a combination of training methods. The GCCS-A PM will be training users via Distance Learning Sites, Mobile Training Teams, Train the Trainer packets and web-based courses. We will closely track the individual training requirements through the Army Commands and Components to ensure we have achieved our objective of 100% trained. Training is not restricting to soldiers at the unit level. We also are reaching into the school houses and using Army publications to inform the widest possible audience both up and down the chain of command. My staff is working hard in a variety of venues and have even updated both the Pre-Command and Chemical Basic Course POIs. Both these courses are prepared to start teaching DRRS-Army now.
23
Where We Are Today The GCCS-A PM is overseeing two major contractor supported efforts in the software development of DRRS-A. The PM is also putting together the training packets and POIs for the MTTs, Distance Learning Sites, Train the Trainer/Self Paced courses. We are preparing the staffing of a “rapid action” revision of AR that will include the DRRS-A changes. We are preparing the Proof or Principle message for the August 2006 test date. Our current status: Under the GCCS-Army PM we are making great progress in the technical development of DRRS-A. I am confident that we will be ready to begin the August 2006 Proof of Principle testing given the progress the PM has demonstrated so far. The PM is also conducting a full court press to finish the training material needed to prepare the field to use DRRS-Army. My staff is also on track to publish a new AR to provide written regulatory guidance for implementing DRRS-A by the end of August 2006. Finally, we will contact the Army Commands and Components to coordinate the publication of a message that outlines the details for the Proof of Principle testing that will begin in August 2006.
24
NETUSR DEMONSTRATION
25
BACKUPS
26
OSD DRRS Mission Assessment Y/Q/N
Commanders/Agency Directors will also assess the ability of the organization to execute the mission essential task list (METL) to meet mission objectives. If the majority of the Command level METs are assessed as “Yes”, and the remaining METs are assessed as “Qualified Yes”, then the overall assessment should be “Yes”. If the majority of the Command level METs are assessed as “Qualified Yes” and the remaining METs are assessed as “Yes”, then the overall mission assessment should be “Qualified Yes”. If any of the tasks are assessed as “No (Red), then the Commander must make a judgment as to whether the mission objectives can still be accomplished. Any “No” task would normally preclude an overall mission assessment of “Yes”. If the overall mission is rated other than “No” the commander should clearly explain how the plan will be accomplished despite the inability to accomplish the MET and any mitigation actions that will be taken. OSD DRRS Serial 2.0 Guidance, page 2 (2005)
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.