Download presentation
Presentation is loading. Please wait.
Published byDorothy Morris Modified over 9 years ago
1
Cardinal Consultants 19 ISMOR Aug 2002 Improving Confidence in the Assessment of System Performance in Differing Scenarios. T D Clayton Cardinal Consultants
2
19 ISMOR Aug 2002 1. Context 2. Scenario Dependency of Input Data 3. Choosing Scenarios to Assess 4. Modelling Widely Differing Scenarios 5. Example Study 6. Summary and Conclusions
3
Cardinal Consultants 19 ISMOR Aug 2002 SYSTEM EFFECTIVENESS ASSESSMENT
4
Cardinal Consultants 19 ISMOR Aug 2002 SYSTEM EFFECTIVENESS ASSESSMENT Warhead Lethality
5
Cardinal Consultants 19 ISMOR Aug 2002 SYSTEM EFFECTIVENESS ASSESSMENT Warhead Lethality Combat modelling
6
Cardinal Consultants 19 ISMOR Aug 2002 SYSTEM EFFECTIVENESS ASSESSMENT Warhead / Fuze Performance Combat modelling Sensor Performance Operator Performance Guidance System Wargaming Tactical / Strategic studies Other subsystems
7
Cardinal Consultants 19 ISMOR Aug 2002 Purpose of System Effectiveness Studies Research / long term development objectives Medium term procurement objectives Design optimisation Procurement decisions Input to Operational / Tactical Studies
8
Cardinal Consultants 19 ISMOR Aug 2002 But, whatever the purpose, scenario assumptions are critical. or, we should assume they are, unless proven otherwise.
9
Cardinal Consultants 19 ISMOR Aug 2002 Rule 1 Everything is scenario dependent.
10
Cardinal Consultants 19 ISMOR Aug 2002 SYSTEM EFFECTIVENESS ASSESSMENT Warhead / Fuze Performance Combat modelling Sensor Performance Operator Performance Guidance System Wargaming Tactical / Strategic studies Other subsystems
11
Cardinal Consultants 19 ISMOR Aug 2002 SYSTEM EFFECTIVENESS ASSESSMENT Warhead / Fuze Performance Combat modelling Sensor Performance Operator Performance Guidance System Wargaming Tactical / Strategic studies Other subsystems
12
Cardinal Consultants 19 ISMOR Aug 2002 P k = 0.47
13
Cardinal Consultants 19 ISMOR Aug 2002 Nature of ground around the target Presence of adjacent trees, or protective earthworks Azimuth distribution Elevation distribution Relative value of M-kill, F-kill, P-kill, K-kill Likelihood of multiple hits Using an MFK value as a probability ?
14
Cardinal Consultants 19 ISMOR Aug 2002 The Multi-Disciplinary Problem Lethality Expert Systems Modeller Combat Modeller
15
Cardinal Consultants 19 ISMOR Aug 2002 The Management Solution Establish roles and responsibilities for managing the interfaces between expert groups.
16
Cardinal Consultants 19 ISMOR Aug 2002 Responsibilities of the Interface Manager Understand methodologies and assumptions at all levels Organise training / briefings to assist expert groups widen knowledge Conduct studies to measure Scenario Dependencies of results Maintain knowledge base of dependencies and “corrections” Involvement in planning of studies, addressing assumptions Involvement in reporting of studies, esp. assumptions
17
Cardinal Consultants 19 ISMOR Aug 2002 Study 1 MAIN DATABASE OF STUDY RESULTS Study 2Study 3 DATABASE OF SCENARIO COMPENSATION FACTORS Comparison & Analysis ‘Offline’ analysis tools Study planning and analysis Data provided to other studies
18
Cardinal Consultants 19 ISMOR Aug 2002 Study 1 MAIN DATABASE OF STUDY RESULTS Study 2Study 3 DATABASE OF SCENARIO COMPENSATION FACTORS Modified SCF’s Calculate SCF’s from new studies Assessment and comparison of SCF’s
19
Cardinal Consultants 19 ISMOR Aug 2002 Rule 2 You will never assess the right scenarios.
20
Cardinal Consultants 19 ISMOR Aug 2002 Scenario Parameters Climate - Temperature - Precipitation Ground - Vegetation - Topology - Roads Geography - Geographic isolation & Politics - Neighbouring countries - Local cilvilian population Opposing - Nuc., Chem., Bio. Max. Cap. - Short range Long range Opposing - Numbers Troops - Capability Opposing - Technology Ground - Numbers Equipment - Own Intell. Posture & - Posture (Defensive, attacking) Deployment - Deployment and detectablity Air - Aircraft types Capability - Level of technology - Numbers - Own Intell. Anti-Air - Numbers of units Capability - Capability - Own Intell. Maritime - Maritime involvement - Capability BLUE ROLE - Peace keeping, combat (defensive) combat (hunt and kill)
21
Cardinal Consultants 19 ISMOR Aug 2002 Scenario 1 Scenario 2 Scenario 3
22
Cardinal Consultants 19 ISMOR Aug 2002 continuous parameter
23
Cardinal Consultants 19 ISMOR Aug 2002 Rule 3 A combat model cannot address widely differing scenarios.
24
Cardinal Consultants 19 ISMOR Aug 2002 Example Study Comparative assessment of two potential candidates for a cannon system for light armoured vehicles.
25
Cardinal Consultants 19 ISMOR Aug 2002
26
Cardinal Consultants 19 ISMOR Aug 2002 Input data Engagement Model (developed for this study) Combat model (existing) 3 Scenarios ORIGINAL STUDY PLAN
27
Cardinal Consultants 19 ISMOR Aug 2002 REVIEW OF PROVIDED DATA 1. When multiple hits are likely, SSKP may not be appropriate. 2. Lethality figures give no azimuth dependency. 3. No information on range dependency. 4. Data required for wider range of target types. Lethality models re-run, in concert with Engagement model.
28
Cardinal Consultants 19 ISMOR Aug 2002 REVIEW OF EXISTING COMBAT MODEL 1. Tends to choose tanks as preferred target type. 2. All targets are land vehicles. 3. Terrain in all 3 scenarios tends to give long engagement ranges. 4. No variations in met-vis or day/night > long ranges 5. Same Blue positions for both System A and System B. 6. Units are static when firing.
29
Cardinal Consultants 19 ISMOR Aug 2002
30
Cardinal Consultants 19 ISMOR Aug 2002 THE ALTERNATIVE APPROACH 1. Use a range of methods, including Military Judgement, to derive intermediate data and distributions reflecting a wide range of scenarios. relative frequencies of target types engaged engagement range distributions azimuth distributions probability of kill per burst - function of range and target type
31
Cardinal Consultants 19 ISMOR Aug 2002 THE ALTERNATIVE APPROACH 1. Use a range of methods, including Military Judgement, to derive intermediate data and distributions reflecting a wide range of scenarios. 2. Develop a simple tool to calculate specific Measures of Effectiveness from the input data and distributions. MoE 1: Military Worth of kills per burst MoE 2: Military Worth of kills per ammunition load
32
Cardinal Consultants 19 ISMOR Aug 2002 THE ALTERNATIVE APPROACH 1. Use a range of methods, including Military Judgement, to derive intermediate data and distributions reflecting a wide range of scenarios. 2. Develop a simple tool to calculate specific Measures of Effectiveness from the input data and distributions. quick to develop quick to run facilitates review and scrutiny of data stores data and maintains audit trails
33
Cardinal Consultants 19 ISMOR Aug 2002 THE ALTERNATIVE APPROACH 1. Use a range of methods, including Military Judgement, to derive intermediate data and distributions reflecting a wide range of scenarios. 2. Develop a simple tool to calculate specific Measures of Effectiveness from the input data and distributions. permit results to be adjusted by Military Judgement to account for factors not addressed by calculations - the value of the ability to fire on the move - the value of the greater manoeuvrability afforded by the lighter system
34
Cardinal Consultants 19 ISMOR Aug 2002
35
Cardinal Consultants 19 ISMOR Aug 2002
36
Cardinal Consultants 19 ISMOR Aug 2002
37
Cardinal Consultants 19 ISMOR Aug 2002
38
Cardinal Consultants 19 ISMOR Aug 2002
39
Cardinal Consultants 19 ISMOR Aug 2002
40
Cardinal Consultants 19 ISMOR Aug 2002 SUMMARY AND CONCLUSIONS Appropriate methods of addressing scenario dependencies are essential to ensure study conclusions are valid. 1.ALL DATA should be regarded as being scenario-dependent. It is very useful to have an analyst in every team with special responsibility for addressing this problem. 2.Using combat models to compare performance of systems can be hazardous. Consider using a range of methods to generate intermediate results which are open to scrutiny and to sensitivity studies.
41
Cardinal Consultants 19 ISMOR Aug 2002
42
Cardinal Consultants 19 ISMOR Aug 2002 Title Contents Study levels Study purpose Rule 1 Highlight top-level Highlight all TarDes pic Leth’y depends MutliDisciplinary Management Soln Responsibilities Framework Feedback Rule 2 Scen Pars Histogram Graph Rule 3 Example study Data Original plan Data review Model review Model results Alternative approach Data screen 1 Results screen Conclusions Further Dev’t Current issues
43
Cardinal Consultants 19 ISMOR Aug 2002 Further Development of the CST Tool 2. Improved statistical routines for increase in speed 1. Development of proper library of routines 3. Automated methods for parametric studies 4. Use of EDMS technologies to manage and access study reports
44
Cardinal Consultants 19 ISMOR Aug 2002 CURRENT ISSUES / PROBLEMS WITH CST-01 1. It is not clear how best to address the problem of firing multiple bursts at a target, depending upon whether it is perceived to be killed. 2.It is not clear whether (and how) costs (or numbers of units) should be included, or handled separately.
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.