AMERICAN METEOROLOGICAL SOCIETY 1 Harvey Stern 22 nd Conference on Interactive Information and Processing Systems, Atlanta, 30 January, 2006. Combining.

Slides:



Advertisements
Similar presentations
Inquiry-Based Instruction
Advertisements

Testing Relational Database
Performance management guidance
Irwin/McGraw-Hill © Andrew F. Siegel, 1997 and l Chapter 12 l Multiple Regression: Predicting One Factor from Several Others.
The General Linear Model Or, What the Hell’s Going on During Estimation?
Task Force on National Greenhouse Gas Inventories Tier 3 Approaches, Complex Models or Direct Measurements, in Greenhouse Gas Inventories Report of the.
ISCI second International conference University of Western Sydney November, 2009.
The Islamic University of Gaza
Concepts in Enterprise Resource Planning Fourth Edition
Visual Literacy Standards Task Force Open Meeting ACRL Image Resources Interest Group Virtual meeting, ALA Midwinter 2011.
AMERICAN METEOROLOGICAL SOCIETY 2007 ANNUAL MEETING 1 Harvey Stern 23rd Conference on Interactive Information and Processing Systems, San Antonio, Texas,
SYSTEMS Identification Ali Karimpour Assistant Professor Ferdowsi University of Mashhad Reference: “System Identification Theory For The User” Lennart.
Determinants of Judgment Performance By: Lea Sulaiman Saputra D1555.
A knowledge-based system to generate internet weather forecasts. Dr Harvey Stern, Bureau of Meteorology, Australia Dr Harvey Stern, Bureau of Meteorology,
Writing Good Software Engineering Research Papers A Paper by Mary Shaw In Proceedings of the 25th International Conference on Software Engineering (ICSE),
Scientific method - 1 Scientific method is a body of techniques for investigating phenomena and acquiring new knowledge, as well as for correcting and.
Annual Review Process Georgi Lowe UWSA Office of Human Resources & Workforce Diversity.
A professional development tool for experiential placement students *graduated = stepped, incremental Pharmacy Competencies Graduated* Descriptors :
Protection Against Occupational Exposure
Dr Mark Cresswell Statistical Forecasting [Part 1] 69EG6517 – Impacts & Models of Climate Change.
1-Experiential Learning The World Wide Web makes it possible for students to tackle a huge amount of human experience. In such a way, they can learn by.
Severe Weather Forecasting Demonstration project for southern African countries Portfolio committee, 21 August 2007.
LSS Black Belt Training Forecasting. Forecasting Models Forecasting Techniques Qualitative Models Delphi Method Jury of Executive Opinion Sales Force.
Estimating Potentials and Forecasting Sales
Forecasting and Statistical Process Control MBA Statistics COURSE #5.
Lecture 8A Designing and Conducting Formative Evaluations English Study Program FKIP _ UNSRI
Managerial Accounting: An Introduction To Concepts, Methods, And Uses Chapter 11 Profit Center Performance Evaluation Maher, Stickney and Weil.
Semester 2: Lecture 9 Analyzing Qualitative Data: Evaluation Research Prepared by: Dr. Lloyd Waller ©
Unit 1 – Preparation for Assessment 1.2 Discuss approaches to auditing the capabilities and capacity of an college in or der to evaluate its readiness.
Celeste Saulo and Juan Ruiz CIMA (CONICET/UBA) – DCAO (FCEN –UBA)
Acknowledgements and Conflicts of interest Dr Gurpreet Kaur Associate Professor Dept of Pharmacology Government Medical College Amritsar.
EVALUATION OF HRD PROGRAMS Jayendra Rimal. The Purpose of HRD Evaluation HRD Evaluation – the systematic collection of descriptive and judgmental information.
CHAPTER 12 Descriptive, Program Evaluation, and Advanced Methods.
Project Phoenix – An Update EuMetCAL Workshop Toulouse, August 2008 Jaymie Gadal.
Concepts in Enterprise Resource Planning Fourth Edition Chapter One Business Functions and Business Processes.
Chap. 5 Building Valid, Credible, and Appropriately Detailed Simulation Models.
Concepts in Enterprise Resource Planning Fourth Edition
Project quality management. Introduction Project quality management includes the process required to ensure that the project satisfies the needs for which.
McGraw-Hill/Irwin © 2003 The McGraw-Hill Companies, Inc., All Rights Reserved. 6-1 Chapter 6 CHAPTER 6 INTERNAL CONTROL IN A FINANCIAL STATEMENT AUDIT.
Model Selection and Validation. Model-Building Process 1. Data collection and preparation 2. Reduction of explanatory or predictor variables (for exploratory.
IT Risks and Controls Revised on Content Internal Control  What is internal control?  Objectives of internal controls  Types of internal controls.
17 TH BMRC MODELLING WORKSHOP – OCTOBER Harvey Stern 17 th BMRC Modelling Workshop, Bureau of Meteorology, Melbourne, 6 October, Generating.
A Guide for Management. Overview Benefits of entity-level controls Nature of entity-level controls Types of entity-level controls, control objectives,
PRICING A FINANCIAL INSTRUMENT TO GUARANTEE THE ACCURACY OF A WEATHER FORECAST Harvey Stern and Shoni S. Dawkins (Bureau of Meteorology, Australia)
+ Evidence Based Practice University of Utah Evidence-Based Treatment and Practice: New Opportunities to Bridge Clinical Research and Practice, Enhance.
© 2008 by Prentice Hall8-1 Competencies Broad range of knowledge, skills, traits and behaviors that may be technical in nature, relate to interpersonal.
PROGRESS ON A KNOWLEDGE-BASED INTERNET WEATHER FORECASTING SYSTEM 100-day evaluation trial from 25 September, 2002 to 8 January, Harvey Stern (Bureau.
Regression Analysis: Part 2 Inference Dummies / Interactions Multicollinearity / Heteroscedasticity Residual Analysis / Outliers.
Forecasting Parameters of a firm (input, output and products)
CLIMATE, CATCHMENTS AND COMMUNITIES – SEPTEMBER Harvey Stern 20th Conference on Interactive Information and Processing Systems, San Diego, California,
Identifying “Best Bet” Web Search Results by Mining Past User Behavior Author: Eugene Agichtein, Zijian Zheng (Microsoft Research) Source: KDD2006 Reporter:
Technology Forecasting. Forecasting predict or estimate a future event or trend. "rain is forecast for Lahore“ The use of historic data to determine the.
INCORPORATING AN ENSEMBLE FORECASTING PROXY INTO A KNOWLEDGE BASED SYSTEM Harvey Stern, Bureau of Meteorology, Australia. Harvey Stern, Bureau of Meteorology,
CHAPTER 1 THE FIELD OF SOCIAL PSYCHOLOGY. CHAPTER OBJECTIVES After reading this chapter, you should be able to: Offer a definition of social psychology.
Building Valid, Credible & Appropriately Detailed Simulation Models
1 A latent information function to extend domain attributes to improve the accuracy of small-data-set forecasting Reporter : Zhao-Wei Luo Che-Jung Chang,Der-Chiang.
LECTURE 5 Nangwonvuma M/ Byansi D. Components, interfaces and integration Infrastructure, Middleware and Platforms Techniques – Data warehouses, extending.
Application of the CRA Method Application of the CRA Method William A. Gallus, Jr. Iowa State University Beth Ebert Center for Australian Weather and Climate.
AUDIT STAFF TRAINING WORKSHOP 13 TH – 14 TH NOVEMBER 2014, HILTON HOTEL NAIROBI AUDIT PLANNING 1.
Strategic Information Systems Planning
Monitoring and Evaluation Systems for NARS Organisations in Papua New Guinea Day 3. Session 9. Periodic data collection methods.
Time Series Consistency
Regression Analysis Module 3.
THE FIELD OF SOCIAL PSYCHOLOGY
Module 2: Demand Forecasting 2.
MECH 3550 : Simulation & Visualization
Regression Analysis.
Building Valid, Credible, and Appropriately Detailed Simulation Models
Presentation transcript:

AMERICAN METEOROLOGICAL SOCIETY 1 Harvey Stern 22 nd Conference on Interactive Information and Processing Systems, Atlanta, 30 January, Combining human and computer generated forecasts using a knowledge based system.

American Meteorological Society, 30 January, 2006 {Harvey Stern} 2 Background Over recent years, the present author has been involved in the development of a knowledge based weather forecasting system. Various components of the system may be used to automatically generate worded weather forecasts for the general public, terminal aerodrome forecasts (TAFs) for aviation interests, and marine forecasts for the boating fraternity. The knowledge based system generates these products by using a range of forecasting aids to interpret NWP model output in terms of a range of weather parameters.

American Meteorological Society, 30 January, 2006 {Harvey Stern} 3 February to May 2005 Trial The author conducted a 100-day trial (Feb 14, 2005 to May 24, 2005) of the performance of the knowledge based system, with twice- daily forecasts being generated out to seven days in advance.

American Meteorological Society, 30 January, 2006 {Harvey Stern} 4 Graphical Forecasts Generated

American Meteorological Society, 30 January, 2006 {Harvey Stern} 5 Performance During the trial, the overall percentage variance of observed weather explained by the forecasts so generated (the system's forecasts) was 43.24% compared with 42.31% for the official forecasts.

American Meteorological Society, 30 January, 2006 {Harvey Stern} 6 Some Definitions ‘Domain knowledge’: ‘Knowledge practitioners gain through experience as part of their jobs‘ ‘Contextual knowledge’: ‘Knowledge one develops by working in a particular environment.' 'The quality of domain knowledge is affected by the forecaster's ability to derive the appropriate meaning from the contextual (or environmental) information' (1). (1)Webby et al (2001)

American Meteorological Society, 30 January, 2006 {Harvey Stern} 7 Correlation Between System Forecasts and Official Forecasts During the trial, the overall percentage variance of official forecasts explained by the system's forecasts was only 45.91% (that is, the system’s forecasts were not highly correlated with the official forecasts). This indicates, that there are significant aspects of the processes employed in deriving the official forecasts that are not taken into account by the system's forecasts (in all likelihood 'domain knowledge').

American Meteorological Society, 30 January, 2006 {Harvey Stern} 8 Combining Forecasts Regarding the two sets of forecasts as partially independent and utilising linear regression to optimally combine the forecasts, it was demonstrated that a substantial lift in the overall percentage variance of observed weather explained was possible. Indeed, the overall percentage variance of observed weather explained was lifted (by the combined forecasts) to 50.21% from 42.31% (official), a lift of 7.90%.

American Meteorological Society, 30 January, 2006 {Harvey Stern} 9 Why the Accuracy Increases The accuracy increases because ‘Combining is most effective when the forecasts combined are not correlated and bring different kinds of information to the forecasting process’ (1); and that, Although 'both (human) intuitive and (computer) analytic processes can be unreliable … different kinds of errors will produce that unreliability' (2). (1)Sanders and Ritzman (2001) (2)Stewart (2001)

American Meteorological Society, 30 January, 2006 {Harvey Stern} 10 Potential What these data suggested was that adopting a strategy of combining predictions has the potential to deliver a set of forecasts that explain as much as 7.90% more variance than that explained by forecasts currently issued officially.

American Meteorological Society, 30 January, 2006 {Harvey Stern} 11 Mechanically Integrating Judgemental and Statistical Forecasts 'Consider mechanically integrating judgmental and statistical forecasts instead of making judgmental adjustments to statistical forecasts… Judgmental adjustment (by humans) of (automatically generated statistical forecasts) is actually the least effective way to combine statistical and judgmental forecasts … (because) judgmental adjustment can introduce bias (1)… (1)Mathews and Diamantopoulos (1986)

American Meteorological Society, 30 January, 2006 {Harvey Stern} 12 Human Judgement as an Input The most effective way to use (human) judgment is as an input to the statistical process… Cleman (1989) reviewed over 200 empirical studies on combining and found that mechanical combining helps eliminate biases and enables full disclosure of the forecasting process. The resulting record keeping, feedback, and enhanced learning can improve forecast quality' (1). (1)Sanders and Ritzman (2001)

American Meteorological Society, 30 January, 2006 {Harvey Stern} 13 The Strategy The strategy is: To take judgmental (human) forecasts (derived with the benefit of knowledge of all available computer generated forecast guidance); and, To input these forecasts into a system that incorporates a statistical process to mechanically combine the judgmental (human) forecasts and the computer generated forecast guidance; Thereby immediately yielding a new set of forecasts.

American Meteorological Society, 30 January, 2006 {Harvey Stern} 14 Modification of System The knowledge based system was modified so that it now automatically integrates judgmental (human) forecasts and the computer generated guidance, thereby incorporating the forecasters' valuable contextual knowledge into the process. It has been demonstrated (1) that judgmental forecasts based on contextual knowledge were significantly more accurate than those based on technical knowledge (and) … were even superior to (a) … statistical model. (1)Sanders and Ritzman (1992)

American Meteorological Society, 30 January, 2006 {Harvey Stern} 15 A New Trial The modified system underwent a new 'real-time' trial. This 100-day trial of the performance of the modified system, being conducted with a fresh set of data (Aug 20, 2005 to November 27, 2005), involved daily forecasts being generated out to seven days in advance.

American Meteorological Society, 30 January, 2006 {Harvey Stern} 16 The Purpose In this context, the purpose of the work was: 1.To evaluate the new set of forecasts; and, 2.To document the increase in accuracy achieved by that new set of forecasts over that of the judgmental (human) forecasts.

American Meteorological Society, 30 January, 2006 {Harvey Stern} 17 Percentage Variance of Official Forecasts Explained Evaluation of the forecasts prepared during the new trial showed that the overall percentage variance of official forecasts explained by the system's forecasts was now lifted to 77.17% (from 45.91% previously). Demonstrating that, in most circumstances, the combining strategy leaves the system’s forecasts almost identical to the official forecasts.

American Meteorological Society, 30 January, 2006 {Harvey Stern} 18 Percentage Variance of Observed Weather Explained Furthermore, the overall percentage variance of observed weather explained was now lifted by the system to 40.15% from 32.43% (official) – a rise of 7.72%, which was nearly as great as the 7.90% lift suggested previously by the present author’s combined forecasts. Demonstrating that, in those few circumstances when the combining strategy substantially changes the official forecasts, the system’s forecasts usually represent an improvement on the official forecasts.

American Meteorological Society, 30 January, 2006 {Harvey Stern} 19 “Domain knowledge” now taken into account These results indicate, that, on a day-to-day basis, 'domain knowledge', is now taken into account by the system.

American Meteorological Society, 30 January, 2006 {Harvey Stern} 20 Concluding Remarks The author’s February to May 2005 trial, suggested that adopting a strategy of combining predictions has the potential to deliver a set of forecasts that explain about 7.90% more variance than that explained by forecasts currently issued officially. Forecast verification data from the new real-time trial, conducted on the knowledge based system modified in order to take into account forecasters’ ‘domain knowledge’, demonstrated that a substantial increase in accuracy (an extra 7.72% variance explained) was, indeed, achievable.

American Meteorological Society, 30 January, 2006 {Harvey Stern} 21 Why the success? In most circumstances, the combining strategy leaves the system’s forecasts almost identical to the official forecasts; whilst, In those few circumstances when the combining strategy substantially changes the official forecasts, the system’s forecasts usually represent an improvement on the official forecasts.

American Meteorological Society, 30 January, 2006 {Harvey Stern} 22 The Future Role of Human Forecasts There is an increasing interest in the question of what might be the appropriate future role for the human in the forecast process (1). The results presented here suggest that the future role of human forecasts may be as an input to a system that mechanically combines human predictions with computer generated forecasts. (1)Stewart (2005)

American Meteorological Society, 30 January, 2006 {Harvey Stern} 23 Acknowledgements To my Bureau of Meteorology colleagues for their helpful discussions and feedback, especially to Neville Nicholls and Frank Woodcock, for their comments on combining forecasts.

American Meteorological Society, 30 January, 2006 {Harvey Stern} 24 Postscript The results of the August to November 2005 trial were analysed, and a number of minor modifications were made to the system. The trial was then continued. After 145 days the lift in percentage variance explained was little changed at 7.77%.

American Meteorological Society, 30 January, 2006 {Harvey Stern} 25 The End Thank You