An Independent Evaluation of the OK-FIRST Decision Support System Thomas E. James and Paula O. Long Institute for Public Affairs / Department of Political.

Slides:



Advertisements
Similar presentations
Bridging Research, Information and Culture An Initiative of the Research and Planning Group for California Community Colleges Your Name Your Institution.
Advertisements

Aligning Employee Performance with Agency Mission
Advances in Human Resource Development and Management
Project Monitoring Evaluation and Assessment
ROAD SAFETY ETP EVALUATION TRAINING
CHAPTER 3 ~~~~~ INFORMAL ASSESSMENT: SELECTING, SCORING, REPORTING.
System Office Performance Management
Computer Science Department Program Improvement Plan December 3, 2004.
Talbert House Project PASS Goals and Outcomes.
PPA Advisory Board Meeting, May 12, 2006 Assessment Summary.
Quality evaluation and improvement for Internal Audit
Evaluation. Practical Evaluation Michael Quinn Patton.
1 Assuring the Quality of your COSF Data. 2 What factors work to improve the quality of your data? What factors work to lessen the quality of your data?
Purpose of the Standards
Measuring Learning Outcomes Evaluation
Evaluation of Training and Education Activities. Objectives By the end of this presentation, participants will be able to List reasons why evaluation.
Professional Growth= Teacher Growth
Reaffirmation of Accreditation: Institutional Effectiveness Southern Association of Colleges and Schools February 2008 Stephen F. Austin State University.
Purpose Program The purpose of this presentation is to clarify the process for conducting Student Learning Outcomes Assessment at the Program Level. At.
1 Classroom-Based Research: How to Be a Researcher in Your Classroom Basic Skills Initiative Teaching and Learning Workshop October 2009 Darla M. Cooper.
Classroom action research
Principles of Assessment
Jack C Richards Professional Development for Language Teachers: Strategies for Teacher Learning Jack C Richards & Thomas.
Problem Based Learning (PBL) David W. Dillard Arcadia Valley CTC.
ISTEP: Technology Field Research in Developing Communities Instructor: M. Bernardine Dias CAs: Sarah Belousov and Ermine Teves Spring 2009.
Process Evaluation Intermediate Injury Prevention Course August 23-26, 2011 Billings, MT.
Tool for Assessing Statistical Capacity (TASC) The development of TASC was sponsored by United States Agency for International Development.
Time Mastery Profile ® The Time Mastery Profile ® helps people understand how they think about and use their time. This understanding is the foundation.
Unit 5:Elements of A Viable COOP Capability (cont.)  Define and explain the terms tests, training, and exercises (TT&E)  Explain the importance of a.
INSIGHTS FROM RBM WORK IN LIMPOPO 13 May PRESENTATION STRUCTURE Background Overall objective Project structure & emerging insights Project successes.
Assessment 101 Center for Analytics, Research and Data (CARD) United Church of Christ.
Lecture #9 Project Quality Management Quality Processes- Quality Assurance and Quality Control Ghazala Amin.
© Grant Thornton | | | | | Guidance on Monitoring Internal Control Systems COSO Monitoring Project Update FEI - CFIT Meeting September 25, 2008.
Food Safety Professional Development for Early Childhood Educators Evaluation Plan.
Classroom Assessments Checklists, Rating Scales, and Rubrics
ASSESSMENT OF HRD NEEDS Jayendra Rimal. Goals of HRD Improve organizational effectiveness by: o Solving current problems (e.g. increase in customer complaints)
2 The combination of three concepts constitutes the foundation for results: 1) meaningful teamwork; 2) clear, measurable goals; and 3) regular collection.
Comp 20 - Training & Instructional Design Unit 6 - Assessment This material was developed by Columbia University, funded by the Department of Health and.
Every Student READY. North Carolina Educator Evaluation A process for professional growth.
A COMPETENCY APPROACH TO HUMAN RESOURCE MANAGEMENT
Measuring Complex Achievement
Director of Evaluation and Accountability Manager, UW’s Grand Rapids, Michigan Robert McKown, CIRS Director of Evaluation and Accountability Sherri.
David Steer Department of Geosciences The University of Akron Learning objectives and assessments May 2013.
Evaluation Process and Findings. 300 briefings and presentations 10,000 people 400 workshops 12,000 people 175 service trips 3,000 people Program Activities.
CHAPTER 12 Descriptive, Program Evaluation, and Advanced Methods.
TRAINING & DEVELOPMENT Dr. Anil Mehta DEFINITION OF TRAINING “A PLANNED ACTIVITY TO MODIFY ATTITUDE, KNOWLEDGE OR SKILL THROUGH LEARNING EXPERIENCE TO.
Community Planning Training 5- Community Planning Training 5-1.
The Major Steps of a Public Health Evaluation 1. Engage Stakeholders 2. Describe the program 3. Focus on the evaluation design 4. Gather credible evidence.
Safety Management in Europe European Organisation for the Safety of Air Navigation Dr. Erik Merckx EUROCONTROL Directorate ATM Programmes Head of Business.
Information Literacy: Assessing Your Instruction Texas Library Association District 10 Fall Workshop October 15, 2005 Michelle Millet Information Literacy.
Primary.  There was a greater level of improvement in Literacy than Numeracy for both FSME and Non-FSME pupils.  Boys showed a greater level of.
Key messages related to quality assurance management Trust Tools Time
Illinois Department of Children & Family Service/Chicago State University STEP Program - NHSTES May THE STEP PROGRAM Supervisory Training to Enhance.
Human Resource Management, 4th Edition © Pearson Education Limited 2004 OHT 9.1 A Model of Human Resource Development I = Individual O= Organisation E=
1 © 2009 University of Wisconsin-Extension, Cooperative Extension, Program Development and Evaluation Planning your evaluation This presentation provides.
1 The project is financed from the European Union funds within the framework of Erasmus+, Key Action 2: Cooperation for innovation and the exchange of.
A Pilot Study of a Multimedia Instructional Program for Teaching of ESL Grammar with Embedded Tracking.
1. October 25, 2011 Louis Everett & John Yu Division of Undergraduate Education National Science Foundation October 26, 2011 Don Millard & John Yu Division.
GC e-Orientation Program for New Hire Module 4 – Knowing your Career in Oracle Updated by HR in July 03.
The purpose of evaluation is not to prove, but to improve.
M & E System for MI’s Training Program & Guidelines for MI’s Completion Report Presented by Monitoring and Evaluation Officer Mekong Institute January.
Candidate Support. Working Agreements Attend cohort meetings you have agreed upon. Start and end on time; come on time and stay for the whole time. Contribute.
Session 7: Planning for Evaluation. Session Overview Key definitions:  monitoring  evaluation Process monitoring and process evaluation Outcome monitoring.
ACF Office of Community Services (OCS) Community Services Block Grant (CSBG) Survey of Grantees Satisfaction with OCS Survey of Eligible Entities Satisfaction.
Instructional Leadership Supporting Common Assessments.
Business Management - Intermediate 2Business Decision Areas © Copyright free to Business Education Network members 2007/2008B111/078 – BDA 1.
CHW Montana CHW Fundamentals
ASSESSMENT OF STUDENT LEARNING
Human Resources Management
Curriculum Coordinator: Janet Parcell Mitchell January 2016
Presentation transcript:

An Independent Evaluation of the OK-FIRST Decision Support System Thomas E. James and Paula O. Long Institute for Public Affairs / Department of Political Science University of Oklahoma Mark Shafer, Oklahoma Climatological Survey University of Oklahoma

OK-FIRST Customized, county-level environmental information Target group: emergency managers Training (computer workshop, data interpretation workshop) 3 classes (June 1997, October 1997, March 1998) Refresher courses with Focus Groups

Independent Evaluator Unbiased and objective perspective A collaborator, NOT a monitor Combines strengths: –Institute of Public Affairs: methodological expertise –OK-FIRST staff: substantive context-specific knowledge Ensured collection of information useful to the program Continuous feedback to improve training methods or data collection techniques

EVALUATION “The systematic collection of information about the activities, characteristics, and outcomes of programs to make judgments about the program, improve program effectiveness, and/or inform decisions about future programming.” (Michael Patton 1997)

EVALUATION GOALS (1)Determine the extent to which OK-FIRST provided access and training to enhance participants abilities (2)Document the use of skills and impacts (3)Identify lessons learned

EVALUATION DESIGN Background Information –self-administered questionnaires –job experience –sources of information –familiarity with computers

EVALUATION DESIGN Workshops –pre-test / post-test –computer workshop: observations of performed tasks coupled with written identification –data interpretation workshop: identification, definitions, application of concepts (written)

EVALUATION DESIGN Self-Assessment –questionnaire administered at the beginning of the data interpretation workshop, covering computer skills –questionnaire at refresher workshops on computer skills and use of information –group discussions focusing on insights and experiences

COMPUTER TRAINING Statistically significant increase in skills –57% average pre-test score; 79% post-test Reduced range of post-test scores (those initially at bottom benefited most) After receiving training, almost all participants proficient enough to access data Despite significant increases in some areas, still room for improvement

DATA INTERPRETATION Significant increase in knowledge –44% average pre-test score; 62% post-test Reduced range in post-test scores Assessing pre-test and post-test knowledge (clusters) allows targeted training Feedback on performance from first class allowed OK-FIRST staff to adjust training, resulting in improved performance of other classes

WEB PAGES Follow-up questionnaire to determine utility 100% satisfied overall –98% very satisfied 91% very satisfied with content NIDS (WSR-88D) most frequently accessed –61% use daily 71% said very easy to navigate

BULLETIN BOARD Designed to share information and seek assistance Not highly used –37% did not use at all, rest infrequently Of those who used it, majority said posted information was useful Integration with web access may make it more useful

MEASURING OUTCOMES Long-term Outcomes –broad goals, which program itself cannot accomplish alone –e.g., saving lives or property Intermediate Outcomes –measurable aspects which facilitate accomplishment of long-term outcomes –e.g., application of data, knowledge, and skills Primarily measured from focus groups

UTILIZATION Overall –“I’m proactive now rather than just reactive” –“This program has become critical to our organization” Access / Application of Data –“OK-FIRST information … has kept disasters from happening” –“This allows the emergency manager to check his own specific area”

UTILIZATION Severe Weather –most frequent application –identify threats earlier –pinpoint storms more accurately –utilize scarce resources more effectively Floods –better management of road closures –fewer false alarms

UTILIZATION Fires –identifying approaching wind shifts –Fire Danger Model Hazardous Materials –trajectories / evacuation decisions Public Works (unanticipated outcomes) –assessing need for snow crews –appropriate conditions for road work –little league baseball

CONCLUSIONS The OK-FIRST team has been successful in meeting the key needs of program participants Able to enhance significantly the knowledge and skills of the project participants in a very short period of time Integration and coordination of training, access, and ongoing support vital to success

CONCLUSIONS “The project was able to change the behavior of local public safety officials and their approach to decision making.” Increased participants confidence, abilities, effectiveness, and range of application Empowered local officials to make decisions “The OK-FIRST organization and team should serve as a model for others.”