1 Local Public Health Preparedness: how can we measure it? Experience in Kansas Gianfranco Pezzino, MD, MPH Kansas Health Institute.

Slides:



Advertisements
Similar presentations
APC Road Show: Albuquerque, New Mexico Costanza Galastri Senior Analyst.
Advertisements

Public Health Surveillance Vital Records are Vital! Chesley Richards, MD, MPH, FACP Deputy Director for Public Health Scientific Services Centers for Disease.
Donald T. Simeon Caribbean Health Research Council
Greater than the sum of its parts: Challenges and growth of alliances in the Land of Oz MLC-3 kick off meeting August 2008 Gianfranco Pezzino, M.D., M.P.H.
1 Assessment of Cambodia’s Statistics Capacity Prepared by Zia A. Abbasi IMF Multi-sector Statistics Advisor, Cambodia for the International Conference.
9 th Annual Public Health Finance Roundtable November 3, 2012 Boston, MA Peggy Honoré.
Jane Suen, Dr.P.H., Kim Gadsden-Knowles, M.P.H. and Med Sohani, M.S. Centers for Disease Control and Prevention (CDC), 4770 Buford Highway, Atlanta, GA.
Ray C. Rist The World Bank Washington, D.C.
Public Health Reform Jonathan Mann Lecture 2011 David Fleming, MD Director and Health Officer.
Comprehensive M&E Systems
I NTRODUCTION TO B ASIC D ATA A NALYSIS AND I NTERPRETATION FOR H EALTH P ROGRAMS.
RADM Ali S. Khan, MD, MPH Director, Office of Public Health Preparedness and Response Bridging the Gaps: Public Health and Radiation Emergency Preparedness.
Kansas Partnership for Improving Community Health and Topeka, Kansas January 25, 2012 Robert F. St. Peter, M.D. Kansas Health.
World Health Organization Conference on Developing New SPHs Jerusalem March, 2002 What should an MPH graduate be able to do at the end of the training.
Evaluation. Practical Evaluation Michael Quinn Patton.
Measuring Performance of Public Health Systems
Sustaining Local Public Health and Built Environment Programs Fit Nation NYC November 3, 2011 Annaliese Calhoun.
Continuous Quality Improvement (CQI)
Kansas City Missouri Health Department Kansas City Cross Jurisdictional Partnership Rex Archer M.D., M.P.H. Director of Health Feb. 23, 2005.
ORC TA: Medicare Rural Hospital Flexibility Grant Program HRSA U.S. Department of Health & Human Services Health Resources & Services Administration.
Central Asia Regional Health Security Workshop Co-organized with the Command Surgeon, US Central Command and the George C. Marshall European Center for.
United Nations Statistics Division
Multnomah County Health Department ►Essential Services ►FDA Food Standards ►PACE Tools for Food Program Excellence Lila Wickham March 17, 2004 ♣
A Multi-State Survey on Public Health Emergency Preparedness Paul Kuehnert, MS, RN Acting Deputy Director Bureau of Health Maine Department of Health and.
Program Performance Accountability & Measurable Results Kevin Keaney, Chief Pesticide Worker Safety Programs U. S. EPA 2005.
SEILA Program and the Role of Commune Database Information System (CDIS) Poverty and Economic Policy (PEP) Research Network Meeting June 2004, Dakar,
Lauren Lewis, MD, MPH Health Studies Branch Environmental Hazards and Health Effects National Center for Environmental Health Centers for Disease Control.
Copyright 2010, The World Bank Group. All Rights Reserved. Planning and programming Planning and prioritizing Part 1 Strengthening Statistics Produced.
Public Health Infrastructure Healthy Kansans 2010 Steering Committee Meeting May 12, 2005.
The Health Metrics Network Assessment Tool. HMN Assessment Process & Tool Why use the HMN assessment tool? A step towards a comprehensive HIS vision;
The National Development Plan, Iraq 6 July 2010 “Developing Objectives & Indicators for Strategic Planning” Khaled Ehsan and Helen Olafsdottir UNDP Iraq.
Independent Monitoring & Evaluation for Accountability Presented to Health Sector Steering Committee, 21 st July 2009.
DAD Community of Practice Meeting, Yerevan, 5 October 2009 Aidan Cox, Aid Effectiveness Adviser UNDP Regional Centre, Bangkok.
Nancy L. Weaver, PhD, MPH Department of Community Health School of Public Health Saint Louis University 16 July 2010 LOGIC MODEL FUNDAMENTALS.
Performance Management: Getting Ready for Accreditation Needs Assessment Survey Community Assessment 1 Online survey Open to anyone interested in this.
Assessing local public health capacity and performance in diabetes prevention and control Deborah Porterfield, MD MPH University of North Carolina-Chapel.
UNESCO INSTITUTE for STATISTICS Statistics for periodic reporting Simon Ellis.
What is HMN? Global partnership founded on the premise that better health information means better decisions and better health Partners reflect wide.
From Standards to Improvement: Laura B. Landrum, Illinois Public Health Institute NWCPHP Hot Topics Forum, August 11, 2005 Steps to Managing Effective.
Data Sources-Cancer Betsy A. Kohler, MPH, CTR Director, Cancer Epidemiology Services New Jersey Department of Health and Senior Services.
UNESCO Institute for Statistics Monitoring and Improving Learning in the 2030 Agenda Simple Road Map to a cross-national scale Silvia Montoya, PhD Director.
STRATEGIC ENVIRONMENTAL ASSESSMENT METHODOLOGY AND TECHNIQUES.
Forecasting and Evaluating Network Growth David Levinson Norah Montes de Oca Feng Xie.
Using Informatics to Promote Community/Population Health
Update on Performance Measures Pilot and Development of the Cancer Plan Index Presented by Deborah Porterfield, MD, MPH RTI International Presented at.
Dashboard Driving Steve Zimmerman Spectrum Nonprofit Services CALCASA September 21, 2010.
Lessons from Programme Evaluation in Romania First Annual Conference on Evaluation Bucharest 18 February 2008.
A Dashboard Framework for Community Resilience Assessments Resilient Communities Research Consortium West Virginia University, Florida A&M University,
Budgeting for Outcomes November 28, 2012 ISAC Fall School of Instruction.
Assessing Hospital and Health System Preparedness and Response Paul Halverson, Dr.P.H. Director Division of Public Health Systems Development and Research.
Exposure Rostering: Population Tracking Following a Disaster Melissa E. Powell, MPH Michelle F. Barber, MS Preparedness, Surveillance & Epidemiology PUBLIC.
This work is part of the Joint Action on Improving Quality in HIV Prevention (Quality Action), which has received funding from the European Union within.
RESULT-BASED M&E FRAMEWORK FOR THE VIETNAM SEDP rd Round Table – Management for Development Results Hanoi 7 February 2007 Department of National.
Introduction to Item Analysis Objectives: To begin to understand how to identify items that should be improved or eliminated.
Kathy Corbiere Service Delivery and Performance Commission
Situation Analysis. 2 Policy and planning Key messages Situation analysis is more than burden of disease (BoD) Also include SDH, population expectations.
URBAN STREAM REHABILITATION. The URBEM Framework.
Is for Epi Epidemiology basics for non-epidemiologists.
What is policy surveillance? What are the methods? Why is it important? November 2015.
World Health Organization (WHO) Health & Equity Ms. He Jing, National Program Officer WHO China Office July 9, 2013.
Presentation at the Workshop Held on the 07 th April 2011 Lisema Lekhooana.
1 PERFORMANCE MEASURES: So WHY should I care? Remarks by Sherry Sterling Senior Advisor, OPP PREP Course 13 June 2005.
Presentation to the Ad-hoc Joint Sub-Committee on Parliamentary Oversight and Accountability Wednesday 20 March 2002 PUBLIC SERVICE MONITORING AND EVALUATION.
Session 2: Developing a Comprehensive M&E Work Plan.
Public Health in Indian Country Capacity Scan: NIHB Project Update & Listening Session Northwest Portland Area Indian Health Board & California Rural Indian.
Monitoring and Evaluation Systems for NARS Organisations in Papua New Guinea Day 2. Session 6. Developing indicators.
Joint session with IHP+ introduction
It’s about more than just tracking numbers
Future of Public Health in Kansas: Local Pilot
Correspondance: A multi-state comparison of public health preparedness assessment using a common standardized tool APHA Meeting, 2007.
Presentation transcript:

1 Local Public Health Preparedness: how can we measure it? Experience in Kansas Gianfranco Pezzino, MD, MPH Kansas Health Institute

2 Introduction Today’s goals:  Describe one project of statewide local preparedness assessment (“case study”)  Describe some lessons learned  Define challenges ahead

3 The need to measure preparedness  Accountability Billions of $ have been invested, need to show results  Program planning and management Officials need to know where the gaps are to make sound plans and decide budget allocations

4 The “PHPPO tool” Public Health Preparedness and Response Capacity Inventory  Aug 2002, updated in Dec 2002 “Rapid” self-assessment for state and local PH agencies  Track progress for CDC cooperative agreement  Guide to planning future activities

5 The Kansas Public Health System 105 counties  91 (87%) are rural 99 local health departments  All counties served by a LHD  Independent jurisdictions  KS Association of LHD (KALHD) Kansas Department of Health and Environment  Support to LHD (technical, financial)

6 The Project Baseline and follow-up surveys in 2002 and 2003 – modified PHPPO tool  103/105 (98%) counties responded to both surveys Preparedness Indexes created compared Results compared by: Whole state Focus Areas Population density groups Report released in July, 2004

7 Methods - Step 1: Question Achievement Criteria for “achievement” developed Individual answers classified

8 Example 1 (simple question) If answer = Yes, then scored as “achieved”

9 Example 2 (complex question)

10 Example 2 (complex question) If (e) = N AND >2 options = Y THEN Achieved = Yes

11 Methods - Step 2: Critical Capacities Indexes grouped Questions grouped into critical capacities  E.g., Q. 1 through 6 = C.C. A-I-A  Q. 19 through 24 = C.C. B-I-A Local Critical Capacity Index = percentage of questions achieved for that critical capacity in each county State index = average of local indexes

Percentage of questions achieved (state average)

14 Methods - Step 3: Focus Areas Indexes grouped Critical capacities grouped into focus areas Local Focus Area Index = average of critical capacity indexes for a focus area in each county State index computed as average of local indexes

16 Methods - Step 4: Overall Preparedness Index Local Overall Preparedness Index = average of all focus area indexes in each county State index computed as average of all county indexes

17 Key Findings  Preparedness for bioterrorism improved

State Average of Local Preparedness Indexes by Focus Area and Year Focus Area2002 Baseline 2003 Follow-up Proportional Increase A – Planning and Assessment49.3 %57.1 %15.8% B – Surveillance and Epidemiology35.6 %47.9 %34.3% C – Laboratory18.7 %20.6 %10.4% E – Communication & Information Technology42.0 %52.8 %25.7% F – Risk Communication & Health Info Dissemination 23.6 %28.9 %22.6% G – Education, Training28.7 %42.6 %48.3% KS-Specific Areas39.2 %52.9 %34.8% State Overall Preparedness Index33.9 %43.3 %27.7%

19 Key Findings

Focus Area2002 Baseline 2003 Follow-up Proportional Increase A – Planning and Assessment49.3 %57.1 %15.8% B – Surveillance and Epidemiology35.6 %47.9 %34.3% C – Laboratory18.7 %20.6 %10.4% E – Communication & Information Technology42.0 %52.8 %25.7% F – Risk Communication & Health Info Dissemination 23.6 %28.9 %22.6% G – Education, Training28.7 %42.6 %48.3% KS-Specific Areas39.2 %52.9 %34.8% State Overall Preparedness Index33.9 %43.3 %27.7% State Average of Local Preparedness Indexes by Focus Area and Year

21 Key Findings

Range of County Preparedness Index in Kansas, 2003 Local Overall Index (State Average)43.3 % Local Overall Index Range17.3 % to 75.5 % Ratio highest index : lowest index4.4

23 Key Findings

Local Preparedness Index by Population Density in Kansas, 2003 FrontierRuralDensely Settled Rural Semi- Urban Overall Index (Average)38.1 %41.3 %47.7 %52.0 %55.8 % Overall Index (Range)18.9 % to 67.1 % 17.3 % to 60.9 % 32.9 % to 67.2 % 35.3 % to 73.3 % 35.7 % to 75.5 % Ratio highest index : lowest index in group

25 So, now how do we use (and not use) these numbers?

26

27 Conclusions  Structured assessments can provide helpful information Tracking progress Plan resource allocation  Assessment would be more meaningful with “gold standards”, clear objectives To know if you are on the right track you need to know where you want to go first  Therefore, we desperately need good, measurable indicators

28 We desperately need good indicators Can be quantified (i.e., measured and counted) Measure what matters Linked to public health goals Understandable to policy makers, public Defensible and logical Allow monitoring of trends Sensitive to changes Timely measured Allow comparisons Reliability Can be monitored without excessive burden Use available data and information systems, when possible

29 The Challenges Ahead “Not everything that counts can be counted, and not everything that can be counted counts” (A. Einstein) “What gets measured gets done” (Ed Thompson et al.) “Let us not make the perfect the enemy of the good” (me)