Download presentation
Presentation is loading. Please wait.
Published byMabel Butler Modified over 9 years ago
1
Evaluating a new model of Primary Health Care service delivery in remote Queensland: Lessons Learned Kristine Battye, Peter Stanley-Davies and Elaine Ashworth
2
Outline Describe NWQAHS to provide context for evaluation Structural issues encountered Difficulties in establishing a system to “measure” process and impact of PHC Type of data available for planning and evaluating PHC services
3
North West Qld Allied Health Service Operational Hub and Spoke Model Mt Isa Hub Gulf precinct 5 communities Highway precinct 3 communities Mt Isa precinct 3 communities Key Features Functional teams 6 month calendar 6 weekly rotations 2-3 days in each community Primary health care Centralized booking system Therapy assistants in each community Videoconference follow-up Case conference with resident health professionals
4
NWQPHC Board Chief Executive Officer Executive Officer Operations & Outcomes Executive Officer Admin & Finance Area Manager NWQAHS Manager
5
NWQPHC Board CEO NWQAHS Service Manager 9 AHPs 1 Admin Community Panel Advisory Initial Management Structure (1.5 Years )
6
Domains of the Evaluation Recruitment and retention strategy Management and operation of service Service delivery – access and PHC Impact – community and individual Integration with other service providers Comparative cost effectiveness with alternate models
7
Methodology Proposed Qualitative - Direct intermittent information gathering Quantitative - Indirect and continuous monitoring, data collection, surveillance and use of sentinel communities, and health issues
8
Difficulties encountered Shared perspective of purpose of evaluation Time frame Information Mx system relevant to PHC activity Recognition of complexity of PHC service delivery Management capacity and multiple demands Are we trying to collect the right data anyway?
9
Shared purpose of evaluation Management Capacity – Multiple demands Time frame Structural Issues impacting on Evaluation
10
Shared purpose of evaluation Complexity of service delivery model Info Mx system that captures complexity Is it the “right” data? PHC paradigm but what data set? Management Capacity – Multiple demands Time frame
11
Purpose of Evaluation ManagementFunderEvaluator Summative – Did the service do what it said it would do, and do it well enough to be refunded? Developing internal systems in parallel to the evaluator Do what it said it would do? Meet objectives of RHS program Improve access – occasions of service Value for $$ Formative and Summative– Development of systems to support roll-out Aspects of model needing modification to achieve goals Impact on client & communities
12
Shared purpose of evaluation Management Capacity – Multiple demands Time frame Structural Issues impacting on Evaluation
13
Structural issues: Lessons Learned and Implications for Policy 1.Realistic timeframes for service establishment and realistic expectations of deliverables in first 3 years 2.Adequate resource allocation to management in the service establishment phase (service and auspice) 3.Greater emphasis on formative evaluation by funders and service providers 4.Broader performance indicators for primary health care services – reduced emphasis on occasions of service
14
Practical Issues around “Measuring” Primary Health Care
15
Shared purpose of evaluation Complexity of service delivery model Info Mx system that captures complexity Is it the “right” data? PHC paradigm – How is it measured? Management Capacity – Multiple demands Time frame
16
Information Management System Specifications: Client demographics and indigenous identifier Clinical treatment records Time use data – activities in conjunction with treatment -community focused activities Client outcomes/ client centred goals
19
Quantitative data Referrals as a proxy for occasions of service Used to measure access to service by indigenous and non-indigenous people at a community level (2 nd yr) Management capacity increased – development of chart system
20
Analysis of Referrals by community 2002/03 CommunityNo. Referrals % Pop referred No. Indig. Referrals Est. Indig. Pop % Indig. Pop referred Burketown10344%359437% Camooweal237%101457% Cloncurry1725%318804% Dajarra209%141957% Doomadgee15411%1031,2318% Mornington Is19516%1071,11510% Normanton815%478665% Richmond22124%51035% Hughenden32212%4526017% Julia Creek627%83027% Total1,41911%4095,0648%
21
REAL LIFE Data Collection Issues AHPs record info/data relevant to their job, or see the value of it – data quality is better Outreach service - client info maintained in a number of places Reason for referral not “centralized” but recorded in client notes Centralized data base – maybe need a data “enterer” Coding? ICPC developed by WONCA
22
Do we collect the right data to evaluate and plan for PHC? Measure what we can measure Occasions of service – proxy for workload? How do you evaluate the number of oldies you keep out of institutions because they have had access to allied health interventions? How do you measure the impact of early intervention in services that operate across the age continuum? Risky!! Why do we try and plan PHC (wellness) services at a local level using secondary care (sickness) data? Because that’s all there is!?
23
PHC: More Lessons Learned Need a mix of qualitative and quantitative measures to evaluate PHC – perhaps with equal emphasis Development of information management and evaluation processes need to be staged, and recognized in contracts with funders We need to re-think the data set and data collection processes we use to plan and evaluate primary health care services
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.