Download presentation
Presentation is loading. Please wait.
Published byKurt Dance Modified over 9 years ago
1
A typology of evaluation methods: results from the pre-workshop survey Rebecca Taylor 1, Alison Short 1,2, Paul Dugdale 1, Peter Nugus 1,2, David Greenfield 2 1 Centre for Health Stewardship, ANU 2 Centre for Clinical Governance, AIHI, UNSW
2
Outline of presentation 1.Background 2.Aim 3.Method: How we developed and distributed the survey 4.Findings 5.So what does this mean? 6.Future directions
3
Background At last year’s workshop participants reported a lack of clarity about how to evaluate CDSM tools, including what factors should be considered in evaluations To foster the use of world class evaluations, first need to know: what evaluations are currently being completed? are there gaps in the types of evaluations currently completed? how can evaluation projects be strengthened?
4
To investigate the types of evaluations conducted around Australia, and how these evaluations are reported to other clinicians and stakeholders Aim
5
1.Development of a survey to investigate the types of evaluations conducted around Australia, and how these evaluations are reported to other clinicians and stakeholders 2.Pilot testing 3.Distribution of survey to attendees of the ‘Evaluating Chronic Disease Self-Management Tools’ workshop 4.Descriptive and thematic analysis of quantitative and qualitative survey data Method
6
Findings: Demographics Professionn* Allied Health3 Nursing6 Administration/Management7 Medicine2 Other2 *n=number of respondents Respondent’s profession (n=20) Setting in which respondent works (n=20) – Participants were able to select more than one response Settingn* Community Organisation4 University3 Hospital7 Community Health7 Government Department5 Other5 *n=number of respondents
7
Findings: Tools evaluated Home Telemonitoring Flinders Program Health Coaching CENTREd Model for Self-Management Support cdmNET My Health My Life Program Stanford Program Living Improvement for Everyone (LIFE) – adapted Stanford Program for ATSI people Intel equipment for care innovations COACH Program AQOL QLD ONI The Continuous Care Pilot
8
Findings: Reasons for evaluation Reasons for undertaking evaluation n* (participants can select more than one response) Academic evaluation7 Management directed evaluation 13 For accreditation1 Patient satisfaction surveying 7 Other8 *n=number of respondents
9
Findings: Collaborators Collaboratorsn* (participants can select more than one response) Managers13 Clinicians15 Consumers13 University based academics9 Health service based academics 5 Other3 *n=number of respondents
10
Findings: Data used Data usedn* (participants can select more than one response) Case study4 Interviews with patients12 Interviews with staff9 Survey questionnaire12 Patient administrative data analysis 4 Patient records review6 Cost information3 Clinical outcomes analysis7 Other3 *n=number of respondents
11
Findings: Outcome of evaluation Outcome of evaluationn* (participants can select more than one response) Commence use of tool6 Continue use of tool7 Discontinue use of tool1 Modification of tool5 Change in how tool used3 Other5 *n=number of respondents
12
Findings: Dissemination of findings Dissemination of findings n* (participants can select more than one response) Written reports12 Seminar presentations24 Conference presentations32 Journal articles2 published 1 submitted 1 in progress 3 planned Other5 No response7 *n=number of respondents Outputs per person Average = 3.5 Range = 0 - 31
13
Findings: Perspectives of evaluation “Evaluation is an essential component of any tool introduced to a service. However, not only from the patient perspective, but how it influences (or not), health professional practice”. (Participant 5)
14
Findings: Perspectives of evaluation Seeing evaluation in context, as part of a process Who participates in the evaluation? Range of results Interfacing with clinicians Need for the right evaluation method for the purpose
15
Findings: Perspectives of evaluation “It is time to develop evaluation tools that measure the changes in people and relationships that we are seeing every day when we work with these tools. It might not change someone's HbA1c overnight but it might mean they connect with family again or talk to their health professionals more or ask for help before crisis hits”. (Participant 16)
16
So what does this mean? When planning service delivery, also plan its evaluation (plan from the beginning) Use a wide range of evaluation methods and ensure they are used in the appropriate context Engage all of the stakeholders Share methods and findings with others Collaborate with and learn from others working in the field to prevent reinventing the wheel
17
Thank you Rebecca Taylor, Postdoctoral Research Fellow Rebecca.Taylor@anu.edu.au
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.