Download presentation
Presentation is loading. Please wait.
Published byNoah Benson Modified over 9 years ago
1
A centre of excellence supported by the Australian Government © Copyright 2011 The evaluation of an open access self-help web-site Delyth Lloyd 1 & Chris Clarke 2 1 Australian Centre for Posttraumatic Mental Health 2 Defence Links, Department of Veterans’ Affairs
2
Acknowledgements Andrea Phelps John O’Connor Funded by Department of Veterans’ Affairs SMS Management Technology
3
Outline Background The web-site & audience Steps in the development of the evaluation Factors shaping the evaluation Data sources and IT capacity Policy cycle Challenges, lessons learned Your thoughts?
4
Context Department of Veterans’ Affairs ‘lifecycle’ program Aim: Provide an on-line Wellbeing resource tailored for veterans, former serving members and their families. Target hard-to-reach / hard-to-engage Adding to the range of mental health and support services available www.wellbeingtoolbox.net.au
5
Wellbeing Toolbox Skills for Psychosocial recovery Generic distress reduction skills Principles based on risk & resilience research Broad, accessible, tried and tested Target population Sensitive to information request “Younger” (<50y) Difficulties with government involvement in health
7
Open Access Anyone can visit Personalised journey – increase relevance Ease of access – find what you want quickly Log in (optional) Questionnaire (optional) Modular (with guide me function) Self management plan (optional) accessibility priority low impact non intrusive evaluation
10
1. Literature review – what can we learn from others? Need is present in target group On-line self-help “courses” can work Disorder specific evidence Organisational evidence Youth orientated products successful Genuine open access self-help evaluations with accessibility focus Not done? Not reported? Planning
11
Planning (cont’d) Utilisation discussions Who is going to use the evaluation, why, how? What do they need to know? What would they like to know? When? Program Logic Not a clinical TOC from use to better wellbeing Path of use Identify and agree on evaluation questions
12
Web-site created and promoted Users from target group become aware of web-site Via KIT Via direct marketing of self-care site navigate there independently from web-search from other sites e.g. At ease Users visit SC web-site Users perceive as useful Sign in Use Re-use Use of other services & resources Positive feedback Re-visit Use web-site components to greater or lesser extent depending on individual factors Users self-refer to other services and resources as appropriate How do users reach the site? Who are the “users”? Who/How many users re-visit the site? Who/How many users re-visit the site? Who/How many users choose sign in? Have site users seen anyone about their problems since using the site or do they intend to do so? Use self – Management Plan Who/How many users re- visit the Self Management plan? What sections are visited by the most people? (indicates interest) What sections are visited by the most people? (indicates interest) What do first time users think? (many will be `1 time’ users) What sections do people re- visit most often? (indicates which are most helpful) What sections do people re- visit most often? (indicates which are most helpful) Note: Module selections may be prompted to some extent if the ‘guide me’ function is used General Feedback: What do users like best about the site & what could be improved? General Feedback: What do users like best about the site & what could be improved? ActionsProcesses Must be targeted at veterans and also relevant for family members Must be clinically sound Must contain Key features named in contract Outcomes Usability testing & stakeholder consultation ensures appealing and usable to target group Have users followed links to other recommended resources? How effective were promotion marketing activities? One time visitors
13
Factors shaping the evaluation 1: Data Available What can / could the site do? e.g. reminders, return visits What evaluation tools can be built in? What can Google Analytics do? Ethics? What is it okay to know? What we don’t/won’t/can’t know? e.g. questionnaire and use, but not both Logged in versus not logged in
14
Web-site created and promoted Users from target group become aware of web-site Via KIT Via direct marketing of self-care site navigate there independently from web-search from other sites e.g. At ease Users visit SC web-site Users perceive as useful Sign in Use Re-use Use of other services & resources Positive feedback Re-visit Use web-site components to greater or lesser extent depending on individual factors Users self-refer to other services and resources as appropriate How do users reach the site? Who are the “users”? Who/How many users re-visit the site? Who/How many users re-visit the site? Who/How many users choose sign in? Have site users seen anyone about their problems since using the site or do they intend to do so? Use self – Management Plan Who/How many users re- visit the Self Management plan? What sections are visited by the most people? (indicates interest) What sections are visited by the most people? (indicates interest) What do first time users think? (many will be `1 time’ users) What sections do people re- visit most often? (indicates which are most helpful) What sections do people re- visit most often? (indicates which are most helpful) Note: Module selections may be prompted to some extent if the ‘guide me’ function is used General Feedback: What do users like best about the site & what could be improved? General Feedback: What do users like best about the site & what could be improved? ActionsProcesses Must be targeted at veterans and also relevant for family members Must be clinically sound Must contain Key features named in contract Outcomes Usability testing & stakeholder consultation ensures appealing and usable to target group Have users followed links to other recommended resources? How effective were promotion marketing activities? One time visitors How do users reach the site?
15
Data – sources and strategies Google Analytics Visitors/ unique visitors/ page views/ time on site Rating scales within the topics How helpful was this module to you? (optional) User survey “pop-up” Demographics, brief ratings, comment Feedback (ad hoc) users and non-users Evaluation Register (structured interviews) Users and stakeholders
16
Benchmarking What will it all mean? “2685 people visited module X”…. So what? user numbers (proportion of sister site) time on site (one open access depression – 20 mins) extent of use (proportion start, mid, end) log-in rates (?1.6%/ 35%/25-90%) Revise and refine benchmarks over time Link findings to recommendations > Early indicators / likely trajectories > if this then what?
17
What info at what time point: Reassurance – now > mid trial report Decision making - soon Accountability & bigger picture- later Challenges of providing this as well as: Rigorous evaluation of new concept Useful Ethical Practically possible & not overly demanding on users Factors shaping the evaluation 2: Policy Processes & Budget Cycle
18
Progress Key questions: Reach, Acceptability, Benefit What next ? Monitoring component (usage patterns over time) Benchmarks (meaning) Feedback (ad hoc complaints and compliments) Interviews (user stories) > Decision making & meaning oriented report
19
Conclusions & our throughs so far.. 1.Evaluation needs to be incorporated in design of web products. Privacy and Ethics are grey areas 2.Lessons/ tactics for a meaningful but low impact evaluation design. Trade-off between purpose of site and ease of evaluation. 3.Orienting reporting to stakeholder needs, timing, broader implications; what is reasonable to conclude mid way?
20
Questions, comments?
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.