Download presentation
Presentation is loading. Please wait.
Published byAgnes Montgomery Modified over 6 years ago
1
Changing The Culture of Mental Health on Campus Through Evaluation
Debbie Bruckner, Senior Director, Student Wellness, Access & Support Dr. Andrew Szeto, Director, Campus Mental Health Strategy Dr. Susan Barker, Vice Provost (Student Experience) Clare Hickie, Undergraduate student, Department of Psychology, Honours Art Assoiants, MSc student, Counselling Psychology NASPA January 2018 Debbie & Andrew introduce selves
2
Relevance of Evaluation: Why does this matter?
Outline . Relevance of Evaluation: Why does this matter? UCalgary Experience: Case Study Role of Mental Health Strategic Plans Evaluation of UCalgary Campus Mental Health Strategy Experiences of Other Institutions: Discussion Key Factors & Barriers Lessons Learned & Best Practices
3
Relevance of Evaluation
What is the Context on Your Campus? Mental Health Programing, Strategies Evaluation in higher education Why does it matter? Debbie Who is involved in evaluation of mental health support Who is interested in evaluation - Who works in area of mental health support
4
UCalgary Experience A Community-Wide Journey
\
5
Role of Mental Health Strategic Plans
Why Go There? Campus Mental Health Strategy 28 recommendations in 6 strategy focus areas Whole campus approach Multiple-levels How do we know this is effective? Impact New directions Annual reporting Evaluation embedded in our strategy Andrew Why did we go there? NCHA data Data from HR- EAP Brentwood
6
UCalgary Experience: Case Study
Evaluation Subcommittee Evaluation at 4 levels Program Recommendation University Process Implementation Student involvement Collaboration with other subcommittees Continuous Improvement Creating a culture of evaluation Debbie Evaluation We have been evaluating all of our health promotion and direct service initiatives since The results of these evaluation processes have informed the development of programming and prompted the facilitation of focus groups with students. UCalgary is working to create a culture of evaluation regarding mental health initiatives across the entire campus. The Mental Health Strategy Advisory Implementation Committee Evaluation Subcommittee hired summer students in 2017 to undertake a scoping study on the current state of evaluation practices in student programming. A broad group of campus stakeholders were interviewed about their current knowledge and comfort implementing evaluation best practices, and participant feedback was gathered on opportunities for further guidance and resources. This work will culminate in a publically available evaluation toolkit and publication, which will supplement our existing and ongoing mental health programming evaluation. In addition to measuring program outcomes, this evaluation subcommittee will also be examining ways to measure student mental health, including innovative measures of social connectedness and resiliency, and program outcomes at the university level. These methods will supplement the National College Health Assessment (NCHA) data that is already being implemented at the University of Calgary.
7
Evaluation of UCalgary CMHS
What we have done… Study and thematic analysis Use of logic models Best Practices Tool kits Clare and Art Linear discussion of the process of the summer’s work Identified stakeholders Conducted interviews with stakeholders about the role of evaluation in their work and how evaluation on campus could be improved (i.e. what they need, what they think is important, etc.) include some sample questions Talk about how we used logic models – how we asked folks to use the logic model framework to describe their programs, including resources, goals, etc. (insert a picture of an example logic model into slide) We did this both to better understand and get context for their programs, and to see how people reacted to the logic models and how they engaged with them (we did not gather data on this and informally looked at this to inform our later work) We began the process of analysis using thematic analysis of transcribed interviews (ongoing process) Used this to inform the creation of the toolkit - what people wanted and needed to better understand evaluation, feel supported in evaluation on campus, etc. Created an online toolkit in website form and pdf form which is being officially launched by the end of January Many of the comments and desired components of the toolkit, such as having interative web videos, in-person seminars, and expert consultation were not within the capacity of the project at this point in time, but indicate the potential for further work and development in this are Also points to the interest that staff have in this field Best practices for on campus evaluation Program evaluation needs to be accessible Both in terms of physical access and understanding Physically – needs to be available, and people need to know that these resources exist Understanding – needs to be accessible to people from varying levels of education and expertise, i.e. needs to be evidence based and informed while also being tangible and accessible Good program evaluation should strike a balance between being accessible and being evidence and theory informed Program evaluation should try to be evidence-based when possible This can include using established measures and methods Using critical thinking when deciding which evaluations and methods to use E.g. Some respondents when asked “why do you evaluate this way” said “because the person who used to be in my role used this” or “I found it in our files” – while evaluating is better than not evaluating, it is important to make the time and space to critically think about how you are evaluating and why to ensure it is a good fit for your needs Solicit feedback on how you evaluation – from fellow staff, student volunteers, program participants, etc. – see if what you are asking is capturing what you want it to and achieving the goals of your evaluation Program evaluation should be iterative Evaluating is better than not evaluating, and does not have to be perfect, and never will be perfect Figuring out the best way to evaluate is an ongoing process, and it’s important to not be hard on yourself and not be shy about making mistakes It is important to be consistently reflecting on your evaluation practices: what are you trying to evaluate? Why? What information do you want to collect? Evaluation will also change as your program changes, and as personal, professional, or institutional priorities shift – one year you might want to be collecting information on program numbers or metrics (say, for a grant application), and other years you may be more focused on evaluating outcomes Evaluation should reflect your needs (and your stakeholders needs) Evaluation Institutions should work to create a culture of evaluation What does this mean? Include evaluation in the creation and implementation of policies and procedures Communicate from top-down the importance of evaluation Make it clear who is responsible for evaluation and what is expected in terms of evaluation Ensure there are appropriate resources in place to support staff & faculty to effectively evaluate Ensure that evaluation is made a priority in terms of work load and role responsibilities Many of our respondents indicated they wanted to improve evaluation, but that they did not have the time, resources, or energy within their role to do so Or did not have enough experience or knowledge to do so, were confused about the ethics process, etc., for example “can I use quotes? Can I use this data? Do I have to write any disclaimers in my survey? Do I have to contact ethics?”, etc. These were all contributing factors, and as a result, competing priorities won out It is vital that evaluation is made a clear priority, and that adjustments are made in terms of staff work load to accommodate this This could mean hiring someone new to evaluate, shifting other role responsibilities, etc. Actively use evaluation data Use it to inform policy, funding, program development Celebrate milestones using data from evaluation E.g. present in meetings, reports, s, newsletters, events, etc. TO ADD INTO THIS SLIDE Link to the toolkit (make it purdy looking)
8
Experiences of Other Institutions
Small group discussion Enablers: What conditions enable the changing of culture of evaluation at your campus? Barriers: What road blocks have you encountered with evaluation? Debbie - Ideally groups of 5-7 Each group has stickies Each individual writes top 2 enablers & top 2 road blocks Arranges stickies in some kind of grouping on flip chart/wall Report top 2 enablers, tope 2 road blocks from each group
9
Key Factors and Barriers to Evaluation
Report back from small groups Funding dedicated to mental health Programming & services Staff Grants Seeing the strategy in action Making it relevant to the campus: responsive Andrew Responsive: ie. Fetanyl, cannabis, integrating other strategies- responding to TRC Two other foundational elements: Dedicated funding for the cmhs…$3 million Hired a director…oversee the implementation, work with the implementation committee and subcommittees
10
Lessons Learned & Best Practices
Reflection Lessons learned Full Community Buy In Engagement of all Stakeholders Evidence-Based Communication Best Practices What can you take back to your institution? Clare
11
Questions? Debbie Bruckner, Senior Director, Student Wellness, Access & Support Dr. Andrew Szeto, Director, Campus Mental Health Strategy Dr. Susan Barker, Vice-Provost (Student Experience) Clare Hickie, Undergraduate Student Art Assoiants, Graduate Student
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.