Presentation is loading. Please wait.

Presentation is loading. Please wait.

Evaluation NIFS | 2013.

Similar presentations


Presentation on theme: "Evaluation NIFS | 2013."— Presentation transcript:

1 Evaluation NIFS | 2013

2 Why are we evaluating our programs?
There is GREAT marketing juice in this stuff – we got a lot of traction on this blog series, and we got some good feedback on the one I wrote on Kathy Douglas’ Fit it In program. It also allows me to talk to possible clients about how you are spending your time and what you can provide the client in terms of data.

3 Why are we evaluating our programs?
It may very much seem like we’ve implemented this evaluation requirement because I want you to beat your head against the wall, but that is 100% not the case. In addition to capturing actual data that has marketing/new business value, this requirement with the accompanying structure was implemented to help you make the most of the programs you’re offering. After reading through a lot of evaluation forms that were submitted in the last review cycle, I saw a lot of missed opportunities which I really felt were connected to a lack of understanding about what you were supposed to be doing with evaluation and/or a lack of understanding about why you were being asked to do it. When I talked to your assistant directors about that, they confirmed that they thought there was general confusion about the tools and why this was a requirement. In a lot of cases it seemed like staff were going through the motions of filling out the forms, but not doing anything actionable with the information. And I know from some conversation that there are cases where pre-eval has actually been filled out after the program finished. So all of that said, what this looks like to me is a great opportunity to do this over. I own 100% of any misunderstanding for the staff by not clearly communicating goals, intentions, and outcomes when we first rolled this out. I apologize. Let’s move on from that and do better going forward.

4 Who should be doing evaluation?
EVERYONE on every program ALL THE TIME Yes everyone – this is an activity that all of staff should get comfortable with because all fitness-related staff have requirements for this in the performance review. But a manager and a health fitness specialist can turn in the same evaluations. If that’s happening, I would expect that the responsibility for completing them is shared. No – not every program, not all the time. When to do evaluation is strategic in and of itself. Not everything lends itself easily to this evaluation structure, so be thoughtful about what programs you choose to evaluate. Maybe the ongoing, every month challenge isn’t the best option. Or maybe, if you really want to see how you can do better with that program, you need to set aside a month to evaluate, but you’ll have to look at the process a little differently. At the most basic level, we are expecting that you will evaluate your performance review-related programs. JUST TEASING!!

5 What are the desired outcomes?
#1: Establish up to THREE S.M.A.R.T GOALS for the program AND determine if you achieved those goals #2: Establish an IMPACT SCORE for the program AND determine if/how you can improve that score for the same program or similar programs in the future Is this a SMART goal? Increase Wellness Center membership/recruit new members to participate in the program. What about this one? Increase average number of days per week participants are currently exercising for minutes during the program. This one? Educate members each week with informational s.

6 The Wellness Challenge
Presented by Reggie Porter

7 The (new and improved) Evaluation Template

8 Expectations for Evaluation
Pre-Program The pre-program tab will ALWAYS be completed in advance of starting the program. The pre-program outline will be shared with your AD for a quick review of goals and overall outline, ideally before you launch the program. The pre-program goals will be smart and will relate back to the performance review goals when the program is intended to “count” for those requirements. The very first expectation here is that these are REQUIRED of you. The performance review files are clear on what specifically is required. I will be expecting all staff to meet the criteria that has been outlined. Please make sure you’re comfortable with the expectations on how many and how well these need to be completed. If you aren’t clear, ask me or ask your assistant director for more information. I understand that this still feels like a “new” requirement for some of our team and I know that for some of you, what you’re doing with evaluation and how you’re supposed to do it was perhaps never crystal clear. So the steps ended up occurring out of order and in a way that ultimately made the evaluation process maybe less than helpful. In a way, that ends up being kind of a self-fulfilling prophesy. You don’t understand it, so you don’t do it correctly and the outcomes don’t’ make any sense and then you still don’t understand it (or maybe you increase your confusion about it) so you continue doing it incorrectly….and on and on. So let’s press do-over button here and get started on an understandable, clear path to using the evaluation tools correctly to the benefit of you, your staff, your members and our clients. Here are your basic steps: Pre-Program The pre-program tab will ALWAYS be completed in advance of starting the program. And preferably at least a few weeks to a month prior to starting the program so that if your pre-program set up uncovers a fatal flaw in your program design you have time to correct it. For example – let’s say you wanted to offer Maintain Not Gain. In your pre-program eval stage, you realize you want to see if you can actually educate your audience and you determine the best way to do that is with a pre and post survey of participants about their knowledge base. If you wait too long to set up the pre-program eval, you won’t have time to draft and administer the survey pre-program. The pre-program outline will be shared with your AD for a quick review of goals and overall outline, ideally before you launch the program. This is not intended to be babysitting. Here’s where I’m going with this: your AD’s have a good read on what’s going on with all of your peers on your larger team. Use that knowledge to your benefit to make sure your programs are tightly and expertly run. I’m not suggesting that you don’t have the expertise. I’m suggesting that an outside view over your intentions may help improve your offering. This also allows your AD to better follow up with you on the initiative post-program. The pre-program goals will be smart and will relate back to the performance review goals when the program is intended to “count” for those requirements. We set up SMART goals for our members…let’s make sure our goals for programs are also SMART. Goals to ‘increase membership”, increase the percentage of members who visit 5+ times per month” aren’t SMART. Get very, very specific, like you saw with Reggie’s goals.

9 Expectations for Evaluation
Post-Program The post-program evaluation will be completed in a timely fashion following the conclusion of the program. The post-program evaluation will include an assessment about the elements that make up the impact score (areas for improvement, positives that contributed to solid outcomes, etc). The post-program evaluation will include an assessment about whether or not the pre-program goals were achieved. If the goals weren’t achieved, data from the impact evaluation will be used to create suggestions about how to achieve them in a future offering. The program summary will be completed for posting on the Intranet so that your peers can understand the basics of offering your program without having to download and decode all the materials. And after the program is over, here are the expectations: The post-program evaluation will be completed in a timely fashion following the conclusion of the program. Let’s make sure that this is almost built into the time required to run the program…don’t wait until your review to complete the post-program evaluation, and here’s why. There should be rich learnings in these evaluation efforts. Waiting too long decreases your memory about what did and didn’t happen with the numbers and the participants. If you miss the window for compiling and reporting on your data, you may have missed a lesson-learned for your next initiative and then you offered 2 programs that weren’t executed maybe as well as they could have been. The post-program evaluation will include an assessment about the elements that make up the impact score (areas for improvement, positives that contributed to solid outcomes, etc). Use the right hand column on the post-program evaluation to record brief notes about the scores on the left. Put on your critical thinking hat to offer some suggestions about how you could have achieved better participation, or penetration, or completion of the offering. Again, this will be easier to accomplish if you do it shortly after the program has been completed. The post-program evaluation will include an assessment about whether or not the pre-program goals were achieved. If the goals weren’t achieved, data from the impact evaluation will be used to create suggestions about how to achieve them in a future offering. This is half the reason we’re doing evaluation – to see if we’re achieving our goals. How much more impactful is the program if we can say that 2 of 3 goals were accomplished and resulted in an increased participation of 20%? I can really DO something with that in our social marketing…and our potential customers will notice numbers. If you’re struggling with the numbers, talk to a colleague who’s good with math. (Call me for Pete’s sake, I love this stuff!) The program summary will be completed for posting on the Intranet so that your peers can understand the basics of offering your program without having to download and decode all the materials. This is new, and it’s designed to be (1) easy to complete…maybe 5 minutes tops, and it should provide your peers with what they need on the front end of pulling your program off the intranet to determine how to run it, how easy it was to run, how you used the program (based on the goals you set…perhaps you have peers with similar goals), etc.

10 Questions?


Download ppt "Evaluation NIFS | 2013."

Similar presentations


Ads by Google