Presentation is loading. Please wait.

Presentation is loading. Please wait.

Evaluation of the Air Quality Health Index Program in Canada San Diego 2011 Sharon Jeffers - Environment Canada (EC) Kamila Tomcik – Health Canada (HC)

Similar presentations


Presentation on theme: "Evaluation of the Air Quality Health Index Program in Canada San Diego 2011 Sharon Jeffers - Environment Canada (EC) Kamila Tomcik – Health Canada (HC)"— Presentation transcript:

1 Evaluation of the Air Quality Health Index Program in Canada San Diego 2011 Sharon Jeffers - Environment Canada (EC) Kamila Tomcik – Health Canada (HC)

2 Page 2 – February 10, 2016 Background The AQHI is the first multi-pollutant health risk based air quality index in the world Multiple (really, really multiple) partners and stakeholders for both development and implementation (the whole is greater than the sum of the parts) Piloted and implemented in different jurisdictions at different times and in different ways.

3 Page 3 – February 10, 2016 Funding and Evaluation Our funding comes in 4 to 5 year cycles with a Treasury Board required evaluation at the end of each cycle But in reality, we have undergone some kind of an evaluation almost every year since 2001

4 Page 4 – February 10, 2016 Definitions Formative Evaluation – used when a program is under development or being formed – focus is on the implementation process Summative Evaluation – done when a program is « mature », want to see measurable results (show me the numbers).

5 Page 5 – February 10, 2016 One more… Developmental Evaluation New field Google Michael Quinn Patton Deals with social change programs that do not lend themselves well to summative evaluation (complex programs, vs simple or complicated)

6 Page 6 – February 10, 2016 Getting Started We started a bottom-up process in response to a poorly organised top-down process The logic model framework and indicators that were being proposed weren’t representative of what the program was supposed to accomplish, and how it was being implemented

7 Page 7 – February 10, 2016 Getting Started We did not want to be responsible for both reporting and being evaluated on indicators that were disconnected from the reality of the program

8 Page 8 – February 10, 2016 What we did… Put together a working group composed of both program and evaluation staff from both federal agencies (EC and HC) Developed our own program logic model and performance indicators These were then approved by senior management in both departments and fed back up through the system This wasn’t as easy as it sounds here….

9 Page 9 – February 10, 2016 Initial Challenges Getting the right people sitting around the table Proving the value of what we were doing to some program managers and senior management Multiple players with multiple priorities – we had to show how we fit in with higher level priorities – often with no notice

10 Page 10 – February 10, 2016 Initial Challenges Changing players at all levels, so just when you got to know someone, they were gone, losing visibility each time with people who need to know you are there The older AQI and the AQHI co-exist in several jurisdictions (so most of the baseline data are for the AQI and are not necessarily transferable (also create confounding and public confusion of the two programs)

11 Page 11 – February 10, 2016 Evolving Challenges Maintaining committment Still have to deal with changing players Getting relevant, AQHI-specific data and then making sense of it (getting at « show me the numbers ») Actually measuring some of our indicators – data from partners are not standardized

12 Page 12 – February 10, 2016 Evolving Challenges Validation of the program logic Recommendations are not always followed up on Getting at behaviour change (developmental evaluation) – attribution of any measured change to the program –AQI and AQHI still co-exist in many jurisdictions – still need AQHI specific baseline data but we need them now

13 Page 13 – February 10, 2016 Addresssing the Challenges Persistence – hang in there Demonstrate early the value of what you are doing – i.e. « what’s in it for management?» Entire process seemed overwhelming – we broke it down into small bites, starting with developing the program logic model Got training in evaluation Collaboration between AQHI program staff with evaluation experience and evaluation specialists

14 Page 14 – February 10, 2016 Advantages of our approach Gives you some influence in the process, not imposed from above – in our case, it turned out to be a lot of influence More opportunity to intervene effectively to prevent confusion from arising Helped de-mystify the evaluation process for many program staff and managers Build capacity for the long term

15 Page 15 – February 10, 2016 Disadvantages It wasn’t easy, and it still isn’t easy Not all the partners are equally engaged – slows things down Raises expectations – will be much harder once we undergo a summative evaluation May be hard to replicate – unique opportunities (e.g., available program staff with evaluation expertise)

16 Page 16 – February 10, 2016 Results Formative evaluation results were mainly positive; gave useful recommendations Program performance measurement framework/indicators provide a focus for smaller partner agencies – better alignment of individual programs & projects with national program goals Helped identify key data gaps – now we can work to fill in those gaps Helped make the case for continued funding for the next five years

17 Page 17 – February 10, 2016 Fluid process Have to stay on top of things Staff change Departmental priorities change Funding changes Organisational structures change Some of the assumptions made for the program logic prove to be wrong

18 Page 18 – February 10, 2016 What’s next Next round of funding is coming up Submission to Treasury Board(TB) for the next five years TB will require an evaluation before the end of the five years It is highly likely that this will be a summative evaluation

19 Page 19 – February 10, 2016 What’s next Establishing a performance management committee –Membership is open to all partners and stakeholders –To have performance measurement results used in decision making and for future planning purposes, data and information dissemination, knowledge transfer and communication functions. –To work together to address gaps in both program logic and data collection.

20 Page 20 – February 10, 2016 Au revoir…. If you want more details of what we did, feel free to contact us: Sharon.Jeffers@ec.gc.caSharon.Jeffers@ec.gc.ca (514-283-8621) Kamila.Tomcik@hc-sc.gc.caKamila.Tomcik@hc-sc.gc.ca (902-426-9449)


Download ppt "Evaluation of the Air Quality Health Index Program in Canada San Diego 2011 Sharon Jeffers - Environment Canada (EC) Kamila Tomcik – Health Canada (HC)"

Similar presentations


Ads by Google