Clarifying the Evaluation Focus in a Complex Program Context Howard Kress, PhD* Natalie Brown, MPH** *Battelle Memorial Institute, with National Center for Injury Prevention and Control Office of the Director ** National Center for Injury Prevention and Control Office of the Director American Evaluation Association November 13, 2009 The findings of this presentation are the responsibility of the authors and do not necessarily represent the official views of the Centers for Disease Control and Prevention
Topics Covered Charge of the evaluation Tools developed Lessons learned
Charge of the Evaluation Assess the relevance, quality, and significance of the ICRC program Highlight specific case studies and/or success stories Identify areas for program planning –Funding and policy decisions –CDC support –Staffing
Evaluation Questions What is the value of the ICRC Program? How has the ICRC program built the injury field? Value outside CDC & ICRCs Advantage of program vs. grants How has the ICRC program affected injury outcomes ? Contributions toward behavior modification Influences on policy and legislation
Evaluation Challenges Variation in length of CDC funding CDC funding only portion of total Different funding mechanisms Variation in host university support Limitations of CDC data systems Single vs. Multiple Research Foci Monitoring of training and education Multiple activities implemented simultaneously Intended vs. Uninteded Outcomes Attribution vs. Contribution
Concept Model
FOA Logic Model
Implementation Logic Model
Lessons Learned Find out stakeholder context –Document reviews –Conversations –Interviews Utility of models –Context –Map out barriers and opportunities Models help guide evaluation
Contact Information Howard Kress Battelle Memorial Institute Natalie Brown Centers for Disease Control and Prevention