Learning from evaluations Jacques Toulemonde Shaping the future of the ESF Brussels 23-24 /06 /2010
Learning from evaluations Study on the Return on Human Capital Investment DG Employment October 2009- June 2010 Research synthesis Impact analyses in four countries Meta-evaluation
Main knowledge gaps Relating costs and impact Findings Main knowledge gaps Relating costs and impact Investing in ‘at-risk’ sectors, firms, and people Specific impact Per training approach (quality) Per publics (women, younger, elder, vulnerable) Impact over the business cycle Substitution effect and skill mismatch
Lessons about impact analysis Findings Lessons about impact analysis Impact indicators lie Data collection matters Analysing impact takes years Impact estimates do not speak by themselves Broader studies, fewer lessons
Lessons about learning Findings Lessons about learning Four success stories of impact analyses Credible methods Relevant findings … but limited learning Supply push Insufficient explanation Problematic timeliness
What does prevent us to learn? Findings What does prevent us to learn? Limited political ownership Lack of explanatory studies Lack of selectivity Poor dissemination strategy Learning process trapped in the programming cycle
Make learning mandatory Proposals Make learning mandatory Rolling evaluation plans Including impact analyses targeted at knowledge gaps Annual follow up report Including quality assessed lessons Knowledge syntheses In all evaluations Before any financial decision Learning: a conditionality
Towards knowledge communities Proposals Towards knowledge communities Incentives for Bridging knowledge gaps Clustering evaluations across programmes Participating in knowledge communities Guidance for Sound impact analysis techniques Effective dissemination approaches Mutual learning about Making evaluation used by policy-makers
Thank you for your attention