Presentation is loading. Please wait.

Presentation is loading. Please wait.

a New Focus for External Validity

Similar presentations


Presentation on theme: "a New Focus for External Validity"— Presentation transcript:

1 a New Focus for External Validity
Critical Juncture: Applying Assessment Tools and Approaches to Scaling-up a New Focus for External Validity American Evaluation Association Annual Conference – 2013 Washington DC

2 Facilitating Full Development and Evaluation Chain:
Effectiveness Efficiency “Impact” “Sustainability” “Scale”

3 Ecology: Development, Scale and Evaluation
Type of Project Theory of Change/ Innovation Impact Level Typical Evaluation Questions Policy New Rules Sector-wide “Was the policy adopted?” Capacity Building Training in new methods Organi-zational “Were skills transferred?” Pilot Testing of Innovation before Scaling Local “Did it work?/ Did we meet Objectives?”

4 Factors affecting development program success
POLICIES/ REGULATIONS ORG CULTURE RESOURCES INNOVATION/ INTERVENTION Evaluations often focus only here CAPACITY/SKILLS INCENTIVES

5 Why do so few Innovations reach Scale?
Most innovations are NOT Designed, Monitored or Evaluated for potential or performance at scale. ASSESSMENT: Evaluations often retrospective, post-facto and focus on performance, few look out/ forward at permanent implementer systems DESIGN: Most projects are short-term and “innovative” instead of long-term programs. IMPLEMENTATION: Focus is often on implementation and short-term results, not institutionalization.

6 Innovation Assessments: Change the Question
“Did it work?” “Did it impact the target population?” “Can it work in another context or system?” Internal Validity External Validity (Populations; Impact) External Validity (Systems; Sustainability)

7 Ecology: Development, Scale and Evaluation
Type of Project Theory of Change Scale Internal Validity Evaluations External Validity Assessments Policy New Rules Sector-wide “Was the policy adopted?” “Is the Policy being implemented and enforced?” Training/ Mentoring Capacity Building in new methods Organi-zational “Were skills transferred?” “Are new skills/ methods improving Outcomes?” Pilot Testing of Innovation before Scaling Local “Did it work?/ Did we meet Objectives” “Can it work elsewhere?”

8 Three Entry Points for Scalability Assessments:
Design Phase: Track operational indicators in pilot and as you go to scale. Generating Evidence Mid-Course: Design and insert Indicators to track Systems Adaptation of key Model components. Managing Change/ Monitoring Adoption Post-Project: Assessing Model components for potential adaptability: into Adopter’s systems; Assessing Potential for Scale Monitoring quality and effectiveness at scale. Monitoring at Scale

9 What’s New About This? HR New Evaluation Purpose New Audience, Client
Generating evidence for retrospective assessments Evidence for Prospective Change Management: the ‘Second-Phase Pilot’ New Audience, Client Technical, Program Staff/ Implementers Decision-Makers of Adopters New Evidence Universe Assessment of program Effects within Target Populations Assessing Viability of model in Permanent or Adopting Systems Political Priorities Org Culture, Incentives HR Financing

10 Gathering Evidence: Going Beyond Effectiveness
Method Purpose Experimental (RCT) or quasi-experimental design ideal Prove Theory of Change/ establish Attributability Supplement with studies of variance on external factors and components Isolate salient aspects of the model that could be generalized to larger population – Identify ‘stripped-down’ model to scale Combine with qualitative methods Triangulate attributability Identify Model’s full process Identify tacit elements/ factors of success beyond Technical intervention Document cost: Unit cost and Operational costs To assess adoption costs for “permanent” implementer(s) Often stop here

11 Standards of Evidence Innovation Minimal objective evidence
Promising Practice Multiple anecdotal reports Model Positive evidence in a few cases Good Practice Clear evidence from several cases Best Practice Evidence of impact from multiple settings and meta-analyses Policy Principle Proven; a “truism” essential for success “Scalability” requires same systemic evidence gathering and analysis as any assessment

12 Quick Background: MSI’s History in Scale
Implementing Policy Change contract under USAID from 2009: developed Scaling-Up Management Practitioner’s Guide : Grants from MacArthur and Packard Foundations to assess ‘scalability’ of 20 pilot programs globally

13 MSI Framework: 3 Steps, 10 Tasks
Step 1: Developing a Scaling Up Plan Result: Realistic assessment of prospects, parameters and strategy for scale Task 1: Creating a Vision Task 2: Assessing Scalability Task 3: Filling Information Gaps Task 4: Preparing a Scaling Up Plan Step 2: Establishing the Preconditions for Scaling Up Result: Decisions taken and resources allocated for going to scale Task 5: Legitimizing Change Task 6: Constituency Building Task 7: Realigning and Mobilizing Resources Step 3: Implementing the Scaling Up Process Result: Sustainable provision of services at scale Task 8: Modifying and Strengthening Organizations Task 9: Coordinating Action Task 10: Tracking Performance and Maintaining Momentum

14 How we do it in the field: Example from India
Initial meeting with implementing and adopting organizations Preliminary assessment of scalability Goals for scaling up, willingness/ability to do it Scaling Up workshop 1 Identification of Model and Evidence of Success Full Scalability Assessment: Internal and External Goal Setting – what ultimate scale & impact, multi-stage? Interim Research Period – Filling the Evidence gaps Scaling Up workshop II Political mapping and stakeholder analysis Devise specific legitimation and advocacy strategies Action plan, Roles and Responsibilities, Timetable Fundraising


Download ppt "a New Focus for External Validity"

Similar presentations


Ads by Google