1 Outcomes: Outcomes: Libraries Change Lives — Libraries Change Lives — Oh yeah? Prove it. Oh yeah? Prove it. The Institute of Museum and Library Services.

Slides:



Advertisements
Similar presentations
BPC-NCWIT Evaluation Workshop Boulder, CO May 16, 2007 Evaluation of the BPC Alliance Projects Daryl E. Chubin AAAS Capacity Center
Advertisements

REGIONAL CONFERENCES Planning, Doing, and Using Evaluation.
REGIONAL CONFERENCES Planning, Doing, and Using Evaluation.
Proposal Development Guidelines for Signature Grantee Semi-Finalists The Covenant Foundation.
Using Logic Models in Program Planning and Grant Proposals The Covenant Foundation.
Goals, Outcomes and Program Evaluation Community Memorial Foundation March 5, 2014.
Evaluation What, How and Why Bother?.
What can outcomes based planning and evaluation do for you? Elizabeth Kryder-Reid, Director, Museum Studies, IUPUI and Principal Investigator, Shaping.
Presentation at The Conference for Family Literacy Louisville, Kentucky By Apter & O’Connor Associates April 2013 Evaluating Our Coalition: Are We Making.
How to Evaluate Your Health Literacy Project Jill Lucht, MS Project Director, Center for Health Policy
1 I N S T I T U T E of M U S E U M and L I B R A R Y S E R V I C E S The Quintessential Meeting Philadelphia, October 28, 2003 I N S T I T U T E of M U.
2014 AmeriCorps External Reviewer Training Assessing Need, Theory of Change, and Logic Model.
Decision Making Tools for Strategic Planning 2014 Nonprofit Capacity Conference Margo Bailey, PhD April 21, 2014 Clarify your strategic plan hierarchy.
Program Evaluation and Measurement Janet Myers. Objectives for today… To define and explain concepts and terms used in program evaluation. To understand.
The New York State Library Division of Library Development Outcome-Based Evaluation for Technology Training Projects Developed for: The New York State.
Developing a Logic Model
Getting on the same page… Creating a common language to use today.
Dennis McBride, Ph.D. The Washington Institute (253) Goal Driven Logic Models.
The Lumina Center Grantseeking Workshop Series Presents Outcomes & Evaluations April 20, 2006.
2014 AmeriCorps State and National Symposium How to Develop a Program Logic Model.
Evaluation. Practical Evaluation Michael Quinn Patton.
How to Write Goals and Objectives
1 Introduction to State Logic Models and Related Planning Stephanie Lampron, NDTAC.
Molly Chamberlin, Ph.D. Indiana Youth Institute
How to Develop the Right Research Questions for Program Evaluation
1 I N S T I T U T E of M U S E U M and L I B R A R Y S E R V I C E S Washington, DC, Knowing What Audiences Learn: Knowing What Audiences.
2014 AmeriCorps External Reviewer Training
Dr. G. Johnson, Program Evaluation and the Logic Model Research Methods for Public Administrators Dr. Gail Johnson.
Product Evaluation the outcome phase. Do the magic bullets work? How do you know when an innovative educational program has “worked”? How do you know.
Edward M. Haugh Jr. ESC Consultant. III. Recommendations for Applying Outcomes Planning to ESC  I. Introduction to Outcomes Planning II. A Sample ESC.
Washington State Library OBE Retreat October 20, 2006 Matthew Saxton & Eric Meyers.
Performance Measures AmeriCorps Project Director Training Saratoga, NY October 7 th – 9 th, 2013.
The Evaluation Plan.
Logic Models Handout 1.
Program Evaluation and Logic Models
WRITING EFFECTIVE GRANT PROPOSALS With an Eye toward Performance and Evaluation Issues.
1 Introduction to Evaluating the Minnesota Demonstration Program Paint Product Stewardship Initiative September 19, 2007 Seattle, WA Matt Keene, Evaluation.
Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31,
Evaluation Assists with allocating resources what is working how things can work better.
Outcome Based Evaluation for Digital Library Projects and Services
Cathy Burack and Alan Melchior The Center for Youth and Communities The Heller School for Social Policy and Management, Brandeis University Your Program.
Comp 20 - Training & Instructional Design Unit 6 - Assessment This material was developed by Columbia University, funded by the Department of Health and.
Logic Models and Theory of Change Models: Defining and Telling Apart
Purposes of Evaluation Why evaluate? Accountability: Justify trust of stakeholders (funders, parents, citizens) by showing program results (Summative)
Prepared by the North Dakota State Data Center July HNDECA and ECCS Evaluation Dr. Richard Rathge Professor and Director North Dakota State Data.
Donor perspectives on planning and evaluation Janice Astbury The J.W. McConnell Family Foundation.
Logic Models as Tools for Developing Performance Measures Presented by JoAnn A. Smith, MPH Community Health Administration Grants Monitoring and Program.
What is a Library ‘Outcome’ and How Do You Measure One? Andy Stewart Library Director C. L. Wilson Library Missouri S&T.
Using Logic Models in Program Planning and Grant Proposals The Covenant Foundation.
1 A QTS Web Training Writing Consumer Education & Referral Outcomes.
Adrienne DiTommaso, MPA, CNCS Office of Research and Evaluation
Planning Evaluation Setting the Course Source: Thompson & McClintock (1998) Demonstrating your program’s worth. Atlanta: CDC/National Center for Injury.
1 The project is financed from the European Union funds within the framework of Erasmus+, Key Action 2: Cooperation for innovation and the exchange of.
Using Logic Models to Create Effective Programs
Prevention Education Meeting May 29, 2013 Evaluation 101.
IT Leading the Way to Institutional Effectiveness Presenter: Kendell Rice, Ph.D. July 11, 2007.
Session 2: Developing a Comprehensive M&E Work Plan.
Assessment/Evaluation Make evaluation a central part of planning – not an afterthought 1) Determine Needs 2) Determine Desired Outcomes 3) Determine Activities.
A Hierarchy for Targeting Outcomes and Evaluating Their Achievement S. Kay Rockwell Professor and Evaluation Specialist Agricultural.
 In Ned law are a company that provides strategic consulting and management, composed of a team of high academic and social esteem, focused on optimization,
Developing a Monitoring & Evaluation Plan MEASURE Evaluation.
Logic Models How to Integrate Data Collection into your Everyday Work.
for CIT Program Operation Resource Development Institute
Using Logic Models in Program Planning and Grant Proposals
BEST PRACTICES IN LIBRARY INSTRUCTION FORUM November 7, 2007
Logic Models and Theory of Change Models: Defining and Telling Apart
Using Logic Models in Project Proposals
BOOTCAMP SOCIAL INNOVATION ACCELERATOR TO CREATE LASTING CHANGE
Presentation transcript:

1 Outcomes: Outcomes: Libraries Change Lives — Libraries Change Lives — Oh yeah? Prove it. Oh yeah? Prove it. The Institute of Museum and Library Services Public Libraries Association Phoenix, AZ 2002

2 Institute of Museum and Library Services The Institute of Museum and Library Services (IMLS) is a federal agency that fosters leadership, innovation, and a lifetime of learning through grants to museums and libraries. Please see IMLS’s Web site at for additional information about IMLS and its grant programs.

3 We will  Talk about outcome-based evaluation (OBE)  Talk about benefits  Define terms  Describe process  Discuss how to use results

4 Outcomes are achievements or changes in  Skill Information literacy Basket weaving Basket weaving  Knowledge A state’s population Symptoms of diabetes  Behavior Completes homework Reads to his kids What are outcomes?

5 Outcomes are achievements or changes in  Attitude Libraries are good I support recycling  Status High school graduate Certified librarian  Life condition Homeless Healthy What are outcomes?

6 Example Wally goes to a reading program and Learns childhood reading is important Learns childhood reading is important Wants to read to his son Wants to read to his son Uses a literacy program Uses a literacy program Advances 2 literacy levels Advances 2 literacy levels Gets his GED Gets his GED (What kinds of outcomes are each of these?)

7 Outcomes Where do they come from? Government Performance and Results Act (GPRA) 1993 Government Performance and Results Act (GPRA) 1993 New accountability, need to evaluate New accountability, need to evaluate Funding trends Funding trends – IMLS – State government, LSTA – Foundations (Gates, Pew)

8 Evaluate? Let me count the Ways! Internal Internal External External Participant Participant Process Process Inputs, Outputs (counts) Inputs, Outputs (counts) Outcomes Outcomes Formative Formative Summative Summative Cost/Benefit Cost/Benefit ROI... ROI...

9 How does OBE fit libraries? Inputs — How much we use Inputs — How much we use Outputs — How much we do Outputs — How much we do Performance quality — How well we do Performance quality — How well we do Economic value — What we’re worth Economic value — What we’re worth Outcomes — What good we do Outcomes — What good we do

10 How does OBE fit libraries? OBE changes focus, from activities to benefits OBE changes focus, from activities to benefits OBE needs to be part of the program design OBE needs to be part of the program design OBE shows to what extent a program met its own goals OBE shows to what extent a program met its own goals

11 Fears & Realities Will use too much time Will use too much time – Consider scope of the evaluation – Consider scope of the evaluation Visitor privacy may be compromised Visitor privacy may be compromised – Ask for volunteers – People like to tell you what they think

12 Fears & Realities Takes money from other priorities Takes money from other priorities – Funds are available – Results help leverage other funds Library impact cannot be measured – Show short-term, immediate impact Library impact cannot be measured – Show short-term, immediate impact

13 More Realities Libraries collect similar information to improve services and user satisfaction Libraries collect similar information to improve services and user satisfaction Evaluation can increase participation, improve services, and leverage funds Evaluation can increase participation, improve services, and leverage funds

14 Why measure outcomes? Know if program met purpose Know if program met purpose Improve programs Improve programs Guide management Guide management Communicate program impact Communicate program impact Satisfy funder’s need to know Satisfy funder’s need to know

15 What is a program? Series of services & activities that lead to a goal Series of services & activities that lead to a goal Has a definite beginning & end Has a definite beginning & end Meant to change attitude, behavior, Meant to change attitude, behavior, knowledge, skill, status, or condition

16 How to develop a program Identify a need Identify a need Need can be based on Need can be based on  Your experiences  Program partner experiences  Formal or informal research

17 How to develop a program Look at assumptions & verify needs Students need structure after school Students need structure after school Students need homework help Students need homework help Kids like computers Kids like computers Our library can provide safety, structure, computer skills, and help with homework skills Our library can provide safety, structure, computer skills, and help with homework skills

18 How to develop a program Identify solution A structured after-school program to provide kids with computer skills and homework help Identify desired results Kids will learn basic computer skills Kids will learn basic computer skills Kids will be information literate Kids will be information literate Kids will get better grades Kids will get better grades

19 How to develop a program Look at stakeholders Include individuals, agencies, funding sources, competition, community groups and national and state affiliations. They influence: Desired outcomes Desired outcomes How results are communicated How results are communicated Type and nature of services Type and nature of services Who program serves Who program serves

20 How to develop a program Look at audience Who is served by a program depends on several factors: Stakeholders Stakeholders Assumed need Assumed need Mission and resources Mission and resources

21 Example Provides a M-F after-school computer homework center Kids 8 to 12 in Springfield Kids have basic computer skills, are information literate, and get better grades Program Purpose Does what For whom For what outcome

22 Program Elements Inputs ServicesActivities Outputs Resources What a program uses Tasks What a program does Products Quantity of work or products

23 Outcome-based program Inputs Outcomes ServicesActivities Outputs Changes in knowledge, skills, abilities, attitudes, condition or life status

24 Measuring outcomes OBE is not formal research OBE is not formal research OBE shows contribution, not attribution OBE shows contribution, not attribution OBE shows what results program achieved OBE shows what results program achieved

25 Measuring outcomes Building a logic model Logic model is the evaluation plan Logic model is the evaluation plan Shows how all elements fit together Shows how all elements fit together Helps assure desired results Helps assure desired results

26 Outcomes Logic Model P ROGRAM InputsServicesActivitiesOutputs OutcomeIndicatorApplied toData sources Data intervals Target Intended impact Observable and measurable behaviors or conditions The population to be measured Sources of information about conditions being measured When data is collected The amount of impact desired

27 Identify program outcomes Immediate Immediate Intermediate Intermediate Long term Long term Outcomes

28 Hints: A program may have many outcomes A program may have many outcomes Pick a few important outcomes to measure Pick a few important outcomes to measure One significant outcome may be enough One significant outcome may be enough Outcomes

29 Outcome 1 Participants have basic computer skills Outcome 2 Participants are “information literate” Outcome 3 Participants complete homework Outcomes

30 Indicators  Observable, measurable, clearly identifiable  Unambiguous  Several may apply to each outcome

31 Indicators Outcome 1: Participants have basic computer skills Indicator : # and % of participants who can word- process one complete assignment without error

32 Indicators Does using a word processor indicate basic computer skills? A word processor is basic tool A word processor is basic tool Word processing requires specific skills Word processing requires specific skills If kids can produce a short assignment, they If kids can produce a short assignment, they probably have one basic skill set probably have one basic skill set

33 Logic Model Outcome 1: Kids have basic computer skills 75% Target After “basic” training session Data interval Completed assignment Data source All kids who finished “basic” training session # and % of kids who can word process 1 complete assignment without error Apply toIndicator

34 Reports What should reports say?  We wanted to do what  We did what  So what

35 Reports Summarize participant characteristics Summarize participant characteristics Summarize inputs, activities/services, outputs, and outcomes Summarize inputs, activities/services, outputs, and outcomes Respond to influencers’ need for information Respond to influencers’ need for information Compare data from program start or previous period Compare data from program start or previous period Interpret results and make recommendations Interpret results and make recommendations

36 Report Elements  What did we achieve for our target audience? Outcomes  How many units did we deliver?  To whom? ( audience characteristics ) Outputs  What did we do? Activities & Services  What did we use?  How much did we spend?  How much did we consume? Inputs

37 Options: Hire a consultant Benefits Result may be seen as unbiased Result may be seen as unbiased Professionals have most expertise Professionals have most expertise Process may be more efficient Process may be more efficient Offers outside perspective Offers outside perspective Who will do the work?

38 Options: Hire staff evaluator Benefits May reduce cost May reduce cost Greater understanding of your program Greater understanding of your program Greater commitment to the process Greater commitment to the process Who will do the work?

39 Options: Train existing staff (ideal) Benefits Integrate evaluation into routine management activities Integrate evaluation into routine management activities Staff know programs and audience Staff know programs and audience Skills transfer to all programs Skills transfer to all programs Who will do the work?

40 What do you get? Small investment: Numbers, audience characteristics, and customer satisfaction Small investment: Numbers, audience characteristics, and customer satisfaction Low to moderate investment: Immediate changes in knowledge, behaviors, and attitudes Low to moderate investment: Immediate changes in knowledge, behaviors, and attitudes

41 What do you get? Moderate to high investment: Attribute short-term changes in audience skills, knowledge, behaviors, and attitudes to program Moderate to high investment: Attribute short-term changes in audience skills, knowledge, behaviors, and attitudes to program High investment: Short- and long- term impact, attribution of impact to program, variables influencing impact High investment: Short- and long- term impact, attribution of impact to program, variables influencing impact

42 Your Action Plan Discuss benefits to your library Discuss benefits to your library Get more information Get more information Consider how to apply OBE Consider how to apply OBE Develop a plan Develop a plan

43 We have Talked about uses of OBE Talked about uses of OBE Talked about benefits Talked about benefits Defined terms Defined terms Described process Described process Discussed how results are used Discussed how results are used

44 Starting Places: Perspectives on Outcome-based Evaluation for Libraries and Museums (2001), IMLS Perspectives on Outcome-based Evaluation for Libraries and Museums (2001), IMLS IMLS bibliography IMLS bibliography

45 Starting Places: Action Plan for Outcomes Assessment in Your Library Peter Hernon and Robert E. Dugan Action Plan for Outcomes Assessment in Your Library, Peter Hernon and Robert E. Dugan (2002), ALA Editions A comprehensive plan specifically for librarians; provides data collection tools for measuring learning outcomes that link outcomes to user satisfaction W.K. Kellogg Foundation Evaluation Handbook (1998) W.K. Kellogg Foundation Evaluation Handbook (1998) Thorough introduction for organizations new to OBE

46 Karen Motylewski Institute of Museum and Library Services 1100 Pennsylvania Avenue, NW Washington, DC For more information