1 I N S T I T U T E of M U S E U M and L I B R A R Y S E R V I C E S The Quintessential Meeting Philadelphia, October 28, 2003 I N S T I T U T E of M U.

Slides:



Advertisements
Similar presentations
Introduction to Monitoring and Evaluation
Advertisements

What can outcomes based planning and evaluation do for you? Elizabeth Kryder-Reid, Director, Museum Studies, IUPUI and Principal Investigator, Shaping.
Presentation at The Conference for Family Literacy Louisville, Kentucky By Apter & O’Connor Associates April 2013 Evaluating Our Coalition: Are We Making.
How to Evaluate Your Health Literacy Project Jill Lucht, MS Project Director, Center for Health Policy
Intro to Grant-Seeking Presented by Bess de Farber Library Grants Manager George A. Smathers Libraries University of Florida February 09,
2014 AmeriCorps External Reviewer Training Assessing Need, Theory of Change, and Logic Model.
Decision Making Tools for Strategic Planning 2014 Nonprofit Capacity Conference Margo Bailey, PhD April 21, 2014 Clarify your strategic plan hierarchy.
Program Evaluation and Measurement Janet Myers. Objectives for today… To define and explain concepts and terms used in program evaluation. To understand.
The New York State Library Division of Library Development Outcome-Based Evaluation for Technology Training Projects Developed for: The New York State.
Developing a Logic Model
Getting on the same page… Creating a common language to use today.
Dennis McBride, Ph.D. The Washington Institute (253) Goal Driven Logic Models.
The Lumina Center Grantseeking Workshop Series Presents Outcomes & Evaluations April 20, 2006.
2014 AmeriCorps State and National Symposium How to Develop a Program Logic Model.
Evaluation. Practical Evaluation Michael Quinn Patton.
How to Write Goals and Objectives
HOW VOLUNTEERS MAKE A DIFFERENCE: Identifying and Measuring Volunteer Outcomes Montgomery County May 2015 Pam Saussy and Barry Seltser, Pro Bono Consultants.
1 Introduction to State Logic Models and Related Planning Stephanie Lampron, NDTAC.
Molly Chamberlin, Ph.D. Indiana Youth Institute
How to Develop the Right Research Questions for Program Evaluation
1 I N S T I T U T E of M U S E U M and L I B R A R Y S E R V I C E S Washington, DC, Knowing What Audiences Learn: Knowing What Audiences.
2014 AmeriCorps External Reviewer Training
Dr. G. Johnson, Program Evaluation and the Logic Model Research Methods for Public Administrators Dr. Gail Johnson.
Edward M. Haugh Jr. ESC Consultant. III. Recommendations for Applying Outcomes Planning to ESC  I. Introduction to Outcomes Planning II. A Sample ESC.
Evaluation: A Necessity for the Profession, an Essential Support for Practitioners Linking Intervention with Outcome Bryan Hiebert University of Victoria.
© 2003 IBM Corporation July 2004 Technology planning for not-for-profit organizations IBM volunteer name Title, organization.
Sustainability… Start Now for a Vibrant Future Sustainability Workshop for Persistently Dangerous Schools Grantees Philadelphia, PA Tuesday, September.
Washington State Library OBE Retreat October 20, 2006 Matthew Saxton & Eric Meyers.
Performance Measures AmeriCorps Project Director Training Saratoga, NY October 7 th – 9 th, 2013.
The Evaluation Plan.
Logic Models Handout 1.
Program Evaluation and Logic Models
WRITING EFFECTIVE GRANT PROPOSALS With an Eye toward Performance and Evaluation Issues.
Monitoring and Evaluation in MCH Programs and Projects MCH in Developing Countries Feb 10, 2011.
1 Introduction to Evaluating the Minnesota Demonstration Program Paint Product Stewardship Initiative September 19, 2007 Seattle, WA Matt Keene, Evaluation.
Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31,
Evaluation Assists with allocating resources what is working how things can work better.
Outcome Based Evaluation for Digital Library Projects and Services
Cathy Burack and Alan Melchior The Center for Youth and Communities The Heller School for Social Policy and Management, Brandeis University Your Program.
Logic Models and Theory of Change Models: Defining and Telling Apart
Purposes of Evaluation Why evaluate? Accountability: Justify trust of stakeholders (funders, parents, citizens) by showing program results (Summative)
Human Services Integration Building More Effective Responses to Peoples’ Needs.
Prepared by the North Dakota State Data Center July HNDECA and ECCS Evaluation Dr. Richard Rathge Professor and Director North Dakota State Data.
Program Evaluation for Nonprofit Professionals Unit 1 Part 2: Evaluation and The Logic Model.
Logic Models as Tools for Developing Performance Measures Presented by JoAnn A. Smith, MPH Community Health Administration Grants Monitoring and Program.
Monitoring and Evaluating in the PHEA Educational Technology Initiative Patrick Spaven.
The Logic Model An Outcomes-Based Program Model. What is a Logic Model? “a systematic and visual way to present and share your understanding of the relationships.
What is a Library ‘Outcome’ and How Do You Measure One? Andy Stewart Library Director C. L. Wilson Library Missouri S&T.
Using Logic Models in Program Planning and Grant Proposals The Covenant Foundation.
Community Planning 101 Disability Preparedness Summit Nebraska Volunteer Service Commission Laurie Barger Sutter November 5, 2007.
1 A QTS Web Training Writing Consumer Education & Referral Outcomes.
Catholic Charities Performance and Quality Improvement (PQI)
Planning Evaluation Setting the Course Source: Thompson & McClintock (1998) Demonstrating your program’s worth. Atlanta: CDC/National Center for Injury.
1 The project is financed from the European Union funds within the framework of Erasmus+, Key Action 2: Cooperation for innovation and the exchange of.
Monitoring and Evaluation in MCH Programs and Projects MCH in Developing Countries Feb 9, 2012.
Using Logic Models to Create Effective Programs
Outcome-based Planning and Evaluation Gloria Latimer, Ed.S, Director of Community Programs Jason Vahling, M.P.H., Community Program Specialist.
Advancing Innovation in Measuring Patient Advocacy Outcomes.
Session 2: Developing a Comprehensive M&E Work Plan.
1 Outcomes: Outcomes: Libraries Change Lives — Libraries Change Lives — Oh yeah? Prove it. Oh yeah? Prove it. The Institute of Museum and Library Services.
Assessment/Evaluation Make evaluation a central part of planning – not an afterthought 1) Determine Needs 2) Determine Desired Outcomes 3) Determine Activities.
A Hierarchy for Targeting Outcomes and Evaluating Their Achievement S. Kay Rockwell Professor and Evaluation Specialist Agricultural.
Developing a Monitoring & Evaluation Plan MEASURE Evaluation.
ADRCs Do What? Using Logic Models to Document and Improve ADRC Outcomes Glenn M. Landers.
Logic Models How to Integrate Data Collection into your Everyday Work.
BEST PRACTICES IN LIBRARY INSTRUCTION FORUM November 7, 2007
Logic Models and Theory of Change Models: Defining and Telling Apart
Using Logic Models in Project Proposals
BOOTCAMP SOCIAL INNOVATION ACCELERATOR TO CREATE LASTING CHANGE
Presentation transcript:

1 I N S T I T U T E of M U S E U M and L I B R A R Y S E R V I C E S The Quintessential Meeting Philadelphia, October 28, 2003 I N S T I T U T E of M U S E U M and L I B R A R Y S E R V I C E S The Quintessential Meeting Philadelphia, October 28, 2003 Libraries Matter– Why Libraries Matter– The Case for Outcomes

2 Overview We will talk about  IMLS  Brief history and functions of outcome- based evaluation (OBE) (advocacy and management)  Definitions and practical applications  Where to find resources, information and assistance

3 Institute of Museum and Library Services The Institute of Museum and Library Services (IMLS) is a federal agency that fosters leadership, innovation, and a lifetime of learning through grants to museums and libraries. Please see IMLS’s Web site at for additional information about IMLS and its grant programs.

4 What is IMLS  What is IMLS and what do we do?  Provide competitive and formula-based grants to libraries, museums, archives, and associated organizations to increase excellence of their services and to create a nation of learners  Provide information of interest to the museum, library, and archives fields and to their audiences and stakeholders

5 Who needs to know?  Policy-makers  Parent institutions  Library profession  General public  Funders Who are your key audiences for data on the value of libraries?  Program audiences  Funders  Media  Other ?

6 How do outcomes support the work of libraries?  Provide concrete, objective information to show library value  Provide information that responds to stakeholder values and background  Go beyond “what we provided” to “why do we matter?”

7 Challenges to showing the value of library services  How library products and services develop  “Let’s put on a show!”  Hopes and dreams (“democracy will flourish”)  Traditions of tracking and reporting library services  IMLS solutions

8 We will  Talk about outcome-based evaluation (OBE)  Talk about benefits  Define terms  Describe process  Discuss how to use results

9 Outcomes are achievements or changes in  Skill Information literacy Competitive intelligence Competitive intelligence  Knowledge Where to find clinical trials Symptoms of diabetes  Behavior Avoids high-fat foods Wears a seat belt What are outcomes?

10 Outcomes are achievements or changes in  Attitude Librarians are valuable Immunization is important  Status MLIS SLA or MLA member  Life condition Employed Healthy What are outcomes?

11 Example Wally goes to a reading program and  Learns childhood reading is important  Wants to read to his son  Uses a literacy program  Advances 2 literacy levels  Gets his GED What kinds of outcomes are each of these?

12 Outcomes Where do they come from?  Government Performance and Results Act (GPRA) 1993  New accountability, need to show value  Funding trends  IMLS and other federal funders  State and local government  Foundations (Gates, Kellogg, local)  ROI/cost:benefit

13 Evaluate? Let me count the Ways! Internal Internal External External Participant Participant Process Process Inputs, Outputs (counts) Inputs, Outputs (counts) Outcomes Outcomes Formative Formative Summative Summative Cost/Benefit Cost/Benefit ROI... ROI...

14 How does OBE fit libraries? Inputs — How much we use Inputs — How much we use Outputs — How much we do Outputs — How much we do Performance quality — How well we do Performance quality — How well we do Economic value — What we’re worth Economic value — What we’re worth Outcomes — What good we do Outcomes — What good we do

15 How does OBE fit libraries? OBE changes focus, from activities to benefits OBE changes focus, from activities to benefits OBE needs to be part of the program design OBE needs to be part of the program design OBE shows to what extent a program met its own goals OBE shows to what extent a program met its own goals

16 Fears & Realities  Will use too much time  Consider scope of the evaluation  Visitor privacy may be compromised  Ask for volunteers  People like to tell you what they think

17 Fears & Realities Takes money from other priorities Takes money from other priorities Funds are available Funds are available Results help leverage other funds Results help leverage other funds Library impact cannot be measured Library impact cannot be measured Show short-term, immediate impact logically related to your service and to progress towards a larger vision (e.g. becoming a life-long learner) Show short-term, immediate impact logically related to your service and to progress towards a larger vision (e.g. becoming a life-long learner)

18 More Realities Libraries collect similar information to improve services and user satisfaction Libraries collect similar information to improve services and user satisfaction Evaluation can increase participation, improve services, and leverage funds Evaluation can increase participation, improve services, and leverage funds

19 Why measure outcomes? Know if program met purpose Know if program met purpose Improve programs Improve programs Guide management Guide management Communicate program impact Communicate program impact Satisfy funder’s need to know Satisfy funder’s need to know

20 What is a program? Series of services & activities that lead to a goal Series of services & activities that lead to a goal Has a definite beginning & end (for program or participant) Has a definite beginning & end (for program or participant) Meant to change attitude, behavior, Meant to change attitude, behavior, knowledge, skill, status, or condition

21 How to develop an outcome-oriented program Identify a need Identify a need Need can be based on Need can be based on  Your experiences  Program partner experiences  Formal or informal research

22 How to develop an outcome-oriented program Look at assumptions & verify long term needs :  Many people use unreliable health information; this is increased by easy access to online info  Many people rely on public libraries for about health information  If public libraries provide fast, high-quality health information, people might use it for health decisions

23 How to develop an outcome-oriented program Look at assumptions & verify short term needs:  Many public library staff lack needed computer, info literacy, and health info skills  Many such staff lack confidence for health- related questions, are slow, and don’t know sources  Many public libraries lack staff to provide needed training

24 How to develop an outcome-oriented program Look at assumptions & verify short term needs:  Our library could provide health-directed computer, info literacy, and online search skills  We could improve speed and quality of health information service in public libraries  In the long term, we could improve health decisions among library users

25 How to develop an outcome-oriented program Identify solution A structured program to provide public library staff with basic computer and information literacy skills to help people answer health-related questions Identify desired results (short-term) Staff will learn basic computer skills Staff will learn basic computer skills Staff will have basic health information literacy Staff will have basic health information literacy Staff will learn to locate needed information quickly Staff will learn to locate needed information quickly

26 How to develop a program Look at stakeholders Include individuals, agencies, funding sources, competition, community groups and national and state affiliations. They influence: Desired outcomes Desired outcomes How results are communicated How results are communicated Type and nature of services Type and nature of services Who program serves Who program serves

27 How to develop a program Look at audience Who is served by a program depends on several factors: Stakeholders Stakeholders Assumed need Assumed need Mission and resources Mission and resources

28 Example Provides a structured training program In health information for public library staff Public library staff in West Dakota Public library staff have basic computer skills, are information literate, and can find high-quality health information quickly Program Purpose Does what For whom For what outcome

29 Measuring outcomes OBE is not the same as research OBE is not the same as research  No intent to compare with other programs  No intent to prove contribution or sole responsibility for change OBE shows contribution, not attribution OBE shows contribution, not attribution OBE shows what results one program achieved for its audience OBE shows what results one program achieved for its audience

30 Measuring outcomes Building a logic model Logic model is the evaluation plan Logic model is the evaluation plan Shows how all elements fit together Shows how all elements fit together Helps assure desired results Helps assure desired results Logic model is not a form to fill in–it is a flexible, dynamic planning tool Logic model is not a form to fill in–it is a flexible, dynamic planning tool Builds consensus about goals and roles Builds consensus about goals and roles

31 Outcomes Logic Model PROGRAM PURPOSE Inputs Services Activities Outputs OutcomeIndicatorApplied toData sources Data intervals Target (Goal) Intended impact Observable and measurable behaviors or conditions The population to be measured Sources of information about conditions being measured When data is collected The amount of impact desired

32 Identify program outcomes Immediate Immediate Intermediate Intermediate Long term Long term Outcomes

33 Hints  A program may have many outcomes  Pick a few important outcomes to measure  One significant outcome may be enough Outcomes

34 Short term Outcome 1. Participants have basic computer skills Outcome 2. Participants are “information literate” Outcome 3. Participants find high-quality health information fast Outcomes

35 Long term Outcome 1. The public will seek health information assistance from public libraries Outcome 2. The public will make health decisions based on high-quality information Outcomes

36 Indicators  Observable, measurable, clearly identifiable  Unambiguous  Several may apply to each outcome

37 Indicators Outcome 1. Participants have basic computer skills Indicator Participants can perform 4 basic computer tasks without assistance (set screen font size, download an application, save a file to a personal folder, print a document)

38 Indicators How does completing a few tasks indicate basic computer skills? Locating information online requires some basic skills for using a computer Locating information online requires some basic skills for using a computer If library staff can demonstrate a representative skill set without assistance, they probably have “basic skills” If library staff can demonstrate a representative skill set without assistance, they probably have “basic skills”

39 Logic Model Outcome 1. Public library staff have basic computer skills 75% Target (Goal) After “basic” training session Data interval Completed assignment Data source All participants who finish “basic” training session Participants who can complete 4 basic tasks without assistance Apply toIndicator

40 Reports What should reports say?  We wanted to do what  We did what  So what

41 Reports  Summarize participant characteristics  Summarize inputs, activities/services, outputs, and outcomes  Respond to influencers’ need for information  Compare data from program start or previous period  Interpret results and make recommendations

42 Report Elements  What did we achieve for our target audience? Outcomes  How many units did we deliver?  To whom? ( audience characteristics ) Outputs  What did we do? Activities & Services  What did we use?  How much did we spend?  How much did we consume? Inputs

43 Options: Hire a consultant Benefits Result may be seen as unbiased Result may be seen as unbiased Professionals have most expertise Professionals have most expertise Process may be more efficient Process may be more efficient Offers outside perspective Offers outside perspective Who will do the work?

44 Options: Hire staff evaluator Benefits May reduce cost May reduce cost Greater understanding of your program Greater understanding of your program Greater commitment to the process Greater commitment to the process Who will do the work?

45 Options: Train existing staff (ideal) Benefits Integrate evaluation into routine management activities Integrate evaluation into routine management activities Staff know programs and audience Staff know programs and audience Skills transfer to all programs Skills transfer to all programs Who will do the work?

46 What do you get? Small investment: Numbers, audience characteristics, and customer satisfaction Small investment: Numbers, audience characteristics, and customer satisfaction Low to moderate investment: Immediate changes in knowledge, behaviors, and attitudes Low to moderate investment: Immediate changes in knowledge, behaviors, and attitudes

47 What do you get? Moderate to high investment: Attribute short-term changes in audience skills, knowledge, behaviors, and attitudes to program Moderate to high investment: Attribute short-term changes in audience skills, knowledge, behaviors, and attitudes to program High investment: Short- and long-term impact, attribution of impact to program, variables influencing impact High investment: Short- and long-term impact, attribution of impact to program, variables influencing impact

48 Your Action Plan Discuss benefits to your library Discuss benefits to your library Get more information Get more information Consider how to apply OBE Consider how to apply OBE Develop a plan Develop a plan

49 We have Talked about uses of OBE Talked about uses of OBE Talked about benefits Talked about benefits Defined terms Defined terms Described process Described process Discussed how results are used Discussed how results are used

50 Starting Places: NLG Project Planning: A Tutorial, IMLS NLG Project Planning: A Tutorial, IMLShttp://e-services.imls.gov/project_planning/ Perspectives on Outcome-based Evaluation for Libraries and Museums (2001), IMLS Perspectives on Outcome-based Evaluation for Libraries and Museums (2001), IMLS IMLS bibliography IMLS bibliography

51 Starting Places: Action Plan for Outcomes Assessment in Your Library Peter Hernon and Robert E. Dugan Action Plan for Outcomes Assessment in Your Library, Peter Hernon and Robert E. Dugan (2002), ALA Editions A comprehensive plan specifically for librarians; provides data collection tools for measuring learning outcomes that link outcomes to user satisfaction W.K. Kellogg Foundation Evaluation Handbook (1998) W.K. Kellogg Foundation Evaluation Handbook (1998) Thorough introduction for organizations new to OBE

52 Karen Motylewski Institute of Museum and Library Services 1100 Pennsylvania Avenue, NW Washington, DC For more information