Download presentation
Presentation is loading. Please wait.
Published byLeonardo Lathrop Modified over 9 years ago
1
1 I N S T I T U T E of M U S E U M and L I B R A R Y S E R V I C E S The Quintessential Meeting Philadelphia, October 28, 2003 I N S T I T U T E of M U S E U M and L I B R A R Y S E R V I C E S The Quintessential Meeting Philadelphia, October 28, 2003 Libraries Matter– Why Libraries Matter– The Case for Outcomes
2
2 Overview We will talk about IMLS Brief history and functions of outcome- based evaluation (OBE) (advocacy and management) Definitions and practical applications Where to find resources, information and assistance
3
3 Institute of Museum and Library Services The Institute of Museum and Library Services (IMLS) is a federal agency that fosters leadership, innovation, and a lifetime of learning through grants to museums and libraries. Please see IMLS’s Web site at http://imls.gov for additional information about IMLS and its grant programs.
4
4 What is IMLS What is IMLS and what do we do? Provide competitive and formula-based grants to libraries, museums, archives, and associated organizations to increase excellence of their services and to create a nation of learners Provide information of interest to the museum, library, and archives fields and to their audiences and stakeholders
5
5 Who needs to know? Policy-makers Parent institutions Library profession General public Funders Who are your key audiences for data on the value of libraries? Program audiences Funders Media Other ?
6
6 How do outcomes support the work of libraries? Provide concrete, objective information to show library value Provide information that responds to stakeholder values and background Go beyond “what we provided” to “why do we matter?”
7
7 Challenges to showing the value of library services How library products and services develop “Let’s put on a show!” Hopes and dreams (“democracy will flourish”) Traditions of tracking and reporting library services IMLS solutions
8
8 We will Talk about outcome-based evaluation (OBE) Talk about benefits Define terms Describe process Discuss how to use results
9
9 Outcomes are achievements or changes in Skill Information literacy Competitive intelligence Competitive intelligence Knowledge Where to find clinical trials Symptoms of diabetes Behavior Avoids high-fat foods Wears a seat belt What are outcomes?
10
10 Outcomes are achievements or changes in Attitude Librarians are valuable Immunization is important Status MLIS SLA or MLA member Life condition Employed Healthy What are outcomes?
11
11 Example Wally goes to a reading program and Learns childhood reading is important Wants to read to his son Uses a literacy program Advances 2 literacy levels Gets his GED What kinds of outcomes are each of these?
12
12 Outcomes Where do they come from? Government Performance and Results Act (GPRA) 1993 New accountability, need to show value Funding trends IMLS and other federal funders State and local government Foundations (Gates, Kellogg, local) ROI/cost:benefit
13
13 Evaluate? Let me count the Ways! Internal Internal External External Participant Participant Process Process Inputs, Outputs (counts) Inputs, Outputs (counts) Outcomes Outcomes Formative Formative Summative Summative Cost/Benefit Cost/Benefit ROI... ROI...
14
14 How does OBE fit libraries? Inputs — How much we use Inputs — How much we use Outputs — How much we do Outputs — How much we do Performance quality — How well we do Performance quality — How well we do Economic value — What we’re worth Economic value — What we’re worth Outcomes — What good we do Outcomes — What good we do
15
15 How does OBE fit libraries? OBE changes focus, from activities to benefits OBE changes focus, from activities to benefits OBE needs to be part of the program design OBE needs to be part of the program design OBE shows to what extent a program met its own goals OBE shows to what extent a program met its own goals
16
16 Fears & Realities Will use too much time Consider scope of the evaluation Visitor privacy may be compromised Ask for volunteers People like to tell you what they think
17
17 Fears & Realities Takes money from other priorities Takes money from other priorities Funds are available Funds are available Results help leverage other funds Results help leverage other funds Library impact cannot be measured Library impact cannot be measured Show short-term, immediate impact logically related to your service and to progress towards a larger vision (e.g. becoming a life-long learner) Show short-term, immediate impact logically related to your service and to progress towards a larger vision (e.g. becoming a life-long learner)
18
18 More Realities Libraries collect similar information to improve services and user satisfaction Libraries collect similar information to improve services and user satisfaction Evaluation can increase participation, improve services, and leverage funds Evaluation can increase participation, improve services, and leverage funds
19
19 Why measure outcomes? Know if program met purpose Know if program met purpose Improve programs Improve programs Guide management Guide management Communicate program impact Communicate program impact Satisfy funder’s need to know Satisfy funder’s need to know
20
20 What is a program? Series of services & activities that lead to a goal Series of services & activities that lead to a goal Has a definite beginning & end (for program or participant) Has a definite beginning & end (for program or participant) Meant to change attitude, behavior, Meant to change attitude, behavior, knowledge, skill, status, or condition
21
21 How to develop an outcome-oriented program Identify a need Identify a need Need can be based on Need can be based on Your experiences Program partner experiences Formal or informal research
22
22 How to develop an outcome-oriented program Look at assumptions & verify long term needs : Many people use unreliable health information; this is increased by easy access to online info Many people rely on public libraries for about health information If public libraries provide fast, high-quality health information, people might use it for health decisions
23
23 How to develop an outcome-oriented program Look at assumptions & verify short term needs: Many public library staff lack needed computer, info literacy, and health info skills Many such staff lack confidence for health- related questions, are slow, and don’t know sources Many public libraries lack staff to provide needed training
24
24 How to develop an outcome-oriented program Look at assumptions & verify short term needs: Our library could provide health-directed computer, info literacy, and online search skills We could improve speed and quality of health information service in public libraries In the long term, we could improve health decisions among library users
25
25 How to develop an outcome-oriented program Identify solution A structured program to provide public library staff with basic computer and information literacy skills to help people answer health-related questions Identify desired results (short-term) Staff will learn basic computer skills Staff will learn basic computer skills Staff will have basic health information literacy Staff will have basic health information literacy Staff will learn to locate needed information quickly Staff will learn to locate needed information quickly
26
26 How to develop a program Look at stakeholders Include individuals, agencies, funding sources, competition, community groups and national and state affiliations. They influence: Desired outcomes Desired outcomes How results are communicated How results are communicated Type and nature of services Type and nature of services Who program serves Who program serves
27
27 How to develop a program Look at audience Who is served by a program depends on several factors: Stakeholders Stakeholders Assumed need Assumed need Mission and resources Mission and resources
28
28 Example Provides a structured training program In health information for public library staff Public library staff in West Dakota Public library staff have basic computer skills, are information literate, and can find high-quality health information quickly Program Purpose Does what For whom For what outcome
29
29 Measuring outcomes OBE is not the same as research OBE is not the same as research No intent to compare with other programs No intent to prove contribution or sole responsibility for change OBE shows contribution, not attribution OBE shows contribution, not attribution OBE shows what results one program achieved for its audience OBE shows what results one program achieved for its audience
30
30 Measuring outcomes Building a logic model Logic model is the evaluation plan Logic model is the evaluation plan Shows how all elements fit together Shows how all elements fit together Helps assure desired results Helps assure desired results Logic model is not a form to fill in–it is a flexible, dynamic planning tool Logic model is not a form to fill in–it is a flexible, dynamic planning tool Builds consensus about goals and roles Builds consensus about goals and roles
31
31 Outcomes Logic Model PROGRAM PURPOSE Inputs Services Activities Outputs OutcomeIndicatorApplied toData sources Data intervals Target (Goal) Intended impact Observable and measurable behaviors or conditions The population to be measured Sources of information about conditions being measured When data is collected The amount of impact desired
32
32 Identify program outcomes Immediate Immediate Intermediate Intermediate Long term Long term Outcomes
33
33 Hints A program may have many outcomes Pick a few important outcomes to measure One significant outcome may be enough Outcomes
34
34 Short term Outcome 1. Participants have basic computer skills Outcome 2. Participants are “information literate” Outcome 3. Participants find high-quality health information fast Outcomes
35
35 Long term Outcome 1. The public will seek health information assistance from public libraries Outcome 2. The public will make health decisions based on high-quality information Outcomes
36
36 Indicators Observable, measurable, clearly identifiable Unambiguous Several may apply to each outcome
37
37 Indicators Outcome 1. Participants have basic computer skills Indicator Participants can perform 4 basic computer tasks without assistance (set screen font size, download an application, save a file to a personal folder, print a document)
38
38 Indicators How does completing a few tasks indicate basic computer skills? Locating information online requires some basic skills for using a computer Locating information online requires some basic skills for using a computer If library staff can demonstrate a representative skill set without assistance, they probably have “basic skills” If library staff can demonstrate a representative skill set without assistance, they probably have “basic skills”
39
39 Logic Model Outcome 1. Public library staff have basic computer skills 75% Target (Goal) After “basic” training session Data interval Completed assignment Data source All participants who finish “basic” training session Participants who can complete 4 basic tasks without assistance Apply toIndicator
40
40 Reports What should reports say? We wanted to do what We did what So what
41
41 Reports Summarize participant characteristics Summarize inputs, activities/services, outputs, and outcomes Respond to influencers’ need for information Compare data from program start or previous period Interpret results and make recommendations
42
42 Report Elements What did we achieve for our target audience? Outcomes How many units did we deliver? To whom? ( audience characteristics ) Outputs What did we do? Activities & Services What did we use? How much did we spend? How much did we consume? Inputs
43
43 Options: Hire a consultant Benefits Result may be seen as unbiased Result may be seen as unbiased Professionals have most expertise Professionals have most expertise Process may be more efficient Process may be more efficient Offers outside perspective Offers outside perspective Who will do the work?
44
44 Options: Hire staff evaluator Benefits May reduce cost May reduce cost Greater understanding of your program Greater understanding of your program Greater commitment to the process Greater commitment to the process Who will do the work?
45
45 Options: Train existing staff (ideal) Benefits Integrate evaluation into routine management activities Integrate evaluation into routine management activities Staff know programs and audience Staff know programs and audience Skills transfer to all programs Skills transfer to all programs Who will do the work?
46
46 What do you get? Small investment: Numbers, audience characteristics, and customer satisfaction Small investment: Numbers, audience characteristics, and customer satisfaction Low to moderate investment: Immediate changes in knowledge, behaviors, and attitudes Low to moderate investment: Immediate changes in knowledge, behaviors, and attitudes
47
47 What do you get? Moderate to high investment: Attribute short-term changes in audience skills, knowledge, behaviors, and attitudes to program Moderate to high investment: Attribute short-term changes in audience skills, knowledge, behaviors, and attitudes to program High investment: Short- and long-term impact, attribution of impact to program, variables influencing impact High investment: Short- and long-term impact, attribution of impact to program, variables influencing impact
48
48 Your Action Plan Discuss benefits to your library Discuss benefits to your library Get more information Get more information Consider how to apply OBE Consider how to apply OBE Develop a plan Develop a plan
49
49 We have Talked about uses of OBE Talked about uses of OBE Talked about benefits Talked about benefits Defined terms Defined terms Described process Described process Discussed how results are used Discussed how results are used
50
50 Starting Places: NLG Project Planning: A Tutorial, IMLS NLG Project Planning: A Tutorial, IMLShttp://e-services.imls.gov/project_planning/ Perspectives on Outcome-based Evaluation for Libraries and Museums (2001), IMLS Perspectives on Outcome-based Evaluation for Libraries and Museums (2001), IMLS http://www.imls.gov/pubs/pdf/pubobe.pdf IMLS bibliography IMLS bibliography http://www.imls.gov/grants/current/crnt_obe.htm#res
51
51 Starting Places: Action Plan for Outcomes Assessment in Your Library Peter Hernon and Robert E. Dugan Action Plan for Outcomes Assessment in Your Library, Peter Hernon and Robert E. Dugan (2002), ALA Editions A comprehensive plan specifically for librarians; provides data collection tools for measuring learning outcomes that link outcomes to user satisfaction W.K. Kellogg Foundation Evaluation Handbook (1998) W.K. Kellogg Foundation Evaluation Handbook (1998) http://www.wkkf.org/pubs/Pub770.pdf Thorough introduction for organizations new to OBE
52
52 Karen Motylewski Institute of Museum and Library Services 1100 Pennsylvania Avenue, NW Washington, DC 20506 202-606-5551 kmotylewski@imls.gov For more information
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.