1 Orientation ALA Midwinter Meeting January 15, 2010.

Slides:



Advertisements
Similar presentations
Creating a Culture of Assessment in Financial Aid Erika M. Cox Associate Director, Student Financial Aid and Enrollment Services University of Texas at.
Advertisements

Collecting Citizen Input Management Learning Laboratories Presentation to Morrisville, NC January 2014.
Presentation at The Conference for Family Literacy Louisville, Kentucky By Apter & O’Connor Associates April 2013 Evaluating Our Coalition: Are We Making.
OECD/INFE High-level Principles for the evaluation of financial education programmes Adele Atkinson, PhD OECD With the support of the Russian/World Bank/OECD.
Sustainability Planning Pat Simmons Missouri Department of Health and Senior Services.
How to Evaluate Your Health Literacy Project Jill Lucht, MS Project Director, Center for Health Policy
Program Evaluation and Measurement Janet Myers. Objectives for today… To define and explain concepts and terms used in program evaluation. To understand.
 Reading School Committee January 23,
Measuring for Success NCHER Legislative Conference Sophie Walker September 26, 2013.
Data Collection* Presenter: Octavia Kuransky, MSP.
The Lumina Center Grantseeking Workshop Series Presents Outcomes & Evaluations April 20, 2006.
Marketing Arizona Cooperative Extension: The Logic Model Approach (and more)
Logic Models. A logic model is your program ROAD MAP. Where are you trying to go? How are you trying to get there? What will tell you that you’ve arrived?
Evaluation. Practical Evaluation Michael Quinn Patton.
Evaluation of Training and Education Activities. Objectives By the end of this presentation, participants will be able to List reasons why evaluation.
Evaluating Physical Activity Intervention Programs Thomas Schmid, PhD Physical Activity and Health Branch CDC Atlanta, Georgia, USA.
CDC Evaluation Process Harlen Hays, MPH Office of Health Promotion, KDHE.
Community Planning Training 1-1. Community Plan Implementation Training 1- Community Planning Training 1-3.
Choosing Your Primary Research Method What do you need to find out that your literature did not provide?
Measuring for Success Module Nine Instructions:
How to Develop the Right Research Questions for Program Evaluation
1 Qualitative Evaluation Terms Coding/categorization The process of condensing qualitative data through the identification of common themes. Data Matrix.
Options for researching the library’s impact on early literacy skill development Research designs for investigating cause and effect relationships: Experimental.
ISTEP: Technology Field Research in Developing Communities Instructor: M. Bernardine Dias CAs: Sarah Belousov and Ermine Teves Spring 2009.
Edward M. Haugh Jr. ESC Consultant. III. Recommendations for Applying Outcomes Planning to ESC  I. Introduction to Outcomes Planning II. A Sample ESC.
Organization Mission Organizations That Use Evaluative Thinking Will Develop mission statements specific enough to provide a basis for goals and.
Step 6: Implementing Change. Implementing Change Our Roadmap.
Impact assessment framework
Washington State Library OBE Retreat October 20, 2006 Matthew Saxton & Eric Meyers.
Performance Measures AmeriCorps Project Director Training Saratoga, NY October 7 th – 9 th, 2013.
Tracking & Evaluation Behavior Change Laurie Gustafson Community POWER Grantee Orientation October 4, 2011.
OBE: Outcomes-Based Evaluation ALA Midwinter Meeting January 24th, 2014 Presented by Kit Keller.
Monitoring and Evaluation in MCH Programs and Projects MCH in Developing Countries Feb 10, 2011.
1 Introduction to Evaluating the Minnesota Demonstration Program Paint Product Stewardship Initiative September 19, 2007 Seattle, WA Matt Keene, Evaluation.
Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31,
Outcome Based Evaluation for Digital Library Projects and Services
Evelyn Gonzalez Program Evaluation. AR Cancer Coalition Summit XIV March 12, 2013 MAKING A DIFFERENCE Evaluating Programmatic Efforts.
OECD/INFE Tools for evaluating financial education programmes Adele Atkinson, PhD Policy Analyst OECD With the support of the Russian/World Bank/OECD Trust.
Performance Measurement 201: Best practices in performance measure design & implementation Ia Moua, Deputy Director, Grants & Program Development Patrick.
TRACKING AND REPORTING PROGRESS AND CONTINUOUS IMPROVEMENT AmeriCorps Program Directors’ Kickoff: 2015 –
Overview of Evaluation Designs. Learning objectives By the end of this presentation, you will be able to: Explain evaluation design Describe the differences.
1 PROJECT EVALUATION IT’S ALL ABOUT STUDENTS. 2 In partnership, we help America’s students stay in school and graduate by: Reducing gaps in college access.
EVALUATION APPROACHES Heather Aquilina 24 March 2015.
Director of Evaluation and Accountability Manager, UW’s Grand Rapids, Michigan Robert McKown, CIRS Director of Evaluation and Accountability Sherri.
Boston Geneva San Francisco Seattle Beginning the Evaluation Journey with FSG KCIC Boot Camp March 24, 2010 Prepared for:
2013 NEO Program Monitoring & Evaluation Framework.
Monitoring & Evaluation: The concepts and meaning Day 9 Session 1.
OBE: Outcomes-Based Evaluation ALA Midwinter Meeting January 25th, 2013 Presented by Kit Keller.
Institute for Financial Literacy © Three Elements to a Successful Financial Literacy Education Program Leslie E. Linfield, Esq. October 29, 2008.
Webinar on Reporting and Evaluation for Museums for America Grantees January 6-8, 2009.
CONDUCTING A PUBLIC OUTREACH CAMPAIGN IMPLEMENTING LEAPS IN CENTRAL AND EASTERN EUROPE: TRAINERS’ HANDBOOK Conducting a Public Outreach Campaign.
Community Planning Training 5- Community Planning Training 5-1.
The Major Steps of a Public Health Evaluation 1. Engage Stakeholders 2. Describe the program 3. Focus on the evaluation design 4. Gather credible evidence.
The Logic Model An Outcomes-Based Program Model. What is a Logic Model? “a systematic and visual way to present and share your understanding of the relationships.
Building the Knowledge Base How to Successfully Evaluate A TCSP Project presented by William M. Lyons U.S. Department of Transportation/Volpe Center and.
Monitoring and Evaluation in MCH Programs and Projects MCH in Developing Countries Feb 24, 2009.
Consultant Advance Research Team. Outline UNDERSTANDING M&E DATA NEEDS PEOPLE, PARTNERSHIP AND PLANNING 1.Organizational structures with HIV M&E functions.
United Nations Oslo City Group on Energy Statistics OG7, Helsinki, Finland October 2012 ESCM Chapter 8: Data Quality and Meta Data 1.
1 The project is financed from the European Union funds within the framework of Erasmus+, Key Action 2: Cooperation for innovation and the exchange of.
Monitoring and Evaluation in MCH Programs and Projects MCH in Developing Countries Feb 9, 2012.
Using Logic Models to Create Effective Programs
Monitoring Afghanistan, 2015 Food Security and Agriculture Working Group – 9 December 2015.
1 Introduction Overview This annotated PowerPoint is designed to help communicate about your instructional priorities. Note: The facts and data here are.
Exploitation means to use and benefit from something. For Erasmus+ this means maximising the potential of the funded activities, so that the results are.
Designing Effective Evaluation Strategies for Outreach Programs
An agency of the Office of the Secretary of Education and the Arts
BEST PRACTICES IN LIBRARY INSTRUCTION FORUM November 7, 2007
Program Evaluation Essentials-- Part 2
M & E Plans and Frameworks
Presentation transcript:

1 Orientation ALA Midwinter Meeting January 15, 2010

2 Workshop Goals  Create a common understanding and increased knowledge of outcome-based planning and evaluation so grantees may refine and finalize Smart your library evaluation and marketing plans.  Set the stage for effective program implementation.

3 Overview & Agenda  Why Measure Outcomes?  What is Outcome-Based Evaluation?  How to Measure Outcomes  How to Collect Data  How to Use Data  Closing Thoughts and Next Steps

4 Why Measure Outcomes?  See if programs really make a difference in the lives of people.  Improve programs.  Improve planning.  Increase accountability.  Ensure best use of funds.  Compare programs consistently.  Demonstrate impact.  Funders demand it.

5 Why Measure Outcomes? "Outcome measurement has the potential to be a powerful tool to help us substantiate the claims we know to be true about the impact of libraries in our institutions and in our societies. Will it be an easy road to travel? No, but it will absolutely be worth the trip!" -- Peggy Rudd, Texas State Librarian

6 What Is Outcome-Based Evaluation?  Outcome-based evaluation is a systematic way to assess how well a program has achieved its intended results. How has my program made a difference? How are the skills of the program participants improved as a result of my program?

7 What Is Outcome-Based Evaluation?

8 Get Ready  Audience is key Users Staff Program partners  Key terms Situation Inputs Activities Outputs Outcomes

9 Get Ready

10 Outputs/Outcomes  Three programs held  Web site developed  Print, electronic collection increased 10%  100 PSAs run  50 people from target audience attend each; report they would recommend the program  Increased use of investment Web site from beginning to end of the program.  Increased use of investment collection.  10% of participants report PSA motivated attendance

Outputs/Outcomes  Good: Train 10 staff.  Better: 80% of trained staff will show improved skills as a result of training.  Best: 50% of trained staff will lead trainings and receive positive evaluations from library patrons. 11

Outputs/Outcomes  Good: Program attendance will average at least 20 people.  Better: 70% of participants will demonstrate increased financial literacy.  Best: 50% of participants will open a savings account within six months of the program end. 12

13 Get Ready  Smart investing program elements Staff training Classes/programs/training/exhibits Web presence Collection development and positioning in physical/virtual library Marketing/outreach Partnerships

14 Choose the Outcomes You Want to Measure  Smart investing overall goals: Community members will view the library as a reliable place for unbiased investor information Community members will make increased use of library programs and resources Community members will be more knowledgeable about key investment issues.

15 Choose the Outcomes You Want to Measure  Smart investing outcomes at the library level: Increased requests for investment materials Increased visits to library Web site Increased awareness of library resources Increased staff competencies

16 Choose the Outcomes You Want to Measure  Smart investing outcomes at the library level: Users demonstrate increased skills and/or knowledge Users take action with new skills (i.e. start investing, reduce debt) Program partners report a positive experience working with the library PR/Marketing activities boost program participation

17 Specify indicators for your outcomes  Smart investing indicator examples Program attendance Web hits Survey feedback Increased circulation of related materials Trained staff will provide patron training New library card enrollments New saving or investing behavior.

18 Specify indicators for your outcomes  Measurable: That is, you can test for the change or observe it. But if you made a movie of success, the camera would focus on people, not on mechanisms or processes used to create the hoped-for results.  Changes in participants: Remember we’ve defined an outcome as a change in a target audience’s skills, attitudes, knowledge, behavior, status, or life condition brought about by experiencing a program.  Define success: Does the outcome represent a benefit for the target audience? Do key stakeholders accept the outcome as valid for your program? Finally, is the outcome realistic and attainable?  Participating in your program: Is it sensible to claim your program services influenced the outcome?

19 Marketing Example  GOAL: Develop an outreach plan to inform local citizens that the library is a resource center for financial planning. The marketing plan will identify community members in each targeted audience who will benefit most from the project. Community members will be well-informed of the spectrum of activities available throughout the project.

20 Staff Example  GOAL: Reference staff will know sources of accurate and unbiased information on investing. Participation by reference staff in investment education resources training. Reference staff will identify new web resources and add them to library Web page.

21 Hands-on  Take 15 minutes to review your goals and outcomes and make two to three improvements based on our conversation. How can you focus on outcomes, as well as outputs? How can you make the objectives more specific and measurable?

22 Prepare to collect data on your indicators  Pre-test and Post-test You can’t measure success without a baseline. What is the “current state of affairs” – what do people know, perceive and do before the program… and how does the program move the audience forward? Include all stakeholders

23 Prepare to collect data on your indicators  Data source options Feedback forms/short surveys Point-of-use inquiry by staff Focus groups Interviews Skills tests Observation Instructor assessments Library use statistics

24 Prepare to collect data on your indicators  Other considerations: When will you collect data? How often will I collect data? Include all participants or a sample? Who will collect data? Who will record/compile data? How will confidentiality be protected? How will participants be informed about the data collection process?

25 Try out Your Measurement System  Pilot or “beta test” your surveys or questionnaires. Clarity of questions Ease of use Are you measuring what you intended to measure? Are you asking the most appropriate questions? REVISE instrument as needed!

26 Analyze and Report Findings  Collect, input data at regular intervals Mid-project reporting Troubleshooting Revising outcomes as appropriate  It’s as much about the journey as the destination

27 Analyze and Report Findings o Review feedback from participants o Get very familiar with the data o Look for and note oddities in reporting o Peruse the data and identify patterns o Substantiate patterns – do data sources corroborate each other?

28 Analyze and Report Findings  Organize data logically (tables/charts)  Analyze and interpret data to develop narrative for final report  Document findings  Maintain files or database of outcomes and activities  Determine outcomes you want to continue monitoring

Analyze and Report Findings  Hiring an outside evaluator What to look for? What to expect? Cost? 29

30 Improve Your System  More important to discontinue something that isn’t working than to report back that you did everything you said you would do at start of the program when it failed.

31 Hands-on  Take 15 minutes to review your evaluation plan and the data you gathered for today’s workshop. What baseline data do you need that you don’t yet have? How will you get it?

32 Use Your Findings  Tell your story!  Marketing  Accountability and long-term assessment  Improved services and/or programs  Resource (re)allocation

33 Marketing  Outcomes can inform key messages about the library and what it does well.  Outcomes allow stakeholders to understand, in users’ own words, the powerful role of the library.

34 Marketing  Identify your audiences Funder Targeted media outlets Partners  Match audiences and outcomes  Determine dissemination strategies Annual report Press release Flyers and brochures

35 Closing  Questions?  One thing you’ve learned today  One thing you’re going to do as a result of what you’ve learned today  Evaluation form

36 Additional resources  IMLS National Leadership Grants tutorial: ex1.asp ex1.asp  NEFE Financial Education Evaluation Toolkit:  Florida State Library workbook: ffice/OutcomeEvalWkbk.doc ffice/OutcomeEvalWkbk.doc

37 ALA Office for Research & Statistics Denise Davis, Director Larra Clark, Project Manager x8213