Best-Fit Evaluation Strategies: Are They Possible? John Carlo Bertot, John T. Snead, & Charles R. McClure Information Use Management and Policy Institute.

Slides:



Advertisements
Similar presentations
Research Strategies: Joining Deaf Educators Together Deaf Education Virtual Topical Seminars Donna M. Mertens Gallaudet University October 19, 2004.
Advertisements

The Aged Care Standards and Accreditation Agency Ltd Continuous Improvement in Residential Aged Care.
Mywish K. Maredia Michigan State University
Enhancing Critical Thinking Skills 2012 HBCU Library Alliance Leadership Institute Presented By: Violene Williams, MLIS Reference Librarian James Herbert.
How to Evaluate Your Health Literacy Project Jill Lucht, MS Project Director, Center for Health Policy
SE 450 Software Processes & Product Metrics 1 Introduction to Quality Engineering.
Program Evaluation. Lecture Overview  Program evaluation and program development  Logic of program evaluation (Program theory)  Four-Step Model  Comprehensive.
School Improvement Work Day. Continuous School Improvement The Model of Process Cycle for School Improvement provides the foundation to address school.
MSP course 2007 Phase 0 – Setting up Kumasi, Ghana 2008 Wageningen International.
Measuring Learning Outcomes Evaluation
Presented by Beverly Choltco-Devlin Reference and Electronic Resources Consultant Mid-York Library System September 25, 2009 REVVED UP FOR REFERENCE CONFERENCE.
INACOL National Standards for Quality Online Teaching, Version 2.
Sustaining Local Public Health and Built Environment Programs Fit Nation NYC November 3, 2011 Annaliese Calhoun.
From Evidence to Action: Addressing Challenges to Knowledge Translation in RHAs The Need to Know Team Meeting May 30, 2005.
N. M. Prusty C. Kelly. Introduction This presentation  Reviews the integration of environmental issues into disaster response  Identifies 6 opportunities.
Assessment & Evaluation Committee A New Road Ahead Presentation Dr. Keith M. McCoy, Vice President Professor Jennifer Jakob, English Associate Director.
Broadband Capacity and Planning Issues Charles R. McClure, PhD, Director Florida State University, Information Institute, and Francis Eppes Professor of.
The County Health Rankings & Roadmaps Take Action Cycle.
AN INTEGRATED ASSESSENT OF IMPACTS, ADAPTATION AND VULNERABILITY IN WATERSHED AREAS AND COMMUNITIES IN SOUTHEAST ASIA Juan M. Pulhin Ekawati S. Wahyuni.
Let’s Get S.T.A.R.T.ed Standards Transformation and Realignment in Thompson.
Curriculum Design. A Learner Centered Approach May, 2007 By. Rhys Andrews.
Preparing for the Main Event Using Logic Models as a Tool for Collaboratives Brenda M. Joly Leslie M. Beitsch August 6, 2008.
Outcome Based Evaluation for Digital Library Projects and Services
IRU 7th Euro-Asian Road Transport Conference & Ministerial Meeting Amman, Jordan, June 2013 Building Safe & Sustainable Transport Links Kiran K.
Bruce White Ruth Geer University of South Australia.
Cathy Burack and Alan Melchior The Center for Youth and Communities The Heller School for Social Policy and Management, Brandeis University Your Program.
1 CONCERT 2004 Power to the Librarian Delivering Transparency in the Serials Market Doug McMillan Managing Director Bowker UK Ltd.
ZLOT Prototype Assessment John Carlo Bertot Associate Professor School of Information Studies Florida State University.
The Theory and Practice of Results Based Grant Making Setting Targets & Measuring Results Jon Newkirk Western Center for Risk Management Education Washington.
8 TH -11 TH NOVEMBER, 2010 UN Complex, Nairobi, Kenya MEETING OUTCOMES David Smith, Manager PEI Africa.
American Community Survey ACS Content Review Webinar State Data Centers and Census Information Centers James Treat, ACSO Division Chief December 4, 2013.
MAINSTREAMING MONITORING AND EVALUATION IN EDUCATION Can education be effectively managed without an M & E system in place?
Presentation Reprised from the NASFAA 2014 Conference By Pamela Fowler University of Michigan Ann Arbor Getting a Seat at the Table 1.
ATL’s in the Personal Project
Institutional Considerations
Measuring and reporting outcomes for BTOP grants: the UW iSchool approach Samantha Becker Research Project Manager U.S. IMPACT Study 1UW iSchool evaluation.
Third Sector Evaluation: Challenges and Opportunities Presentation to the Public Legal Education in Canada National Conference on “Making an Impact” 26.
Week 12: Performance Management and Performance Budgeting Discuss Eureka exercise Review mid-term Conceptual Origins of Performance Management Government.
What is a Library ‘Outcome’ and How Do You Measure One? Andy Stewart Library Director C. L. Wilson Library Missouri S&T.
Source : The Problem Learning and innovation skills increasingly are being recognized as the skills that separate students who are.
Community Planning 101 Disability Preparedness Summit Nebraska Volunteer Service Commission Laurie Barger Sutter November 5, 2007.
Project Design Jennifer Coffey OSEP May 4,
Surveying instructor and learner attitudes toward e-learning Presenter: Jenny Tseng Professor: Ming-Puu Chen Date: April 12, 2008 Liaw, S., Huang, H.,
AT Approach AT Definitions AT Assessment AT Accessibility AT Adaptability and Personalization.
CAREER DEVELOPMENT by Naveeddear. CAREER DEVELOPMENT Career development is an ongoing, formalized effort by an organization that focuses on developing.
Kathy Corbiere Service Delivery and Performance Commission
Ministry of Education and Higher Education (MEHE) - Lebanon EGM on ICT Indicators Adoption and Data Collection. Cairo, Egypt 13 – 15 February 2007 Cairo,
We’re an Edge library! Applying Edge Results to Elevate Digital and Technology Services Insert Library Logo Here.
Five Year Forward View: Personal Health Budgets and Integrated Personal Commissioning Jess Harris January 2016.
Erik Augustson, PhD, National Cancer Institute Susan Zbikowski, PhD, Alere Wellbeing Evaluation.
2011 Symposium on Service and Inclusion: Improving the Member Experience Through Intentional Strategies Leveraging Partnerships Erin Gannon - Senior Training.
David Steingraber, Executive Director Wisconsin Office of Justice Assistance Perception, Expectations and Reality The Politics of Performance and Its Impact.
Evaluating Engagement Judging the outcome above the noise of squeaky wheels Heather Shaw, Department of Sustainability & Environment Jessica Dart, Clear.
Relationships in the 21 st Century Parent Teachers Students Association (PTSA) Goals, Membership, Participation.
HEALTH SYSTEMS FINANCING The path to universal coverage The path to universal coverage Show and Tell Social Protection Meeting BMZ, 5 May, Bonn/Germany.
Organizations of all types and sizes face a range of risks that can affect the achievement of their objectives. Organization's activities Strategic initiatives.
Reading Discussion – s519 by Peter Hall – 1/21/09 Ryan, J., McClure, C. R., & Bertot, J. C. (2001). Choosing measures to evaluate networked information.
Community Practice for Community Change
WIS DOT MCLARY MANAGEMENT PERFORMANCE MEASUREMENT.
Outcomes By the end of our sessions, participants will have…  an understanding of how VAL-ED is used as a data point in developing professional development.
CUSTOMER SERVICE UNIT CODE: J/601/1790.
Assessment & Evaluation Committee
Right-sized Evaluation
Gender Equality Ex post evaluation of the ESF ( )
4.2 Identify intervention outputs
Sarah Lucchesi Learning Services Librarian
Assessment & Evaluation Committee
The Heart of Student Success
Re-Framing Agendas: From the Personal to the Policy Level
Presentation transcript:

Best-Fit Evaluation Strategies: Are They Possible? John Carlo Bertot, John T. Snead, & Charles R. McClure Information Use Management and Policy Institute College of Information, Florida State University

Introduction Received a grant from the Institute of Museum and Library Services (IMLS) to develop an online instructional system to assist public libraries evaluate their services/resources Evaluation Decision Management System (EDMS) In year two of three year project Why this project? Little research provided comprehensive assistance in determining what specific evaluation strategies serve libraries best relative to specific library situational factors and contexts, data needs, and other considerations With so many evaluation options available, there was a need to bridge information-need issues (i.e., situational factors, data needs, stakeholder questions, etc.) with evaluation approaches Understanding information needs and linking these needs to evaluation approaches required evaluation strategies capable of providing data that library decision makers can use to address specific problems Bridge practice an research

Best Fit Evaluation Approaches The original intent of the project was to develop Best-fit evaluation strategies which match the data needs of a library within a specific situational context to the evaluation approaches that are most appropriate to that particular situational context Identify which evaluation strategies supply the best data within specific library organizational and situational contexts for use to provide the greatest impact to improve library services, or enable libraries to better advocate the value of libraries to their institutions or the communities they serve

Best-Fit Evaluation Strategy Considerations What evaluation approaches are available Which evaluation approaches might best meet a library’s data needs, either library developed or imposed by external funders/organizations/etc. How to develop an overall evaluation plan that makes effective and efficient use of limited library resources How to implement an evaluation strategy How to use evaluation findings to advocate for local library support

The Original Approach Academic Literature Outcomes assessment Value demonstration ROI Outputs Service Quality Other Stand alone literature and approaches Bring these to the public library community

Different Perspectives Stakeholder Perspective Answer a range of questions asked by various stakeholders groups (e.g., library boards, county or city executives) regarding library services and resources Make informed decisions regarding a library’s range or availability of services and resources Data Perspective, provide data to Answer a range of questions asked by various stakeholders groups (user- centered evaluation perspective) Make informed decisions regarding a library’s range or availability of services and resources (library-centered evaluation perspective) Demonstrate value and effectiveness of the library to the community that it serves (community-centered evaluation perspective) Frame the perceptions of the library in the local political environment (political context-centered evaluation perspective) Support the notion of the library as serving as a public good (customer- centered evaluation perspective)

Then the Reality of Public Libraries Advocacy Tell story Value Impact Cuts across individual evaluation approaches Integrated Fits library’s situation

Development of Best-Fit Evaluation Strategies Library decision makers are often faced with difficulties matching their data needs with the appropriate evaluation approaches There are many different kinds of evaluation data that a library may need and evaluation approaches that a library might employ As a result, many libraries struggle with the problem of choosing the best evaluation approaches to effectively and efficiently demonstrate the value they provide And…they don’t necessarily consider “evaluation”

Some Issues and Findings Have not quite abandoned the “logic model” approach to evaluation and “best fit”, but have Modified thinking significantly Changed how we make evaluation tools available Less “evaluation like” Focused more on advocacy and story telling -- with evaluation on the side Though it does underlie the entire approach

Some Issues and Findings In essence, developing a community of practice Resources Access to experts Information commons for asking questions and sharing ideas, resources, and other material Successes and failures How to Other

Some Issues and Findings Public libraries don’t necessarily consider evaluation as part of their advocacy strategy (if they have one) The link between evaluation, data, and advocacy is not always apparent Once burned, twice shy If have used evaluation strategies and data collection efforts, and not met with success, there is a reluctance to do so again Waste of resources Library situational factors (organizational, community, other) affect the successful use (or unsuccessful use) of leading evaluation approaches Though libraries now talk about advocacy regularly -- the extent to which they are able and/or ready to engage in advocacy strategies is in question Regardless of a linkage between evaluation and advocacy The fear is that we may have to hide evaluation in an advocacy wrapper These are two different things and require different skill sets The ability of a library director to negotiate the political environment of the library may trump evaluation

Some Issues and Findings Need to provide as many tools as possible to facilitate evaluation If, then interfaces that enable EDMS to know something about libraries NCES data (operating budgets, population served, FTEs, other) Public Library Internet data (connectivity, bandwidth, workstations, services, other) Prepackaged reports which incorporate data Peer comparisons Brochures Presentations (Budget, building, other)

Moving Forward Challenges remain Biggest is making link between evaluation and advocacy “Painless” evaluation that meets library’s situational factors Redefining success of project

Conclusions Some libraries do continue to engage in a wide range of evaluation efforts to assess services and resources provided to the communities libraries serve The evaluation environment is increasingly complex, and requires knowledge of multiple evaluation frameworks, methodologies, data analysis techniques, and communication skills The issue is not that libraries face a lack of available evaluation approaches The issue is helping libraries select the approach or approaches that best meet advocacy needs of a library from the many evaluation techniques that exist Making the evaluation-advocacy link

Thank You Questions/Comments? Contact information John Bertot: Information Institute: