Evaluation of Information Literacy Education

Slides:



Advertisements
Similar presentations
Evaluating Training Programs The Four Levels
Advertisements

SNDT Women's University Introduction to Evaluation and Assessments  Presented by Kathleen (Kat) Miller, Senior Consultant with Booz Allen Hamilton  4.
Reasons for Evaluating Training Companies are investing millions of dollars in training programs to help gain a competitive advantage. To justify the costs.
Click to edit Master title style Click to edit Master subtitle style Towards a Confident Future Edinburgh Napier University is a registered Scottish Charity.
Study on the outcomes of teaching and learning about ‘race’ and racism Kish Bhatti-Sinclair (Division of Social Work Studies) Claire Bailey (Division of.
Copyright © Questionmark Corporation and/or Questionmark Computing Limited, known collectively as Questionmark. All rights reserved. Questionmark.
Fred Barranger (N72) John Wishall (N721B) 24 February 2015
Unit 10: Evaluating Training and Return on Investment 2009.
Measuring Learning Outcomes Evaluation
Measuring (and Driving) the Value of Training Bruce Winner, Los Rios CCD – Government Training Academy Bruce blogs to the training community at -
Evaluation of Training and Education Activities. Objectives By the end of this presentation, participants will be able to List reasons why evaluation.
+ Teaching psychological research methods through a pragmatic and programmatic approach. Patrick Rosenkranz, Amy Fielden, Efstathia Tzemou.
Step 4: Gather Evidence Luann D’Ambrosio, MEd. Review: What to measure What amount of $ and time were invested What did the program actually consist of.
Learning Outcomes and Assessment APCC Peter Wolf April
INTRODUCTION TO THE S.H.E.A. MODELS FOR TRAINING EVALUATION SMARTRISK LEARNING SERIES December 18 th, 2007 By Dr. Michael P. Shea.
Evaluating Training Effort Organisations are under pressure to justify various expenses. Business heads and training managers are under pressure to prove.
Kirkpatrick’s Levels of Evaluation
TRAINING EVALUATION WHAT? WHAT? WHY? WHY? HOW? HOW? SO WHAT? SO WHAT?
Evaluation and Case Study Review Dr. Lam TECM 5180.
Leeds University Library Implementing an information literacy audit in the School of Healthcare, Leeds University Alison Lahlafi, Faculty Team Librarian.
This project is financed by the European Union 1 The project is implemented by a European Profiles S.A. led consortium Evaluation of Training By Senior.
Research Problem In one sentence, describe the problem that is the focus of your classroom research project about student learning: That students do not.
Evaluation of Strategic HRD Chapter 11. Why Evaluate ? The Purpose of Evaluation: Viewpoints & Challenges Evaluation is a core part of what makes us compete.
This project is financed by the European Union 1 The project is implemented by a European Profiles S.A. led consortium Evaluation of Training By Senior.
The purpose of evaluation is not to prove, but to improve.
Tools and techniques to measuring the impact of youth work (Caroline Redpath and Martin Mc Mullan – YouthAction NI)
Using Pre and Post Scenarios to Assess Skill Attainment in Educational Settings Allison Nichols, Ed.D. Evaluation Specialist West Virginia University Extension.
Evaluating Training The Kirkpatrick Model.
Guidance for Analyses of Pilot Interventions European Workplace and Alcohol Berlin, 9 November 2012 Jon Dawson.
How to evaluate the impact of CPD? 28 th April 2016.
Measurement Tools ENACTUS TRAINING Quality of Life
Freshmen to PhD, Empowering with Research
Preventing HCAI’s through an education programme for nurses
First-Year Experience Seminars: A Benchmark Study of Targeted Courses for Developmental Education Students.
Research Methodologies
Monitoring, Evaluation and Learning
Classroom Assessment A Practical Guide for Educators by Craig A
Evaluation Plan Akm Alamgir, PhD June 30, 2017.
Introduction to evaluating and measuring impact in career development Presented by – Date – Doubt, the essential preliminary of all improvement and.
Training Trainers and Educators Unit 8 – How to Evaluate
Internal Assessment 2016 IB Chemistry Year 2 HL.
The Lycurgus Group: Instructional Effectiveness Survey System
Course of Information literacy: Measuring effectiveness
Evaluating performance management
BEST PRACTICES IN LIBRARY INSTRUCTION FORUM November 7, 2007
Measurement Tools ENACTUS TRAINING Quality of Life
Market Research Unit 3 P3.
Program Evaluation Essentials-- Part 2
Chapter Six Training Evaluation.
Presentation by: Nora, Katherine, Carmen, and Shadia
Training Trainers and Educators Unit 8 – How to Evaluate
Module 7: Monitoring Data: Performance Indicators
Advanced Program Learning Assessment
Evaluation tools.
Dr Claire Kotecki (STEM) & Dr Prithvi Shrestha (WELS)
Running litigation surgeries
Efficiency of an information literacy courses for Ph. D
Faculty Development Dr Samira Rahat Afroze.
Theoretical Perspectives
Evaluation of Training
The impact of small-group EBP education programme: barriers and facilitators for EBP allied health champions to share learning with peers.
Evaluation: Prospects & Constraints
Monitoring, Evaluation and Learning
Pre-test Results.
Assessments and the Kirkpatrick Model
Training Evaluation Chapter 6
Can E-learning Replace the Traditional Classroom
6 Chapter Training Evaluation.
Kirkpatrick’s Four Levels of Evaluation
Presentation transcript:

Evaluation of Information Literacy Education Mgr. Gabriela šimková (faculty of arts, masaryk university) Mgr. Jiří Kratochvíl, ph.d. (the MU Campus Library)

Measuring effectiveness: main reasons „E-learning can be a powerful tool – it is scalable and less expensive than traditional training (...). But after spending a lot of money on infrastructure and content, how do you know if your e-learning content or program is really effective?“ Learning activities focusing students‘ needs Knowledge – Skills – Attitudes Evidence-Based Learning Approach (continual research as a precondition for more effective achievement of educational goals defined within information literacy (IL) education) Key project activity

Measuring as a continuing process Educational needs analyse Measuring methodology design Measuring before activity Learning activity Measuring after activity

Kirkpatrick’s Four-Level Model Our aim: to strength students’ satisfaction and learning results The first level tries to evaluate immediate students’ reactions to an educational activity (environment, content and the lecturer) - short paper questionnaires (smile-sheets) The second level explores the change in knowledge and skills - a pre- and a post-test The third level tries to identify the long-term change in participants’ behaviour - qualitative methodology, specifically a focus group series and 360-degree feedback The fourth level is focused on the return on investment in education

About Model introduced already in 1959: a reaction to the increasing pressure on proving the effectiveness, value and benefit of education for business. one of the most widespread models for education evaluation well reflects the current constructivist conception of instruction comprised of four hierarchically ordered levels revealing, one by one, the levels of effectiveness of the educational process

Level 1: Immediate Reaction to Education Key question: To what extent were the participants satisfied with the educational activity? to evaluate immediate students’ reactions to an educational activity (a seminar, a workshop, an e-learning module, etc.) how participants feel about the various aspects of a training program a clear research goal, understandable questions, and quantifiable answers, ensure anonymity of participants and the possibility of adding a comment We assessed students’ satisfaction with the study environment, study content and the lecturer

About smile sheets: type of tool displaying immediate students' response to the educational activity a target group comprised mainly of students aged between 20 and 25 the design of the questionnaire was based on an adjusted five-point Likert scale instead of an evaluation reaching from "extremely satisfied" to "not at all satisfied", consisted of five smileys indicating the level of satisfaction with a particular aspect The three main aspects evaluated by this questionnaire are: content and organization of the seminar, the instructor and overall assessment of the lesson.

Measuring of participants’ satisfaction at the MU Campus Library

Level 2: Gained Knowledge Key question: To what extent did the participants obtain the expected knowledge and skills as a result of attending the educational activity? explores the change in one or more areas of participants’ knowledge, skills or attitudes due to education activity This change is expressed by the quantity of knowledge transferred during a lesson quantitative methods with statistical evaluation when knowledge is measured both prior to and after the lesson for a comparison (a pre- test, a post-test)

Measuring of gained knowledge: e-learning Course of Information Literacy, pre- and post-test autumn 2013 (n=1184)

Measuring of gained knowledge: e-learning Course of Information Literacy, pre- and post-test autumn 2013 (n=

Level 3: Long-Term Effects Key question: To what extent do the participants apply the knowledge and skills obtained to their everyday work? to identify the long-term change in participants’ behaviour with the benefit of hindsight (e.g. three to six months after the lesson) methods used for third-level measurements usually have a qualitative character: an interview, finding out the views of students or teachers who work with the person who attended the course focus group - Small number of people (usually between 4 and 15, but typically 8) brought together with a moderator to focus on a specific topic (satisfaction with study materials, communication aspects...). Focus groups aim at a discussion instead of on individual responses to formal questions, and produce qualitative data (preferences and beliefs) that may or may not be representative of the general population.

Level 4: Results Key question: To what extent have the planned objectives of a development project and subsequent support activities been achieved? shows the tangible results of a programme and is accepted mainly in the commercial sphere because it is focused on the return on investment in education

References 1. Eldredge, J.: Evidence-Based Librarianship: searching for the needed EBL evidence. Medical Reference Services Quarterly. 3, 1-18 (2000) 2. Davies, P.: What is evidence-based education?. British Journal Of Educational Studies. 2, 108-121 (1999) 3. Smith, A.: Scientifically Based Research and Evidence-Based Education: A Federal Policy Context. Research & Practice for Persons with Severe Disabilities. 3, 126--132 (2003) 4. Kirkpatrick, D.: The Four Levels of Evaluation: Measurement and Evaluation. American Society for Training & Development Press, Alexandria (2007) 5. Kirkpatrick, D. Seven Keys to Unlock the Four Levels of Evaluation. Performance Improvement. 45, 5--8 (2006) 6. Explorable: Pretest-Posttest Designs, https://explorable.com/pretest-posttest-designs 7. Kirkpatrick, J.: The Hidden Power of Kirkpatrick's Four Levels. T+D, 61(8), 34--37 (2007) 8. Naugle, K.A., Naugle, L.B., Naugle, R.J.: Kirkpatrick’s Evaluation Model as a Means of Evaluating Teacher Performance. Education. 1, 135--144 (2000)