How do we know it works? Evaluating Learning Technology Projects EDUCAUSE Learning Initiative Seminar Clare van den Blink

Slides:



Advertisements
Similar presentations
Understanding Student Learning Objectives (S.L.O.s)
Advertisements

By: Edith Leticia Cerda
Performance Assessment
STRATEGIES FOR COURSE REDESIGN EVALUATION Laura M. Stapleton Human Development and Quantitative Methodology University of Maryland, College Park
Program Goals Just Arent Enough: Strategies for Putting Learning Outcomes into Words Dr. Jill L. Lane Research Associate/Program Manager Schreyer Institute.
SLO Assessment Departmental Tools and Examples from the Field Sacramento City College Fall Flex Workshop 8/21/08 8/21/08Presenters: Alan Keys Faculty Research.
Autism Observation Instrument General Education Classrooms
CDI Module 10: A Review of Effective Skills Training
Online Rubric Assessment Tool for Marine Engineering Course
Teacher Evaluation New Teacher Orientation August 15, 2013.
A quasi-experimental comparison of assessment feedback mechanisms Sven Venema School of Information and Communication Technology.
Using Video Segments to Enhance Early Clinical Experiences of Prospective Teachers Kristen Cuthrell, Michael Vitale, College of Education, East Carolina.
What is the ADDIE Model? By San Juanita Alanis. The ADDIE model is a systematic instructional design model consisting of five phases: –Analysis –Design.
Composed Portraits OTEC Spring 2010 Michael Childers - John Coney Jean Javellana - Myla Gumayagay.
An Assessment Primer Fall 2007 Click here to begin.
IS 421 Information Systems Management James Nowotarski 16 September 2002.
University of Dublin Trinity College University of Dublin Trinity College Centre for Academic Practice & Student Learning University of Dublin Trinity.
INACOL National Standards for Quality Online Teaching, Version 2.
Standards and Guidelines for Quality Assurance in the European
Professional Growth= Teacher Growth
Developing an Effective Evaluation to Check for Understanding Susan E. Schultz, Ph.D. Evaluation Consultant PARK Teachers.
Scholarly Project Malignant Hyperthermia Learning Module
Implementation & Evaluation Regional Seminar ‘04 School Development Planning Initiative “An initiative for schools by schools”
ISTEP: Technology Field Research in Developing Communities Instructor: M. Bernardine Dias CAs: Sarah Belousov and Ermine Teves Spring 2009.
ADDIE Instructional Design Model
Student Learning Objectives The SLO Process Student Learning Objectives Training Series Module 3 of 3.
Programs of the Intel Education Initiative are funded by the Intel Foundation and Intel Corporation. Copyright © 2007 Intel Corporation. All rights reserved.
CLASS Keys Orientation Douglas County School System August /17/20151.
SITE Conference – 3/ Successful Online Assessment, Evaluation, and Interaction Techniques Presented by: Dr. Barbara K.Mckenzie – Dept. of MIT, State.
Quality school library – how do we find out? Polona Vilar Department of LIS&BS, Faculty of Arts, University of Ljubljana, SLO Ivanka Stričević Department.
Curriculum materials are available at no cost and are licensed under a Creative Commons Attribution-Noncommercial 3.0 Unported License. This means you.
Using Technology to Enhance Instruction. Educational Technologies Blackboard, Content- Based Tools Distribution Tools Communicatio n Tools Presentatio.
Food Safety Professional Development for Early Childhood Educators Evaluation Plan.
Monica Ballay Data Triangulation: Measuring Implementation of SPDG Focus Areas.
Elizabeth Godfrey 1.  Periodic assessment of results Appropriateness, effectiveness, efficiency, impact, sustainability  Identifies intended and unintended.
MAC Fall Symposium: Learning What You Want to Know and Implementing Change Elizabeth Yakel, Ph.D. October 22, 2010.
Semester 2: Lecture 9 Analyzing Qualitative Data: Evaluation Research Prepared by: Dr. Lloyd Waller ©
Learnings from Classroom Connections Findings and Conclusions from Two Studies on the Statewide Laptop Initiative Dr. Wade Pogany – South Dakota DOE –
INTERNATIONAL SOCIETY FOR TECHNOLOGY IN EDUCATION working together to improve education with technology Using Evidence for Educational Technology Success.
Evaluating a Research Report
 This prepares educators to work in P-12 schools (1)  It provides direction (1)  It is knowledge-based, articulated, shared, coherent, consistent with.
A toolkit for embedding methods teaching within a Sociology fieldtrip Carole Sutton & Alison Anderson.
Teaching in a Web-Based Distance Learning Environment: An Evaluation Summary Based on Four Courses Charles Graham, Joni M. Craner, Byung-ro Lim, & Kursat.
ScWk 242 Course Overview and Review of ScWk 240 Concepts ScWk 242 Session 1 Slides.
CommendationsRecommendations Curriculum The Lakeside Middle School teachers demonstrate a strong desire and commitment to plan collaboratively and develop.
PPA 502 – Program Evaluation Lecture 2c – Process Evaluation.
Evaluation and Impact of Entrepreneurial Education and Training Malcolm Maguire Transnational High Level Seminar on National Policies and Impact of Entrepreneurship.
What could we learn from learning outcomes assessment programs in the U.S public research universities? Samuel S. Peng Center for Educational Research.
EQAL, E-Learning and Effective Embedding Dr Neil Ringan Centre for Learning & Teaching.
EDUCAUSE 2005 Annual Conference October 19, 2005.
Cc Developed through a partnership of the Maryland State Department of Education Division of Special Education and Early Intervention Services and the.
Efficiently Completing and Updating Campus Greenhouse Gas Inventory University of Wisconsin-Stout Applied Research Center (ARC) Wisconsin’s Polytechnic.
Curriculum materials are available at no cost and are licensed under a Creative Commons Attribution-Noncommercial 3.0 Unported License. This means you.
BARREN COUNTY NON-TRADITIONAL LEARNING PLAN.
What Are the Characteristics of an Effective Portfolio? By Jay Barrett.
EDUCAUSE 2003 Copyright Toshiyuki Urata 2003 This work is the intellectual property of the author. Permission is granted for this material to be shared.
MTSS PRESENTATION SEPTEMBER, 2013 NCSC Summative Assessment Update.
INTRODUCTION TO ONLINE FACILITATION- DAY TWO Anna N Perry.
Implementation and Sustainability in the US National EBP Project Gary R. Bond Dartmouth Psychiatric Research Center Lebanon, NH, USA May 27, 2014 CORE.
Cristina G. Vázquez, Manager, Student Assessment Division, Texas Education Agency.
Lean Manufacturing Practices In Malaysian SMEs. 1.0Introduction Research Background Why Lean Manufacturing needs to be practiced in SMEs? Problem Statement.
SAS Curriculum Mapping Trainer (SAS CMT) October 2011.
Welcome to the (ENTER YOUR SYSTEM/SCHOOL NAME) Data Dig 1.
Engaging Students in Technical Modules: The Quest to Promote Student Identification of Problematic Knowledge. Dr William Lyons, School of Engineering,
Team Building Project Design and Development Storyboard 1.
Measuring Institutional Capacity for Sustainability Mark D. Bardini, Ph.D. Chemonics International AEA Webinar September 15, 2011.
To flip or not to flip: An exploratory analysis into student attitudes towards the flipped classroom approach to learning Enhancement Themes conference,
Philip Russell Deputy Librarian, ITT Dublin CoPILOT, Glasgow, February 12 th 2014 Creating and Sharing Information Literacy OERs.
Rebecca L. Mugridge LFO Research Colloquium March 19, 2008.
Cynthia Curry, Director National AEM Center
Presentation transcript:

How do we know it works? Evaluating Learning Technology Projects EDUCAUSE Learning Initiative Seminar Clare van den Blink

Seminar Goals Seminar participants will be able to…. Select project goals that can be evaluated. Identify relevant indicators to evaluate project goals. Identify data collection methods Assess considerations for developing evaluation activities Create an evaluation plan for an academic technology project that is tied to their project assumptions and strategies.

Please indicate the type of projects you’re interested in evaluating.

What are the key challenges in fully evaluating your projects?

A Framework Project Goals Focus of Evaluation Evaluation Design Overview Indicators - measures of ”success” Data collection -Methods -Population -Procedures Timeline Data Analysis Reporting Findings

Introduction How this process was developed… “To evaluate the effectiveness of the technology enhancement and it’s impact on student learning” …..With constraints of staff, time and budget limitations.

Staff Effort Complexity of Evaluation Low High Small Project with technology intervention LMS Pilot Project- Service Decisions Quasi-experimental Research on Tech Intervention Importance-Complexity

Staff Effort Complexity of Evaluation Low High Small Project with technology intervention LMS Pilot Project- Service Decisions Quasi-experimental Research on Tech Intervention Importance-Complexity Interview Survey

Staff Effort Complexity of Evaluation Low High Small Project with technology intervention LMS Pilot Project- Service Decisions Quasi-experimental Research on Tech Intervention Importance-Complexity Interview Survey Interviews Surveys Tech Review Usability

Staff Effort Complexity of Evaluation Low High Small Project with technology intervention LMS Pilot Project- Service Decisions Quasi-experimental Research on Tech Intervention Importance-Complexity Interview Survey Interviews Surveys Tech Review Usability Interviews Surveys Observations Control group

To inform the evaluation Based on Prior research? What led to the development of the project? Literature review- What will inform the evaluation? Assumptions about the strategies & technologies selected?

Selecting Goals…. Since not all projects may be evaluated within the timeframe of the project… OR may be difficult to measure…

Selecting Goals…. Since not all projects may be evaluated within the timeframe of the project… OR may be difficult to measure… How can you SELECT goals that can be evaluated within the scope of the project?

Selecting Goals…. Since not all projects may be evaluated within the timeframe of the project… OR may be difficult to measure… How can you SELECT goals that can be evaluated within the scope of the project?  How would you PRIORITIZE the goals that are the most critical to evaluate?

Sample Goals: Review examples ofgoals and idenitfy goals that could be evaluated within the project constraints. EXAMPLE #1 Instructional Goals: 1.) Encourage active participation and critical thinking skills by using video clips of main stream movies to initiate class discussions. 2.) Encourage student involvement and active learning by creating a mechanism for students to record interviews in the field. (part of a class assignment) 3.) Create a repository of student-collected audio interviews for ongoing use in the curriculum. Audio clips will be used to illustrate the diversity of public education experiences. Other Project Goals:. 4.) Develop work flow and documentation for student recording of audio interviews and video clip processing. 5.) Choose, create, and provide an archiving mechanism for cataloguing clips.

Sample Goals: Review examples ofgoals and idenitfy goals that can be evaluated within the project constraints. EXAMPLE #2 Instructional Goals: A. ) Students will be able to practice application of fluid therapy, under various conditions, employing a unique computer-based simulation B.) Students will be able to interpret symptoms presented in a sick dog, select an appropriate treatment, administer fluids, monitor patient’s reaction and modify treatment plan accordingly. C.) Case simulation will enable students to experience clinical variability in a manner similar to hands-on practice. Other Project Goals: D.) Simplify the creation of a set of teaching models, or prototypes, that are the basis of the cases. E.) Create a method for generating unique computer- based cases that build from the prototypes. F.) Provide a method for saving case data for comparison.

Developing the evaluation plan

Evaluation Process Identify Project Goals Survey Observation Interviews Focus Groups Data analysis Case Study

Evaluation Process Identify Project Goals Focus of the Evaluation? What goals will be evaluated? Survey Observation Interviews Focus Groups Data analysis Case Study

Evaluation Process Identify Project Goals Focus of the Evaluation? What goals will be evaluated? Indicators: type of data to collect? Survey Observation Interviews Focus Groups Data analysis Case Study

Evaluation Process Identify Project Goals Focus of the Evaluation? What goals will be evaluated? Indicators: type of data to collect? Methods: how will data be collected? Survey Observation Interviews Focus Groups Data analysis Case Study

Evaluation Process Identify Project Goals Focus of the Evaluation? What goals will be evaluated? Indicators: type of data to collect? Methods: how will data be collected? Survey Observation Interviews Focus Groups Data analysis Case Study Population

Process: Select Goals …Focus  What goals will be the Focus of the evaluation? Project Goals 1.Use Personal Response System (polling) with questions to encourage critical thinking & student engagement. 2.Use of PowerPoint presentations with interactive lecture. 3.Implement use of a Tablet PC for annotating presentations for visual and interactive lectures Evaluation Process

Process: Select Goals …Focus  What goals will be the Focus of the evaluation? What is feasible to evaluate in project’s timeframe? Project Goals 1.Use Personal Response System (polling) with questions to encourage critical thinking & student engagement. 2.Use of PowerPoint presentations with interactive lecture. 3.Implement use of a Tablet PC for annotating presentations for visual and interactive lectures Focus of the Evaluation Use Personal Response System (polling) with questions to encourage critical thinking & active participation. (student engagement) Evaluation Process

Process: Select Goals …Focus  What goals will be the Focus of the evaluation? What is feasible to evaluate in project’s timeframe?  Identify what INDICATORS can be used to collect data, both indirect and direct measures. Project Goals 1.Use Personal Response System (polling) with questions to encourage critical thinking & student engagement. 2.Use of PowerPoint presentations with interactive lecture. 3.Implement use of a Tablet PC for annotating presentations for visual and interactive lectures Focus of the Evaluation Use Personal Response System (polling) with questions to encourage critical thinking & active participation. (student engagement) Evaluation Process

Evaluation Focus: Example 1 Formative: Since the project is assisting with the development of online modules, a formative evaluation of the modules will be conducted to look at the interface design, navigation, usability, organization and presentation of content, and the usefulness for student learning. The key focus will be on the “functionality” of the module.  Interface / Navigation / Design (not important in phase I)  Technology performance: test across browsers, OS, distance, etc.  Organization/presentation of content  Use of images, illustrations, images  Learning objectives** Summative: (part of the overall program evaluation) Since the project is assisting with the development of web-based modules, the summative evaluation will examine the effect on student perception of learning from the implementation of instructional technology in the course. Measures of success may include the student perception of greater ease in learning difficult concepts, and positive feedback about new modules. Evaluation Process

Select Methods Develop data collection METHODS for the indicators, such as surveys, interviews, observations, etc. What method(s) would you select evaluate the focus area- “functionality” of the online module? 1. Surveys 2. Interviews 3. Observation 4. Log analysis 5. All of the above 6. Other – post in chat Evaluation Process

Select Methods Develop data collection METHODS for the indicators, such as surveys, interviews, observations, etc What method(s) would you select to evaluate how well the online module met the “ learning objectives ”? 1. Surveys 2. Interviews 3. Observation 4. Log analysis 5. All of the above 6. Other – post in chat Evaluation Process

Other considerations  Identify the population from where the data will be collected. Can you contact this group?  Identity other data sources, such as logs, document, etc.  Are there considerations for human subjects research & informed consent on your campus? For example, using grades as data, what permissions are necessary? Evaluation Process

Timelines: How much time do you have? Need? Dec-Jan Project Transition & Closeout Fall Project Development Aug – Dec Fall Semester Project Implementation Implement Project Evaluation Project Evaluation Planning Evaluation: Data analysis & reports Evaluation Process

Evaluation Timeline Develop an initial timeline & staffing effort. Develop evaluation plan.September 1 Create an observation protocolSeptember 15 Observe EDU 271 during two class sessionsSept - October Complete the student survey. (instrument)October 1, 2005 Create an interview protocolOctober 15 Administer student survey.Mid- November Conduct student interviewsEnd of November Conduct data analysisDecember - January Complete evaluation reportFebruary 1 Evaluation Process

Evaluation Timeline Develop an initial timeline & staffing effort. Develop evaluation plan.September 1 Create an observation protocolSeptember 15 Observe EDU 271 during two class sessionsSept - October Complete the student survey. (instrument)October 1, 2005 Create an interview protocolOctober 15 Administer student survey.Mid- November Conduct student interviewsEnd of November Conduct data analysisDecember - January Complete evaluation reportFebruary 1 Where reality meets ideal evaluation methods…. Evaluation Process

Implementing the Plan & Reporting Results

Implementing Methods Surveys:  Identify or develop questions.  Do survey questions map to indicators?  Survey distribution & associated permissions. Interviews:  Develop interview questions & protocols.  Schedule and conduct interviews. Resources about quantitative and qualitative methods can guide development and implementation of methods and data analysis. Evaluation Process

Analysis & Reporting DATA ANAYSIS What type of analysis will be completed? Quantitative: survey analysis Qualitative: interview analysis based on interview protocols. Example: Overall, I am satisfied with the use of instructional technology in this course. Mean = Strongly Agree 82% 2 Agree 18% 3 Neutral 0 4 Disagree 0 5 Strongly Disagree0 Evaluation Process “I interviewed Prof. X about her experience with the simulation…..”

Analysis & Reporting Evaluation Process

Constraints

Staffing How can this planning all be completed within limited staff hours, while maintaining the INTEGRITY of the evaluation process? Evaluation Considerations & Tools

Do you think it is feasible to re-train staff for evaluation?

Staffing How can this planning all be completed within limited staff hours, while maintaining the INTEGRITY of the evaluation process?  How can staff be trained in this process without having a deep evaluation background?  What staff skills might be adapted?  Other campus resources? Evaluation Considerations & Tools

What type of existing skills could be adapted for evaluation?

Supporting Tools  Have an overall evaluation plan template that an be adapted to other projects.  Informed consent templates  Use common interview/observation protocols.  Develop question banks for survey questions.  Use of Video in Course Presentations and lectures  Use of Online instructional tutorials  Use of Presentations in Lecture Evaluation Considerations & Tools

question banks Evaluation Considerations & Tools

question banks Evaluation Considerations & Tools

Summary

Consider…  How can this METHODOLOGY be applied to your projects and institution?  How VIABLE is this as an evaluation methodology for your projects? When does a more ROBUST process need to be put in place?

A Framework Project Goals Focus of Evaluation Evaluation Design Overview Indicators - measures of ”success” Data collection -Methods -Population -Procedures Timeline Data Analysis Reporting Findings

A Framework 1. Project Goals 2. Focus of Evaluation 3. Evaluation Design  Overview  Indicators - measures of ”success”  Data collection - Methods - Population - Procedures 4. Timeline: what, when, how 5. Data Analysis 6. Reporting Findings

Questions?