20081 E-learning Lecture-10: EVALUATING THE IMPACTS OF E-LEARNING week 12- Semester-4/ 2009 Dr. Anwar Mousa University of Palestine Faculty of Information.

Slides:



Advertisements
Similar presentations
Performance Assessment
Advertisements

Action Research Teams in the Cedar Rapids Community Schools A vehicle for continuous improvement to close identified student achievement gaps.
Online Rubric Assessment Tool for Marine Engineering Course
April 6, 2011 DRAFT Educator Evaluation Project. Teacher Education and Licensure DRAFT The ultimate goal of all educator evaluation should be… TO IMPROVE.
Assessment The purpose of this workshop / discussion is to extend further teachers’ understanding of the Department's Assessment Advice. This workshop.
Project Monitoring Evaluation and Assessment
Decision Making Tools for Strategic Planning 2014 Nonprofit Capacity Conference Margo Bailey, PhD April 21, 2014 Clarify your strategic plan hierarchy.
Program Evaluation. Lecture Overview  Program evaluation and program development  Logic of program evaluation (Program theory)  Four-Step Model  Comprehensive.
Forward Moving Districts Information Summarized by Iowa Support Team as they Study Identified Buildings and Districts Actions in those Buildings and Districts.
1 Michigan Department of Education Office of School Improvement One Common Voice – One Plan Michigan Continuous School Improvement (MI-CSI)
School Improvement Work Day. Continuous School Improvement The Model of Process Cycle for School Improvement provides the foundation to address school.
Return On Investment Integrated Monitoring and Evaluation Framework.
1 Types of Evaluation Decide on the Purpose: Formative - improve and inform Summative- identify value/effect.
Demystifying Assessment at Cal Poly Pomona Bob Hurt Faculty Associate for Assessment & Program Review (909)
Challenge Questions What outcomes have we achieved?
The Graduate Attributes Project: a perspective on early stakeholder engagement Dr Caroline Walker Queen Mary, University of London.
PROGRAM EVALUATION 2013 R&D, FEBRUARY 12, 2014 DBEEPEDML.
Catherine Wehlburg, Ph.D. Office for Assessment & Quality Enhancement.
AnalyseDesignDevelopImplementEvaluate ADDIE - Model.
Professional Growth= Teacher Growth
Instructional Design Aeman Alabuod. Instructional Design instructional Design (also called Instructional Systems Design (ISD)) is the practice of creating.
ISTEP: Technology Field Research in Developing Communities Instructor: M. Bernardine Dias CAs: Sarah Belousov and Ermine Teves Spring 2009.
Theories Guiding E-Learning
Instructional Design Eman Almasruhi.
Do it pro bono. Strategic Scorecard Service Grant The Strategy Management Practice is presented by Wells Fargo. The design of the Strategic Scorecard Service.
Elizabeth Godfrey 1.  Periodic assessment of results Appropriateness, effectiveness, efficiency, impact, sustainability  Identifies intended and unintended.
Unit 1 – Preparation for Assessment LO 1.1&1.2&1.3.
Semester 2: Lecture 9 Analyzing Qualitative Data: Evaluation Research Prepared by: Dr. Lloyd Waller ©
“Strategies for Effective Clinical Teaching and Evaluation” Assessment & Evaluation – (Part 2) Patricia A. Mahoney, MSN, RN, CNE This presentation is a.
Evaluation Tools & Approaches for Engaging Stakeholders: The AREA Approach Presentation at the Canadian Evaluation Society Dr. Mariam Manley, Principal.
Improving relevant standards. Aims and objectives Familiarize ourselves with best practice standards of teaching To think about how we can implement the.
Evaluation and Assessment What are they? Why do we need them? When do we use them?
March 26-28, 2013 SINGAPORE CDIO Asian Regional Meeting and Workshop on Engineering Education and Policies for Regional Leaders Programme Evaluation (CDIO.
Monitoring & Evaluation: The concepts and meaning Day 9 Session 1.
Intel ® Teach to the Future Pre Service Evaluation across Asia - Gaining the Country Perspective - Deakin University Faculty of Education Consultancy and.
Guidance for Completing Interim Report I Evaluation Webinar Series 3 Dec 2013.
Building and Recognizing Quality School Systems DISTRICT ACCREDITATION © 2010 AdvancED.
Action research projects This sequence may help you when planning your case study What does the data identify about this target group? What will.
20081 E-learning Lecture-4: ASSESSMENT, FEEDBACK, AND E-MODERATION week 2- Semester-2/ 2008 Dr. Anwar Mousa University of Palestine Faculty of Information.
STRATEGIC ENVIRONMENTAL ASSESSMENT METHODOLOGY AND TECHNIQUES.
Exploring Evidence.
1 Evaluating the Quality of the e-Learning Experience in Higher Education Anne Jelfs and Keir Thorpe, Institute of Educational Technology (IET), The Open.
ELearning Committee Strategic Plan, A Brief History of the ELC Committee Developed and Charged (2004) CMS Evaluation and RFP Process (2004)
School Improvement Partnership Programme: Summary of interim findings March 2014.
“A Truthful Evaluation Of Yourself Gives Feedback For Growth and Success” Brenda Johnson Padgett Brenda Johnson Padgett.
Assessment Design. Four Professional Learning Modules 1.Unpacking the AC achievement standards 2.Validity and reliability of assessments 3. Confirming.
CES 20th Annual Conference Keun-bok Kang (CNU) & Chan-goo Yi (ETRI) ; Taejon, Korea A DESIGN OF THE METAEVALUATION MODEL A DESIGN OF THE METAEVALUATION.
Data Analysis Processes: Cause and Effect Linking Data Analysis Processes to Teacher Evaluation Name of School.
“Kids are here to maintain our humility.” Dr. Rob Horner
Program Evaluation Overview. Definitions of Program Evaluation systematic collection of information abut the activities, characteristics, and outcome.
Kathy Corbiere Service Delivery and Performance Commission
1 The project is financed from the European Union funds within the framework of Erasmus+, Key Action 2: Cooperation for innovation and the exchange of.
February 28.  Unit plans feedback (that I have completed)  Expectations for reflections  Pre-Internship Expectations  Questions you always wanted.
Office of Service Quality
Evaluating Engagement Judging the outcome above the noise of squeaky wheels Heather Shaw, Department of Sustainability & Environment Jessica Dart, Clear.
Chapter 8:Evaluation Anwar F. Al Arfaj Supervised by Dr. Antar Abdellah Submitted by.
Dr. Kathleen Haynie Haynie Research and Evaluation November 12, 2010.
Demonstrating Institutional Effectiveness Documenting Using SPOL.
Long Range Technology Plan, Student Device Standards Secondary Device Recommendation.
North Carolina Standard for School Executives Standard 1 By: Barbara Bumgardner Aleen Besmer James Westbrook Kristy Christenberry.
Gender-Sensitive Monitoring and Evaluation
Gender-Sensitive Monitoring and Evaluation
Training Trainers and Educators Unit 8 – How to Evaluate
Teacher Education Improvement Program- Practicum (TEIP-1)
Teaching and Learning with Technology
Training Trainers and Educators Unit 8 – How to Evaluate
Governance and leadership roles for equality and diversity in Colleges
Introduction to CPD Quality Assurance
Resource 1. Evaluation Planning Template
Background on Provincial Report Cards
Presentation transcript:

20081 E-learning Lecture-10: EVALUATING THE IMPACTS OF E-LEARNING week 12- Semester-4/ 2009 Dr. Anwar Mousa University of Palestine Faculty of Information Technology

20082 EVALUATING THE IMPACTS OF E-LEARNING Goals Explore issues related to the evaluation of e-learning and teaching activities. Propose a comprehensive plan for the evaluation of e- learning and teaching.

20083 Goals of evaluation A major goal of any evaluation activity is to influence decision making. For any organization to be able to attain its mission, a comprehensive evaluation strategy for ascertaining the impacts of its various teaching, learning and research related activities is crucial. This strategy needs to be systematic in its approach to gathering different types of data and feedback from a range of sources, and with the help of a variety of instruments.

20084 Goals of evaluation The gathering of this kind of data and feedback is also crucial to ensuring a high quality of service, and effective utilization of information and communications technology in teaching and learning. The term “evaluation” is being used here to refer to the systematic acquisition of feedback on the use, worth and impact of some activity, program or process in relation to its intended outcomes (see Naidu, 2005).

20085 Goals of evaluation The most basic distinctions between various types of educational evaluation activities are drawn between formative, summative, and monitoring or integrative evaluation (see also Kirkpatrick, 1994; Naidu, 2002, 2005; Reeves, 1997, 1999).

20086 Evaluation methodology You should aim to gather data from all stakeholders (i.e., students and staff) regularly using a set of evaluation instruments within A consistent evaluation framework which should include: – front-end analysis, formative, summative and integrative evaluation. You should also aim to collect a variety of data using a range of data gathering instruments. However, you would want to keep the data gathering process as simple and as less intrusive (interfering) as possible.

20087 Evaluation methodology Front-end analysis comprises a set of ways by which you would plan to ascertain the readiness of students and staff and their preferences in relation to teaching and learning online. Carrying out such surveys periodically and especially prior to the full roll-out of e-learning will enable your organization to get a better handle on how to align its services to meet the needs of prospective (future) users. The information gathered will help to inform the organization on the nature of its user needs, their perceptions (awareness) and expectations, and any gaps in the provision of existing support.

20088 Evaluation methodology Formative evaluation would involve gathering feedback from users and other relevant groups during the implementation process. Its purpose would be to identify problems so that improvements and adjustments can be made during the implementation stages of elearning in your organization. You may wish to plan to carry out formative evaluations routinely and regularly. It would be best that these evaluations use a consistent set of tools comprising surveys, and focus group interviews with users.

20089 Evaluation methodology Summative evaluation will enable you to ascertain the full impacts and outcomes of e-learning on teaching and learning at your organization. You would usually carry this out upon the completion of an e-learning program, even though there is not likely to be a dividing line between formative and summative evaluation phases. As part of this process, your aim is to periodically assess the sum impacts of e-learning on teaching and learning activities in your organization.

Evaluation methodology Data gathered should reveal how e-learning is responding to challenges facing teaching and learning in your organization, and the extent to which you are achieving benchmarks and milestones which you have set. Monitoring or integrative evaluation will comprise attempts to ascertain the extent to which the use of e- learning or online learning is integrated into regular teaching and learning activities at your organization. Data gathered as part of this process will reveal the extent to which, and how teaching and learning activities in the organization have been impacted with e-learning.

EVALUATING THE IMPACTS OF E-LEARNING For more information on evaluation tools have a look at these sites. l l l l l evaluation.html l l l l

EVALUATING THE IMPACTS OF E-LEARNING For more information on evaluation tools have a look at these sites. l l l l l l l

Points to Remember Evaluation refers to the systematic acquisition of feedback on the use, worth and impact of some activity, program or process in relation to its intended outcomes. Any evaluation strategy should include front-end analysis, formative, summative and integrative evaluation activities. Evaluation of e-learning is no different and should aim at gathering data from all stakeholders.