Course evaluation for practice and educational research Professor Elena Frolova St-Petersburg, Russia Department of family medicine MAPS Educational research.

Slides:



Advertisements
Similar presentations
Bridging Research, Information and Culture An Initiative of the Research and Planning Group for California Community Colleges Your Name Your Institution.
Advertisements

Chapter 3 Doing Sociological Research 1. Sociology & the Scientific Method The research process: 1.Developing a research question 2.Creating a research.
1 Assessment and Evaluation for Online Courses Associate Professor Dr. Annabel Bhamani Kajornboon CULI’s 6 th Intl Conference: Facing.
Product Evaluation the outcome phase. Do the magic bullets work? How do you know when an innovative educational program has “worked”? How do you know.
Pedagogical issues involved in using IT in teaching Student issues Current knowledge about students in transition and their approach to learning Changes.
Dr Elena Luchinskaya, Lancaster University/ Leeds Metropolitan University, UK.
Enhancing student learning through assessment: a school-wide approach Christine O'Leary, Centre for Promoting Learner Autonomy Sheffield Business School.
Elizabeth Godfrey 1.  Periodic assessment of results Appropriateness, effectiveness, efficiency, impact, sustainability  Identifies intended and unintended.
Setting the Stage: Workshop Framing and Crosscutting Issues Simon Hearn, ODI Evaluation Methods for Large-Scale, Complex, Multi- National Global Health.
PGCAPP U6 – Evaluation Geraldine Jones
1 Evaluating the Quality of the e-Learning Experience in Higher Education Anne Jelfs and Keir Thorpe, Institute of Educational Technology (IET), The Open.
“A Truthful Evaluation Of Yourself Gives Feedback For Growth and Success” Brenda Johnson Padgett Brenda Johnson Padgett.
Second Language Classroom Research (Nunan, D. 1990) Assoc. Prof. Dr. Sehnaz Sahinkarakas.
Facilitate Group Learning
1 The project is financed from the European Union funds within the framework of Erasmus+, Key Action 2: Cooperation for innovation and the exchange of.
Associate Professor Cathy Gunn The University of Auckland, NZ Learning analytics down under.
Evaluating with Perspective: Mobilizing Community Participation in Program Evaluation …its not too late!
Enhancing Evaluation Stakeholder Responsiveness Through Collaborative Development of Data Collection Instruments Karen Kortecamp, PhD The George Washington.
How to evaluate the impact of CPD? 28 th April 2016.
TOTAL QUALITY MANAGEMENT
Queen’s Teaching Awards QUB Teaching Awards Aims of the Briefing Session To raise awareness of the Queen’s Teaching Awards Scheme To encourage colleagues.
A window to the world: using technology to prepare students for international team work Sabine McKinnon, Senior Lecturer in Academic Development Dr Anne.
Course Work 2: Critical Reflection GERALDINE DORAN B
Stages of Research and Development
Logic Models How to Integrate Data Collection into your Everyday Work.
Introduction Social ecological approach to behavior change
Evaluating the Quality and Impact of Community Benefit Programs
Learning Assessment Techniques
Where We Are and Where We Want to Be
Provide instruction.
The What Works Centre for Crime Reduction: An evaluation
Personal Statement Guidance
Mentoring CPD Festival 2017.
Plenary session 1: How do institutions develop strategies to link/inform teaching with research? Case of the St. Petersburg State University of Economics:
Support for English, maths and ESOL Module 5 Integrating English, maths and ICT into apprenticeship programmes.
Evaluation Emma King.
Training Trainers and Educators Unit 8 – How to Evaluate
Business Case Analysis
Primary Investigator: Prof. P Reddy Project Director: Ms S James
Department of Political Science & Sociology North South University
The Development of a Competency Map for Population Health Education
ICT PSP 2011, 5th call, Pilot Type B, Objective: 2.4 eLearning
Action Research Designs
Assess Plan Do Review Resource 1: Types of Evaluation – which type of approach should I use? There are a number of different types of evaluation, the most.
Community program evaluation school
ASSESSMENT OF STUDENT LEARNING
Descriptive Analysis of Performance-Based Financing Education Project in Burundi Victoria Ryan World Bank Group May 16, 2017.
Effective educational strategies of resilient schools
Program Evaluation Essentials-- Part 2
Dr Anna Stodter FST Department of Sport and Exercise Sciences
Authentic Assessment in Early Intervention
Training Trainers and Educators Unit 8 – How to Evaluate
Perspective Interview: Sofia Perez
Emma Senior & Mark Telford.
Applying Psychology to Teaching
Balancing Administrative & Clinical Supervision
Final Research Question
Assessment and Development of Core Skills in Engineering Mathematics
PGCE PCE Mentoring Training
Parent-Teacher Partnerships for Student Success
Applying Psychology to Teaching
Unit 7: Instructional Communication and Technology
Evaluation and Testing
Bringing it Home: CSLOA Results and the Classroom
Building Capacity for Quality Improvement A National Approach
Chapter 4 Instructional Media and Technologies for Learning
Designing Assessment Methods
The International Conference of Creative Teaching, Assessment and Research in the English Language (ICCTAR 2019) Effect of Eclectic Approach in teaching.
Alternative Modes of Assessment
By Kaone Bakokonyane Tiro Mokgoare
Presentation transcript:

Course evaluation for practice and educational research Professor Elena Frolova St-Petersburg, Russia Department of family medicine MAPS Educational research task Force group, Tallinn, 2011 May 6th

Background New e-learning course as a part of RESPECT project of Family medicine Department and Leuven Katholiek University (leaders prof.Degryse, prof. Kuznetsova) To teach spirometry, then to conduct research on prevalence of COPD

Why we decided to evaluate?

Why evaluate? Evaluation is central to ensuring the quality of academic practice e-learning involves high levels of investment that need to be justified To demonstrate effectiveness and cost- effectiveness of learning technologies

Who are stakeholders? We as teachers to extract lessons learned and improve on practice Your students or future cohorts Other lecturers, departments Senior managers, to disseminate lessons learned and promote good practice by others QAA Funding body, finance officer ( www2.warwick.ac.uk/services/ldc)

Why we decided to evaluate? New learning- e-learning New subject of learning Spirometry in primary care Research in education Too many challenges! Money, money

When evaluate? Diagnostic – learning from the potential users; to inform plans to change practice; Formative - learning from the process; to inform practice; Summative - learning from the product; to inform others about practice

When we decided to evaluate? When the cook tastes the soup, it is formative evaluation; When the dinner guest tastes the soup, it is summative evaluation (Jen Harvey, “Evaluation cookbook”) Diagnostic, formative, summative

What we evaluate? e-pedagogy? E-learning facilitates new forms of resources, communication and collaboration and new patterns of study and group behaviors E-learning may demand new attitudes and skills

The objects of evaluation Do not compare the “e” group with a previous “control” group It is difficult to separate e-learning intervention from complex interplay of cultural and social influences Not only pedagogical aims and teaching process Technology itself and support surrounding this technology

“If you don't have a question you don't know what to do (to observe or measure), if you do then that tells you how to design the study” (Draper, 1996).

Ask yourself For the top management of my company, university, government, or institution, what is the single most important measure of success?

Focus on ? On e-learning materials ? On ‘content’? On issues concerning screen design? Navigation? This focus is probably quite superficial in terms of pedagogical impact

Focus on? The ways in which students interact with electronic content How the e-resources are introduced into the learning design The ways in which the technology supports interactions between learners The form of communication between students

Comparing traditional and e-learning methods is quite difficult Students like it because it’s new Students hate it because it’s unfamiliar Is it possible to isolate the effect of the new medium? Is any change in scores the result of having a different cohort of student?

Evaluating technology supported higher order learning A key evaluation question is whether the approach resulted in students achieving the intended learning outcomes

Evaluating cost-effectiveness To determine the costs associated with using e-learning The outcomes may include pedagogical, social and economic benefits which not always could be convert into market or monetary forms The benefits of using e-learning are difficult to quantify, but may be of high social value

Question structures Level 1 st Does it work? Do students like it? How effective is it?

Question structures Level 2 nd How cost-effective is the approach? How scalable is the approach?

Question structures Is there sufficient need for the innovation to make it worth developing? What are the best ways of using e-learning resource X? Will this approach be an effective way to resolve problem X? How does this approach need to be modified to suit its purpose? Do you think it would be better if...(an alternative mode of engagement) was carried out? What advice would you give another student who was about to get involved in a similar activity? (Bristol LTSS guidelines)

Question structures How do students actually use the learning system/materials? How usable is the e-learning tool or material? What needs are being met by them? What types of users find it most useful? Under what conditions is it used most effectively? What are features of the support materials are most useful to students? What changes result in teaching practices, learning/study practices? What effects does it have on tutor planning, delivery and assessment activities? What are the changes in time spent on learning? Tavistock (1998) evaluation guide

How evaluation should be planned A preliminary analysis of aims, questions, tasks, stakeholders, timescales and instruments/methods Time and budget Sampling and randomisation Question should be changed with the changing of the aim of evaluation

Methods of gathering information A pre-task questionnaire Confidence logs after each kind of activity A learning test (quiz) Access to subsequent exam (assessment) performance on one or more relevant questions Post-task questionnaire Interviews of a sample of students (also: focus group) Observation and/or videotaping of one or more individuals (also: peer observation/co-tutoring)

How should the data be analyzed? Quantitative data analysis Qualitative data analysis

"Research is the process of going up alleys to see if they are blind." Marsten Bates, 1967

Design Samples Control group Intervention group Skills and attitudes Spirometry skills Performance?

What happened to the students Randomized studies found no difference between groups “no particular method of teaching is measurably to be preferred over another when evaluated by student examination performances” (Dubin and Taveggia, 1968) Methods like Problem Based Learning are implemented very differently in different institutions

What we test? Students may compensate for educational interventions Hawthorn effect VanderBlij Effect a design with an educational method as the independent variable and a curricular test as the dependent variable, is usually too simple We have know the learners behavior

What we expect from better learning? Better students grades? Better patients? Better world? Better doctors? Or finally better learning behavior?

Conclusion Do we want to test this dinner still?

Recourses used and recommended (Evaluating online learning) “Evaluation cookbook”, Editor Jen Harvey ( OLLE TEN CATE. What Happens To the Student? The Neglected Variable in Educational Outcome Research. Advances in Health Sciences Education 6: 81–88, 2001.