Presentation is loading. Please wait.

Presentation is loading. Please wait.

Course evaluation for practice and educational research Professor Elena Frolova St-Petersburg, Russia Department of family medicine MAPS Educational research.

Similar presentations


Presentation on theme: "Course evaluation for practice and educational research Professor Elena Frolova St-Petersburg, Russia Department of family medicine MAPS Educational research."— Presentation transcript:

1 Course evaluation for practice and educational research Professor Elena Frolova St-Petersburg, Russia Department of family medicine MAPS Educational research task Force group, Tallinn, 2011 May 6th

2 Background New e-learning course as a part of RESPECT project of Family medicine Department and Leuven Katholiek University (leaders prof.Degryse, prof. Kuznetsova) To teach spirometry, then to conduct research on prevalence of COPD

3 Why we decided to evaluate?

4 Why evaluate? Evaluation is central to ensuring the quality of academic practice e-learning involves high levels of investment that need to be justified To demonstrate effectiveness and cost- effectiveness of learning technologies

5 Who are stakeholders? We as teachers to extract lessons learned and improve on practice Your students or future cohorts Other lecturers, departments Senior managers, to disseminate lessons learned and promote good practice by others QAA Funding body, finance officer ( www2.warwick.ac.uk/services/ldc)

6 Why we decided to evaluate? New learning- e-learning New subject of learning Spirometry in primary care Research in education Too many challenges! Money, money

7 When evaluate? Diagnostic – learning from the potential users; to inform plans to change practice; Formative - learning from the process; to inform practice; Summative - learning from the product; to inform others about practice

8 When we decided to evaluate? When the cook tastes the soup, it is formative evaluation; When the dinner guest tastes the soup, it is summative evaluation (Jen Harvey, “Evaluation cookbook”) Diagnostic, formative, summative

9 What we evaluate? e-pedagogy? E-learning facilitates new forms of resources, communication and collaboration and new patterns of study and group behaviors E-learning may demand new attitudes and skills

10 The objects of evaluation Do not compare the “e” group with a previous “control” group It is difficult to separate e-learning intervention from complex interplay of cultural and social influences Not only pedagogical aims and teaching process Technology itself and support surrounding this technology

11 “If you don't have a question you don't know what to do (to observe or measure), if you do then that tells you how to design the study” (Draper, 1996).

12 Ask yourself For the top management of my company, university, government, or institution, what is the single most important measure of success?

13 Focus on ? On e-learning materials ? On ‘content’? On issues concerning screen design? Navigation? This focus is probably quite superficial in terms of pedagogical impact

14 Focus on? The ways in which students interact with electronic content How the e-resources are introduced into the learning design The ways in which the technology supports interactions between learners The form of communication between students

15 Comparing traditional and e-learning methods is quite difficult Students like it because it’s new Students hate it because it’s unfamiliar Is it possible to isolate the effect of the new medium? Is any change in scores the result of having a different cohort of student?

16 Evaluating technology supported higher order learning A key evaluation question is whether the approach resulted in students achieving the intended learning outcomes

17 Evaluating cost-effectiveness To determine the costs associated with using e-learning The outcomes may include pedagogical, social and economic benefits which not always could be convert into market or monetary forms The benefits of using e-learning are difficult to quantify, but may be of high social value

18 Question structures Level 1 st Does it work? Do students like it? How effective is it?

19 Question structures Level 2 nd How cost-effective is the approach? How scalable is the approach?

20 Question structures Is there sufficient need for the innovation to make it worth developing? What are the best ways of using e-learning resource X? Will this approach be an effective way to resolve problem X? How does this approach need to be modified to suit its purpose? Do you think it would be better if...(an alternative mode of engagement) was carried out? What advice would you give another student who was about to get involved in a similar activity? (Bristol LTSS guidelines)

21 Question structures How do students actually use the learning system/materials? How usable is the e-learning tool or material? What needs are being met by them? What types of users find it most useful? Under what conditions is it used most effectively? What are features of the support materials are most useful to students? What changes result in teaching practices, learning/study practices? What effects does it have on tutor planning, delivery and assessment activities? What are the changes in time spent on learning? Tavistock (1998) evaluation guide

22 How evaluation should be planned A preliminary analysis of aims, questions, tasks, stakeholders, timescales and instruments/methods Time and budget Sampling and randomisation Question should be changed with the changing of the aim of evaluation

23 Methods of gathering information A pre-task questionnaire Confidence logs after each kind of activity A learning test (quiz) Access to subsequent exam (assessment) performance on one or more relevant questions Post-task questionnaire Interviews of a sample of students (also: focus group) Observation and/or videotaping of one or more individuals (also: peer observation/co-tutoring)

24 How should the data be analyzed? Quantitative data analysis Qualitative data analysis

25 "Research is the process of going up alleys to see if they are blind." Marsten Bates, 1967

26 Design Samples Control group Intervention group Skills and attitudes Spirometry skills Performance?

27 What happened to the students Randomized studies found no difference between groups “no particular method of teaching is measurably to be preferred over another when evaluated by student examination performances” (Dubin and Taveggia, 1968) Methods like Problem Based Learning are implemented very differently in different institutions

28 What we test? Students may compensate for educational interventions Hawthorn effect VanderBlij Effect a design with an educational method as the independent variable and a curricular test as the dependent variable, is usually too simple We have know the learners behavior

29 What we expect from better learning? Better students grades? Better patients? Better world? Better doctors? Or finally better learning behavior?

30 Conclusion Do we want to test this dinner still?

31 Recourses used and recommended http://www2.warwick.ac.uk/services/ldc http://www.fastrak-consulting.co.uk/tactix/ (Evaluating online learning) http://www.fastrak-consulting.co.uk/tactix/ “Evaluation cookbook”, Editor Jen Harvey (http://www2.warwick.ac.uk/services/ldc/)http://www2.warwick.ac.uk/services/ldc/ OLLE TEN CATE. What Happens To the Student? The Neglected Variable in Educational Outcome Research. Advances in Health Sciences Education 6: 81–88, 2001.


Download ppt "Course evaluation for practice and educational research Professor Elena Frolova St-Petersburg, Russia Department of family medicine MAPS Educational research."

Similar presentations


Ads by Google