Proposed plan for the summative evaluation of a Technology Enhanced Learning Project Dean Petters, Statement of the research question Participants Method.

Slides:



Advertisements
Similar presentations
Towards a Pan-Canadian Consensus
Advertisements

DeSILA Designing and Sharing Inquiry-based Learning Activities JISC Design for Learning Programme Dr Philippa Levy & Dr Sabine Little (CILASS) John Stratford,
X4L Prog Mtg – March 05 Evaluation – What works best - when and why? Prof Mark Stiles Project Director: SURF X4L, SURF WBL, SUNIWE and ICIER Projects Head.
LLP Leonardo TOI INCOM-VET Quality Assurance of Deliverables WP5 WP’s Leader: University of Aosta Valley (IT) Kick off Meeting Vilnius, 7-8 November 2013.
January 6. January 7 January 8 January 9 January 10.
QSR/Practice Overview © Human Systems & Outcomes, Inc., 2010 Reviewing & Refining Our Practice: Applying Disciplined Inquiry to Case Practice to Improve.
Doheny M. 1, Richardson N. 1, Brennan L. 2, Lambe B. 3, Osborne A. 1, Carroll P. 3 1 Centre for Men’s Health, Institute of Technology Carlow 2 Men’s Development.
Data analysis and interpretation. Agenda Part 2 comments – Average score: 87 Part 3: due in 2 weeks Data analysis.
Increasing Preservice Teachers' Capacity for Technology Integration Through the Use of Electronic Models Melissa Dark Purdue University.
The Process of Interaction Design. Overview What is Interaction Design? —Four basic activities —Three key characteristics Some practical issues —Who are.
Chapter 14: Usability testing and field studies. Usability Testing Emphasizes the property of being usable Key Components –User Pre-Test –User Test –User.
1 User Centered Design and Evaluation. 2 Overview Why involve users at all? What is a user-centered approach? Evaluation strategies Examples from “Snap-Together.
Evaluation. formative 4 There are many times throughout the lifecycle of a software development that a designer needs answers to questions that check.
Instant Data Analysis (IDA): Evaluating Usability in a Day Jesper Kjeldskov Mikael B. Skov Jan Stage.
Think-aloud usability experiments or concurrent verbal accounts Judy Kay CHAI: Computer human adapted interaction research group School of Information.
Recap of IS214. Placing this course in context Creating information technology that helps people accomplish their goals, make the experience effective,
Usable Privacy and Security Carnegie Mellon University Spring 2008 Lorrie Cranor 1 Designing user studies February.
Objectives of Session Seven Complete in-class survey Discuss question formats and ordering Case study: face-to-face interviews v. self-administered questionnaires.
1 User Centered Design and Evaluation. 2 Overview My evaluation experience Why involve users at all? What is a user-centered approach? Evaluation strategies.
Chapter 3 Preparing and Evaluating a Research Plan Gay and Airasian
RESEARCH METHODS Lecture 19
Quality of life 8th seminar Jaroslav Biolek. Objective x subjective -What is relation between objective and subjective indicators of quality of life (according.
Project NEStLeD Move Forward VIA University College Project NEStLeD NESTLED (Nurse Educator Simulation Based Learning) Project Leonardo Transfer of Innovation.
1 Qualitative Evaluation Terms Coding/categorization The process of condensing qualitative data through the identification of common themes. Data Matrix.
AWARE PROJECT – AGEING WORKFORCE TOWARDS AN ACTIVE RETIREMENT Alberto Ferreras-Remesal Institute of Biomechanics of Valencia IFA 2012 – Prague – May 31th.
Evaluating Your STEM Outreach Program MISO Spring Workshop May 7, 2012 MISO Data Analytics Team Jeni Corn, Tricia Townsend, Alana Unfried
Implementation & Evaluation Regional Seminar ‘04 School Development Planning Initiative “An initiative for schools by schools”
Evidence based research in education Cathy Gunn University of Auckland.
Module 4: Systems Development Chapter 13: Investigation and Analysis.
Research !!.  Philosophy The foundation of human knowledge A search for a general understanding of values and reality by chiefly speculative rather thanobservational.
OPERATIONAL GUIDELINES Ensuring Ownership of PARSEL by Partners.
Evaluating a Research Report
Human Computer Interaction
Usability testing. Goals & questions focus on how well users perform tasks with the product. – typical users – doing typical tasks. Comparison of products.
Christiana Noestlinger & Bea Vuylsteke, Institute of Tropical Medicine, Belgium This work is part of the Joint Action on Improving Quality in HIV Prevention.
Jing Wan.  Introduction  Logic model  Evaluation methods(Design, data collection, analysis)  Communication & dissemination.
Introduction to Research in Physical Activity
COMP5047 Pervasive Computing: 2012 Think-aloud usability experiments or concurrent verbal accounts Judy Kay CHAI: Computer human adapted interaction research.
Performance Stories Evaluation - A Monitoring Method to Enhance Evaluation Influence Riad Naji, Catriona King, Richard Habgood.
Information Call April 29, Today’s Call –BCPSQC –Aim & Objectives –Overview of Quality Academy –Curriculum –Supports and Benefits of Participation.
Chapter 8 Usability Specification Techniques Hix & Hartson.
Towards Common Standards for Studies of Software Engineering Tools and Tool Features Timothy C. Lethbridge University of Ottawa.
Sample Cost/Benefit Analysis of adding Human Factors Tasks to a Software Development Project Adapted from: Mantei, Marilyn M. and Teorey, Toby J., “ Cost/Benefit.
AMSc Research Methods Research approach IV: Experimental [1] Jane Reid
SBD: Analyzing Requirements Chris North CS 3724: HCI.
Survey Methodology Survey Instruments (1) EPID 626 Lecture 7.
NONPROFITS VOTE 2014 CAMPAIGN: AN OVERVIEW Presented by.
Usability Evaluation, part 2. REVIEW: A Test Plan Checklist, 1 Goal of the test? Specific questions you want to answer? Who will be the experimenter?
June 5, 2007Mohamad Eid Usability Testing Chapter 8.
SunSatFriThursWedTuesMon January
Research methods 16 th January Research methods Important to have a clear focus for your research. Hypothesis Question Grounded data.
This action-based research study used a descriptive triangulation process, which included quantitative and qualitative methods to analyze nursing students’
ECD QUALITY TOOLKIT Setting the scene Pilot Briefing Meeting 11 November 2015.
HEALTH and EDUCATION Module Number: ED31220 Course Tutor Dr Malcolm Thomas.
The Starting Point Things to consider include Defining the overarching aim(s) of the event/activity Deciding the type of event you wish to undertake Defining.
指導教授: Chen, Ming-puu 報 告 者: Tsai, Yu-ting 報告日期: 2006/12/19 Kathleen, I. & Deborah, C. (2004). Scenario-based e-learning design. Performance Improvement,43(1)16-23.
Presentation to the Ad-hoc Joint Sub-Committee on Parliamentary Oversight and Accountability Wednesday 20 March 2002 PUBLIC SERVICE MONITORING AND EVALUATION.
Evaluation / Usability. ImplementDesignAnalysisEvaluateDevelop ADDIE.
PSYCH 610 Entire Course (UOP) For more course tutorials visit  PSYCH 610 Week 1 Individual Assignment Research Studies Questionnaire.
1 Usability Analysis n Why Analyze n Types of Usability Analysis n Human Subjects Research n Project 3: Heuristic Evaluation.
PSYCH 610 guide / psych610guidedotcom.  PSYCH 610 Week 1 Individual Assignment Research Studies Questionnaire  PSYCH 610 Week 2 Individual Assignment.
Program Evaluation “Most often defined as a process used to determine whether the design and delivery of a program were effective and whether the proposed.
Postgraduate podcasting: An innovative approach to assessment Lynne Powell and Dr. Fiona Robson.
School of Engineering and Information and Communication Technology KIT305/607 Mobile Application Development Week 7: Usability (think-alouds) Dr. Rainer.
Update from the Faster Payments Task Force
SBD: Analyzing Requirements
Human computer interaction-com 402
The Hub Innovation Program Evaluation Plan
RESEARCH METHODS Lecture 19
eContentplus 2007 Work Programme
Presentation transcript:

Proposed plan for the summative evaluation of a Technology Enhanced Learning Project Dean Petters, Statement of the research question Participants Method Artefacts and data collection instruments Timetable Dependencies and risks Conclusion

Statement of the research question A plan for forming a research question from three objectives: – Adoption of effective practice – Experiment with new tools and approaches – Critically reflect on practice Tensions and synergies within research question Mould, sharpen, focus, guide Extend, broaden, experiment, innovate Reflect, deliberate, be explicit in how and why

Participants Novices users versus committed practitioners who may have become stakeholders Experienced lecturers versus participants new to HE teaching, individuals in training Trainers versus users

Research method Issues to clarify: – Evaluation of software versus evalution of processes that software is trying to facilitate – Controls in the design, contamination between conditions, artificiality of task An experimental within subjects design with counter-balancing – Half of subjects use application first and controlled condition second – Half of subjects use controlled condition first and application second Data – Quantitative from data logging, usability analysis – Qualitative from structured interviews, focus groups and questionnaires Analysis – Statistical analysis of differences, correlations – Thematic analysis of interviews and focus groups, diagrammatic representation of themes

Artefacts and data collection instruments A working prototype of the software with data logging adaptations for measurements for evaluation (or a usability testing environment to video users) Questionnaires and interview schedules The material for users to work upon, tasks to accomplish, optimum balance of control over these materials and with reality of tasks

Timetable Before the six month evaluation period – Getting appropriate data logging incorporated in application, or designing usability testing without these capabilities – Designing, calibrating and piloting questionnaires and structured interviews January to February - First five week design period End of February - Debrief, including post-experience interviews March to April - Second five week design period End of April - Debrief, including post-experience interviews May – data collation and analysis, and any short follow up data collection June – write up and dissemination

Dependencies and risks Key dependency and risk is getting everything ready for cohort of users in January Collection of results dependent on working software, but dependency on data logging may be limited by usability video capture Contamination between conditions limited by counterbalancing

Conclusion Overview of research question – how to combine different elements? Alternatives for Activity Theory Evaluating software and evaluating a way of promoting a pattern of behaviour Planning for future research – if application is effective, how and why?