Effective Methods for Educational Research The value of prototyping, observing, engaging users Diana Laurillard.

Slides:



Advertisements
Similar presentations
Designing and Developing Online Courses. Course Life Cycle Design Develop Implement Evaluate Revise.
Advertisements

Mobile Data from a Research Perspective Institute of Educational Technology The Open University Agnes Kukulska-Hulme JISC/CNI conference, Edinburgh, 1-2.
Enhancing Accountability in a Database Design Team Project Enhancing Accountability in a Database Design Team Project Karen C. Davis Electrical & Computer.
EVALUATING THE WORTH OF ONLINE DISCUSSION IN TARGET LANGUAGE IMPROVING THE STUDENT EXPERIENCE: PEDAGOGICAL THEORY AND PRACTICE S.CLARK.
Teaching and Learning Analytics for MOOCS Diana Laurillard London Knowledge Lab Institute of Education.
By: Rachel Hall and Amy Austin.  Grade: All levels  Price  Host App: $24.99  Student App: free (allows up to 32 students)  Location: App Store 
Learning and Teaching Conference 2012 Skill integration for students through in-class feedback and continuous assessment. Konstantinos Dimopoulos City.
IS214 Recap. IS214 Understanding Users and Their Work –User and task analysis –Ethnographic methods –Site visits: observation, interviews –Contextual.
Observing users Chapter 12. Observation ● Why? Get information on.. – Context, technology, interaction ● Where? – Controlled environments – In the field.
Using RUBRICS to Assess Program Learning Outcomes By Dr. Ibrahim Al-Jabri Director, Program Assessment Center April 16, 2007.
EQuIP Rubric and Quality Review Curriculum Council September 26, 2014.
Kresge Business Administration Library 1 EVALUATING VIDEO TUTORIALS: Measuring Excellence And Outcomes Sally Ziph Instruction Coordinator Kresge Business.
Administrivia  Review Deliverable 2 –Overview (audience) –Excellent additions  User Goals  Usability Goals  User Group (who are you designing for?)
Internet Supported Distance Learning Brian Mulligan IT Sligo, September 2003.
Administrivia EPost and Portfolio EPost and Portfolio Analysis of Design process Analysis of Design process.
Recap of IS214. Placing this course in context Creating information technology that helps people accomplish their goals, make the experience effective,
Scholarship of Teaching: An Introduction New Fellows Orientation April 17, 2008 SoTL Fellows
Part 2: Requirements Days 7, 9, 11, 13 Chapter 2: How to Gather Requirements: Some Techniques to Use Chapter 3: Finding Out about the Users and the Domain.
COMP 7970 Playtesting Cheryl Seals
ICS 463, Intro to Human Computer Interaction Design: 8. Evaluation and Data Dan Suthers.
10th Workshop "Software Engineering Education and Reverse Engineering" Ivanjica, Serbia, 5-12 September 2010 First experience in teaching HCI course Dusanka.
Introduction Introduction. Problem. Literature. Data. Quantitative. Qualitative. Presentation. Cases. Analytical methods for Information Systems Professionals.
Online assessment methodologies and examples Bronwyn Beach, Lynn Huguenin and Davin Nicholas GippsTAFE (Part 2 of 2)
Dr Irene Bell (Mathematics and Numeracy) Dr John McCullagh (Science) Mr Fergal Corscadden (CELT Coordinator) Contact Person Presenters:
Incorporating Assessment in Course Design Sean Burns, Learning Analyst and Assessment Coordinator Lydia Page, Course Developer.
Language Understanding to Improve Student Achievement Project LUISA Session 7. Mar 1, Welcome: Focusing on Assessment 2. Standardized Proficiency.
Making Clickers Work for You Dr. Stephanie V. Chasteen & Dr. Steven Pollock Workshop developed.
ICT can be a powerful tool when used appropriately to aid the delivery of science lessons and as a tool for students. As ICT capability increases, so should.
An ITS initiative in association with the TSC Gathering your needs and requirements to support eLearning at Western Talk to Us!
#17 - Involve Users in the Development Model of Multinational Corporations - Is it worth it? Experience Report IRCSE '08: IDT Workshop Friday 31 October.
CWSEI Workshop 2 Interventions. Goals of workshop 1. Articulate your own reasons for (or against) using clickers/in class exercises in YOUR class. 2.
S556 SYSTEMS ANALYSIS & DESIGN Week 11. Creating a Vision (Solution) SLIS S556 2  Visioning:  Encourages you to think more systemically about your redesign.
Enhancing Teaching and Learning with Podcasts Mico e-Learning Workshop.
Engaging teachers in reflection on their teaching practice through recording and analyzing math discussions in their classrooms. Virginia Bastable TDG.
Using CMS Data as a Force for Good? Applying Academic Analytics to Teaching and Learning Leah P. Macfadyen Science Centre for Learning and Teaching, UBC,
Media & Learning Design (M&LD) Research & Evaluation Presentation to M&LD Steering Committee By Christos Anagiotos & Phil Tietjen (
Human Computer Interaction
Ice Breaker! 1. On your name tag, please write the name that you wish to be addressed by throughout this course. (First, Last or Nick name) 2. What word.
CPLA Video Case Studies Making Media Nursing. Making Media - Background First year, semester long 20 credit core module Involved 200 students with 4 staff.
Shaping User Experience Electronic Resources and Libraries 2012 Tara Carlisle, University of North Texas Libraries Kate Crane, Texas Tech University Ana.
Training Program Proposal December  Review ~ of The Educational Technology Plan from Donegal’s Strategic Plan  Analysis ~ of Donegal’s School.
Software Engineering User Interface Design Slide 1 User Interface Design.
1 Human Computer Interaction Week 7 Prototyping. 2 Introduction Prototyping is a design technique where users can be involved in testing design ideas.
Welcome Science 5 and Science 6 Implementation Workshop.
Gathering Information Enables You To: 1. Reduce risk 2. Determine consumer attitudes 3. Monitor the environment 4. Coordinate strategy 5. Measure performance.
Improving Student Engagement and Achievement Through Blended Learning Peter Anello & Steve Courchesne Nipissing-Parry Sound Catholic DSB.
Online Course Design Jennifer Freeman ACADEMIC ■ IMPRESSIONS
Module 5 ELL, ESE, Walkthoughs To identify diagnostic tools appropriate for assessing ELL and ESE students’ learning needs in reading and instructional.
September 2010 Arlene W. Williams Marshall School of Business PLEASE SIT IN TEAMS.
© 2007 Pearson Education, Inc. Publishing as Pearson Addison-Wesley 1 Prototyping.
Presenter: Jenni Parker, Murdoch University Stream: Enjoying the Sun - Moodle showcase eDesign: An authentic online learning course using Moodle paired.
Associate Professor Cathy Gunn The University of Auckland, NZ Learning analytics down under.
Learning Technology Development. edgehill.ac.uk/ls David Callaghan September 2013 “How I engaged my students” One tutor’s experience that produced outstanding.
Activities Assignments Blogs* Chat* Choice Database Forum* Hot Potatoes* Journals Quizzes (Tests) Wiki* Audio Recorder Podcasts* Surveys VodCasts* Webquest*
Talk about the assignment! April 27th 2015 #TOOC15 Webinar.
Individuals Pairs Small Groups Organizations Communities Society Exploring the Group within Social and Collaborative Search Small Groups Work via Collaboration.
Teaching and Learning Online What Makes Sense When Moving Courses Online.
Main strand session 17 Session Seventeen Measuring Learning 2: Assessment evidence Jim Rogers.
Learning Management System (LMS) /Course Management System (CMS) & Digital Portfolio.
Generating data with enacted methods
Accounting 9706.
Online / Hybrid Workshop: Course Design, Development, and Delivery
A community of learners improving our world
CPSC : Collaborative Computing & Personal Informatics
Fostering Student Success: Leveraging Canvas Analytics for face-to-face, hybrid, and online courses Welcome February 16, 2018.
Usability Techniques Lecture 13.
Chapter 23 Deciding how to collect data
Mr. Edward Martin ELC / ACE Lecturer December 13, 2017
Empirical Evaluation Data Collection: Techniques, methods, tricks Objective data IRB Clarification All research done outside the class (i.e., with non-class.
Presentation transcript:

Effective Methods for Educational Research The value of prototyping, observing, engaging users Diana Laurillard

Methods for educational software design An iterative design-test-analyse-redesign approach The focus is achievement of learning outcomes Align outcomes – activities – assessment Discover requirements – design – design tests Prototype Observe Test Engage Evaluate

Prototyping Sketch the design idea Design user tasks – Use materials to represent choices and actions – Allow users to construct the choices and actions Data collection – 6 target users (min) – User constructions (materials) – Users’ commentary (audio, observation notes) – Video (usually unnecessary)

A paper prototype Proposed screen design Proposed actions

A paper prototype User defined actions

A paper prototype

Observing Individual – Observation notes – ‘Stimulated recall’ (Bloom1953) – video or playback to act as a memory aide – Exercise: ‘what were you thinking at this point?’ – Audio + notes Pairs – listen to them talking, audio + notes Recording of screen activities Video

Data capture and analysis tools To capture screen activities To capture eye movements, logged interactions, and for eye- tracking data analysis To coordinate and analyse the multiple streams of data

User engagement Workshops – with frequent discussions Focus-group - question then facilitated discussion Survey – Closed questions + open comment Construct elicitation interviews to discover users’ own categories of description

Learning analytics Measures that correlate with final grade Total discussion messages posted (+ve) Total number of online sessions (+ve) Total time online (+ve) Time spent on assignments (+ve) Time spent on assessments (-ve) (MacFadyen and Dawson, 2010)

LMS tracking those at risk Ave grade = 72% Ave grade = 63% Ave grade = 57% Ave grade = 44% (MacFadyen and Dawson, 2010) Predicting those at risk Week of course Average hours online per week

Learning analytics on outcomes Pedagogically-driven analytics Define the ecology of experiences and resources in which the software is embedded Design appropriate assessment of outcomes Use the software tasks to generate tests Formulate the pre-post tests that will capture the differences the software should make Interpret outcomes for re-design

Working groups 1. What kinds of problems do you see with methods of prototyping, observing and engaging? 2. What might be the advantages of using such methods? 3. How might an educational software developer can draw on evidence-based research?