Evaluating Scrutable Adaptive Hypertext Marek Czarkowski University of Sydney, Australia Fourth Workshop on the Evaluation of Adaptive Systems July 2005.

Slides:



Advertisements
Similar presentations
Mona P. Klose, MS, RN, CPHQ Sarah B. Fuchs, MS, RN, C
Advertisements

BLR’s Human Resources Training Presentations
By: Edith Leticia Cerda
A Systems Approach To Training
Redesigning Computer Literacy Arizona State University Tempe,Arizona Toni Farley Redesign Alliance Conference March 23, 2009 Orlando, Florida.
Online Student Success: Teaching the ABCs of Online Proficiency to Produce As, Bs, and Cs in Online Classes.
What is an ITP? Individual training plan What is an ITP for a Goldsmiths PGCE student? Progress check, a record of achievement, a log of experiences,
Directorate of Human Resources Examples of blended course designs Oxford Centre for Staff and Learning Development
The Cost of Authoring with a Knowledge Layer Judy Kay and Lichao Li School of Information Technologies The University of Sydney, Australia.
PACT Feedback Rubric Pilot Results with UC Davis English Cohort.
LectureTools: A Powerful Web-Based Alternative to Clickers Perry Samson Perry Samson
Promoting language competence and cultural awareness through blogging in a Chinese language course Zhou Hongfen
They’re Computer Savvy, Right? Well, Maybe…
TU e technische universiteit eindhoven / department of mathematics and computer science 1 Empirical Evaluation of Learning Styles Adaptation Language Natalia.
By: Josh Bergquist and Dustin Mees. Why Blog? Blogging gives you a chance to communicate your thoughts and ideas with your classmates in an asynchronous.
Dr. Ahmed El-Mowafy Dept. of Spatial Sciences Ahmed El-Mowafy, Dept. of Spatial Sciences Faculty of S&E, T&L Symposium 2013, 23 July,
Evaluation Mary Rowlatt MDR Partners. Definition of project evaluation Evaluation focuses on whether the project was effective, achieved its objectives,
Jennifer Strickland, PhD,
Data to Instruction Eric Lech Session Objectives Understand differences in assessments and their impact on instruction Understand the.
What Behaviors Indicate a Student is Meeting Course Goals and Objectives? “Indicators”
USMLEEasy User Guide (Student). USMLEEasy starts with an easy-to-use dashboard which enables you to start and monitor your personal study plan.
Creator: Wendi South Diffusion and Integration of Technology in Education.
PEER REVIEW OF TEACHING Helen Barefoot Learning and Teaching Institute.
Chapter 14: Usability testing and field studies. Usability Testing Emphasizes the property of being usable Key Components –User Pre-Test –User Test –User.
Design of metadata surrogates in search result interfaces of learning object repositories: Linear versus clustered metadata design Panos Balatsoukas Anne.
EVIDENCE BASED WRITING LEARN HOW TO WRITE A DETAILED RESPONSE TO A CONSTRUCTIVE RESPONSE QUESTION!! 5 th Grade ReadingMs. Nelson EDU 643Instructional.
EQNet Travel Well Criteria.
Professional Growth= Teacher Growth
Chapter 14: Usability testing and field studies
Human Learning John Penn. Learning Theory A learning theory is a theory that explains how people learn and acquire information. A learning theory gives.
+ Instructional Design Models EDU 560 Fall 2012 Online Module November 13, 2012.
Ove Edvard Hatlevik and Ingrid Radtke An Application of Standard-Setting Methods in a Formative Assessment in Digital Responsibility.
Understanding of the fundamentals ◦ By the end of the class, 70% of the students will be able to:  Correctly draw freebody diagrams of 2D truss structures.
EDU 385 Education Assessment in the Classroom
Hummm…  How do I create an engaging online course that facilitates learning?
Learning From Assessment: Evaluating the Benefits of DALI (Diagnostic Assessment Learning Interface) Hershbinder Mann & Guinevere Glasfurd-Brown, University.
SWE 4324 UCD - WEB A FINAL VERSION Evaluation of a feature Note about Webs – Webs are the interactive class you must participate for a satisfactory understanding.
Lecture # 11 SCIENCE 1 ASSOCIATE DEGREE IN EDUCATION Teaching populations and ecosystems in elementary grades.
Beyond the Basics Alan Shurling. More Software tools are available than your basic suit We as teacher must utilize these tools.
Sharing Design Knowledge through the IMS Learning Design Specification Dawn Howard-Rose Kevin Harrigan David Bean University of Waterloo McGraw-Hill Ryerson.
1 System for Administration, Training, and Educational Resources for NASA Training Evaluations.
Teaching with Primary Sources. Teaching with Primary Sources Wikispace Join wikispace Participant survey Overview.
CITI L.A.W. Working Group Meeting May 22, 2009 Key West, Florida
Assessment of Student Learning Faculty Development Workshop October 31, 2013 Donna L. Pattison, PhD Instructional Professor Department of Biology & Biochemistry.
Analyze Design Develop AssessmentImplement Evaluate.
Educator’s view of the assessment tool. Contents Getting started Getting around – creating assessments – assigning assessments – marking assessments Interpreting.
APCS Nguyen Ngoc Dan Vy – Tran Thi Hong Diem – Instructor: Do Lenh Hung Son.
Building & Evaluating Spoken Dialogue Systems Discourse & Dialogue CS 359 November 27, 2001.
E VALUATING YOUR E - LEARNING COURSE LTU Workshop 11 March 2008.
PAEasy User Guide (Student). PAEasy starts with an easy-to-use dashboard which enables you to start and monitor your personal study plan.
+ Instructional Design Models EDU 560 Fall 2012 Online Module November 13, 2012.
1 Evaluating the User Experience in CAA Environments: What affects User Satisfaction? Gavin Sim Janet C Read Phil Holifield.
Education Transform Resources
Teaching through the Mathematical Processes Session 1: Introduction to the Mathematical Processes.
Teaching Gardner’s multiple Intelligences Theory as a tool for differentiation: Intelligence for opening doors.
Welcome to Seminar ED503 ED 503. Agenda  General comments  Rules of engagement  Best time for seminar—Survey Monkey  Holiday Schedule  Getting to.
TECHNOLOGY IN THE COMMUNITY Community Center Representative EDU 620 Meeting Individual Student Needs Instructor: Dusty Clark January 21 st 2016 Amber Currie.
Using scenarios to promote learning analytics practice for teachers Project: Building an evidence-base for teaching and learning design using learning.
Associate Professor Cathy Gunn The University of Auckland, NZ Learning analytics down under.
Reading Online Course Hu zhanqi. Learner Profile – Context: Non-English major students in Chinese university who master the general knowledge of English.
Eurocall Nottingham 2011, United Kingdom 1. Introduction ENHANCING WRITING SKILLS THROUGH BLOGS IN AN EFL CLASS Ruby Vurdien, White Rose Language School,
1 Using DLESE: Finding Resources to Enhance Teaching Shelley Olds Holly Devaul 11 July 2004.
Angela Kleanthous University of Cyprus May 20th, 2017
WP8: Demonstrators (UniCam – Regione Marche)
Training Trainers and Educators Unit 8 – How to Evaluate
Training Trainers and Educators Unit 8 – How to Evaluate
USMLEEasy User Guide (Student)
Kavitha N Krishnan MS OTR/L Dec 2nd, 2016
Introduction to the course
Presentation transcript:

Evaluating Scrutable Adaptive Hypertext Marek Czarkowski University of Sydney, Australia Fourth Workshop on the Evaluation of Adaptive Systems July 2005

Agenda What is Scrutable Adaptive Hypertext? Scrutinisation Tools to be evaluated Evaluation Design Field Test Evaluation: UNIX Security Course Controlled Evaluations: Personalised TV Guide, Holiday Planner

What is Scrutable Adaptive Hypertext? …Adaptive Hypertext (personalised presentation / navigation) with built-in support for tools that allow users to understand and control personalisation Why? Control and transparency - good HCI principles Guidance for correcting misconceptions / errors in user model Privacy legislation Curiosity, Reflection, Exploration of alternatives Important for critical applications

What is Scrutable Adaptive Hypertext? Supporting scrutinisation means allowing users to get answers to questions like… Why / How was this page personalised to me? What does the system know about me? Why does it think that? … and change the personalisation to better suit their needs What would the system show me if it thought I was ……?

SASY typical personalised page view

Scrutinisation Tools Highlight Tool Highlight Tool – explain why items were included by personalisation

Scrutinisation Tools Highlight Tool Highlight Tool – explain why items were removed by personalisation

Scrutinisation Tools Evidence Tool Evidence Tool See reason why system holds a belief about the user

Scrutinisation Tools Profile Tool Profile Tool View and change user model to change personalisation

Evaluation Design Difficulties in evaluating Scrutable Adaptive Hypertext: Users will not scrutinise often Understandable as this is not user’s main goal We want to understand how users experience and perceive the user model and personalisation during interaction. For this, users should be immersed in realistic tasks (Paramythis et. al. 2001)

Evaluation Design Strategy Model evaluation around the most common scenarios where users might be motivated to scrutinise: User believes personalisation is faulty because it produces unexpected results Content author wishes to debug the adaptive content they have created User is curious as to what the system believes about them or how a page was personalised and wants to explore alternatives Evaluate multiple domains

Evaluation 1: UNIX Security Course Field Test Aim: Will learners scrutinise and change personalisation to remove material that is distracting to their learning? Method: Pre-test (knowledge), free use (logging user actions), post- test (knowledge and qualitative). To motivate scrutinisation: We planted jokes and comments in teaching material Populated user model with defaults to include advanced concepts and lots of quiz questions Participants: 84 computer science students learning UNIX security.

Evaluation 1: UNIX Security Course Field Test Results – Exploring personalisation 77% scrutinised in some way (N=84) Scrutinisation Tool Usage % Accessed at least once % Accessed > 2 times View Profile5110 Changed Profile3918 Evidence Tool409 Highlight Tool4011

Evaluation 1: UNIX Security Course Field Test Results - Control over personalisation Overall 37% changed profile to change personalisation 4% removed Hints, 9% removed Jokes. But from survey, most users said jokes/hints were not annoying 6% reduced number of quiz questions 22% changed profile to state they knew more or knew less Results – Qualitative Survey 57% strongly agreed or agreed "it is useful to be able to inspect and control the personalisation". Overall Tool Utility: 50% +ve, 40% neutral, 10 -ve

Evaluation 2: Personalised TV Guide Lab Test Aim: Measure how effectively SASY supports users to: Scrutinise a page to determine why adaptive content is included/removed in relation to their user profile. Explain how/why a belief held by the system was instantiated. In this case the belief is inferred by the system through the user’s interaction with the system. Demonstrate control over the personalisation by altering their profile to change how content is included and removed. Affect of online help/training.

Evaluation 2: Personalised TV Guide Lab Test Method: Users complete series of tasks using personalisation tools and provide feedback after each step. Can measure efficiency and task correctness. Qualitative survey at end of experiment to measure user satisfaction and acceptance. One group of users trained, other group not trained.

Questions SASY: SASY Evaluation: