User Modeling and Recommender Systems: evaluation and interfaces Adolfo Ruiz Calleja 18/10/2014.

Slides:



Advertisements
Similar presentations
Close Reading at NQ Is it really that different to what I have done before?
Advertisements

CS305: HCI in SW Development Evaluation (Return to…)
THE END OF THE SEMESTER 04. December 2014 Kerti Sönmez.
Montana Unified School Trust
LOGO August The Basics What is the Analysis of Student Work (ASW)? ASW is the process North Carolina has decided to implement in order to obtain.
Future Faculty Program Overview November Goals Increase number of Ph.D. graduates who obtain academic positions –Especially at top-50 engineering.
1 CS 430 / INFO 430 Information Retrieval Lecture 24 Usability 2.
1 Discussion Class 12 User Interfaces and Visualization.
An evaluation framework
An evaluation framework
Retrieval Evaluation. Introduction Evaluation of implementations in computer science often is in terms of time and space complexity. With large document.
IB Diploma Program Exams – Semester Report Cards
Evaluation of digital Libraries: Criteria and problems from users’ perspectives Article by Hong (Iris) Xie Discussion by Pam Pagels.
Copyright © 2014 by The University of Kansas Member Survey of Process: Ratings of Satisfaction.
END OF COURSE TESTING WHAT YOU CAN EXPECT Freshman Parent Information Session
South Carolina Oyster Restoration and Enhancement Water Quality Monitoring: Evaluation of a Digital Training Product for South Carolina Oyster Restoration.
ISE 324 Fundamentals of Modern Manufacturing Systems
Advanced Business Communication Spring Advanced Business Communication Spring 2012 Introduction Our last project for the class is a recommendation.
Evaluation Framework Prevention vs. Intervention CHONG POH WAN 21 JUNE 2011.
1 REQUIREMENT ENGINEERING Chapter 7. 2 REQUIREMENT ENGINEERING Definition Establishing what the customer requires from a software system. OR It helps.
2 September Statistics for Behavioral Scientists Psychology W1610x.
1 L643: Evaluation of Information Systems Week 5: February 4, 2008.
Delta State University College of Education Annual Student Update Part II Dissertation January16, 2010.
Tallinn University Orientation Day Academic Calender Autumn semester : until 5:00 p.m. Deadline for registration.
Orientation Day Study year consists of two semesters: Autumn semester Autumn semester Spring semester Spring semester One semester consists.
WHAT IS THE WPE? Writing Assessment Office Office of Undergraduate Studies Campus Center 1st Floor, Room
Level 2 Prepared by: RHR First Prepared on: Nov 23, 2006 Last Modified on: Quality checked by: MOH Copyright 2004 Asia Pacific Institute of Information.
 Any needed undergraduate prerequisites must be completed before a student can take their 5 th graduate course.  The successful completion of ten graduate.
OT Connections is AOTA’s new online community which allows occupational therapists, occupational therapy assistants and students to connect with each.
CPSC 121: Models of Computation Unit 0 Introduction George Tsiknis Based on slides by Patrice Belleville and Steve Wolfman.
1 Key elements of a good project ECVET contact seminar October 2010 Brussels LEONARDO DA VINCI.
Introduction to the Senior Project At last! This is the year that you have waited for, and now you have the opportunity to create a memorable Senior Project.
ACTION PLAN IN COMMUNICATION SKILLS 2
Studies at the University of Tartu International Student Service Autumn 2014.
College Applications Time lines and Procedures “Applying to College is like Buying a House (or new car)” You wouldn’t buy a house without finding out.
Studies at the University of Tartu International Student Service Autumn 2014.
GdI/ICS 1 WS 2009/2010 Telecooperation/RBG Prof. Dr. Max Mühlhäuser Dr. Guido Rößling Dr. Dirk Schnelle-Walka, Stefan Radomski.
Project 1 (CGNB 413) Briefing
Department of Earth Science and Engineering, Imperial College London Undergraduate degrees: Academic briefing Choosing your module options.
Identifying “Best Bet” Web Search Results by Mining Past User Behavior Author: Eugene Agichtein, Zijian Zheng (Microsoft Research) Source: KDD2006 Reporter:
National Workshop on ANSN Capacity Building IT modules OAP, Thailand 25 th – 27 th June 2013 KUNJEER Sameer B Pool of experts database and further enhancements.
Introducing Internet Advancement. Units can Enter Ranks, Merit Badges, and Awards Online –Internet Advancement is provided to assist units with recording.
LECTURE 16: BEYOND LINEARITY PT. 1 March 28, 2016 SDS 293 Machine Learning.
Evaluation / Usability. ImplementDesignAnalysisEvaluateDevelop ADDIE.
1 Welcome Alireza Humber College Lecture 1 Game 540 Alireza
INSTRUCTORS Please review the slides below and update the given examples with information relevant to your state and/or institution: 9 Achievable, Relevant,
Participatory Monitoring and Evaluation of Community Projects – A Model Pretoria, South Africa 30 November – 3 December 2009.
Supplemental Instruction (SI) Spring 2016 Faculty Orientation January 13, :30-12:00 B-8.
TALLINN UNIVERSITY Orientation Day Contact information Matthew Crandall, administrator of the programme, lecturer of International Relations,
MSc in Advanced Computer Science Induction 2016/17
Units can enter ranks, merit badges, and awards online.
Oral Exam Information Session
IST EdD Orientation “Advanced” students
TALLINN UNIVERSITY Orientation Day
Effects of Targeted Troubleshooting Activities on
Systems Analysis and Design
New Teacher Evaluation Process
COMI Friday 9:00 – 1:50 Room 2108.
Q4 Measuring Effectiveness
Applying for a TA Position in CSE
General recommendations
Semester Averages: The whole story
Local Review is a web site used by state industry projections analysts to obtain, from local area experts, information about economic activity and any.
Homework Reading Machine Projects Labs
Guide: Report results Version of Ladok by the latest update:
Member Survey of Process: Ratings of Satisfaction
Guide: Certify results Version of Ladok by the latest update:
CFR Enhancement Session
20-Hour Week vs. 8-Hour Week
Presentation transcript:

User Modeling and Recommender Systems: evaluation and interfaces Adolfo Ruiz Calleja 18/10/2014

Index 2 Some administrative staff Evaluation of recommender systems Interfaces for recommender systems

Index 3 Some administrative staff – Administrative DLs – Final project DLs Evaluation of recommender systems Interfaces for recommender systems

Administrative DLs 4 Lecturers in agreement with the study group, must set two exams times. I open those times in the Study Information System (ÕIS). Student must register themselves on one of these times. The responsible member of teaching staff has 10 working days to enter the results of exams/assessments on to the Study Information System. A student who receives a negative result have right to participate on the re- examination once until the end of Spring semesters intermediate week (March 22, 2015). Lecturer have right to choose, when he/she will conduct the re-exam. If student doesn't pass the course by the end of next semester intermediate week, then he/she must re-take the whole course. How to organize exam/assessment/re-examination? – Lecturer will set a exam/assessment/re-examination times and informs me – I will open those times – Student registers (registration is possible until 24h before the exam) – ~24h before exam - lecturer contact with me and asks the list of participants – Exam/assessment/re-examination will take place. NB! Lecturers must not allow to the exam/assessment/re-examination students who are not registered! – I enter the result. NB! System doesn't allow me to enter the results to students who haven't registered to the exam/assessment/re-examination!

5 Official evaluation dates: Dec. 20 th and 21 st Projects should be reviewed by 1 st of January Re-examination: Feb. 15 th – Those who fail/not-presented  send the project – Those who want better grades  send new version of the project + send answer to my review Administrative DLs

6 October, 18th: Projects are proposed by the students. October, 19th: Students start working on their projects. October, 23rd: I can still be asked about the project topics. December, 1st: First complete draft of the projects December, 8th: Each student should have peer-reviewed at least two projects of his colleagues. December, 13 th and 14th: Projects are presented December, 21 st : Projects are finished January, 1st: Projects are reviewed. Final project DLs

Index 7 Some administrative staff Evaluation of recommender systems – General comment – Common metrics – Data gathering techniques Interfaces for recommender systems

General comment 8 What do you want to evaluate? What characteristics are relevant for this evaluation? What metrics are relevant? Which techniques can be employed to gather data?

Common metrics (accuracy) 9 Relevant Retrieved b ac d

Common metrics (accuracy) 10 Precision Recall Fall-out F-metrics Mean Absolute Error, Mean Square Error… …and many others

Common metrics (beyond accuracy) 11 Coverage Learning Rate Serendipity Confidence User reactions – Explicit (ask) vs. Implicit (log) – Outcome vs. Process – Short-term vs. Long-term

Data gathering techniques 12 Feature analysis Test the system with synthetic data Test the system with developers or experts Test the system with users – Laboratory vs. Field studies – Formal experiment vs. Analysis of user behavior

13 Ask a question I want... SQL Select results 5 results. Do they satisfy you? Dataset Wizard of Oz Algorithm

Index 14 Some administrative staff Evaluation of recommender systems Interfaces for recommender systems – Recommendations – Explanations – Interactions

Recommendations 15 Top Item Top N-Items Predicted ratings for each item Structured overview (divided into categories)

Explanations: Benefits 16 Transparency  explain how the system works Scrutability  allow users to say that the system is wrong Trust  increase user’s confidence Effectiveness  Help users to make decissions Persuasiveness  Convince users to take an item Efficiency  Help users to make faster decissions Satisfaction  Increse the easy of use or the user enjoyenment

Explanations: Guideline 17 Consider the benefits that you would like to obtain from the recommendations How you present explanations affect the interaction model The type of recommendations provided have an impact on the underlying algorithm Be careful: the evaluation of explanations is related to the underlying algorithm

Interactions: What does the user need to do? 18 Nothing Specify his requirements – Maybe we can ask him to change the requirements Explicitly ask for recommendations – More like this – More later – No more like this – Surprise me! – “I´m feeling lucky”

User Modeling and Recommender Systems: evaluation and interfaces Adolfo Ruiz Calleja 18/10/2014