UBICOMP ‘01 A Hybrid Evaluation Approach for Ubiquitous Computing Environments Mark Burnett, Chris Rainsford DSTO, Australia.

Slides:



Advertisements
Similar presentations
Catalogue Energy Interventions Data and Information Needs Development and Energy in Africa First National Workshop September 1, 2005 ….innovating energy.
Advertisements

Learning Outcomes Participants will be able to analyze assessments
Salford initiatives to improve how we listen and respond to the voice of the child.
ACTIVELY ENGAGING THE STAKEHOLDER IN DEFINING REQUIREMENTS FOR THE BUSINESS, THE STAKEHOLDER, SOLUTION OR TRANSITION Requirements Elicitation.
Discussion Questions November 1, 2012 POSTECH Strategic Management of Information and Technology Laboratory (POSMIT: Dept.
UBICOMP pervasive computing
Assistive Technologies: For the Well Class 8. Agenda 3:00-3:05 Announcements 3:05-3:15 Quiz 3:15-3:40 Presentation 3:40-3:50 Discussion.
Part 1: Introducing User Interface Design Chapter 1: Introduction –Why the User Interface Matters –Computers are Ubiquitous –The Importance of Good User.
Innovation Ecosystems Professor Simon Kaplan Director, NICTA Queensland.
490dp Introduction Robert Grimm. The Computer for the 21 st Century “The most profound technologies are those that disappear. They weave themselves into.
Ubicomp: Smart Homes #2 Thursday March 22 nd 2007.
Mobile and Ubiquitous Computing. Overview Attributes Discussion.
Web Usability by Scott Grissom1 Web Usability Scott Grissom Computer Science & Information Systems.
Chapter 3 The Qualitative Research Approach. WHAT IS THE INTERPRETIVE WAY OF THINKING? Multiple Realities Data vs. Information Subjects vs. Research Participants.
1 Overview of Usability Testing CSSE 376, Software Quality Assurance Rose-Hulman Institute of Technology April 19, 2007.
Part 4: Evaluation Chapter 20: Why evaluate? Chapter 21: Deciding on what to evaluate: the strategy Chapter 22: Planning who, what, where, and when Chapter.
The Conditions of Learning Joanne. The Learning Task Student will be able to complete an effective article search on the Ebsco databases. Learned Capability.
University of Sunderland CSEM04 ROSCO Unit 13 Unit 13: Risk Methods CSEM04: Risk and Opportunities of Systems Change in Organisations Dr Lynne Humphries.
Program Performance Reporting and Evaluation in Australia Mark Nizette Department of Finance and Administration October 2001.
Quantitative (survey) v. Qualitative (field) 1QuantitativeQualitative General framework Seek to test and confirm hypotheses about phenomena. Instruments.
CHI 2009 Review Process Changes area-based submissions and sub-committees.
Enhancing Learning Through Technology b Choosing the technology b Meeting the needs of students with exceptionalities b Specific Technologies.
1 Focus on Measurement: Part I. 2 Objectives (1 of 2)  Explain why it is important to use data to analyze processes, identify problems and test interventions.
1 Exploring Focus in the 2011 MA Curriculum Framework for Mathematics Exploration Activity : Examining the Critical Areas in Mathematics April 2011.
ELA SCHOOL TEAM SESSION Welcome to EEA, 2012! 10/2/2015MSDE1.
Institutionalizing Sustainability Performance Metrics Smart & Sustainable Campuses Conference 2013 Presented by Keri Enright-Kato Yale Office of Sustainability.
Fall 2002CS/PSY Pervasive Computing Ubiquitous computing resources Agenda Area overview Four themes Challenges/issues Pervasive/Ubiquitous Computing.
Charting Past, Present, and Future Research in Ubiquitous Computing Gregory D. Abowd and Elizabeth D. Mynatt Georgia Institute of Technology LEE SEMUN.
Gathering User Data IS 588 Dr. Dania Bilal Spring 2008.
Usability testing. Goals & questions focus on how well users perform tasks with the product. – typical users – doing typical tasks. Comparison of products.
A Fuzzy Multiple Level Multiple Criteria Decision Making Approach for Advertising Agency Selection Graduate Student: Eilinawati 研究生 : 謝有花 Advisor: Prof.
2010 Framework for Action on ICT For Development in the Pacific (FAIDP) Review Sakaio Manoa ICT Outreach Coordinator USP.
Evaluation Paradigms & Techniques IS 588 Spring 2008 Dr. D. Bilal.
Multimedia CALL: Lessons to Be Learned from Research on Instructed SLA Carol A. Chapelle Presenters: Thorunn April.
What about Chapter 7?. What is the usability process? Tyldesley’s 22 possible Measurement Criteria Let’s focus on usability–A usability initiative needs.
© 1997, BAK. 1The DESMET Methodology EEL UNIVERSITY KE Evaluating software methods and tools using the DESMET Methodology Barbara Kitchenham Steve Linkman.
Evaluation and Impact of Entrepreneurial Education and Training Malcolm Maguire Transnational High Level Seminar on National Policies and Impact of Entrepreneurship.
Identifying the MERN and analyzing the contents of the fora Rachel. W. Roh.
Identify a Health Problem Qualitative Quantitative Develop Program -theory -objectives -format -content Determine Evaluation -design -sampling -measures.
Cerberus: A Context-Aware Security Scheme for Smart Spaces presented by L.X.Hung u-Security Research Group The First IEEE International Conference.
MG462 Qualitative Methods Focus Groups Dr. Meredith Rolfe Week 9.
Evaluation Methods - Summary. How to chose a method? Stage of study – formative, iterative, summative Pros & cons Metrics – depends on what you want to.
User Interface Lab. ISIE Jeong, Seok-Hyun Using the experience sampling method to evaluate ubicomp applications.
Edinburgh Youth Social Inclusion Partnership Involving young people in developing solutions to exclusion Supporting evidence-based change in mainstream.
STAKEHOLDER INFORMED SERVICE DESIGN. Agenda 1.Viable Calgary 2.Service Design 3.Employer Engagement 4.Connections 5.Outcomes.
New Advanced Higher Subject Implementation Events Computing Science Unit Assessment at Advanced Higher.
MARK 7362 Focus Groups. MARK 7362 Agenda: -Why would you want to do a focus group? -How do you prepare for a focus group? -How do you conduct a focus.
Program Evaluation “Most often defined as a process used to determine whether the design and delivery of a program were effective and whether the proposed.
OCLC Online Computer Library Center 1 Using Library Perception Information and Impact Data.
VYTAUTAS SIMANAITIS Cloud computing © Kaunas 2013, KTU.
“W HAT TEACHING STRATEGIES ARE EFFECTIVE FOR INCREASING ENGAGEMENT IN IT CLASSROOMS ?” by Lorraine Blyth Enquiry Based Project Integrated Teaching.
Prof. James A. Landay University of Washington Spring 2008 Web Interface Design, Prototyping, and Implementation Ubicomp Design Pre-Patterns May 29, 2008.
CRITERIA FOR SELECTING THE RESEARCH DESIGN St: Ngô Thị Hằng Class: 2701 – K24 TS: Đinh Thị Bảo H ươ ng.
User Interface Evaluation
Gathering a credible evidence base
MetaOS Concept MetaOS developed by Ambient Computing to coordinate the function of smart, networked devices Smart networked devices include processing.
Goal Reality Options Will
Evaluation Paradigms & Techniques
FRAMEWORK OF EVALUATION QUESTIONS
بسم الله الرحمن الرحیم.
Assessment in Career Counseling
Chapter 6 Thinking about requirements and describing them
What is a Logic Model? A depiction of a program showing what the program will do and what it is to accomplish. A series of “if-then” relationships that.
Health Health: Learning Experience 13
Physical Development Physical Development: Learning Experience 12
Peer assessment.
Inquiry learning What do inquiry tasks look like in mathematics?
Visual and Performing Arts
What is a Logic Model? A depiction of a program showing what the program will do and what it is to accomplish. A series of “if-then” relationships that.
Research Methods in Education Session 3
Presentation transcript:

UBICOMP ‘01 A Hybrid Evaluation Approach for Ubiquitous Computing Environments Mark Burnett, Chris Rainsford DSTO, Australia

Types of Evaluation Typical Characteristics QuantitativeQualitative Target ProblemWell defined, agreed metrics. Loosely defined, new, unknown metrics ProsWell defined outcomes, easy to compare candidates Allows exploration, encourages diversity, human subjectivity allowed. ConsNarrow focus, does not accommodate human subjectivity. Not well defined, hard to compare candidates, inconclusive. Evaluation Outcomes Improved performanceDiscovery of new approaches, identification of strong points TREC What’s the equivalent of ROBOCUP for a ubicomp environment? PKDD

A Hybrid Evaluation Framework Environment: Ubiquitous/pervasive  Invisible  Connected  Context-aware suggests hybrid of qualitative and quantitative approaches How to evaluate a ubiquitous computing environment? choose a well-defined environment, for example a smart room choose some well-defined tasks for a hybrid evaluation What to evaluate in a ubiquitous computing environment? software infrastructure and services in support of a task participants can build their own smart room (devices+infrastructure+s/w)