Mobile HCI Presented by Bradley Barnes. Mobile vs. Stationary Desktop – Stationary Users can devote all of their attention to the application. Very graphical,

Slides:



Advertisements
Similar presentations
A Natural Interactive Game By Zak Wilson. Background This project was my second year group project at University and I have chosen it to present as it.
Advertisements

Addressing Patient Motivation In Virtual Reality Based Neurocognitive Rehabilitation A.S.Panic - M.Sc. Media & Knowledge Engineering Specialization Man.
Context-Aware User Interfaces. Gent, 21 maart 2005 Context-Aware User Interfaces Context-Aware User Interfaces is a requirement for all defined scenarios.
Enabling Always-Available Input with Muscle-Computer Interfaces T. Scott Saponas University of Washington Desney S. Tan Microsoft Research Dan Morris Microsoft.
Forearm Electromyography Muscle-Computer Interfaces Demonstrating the Feasibility of Using Forearm Electromyography for Muscle-Computer Interfaces T. Scott.
Handhelds and Collaborative Command and Control Brad Myers Human Computer Interaction Institute Carnegie Mellon University February 23, 2001.
Jesper Kjeldskov & Jan Stage Department of Computer Science Aalborg University Denmark New Techniques for Usability Evaluation of Mobile Systems.
4. Interaction Design Overview 4.1. Ergonomics 4.2. Designing complex interactive systems Situated design Collaborative design: a multidisciplinary.
© Lethbridge/Laganière 2001 Chapter 7: Focusing on Users and Their Tasks1 7.1 User Centred Design (UCD) Software development should focus on the needs.
INTRODUCTION. Concepts HCI, CHI Usability User-centered Design (UCD) An approach to design (software, Web, other) that involves the user Interaction Design.
Adapted from CTAE Resources Network PROFITT Curriculum Basic Computer Skills Module 1 Hardware.
Hardware Specialised Devices
The Impulse “Switch” 1Daniel Overman DIS5274/10/2010.
CHAPTER 2 Input & Output Prepared by: Mrs.sara salih 1.
Closing conference of SYSIASS – June 17 th 2014 Multimodal Bio-signal based Control of Intelligent Wheelchair Professor Huosheng Hu Leader of Activity.
User Interface Theory & Design
2. new design3. new experience1. problem Austin Toombs I544 EDCritique #4: Mobile Device Without Screen Display original design I544 Experience Design.
ASSISTIVE TECHNOLOGY PRESENTED BY ABDUL BARI KP. CONTENTS WHAT IS ASSISTIVE TECHNOLOGY? OUT PUT: Screen magnifier Speech to Recogonizing system Text to.
ICT PROSPECTS FOR THE PHYSICALLY CHALLENGED CHILD - A PARENTAL VIEW AVM FEMI GBADEBO (Rtd) OFR PRINCIPAL CONSULTANT GEEBARD CONCEPTS NIG. LTD.
Chapter 11: Interaction Styles. Interaction Styles Introduction: Interaction styles are primarily different ways in which a user and computer system can.
Copyright John Wiley & Sons, Inc. Chapter 3 – Interactive Technologies HCI: Developing Effective Organizational Information Systems Dov Te’eni Jane.
Technology to support psychosocial self-management Kurt L. Johnson, Ph.D. Henry Kautz, Ph.D.
The Ergonomic Implications of Gesturing Examining Single and Mixed Use with Appropriate Placement Lindsey Muse B.A., S. Camille Peres Ph.D., Adrian Garcia.
Multimedia Specification Design and Production 2013 / Semester 2 / week 8 Lecturer: Dr. Nikos Gazepidis
Input Devices. What is Input?  Everything we tell the computer is Input.
Unit 1_9 Human Computer Interface. Why have an Interface? The user needs to issue instructions Problem diagnosis The Computer needs to tell the user what.
Lecture 6 User Interface Design
Interaction with Surfaces. Aims Last week focused on looking at interaction with keyboard and mouse This week ◦ Surface Interaction ◦ Gestures.
Ubiquitous Computing Computers everywhere. Where are we going? What happens when the input is your car pulls into the garage, and the output is the heat.
Visualizing Information in Global Networks in Real Time Design, Implementation, Usability Study.
BMAN Integrative Team Project Week 2 Professor Linda A Macaulay.
Chapter 3: Managing Design Processes
Input By Hollee Smalley. What is Input? Input is any data or instructions entered into the memory of a computer.
Turns human body into a touch screen finger input Interface. By, M.PRATHYUSHA 07P61A1261 IT-B.
Human Computer Interaction © 2014 Project Lead The Way, Inc.Computer Science and Software Engineering.
Human-Computer Interaction
22CS 338: Graphical User Interfaces. Dario Salvucci, Drexel University. Lecture 10: Advanced Input.
MULTIMEDIA INPUT / OUTPUT TECHNOLOGIES INTRODUCTION 6/1/ A.Aruna, Assistant Professor, Faculty of Information Technology.
User Interface Theory & Design Lecture 6a 1.  User interface is everything the end user comes into contact with while using the system  To the user,
Users’ Quality Ratings of Handheld devices: Supervisor: Dr. Gary Burnett Student: Hsin-Wei Chen Investigating the Most Important Sense among Vision, Hearing.
Mixed Reality: A Model of Mixed Interaction Céline Coutrix, Laurence Nigay User Interface Engineering Team CLIPS-IMAG Laboratory, University of Grenoble.
Predicting Task Execution Time on Handheld Devices Using the Keystroke Level Model Annie Lu Luo and Bonnie E. John School of Computer Science Carnegie.
WEARABLE COMPUTERS. Wearable computers are computers that are worn on the body. Wearable computers are especially useful for applications that require.
The Software Development Process
1 Human Computer Interaction Week 5 Interaction Devices and Input-Output.
SD1230 Unit 6 Desktop Applications. Course Objectives During this unit, we will cover the following course objectives: – Identify the characteristics.
Augmented Reality Authorized By: Miss.Trupti Pardeshi. NDMVP, Comp Dept. Augmented Reality 1/ 23.
Different Types of HCI CLI Menu Driven GUI NLI
Multi-Modal Dialogue in Personal Navigation Systems Arthur Chan.
Parts of a Computer. Two Basic Components of a Computer System Hardware Parts of the Computer System you can physically touch Software Computer Instructions.
Copyright John Wiley & Sons, Inc. Chapter 3 – Interactive Technologies HCI: Developing Effective Organizational Information Systems Dov Te’eni Jane.
1 The Software Development Process ► Systems analysis ► Systems design ► Implementation ► Testing ► Documentation ► Evaluation ► Maintenance.
Toward Subtle Intimate Interfaces for Mobile Devices Using an EMG Controller Enrico Costanza Media Lab Europe now at MIT Media Lab Samuel A. Inverso Media.
1 Interaction Devices CIS 375 Bruce R. Maxim UM-Dearborn.
Introduction to Computing Slides By ADEELA MUSTAFA.
Keyboard mode: First Touch Second Touch if2is2mf2ms2rf2rs2ps2 if1abcdfg[space] is1hejklm[backspace] mf1npiqrs[return] ms1tvwoxz[period] rf10123u4[sym]
What is Multimedia Anyway? David Millard and Paul Lewis.
PRESENTED BY : DIPTIMAYEE PADHIHARI  Introduction to wearable computers  Aim of wearable computing  History of wearable computers  Wearable computing.
TOUCHLESS TOUCHSCREEN USER INTERFACE
MULTIMODAL AND NATURAL COMPUTER INTERACTION Domas Jonaitis.
Introduction to Input Devices. Input Devices Units that gather information and transform that information into a series of electronic signals for the.
TOUCHLESS TOUCHSCREEN USER INTERFACE
Chapter 6: Interfaces and interactions
F-Pointer: Prototype testing of the finger-manipulated device
Wearable Devices. Wearable Devices Wearable Interfaces Wearable interfaces are the interfaces used to interact with wearable devices while they.
Multimodal Human-Computer Interaction New Interaction Techniques 22. 1
Human Computer Interface
Data Groupings: File File: a group of related records
Human and Computer Interaction (H.C.I.) &Communication Skills
Presentation transcript:

Mobile HCI Presented by Bradley Barnes

Mobile vs. Stationary Desktop – Stationary Users can devote all of their attention to the application. Very graphical, detailed Use the keyboard and mouse for input

Mobile and Wearable Devices Users in motion – can’t devote all of attention to the application Limited screen real estate Input and output capabilities are restricted for users on the move

Mobile Device Interaction The interface for mobile and wearable devices continues to mimic those of desktop computers. New interaction techniques are needed to safely accommodate users on the move. Interaction should be subtle, discreet, and unobtrusive.

Mobile Interaction Methods Keyboard Touch Screen Speech Recognition Head motion, 3-D sound Eyeglass displays with Gestural Interaction

CHI2005 Paper Toward Subtle Intimate Interfaces for Mobile Devices Using an EMG Controller Using a mobile device in a social context should not cause embarrassment and disruption to the immediate environment.

Intimate Interfaces Discrete interfaces that allow control of mobile devices through subtle gestures in order to gain social acceptance. Must take into account the social context where the interaction will occur. Most interaction occurs around other people (bus, train, street, etc.)

Electromyographic Signals EMG signals are generated by muscle contraction. Signals are picked up by contact electrodes. Allows a definition of “subtle” or “motionless gestures” that can be used to issue commands to mobile devices. Can sense muscular activity not related to movement.

The System Armband controller recognizes gestures Signals transmitted via Bluetooth Compliant device receives signals and performs appropriate action. Can be PDA, mobile phone, etc. The armband works on all users: no calibration or training required.

When combined with eyeglass displays, the system becomes “hands free”. Can be operated when users are carrying items. Can be used in specific fields, such as maintenance, for assistance when the user’s hands are tied.

Design Process Iterative process centered on users. Three pilot studies and one formal Study. Pilot Study 1: Bicep is chosen muscle, and the gesture is defined as a brief unnoticeable contraction of the bicep.

Pilot Study 2: Refine the gesture definition, and create an algorithm for its detection. New subjects w/ variety of muscle volumes Gesture not fully described to subjects Compared EMG signals of gesture to those of normal activity. Algorithm detects peaks in the EMG signals.

Pilot Study 3: Fine tuning of the system, wanted to test for false positives and false negatives Consisted of new and returning users Worked with the algorithm until the number of false positives approached zero Also, they decided to try a gesture alphabet with two gestures. They are defined as two short contractions of different duration.

Formal Study: Validation of Results Pilot studies set up the system parameters by testing gestures on subjects who were not mobile. Conducted to assess the usability of EMG as a subtle interaction technique for mobile devices. Evaluated the system usability in a mobile context.

Formal Experiment Design 10 adult participants-Ages 23 to 34 Perform 5 walking tasks – one with no contraction to calculate misclassification rate, and other four with contractions of different durations. Subjects make laps around obstacles while doing the contractions

Familiarization sessions preceded the walking tasks. These involved standing and making contractions. The participants were prompted to contract by a MIDI piano tone delivered through the wireless headphones. System recognized contractions between 0.3 and 0.8 seconds, but the subjects did not know the duration of their contraction: only that the system recognized it.

Tasks 1) Walking, No Contractions – 10 laps 2) Standing, Familiarization, Generic Contractions 3) Walking, Stimulus-Response, Generic Contractions 4) Standing, Familiarization, Short Contractions

5) Walking, Stimulus-Response, Short Contractions 6) Standing, Familiarization, Long Contractions 7) Walking, Stimulus-Response, Long Contractions 8) Walking, Stimulus-Response, Mixed Long and Short Contractions (low tone – long contractions, high tone – short)

Results The online recognition rates for the four walking tasks were: Generic: 96% Short: 97% Long: 94% Mixed: 87%

Conclusion An EMG based wearable input device can be used for subtle and intimate interaction. The system presented can recognize motionless gesture without training or calibration. EMG gestures can be utilized as a socially acceptable alternative for mobile device interaction

Future Work Expand gesture alphabet Test in more “real world” scenarios, like when lifting something.

References Lumsden, J., Brewster, S. A paradign shift: alternative interaction techniques for use with mobile & wearable devices. Proc. Of the 13 th Annual IBM Centers for Advanced Studies Conference CASCON’2003. Costanza, E., Inverso, S., Allen, R. Toward Subtle Intimate Interfaces for Mobile Devices Using an EMG Controller.