TAUCHI – Tampere Unit for Computer-Human Interaction Adaptive Software Button: Blind Interaction Techniques for Touchscreens Georgios Yfantidis

Slides:



Advertisements
Similar presentations
Support.ebsco.com EBSCOhost Mobile Tutorial. Welcome to the EBSCOhost Mobile tutorial, a guide to the most popular EBSCOhost features available for use.
Advertisements

Chapter 11 Designing the User Interface
Machine Parts and Related Terms. monitor The TV-like piece of equipment used to display text, data, and graphic images on screen.
DEVELOPING ICT SKILLS PART -TWO
CS0004: Introduction to Programming Visual Studio 2010 and Controls.
Word Lesson 1 Microsoft Word Basics
DESCRIBING INPUT DEVICES
Using the Computer and Managing Files 1. Basic Information And Operations  View The Computer's Basic System Information  Change The Computer's Desktop.
Lesson 4 Alternative Methods Of Input.
XP Exploring the Basics of Microsoft Windows XP1 Exploring the Basics of Windows XP.
Human Computer Interface. HCI and Designing the User Interface The user interface is a critical part of an information system -- it is what the users.
TAUCHI – Tampere Unit for Computer-Human Interaction Blind Text Entry for Mobile Devices Grigori Evreinov Dept. of Computer Sciences University of Tampere,
Cyclic input of characters through a single digital button without visual feedback Yang Xiaoqing New Interaction Techniques Dept.
TAUCHI – Tampere Unit for Computer-Human Interaction “Easy Access” : Eye-Movements and Function Selection Oleg Spakov Tampere Unit for Computer-Human Interaction.
Text Input Under Time Pressure Conditions Department of Computer and Information Sciences University of Tampere, Finland Liang Jing Liang Jing p 01_12.
1 of 6 This document is for informational purposes only. MICROSOFT MAKES NO WARRANTIES, EXPRESS OR IMPLIED, IN THIS DOCUMENT. © 2007 Microsoft Corporation.
Conversational Computers
Object-Oriented Analysis and Design LECTURE 8: USER INTERFACE DESIGN.
1 TOUCH SCREEN PRESENTED BY NAINA Overview Introduction How does touch screen monitor know where you are touching? Basic working Principle.
User Interface Design Chapter 11. Objectives  Understand several fundamental user interface (UI) design principles.  Understand the process of UI design.
Module 3 Productivity Programs Common Features and Commands Microsoft Office 2007.
Text Input to Handheld Devices for People with Physical Disabilities Brad A. Myers and Jacob O. Wobbrock Human Computer Interaction Institute School of.
Basic Technology Components Review Keyboarding
Gesture Recognition Using Laser-Based Tracking System Stéphane Perrin, Alvaro Cassinelli and Masatoshi Ishikawa Ishikawa Namiki Laboratory UNIVERSITY OF.
Chapter 12 Designing the Inputs and User Interface.
Chapter 11: Interaction Styles. Interaction Styles Introduction: Interaction styles are primarily different ways in which a user and computer system can.
Microsoft Windows LEARNING HOW USE AN OPERATING SYSTEM 1.
1 Lesson 6 Exploring Microsoft Office 2007 Computer Literacy BASICS: A Comprehensive Guide to IC 3, 3 rd Edition Morrison / Wells.
Input Devices What is input? Everything we tell the computer is input.
Lesson 4 Access Lesson 4 Lesson Plans Michele Smith – North Buncombe High School, Weaverville, NC
11.10 Human Computer Interface www. ICT-Teacher.com.
CS 4720 Usability and Accessibility CS 4720 – Web & Mobile Systems.
Input Devices. What is Input?  Everything we tell the computer is Input.
Accessing the Keyboard Jessica Cassellius April LaCoursiere Meghan Neu.
1 of 2 This document is for informational purposes only. MICROSOFT MAKES NO WARRANTIES, EXPRESS OR IMPLIED, IN THIS DOCUMENT. © 2007 Microsoft Corporation.
COMP106 Assignment 2 Proposal 1. Interface Tasks My new interface design for the University library catalogue will incorporate all of the existing features,
PowerPoint Presentation for Dennis, Wixom, & Tegarden Systems Analysis and Design with UML, 3rd Edition Copyright © 2009 John Wiley & Sons, Inc. All rights.
ED 505 Melanie Shotts March 28, 2015
Human Computer Interaction © 2014 Project Lead The Way, Inc.Computer Science and Software Engineering.
Keyboard  Used to input data into application software.  Used for typing in commands to the computer (e.g. Ctrl + P for printing)  The keys are arranged.
Microsoft Assistive Technology Products Brought to you by... Jill Hartman.
E.g.: MS-DOS interface. DIR C: /W /A:D will list all the directories in the root directory of drive C in wide list format. Disadvantage is that commands.
Microsoft Office XP Illustrated Introductory, Enhanced Started with Windows 2000 Getting.
Assistive Technology in the Classroom Setting Rebecca Puckett CAE6100 – GQ1 (24494) Dec. 7, 2009.
KAMI KITT ASSISTIVE TECHNOLOGY Chapter 7 Human/ Assistive Technology Interface.
1 Human Computer Interaction Week 5 Interaction Devices and Input-Output.
Selection Three methods of selection Pressing the mouse button Pressing the mouse button Switch Switch Dwell (wait time) Dwell (wait time) Feedback of.
Haptic & Direct User Input with DirectInput ® 8 API Graphics Lab. Korea Univ.
TAUCHI – Tampere Unit for Computer-Human Interaction Development of the text entry self-training system for people with low vision Tatiana Evreinova Multimodal.
User Performance in Relation to 3D Input Device Design  Studies conducted at University of Toronto  Usability review of 6 degree of freedom (DOF) input.
Different Types of HCI CLI Menu Driven GUI NLI
USER INTERFACE DESIGN (UID). Introduction & Overview The interface is the way to communicate with a product Everything we interact with an interface Eg.
Copyright © 2009 Pearson Education, Inc. Publishing as Pearson Addison-Wesley Chapter 4: Events Programming with Alice and Java First Edition by John Lewis.
SCRIPT PROGRAMMING WITH FLASH Introductory Level 1.
1 Word Lesson 1 Microsoft Word Basics Microsoft Office 2010 Introductory Pasewark & Pasewark.
Introduction to Computing Slides By ADEELA MUSTAFA.
Creating a PowerPoint With Sound PowerPoint 2007 Version.
1 Lesson 11 Exploring Microsoft Office 2010 Computer Literacy BASICS: A Comprehensive Guide to IC 3, 4 th Edition Morrison / Wells.
SONGONUGA EMILIA ACCOUNTING 12/SMS02/ Introduction One goal of human-computer interaction research is to reduce the demands on users when using.
| Mobile Accessibility Development Making an Accessible App Usable Scott McCormack.
Input Devices.
SIE 515 Universal Design Lecture 9.
Standard Input Devices
Standard Methods of Input.
Designing the Inputs and User Interface
11.10 Human Computer Interface
LECTURE Course Name: Computer Application
Chapter 5 - Input.
Microsoft Office Illustrated Introductory, Windows XP Edition
Windows xp PART 1 DR.WAFAA SHRIEF.
Presentation transcript:

TAUCHI – Tampere Unit for Computer-Human Interaction Adaptive Software Button: Blind Interaction Techniques for Touchscreens Georgios Yfantidis Project for the course New Interaction Techniques Fall 2003

TAUCHI – Tampere Unit for Computer-Human Interaction Introduction :1 New interaction techniques and usability research engage in constant experimentation, to provide solutions for effective computer based applications. Especial interest is for techniques concerning physically challenged user groups, more test cases are being developed and evaluated

TAUCHI – Tampere Unit for Computer-Human Interaction Introduction :2 Touch screen can take advantage of gesture recognition. Applications can be useful both in large terminals and in personal device assistants that have limited size of displays. Blind people have several difficulties in learning how to use keyboards. Blind people have difficulties pointing with the use of mouse or other devices. Blind people need constant feedback in the form of markers (audio, tactile, etc) to comprehend system’s functionality and state. 1

TAUCHI – Tampere Unit for Computer-Human Interaction The main objectives Create a new technique for gesture interaction based on software button metaphor. The button has to be adaptive. Simple text entry functionality should be implemented. Simplify the interaction for blind people. - only one button - appears anywhere they click, no need to search Facilitate the interaction for blind people by recognizing simple finger gestures as input mode.

TAUCHI – Tampere Unit for Computer-Human Interaction What is an adaptive button? By an adaptive button we mean a virtual button (as opposed to physical keys), which resembles a normal button in a wide sense but has a variable function :“caption’s layout”. The layout of the button changes accordingly, during the interaction. In our project the button properties should be adaptive regarding position and functionality.

TAUCHI – Tampere Unit for Computer-Human Interaction Why use an adaptive button? In general, the button is useful for blind people, or in situations where the user can use only one finger to enter text. The advantages of an adaptive button include - Flexibility since the button appears where the user touches. - Easy navigation with auditory feedback, so the user doesn’t need to look at the screen in order to complete the text entry. - Simplicity: Only one button with the ability to manipulate with only one finger (or stylus) which leaves a free hand to the user.

TAUCHI – Tampere Unit for Computer-Human Interaction Which letters do the layouts contain? The first layout contains the most commonly used letters in English language. These letters are E R A S T I O N The second layout includes characters that are used less frequently C L P D H F G M The third layout contains characters that are more rarely used Q U B K X V W Y English letter frequencies a b c d e f g h i j k l m n o p q r s t u v w x y z

TAUCHI – Tampere Unit for Computer-Human Interaction Succession and layout of layers E N I O RA T S C M F G LP D H Q Y V W UB K X LAYER 1 LAYER 3 LAYER 2 Time interval

TAUCHI – Tampere Unit for Computer-Human Interaction Software : Main structure The initial screen is comprised of a text box and a square shaped adaptive button. Button has 3 different layouts. One layout appears at a time. Each layout contains 8 characters in an array that follows the basic arrow directions, up down left right and the intermediate positions between them The layouts change cyclically in time.

TAUCHI – Tampere Unit for Computer-Human Interaction Software : Main structure 2 Characters are entered by moving the finger (or stylus) in their direction and lifting the finger (or stylus). While user does this movement there is sound feedback informative of the character that the user is about to select. Since there are three layouts with 8 letters the total number of letters and characters that can be entered in this way are only 24.

TAUCHI – Tampere Unit for Computer-Human Interaction Change of characters After pointing towards some characters and waiting, there is substitution of character with another character, symbol or special function. The user hears a sound signal that indicates the change. That means that if the user performs a “mouse up” action the character of the initial layout will be the substituted by new character.

TAUCHI – Tampere Unit for Computer-Human Interaction Character substitutions: 1rst Layer E N I O RA T S Becomes “space” after dwelling Enters “Next Line” after dwelling

TAUCHI – Tampere Unit for Computer-Human Interaction Character substitutions: 2nd Layer C M F G LP D H Becomes “backspace” function after dwelling Becomes coma “,” after dwelling Becomes “J” after dwelling Becomes point “.” after dwelling

TAUCHI – Tampere Unit for Computer-Human Interaction Character substitutions: 3d Layer Q Y V W UB K X Becomes question mark“?” after dwelling Changes between Up case/low case after dwelling Becomes “z”after dwelling

TAUCHI – Tampere Unit for Computer-Human Interaction Layer edit feature There is a special feature in the software that initiates an editing mode of the layers. User has the chance to change the order of the letters in the layers according to his needs or his preferences. By double clicking within the textual box user can call editing mode and alter the layout of the layers by removing, repositioning and adding new letters and characters. The new layer can be saved in order for the user to have a customized interaction with the button.

TAUCHI – Tampere Unit for Computer-Human Interaction Manipulation through time In this software the dimension of time plays an important role that guides the interaction. The adaptation of the button is happening by changing position and functionality, but the main factor that determines how this adaptation and interchange is happening, is time. Layers change cyclically according to time. There is a certain interval between the layers which in future versions of the software could be configurable or adaptive. Dwelling time after a movement (an action of the user) determines which characters within the layers are going to be selected (in case there is a character with a substitution feature).

TAUCHI – Tampere Unit for Computer-Human Interaction Movements and gestures The user can move his finger in 8 different directions. Each direction returns a sound feedback, and the user can perform an input entry by lifting the finger (or stylus) after pointing. In future versions of the software more gesture recognition could be applied in order for extra functionality to be added. An example could be to have a different feedback if the finger was backtracking in the same path. It could perhaps enter a second character if the user moved towards the director of the symbols and then returned to the center. Another option which was realized is cancel.

TAUCHI – Tampere Unit for Computer-Human Interaction Future investigation Future investigation could deal with the following matters Implementing a built-in statistical data monitor feature that could use the results to gain insight into the data and deal with usability issues. How to make the interchange of the layers more effective (perhaps some adaptive feature concerning the timer that would change the time interval according to user performance) Further investigating input techniques and how gestures can be more intuitive and perhaps fast according to layout and character selection techniques. Possibilities for multiple character entering by this method: e.g., according to dwelling time user could be presented with more options for succeeding letters based on prediction or statistics, then user could choose this option based on certain pointing upping or clicking with finger

TAUCHI – Tampere Unit for Computer-Human Interaction Conclusions The project in this initial phase seemed to fulfill most of the basic functionality and features an adaptive button for a touch screen could have. Gestures and time manipulation were tried out in a basic level and that revealed that single button manipulation by one finger could be really used as an alternative to traditional text entry methods (e.g., full-keyboard emulation unavailable for blind text entry on touchscreen without special matrixes) This method of text entry, can be used by blind people because it allows them to hit almost anywhere in the screen presented with this virtual button.. Sound feedback during the process provides a good and understandable guidance that makes the use of a display unnecessary.

TAUCHI – Tampere Unit for Computer-Human Interaction References Vanderheiden, G., Law, C. Ergonomics of a a non-visual touchscreen interface: a case study. Trace R&D Center, University of Wisconsin-Madison Vanderheiden. G. Use of audio-haptic interface techniques to allow nonvisual access to touchscreen appliances. Trace R&D Center, University of Wisconsin-Madison Evreinov, G. and Raisamo, R. Information Kiosks for All: Issues of Tactile Access Interaction Design Guide for Touchscreen Applications, Lesher, G.W., Moulton, B.J., Higginbotham, D.J. Techniques for augmenting scanning communication Dunlop, H., Jones, M., Cunningham, S.J. A digital library of conversational expressions: a communication aid for people with profound physical disabilities. Dept. of Computer Science University of Waikato Gnatenko, V. Multidirectional input keypad TM- text input solution for mobile devices, Zhai, S., Smith, B.A. Alphabetically Biased Virtual Keyboards Are Easier to Use– Layout Does Matter. IBM Almaden Research Center, San Jose, CA, USA Wobbrock, J.O., Myers, B.A., Hudson, S.E. Exploring Edge-Based Input Techniques for Handheld Text Entry. HCI Institute, School of Computer Science, Carnegie Mellon University Geißler, J. Gedrics: the next generation of icons. German National Research Center for Computer Science (GMD)