Lens Gestures: Integrating Compound Gesture Inputs for Shortcut Activation.

Slides:



Advertisements
Similar presentations
Objectives to improve citizens awareness and comfort industrial competitiveness efficiency of public administrations by enhancing and supporting the use,
Advertisements

Hand Gesture for Taking Self Portrait Shaowei Chu and Jiro Tanaka University of Tsukuba Japan 12th July 15 minutes talk.
Motors and Automatic Target Recognition Motors... l Motorized Instruments have 2 Motors: –One to Rotate the Alidade Horizontally –A Second to Rotate.
Observing users Chapter 12. Observation ● Why? Get information on.. – Context, technology, interaction ● Where? – Controlled environments – In the field.
SNOUT: One-Handed use of Capacitive Touch Devices Adam Zarek, Daniel Wigdor, Karan Singh University of Toronto.
 INTRODUCTION  STEPS OF GESTURE RECOGNITION  TRACKING TECHNOLOGIES  SPEECH WITH GESTURE  APPLICATIONS.
Department of Electrical and Computer Engineering He Zhou Hui Zheng William Mai Xiang Guo Advisor: Professor Patrick Kelly ASLLENGE.
Dana Cobzas-PhD thesis Image-Based Models with Applications in Robot Navigation Dana Cobzas Supervisor: Hong Zhang.
Vision for Robotics ir. Roel Pieters & dr. Dragan Kostić Section Dynamics and Control Dept. Mechanical Engineering {r.s.pieters, January.
Supporting Collaboration: Digital Desktops to Intelligent Rooms Mary Lou Maher Design Computing and Cognition Group Faculty of Architecture University.
Vision-Based Interactive Systems Martin Jagersand c610.
Cindy Song Sharena Paripatyadar. Use vision for HCI Determine steps necessary to incorporate vision in HCI applications Examine concerns & implications.
Gyration GyroMouse. Digitizers 3D Digitization Data Gloves (Haptic Devices)
Create a Mobile App for Local Events, Eateries, and Attractions CHALLENGE #17 RYAN KARA.
Integrated Astronaut Control System for EVA Penn State Mars Society RASC-AL 2003.
Motion Capture Emilio Cantu, Jin He, Gerard Louis, Brian Orchosky, Chelsey Salberg.
Interactive Promotional Products on Danilo Dolenc Statistical Office of the Republic of Slovenia.
“S ixth Sense is a wearable gestural interface device that augments the physical world with digital information and lets people use natural hand gestures.
Session 1: Overview of Title III Plan, Data, and Review of Specially Designed Academic Instruction in English (SDAIE) Title III Access to Core Professional.
Casey Smith Doug Ritchie Fred Lloyd Michael Geary School of Electrical and Computer Engineering November 2, 2011 ECE 4007 Automated Speed Enforcement Using.
Multi-device Organic 3D Sculpting through Natural User Interface Gestures BSci Honours - Bradley Wesson Supervisor – Brett Wilkinson.
Human Gesture Recognition Using Kinect Camera Presented by Carolina Vettorazzo and Diego Santo Orasa Patsadu, Chakarida Nukoolkit and Bunthit Watanapa.
Use of Eye Movement Gestures for Web Browsing Kevin Juang Frank Jasen Akshay Katrekar Joe Ahn.
Lectio Praecursoria Miguel Bordallo López
The Planning Innovations Technology Lab Graduate Program in City & Regional Planning School of Urban Affairs & Public Policy.
STREET VIEW Wenyang Liu GMAT9205: Fundamentals of Geo-Positioning Session 1, 2010.
Class 02 – 03 Feb 2014 Setup Where do we begin? Know your content Discovering your target user.
Turns human body into a touch screen finger input Interface. By, M.PRATHYUSHA 07P61A1261 IT-B.
Human Computer Interaction © 2014 Project Lead The Way, Inc.Computer Science and Software Engineering.
Image Pool. (a)(b) (a)(b) (a)(c)(b) ID = 0ID = 1.
The Second Life of a Sensor: Integrating Real-World Experience in Virtual Worlds using Mobile Phones Mirco Musolesi, Emiliano Miluzzo, Nicholas D. Lane,
Mixed Reality: A Model of Mixed Interaction Céline Coutrix, Laurence Nigay User Interface Engineering Team CLIPS-IMAG Laboratory, University of Grenoble.
Overview on Gesture Recognition Spring Semester, 2010.
Hand Motion Identification Using Independent Component Analysis of Data Glove and Multichannel Surface EMG Pei-Jarn Chen, Ming-Wen Chang, and and Yi-Chun.
Spring 2007 COMP TUI 1 Computer Vision for Tangible User Interfaces.
What are the advantages of using bar code scanner?  Fast  It is fast  It is fast for reading data  It is fast for data input  Accurate  The advantage.
January 29, January 29, 2014 Gesture recognition technology moving into mainstream Tal Krzypow VP Product Management eyeSight Technologies Ltd.
Summary of Organization Reports UNECE Work Session on the Communication of Statistics 30 June – 2 July 2010, Paris, France United Nations Economic Commission.
Mathematics Learning mathematics. Math can be fun and exiting! It is important to learn math without relying on a calculator…
What is Multimedia Anyway? David Millard and Paul Lewis.
TOUCHLESS TOUCHSCREEN USER INTERFACE
Multimodal and Natural computer interaction Evelina Stanevičienė.
Microsoft Kinect How does a machine infer body position?
 ASMARUL SHAZILA BINTI ADNAN  Word Emotion comes from Latin word, meaning to move out.  Human emotion can be recognize from facial expression,
Outline  What is MySmartEye ?  Motivation, objectives.  Implementation.  Programming techniques.  Future Work.  Demo.
Persona Ecosystem Step 1 – Identify The Market
TOUCHLESS TOUCH SCREEN USER INTERFACE
When CSI Meets Public WiFi: Inferring Your Mobile Phone Password via WiFi Signals Warren Yeu When CSI Meets Public Wifi.
When CSI Meets Public WiFi: Inferring Your Mobile Phone Password via WiFi Signals Adekemi Adedokun May 2, 2017.
EYE TRACKING TECHNOLOGY
Sliding Puzzle Project
Augmented Reality And Virtual Reality.
Luke Dillman & James McManus with Scott Bateman
P1: Smart Phone Interaction Workaround
TOUCHLESS TOUCHSCREEN USER INTERFACE
Gesture recognition using deep learning
Massachusetts Institute of Technology
LLAMA Language Learning and Acquisition on Mobile Architecture Problem
Chao Xu, Parth H. Pathak, et al. HotMobile’15
Higher School of Economics , Moscow, 2016
PRESENTED BY: CH.MOUNICA B.KEERTHANA SKINPUT. PRESENTED BY: CH.MOUNICA B.KEERTHANA SKINPUT.
Listing Builder.
AirFlip: A Double Crossing In-Air Gesture using Boundary Surfaces of Hover Zone for Mobile Devices Hiroyuki Hakoda, Takuro Kuribara, Keigo Shima, Buntarou.
Interactive media.
Higher School of Economics , Moscow, 2016
Higher School of Economics , Moscow, 2016
Report 2 Brandon Silva.
Presentation transcript:

Lens Gestures: Integrating Compound Gesture Inputs for Shortcut Activation

Current Problem and Opportunity Questions: 1.Can compound gestures be accurately recognized? 2.Can compound gestures acquire shortcut targets accurately and at a faster rate than touch inputs? Opportunity: Lens Gestures uses the camera as a motion sensor, and this can be used as an alternative input source. Implications: – Quicker access to frequently used applications – Inaccurate recall of shortcuts

Compound Gesture Workflow 1. Initial Recognition 2. Perform Gestures 3. End Recognition Images from paper: “LensGesture: Augmenting Mobile Interactions with Back- of-Device Finger Gestures”

Demonstration

Challenges of a Compound Input System 1.Accuracy – Not all combinations are equitably easy to perform compared to their peers 2. Recall – The user will need to recall the specific combination to acquire a particular target

Addressing the Challenges of a Compound Input System 1.Accuracy – Study 1: Measuring Accuracy of Combinations - A group of participants will perform all one and two gesture inputs. The study will observe the accuracy rate of performing each gesture. 2.Recall – Study 2: Measuring Accuracy and Time of Target Acquisition – A group of participants will be asked to perform the gesture to acquire a given application. The study will observe the accuracy rate and time of acquiring the correct application for differing settings (one gesture, two gestures, and two gestures with one fixed single gesture, and touch inputs). – Preset mapping classify by type (social media, music, etc.)