Capacitive Fingerprinting: Exploring User Differentiation by Sensing Electrical Properties of the Human Body Chris Harrison Munehiko SatoIvan Poupyrev.

Slides:



Advertisements
Similar presentations
Hybrid Infrared and Visible Light Projection for Location Tracking
Advertisements

TEMPLATE DESIGN © The basic model for a trigonometric setup requires that the HID be seen by at least two cameras at any.
Virtual Me. Motion Capture (mocap) Motion capture is the process of simulating actual movement in a computer generated environment The capture subject.
PlayAnywhere: A Compact Interactive Tabletop Projection-Vision System Professor : Tsai, Lian-Jou Student : Tsai, Yu-Ming PPT Production rate : 100% Date.
30 April Prepared by Avinash Kumar Chauhan B.Tech (IT) 4 th Year Roll No : Virtual Keyboard.
SideBySide: Ad-Hoc Multi-User Interaction with Handheld Projectors Karl D.D. Willis, Ivan Poupyrev, Scott E. Hudson, and Moshe Mahler SideBySide:
Page 1 SIXTH SENSE TECHNOLOGY Presented by: KIRTI AGGARWAL 2K7-MRCE-CS-035.
Advanced Computer Graphics (Fall 2010) CS 283, Lecture 24: Motion Capture Ravi Ramamoorthi Most slides courtesy.
Slide 1 Tiled Display Walls - Relation to the Access Grid and Other Systems Mike Walterman, Manager of Graphics Programming, Scientific Computing and Visualization.
Game Development with Kinect
Input/Output Devices Graphical Interface Systems Dr. M. Al-Mulhem Feb. 1, 2008.
Sketchify Tutorial Graphics and Animation in Sketchify sketchify.sf.net Željko Obrenović
Tutorial 7 Working with Multimedia. XP Objectives Explore various multimedia applications on the Web Learn about sound file formats and properties Embed.
3D object capture Capture N “views” (parts of the object) –get points on surface of object –create mesh (infer connectivity) Hugues Hoppe –filter data.
Computer-Based Animation. ● To animate something – to bring it to life ● Animation covers all changes that have visual effects – Positon (motion dynamic)
GENERAL PRESENTATION ON TOUCHSCREEN Neeraj Dhiman.
Mouse, Touch Screen, Haptic Technology. Mouse Invented by Doug Engelbart First commercial computer to come with a mouse: Apple Macintosh 1984.
Virtual painting project By: Leetal Gruper Tsafrir Kamelo Supervisor: Michael Kolomenkin Advisor from 3DV systems: Sagi Katz.
“S ixth Sense is a wearable gestural interface device that augments the physical world with digital information and lets people use natural hand gestures.
(CONTROLLER-FREE GAMING
A Brief Overview of Computer Vision Jinxiang Chai.
The Camera Mouse: Visual Tracking of Body Features to Provide Computer Access for People With Severe Disabilities.
OCR GCSE ICT DATA CAPTURE METHODS. LESSON OVERVIEW In this lesson you will learn about the various methods of capturing data.
A HIGH RESOLUTION 3D TIRE AND FOOTPRINT IMPRESSION ACQUISITION DEVICE FOR FORENSICS APPLICATIONS RUWAN EGODA GAMAGE, ABHISHEK JOSHI, JIANG YU ZHENG, MIHRAN.
Supporting Beyond-Surface Interaction for Tabletop Display Systems by Integrating IR Projections Hui-Shan Kao Advisor : Dr. Yi-Ping Hung.
Canyon Adventure Technology David Maung, Tristan Reichardt, Dan Bibyk, Juan Roman Department of Computer Science and Engineering The Ohio State University.
1 Lecture 19: Motion Capture. 2 Techniques Morphing Motion Capture.
Low-Cost Multi-Touch Sensing through Frustrated Total Internal Reflection Jefferson Y. Han, New York University Presented by: Cody Boisclair.
Zhengyou Zhang Microsoft Research Digital Object Identifier: /MMUL Publication Year: 2012, Page(s): Professor: Yih-Ran Sheu Student.
KinectFusion : Real-Time Dense Surface Mapping and Tracking IEEE International Symposium on Mixed and Augmented Reality 2011 Science and Technology Proceedings.
Presentation by: K.G.P.Srikanth. CONTENTS  Introduction  Components  Working  Applications.
VIRTUAL KEYBOARD. what IS THE V IRTUAL KEYBOARD? Optically projected Keyboard Miniature, stand alone accessory. Fully functional as a standard keyboard.
Tutorial 7 Working with Multimedia. XP Objectives Explore various multimedia applications on the Web Learn about sound file formats and properties Embed.
GENERAL PRESENTATION SUBMITTED BY:- Neeraj Dhiman.
Supporting Beyond-surface Interaction for Tabletop Systems by Integrating IR Projections Hui-Shan Kao.
A Method for Hand Gesture Recognition Jaya Shukla Department of Computer Science Shiv Nadar University Gautam Budh Nagar, India Ashutosh Dwivedi.
Submitted by:- Vinay kr. Gupta Computer Sci. & Engg. 4 th year.
Surface Computing Turning everyday surfaces into interactive intelligent interfaces Co-located input and output Mixed reality: tangible objects, natural.
High-Resolution Interactive Panoramas with MPEG-4 발표자 : 김영백 임베디드시스템연구실.
Turns human body into a touch screen finger input Interface. By, M.PRATHYUSHA 07P61A1261 IT-B.
XP Practical PC, 3e Chapter 15 1 Creating Desktop Video and Animation.
Biometrics Stephen Schmidt Brian Miller Devin Reid.
1 Computer Graphics Week2 –Creating a Picture. Steps for creating a picture Creating a model Perform necessary transformation Lighting and rendering the.
Integrating Active Tangible Devices with a Synthetic Environment for Collaborative Engineering Sandy Ressler Brian Antonishek Qiming Wang Afzal Godil National.
Camera-less Smart Laser Projector
THE HUMAN BODY AS TOUCH SCREEN
1 Research Question  Can a vision-based mobile robot  with limited computation and memory,  and rapidly varying camera positions,  operate autonomously.
CONTENT FOCUS FOCUS INTRODUCTION INTRODUCTION COMPONENTS COMPONENTS TYPES OF GESTURES TYPES OF GESTURES ADVANTAGES ADVANTAGES CHALLENGES CHALLENGES REFERENCE.
Planar® DirectLight™ LED Video Wall System
-BY SAMPATH SAGAR( ) ABHISHEK ANAND( )
Product: Microsoft Kinect Team I Alex Styborski Brandon Sayre Brandon Rouhier Section 2B.
VIRTUAL KEYBOARD By Parthipan.L Roll. No: 36 1 PONDICHERRY ENGINEERING COLLEGE, PUDUCHERRY.
MULTI TOUCH. Introduction Multi-touch is a human-computer interaction technique. Consists of a touch screen as well as software that recognizes multiple.
What you need: In order to use these programs you need a program that sends out OSC messages in TUIO format. There are a few options in programs that.
CHAPTER 7 TouchGestures. Chapter objectives: To code the detection of and response to touch gestures. Patterns of common touches to create touch gestures.
MULTI TOUCH  Multi-touch refers to a touch system's ability to simultaneously detect and resolve a minimum of 3+ touch points. All 3 or more touches are.
Enabling Beyond-Surface Interactions for Interactive Surface with An Invisible Projection Li-Wei Chan, Hsiang-Tao Wu, Hui-Shan Kao, Ju-Chun Ko, Home-Ru.
Contents Introduction Requirements Design Technology Working Interaction.
TOUCHLESS TOUCHSCREEN USER INTERFACE
Clothing Manipulation Takeo Igarashi John F.Hughes Computer Science Department University of Tokyo Computer Science Department Brown University.
Daniel A. Taylor Pitt- Bradford University
A seminar on Touchless Touchscreen Technology
3D Puppetry: A Kinect-based Interface for 3D Animation
CAPTURING OF MOVEMENT DURING MUSIC PERFORMANCE
TOUCHLESS TOUCHSCREEN USER INTERFACE
Jefferson Y. Han, New York University
Vehicle Segmentation and Tracking in the Presence of Occlusions
PRESENTED BY: CH.MOUNICA B.KEERTHANA SKINPUT. PRESENTED BY: CH.MOUNICA B.KEERTHANA SKINPUT.
LEAP MOTION: GESTURAL BASED 3D INTERACTIONS
WORDKIT: by Kailee Murphy
Presentation transcript:

Capacitive Fingerprinting: Exploring User Differentiation by Sensing Electrical Properties of the Human Body Chris Harrison Munehiko SatoIvan Poupyrev UIST’2012 Disney Research PittsburghCarnegie Mellon UniversityUniversity of Tokyo

ABSTRACT Touchscreens can differentiate multiple points of contact, but not who is touching the device. Electrical properties of humans and their attire can be used to support user differentiation on touchscreens.

Title Electrical properties of users’ bodies can be used for differentiation – the ability to tell users apart, but not necessarily uniquely identify them. Not only report finger touch locations, but also identify to which user that finger belongs.

Title We estimate impedance profiles of users at different frequencies. The fundamental physical principle behind Capacitive Fingerprinting is that the path of alternating current (AC) in a human body depends on the signal frequency.

PROTOTYPE

APPLICATIONS There are many games, especially for tablets,that allow two users to play simultaneously. Using Capacitive Fingerprinting, it is possible for two players to interact in a common game space.

FUTURE WORK We are also curious as to how our approach could be applied to differentiating between humans and non- humans (e.g., bags, drinks) in ride systems and other applications.

WorldKit: Rapid and Easy Creation of Ad-hoc Interactive Applications on Everyday Surfaces Chris Harrison Scott E. Hudson CHI 2013 Robert Xiao Carnegie Mellon University Human-Computer Interaction Institute

ABSTRACT WorldKit system, which makes use of a paired depth camera and projector to make ordinary surfaces instantly interactive. Using this system, touch-based interactivity can without prior calibration, be placed on nearly any unmodified surface literally with a wave of the hand, as can other new forms of sensed interaction.

Title The system automatically selects an orientation for the interactor, which can optionally be adjusted by dragging a hand around the periphery of the selected area. Once all elements have been instantiated, the interface starts and can be used immediately

One-Time Projector / Depth Camera Calibration The calibration above only needs to be performed once.The setup can then be transported and installed anywhere. Depth sensor is used to automatically learn about new environments without requiring explicit steps to measure or (re-)calibrate in a new space. environment changes temporarily or permanently after interfaces have been defined by a user (e.g., a surface being projected on is moved), it may be necessary to re- define affected interfaces.

Title When the system starts up, we capture 50 consecutive depth frames and average them to produce a background profile. Within the background image, the standard deviation at each pixel location across the frames is used as a noise profile.

APPLICATION

LIMITATIONS The system has two notable drawbacks: the resolution of sensing and graphics can be lower than optimal, and the user may occlude the depth sensor and/or projector in certain configurations.

GaussBits: Magnetic Tangible Bits for Portable and Occlusion-Free Near-Surface Interactions Rong-Hao LiangKai-Yin ChengLiwei ChanChuan-Xhyuan Peng Mike Y. ChenRung-Huei LiangDe-Nian YangBing-Yu Chen National Taiwan UniversityNational Taiwan University of Science and Technology CHI 2013

ABSTRACT Since non-ferrous materials, such as the user’s hand, do not occlude the magnetic field. Such on-surface approaches allow users only to perform 2D tangible interactions but not 3D interactions, such as grasping.

Types of GaussBits The basic operations that can be performed with the Gauss-Bits are 3D translation, tilt, roll, and flip.

Title

APPLICATIONS

Title

Combination with Other ID Techniques

Results and Discussion

SideBySide: Ad-hoc Multi-user Interaction with Handheld Projectors Ivan PoupyrevMoshe Mahler Scott E. Hudson Disney Research Karl D.D. Willis Carnegie Mellon University Computational Design Lab UIST’11

Abstract

On-device Sensing

Title Tracking visible projected images restricts the type of content that can be projected, we track invisible fiducial markers projected from the handheld projection device.

Combining Visible and Infrared Imagery loading a monotone image into the red and green channels of an OpenGL frame buffer. The visible image is loaded from a separate monotone image into the blue channel using OpenGL’s additive blending mode.

Overlapping Markers

Title Our marker recovery procedure continuously adapts to the changing environmental conditions by refreshing the stored image of the reference marker when there is no overlap,resulting in more robust performance.

Optical Communication To project dynamic event markers, the sender device firstly identifies an empty region in the projection frame.

APPLICATIONS

3D Puppetry: A Kinect-based Interface for 3D Animation Ankit Gupta Brian Curless Robert T. Held University of Washington UIST’12 Maneesh Agrawala University of California, Berkeley

ABSTRACT

INTRODUCTION We use the Kinect along with the ReconstructMe 3D scanning software to build virtual 3D models of each puppet as a pre-process.

恭賀老師 榮升正教授

Title With tangible 3D puppets, the physics of the real world can help to produce natural-looking motions. Our puppet-tracking algorithm does impose a few restrictions on puppeteers. It assumes that puppets are rigid objects it cannot track articulated joints or easily deformable materials such as cloth. In addition, if puppeteers move the puppets very quickly over long distances our system can lose track of them.

SYSTEM OVERVIEW

Title With tangible 3D puppets, the physics of the real world can help to produce natural-looking motions. Our puppet-tracking algorithm does impose a few restrictions on puppeteers. It assumes that puppets are rigid objects it cannot track articulated joints or easily deformable materials such as cloth. In addition, if puppeteers move the puppets very quickly over long distances our system can lose track of them.

Title With tangible 3D puppets, the physics of the real world can help to produce natural-looking motions. Our puppet-tracking algorithm does impose a few restrictions on puppeteers. It assumes that puppets are rigid objects it cannot track articulated joints or easily deformable materials such as cloth. In addition, if puppeteers move the puppets very quickly over long distances our system can lose track of them.