Su-ting, Chuang 2010/8/2. Outline Introduction Related Work System and Method Experiment Conclusion & Future Work 2.

Slides:



Advertisements
Similar presentations
B.N.Lin UniDisplay.
Advertisements

Interactively Co-segmentating Topically Related Images with Intelligent Scribble Guidance Dhruv Batra, Carnegie Mellon University Adarsh Kowdle, Cornell.
PlayAnywhere: A Compact Interactive Tabletop Projection-Vision System Professor : Tsai, Lian-Jou Student : Tsai, Yu-Ming PPT Production rate : 100% Date.
Vision-Based Finger Detection and Its Applications 基於電腦視覺之手指偵測及其應用 Yi-Fan Chuang Advisor: Prof. Yi-Ping Hung Prof. Ming-Sui Lee.
Text Scaffolds for Effective Surface Labeling Gregory Cipriano and Michael Gleicher.
Render Cache John Tran CS851 - Interactive Ray Tracing February 5, 2003.
VisHap: Guangqi Ye, Jason J. Corso, Gregory D. Hager, Allison M. Okamura Presented By: Adelle C. Knight Augmented Reality Combining Haptics and Vision.
Su-ting, Chuang 2010/8/2. Outline Introduction Related Work System and Method Experiment Conclusion & Future Work 2.
1 Towards Pervasive Connectivity in Mobile Computing Frank Siegemund European Microsoft Innovation Center November 2006.
Adviser : Ming-Yuan Shieh Student ID : M Student : Chung-Chieh Lien VIDEO OBJECT SEGMENTATION AND ITS SALIENT MOTION DETECTION USING ADAPTIVE BACKGROUND.
By shooting 2009/10/1. outline imTop overview imTop detection Finger Mobile Finger detection evaluation Mobile detection improvement.
Robust Multi-Pedestrian Tracking in Thermal-Visible Surveillance Videos Alex Leykin and Riad Hammoud.
Formation et Analyse d’Images Session 8
Broadcast Court-Net Sports Video Analysis Using Fast 3-D Camera Modeling Jungong Han Dirk Farin Peter H. N. IEEE CSVT 2008.
Modeling Pixel Process with Scale Invariant Local Patterns for Background Subtraction in Complex Scenes (CVPR’10) Shengcai Liao, Guoying Zhao, Vili Kellokumpu,
Motion Tracking. Image Processing and Computer Vision: 82 Introduction Finding how objects have moved in an image sequence Movement in space Movement.
1 Learning to Detect Objects in Images via a Sparse, Part-Based Representation S. Agarwal, A. Awan and D. Roth IEEE Transactions on Pattern Analysis and.
Video Processing EN292 Class Project By Anat Kaspi.
Robust Object Segmentation Using Adaptive Thresholding Xiaxi Huang and Nikolaos V. Boulgouris International Conference on Image Processing 2007.
Tracking using the Kalman Filter. Point Tracking Estimate the location of a given point along a sequence of images. (x 0,y 0 ) (x n,y n )
Student: Hsu-Yung Cheng Advisor: Jenq-Neng Hwang, Professor
Scale Invariant Feature Transform (SIFT)
Touchscreen Implementation for Multi-Touch
Multi-camera Video Surveillance: Detection, Occlusion Handling, Tracking and Event Recognition Oytun Akman.
Face Processing System Presented by: Harvest Jang Group meeting Fall 2002.
Jacinto C. Nascimento, Member, IEEE, and Jorge S. Marques
The objective of this senior design project was to design and build a multi-touch interface device that could allow users to interact with a computer application.
Overview and Mathematics Bjoern Griesbach
Yingen Xiong and Kari Pulli
MULTI-TOUCH TABLE Athena Frazier Chun Lau Adam Weissman March 25, 2008 Senior Projects II.
June 10, 2009 – CMPE 123b Project Presentations Jas Condley Eddie Izumoto Kevin Nelson Matt Thrailkill Zach Walker.
KinWrite: Handwriting-Based Authentication Using Kinect Proceedings of the 20th Annual Network & Distributed System Security Symposium, NDSS 2013 Jing.
Object detection, tracking and event recognition: the ETISEO experience Andrea Cavallaro Multimedia and Vision Lab Queen Mary, University of London
Supporting Beyond-Surface Interaction for Tabletop Display Systems by Integrating IR Projections Hui-Shan Kao Advisor : Dr. Yi-Ping Hung.
3D Fingertip and Palm Tracking in Depth Image Sequences
BraMBLe: The Bayesian Multiple-BLob Tracker By Michael Isard and John MacCormick Presented by Kristin Branson CSE 252C, Fall 2003.
Supporting Beyond-surface Interaction for Tabletop Systems by Integrating IR Projections Hui-Shan Kao.
3D SLAM for Omni-directional Camera
An Information Fusion Approach for Multiview Feature Tracking Esra Ataer-Cansizoglu and Margrit Betke ) Image and.
Improving the Speed of Virtual Rear Projection: A GPU-Centric Architecture Matthew Flagg, Jay Summet, James M. Rehg GVU Center College of Computing Georgia.
Image Pool. (a)(b) (a)(b) (a)(c)(b) ID = 0ID = 1.
Video Segmentation Prepared By M. Alburbar Supervised By: Mr. Nael Abu Ras University of Palestine Interactive Multimedia Application Development.
Online Kinect Handwritten Digit Recognition Based on Dynamic Time Warping and Support Vector Machine Journal of Information & Computational Science, 2015.
Making Graphical Information Visible in Real Shadows on Interactive Tabletops Mariko Isogawa, Daisuke Iwai, and Kosuke Sato (Osaka Univ., Japan) IEEE TRANSACTIONS.
Laser-Based Finger Tracking System Suitable for MOEMS Integration Stéphane Perrin, Alvaro Cassinelli and Masatoshi Ishikawa Ishikawa Hashimoto Laboratory.
Sparse Bayesian Learning for Efficient Visual Tracking O. Williams, A. Blake & R. Cipolloa PAMI, Aug Presented by Yuting Qi Machine Learning Reading.
Projector Calibration of Interactive Multi-Resolution Display Systems 互動式多重解析度顯示系統之投影機校正 Presenter: 邱柏訊 Advisor: 洪一平 教授.
Su-ting, Chuang 2010/8/2. Outline Introduction Related Works System and Method Experiment Conclusion & Future Work 2.
Su-ting, Chuang 1. Outline Introduction Related work Hardware configuration Detection system Optimal parameter estimation framework Conclusion 2.
By shooting 2009/6/22. Flow chart Load Image Undistotion Pre-process Finger detection Show result Send Result to imTop Calculate Background image by 10.
By shooting. Optimal parameters estimation Sample collect Various finger size Hard press and soft press Exhaustive search.
Images for paper By shooting. Sample collection Hard/Soft vertical touch Finger touch position 5 timer 2.
Su-ting, Chuang 1. Outline Introduction Related work Hardware configuration Finger Detection system Optimal parameter estimation framework Conclusion.
Efficient Geographic Routing in Multihop Wireless Networks Seungjoon Lee*, Bobby Bhattacharjee*, and Suman Banerjee** *Department of Computer Science University.
MULTI TOUCH. Introduction Multi-touch is a human-computer interaction technique. Consists of a touch screen as well as software that recognizes multiple.
What you need: In order to use these programs you need a program that sends out OSC messages in TUIO format. There are a few options in programs that.
CSCI 631 – Foundations of Computer Vision March 15, 2016 Ashwini Imran Image Stitching.
Enabling Beyond-Surface Interactions for Interactive Surface with An Invisible Projection Li-Wei Chan, Hsiang-Tao Wu, Hui-Shan Kao, Ju-Chun Ko, Home-Ru.
Over the recent years, computer vision has started to play a significant role in the Human Computer Interaction (HCI). With efficient object tracking.
CSCI 631 – Foundations of Computer Vision March 15, 2016 Ashwini Imran Image Stitching Link: singhashwini.mesinghashwini.me.
Video object segmentation and its salient motion detection using adaptive background generation Kim, T.K.; Im, J.H.; Paik, J.K.;  Electronics Letters 
V4 – Video Tracker for Extremely Hard Night Conditions
Hand Gestures Based Applications
C. Canton1, J.R. Casas1, A.M.Tekalp2, M.Pardàs1
Object Tracking Based on Appearance and Depth Information
Fast and Robust Object Tracking with Adaptive Detection
Introduction to Digital Image Analysis Part II: Image Analysis
A visual surveillance using real-time curve evolution based on the level-set method and pan-tilt camera Good afternoon ~ sir. Today I want to talk about.
Report 2 Brandon Silva.
DIGITAL IMAGE PROCESSING Elective 3 (5th Sem.)
Presentation transcript:

Su-ting, Chuang 2010/8/2

Outline Introduction Related Work System and Method Experiment Conclusion & Future Work 2

Outline Introduction Related Work System and Method Experiments Conclusion & Future Work 3

Introduction Non-uniform lighting problem Various finger touch response among different position Low computation efficiency No such tool that helps users determine parameters automatically 4

Outline Introduction Related Work System and Method Experiments Conclusion & Future Work 5

Related Work FTIR (Frustrated Total Internal Reflection) J. Y. Han, “Low-cost multi-touch sensing through frustrated total internal reflection," in Proceedings of the 18th annual ACM symposium on User interface software and technology (UIST '05). New York, NY, USA: ACM Press, 2005, pp

Related Work DI (Diffused Illumination) J. Rekimoto and N. Matsushita, “Perceptual surfaces: Towards a human and object sensitive interactive display," Workshop on Perceptural User Interfaces (PUI'97),

Related Work TouchLib A multi-touch development kit Finger detection processing flow chart 8 Background Subtraction Simple Highpass ScaleThreshold Finger Analysis

Related Work DirectShow Filter-based framework GShow GPU-accelerated framework Combination of DirectX and DirectShow 9

Outline Introduction Related Work System and Method Experiments Conclusion & Future Work 10

Hardware Configuration (2) IR Camera (3) IR Illuminator (1) Peripheral Projector 11

Hardware Configuration Order of diffuser layer and touch-glass layer 12 Diffuser layer IR illuminator IR camera spot IR illuminator IR camera Touch-glass layer IR camera spot IR camera

Hardware Configuration Problem: IR rays reflected by the touch-glass will result in hot spot regions in camera views Solution: Use other cameras to recover the regions which are sheltered by IR spots 13

Software Architecture Detection system Image Stitching Finger Detection Finger Tracking Parameter determination 14 Image Stiching Image Stiching Finger Detection Finger Detection Finger Tracking Finger Tracking

Software Architecture 15 Image Stiching Image Stiching Finger Detection Finger Detection Finger Tracking Finger Tracking

Image Stitching Goal Combine multi-camera view into a virtual camera view 16

Image Stitching Advantages Remove IR spot effect Unify finger size among different position of table Reduce matching problem Be compatible with existent finger detection system 17

Image Stitching 18 Image Blending IR Camera(L) IR camera(R) Undistortion HomoWarp

Image Stitching HomoWarp

Image Stitching Image Blending 20

Finger Detection TouchLib Our method 21 Normalization Difference of Gaussian Background Subtraction Binary Finger Analysis Simple Highpass Scale Background Subtraction Binary Finger Analysis

Finger Detection Normalization Method Model distribution of IR illumination Use specific material to simulate foreground Construct normalization map Normalize foreground image Result Before normalization: mean = 75, standard variation = 30 After normalization: mean = 255, standard variation = 3 22

Finger Detection Difference of Gaussian (DoG) Modified from simple highpass in TouchLib 23

Fingertip Tracking Goal Smooth the trajectory of finger Fix lost results Method Kalman filter Smooth the path Predict the new state and its uncertainty Correct the tracker with its new measurement Assume white noise and uniform velocity Original After Kalman filter 24

Parameter Determination Requirements of ideal finger detection system Sensitive  miss ↓ Noise-free  false alarm ↓ Goal Find an applicable set of parameters for finger detection system fulfilling the requirements 25

Parameter Determination 26 Parameters Determinator Parameter Combination Detection Result Applicable set of Parameters Test Set Touch Data Ground Truth (Trace) Detection System

Parameter Determination Evaluation of parameters Data Collection Depict trace Measurement Minimize # of miss and false alarm 27

Parameter Determination Ideal finger detection Only one fingertip landing on trace Continuity among frames 28

Outline Introduction Related Work System and Method Experiments Conclusion & Future Work 29

Experiments Performance evaluation 30

Experiments Parameter determination Decide parameters in our system Adopt sampling-based parameter search technique 31 Normalization Difference of Gaussian Background Subtraction Binary Finger Analysis Subtract value Smooth kernel Smooth kernel Threshold Finger Size Finger Size

Experiments Parameter determination Exhaustive search Parameter combination 5 (step) *5 (step) *5 (step) *5 (step) = 625 Applicable parameter num 16/625 = 2.56% 32 Subtract value Smooth kernel ThresholdFinger size Low bound 0510 Step55510 High bound

Experiments Parameter determination Particle filtering 33 Sampling Measure

Outline Introduction Related Work System and Method Experiments Conclusion & Future Work 34

35