Download presentation
Presentation is loading. Please wait.
Published byClaribel Hutchinson Modified over 9 years ago
1
Takuya Imaeda Graduate School of Media and Governance, Keio University Hideyuki Tokuda Faculty of Environment and Information Studies, Keio University
2
“uMegane” visualizes sensor information using Augmented Reality (AR) technology ◦ Users can visually understand sensor information. The filtering mechanism of our system let users select required sensor information.
3
Miniaturization of Sensor Devices ◦ Wireless Sensor Node Realization of Ubiquitous Computing Environment
4
It is difficult for ordinary users to get the information from sensor network. ◦ e.g. Accessing Database System Sensor information tends to become a huge amount. Users cannot –get information of the sensor before his face –access required information ・・ ・
5
Direct Access to Database Display on Map QRcode + Mobile Phone + Web-based-UI
6
constructing a system that enables users to access sensor information easily ◦ don’t require users special knowledge or great care Temperature : 29 ℃ Humidity : 30 %
7
Traceability application ◦ In food store ◦ Consumers can check safety of the food by reading sensor data ◦ There are many objects good condition proper temperature
8
Visualizing sensor information using AR Technology Filtering sensor information ◦ select required information from a huge number of sensor information
9
Augmented Reality ◦ overlaying digital information on images of real- world We applied AR Technology to Ubiquitous- Computing-Environment
10
Weapon Technology ◦ display airplane information on HMD or HUD Vehicle ◦ display information of the vehicle and surrounding area on windshield being of service in specific environments or situations
11
NaviCam ◦ Rekimoto, 1995 ◦ projects digital information on image of the real- world ◦ covers static information such as calendar and bookshelf does not cover dynamic information such as sensor u-Photo ◦ Tokuda Lab, Keio University ◦ enables users to operate devices using the metaphor of photograph does not cover dynamic information on real-time
12
Overlaying sensor information on images of real-world. Temperature : 29 ℃ Humidity : 30 %
14
Visual-Marker ◦ merit distinguish uniform object individually ◦ demerit attach Visual-Markers to objects 3D Model ◦ merit no need for change of the real-world ◦ demerit need for 3D Model of real-world object recognize uniform object as identical object Feature Point ◦ merit no need for real-world model ◦ demerit unfitted for object recognition This system needs to recognize uniform object individually. We apply Visual-Marker
15
visualizes sensor information as 3D model globe ◦ Color : Temperature RedHot BlueCold ◦ Size : Lighting Intensity Bigbright Smalldark ◦ Movement : Acceleration FastFast SlowSlow スクリーンショット
16
It is difficult to project all sensor information ◦ There is a huge number of sensor data Filtering-mechanism switches visibility of sensor information ◦ Real-time filter ◦ Time-machine filter ◦ Abnormal-state-detection filter
17
We implemented 3 filters on the prototype ◦ Real-time Filter display latest sensor data ◦ Time-machine Filter shows past sensor information come and go along time-axis ◦ Abnormal-State-Detection Filter shows only sensor data that exceeds the range of value defined in advance by users
18
User Device ◦ Camera and HMD / Display ◦ Mobile-phone Object in the real-world ◦ Visual-Marker ◦ Wireless Sensor Node Sensor Information delivering Server ◦ deliver sensor information to user-device through wireless network
19
Sync Node Sensor Server Wireless Sensor Node User Device Wireless Communication Sensor data WebCamera HMD Internet Sensor data USB Interface VGA USB
20
CameraAccelerometerUser Input (Touch Panel / Mouse) Sensor Data (through Wifi) Camera PoseAcceleration Camera Pose Filter Switch Operation … User Operation Sensor Data Draw Operation User Device OpenGL / OpenGL ES (mobile phone) Visualization Software Module Filter Switcher ARToolkit Acceleration Analysis User Interface Sensor Data Manager Camera Pose Analysis Real-time Filter Time-machine Filter Abnormal-State- Detection Filter
21
1. Scalability investigation of sensor volume 2. Usability Evaluation
22
We measured the change of the time required for displaying sensor information when the number of sensors is increased Target Data ◦ Collect sensor data for a month 20080601 – 20080630 ◦ update data every minute ◦ 100 sensors Goal ◦ 30 fps (1 frame take 33 msec) Sensor ◦ uPart (TecO Lab, University of Karlsruhe, Germany) Light Temperature 1-Dimmension Acceleration
23
Software ◦ OS Windows XP Service Pack2 ◦ Language C/C++ ◦ Compiler Visual Studio 2005 ◦ Library ARToolkit OpenGL Hardware ◦ CPU Intel Core Duo 1.5GHz ◦ Memory 512MB Software ◦ OS Windows XP Service Pack2 ◦ Language JDK 6.0 ◦ Database MySQL Hardware ◦ CPU Pentium M 1.70GHz ◦ Memory 1GB ClientServer
24
The result show that the system can work on practical use Number of Sensors 110100 Average Time (msec) 3.27356.792129.7655 The system work on 30 fps 30fps = 1 frame take 33 msec ※
25
Participant ◦ 7 people (7 male) 3 people had the experience of handling environment monitoring system using sensor devices ◦ 20 – 25 age Content of Evaluation ◦ read 10 sensor information in the environment ◦ answer questionnaires after operation
26
“LOFT in LOFT Project” Evaluation Space ◦ There are many sensors and devices Location Temperature Light ◦ envisioned as living room
27
Questionnaire Point 1 2 3 4 5 Average Do you want to use the system in daily life? 0 3 2 2 22.8571 Do you want to use the system when you want to detect abnormality? 0 3 1 4 24.1429 Is the system useful for visualization of sensor information? 0 0 0 2 54.7142 Is the system more useful than direct operation of database system? 0 1 0 0 64.5714 Is the system more useful than Web-based system? 0 0 3 2 23.8571 Is the system more useful than mobile-phone and 2D barcode interface? 0 0 0 1 64.8571 No -> 1, Yes -> 5 The result show that the system is useful compared with other techniques
28
Research forum of Shonan Fujisawa Campus, Keio University ◦ Date 22, 23 / Nov / 2007 ◦ Location RoppongiHills, Tokyo, Japan We demonstrated the system ◦ got a lot of comments from Visitors
29
Instinctive Interface ◦ “I am not aware of the existing of the system comparing with mobile-phone” ◦ “The system is more instinctive and easy to use than ordinary mobile-phone interfaces” ◦ “It is fun to see dynamic change of sensor information in real time” Dissatisfaction with Visual-Marker ◦ “Visual-Marker spoil a view” ◦ “Use LED-panel in place of Visual-Marker” Request for User Interface ◦ “It becomes difficult for the angle of camera to see information” ◦ “Display something always”
30
Enhancement of Filters ◦ implement more filters for various context ◦ select appropriate filters automatically according to context Sensor recognition method except Visual- Marker ◦ Visible Light Communication using LED
31
We developed “uMegane”, the sensor information visualization system ◦ project sensor information on real-world picture using AR technology ◦ implement filter mechanism for selection of required information from a huge of sensor information We run usability evaluation ◦ confirmed utility of the system We demonstrated the system at ORF 2007
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.