Takuya Imaeda Graduate School of Media and Governance, Keio University Hideyuki Tokuda Faculty of Environment and Information Studies, Keio University
“uMegane” visualizes sensor information using Augmented Reality (AR) technology ◦ Users can visually understand sensor information. The filtering mechanism of our system let users select required sensor information.
Miniaturization of Sensor Devices ◦ Wireless Sensor Node Realization of Ubiquitous Computing Environment
It is difficult for ordinary users to get the information from sensor network. ◦ e.g. Accessing Database System Sensor information tends to become a huge amount. Users cannot –get information of the sensor before his face –access required information ・・ ・
Direct Access to Database Display on Map QRcode + Mobile Phone + Web-based-UI
constructing a system that enables users to access sensor information easily ◦ don’t require users special knowledge or great care Temperature : 29 ℃ Humidity : 30 %
Traceability application ◦ In food store ◦ Consumers can check safety of the food by reading sensor data ◦ There are many objects good condition proper temperature
Visualizing sensor information using AR Technology Filtering sensor information ◦ select required information from a huge number of sensor information
Augmented Reality ◦ overlaying digital information on images of real- world We applied AR Technology to Ubiquitous- Computing-Environment
Weapon Technology ◦ display airplane information on HMD or HUD Vehicle ◦ display information of the vehicle and surrounding area on windshield being of service in specific environments or situations
NaviCam ◦ Rekimoto, 1995 ◦ projects digital information on image of the real- world ◦ covers static information such as calendar and bookshelf does not cover dynamic information such as sensor u-Photo ◦ Tokuda Lab, Keio University ◦ enables users to operate devices using the metaphor of photograph does not cover dynamic information on real-time
Overlaying sensor information on images of real-world. Temperature : 29 ℃ Humidity : 30 %
Visual-Marker ◦ merit distinguish uniform object individually ◦ demerit attach Visual-Markers to objects 3D Model ◦ merit no need for change of the real-world ◦ demerit need for 3D Model of real-world object recognize uniform object as identical object Feature Point ◦ merit no need for real-world model ◦ demerit unfitted for object recognition This system needs to recognize uniform object individually. We apply Visual-Marker
visualizes sensor information as 3D model globe ◦ Color : Temperature RedHot BlueCold ◦ Size : Lighting Intensity Bigbright Smalldark ◦ Movement : Acceleration FastFast SlowSlow スクリーンショット
It is difficult to project all sensor information ◦ There is a huge number of sensor data Filtering-mechanism switches visibility of sensor information ◦ Real-time filter ◦ Time-machine filter ◦ Abnormal-state-detection filter
We implemented 3 filters on the prototype ◦ Real-time Filter display latest sensor data ◦ Time-machine Filter shows past sensor information come and go along time-axis ◦ Abnormal-State-Detection Filter shows only sensor data that exceeds the range of value defined in advance by users
User Device ◦ Camera and HMD / Display ◦ Mobile-phone Object in the real-world ◦ Visual-Marker ◦ Wireless Sensor Node Sensor Information delivering Server ◦ deliver sensor information to user-device through wireless network
Sync Node Sensor Server Wireless Sensor Node User Device Wireless Communication Sensor data WebCamera HMD Internet Sensor data USB Interface VGA USB
CameraAccelerometerUser Input (Touch Panel / Mouse) Sensor Data (through Wifi) Camera PoseAcceleration Camera Pose Filter Switch Operation … User Operation Sensor Data Draw Operation User Device OpenGL / OpenGL ES (mobile phone) Visualization Software Module Filter Switcher ARToolkit Acceleration Analysis User Interface Sensor Data Manager Camera Pose Analysis Real-time Filter Time-machine Filter Abnormal-State- Detection Filter
1. Scalability investigation of sensor volume 2. Usability Evaluation
We measured the change of the time required for displaying sensor information when the number of sensors is increased Target Data ◦ Collect sensor data for a month – ◦ update data every minute ◦ 100 sensors Goal ◦ 30 fps (1 frame take 33 msec) Sensor ◦ uPart (TecO Lab, University of Karlsruhe, Germany) Light Temperature 1-Dimmension Acceleration
Software ◦ OS Windows XP Service Pack2 ◦ Language C/C++ ◦ Compiler Visual Studio 2005 ◦ Library ARToolkit OpenGL Hardware ◦ CPU Intel Core Duo 1.5GHz ◦ Memory 512MB Software ◦ OS Windows XP Service Pack2 ◦ Language JDK 6.0 ◦ Database MySQL Hardware ◦ CPU Pentium M 1.70GHz ◦ Memory 1GB ClientServer
The result show that the system can work on practical use Number of Sensors Average Time (msec) The system work on 30 fps 30fps = 1 frame take 33 msec ※
Participant ◦ 7 people (7 male) 3 people had the experience of handling environment monitoring system using sensor devices ◦ 20 – 25 age Content of Evaluation ◦ read 10 sensor information in the environment ◦ answer questionnaires after operation
“LOFT in LOFT Project” Evaluation Space ◦ There are many sensors and devices Location Temperature Light ◦ envisioned as living room
Questionnaire Point Average Do you want to use the system in daily life? Do you want to use the system when you want to detect abnormality? Is the system useful for visualization of sensor information? Is the system more useful than direct operation of database system? Is the system more useful than Web-based system? Is the system more useful than mobile-phone and 2D barcode interface? No -> 1, Yes -> 5 The result show that the system is useful compared with other techniques
Research forum of Shonan Fujisawa Campus, Keio University ◦ Date 22, 23 / Nov / 2007 ◦ Location RoppongiHills, Tokyo, Japan We demonstrated the system ◦ got a lot of comments from Visitors
Instinctive Interface ◦ “I am not aware of the existing of the system comparing with mobile-phone” ◦ “The system is more instinctive and easy to use than ordinary mobile-phone interfaces” ◦ “It is fun to see dynamic change of sensor information in real time” Dissatisfaction with Visual-Marker ◦ “Visual-Marker spoil a view” ◦ “Use LED-panel in place of Visual-Marker” Request for User Interface ◦ “It becomes difficult for the angle of camera to see information” ◦ “Display something always”
Enhancement of Filters ◦ implement more filters for various context ◦ select appropriate filters automatically according to context Sensor recognition method except Visual- Marker ◦ Visible Light Communication using LED
We developed “uMegane”, the sensor information visualization system ◦ project sensor information on real-world picture using AR technology ◦ implement filter mechanism for selection of required information from a huge of sensor information We run usability evaluation ◦ confirmed utility of the system We demonstrated the system at ORF 2007