Integrating Active Tangible Devices with a Synthetic Environment for Collaborative Engineering Sandy Ressler Brian Antonishek Qiming Wang Afzal Godil National.

Slides:



Advertisements
Similar presentations
Testing Relational Database
Advertisements

CIMCO Integration Software Products
A Natural Interactive Game By Zak Wilson. Background This project was my second year group project at University and I have chosen it to present as it.
Haptic Glove Hardware Graduation Project Prepared by Yaman A. Salman Eman M. Masarweh 2012.
DESIGN AND IMPLEMENTATION OF SOFTWARE COMPONENTS FOR A REMOTE LABORATORY J. Fernandez, J. Crespo, R. Barber, J. Carretero University Carlos III of Madrid.
Robot Soccer Challenge
The Science of Digital Media Microsoft Surface 7May Metropolia University of Applied Sciences Display Technologies Seminar.
1 ITC242 – Introduction to Data Communications Week 12 Topic 18 Chapter 19 Network Management.
Chapter 13 Embedded Systems
VRML for Kinematic and Physical Modeling and Simulations Benjamin Pugliese Mahesh Saptharishi.
Gyration GyroMouse. Digitizers 3D Digitization Data Gloves (Haptic Devices)
Cambodia-India Entrepreneurship Development Centre - : :.... :-:-
The NXT is the brain of a MINDSTORMS® robot. It’s an intelligent, computer-controlled LEGO® brick that lets a MINDSTORMS robot come alive and perform.
Welcome to the world of G.V.Ram Mohan Reddy G.V.Ram Mohan Reddy.
The Eye-Tracking Butterfly: Morphing the SMI REDpt Eye-Tracking Camera into an Interactive Device. James Cunningham & James D. Miles California State University,
Android Core Logging Application Keith Schneider Introduction The Core Logging application is part of a software suite that is designed to enable geologic.
Knowledge Systems Lab JN 8/24/2015 A Method for Temporal Hand Gesture Recognition Joshua R. New Knowledge Systems Laboratory Jacksonville State University.
People in multimedia Systems. Multimedia Systems Multimedia systems are designed by a team of people who specialise in a particular field, For example:
JWST Integrated Modeling Environment James Webb Space Telescope.
The Camera Mouse: Visual Tracking of Body Features to Provide Computer Access for People With Severe Disabilities.
Chapter 11-Multimedia Authoring Tools. Overview Introduction to multimedia authoring tools. Types of authoring tools. Cross-platform authoring notes.
 2008 Pearson Education, Inc. All rights reserved Introduction to Computers, the Internet and World Wide Web.
Copyright John Wiley & Sons, Inc. Chapter 3 – Interactive Technologies HCI: Developing Effective Organizational Information Systems Dov Te’eni Jane.
By: Brian Cannella Andrew McCaffrey Thomas McConlogue.
COMPUTER SOFTWARE Section 2 “System Software: Computer System Management ” CHAPTER 4 Lecture-6/ T. Nouf Almujally 1.
MVC and MVP. References enter.html enter.html
Department of Computing and Information Sciences Kansas State University Design Methodology for State based Embedded Systems Case Study: Robot Controller.
1 SWE 513: Software Engineering Usability II. 2 Usability and Cost Good usability may be expensive in hardware or special software development User interface.
Why do robots need to move?
The George Washington University Electrical & Computer Engineering Department ECE 002 Dr. S. Ahmadi Class 2.
Operating Systems.
11.10 Human Computer Interface www. ICT-Teacher.com.
CHAPTER TEN AUTHORING.
COMPUTER GRAPHICS Hochiminh city University of Technology Faculty of Computer Science and Engineering CHAPTER 01: Graphics System.
Team Project: A Surveillant Robot System Little Red Team Chankyu Park (Michel) Seonah Lee (Sarah) Qingyuan Shi (Lisa) Chengzhou Li JunMei Li Kai Lin SW.
GENESIS OF VIRTUAL REALITY  The term ‘Virtual reality’ (VR) was initially coined by Jaron Lanier, founder of VPL Research (1989)..
Chapter 1 Introduction to Databases. 1-2 Chapter Outline   Common uses of database systems   Meaning of basic terms   Database Applications  
Teaching Robots There are two ways to teach a robot how to do something: Type a program of instructions into the controlling computer. Use lead-through.
A Multi-agent Approach for the Integration of the Graphical and Intelligent Components of a Virtual Environment Rui Prada INESC-ID.
Chapter 2 Introduction to Systems Architecture. Chapter goals Discuss the development of automated computing Describe the general capabilities of a computer.
IMPRINT 2 2 System of Systems (SoS) Modeling Approach Micro Analysis & Design Nils D. LaVine Marc Gacy 7 December 2005.
MULTI-SEAT COMPUTER SYSTEM WITH AUDIO INTEGRATION Team 4 Manager Webmaster Document Prep Presentation Prep Lab Coordinator Faculty Facilitator David Wilson.
Choosing interaction devices: hardware components
Controlling Computer Using Speech Recognition (CCSR) Creative Masters Group Supervisor : Dr: Mounira Taileb.
Reading Flash. Training target: Read the following reading materials and use the reading skills mentioned in the passages above. You may also choose some.
Design of an Integrated Robot Simulator for Learning Applications Brendon Wilson April 15th, 1999.
Software Engineering Chapter: Computer Aided Software Engineering 1 Chapter : Computer Aided Software Engineering.
Copyright © 2009 Pearson Education, Inc. Publishing as Pearson Addison-Wesley Chapter 4: Events Programming with Alice and Java First Edition by John Lewis.
August 2003 At A Glance The IRC is a platform independent, extensible, and adaptive framework that provides robust, interactive, and distributed control.
CSC190 Introduction to Computing Operating Systems and Utility Programs.
CONTENT FOCUS FOCUS INTRODUCTION INTRODUCTION COMPONENTS COMPONENTS TYPES OF GESTURES TYPES OF GESTURES ADVANTAGES ADVANTAGES CHALLENGES CHALLENGES REFERENCE.
Haris Ali (15) Abdul Ghafoor (01) Kashif Zafar (27)
Copyright John Wiley & Sons, Inc. Chapter 3 – Interactive Technologies HCI: Developing Effective Organizational Information Systems Dov Te’eni Jane.
Lego League. What is the Lego League? What we will be doing for the next few weeks The AIM of the next few weeks is to gain knowledge into programming,
Unit 3 Computer Systems. What is software? unlike hardware it can’t be physically touched it’s the missing link between the computer hardware and the.
COMP413: Computer Graphics Overview of Graphics Systems Chapter 1.
CIRP Annals - Manufacturing Technology 60 (2011) 1–4 Augmented assembly technologies based on 3D bare-hand interaction S.K. Ong (2)*, Z.B. Wang Mechanical.
Fundamental of Database Systems
VEX IQ Curriculum Smart Machines Lesson 09 Lesson Materials:
Human Computer Interaction (HCI)
11.10 Human Computer Interface
Unified Modeling Language
CHAPTER 8 Multimedia Authoring Tools
Robot Soccer Challenge
Database Database is a large collection of related data that can be stored, generally describes activities of an organization. An organised collection.
Understand Windows Forms Applications and Console-based Applications
Introduction to Events
Systems Analysis and Design in a Changing World, 6th Edition
Virtual Reality.
Analysis models and design models
Presentation transcript:

Integrating Active Tangible Devices with a Synthetic Environment for Collaborative Engineering Sandy Ressler Brian Antonishek Qiming Wang Afzal Godil National Institute of Standards and Technology Jared Freeland DAS FA CIS 4930

Abstract of the Abstract This paper describes the creation of an environment for collaborative engineering in which the goal is to improve the user interface by using haptic manipulation with synthetic environments. The system to be outlined combines some of what Dr. Fishwick discussed on Wednesday, as well as Scott’s presentation on “Real Reality”.

Introduction The immediate goal: to determine the feasibility of using a tangible interface with a multiuser VRML environment as applied to collaborative engineering. By “tangible” we refer to the ability to pick up and interact with actual physical objects represented in the virtual environment

Introduction A secondary goal of the project was to use as much off-the-shelf software and hardware as possible, to facilitate transfer of the technology into the commercial world. The mediation hub is run by Java The VE uses a commercial system, the blaxxun Community Platform The tangible devices are off-the-shelf configurable robots by LEGO Mindstorms

System Overview The overall environment is conceptually simple. Two collaborating engineers in geographically separate areas wish to manipulate and discuss a construction project Recent work at NIST has demonstrated that VRML can be used to represent rich construction environments, but manipulation of elements such as a virtual excavator is awkward.

System Overview Control panels with many buttons and sliders are functional but can be difficult to manipulate. The answer: direct haptic manipulation should be more intuitive for interaction. Users move the tangible excavator and adjust the rotatable arm, causing the virtual “mirror” to update.

System Overview The core of the system is the Java based Virtual Environment Device Integration server, or JVEDI, which acts as a hub between all the system’s components. The server runs as a stand-alone Java application on the host computer.

System Overview The Real Environment The Real Environment Two work surfaces (A.K.A. tables) A LEGO Mindstorm robot on each surface Above each surface is a video camera looking down at the surface, providing 2D position/orientation

System Overview The Virtual Environment The Virtual Environment A multi-user blaxxun environment displays the sum of both (or all) physical environments A simplified user interface consisting of buttons and arrows is included for collaborators without access to an actual robot

Interesting Points of the System Unlike “graspable” interfaces, this system does not use a haptic glove or data glove of any kind. Instead, by using a camera to track the movement of robots, the user is given complete unrestricted control of the robots

Interesting Points of the System The virtual and real environments are kept synchronized. They always mirror each other. This is accomplished by always using position values reported by the video system.

Integration Issues The most challenging aspect in creating the work environment was integrating all the processing elements. Functionality for controlling robots and reading and writing data from a position tracking device had to be built for VRML’s External Authoring Interface. A fully configured version of the environment requires up to six separate computers.

Major Components Vision Processing Vision Processing The position and orientation of the LEGO robots are computed in real time using a computer vision method based on color tracking. The vision program uses an inexpensive camera and can track multiple robots at 10 frames/sec

Major Components Vision Processing Vision Processing To track the LEGO robots, two differently colored cards were attached to the robot. The computer vision program uses probability distribution to find the centers of the two squares, the mean of which is the robot’s position. The orientation is the arctangent of the difference of the centers

Major Components Speech Input Speech Input A user who is moving robots cannot easily access a keyboard. It can also be necessary to move robots on two surfaces simultaneously. Voice commands were built in such as “forward”, “backward”, “left”, “right”, “select red”, “select blue”, etc.

Major Components Multiuser VRML Multiuser VRML The multiuser aspect of the VRML world was accomplished with commercially available software from blaxxun Community. The first part is the virtual world which includes the two lego robots (red and blue), the floor, and a cylinder. The second part is the control panel

Auxiliary Processing Collision Detection Collision Detection Suppose the robots on two separate work surfaces collide virtually The collision detection is performed by the VRML world, and knowledge of it exists only in the VE. The robots light up and beep when they hit something in the virtual world.

Auxiliary Processing Task Commands and Recording Task Commands and Recording Small programmatic tasks were created for the robots, such as movement patterns. Additionally, functionality was added that allows users to record the movements of the robots, to be played back and repeated later.

Links The JVEDI code is publicly available at More on robots at legomindstorms.com More on the blaxxun Community at blaxxun.com

Discussion How does this project relate or compare to what Scott discussed Wednesday? What are the advantages/disadvantages? What can and needs to be improved?