Next Generation 4-D Distributed Modeling and Visualization of Battlefield Next Generation 4-D Distributed Modeling and Visualization of Battlefield Avideh.

Slides:



Advertisements
Similar presentations
SEMINAR ON VIRTUAL REALITY 25-Mar-17
Advertisements

Some Reflections on Augmented Cognition Eric Horvitz ISAT & Microsoft Research November 2000 Some Reflections on Augmented Cognition Eric Horvitz ISAT.
ARCHEOGUIDE Augmented Reality-based Cultural Heritage On-site Guide
Visualization and Fusion Craig Scott, Aisha Page, Olusanya Soyannwo, Hamzat Kassim Paul Blackmon, Pierre Knight, Francis Dada Engineering Visualization.
9th E3 Concertation Meeting, Brussels, September 10th, 2002
A Natural Interactive Game By Zak Wilson. Background This project was my second year group project at University and I have chosen it to present as it.
Sheldon Brown, UCSD, Site Director Milton Halem, UMBC Director Yelena Yesha, UMBC Site Director Tom Conte, Georgia Tech Site Director Fundamental Research.
C1 - The Impact of CAD on the Design Process.  Consider CAD drawing, 2D, 3D, rendering and different types of modelling.
Martin Wagner and Gudrun Klinker Augmented Reality Group Institut für Informatik Technische Universität München December 19, 2003.
Wearable Badge for Indoor Location Estimation of Mobile Users MAS 961 Developing Applications for Sensor Networks Daniel Olguin Olguin MIT Media Lab.
Real-time, low-resource corridor reconstruction using a single consumer grade RGB camera is a powerful tool for allowing a fast, inexpensive solution to.
Virtual Reality Design Virtual reality systems are designed to produce in the participant the cognitive effects of feeling immersed in the environment.
International Maritime Protection Symposium 2005 The Harbour Defence IKC2 Experiment 13 Dec 2005 Tan Choon Kiat Defence Science Technology Agency, Singapore.
Collaborative Work Systems, Inc CWS Collaborative Work Systems, Inc Geo-Docent: Improving Human, Team, and Organizational Performance with Geographically.
UBIGIous – A Ubiquitous, Mixed-Reality Geographic Information System Daniel Porta Jan Conrad Sindhura Modupalli Kaumudi Yerneni.
CMU Wearable Computers and Pervasive Computing Asim Smailagic Institute for Complex Engineered Systems Carnegie Mellon June 28, 2001.
Paper by Shi, Qin, Suo, & Xiao Presented by Alan Kelly November 16, 2011.
AirMessages interactive density exploration Steven Strachan Hamilton Institute Roderick Murray-Smith University of Glasgow and Hamilton Institute
Next Generation 4-D Distributed Modeling and Visualization of Battlefield Next Generation 4-D Distributed Modeling and Visualization of Battlefield Avideh.
Interactive and Collaborative Visualization and Exploration of Massive Data Sets ---- UC Davis Visualization Investigators: Bernd Hamann, Ken Joy, Kwan-Liu.
MUltimo3-D: a Testbed for Multimodel 3-D PC Presenter: Yi Shi & Saul Rodriguez March 14, 2008.
Remote Sensing in Modern Military Operations. Outline ► Background ► Former cruise missile technology ► Current cruise missile technology ► GIS on the.
Sheldon Brown, UCSD, Site Director Milton Halem, UMBC Director Yelena Yesha, UMBC Site Director Tom Conte, Georgia Tech Site Director Fundamental Research.
Interactive Systems Technical Design
Location Systems for Ubiquitous Computing Jeffrey Hightower and Gaetano Borriello.
Uncertainty Quantification and Visualization: Geo-Spatially Registered Terrains and Mobile Targets Suresh Lodha Computer Science, University of California,
Virtual Worlds Lab Testbed for Mobile Augmented Battlefield Visualization September, 2003 Testbed for Mobile Augmented Battlefield Visualization September,
Uncertainty Quantification and Visualization: Geo-Spatially Registered Terrains and Mobile Targets Suresh Lodha Computer Science, University of California,
Human-Computer Interaction for Universal Computing James A. Landay EECS Dept., CS Division UC Berkeley Endeavor Mini Retreat, 5/25/99 Task Support.
User Interface Development Human Interface Devices User Technology User Groups Accessibility.
MACHINE VISION GROUP Multimodal sensing-based camera applications Miguel Bordallo 1, Jari Hannuksela 1, Olli Silvén 1 and Markku Vehviläinen 2 1 University.
Introduction to Graphics and Virtual Environments.
Personalized Medicine Research at the University of Rochester Henry Kautz Department of Computer Science.
Life Logging Melekam Tsegaye Shaun Bangay Alfredo Terzoli Research area: Wearable, Pervasive and Ubiquitous Computing
Satellites in Our Pockets: An Object Positioning System using Smartphones Justin Manweiler, Puneet Jain, Romit Roy Choudhury TsungYun
Chapter 11: Interaction Styles. Interaction Styles Introduction: Interaction styles are primarily different ways in which a user and computer system can.
Copyright John Wiley & Sons, Inc. Chapter 3 – Interactive Technologies HCI: Developing Effective Organizational Information Systems Dov Te’eni Jane.
Introduction Kinect for Xbox 360, referred to as Kinect, is developed by Microsoft, used in Xbox 360 video game console and Windows PCs peripheral equipment.
September 29, 2002Ubicomp 021 NIST Meeting Data Collection Jean Scholtz National Institute of Standards and Technology Gaithersburg, MD USA.
Intelligent Paper Pens and Ink. Background Despite the wide-ranging recognition that paper remains a pervasive resource for human conduct and collaboration,
NC-BSI: 3.3 Data Fusion for Decision Support Problem Statement/Objectives: Problem - Accurate situation awareness requires rapid integration of heterogeneous.
Next Generation 4-D Distributed Modeling and Visualization of Battlefield Next Generation 4-D Distributed Modeling and Visualization of Battlefield Avideh.
VIRTUAL REALITY Sagar.Khadabadi. Introduction The very first idea of it was presented by Ivan Sutherland in 1965: “make that (virtual) world in the window.
Virtual Reality Lecture2. Some VR Systems & Applications 고려대학교 그래픽스 연구실.
GENESIS OF VIRTUAL REALITY  The term ‘Virtual reality’ (VR) was initially coined by Jaron Lanier, founder of VPL Research (1989)..
Three Dimensional Model Construction for Visualization Avideh Zakhor Video and Image Processing Lab University of California at Berkeley
Parallel and Distributed Simulation Introduction and Motivation.
Integrating Virtual Environment and GIS for 3D Virtual City Development and Urban Planning Bin Chen, Fengru Huang, Yu Fang Peking University.
MURI: Integrated Fusion, Performance Prediction, and Sensor Management for Automatic Target Exploitation 1 Dynamic Sensor Resource Management for ATE MURI.
Fundamentals of Information Systems, Sixth Edition1 Natural Language Processing and Voice Recognition Processing that allows the computer to understand.
Virtual Reality Environment in C3I Systems Jan Hodicky, Petr Frantis Communication and Information Systems Departement University of Defense, Brno Czech.
Face Modeling, Expression Analysis, Caricature uncalibrated images Reconstructed 3D model Expression analysis from region models automated caricature Model.
HCI 입문 Graphics Korea University HCI System 2005 년 2 학기 김 창 헌.
© Anselm Spoerri Lecture 9 Augmented Reality –Definition –Visualization Approaches –Applications –Tools and Links.
Augmented Reality Authorized By: Miss.Trupti Pardeshi. NDMVP, Comp Dept. Augmented Reality 1/ 23.
Georgia Tech GVU Center Mobile Visualization in a Dynamic, Augmented Battlespace Mobile Visualization in a Dynamic, Augmented Battlespace William Ribarsky.
Uncertainty Computation,Visualization, and Validation Suresh K. Lodha Computer Science University of California, Santa Cruz (831)
Copyright John Wiley & Sons, Inc. Chapter 3 – Interactive Technologies HCI: Developing Effective Organizational Information Systems Dov Te’eni Jane.
Mixed Reality Conferencing Hirokazu Kato, Mark Billinghurst HIT Lab., University of Washington.
Nosipho Masilela COSC 480.  Define Augmented Reality  Augmented Reality vs. Reality  History of AR and its Applications  Augmented Tracking  Future.
MULTIMODAL AND NATURAL COMPUTER INTERACTION Domas Jonaitis.
SENSOR FUSION LAB RESEARCH ACTIVITIES PART I : DATA FUSION AND DISTRIBUTED SIGNAL PROCESSING IN SENSOR NETWORKS Sensor Fusion Lab, Department of Electrical.
Ubiquitous Computing and Augmented Realities
MURI Annual Review Meeting Randy Moses November 3, 2008
Chapter 2: Input and output devices
Gigapixel Visualization
Mobile Computing.
Principles/Paradigms Of Pervasive Computing
3rd Studierstube Workshop TU Wien
Closing Remarks.
Presentation transcript:

Next Generation 4-D Distributed Modeling and Visualization of Battlefield Next Generation 4-D Distributed Modeling and Visualization of Battlefield Avideh Zakhor UC Berkeley

MURI on Mobile Augmented Battlespace Visualization zParticipants: yAvideh Zakhor, (UC Berkeley) yBill Ribarsky, (Georgia Tech) yUlrich Neumann (USC) yPramod Varshney (Syracuse) ySuresh Lodha (UC Santa Cruz)

Battlefield Visualization zDetailed, timely and accurate picture of the modern battlefield vital to military zMany sources of info: yeye witness, aerial photographs, sonar, Synthetic Aperture Radar (SAR), Multi- Spectral Imaging (MSI), Hyper-Spectral Imaging (HSI). Foliage PENetration (FOPEN) radar, Electro-Optic (EO), Infra- Red (IR), Moving Target Imaging (MTI)

Major Challenges zDisparate/conflicting sources of info must be combined. zImpossible for ONE individual to collect and comprehend zSpecially trained technicians for each info source zEffectiveness of information combining and fusion determined by its usability. zMust avoid information overload in presenting the data.

Historical Perspective zSand box: ybox filled with sand shaped to replicate battlespace terrain. yCommanders moved around small physical replicas of battlefield objects to direct situation.

Historical Perspective zPaper maps and acetate: yAs intelligence arrives, technicians use grease pencils to mark new info on acetate. yCommanders draw on the acetate to plan battlefield situations. yTime consuming: several hours to print, distribute and update. yMany opportunities for introducing errors.

Historical Perspective zJoint Maritime Command Information System (JMCIS): ycomputerized battle space visualization ytwo fundamental limitations: xclutter when displaying too much info. xTwo dimensional display: 3D info is lost.

Historical Perspective z Responsive workbench with Dragon: yInherently 3D. yWorkstation renders 3D, back-projected on a horizontal screen. yUsers in the viewing area interact with the bench through 3D mouse, pinch gloves, speech recognition. yStereographic display with LCS shutter glass.

Improvements over WRB/Dragon zWorkbench not suitable for mobile soldier with PDA. zAugmented reality can enhance Dragon/WRB: ysensors distributed among the soldiers, can be used both to navigate and to update the visualization database. zNeed to deal with uncertainty: represent, compute, and visualize uncertainty information without cluttering. zTime should become the 4th dimension. 4D model construction. Play back/visualize the last 24 hours.

Agile, Mobile, Collaborative Testbed  A networked, collaborative whole Earth terrain database linking workstations, large projected displays, and mobile handheld systems.  Mobile users carry systems with handheld or augmented displays providing 3D terrain visualizations.  Mobile users receive, record, and transmit information about the world.  Users of stationary 3D displays collect and evaluate information provided by mobile users and route data and intelligence back to mobile users.  Collaboration is through annotation of the virtual geo-spatial database.

Central Terrain Database NAVEVirtual Workbench Mobile Displays Collaboration Channels Wireless Terrain Datalink Terrain Datalink Position, Terrain Markup, Route Finding, Weather, Friendly/Foe advisories, etc. w/GPS, bearing, tilt sensors and wireless data. Local databases download and cache data according to bandwidth, movement, and rendering speed of each platform. Wireless Architecture: Agile, Mobile, Collaborative Testbed

Research Agenda z Model construction: y initially constructed by registering sensor imagery to reference imagery, maps, elevation data, etc. yFour dimensional: time treated on the same footing as space. zModel update: Distributed mobile or stationary users contribute to updating the database via their sensors. zMobile, real time visualization, interaction and navigation within the database; augmented reality sensors tracking and registration required; zUncertainty processing for model construction and update, as well as uncertainty visualization.

Tracking & Registration Database Visualization Model Update & Construction Uncertainty processing Uncertainty visualization Technical Challenges

3-D and 4D Model Construction zDevelop a framework for fast, automatic and accurate 3D model construction for objects, scenes, rooms, buildings (interior and exterior), urban areas, and cities. Incorporate time element (4-D). zModels must be easy to compute, compact to represent, suitable for insertion in large hierarchical visualization databases, to facilitate high quality view synthesis and visualization from views that were not necessarily captured during data collection process. zStrategy: yFusion of multiple data sources: intensity, range, GPS, panoramic cameras. yIncorporate apriori models, e.g. CAD, DEM, DTED, elevation data, maps

Geo-registration and Tracking zDevelop techniques for unemcumbered, wide area and real time tracking for (a) augmentation and (b) visualization. zStrategy: Estimate real time 6-DOF tracking by fusion of multiple data streams with variable uncertainty: GPS, vision, inertial gyros, accelerometers, compass, laser range finders zReal time prototype will use the resulting tracking algorithms on an augmented reality PDA to geo-register and navigate the user within the geo-spatial database.

Mobile Visualization zVGIS: general framework for global geo-spatial data and visualization. zHierarchical, scalable data structures with fast access that include (a) time; (b) uncertainty; c) all varied products. zDynamic data structures with fast update: Real time data is put in dynamic cache until system finds time to integrate it with the online data base. zAutomated detail management for (a) uncertainty; (b) all visual products; zIntelligent retrieval and visual data mining. zMulti-modal interaction in multiple environments.

Multi-modal Interaction in Multiple Environments Environments Desktop Laptop Mobile (palm, HMD, larger handheld or wearable) Virtual Workbench Liveboard NAVE Others Interaction Modes Mouse Joystick or Spaceball Pen-based or touch screen Wired 3D or wireless 3D Head tracking, hand-tracking, gesture recognition Voice Others What are the environmental affordances? What are user needs? Is 3D interaction necessary? How best to navigate large virtual spaces (2D or 3D)? How to most effectively combine modes? How to evaluate?

Technical Challenges: Uncertainty Computation and Visualization

Uncertainty Processing zRepresentation and computation issues: ymany formalisms: probability, possibility, evidence theory; Transformations; zFusion: data, feature and decision levels yuncertainty aware fusion algorithms for dynamic distributed networks; various topologies for fusion networks: serial, parallel, tree network, non-tree feedback network. ydecentralized statistical inference algorithms zTime Critical computation and quality of service issues: yData continually arriving, requiring re-computation. ytradeoff between precision and speed of uncertainty computation; fast, yet imprecise answers.

Uncertainty Visualization and Validation zPresent uncertainty in an intuitive uncluttered way; zDisplay devices: screen space, mobility zModality: vision, audio, haptics zData types: scalar/vector/tensor, discrete/continuous, static/dynamic zUncertainty visualization techniques: glyphs, deformation, transparency, texture, superimposing/backgrounding, augmented reality. zValidation with novice and expert users: yTasks for mobile battlefield; measure accuracy and speed performance of user; conduct stat. analysis.

Proof of Concept : Multi-modal Interactions for PDA z Extend interaction methods for workstations to PDA; zWired (electromagnetic) and unwired (vision) interaction modes, hand gestures zVisible and infrared sensors added to PDA zDemonstrate a hybrid vision/inertial tracking PDA ydemonstrate VGIS visualization on portable display context zMultiple mobile users collaborate on identifying and locating features or targets.

Facilities and Equipment zG. Tech: GVU Future Computing Lab (FCL); Mobile augmented test-bed. zUSC: IMSC lab; CGIT lab. zUCSC: VizLab zSyracuse: Sensor Fusion Laboratory (SFL) zBerkeley: Video and Image Processing (VIP) Lab.

Collaboration and Management issues zDr. Hassan Foroosh currently with University of Maryland, will be the technical liason amongst the five universities. zTelephone conference amongst team members periodically to discuss status of the project. zLive seminars broadcast via video conferencing amongst five campuses, once every two months. zAnnual workshops and retreats for PI and graduate students and all other researchers to exchange ideas.

Transitions zAir Force Research Lab (Graniero) zArmy Research Lab (Emmerman, Tocarcik) zNavy Research Lab (Rosenblum) zLawernce Livermore Lab (Ucelton) zSun Microsystems (Sowizraj) zIntel (Liang) zPlanet 9 (Colleen) zGeometrix (Zwern) zHughes Research Lab (Azuma)

Cross Collaboration UCBUSCG.T.SYRUCSC Model const. & update X x x x Tracking & reg. x X x x Mobile visual. database x x X x x Uncertain. processing x x X x Uncertain. Visualizati on. x x X

Outline of Talks z3D model construction for visualization (UC Berkeley) zGeo-registration and tracking for augmentation and visualization(USC) zMobile visualization in dynamic, augmented battlespace (Georgia Tech) zUncertainty processing and information fusion (Syracuse) zUncertainty visualization and validation (UC Santa Cruz)