Download presentation
Presentation is loading. Please wait.
Published byFerdinand Norris Modified over 9 years ago
1
DESIGNING 3D WEB USING VRML Bob Hobbs Introduction to Virtual Reality and Simulation CE00166-1
2
Development Cycle
3
Target Platform To establish your target computer platform you must ask yourself the following questions: 1.Who will be using my Synthetic Environment? 2.What hardware and software are they likely to be using? 3.How will my Synthetic Environment be delivered to them? 4.When will my Synthetic Environment be finished?
4
Understand the Features & Limitations of the Technology General Limits Polygon Count Textures Lights Sounds Filesize VRML Specific Limits No Shadow Casting No Reflections Performance Plug-in Compatibility/Compliance
5
Storyboard and Project Documentation Target Platform Preliminary Research Storyboard Sketches Interactivity User Interface Resource Requirements Project Timeline and Milestones
6
Importing From Modeling Software Find suitable model Modify Reduce Polygon Count
7
Edit Appearance Material Properties Diffuse - colour of object Specular - highlight colour on shiny objects Emissive - glow colour Ambient intensity Shininess - changes the size of the highlight Transparency Apply Textures File Formats PNG - Portable Network Graphics JPEG - Joint Photographic Experts Group GIF - Graphics Interchange Format MPEG1 - Motion Picture Experts Group
8
Assemble Scene Modular Assembly Time_consuming Allows behaviours to be attached more easily Use DEF to allow cloning Use scene tree methodology Using Inlines Uses existing model Scaling Cannot animate sub-assemblies
9
Environmental Objects Background Texture Gradient Single Colour Adding Lights No shadows More Faces - better shading Limit to number of lights Different types of lighting
10
Lighting Directional lightPoint LightSpot Light Headlight - always on (deactivate after adding lighting)
11
Add Animation Key Frame animation Interpolation Tweening Storyboard
12
Complex Interactions Create Drag Sensors Create Switch Nodes Scripting
13
Sensors TouchSensor User clicks on, or rolls mouse pointer over object to trigger even PlaneSensor User clicks and drags to move an object in the local X-Y plane (left-right, up-down) CylinderSensor User clicks and drags to rotate an object around its Y (up-down) axis SphereSensor User clicks and drags to rotate an object freely around all axes ProximitySensor Event is triggered when user enters a specified zone VisibilitySensor Event is triggered when object enters or exits the user's view TimeSensor When activated by one of the above sensors or a script the TimeSensor outputs the absolute time plus fractional values between 0.0 and 1.0 for a specified duration. It is used to "drive" animation interpolators. Collision* Not technically a sensor at all but a special grouping node. It does, however, generate output events when the user collides with the group so it is worth mentioning with the other sensors.
14
Scripting Required when:- keeping track of an object's state (eg. is the door open or closed?) when the interactions of one object are conditional on the state of another (IF... THEN... ELSE type statements etc) any time mathematics is required (eg. PI or random number generation or sine wave animation path etc) runtime creation of custom nodes (eg. build me a spiral staircase 3 metres high etc)
15
Switch Nodes Allows the same Node in the scene graph to switch between different groups Possible uses: Changing fabrics for dress model Changing items on shelf Switching between open and closed door
16
Navigation and User Interface Specify Method(s) of Navigation Create Viewpoints Hyperlinks and HUD (Head Up Display)
17
Specify Method(s) of Navigation NavigationInfo Node Camera Avatar (see later Lecture) - also congigurable Navigation Style WalkUser affected by gravity FlyNo gravity ExamineRotate scene around NoneNo navigation controls - good for banner ads or if you are creating your own controls Any User may choose any navigation they wish
18
Viewpoints Allows several specialist views to easily navigate Can be linked to sensors attached to objects
19
Hyperlinks and HUD Hyperlinks - any object can be linked to a URL A HUD (Head Up Display) is a construct of a number of VRML nodes and a ProximitySensor The ProximitySensor basically keeps track of where the user is and feeds the group’s position and rotation into the group of objects that comprise the HUD to keep it positioned just in front of them at all times.
20
Packaging And Web Integration Stores all files into one directory so that they can be easily published An initial link to the.wrl file must be incorporated into the start-up page Provided any separate files are in the same directory the seperatae.wrl files can be inlined or hyperlinked at will
21
Optimise and Test (General Guidelines) Reduce the polygon count Reduce detail Build only what will be seen Modularity Scale Create Levels of Detail (LODs) Materials and Texturing
22
Publish FTP to Host Keep same file structure
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.