Download presentation
Presentation is loading. Please wait.
Published byErik Marshall Modified over 9 years ago
1
Advanced Game Design Prof. Roger Crawfis Computer Science & Engineering The Ohio State University
2
Course Overview Project-based / Team-based Little lecturing Focus on programming for games Systems integration – graphics, sound, AI, networking, user-interfaces, physics, scripting Utilize higher-level toolkits, allowing for more advanced progress while still developing programming skills.
3
Course Structure I will lecture for about the first week and a half. Student game project groups will provide several presentations on their game ideas and progress. Student technology teams will provide an intermediate and an advanced lecture on their findings and analysis about their area.
4
Project Goals Large-scale software development Team-based (synergistic development) Toolkit-based (fast-start development) Learn and utilize the many non-graphical elements needed for games. Leverage and extend your software engineering, graphics, AI, …, expertise.
5
Elements Gaming Engine Responsible for providing primitives Hardware abstraction Handle different areas of the game Physics, AI, etc. Game Defined by a genre Defines the gameplay
6
Requirements of a gaming engine Stunning Visuals Immersive sound stage Varied input/output devices Scalability Simulation Animation Networking
7
Requirements of a game Scripting Artificial Intelligence Supporting Tools Optimizing game content Developing game content Extending game content Debugging / Tuning of game performance
8
Stunning Visuals Adding realism Smarter Models Use hardware Bump-mapping Dynamic water or other liquids Rich textures (Billboards, gloss-maps, light-maps, etc.) Shadows Particle systems
9
Immersive sound stage Multi-track sound support Positional sound effects (3D immersion) Dynamic sounds / movement (doppler effects)
10
Input devices Commonly available devices are Keyboard, mouse, gamepads and joysticks Force feedback (haptic) devices are gaining popularity Steering wheels Joysticks Motion tracking Output devices Multiple monitors Head mounted displays
11
Scalability Multiple hardware capabilities Multi-resolution models Multi-user support LOD Multiple model definitions Multi-res models Subdivision surfaces
12
Simulation Virtual worlds are all good to look at Immersion breaks when real world physics are not applied So we need Collision detection Collision response
13
Animation Linear transformations Modeled animations Articulated motion Lip syncing Facial Expressions Blending animations
14
Networking Multi-player support essential Common problems Latency Synchronization Scalability Consistent game state Security
15
Scripting Strict coding is tedious Support for scripting is essential for RAD Scripting has added a whole new fun factor for many games.
16
Artificial Intelligence Games need specialized AI Strategy Path finding Modeling behavior Learning Non-perfect! Fast!
17
Tools Creating varied content models, video, images, sound Integrating content Common file format support Supporting existing popular tools via plug-ins 3DS Max, Lightwave, Maya etc. Adobe premier, Adobe Photoshop
18
Interactive Programs Games are interactive systems - they must respond to the user How?
19
Interactive Program Structure Event driven programming Everything happens in response to an event Events come from two sources: The user The system Events are also called messages An event causes a message to be sent… Initialize User Does Something or Timer Goes Off System Updates
20
User Events The OS manages user input Interrupts at the hardware level … Get converted into events in queues at the windowing level … Are made available to your program It is generally up to the application to make use of the event stream Windowing systems may abstract the events for you
21
System Events Windowing systems provide timer events The application requests an event at a future time The system will provide an event sometime around the requested time. Semantics vary: Guaranteed to come before the requested time As soon as possible after Almost never right on (real-time OS?)
22
Polling for Events Most windowing systems provide a non-blocking event function Does not wait for an event, just returns NULL if one is not ready What type of games might use this structure? Why wouldn’t you always use it? while ( true ) if ( e = checkEvent() ) switch ( e.type ) … do more work
23
Waiting for Events Most windowing systems provide a blocking event function Waits (blocks) until an event is available Usually used with timer events. Why? On what systems is this better than the previous method? What types of games is it useful for? e = nextEvent(); switch ( e.type ) …
24
The Callback Abstraction A common event abstraction is the callback mechanism Applications register functions they wish to have called in response to particular events Translation table says which callbacks go with which events Generally found in GUI (graphical user interface) toolkits “When the button is pressed, invoke the callback” Many systems mix methods, or have a catch-all callback for unclaimed events Why are callbacks good? Why are they bad?
25
Upon Receiving an Event … Event responses fall into two classes: Task events: The event sparks a specific task or results in some change of state within the current mode eg Load, Save, Pick up a weapon, turn on the lights, … Call a function to do the job Mode switches: The event causes the game to shift to some other mode of operation eg Start game, quit, go to menu, … Switch event loops, because events now have different meanings Software structure reflects this - menu system is separate from run-time game system, for example
26
Real-Time Loop At the core of interactive games is a real-time loop: What else might you need to do? The number of times this loop executes per second is the frame rate # frames per second (fps) while ( true ) process events update animation / scene render
27
Lag Lag is the time between when a user does something and when they see the result - also called latency Too much lag and causality is distorted With tight visual/motion coupling, too much lag makes people motion sick Big problem with head-mounted displays for virtual reality Too much lag makes it hard to target objects (and track them, and do all sorts of other perceptual tasks) High variance in lag also makes interaction difficult Users can adjust to constant lag, but not variable lag From a psychological perspective, lag is the important variable
28
Computing Lag Lag is NOT the time it takes to compute 1 frame! What is the formula for maximum lag as a function of frame rate, fr ? What is the formula for average lag? Process input Update state Render Process input Update state Render Process input Frame time Lag Event time
29
Frame Rate Questions What is an acceptable frame rate for twitch games? Why? What is the maximum useful frame rate? Why? What is the frame rate for NTSC television? What is the minimum frame rate required for a sense of presence ? How do we know? How can we manipulate the frame rate?
30
Frame Rate Answers (I) Twitch games demand at least 30fs, but the higher the better (lower lag) Users see enemy’s motions sooner Higher frame rates make targeting easier The maximum useful frame rate is the monitor refresh rate Time taken for the monitor to draw one screen Synchronization issues Buffer swap in graphics is timed with vertical sweep, so ideal frame rate is monitor refresh rate Can turn of synchronization, but get nasty artifacts on screen
31
Frame Rate Answers (II) NTSC television draws all the odd lines of the screen, then all the even ones ( interlace format) Full screen takes 1/30th of a second Use 60fps to improve visuals, but only half of each frame actually gets drawn by the screen Do consoles only render 1/2 screen each time? It was once argued that 10fps was required for a sense of presence (being there) Head mounted displays require 20fps or higher to avoid illness Many factors influence the sense of presence Perceptual studies indicate what frame rates are acceptable
32
Reducing Lag Faster algorithms and hardware is the obvious answer Designers choose a frame rate and put as much into the game as they can without going below the threshold Part of design documents presented to the publisher Threshold assumes fastest hardware and all game features turned on Options given to players to reduce game features and improve their frame rate There is a resource budget: How much of the loop is dedicated to each aspect of the game (graphics, AI, sound, …) Some other techniques allow for more features and less lag
33
Decoupling Computation It is most important to minimize lag between the user actions and their direct consequences So the input/rendering loop must have low latency Lag between actions and other consequences may be less severe Time between input and the reaction of enemy can be greater Time to switch animations can be greater Technique: Update different parts of the game at different rates, which requires decoupling them For example, run graphics at 60fps, AI at 10fps Done in Unreal engine, for instance
34
Animation and Sound Animation and sound need not be changed at high frequency, but they must be updated at high frequency For example, switching from walk to run can happen at low frequency, but joint angles for walking must be updated at every frame Solution is to package multiple frames of animation and submit them all at once to the renderer Good idea anyway, makes animation independent of frame rate Sound is offloaded to the sound card
35
Overview of Ogre3D Not a full-blown game engine. Open-source Strong user community. Decent Software Engineering. Cool Logo
36
Features Graphics API independent 3D implementation Platform independence Material & Shader support Well known texture formats: png, jpeg, tga, bmp, dds, dxt Mesh support: Milkshape3D, 3D Studio Max, Maya, Blender Scene features BSP, Octree plugins, hierarchical scene graph Special effects Particle systems, skyboxes, billboarding, HUD, cube mapping, bump mapping, post- processing effects Easy integration with physics libraries ODE, Tokamak, Newton, OPCODE Open source!
37
Core Objects
38
Startup Sequence ExampleApplication Go() Setup() Configure() setupResources() chooseSceneManager() createCamera() createViewport() createResourceListener() loadResources() createScene() frameStarted/Ended() createFrameListener() destroyScene()
39
Basic Scene Entity, SceneNode Camera, lights, shadows BSP map Integrated ODE physics BSP map Frame listeners
40
Terrain, sky, fog
41
CEGUI Window, panel, scrollbar, listbox, button, static text Media/gui/ogregui.layout CEGUI :: Window * sheet = CEGUI :: WindowManager :: getSingleton (). loadWindowLayout ( ( CEGUI :: utf8 *) "ogregui.layout" ); mGUISystem -> setGUISheet ( sheet );
42
Animation Node animation (camera, light sources) Skeletal Animation AnimationState *mAnimationState; mAnimationState = ent->getAnimationState("Idle"); mAnimationState->setLoop(true); mAnimationState->setEnabled(true); mAnimationState->addTime(evt.timeSinceLastFrame); mNode->rotate(quat);
43
Animation Crowd (instancing vs single entity) InstancedGeometry * batch = new InstancedGeometry ( mCamera -> getSceneManager (), "robots" ); batch -> addEntity ( ent, Vector3 :: ZERO ); batch -> build (); Facial animation VertexPoseKeyFrame * manualKeyFrame ; manualKeyFrame -> addPoseReference (); manualKeyFrame -> updatePoseReference ( ushort poseIndex, Real influence ) ushort Real
44
Picking
45
CEGUI :: Point mousePos = CEGUI :: MouseCursor :: getSingleton (). getPosition (); Ray mouseRay = mCamera -> getCameraToViewportRay ( mousePos. d_x / float ( arg. state. width ), mousePos. d_y / float ( arg. state. height )); mRaySceneQuery -> setRay ( mouseRay ); mRaySceneQuery -> setSortByDistance ( false ); RaySceneQueryResult & result = mRaySceneQuery -> execute (); RaySceneQueryResult :: iterator mouseRayItr ; Vector3 nodePos; for (mouseRayItr = result.begin(); mouseRayItr != result.end(); mouseRayItr++) { if (mouseRayItr->worldFragment) { nodePos = mouseRayItr->worldFragment->singleIntersection; break; } // if }
46
Particle Effects mSceneMgr -> getRootSceneNode ()-> createChildSceneNode ()- > attachObject ( mSceneMgr -> createParticleSystem ( "sil", "Examples/sil" )); Examples/sil { material Examples/Flare2 particle_width 75 particle_height 100 cull_each false quota 1000 billboard_type oriented_self // Area emitter emitter Point { angle 30 emission_rate 75 time_to_live 2 direction 0 1 0 velocity_min 250 velocity_max 300 colour_range_start 0 0 0 colour_range_end 1 1 1 } // Gravity affector LinearForce { force_vector 0 -100 0 force_application add } // Fader affector ColourFader { red -0.5 green -0.5 blue -0.5 }
47
Particle Effects Particle system attributes quota material particle_width particle_height cull_each billboard_type billboard_origin billboard_rotation_type common_direction common_up_vector renderer sorted local_space point_rendering accurate_facing iteration_interval nonvisible_update_timeout Emitter attributes (point, box, clyinder, ellipsoid, hollow ellipsoid, ring) angle colour colour_range_start colour_range_end direction emission_rate position velocity velocity_min velocity_max time_to_live time_to_live_min time_to_live_max duration duration_min duration_max repeat_delay repeat_delay_min repeat_delay_max Affectors (LinearForce, ColorFader) Linear Force Affector ColourFader Affector ColourFader2 Affector Scaler Affector Rotator Affector ColourInterpolator Affector ColourImage Affector DeflectorPlane Affector DirectionRandomiser Affector
48
Fire and Smoke affector Rotator { rotation_range_start 0 rotation_range_end 360 rotation_speed_range_start -60 rotation_speed_range_end 200 }
49
Cel Shading vertex_program Ogre/CelShadingVP cg { source Example_CelShading.cg entry_point main_vp … default_params { … } material Examples/CelShading { … vertex_program_ref Ogre/CelShadingVP {} fragment_program_ref Ogre/CelShadingFP {} }
50
Cube Mapping With Perlin noise to distort vertices void morningcubemap_fp ( float3 uv : TEXCOORD0, out float4 colour : COLOR, uniform samplerCUBE tex : register(s0) ) { colour = texCUBE(tex, uv); // blow out the light a bit colour *= 1.7; }
51
Bump Mapping + =
52
Reflections & Refractions // Noise texture_unit { // Perlin noise volume texture waves2.dds // min / mag filtering, no mip filtering linear linear none } // Reflection texture_unit { // Will be filled in at runtime texture Reflection tex_address_mode clamp // needed by ps.1.4 tex_coord_set 1 } // Refraction texture_unit { // Will be filled in at runtime texture Refraction tex_address_mode clamp // needed by ps.1.4 tex_coord_set 2 }
53
Reflections & Refractions
54
Grass Each grass section is 3 planes at 60 degrees to each other Normals point straight up to simulate correct lighting
55
Post-Processing Effects
56
Supporting tools (add-ons) Blender, Autodesk 3DS, Milkshape3D, Softimage XSI importers/exporters Particle editors, mesh viewers GUI editors Physics bindings Scene exporters and lots more Add-on site
57
References http://www.ogre3d.org/
58
Design Patterns First, an aside on Design Patterns
59
Ogre Foundations Root RenderSystem SceneManager ResourceManager
60
Ogre Foundations Mesh Entity Material Overlay – OverlayElement, OverlayContainer, OverlayManager Skeletal Animation
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.