Advanced Game Design Prof. Roger Crawfis Computer Science & Engineering The Ohio State University.

Slides:



Advertisements
Similar presentations
MANIFEST DESTINY 560group3 Abe Kim David Straily Jarrod Freeman Abdul Mod-Rokbi “The future of RTS games...”
Advertisements

SmashBlasters Chau Vo, David Huynh, Lowell Bateman, William Kentris Department of Computer Science and Engineering The Ohio State University Modeling Physics.
Irrlicht Engine Overview By Eric Osugi. Irrlicht's development started in 2003 with only Nikolaus Gebhardt. Only after the 1.0 release of Irrlicht in.
Operating Systems High Level View Chapter 1,2. Who is the User? End Users Application Programmers System Programmers Administrators.
W4118 Operating Systems OS Overview Junfeng Yang.
Game Design and Programming. Objectives Classify the games How games are design How games are implemented What are the main components of a game engine.
Useful Tools for Making Video Games Part I An overview of Ogre.
Chapter 1 and 2 Computer System and Operating System Overview
Underlying Technologies Part Two: Software Mark Green School of Creative Media.
Ch 1 Intro to Graphics page 1CS 367 First Day Agenda Best course you have ever had (survey) Info Cards Name, , Nickname C / C++ experience, EOS experience.
AGD: 5. Game Arch.1 Objective o to discuss some of the main game architecture elements, rendering, and the game loop Animation and Games Development.
Master Project Preparation Murtaza Hussain. Unity (also called Unity3D) is a cross-platform game engine with a built-in IDE developed by Unity Technologies.
Week 1 - Friday.  What did we talk about last time?  C#  SharpDX.
Prof. Roger Crawfis Computer Science & Engineering The Ohio State University Advanced Game Design.
Antigone Engine Kevin Kassing – Period
Building An Interactive, 3-D Virtual World Raymond H. Mazza, Computer Science, Colby College Advisor:
CSE 786 Game Presentation June 9, 2009 Game: “Little Robot’s Adventure” Team: OgreNoName Hiroshi Hayashi Bryan Linthicum Brett Kizer.
Designing 3D Interfaces Examples of 3D interfaces Pros and cons of 3D interfaces Overview of 3D software and hardware Four key design issues: system performance,
London April 2005 London April 2005 Creating Eyeblaster Ads The Rich Media Platform The Rich Media Platform Eyeblaster.
London April 2005 London April 2005 Creating Eyeblaster Ads The Rich Media Platform The Rich Media Platform Eyeblaster.
Digital Multimedia, 2nd edition Nigel Chapman & Jenny Chapman Chapter 8 This presentation © 2004, MacAvon Media Productions Animation.
Computer Graphics An Introduction. What’s this course all about? 06/10/2015 Lecture 1 2 We will cover… Graphics programming and algorithms Graphics data.
Game Engine Programming. Game Engine Game Engine Rendering Engine (OGRE) Rendering Engine (OGRE) Physics Engine (Bullet) Physics Engine (Bullet) Input/Output.
09/09/03CS679 - Fall Copyright Univ. of Wisconsin Last Time Event management Lag Group assignment has happened, like it or not.
(c) University of Washington08-1 CSC 143 Models and Views Reading: Ch. 18.
Object Orientated Data Topic 5: Multimedia Technology.
Web Games Programming An Introduction to Unity 3D.
 The creation of moving pictures one frame at a time Literally 'to bring to life' e.g. make a sequence of drawings on paper, in which a character's position.
Dr. Ken Hoganson, Kennesaw State University Introduction to the Torque Game Development System.
Game Programming 08 OGRE3D Material in Action 2010 년 2 학기 디지털콘텐츠전공.
11 Lecture 3 Particle Effects References: [1] Gregory Junker, Pro OGRE 3D Programming, Apress, 2006 [2] Ogre Tutorials – Ogre Wiki
A Multi-agent Approach for the Integration of the Graphical and Intelligent Components of a Virtual Environment Rui Prada INESC-ID.
Useful Tools for Making Video Games Part II An overview of.
111 Introduction to OGRE3D Programming: Main Loop.
1 Perception and VR MONT 104S, Fall 2008 Lecture 21 More Graphics for VR.
2.1. T HE G AME L OOP Central game update and render processes.
CS 638, Fall 2001 Interactive Programs Games are interactive systems - they must respond to the user Today is all about how interactive programs are designed.
Games Development Game Architecture: Entities CO2301 Games Development 1 Week 22.
Game Design and Dev Ogre3D Overview
Advanced Computer Graphics Spring 2014 K. H. Ko School of Mechatronics Gwangju Institute of Science and Technology.
Fundamentals of Level Editor Design and Implementation.
Computer Graphics: Programming, Problem Solving, and Visual Communication Steve Cunningham California State University Stanislaus and Grinnell College.
Introduction to Interactive Media Interactive Media Tools: Authoring Applications.
1 Graphics CSCI 343, Fall 2015 Lecture 6 Viewing, Animation, User Interface.
1 Software. 2 What is software ► Software is the term that we use for all the programs and data on a computer system. ► Two types of software ► Program.
Super Pong Andrew S. Dunsmore CSC436 August 2004.
Vizard Virtual Reality Toolkits Vizard Virtual Reality Toolkits.
Games Development 1 Review / Revision CO2301 Games Development 1 Semester 2.
Havok FX Physics on NVIDIA GPUs. Copyright © NVIDIA Corporation 2004 What is Effects Physics? Physics-based effects on a massive scale 10,000s of objects.
UFCFSU-30-13D Technologies for the Web An Introduction to Unity 3D.
09/04/03CS679 - Fall Copyright Univ. of Wisconsin Last Time Course intro Project management –Group size limit increased to 5 –Lecture notes had.
What you need: In order to use these programs you need a program that sends out OSC messages in TUIO format. There are a few options in programs that.
Applications and Rendering pipeline
Unity 3D Rolfe Bozier 24-Apr-2017
Interactive Animation
Computer Graphics Lecture 1 Introduction to Computer Graphics
Games Development Practices 3D Modelling
- Introduction - Graphics Pipeline
Week 2 - Monday CS361.
Ogre Overview Lecture 3.
Ogre Overview.
Advanced Game Design Dr. Matt Boggus
Operating Systems.
Blender API: Part 2: Game Engine
AN INTRODUCTION TO COMPUTER GRAPHICS Subject: Computer Graphics Lecture No: 01 Batch: 16BS(Information Technology)
Chapter I Introduction
Model, View, Controller design pattern
Games Development 2 Tools Programming
Games Development 1 Review / Revision
Chapter 13: I/O Systems “The two main jobs of a computer are I/O and [CPU] processing. In many cases, the main job is I/O, and the [CPU] processing is.
Presentation transcript:

Advanced Game Design Prof. Roger Crawfis Computer Science & Engineering The Ohio State University

Course Overview Project-based / Team-based Little lecturing Focus on programming for games Systems integration – graphics, sound, AI, networking, user-interfaces, physics, scripting Utilize higher-level toolkits, allowing for more advanced progress while still developing programming skills.

Course Structure I will lecture for about the first week and a half. Student game project groups will provide several presentations on their game ideas and progress. Student technology teams will provide an intermediate and an advanced lecture on their findings and analysis about their area.

Project Goals Large-scale software development Team-based (synergistic development) Toolkit-based (fast-start development) Learn and utilize the many non-graphical elements needed for games. Leverage and extend your software engineering, graphics, AI, …, expertise.

Elements Gaming Engine Responsible for providing primitives Hardware abstraction Handle different areas of the game Physics, AI, etc. Game Defined by a genre Defines the gameplay

Requirements of a gaming engine Stunning Visuals Immersive sound stage Varied input/output devices Scalability Simulation Animation Networking

Requirements of a game Scripting Artificial Intelligence Supporting Tools Optimizing game content Developing game content Extending game content Debugging / Tuning of game performance

Stunning Visuals Adding realism Smarter Models Use hardware Bump-mapping Dynamic water or other liquids Rich textures (Billboards, gloss-maps, light-maps, etc.) Shadows Particle systems

Immersive sound stage Multi-track sound support Positional sound effects (3D immersion) Dynamic sounds / movement (doppler effects)

Input devices Commonly available devices are Keyboard, mouse, gamepads and joysticks Force feedback (haptic) devices are gaining popularity Steering wheels Joysticks Motion tracking Output devices Multiple monitors Head mounted displays

Scalability Multiple hardware capabilities Multi-resolution models Multi-user support LOD Multiple model definitions Multi-res models Subdivision surfaces

Simulation Virtual worlds are all good to look at Immersion breaks when real world physics are not applied So we need Collision detection Collision response

Animation Linear transformations Modeled animations Articulated motion Lip syncing Facial Expressions Blending animations

Networking Multi-player support essential Common problems Latency Synchronization Scalability Consistent game state Security

Scripting Strict coding is tedious Support for scripting is essential for RAD Scripting has added a whole new fun factor for many games.

Artificial Intelligence Games need specialized AI Strategy Path finding Modeling behavior Learning Non-perfect! Fast!

Tools Creating varied content models, video, images, sound Integrating content Common file format support Supporting existing popular tools via plug-ins 3DS Max, Lightwave, Maya etc. Adobe premier, Adobe Photoshop

Interactive Programs Games are interactive systems - they must respond to the user How?

Interactive Program Structure Event driven programming Everything happens in response to an event Events come from two sources: The user The system Events are also called messages An event causes a message to be sent… Initialize User Does Something or Timer Goes Off System Updates

User Events The OS manages user input Interrupts at the hardware level … Get converted into events in queues at the windowing level … Are made available to your program It is generally up to the application to make use of the event stream Windowing systems may abstract the events for you

System Events Windowing systems provide timer events The application requests an event at a future time The system will provide an event sometime around the requested time. Semantics vary: Guaranteed to come before the requested time As soon as possible after Almost never right on (real-time OS?)

Polling for Events Most windowing systems provide a non-blocking event function Does not wait for an event, just returns NULL if one is not ready What type of games might use this structure? Why wouldn’t you always use it? while ( true ) if ( e = checkEvent() ) switch ( e.type ) … do more work

Waiting for Events Most windowing systems provide a blocking event function Waits (blocks) until an event is available Usually used with timer events. Why? On what systems is this better than the previous method? What types of games is it useful for? e = nextEvent(); switch ( e.type ) …

The Callback Abstraction A common event abstraction is the callback mechanism Applications register functions they wish to have called in response to particular events Translation table says which callbacks go with which events Generally found in GUI (graphical user interface) toolkits “When the button is pressed, invoke the callback” Many systems mix methods, or have a catch-all callback for unclaimed events Why are callbacks good? Why are they bad?

Upon Receiving an Event … Event responses fall into two classes: Task events: The event sparks a specific task or results in some change of state within the current mode eg Load, Save, Pick up a weapon, turn on the lights, … Call a function to do the job Mode switches: The event causes the game to shift to some other mode of operation eg Start game, quit, go to menu, … Switch event loops, because events now have different meanings Software structure reflects this - menu system is separate from run-time game system, for example

Real-Time Loop At the core of interactive games is a real-time loop: What else might you need to do? The number of times this loop executes per second is the frame rate # frames per second (fps) while ( true ) process events update animation / scene render

Lag Lag is the time between when a user does something and when they see the result - also called latency Too much lag and causality is distorted With tight visual/motion coupling, too much lag makes people motion sick Big problem with head-mounted displays for virtual reality Too much lag makes it hard to target objects (and track them, and do all sorts of other perceptual tasks) High variance in lag also makes interaction difficult Users can adjust to constant lag, but not variable lag From a psychological perspective, lag is the important variable

Computing Lag Lag is NOT the time it takes to compute 1 frame! What is the formula for maximum lag as a function of frame rate, fr ? What is the formula for average lag? Process input Update state Render Process input Update state Render Process input Frame time Lag Event time

Frame Rate Questions What is an acceptable frame rate for twitch games? Why? What is the maximum useful frame rate? Why? What is the frame rate for NTSC television? What is the minimum frame rate required for a sense of presence ? How do we know? How can we manipulate the frame rate?

Frame Rate Answers (I) Twitch games demand at least 30fs, but the higher the better (lower lag) Users see enemy’s motions sooner Higher frame rates make targeting easier The maximum useful frame rate is the monitor refresh rate Time taken for the monitor to draw one screen Synchronization issues Buffer swap in graphics is timed with vertical sweep, so ideal frame rate is monitor refresh rate Can turn of synchronization, but get nasty artifacts on screen

Frame Rate Answers (II) NTSC television draws all the odd lines of the screen, then all the even ones ( interlace format) Full screen takes 1/30th of a second Use 60fps to improve visuals, but only half of each frame actually gets drawn by the screen Do consoles only render 1/2 screen each time? It was once argued that 10fps was required for a sense of presence (being there) Head mounted displays require 20fps or higher to avoid illness Many factors influence the sense of presence Perceptual studies indicate what frame rates are acceptable

Reducing Lag Faster algorithms and hardware is the obvious answer Designers choose a frame rate and put as much into the game as they can without going below the threshold Part of design documents presented to the publisher Threshold assumes fastest hardware and all game features turned on Options given to players to reduce game features and improve their frame rate There is a resource budget: How much of the loop is dedicated to each aspect of the game (graphics, AI, sound, …) Some other techniques allow for more features and less lag

Decoupling Computation It is most important to minimize lag between the user actions and their direct consequences So the input/rendering loop must have low latency Lag between actions and other consequences may be less severe Time between input and the reaction of enemy can be greater Time to switch animations can be greater Technique: Update different parts of the game at different rates, which requires decoupling them For example, run graphics at 60fps, AI at 10fps Done in Unreal engine, for instance

Animation and Sound Animation and sound need not be changed at high frequency, but they must be updated at high frequency For example, switching from walk to run can happen at low frequency, but joint angles for walking must be updated at every frame Solution is to package multiple frames of animation and submit them all at once to the renderer Good idea anyway, makes animation independent of frame rate Sound is offloaded to the sound card

Overview of Ogre3D Not a full-blown game engine. Open-source Strong user community. Decent Software Engineering. Cool Logo

Features Graphics API independent 3D implementation Platform independence Material & Shader support Well known texture formats: png, jpeg, tga, bmp, dds, dxt Mesh support: Milkshape3D, 3D Studio Max, Maya, Blender Scene features BSP, Octree plugins, hierarchical scene graph Special effects Particle systems, skyboxes, billboarding, HUD, cube mapping, bump mapping, post- processing effects Easy integration with physics libraries ODE, Tokamak, Newton, OPCODE Open source!

Core Objects

Startup Sequence ExampleApplication Go() Setup() Configure() setupResources() chooseSceneManager() createCamera() createViewport() createResourceListener() loadResources() createScene() frameStarted/Ended() createFrameListener() destroyScene()

Basic Scene Entity, SceneNode Camera, lights, shadows BSP map Integrated ODE physics BSP map Frame listeners

Terrain, sky, fog

CEGUI Window, panel, scrollbar, listbox, button, static text Media/gui/ogregui.layout CEGUI :: Window * sheet = CEGUI :: WindowManager :: getSingleton (). loadWindowLayout ( ( CEGUI :: utf8 *) "ogregui.layout" ); mGUISystem -> setGUISheet ( sheet );

Animation Node animation (camera, light sources) Skeletal Animation AnimationState *mAnimationState; mAnimationState = ent->getAnimationState("Idle"); mAnimationState->setLoop(true); mAnimationState->setEnabled(true); mAnimationState->addTime(evt.timeSinceLastFrame); mNode->rotate(quat);

Animation Crowd (instancing vs single entity) InstancedGeometry * batch = new InstancedGeometry ( mCamera -> getSceneManager (), "robots" ); batch -> addEntity ( ent, Vector3 :: ZERO ); batch -> build (); Facial animation VertexPoseKeyFrame * manualKeyFrame ; manualKeyFrame -> addPoseReference (); manualKeyFrame -> updatePoseReference ( ushort poseIndex, Real influence ) ushort Real

Picking

CEGUI :: Point mousePos = CEGUI :: MouseCursor :: getSingleton (). getPosition (); Ray mouseRay = mCamera -> getCameraToViewportRay ( mousePos. d_x / float ( arg. state. width ), mousePos. d_y / float ( arg. state. height )); mRaySceneQuery -> setRay ( mouseRay ); mRaySceneQuery -> setSortByDistance ( false ); RaySceneQueryResult & result = mRaySceneQuery -> execute (); RaySceneQueryResult :: iterator mouseRayItr ; Vector3 nodePos; for (mouseRayItr = result.begin(); mouseRayItr != result.end(); mouseRayItr++) { if (mouseRayItr->worldFragment) { nodePos = mouseRayItr->worldFragment->singleIntersection; break; } // if }

Particle Effects mSceneMgr -> getRootSceneNode ()-> createChildSceneNode ()- > attachObject ( mSceneMgr -> createParticleSystem ( "sil", "Examples/sil" )); Examples/sil { material Examples/Flare2 particle_width 75 particle_height 100 cull_each false quota 1000 billboard_type oriented_self // Area emitter emitter Point { angle 30 emission_rate 75 time_to_live 2 direction velocity_min 250 velocity_max 300 colour_range_start colour_range_end } // Gravity affector LinearForce { force_vector force_application add } // Fader affector ColourFader { red -0.5 green -0.5 blue -0.5 }

Particle Effects Particle system attributes quota material particle_width particle_height cull_each billboard_type billboard_origin billboard_rotation_type common_direction common_up_vector renderer sorted local_space point_rendering accurate_facing iteration_interval nonvisible_update_timeout Emitter attributes (point, box, clyinder, ellipsoid, hollow ellipsoid, ring) angle colour colour_range_start colour_range_end direction emission_rate position velocity velocity_min velocity_max time_to_live time_to_live_min time_to_live_max duration duration_min duration_max repeat_delay repeat_delay_min repeat_delay_max Affectors (LinearForce, ColorFader) Linear Force Affector ColourFader Affector ColourFader2 Affector Scaler Affector Rotator Affector ColourInterpolator Affector ColourImage Affector DeflectorPlane Affector DirectionRandomiser Affector

Fire and Smoke affector Rotator { rotation_range_start 0 rotation_range_end 360 rotation_speed_range_start -60 rotation_speed_range_end 200 }

Cel Shading vertex_program Ogre/CelShadingVP cg { source Example_CelShading.cg entry_point main_vp … default_params { … } material Examples/CelShading { … vertex_program_ref Ogre/CelShadingVP {} fragment_program_ref Ogre/CelShadingFP {} }

Cube Mapping With Perlin noise to distort vertices void morningcubemap_fp ( float3 uv : TEXCOORD0, out float4 colour : COLOR, uniform samplerCUBE tex : register(s0) ) { colour = texCUBE(tex, uv); // blow out the light a bit colour *= 1.7; }

Bump Mapping + =

Reflections & Refractions // Noise texture_unit { // Perlin noise volume texture waves2.dds // min / mag filtering, no mip filtering linear linear none } // Reflection texture_unit { // Will be filled in at runtime texture Reflection tex_address_mode clamp // needed by ps.1.4 tex_coord_set 1 } // Refraction texture_unit { // Will be filled in at runtime texture Refraction tex_address_mode clamp // needed by ps.1.4 tex_coord_set 2 }

Reflections & Refractions

Grass Each grass section is 3 planes at 60 degrees to each other Normals point straight up to simulate correct lighting

Post-Processing Effects

Supporting tools (add-ons) Blender, Autodesk 3DS, Milkshape3D, Softimage XSI importers/exporters Particle editors, mesh viewers GUI editors Physics bindings Scene exporters and lots more Add-on site

References

Design Patterns First, an aside on Design Patterns

Ogre Foundations Root RenderSystem SceneManager ResourceManager

Ogre Foundations Mesh Entity Material Overlay – OverlayElement, OverlayContainer, OverlayManager Skeletal Animation