Game Development Tools, Techniques, and Tips
Getting Started A game must be planned in the same way as any other software project. Typically, this means someone must: Formulate a concept for the game. Flush out this concept into requirements. Architect design to meet requirements. Implement software embodies this design. Test and refine software into working game.
Getting Started: What To Do, How To Do It Game Logic Tools Content Creation and Management Tools Game Logic Game Content and Assets Game Engine Programming and Support SDKs
Getting Started: What To Do, How To Do It Programming or scripting? If starting from scratch, the game will likely need to be programmed using a traditional programming language. If using an engine, you may be able to program with a traditional language, or you may have to use scripting instead. Depending on the engine! While script writing is very much like programming, it is not done using traditional languages, which might not be what is wanted for a programming course.
Getting Started: What To Do, How To Do It 2D or 3D? 2D game development is typically easier, both in terms of programming or scripting, but also in terms of content and asset management. 3D game development, conversely, tends to be more challenging, but more interesting. Depending on your language and tool of choice, however, you might be forced into one of these options or the other.
Getting Started: What To Do, How To Do It Screen shots from two games developed at Western, The Misadventures of State-Man (2D, left) and Smashocosm Tournament (3D, right). Both games received perfect grades, one for gameplay and the other for technical elements.
Development Stages Develop original concept Shop to publishers Create schedule (12-24 months) Deliver work as milestones (work products or completed activity) React to customer evaluation
Game Development is Unique Must be willing to rip out features that don’t work Designers may create things customer never heard of before May require more research and experimentation than other software development Often more ideas than time to implement
Development Team Size In the 1980’s might be single developer Most teams today have 10-60 people Programming in now a smaller part of the complete project than before (need good software engineering and media design work)
Example 1988 3 programmers 1 part-time artist 1 tester
Example 1995 6 programmers 1 artist 2 level designers 1 sound designer Contract musicians
Example 2002 2 producers 4 programmers 2 game designers 1 2D and texture artist 3 level designers 1 audio designer 4 animators QA lead and testers
Development Milestones: Development Timeline Here are some example development periods for different platforms: 4-6 months for a high-end mobile game 18-24 months for an original console game 10-14 months for a license / port 16-36 months for an original PC Game
Budgeting Personnel costs Developer/Contractor payments Salary x time x involvement % Developer/Contractor payments Equipment & software Supplies Travel & meals Shipments
Kicking Off Tasks - Audio Sound list Music specification Story text Voice-over script Creation of sounds Creation or licensing of music Recording of voice-overs
Quality Assurance Test plan The QA database QA – the view from inside The QA-producer relationship Post mortem
What are Game Objects? Anything that has a representation in the game world Characters, props, vehicles, missiles, cameras, trigger volumes, lights, etc. Need for a standard ontology Clarity Uniformity Feature, staff and tool mobility Code reuse Maintenance E.g. use of modularity/inheritance reduces duplication As a rule of thumb, if it has a position it can be a game object, although not all game objects have a position. Most games try to devise a universal representation for their objects.
Components vs Hierarchies GameObject GameObject * * Shift of ontology from a semantic network (nested sets of functionality) to flat aggregation of functionality. Prop Attribute<T> Behaviour
Hulk:UD Object Model Let’s look at an example. This is the object model from the Hulk: Ultimate Destruction game, which predates the game object model. The two main classes are the character class and the prop class. The character class had to cater for a number of types of characters: Hulk, soldiers, Hulkbusters, boss characters etc. Therefore, it was a superset of the functionality required by all of them, e.g. it contained the AI and cloud code, which wasn’t required by Hulk. As another example, a character could be held or hold another character, prop or a civilian. These 3 cases required different code. Consequently, the file CCharacter.cpp had 11,000 lines of code and the memory footprint of a character object was over 20k. And that was on the last gen hardware. Props were slightly lighter and had different functionality, although they replicated a lot of the code and data. When it came to implementing helicopters, it proved hard to bend either the Character class or the prop class to fit the requirements, so a hybrid was made, by creating both instances and linking one to another, the so called character props. When it came to fleshing out the ambient system, the characters were deemed too expensive to use for pedestrians and cows. So the pedestrians were written from the ground up, and cows were implemented as animated props. Both had their own state machines and AI.
Prototype Game Objects Alex PhysicsBehaviour TouchBehaviour CharacterIntentionBehaviour MotionTreeBehaviour CollisionActionBehaviour PuppetBehaviour CharacterMotionBehaviour MotionStateBehaviour RagdollBehaviour CharacterSolverBehaviour HealthBehaviour RenderBehaviour SensesInfoBehaviour HitReactionBehaviour GrabSelectionBehaviour GrabbableBehaviour TargetableBehaviour AudioEmitterBehaviour FightVariablesBehaviour ThreatReceiverBehaviour Helicopter PhysicsBehaviour TouchBehaviour CharacterIntentionBehaviour MotionTreeBehaviour CollisionActionBehaviour PuppetBehaviour CharacterSolverBehaviour HealthBehaviour RenderBehaviour HitReactionBehaviour GrabbableBehaviour GrabBehavior TargetableBehaviour AudioEmitterBehaviour FightVariablesBehaviour EmotionalStateBehaviour ThreatReceiverBehaviour FEDisplayBehaviour Pedestrian(HLOD) PhysicsBehaviour CharacterIntentionBehaviour MotionTreeBehaviour PuppetBehaviour HealthBehaviour RenderBehaviour GrabbableBehaviour GrabBehaviour TargetableBehaviour AudioEmitterBehaviour EmotionalStateBehaviour FEDisplayBehaviour CharacterPedBehaviour Pedestrian(LLOD) SensesInfoBehaviour TargetableBehaviour PedBehaviour Now let’s compare it to the object model in the Prototype game. Here, all entities are instances of the same class. However, every instance is taylor-made for its specific role. It has only the behaviours it needs. The main character doesn’t have any AI or cloud data. The helicopter doesn’t have the character motion, ragdoll or grabbing code. HLOD pedestrians are similar to Alex, but have fewer behaviours or use their lighter versions. LLOD pedestrians have only three behaviours! As you can see, instead of a rigid hierarchy of classes, onto which every new type of entity has to be more or less forcefully grafted, we have a flexible pick’n’mix approach, where every object can be made into exactly what it needs to be, without incurring much overhead. Ground spike example.
An attribute or not an attribute? Attribute if accessed by more than one behaviour, or accessed by external code Otherwise a private member of the behaviour If not sure, make it an attribute … If you find yourself writing a lot of “Get” messages, consider using an attribute instead. If you are worried that something will corrupt it when it’s a public attribute, remember that you will always get a message if that happens, in which case you can either deal with it, or assert. Going forward we want to shift most if not all of data to attributes and leave behaviours stateless.
Game Object Update GameObject::OnUpdate(pass, delta) for b in behaviours b.OnUpdate(pass, delta) OnUpdate() and OnMessage() are the only two entry points to a behaviour. Finally, to complete the picture, every game object and its behaviours receive update passes. This is where regular processing is done.
HealthBehaviour Example void HealthBehaviour::OnMessage(Message* m) { switch (m.type) case APPLY_DAMAGE: Attribute<float>* healthAttr = GetAttribute(HEALTH_KEY); healthAttr->value -= m.damage; if (healthAttr->value < 0.f) mGameObject->SetLogicState(DEAD); break; case ATTR_UPDATED: if (m.key == HEALTH_KEY) } There is no OnUpdate()
Interface creating the connection
Key Questions ■ How do game interfaces relate to player-centered design? ■ What are the components of game interfaces? ■ What is the difference between a manual and visual interface? ■ What is the difference between a passive and an active interface? ■ Why is usability important in game interface design?
Player-Centered Design What’s wrong with this picture?
Interface & Game Features Gameplay Story Character Audio World Platform Genre
Interface Types Manual (Physical) Bass Fishing Samba di Amigo Dance Dance Revolution
Interface Types Visual Active Puzzle Pirates (radial/pie menu) I Was an Atomic Mutant (main menu)
Interface Types Visual Passive True Crime: Streets of LA (HUD = heads-up display)
Interface Types Visual Styles Split-Screen (Adventures of Fatman) Whole Screen (Myst III: Exile) Invisible (Black & White: Creature Isle)
Visual Interface Components Score Cyclone Super Collapse
Visual Interface Components Lives & Power Super Mario Sunshine (Lives) Tour de France (Power)
Visual Interface Components Map Age of Empires II: The Age of Kings Age of Mythology: The Titans
Visual Interface Components Character Character creation & management and character inventory interfaces from The Temple of Elemental Evil
Visual Interface Components Start Screen Crazy Bunker
Saving the Game Quick-save Auto-save Save to slot (or file) “Save-game” debate
Guidelines for a Great Interface Be consistent Enable shortcuts Provide feedback Offer defined tasks Permit easy reversal of actions Allow for player control Keep it simple Make it customizable Include a context-sensitive pointer Implement different modes Use established conventions
Define A Color Scheme Color is a very important part of an interface. What color is your game? This is a good question to answer early. Anyone who looks at your interface should be able to see at a glance the color scheme of the entire game.
3D Solutions & Challenges 3D interfaces can be very compelling and the idea of creating a 3D interface seems really cool. 3D interfaces can also be very expensive and time-consuming. Making big changes to a 3D interface can be more difficult than making changes to a 2D interface.
Creating GUI Elements Generally, we provide the player with a Graphical User Interface for the player to interact with the program. The menu we employ at the start-up of a game, change the setup, or quit – these screens are all examples of GUIs.
Know Your User Can we make any generalization about gamers? Technical level? Gender? Other?
Know Your User's Tasks Tasks will vary per game For example, what are the tasks: in a puzzle game? in a RTS? in an MMO? Multi-player games are interesting, as they combine aspects of instant messaging with other gameplay aspects Communication is often a necessary task
Game Audio Prototyping
Game Audio Overview Like film audio, game audio is comprised of speech, sound effects and music However, unlike film audio, the audio is interactive and must respond to changes in gameplay
Old-Gen Game Audio Workflow Conventional methods before the current generation of game consoles (pre-2007) have required recompilation and game coder involvement to hear results which is typically slow: Create 30 mins Compile 5-30 mins Run Game 3 mins Locate 2-5 mins Test 5 mins
Next-Gen Audio Prototyping Allow sound designers to create their procedural sound designs in real-time while the game is running Middleware solutions such as Wwise allow real-time tweaking of parameters, however only for existing patches and basic synthesis unless plugins are used Use of Kismet within Unreal allows for some scripting
Process What do you need to do to get your audio into the game? Over-prepare for the worst What tools and resources are you going to use and what software will support your creative design? Work well with your coder & producer A good process can be found by making all the right mistakes once and taking risks - exciting! Audio is always last, so be prepared for feature drop, no money, no time & no love :)
Definitions Sound: compression waves transmitted through a medium Wave: cycle of compression and rarefaction Amplitude: measure of sound wave pressure Coming soon: cool stuff
Common Frequency Ranges Human Hearing: 20Hz – 20kHz Human Voice: 90 – 300 Hz Middle C: 261 Hz
Digital Sound A “unit” of digital sound is called a sample The sampling rate is the number of samples taken per second Bit depth: size of sound sample data
When you’re Project Lead: Create an audio asset list “How many sounds can this make?” “How often will we hear this?” “How much music can/should we use?” Excel spreadsheets are commonly used
Sound Design Critical is interactive audio component Sound when event occurs (gunshot when trigger pulled, dialog when character spoken to, …) Well done, sounds great. Poorly done, ruin all. Need to avoid repetition One footstep for 20+ hours of play annoying Need 6-20 (depending upon budget) Dynamics can help (pitch, volume, stereo…) Mix pre-existing sounds with own sounds Provides “custom” identity for game
Open Sound Control (OSC) OSC is open-source software that allows communication between the game and PD OSC is included in PD extended and just requires the addition of several C++ modules into the game engine Access to the game events while the game is running
Audiomulch Independent audio processing tool that is good for granulation Doesn't support OSC but can receive MIDI control information from PD server Easy to use and quick to prototype real-time adaptive musical ideas
Audio Team Briefly, allow to see some roles Book has details Production both science (tech) and art Three teams: Music Team Sound Design Team Dialog Team
Music Team Composer Write custom music (writing, recording, mixing) Contracted per-project basis With larger budgets, 1 person will have assistants
Music Team Recording Engineer Enables production of sound through mechanical means Gets best sounds out of each component Often work out of home May often be a sound designer (coming next)
Sound Design Team Audio Director/Manager Sound Designer Manage sound design teams Keep track of resources and schedules Execute vision of game producer on sound and dialog Sound Designer Bring life-like (and beyond life) sound to game Critical member, as audio has more capability and more importance
Sound Design Team Implementer Work with production tools to attach sounds to events, characters, etc. “Level designers” of the audio department Not too common (may often be “just” a programmer with no audio training), but increasingly more common
Dialog Team Voice Actors Dialog Editor Provide voice for characters, animations, cut-scenes Unionized (better but expensive) or non-unionized (cheaper, but less expensive) Dialog Editor Organize files created by voice actors Master files, check for errors and submit assets to audio director Often tedious, but critical
Spatialized Audio Making audio provide physical location clues Mono – one channel, no chance for spatialization Stereo – two channels, left and right, like the ear works Different volumes create illusion of sounds in space Gradual changes give illusion of “moving” Surround sound - 5.1 – 5 main, 1 subwoofer Usually, dialog center, music left and right and specialized sound effects behind Environment can often affect Bounce off walls, objects – door open and in next room? Material matters (wood, metal, plastic) Climate matters (temp, humidity) Getting better (Creative Labs with Environmental eXtensions, EAX)
MP3 – ‘MP3' abbreviation of MPEG 1 audio layer 3 'MPEG' abbrev of 'Moving Picture Experts Group‘ 1990, Video at about 1.5 Mbits/sec (1x CD-ROM) Audio at about 64-192 kbits/channel MP3 differs in that it does not try to accurately reproduce PCM (waveform) Instead, uses theory of 'perceptual coding‘ PCM attempts to capture a waveform 'as it is‘ MP3 attempts to capture it 'as it sounds'.