Download presentation
Presentation is loading. Please wait.
1
9.2. Other notable AI Aspects / HLSL Intro
Common board game AI approaches and Strategic AI
2
Tactical and Strategic AI
Execution Management Movement Strategy Decision Making World Interface Animation Physics ... Tactical and Strategic AI Brief introduction to tactical and strategic AI
3
Tactical and Strategic AI
Tactical and strategic AI encompasses a wide range of algorithms that try to: derive a tactical assessment of some situation, possibly using incomplete or probabilistic information Use tactical assessments to make decisions and coordinate the behaviour of multiple characters Aside: Not every genre of game needs tactical and/or strategic forms of AI
4
Waypoint tactics E C R S P E C S
A waypoint is simply a position in the game world. As with path-finding waypoints (holding path-finding information, e.g. terrain cost, etc.), tactical waypoints hold tactical information, e.g.: Cover points Reconnaissance/sniper locations Shadowed locations Power-up spawn points Exposed locations S E C R S R R C S P S E
5
Waypoint tactics Tactical locations can be either set by the designer or derived from game data or analytical algorithms. Tactical nodes can be combined with pathfinding nodes to provide tactically aware pathfinding. A measure of exposure has been mapped into the pathfinding approach
6
Influence maps Influence mapping is widely used in strategy games to map the influence/strength of each side. The game world is split into chunks (tile-based is a common representation). Each chunk is assigned an influence score based on the combined balance of influence ‘emitted’ by game objects that can effect that chunk. The influence map can be used to identify points of weakness and strength and, from this, drive strategic goal selection.
7
Influence maps The influence exerted on a particular area can depend upon the proximity of game objects (e.g. mobile units or stationary bases), type of surrounding terrain (e.g. a mountain range may ‘prevent’ influence passing), side specific factors (e.g. current financial or happiness state), etc. In most games, influence emitted by a game object decays over distance (e.g. using a linear drop-off alongside a defined maximum influence range). 4 5 1 3 2 +1 1 5 4 3 2 3 2 1
8
Jumping Overview of approaches enabling jumping in games Movement
Execution Management Movement Strategy Decision Making World Interface Animation Physics ... Jumping Overview of approaches enabling jumping in games
9
Jumping Unlike other forms of steering behaviour, jumps are inherently risky (i.e. they can fail, possibly with ‘fatal’ consequences). Also, steering behaviours typically re- evaluate decisions several times/second, correcting small mistakes. A jump action is a one-time, single event, fail-sensitive decision. To jump, the character must be moving at the right speed and in the right direction and ensure the jump is executed at the right time.
10
Jumping (jump points) The most simple approach is to place jump points into the game level. If characters can move with different speeds, then the jump point also needs a minimum jump speed. The character can then seek towards the jump pad, matching the specified speed, and jump whenever it is on the jump pad. Minimum jump velocity
11
Jumping (difficulties)
Some forms of jump require a defined target speed or a defined direction/angle of approach to the jump pad. Additionally, some jumps may have a higher ‘price’ of failure (e.g. ‘death’ vs. a short delay to climb-up). Aside: Such information can be incorporated into the jump point, but it is difficult to extensively test. Precise jump speed needed Precise jump direction needed
12
Jumping (landing pads)
A good approach is to pair a jump pad with a landing pad. Doing this permits the game object to determine the needed speed and direction (by solving the trajectory equations). This approach is more flexible (different characters can differ in their movement approach) and is less prone to error.
13
High Level Shader Language
Introduction to HLSL
14
A bit of history (fixed function pipelines)
Early versions of the DirectX and OpenGL APIs defined a number of fixed rendering stages. This forced all games to use the same approach with only a few parameters open to change.
15
Rasterisation / Interpolation
Application Recent history (shaders) Vertices Vertex Shader As GPUs increased in capability it became possible to inject small programs (called shaders) allowing the application to have greater control. A list of vertices (points) are sent to the vertex shader. In the rasterisation stage primitives are constructed from output vertices. The primitives are then rasterized (i.e. the onscreen pixels determined). Vertex attributes are interpolated between the pixels. The pixel shader determines the on- screen colour of each pixel. Rasterisation / Interpolation Pixel Shader Z-buffer test Frame buffer To screen
16
Shaders Shaders are small programs that run on the GPU. Different shader languages are available. Vertex Shader The vertex shader can set/change rendered vertices, e.g. object deformation, skeletal animation, particle motion, etc. Pixel Shader The pixel shader sets the colour of the pixel, e.g. for per-pixel lighting, texturing. Can also be used to apply effects over an entire scene, e.g. bloom, depth of field blur, etc. Aside: DirectX 10 also supports geometry shaders (not supported by XNA).
17
Decide if you want to explore this
To do: Decide if you want to explore this HLSL(High Level Shader Language) HLSL is a shading language developed by Microsoft for the Direct3D API. HLSL offers a number of functions (mostly centred around branching control, math functions and texture access). HLSL Aside: See for a complete HLSL reference
18
HLSL(Data types) bool int half float double
texture textureName; sampler2D textureSampler = sampler_state { Texture = textureName; MinFilter = Linear; MagFilter = Linear; MipFilter = Linear; AddressU = Wrap; AddressV = Wrap; AddressW = Wrap; } HLSL(Data types) HLSL supports the shown scalar data- types. Note: vectors/matrices forms can also be defined, e.g. float3, int2x2, double4x4, etc. HLSL also provides a sampler type (used to read, i.e. sample, textures): sampler, sampler1D, sampler2D, and sampler3D. True or false bool 32-bit signed integer int 16-bit floating point half 32-bit floating point float 64-bit floating point double The sampler type is defined using a number of different states, e.g. MinFilter, MagFilter, and MipFilter controlling texture filtering, and AddressU, AddressV, and AddressW controlling addressing states.
19
HLSL(Semantics) Semantics are used to map input and output data to variables. All varying input data (from the application or between rendering stages) requires a semantic tag, e.g. all outputs from the vertex shader must be semantically tagged. Above right note: [n] is an optional integer that provides support for multiple data types, e.g. Texture0, Texture1, Texture2. Vertex position in object space Position[n] Colour (e.g. Diffuse) Color[n] Normal vector Normal[n] Tangent vector Tangent[n] Binormal vector Binormal[n] Texture coordinate Texcoord[n] float4 vertexPosition : POSITION0; Aside: The only valid semantic inputs to the pixel shader are Color[n] and Texture[n]. Often, custom data (i.e. non-texture addressing) is passed using a Texture[n] semantic.
20
HLSL(Functions) HLSL permits C like functions to be specified.
A shader must define at least one vertex function that will consider vertex information and at least one pixel function that will consider pixel colours. These functions must define their inputs and outputs with semantics. Intrinsic Functions HLSL offers a set of ‘built-in’ functions, mostly centred around flow control, math operations and texture access. float2 CalculateParallaxOffset( float3 view, float2 texCoord ) { view = normalize(view); float height = parallaxScale * (tex2D(HeightSampler, texCoord).r ) + parallaxOffset; float2 viewOffset = view.xy * (height); return viewOffset; }
21
HLSL(Example) The following is a simple example shader
pixelShaderInput SimpleVS(vertexShaderInput input) { pixelShaderInput ouput; output. screenPosition = mul(input.position, wvpMatrix ); output.colour = float3(1.0f, 1.0f, 1.0f); return output; } float4 SimplePS(pixelShaderInput input) : Colour0 { return float4(input.colour.rgb, 1.0f); technique SimpleShader { pass VertexShader = compile vs_1_1 SimpleVS(); PixelShader = compile ps_1_1 SimplePS(); HLSL(Example) Vertex shader function Transform from model space to screen space The following is a simple example shader Define world-view-projection matrix float4x4 wvpMatrix : WorldViewProjection; struct vertexShaderInput { float4 vertexPosition : Position0; }; struct pixelShaderInput float4 screenPosition : Position; float3 colour : Color0; Pixel shader function Define input structure expected by the vertex shader Output pixel colour Technique definition – specifying VS and PS functions and compile type Define input structure expected by the pixel shader (and also output by the vertex shader)
22
Effect effect; effect = content.Load<Effect>(“effectName"); effect.CurrentTechnique = lightEffect.Techniques[“techique"]; effect.Parameters[“colour"].SetValue(Vector3.One); effect.Parameters[“tolerance"].SetValue(0.8f); effect.Begin(); foreach ( EffectPass pass in effect.CurrentTechnique.Passes) { pass.Begin(); // Send vertex information to the effect, e.g. //graphicsDevice.DrawUserIndexedPrimitives <VertexPositionTexture> (PrimitiveType.TriangleList, ... ); pass.End(); } effect.End(); Load the effect Effects in XNA Select the desired technique Effects in XNA are types of game asset (alongside textures and models). The Effect class represents an effect, permitting effect parameter configuration, technique selection, and actual rendering. An effect can be loaded/configured as shown. Define effect parameters Begin effect and iterate over each pass Aside: For better performance, the effect.Parameter[“name”] can be stored as an EffectParameter object (upon effect construction), and the setValue(...) method called on the parameter. Vertex information can be sent using other approaches End the effect
23
To do: Summary Complete Question Clinic
Today we explored: Brief intro to some types of strategic/ tactical AI Overview of how jumps can be supported within games HLSL and effect usage in XNA To do: Complete Question Clinic Consider if material is of use within your game. Complete section in project document on Alpha hand-in Roundup work for the alpha hand-in at the end of this week.
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.