How Navigation Meshes Work in 3D Games | AI 101

How Navigation Meshes Work in 3D Games | AI 101
    Watch the video

    click to begin

    Youtube

    Hi I'm Tommy Thompson and this is AI and Games, a series on research and applications of artificial
    intelligence in video games. This video is the beginning of a new series of AI 101, where
    I'm going to be looking at many of the fundamental AI tools and techniques used in game development.
    This is aimed at helping aspiring game developers understand the basic theory of these methods,
    where they become useful in making your own games and how they're applied in some of the
    biggest titles in the industry.
    In this first video, I want to talk about navigation. One of the most critical and often
    unspoken challenges found in modern games is ensuring that non-player characters can
    move around an environment. This means walking, running, climbing, hiding behind cover, swimming
    and the like. Navigation seldom receives the attention it deserves when talking about how
    AI works in a specific game - a point even this series is guilty of. Despite this, faulty
    navigation and character movement is arguably one of the easiest ways to break the immersion
    of a game.
    So let's take a look at how navigation works in 3D games using a tool referred to as a
    navigation mesh: what it is, how it works and how some games address very specific issues
    that their core design creates for them.
    A navigation mesh or navmesh is a data structure used to represent the accessibility of a three
    dimensional surface or space by encoding it as a series of polygons. If you watch an AI
    character move across a surface in a given game, odds are it's using a navigation mesh
    to do so. A navigation mesh stores important data about that region of space for an AI
    character to move across it, this includes:
    - What parts of the surface is accessible given there might be obstacles or the structure
    of the space fractures it into even more polygons. - How individual surfaces link to one another.
    This is useful for environment artists given they will build up a game world with smaller
    pieces rather than one big chunk of land. - Dictating a type or region for that surface.
    Regions can be used to identify how expensive it is to traverse a surface by providing a
    cost value and AI agents can be told which types they can navigate across.
    Modelling accessibility of the space is critical to achieve any sort of intelligent movement,
    given we want to ensure our characters know if they can move around the game space in
    a manner that is cheap and easy to calculate. Meanwhile the type of an area is important
    in crafting an AI that looks like it's making intelligent decisions about how it uses or
    ignores part of the virtual space. This is important in open world games such as Assassin's
    Creed and Far Cry, where NPCs will actively avoid moving in water and instead attack from
    the shore or move around if possible. Meanwhile, using costs to make regions more expensive
    than others can be useful for having characters prioritise surfaces such that their behaviour
    is in keeping with expectations. A great example of this is civilians in Grand Theft Auto,
    who we expect to walk along sidewalks and use signal crossings rather than just wander
    about on patches of grass and cross the road wherever they like. Although if Rockstar ever
    decide to make a GTA in Glasgow they can probably throw out those restrictions.
    Lastly links helps us manage where and how characters can cross gaps in the nav meshes.
    Sure, sometimes you might want to ensure an AI can't follow you to a specific location,
    such as the inmates in Outlast who can't follow you after you run through tight gaps. However,
    you typically want AI to move between navmesh areas at specific points, such as soldiers
    climbing up and down ladders in F.E.A.R. or demons leaping to higher ground in DOOM. Links
    dictate where and how this can happen and is often heavily tied to an animation that
    should be played whilst the link is traversed: a point that I'll return to in a future video.
    As mentioned earlier, navigation is often the first point of failure in a games AI and
    its readily apparent when it's not working. Sources close to the development of DOOM 2016
    revealed to me that the SnapMap system - a pretty cool map and mode creation tool - was
    consistently broken during development, largely due to navmesh integration issues. Plus bugs
    in behaviour in a lot open world games such as Watchdogs and Assassin's Creed, have often
    stemmed from navigation issues or failing to address how to compensate for limited movement
    capabilities.
    However, before I get into the fun ways navigation is utilised in game design, lets side step
    into how navmeshes are typically adopted in gaming and where they come from.
    The history of navigation meshes stems back around 20 years as in-house 3D engines arose
    in the late 1990's. The actual tech and theory behind it stems from robotics research from
    the 1980's, where back then researchers were faced with how to get a robot to move across
    a real world space after being given accurate map data. Some of the early days tech, such
    as meadow mapping or voronoi tesselation - which take a space and break it up into polygons
    like so - is the cornerstone of how nav meshes are built today. Once a space like this is
    broken up, an AI can then use a search algorithm to plot a path through the space and optimise
    it to become more natural.
    One of the largest moves forward in defining navigation tools for games can be found in
    1999's Quake III Arena, which uses a navigation system called the 'Area Awareness System'.
    The AAS scans the game map to create convex hulls that define walkable regions and simulates
    movement such that it can identify how and if agents can move between them and allows
    for custom areas and nav mesh links to be established.
    This principle has largely held consistent, with the likes of Mikko Mononen's open source
    projects Recast and Detour approach the problem in a similar manner. This consistency, plus
    how critical basic navigation is to even the most basic behaviours in 3D video games, means
    it's one of the few AI technologies commonly provided in commercial game engines such as
    Unreal Engine and Unity. Each engine has it's own way in which it allow users to define
    and customise them, but the core logic is largely consistent (with Recast being the
    basis of the navmesh systems in Unreal Engine 4)
    Navigation meshes are typically built or baked in advance in-engine during development and
    are stored as data as part of the level files. Designers and programmers would build navigation
    meshes in their engine of choice during development. However this presents a problem: given that
    whether an area is accessible or not might not hold true in an environment once gameplay
    is underway. Even something as simple as another non-player character moving through the space
    can violate the integrity of the navigation mesh, but this is often handled with tools
    that identify them as moving obstacles that you should avoid. But what's worse is when
    the topology of the map has been fundamentally altered when parts of the level are changed,
    disabled or outright removed. As stated, nav meshes are baked in advance because they're
    an expensive thing to do that could impact performance. However, nowadays most game engines
    do allow for re-baking of a navigation mesh at runtime, this is a still time consuming
    and relatively expensive process.
    So let's take a look at how some games use nav meshes in clever ways as part of their
    core design as well as some of the problems faced along the way.
    While nav meshes are critical to achieving basic behaviour and movement, sometimes we
    can clever ways to exploit them for the purposes of the games core design. So let's take a
    look at some fun and often pragmatic examples of how games work with and around navmeshes
    in AAA games.
    First up let's talk combat behaviour, knowing where to go in the world and how best to get
    there. A great example can be found in my recent case study on DOOM 2016 where the 'exposed
    cover' system aims to have demons deliberately position themselves on the navmesh in an effort
    to maintain the players line of sight. Not just so that they can attack you, but also
    help the player make a decision on which target they're going to move towards next. DOOM much
    like all idTech engine games, uses an updated version of the AAS system from Quake III Arena.
    This allows for characters in the likes of both DOOM and Rage to be able to navigate
    to exact locations in the game world, conducting minor linear interpolation if they're slightly
    off point and animating to satisfy the discrepancy. It's what gives characters in id tech games
    such wonderful fluid motion.
    Meanwhile, Alien Isolation utilises the navigation mesh not just for basic movement, but for
    sensory data as well as in-game behaviours. Air vents act as locations for the alien to
    enable front-stage mode - where it wanders around the local area - and uses the nav mesh
    for realistic navigation. The air vent is actually playing an animation of the alien
    crawling out of it. Then the AI in the alien activates at the end of the animation. Conversely,
    when the alien is in backstage mode it isn't crawl through vents, it simply disables its
    renderer, colliders and attack behaviours and moves around the navmesh randomly in straight
    lines with long stops to make you think it's doing something intelligent. So when you see
    it move above or below you in the vents, it's actually just walking through you, only it's
    invisible, can walk through walls and can't hurt you, which actually makes the game sound
    even scarier.
    The actual navigation in Alien Isolation is a modified version of the aforementioned Recast
    and Detour, only it's got an extra set of sensory gauges that are tied into the navigation
    system. The noise players make is passed through the navigation system to the alien such that
    is can properly understand whether it would actually be able to hear the noise and pinpoint
    it's location. In fact a source once told me that at one point they experimented with
    having the alien be able to smell you, but it was too difficult to quantify in-game for
    a player to understand it. Lastly, when the alien goes to kill you, it has to sample the
    navigation mesh to ensure that the execution it's about to play fits into the space. This
    ensures it doesn't try and conduct a death animation such as this one in a tight room
    with no floor space.
    Next let's consider some open world games. In Tom Clancy's The Division, the manner in
    which non-player characters move through the environment and interact with cover is dependent
    on their class and faction. With some sprinting to cover and laying low as best as possible,
    while others will stay in the open to provide ample targets for players to prioritise. However,
    despite this strategic play, NPCs are limited to specific regions of the map for both story
    missions and in the open world. They're not permitted to run too far from their original
    spawning location so as to manage the experience. Otherwise players could wind up with a conga
    line of thugs chasing them from Times Square all the way down to 22nd street.
    This same consideration is extended to the Far Cry franchise, with non-player characters
    including wildlife only ever spawn onto a navigable surface within a range of around
    250m of the player. This provides ample space within which to create interesting puzzles
    for players to solve, but also manages resources such that the game isn't wasting CPU power
    on processing AI that are miles away.
    Lastly, Left 4 Dead breaks up the navigation mesh into chunks and constantly monitors which
    regions players are standing in. This is done to allow the director AI to calculate ideal
    spawning locations for mobs and special infected that are both local to the team, but outside
    of your field of view. You can actually see some this happening in-game if you enable
    the debug tools in an offline match.
    However it's not just enemy characters that are heavily reliant on navmeshes for core
    design, companions in a variety of forms also heavily utilise these tools to know where
    to go and how to operate in proximity of the player. In my recent case study on Far Cry
    Primal, I looked at how the companion AI utilises navmeshes to follow the player and stays within
    the view frustum when on the move. However the game also needs to compensate for issues
    where the player has moved into a space the animal cannot and the player is getting even
    farther away. This occasionally results in the companion system looking for a valid location
    on the navmesh behind the player to teleport towards.
    Meanwhile a similar problem arises in Tom Clancy's Ghost Recon: Wildlands, where the
    player has three AI companions that assist in combat when playing offline. To keep your
    team following your lead during stealth incursions of outposts, the player drops 'breadcrumbs'
    onto the navmesh for the teammates to move towards. Plus in much the same behaviour as
    Far Cry Primal, in the event companions need to be close to the player but the current
    circumstances do not permit it, the game will search positions off-camera for the characters
    to spawn onto. This is particularly important should the player be downed and close to death
    and needing that all important revival after going off and fighting an entire encampment
    by themselves. Silly humans.
    Plus, for companions it's sometimes vital to constrain the space within which they navigate,
    BioShock Infinite adopts a principle known as the critical path, whereby companion character
    Elizabeth will only move towards and engage with interactable objects that are between
    the player and the next objective. Allowing for her to express herself in a variety of
    ways, but in a manner that ensures players actually pay attention.
    Lastly, let's talk about rebuilding navigation meshes. It's not always an easy process to
    integrate into your games core design, with a great example found in Ubisoft's Rainbow
    Six: Siege. In the game mode Terrorist Hunt, AI opponents need to navigate an environment
    that is constantly changing thanks to Siege's procedural destruction system that allows
    for surfaces to be destroyed in a variety of ways. Siege has to be ready to rebuild
    the navigation mesh at any point time while maintaining performance as an online multiplayer
    running at 60 fps.
    As detailed in Julien L'Heuruex's 2017 Game Developer's Conference talk, Siege does not
    calculate destruction changes in real-time, it pre-calculates them before they happen.
    Pretty much all actions that can damage or destroy surfaces have a warm up process, be
    it Sledge winding up to with his hammer, to breach charges being placed upon walls or
    Hibana and Ash's projectile-based explosives. As the player starts the animation, their
    instance of the game spools off a separate thread of execution that calculates how the
    surface will be broken, as well as how that region of the navigation mesh will behave
    after that surface is damaged, updating whether it is fully accessible or if navmesh links
    have been created or destroyed. This is then synchronied into the main game thread and
    pushed to server to replicate for others players at the point of destruction. This means that
    as soon as the dust has cleared, there's not just a hole in the wall, but the AI now knows
    if it can climb or shoot through it.
    ==CLOSING== In closing, navigation meshes are a critical
    component of contemporary AI game development, but also one that requires diligence and attention
    when crafting dynamic and ever changing environments. Figuring out how to handle and potentially
    update navmeshes as critical to the design of many contemporary games, given that it
    is ultimately one of the first points of failure for non-player characters that can break immersion.
    If your AI can't figure out how to compensate for environmental change, then your game is
    going to suffer in the long run.
    The AI of Doom (2016) | AI and Games Dota 2, MOBA's and the Future of AI Research | AI and Games Tom Clancy's The Division: full Agent Origins movie - Live Action Short Film How to Survive in Gamedev for Eleven Years Without a Hit Survival of the Smartest Graphics Rewriting Your Legacy: Campaign Design in Titanfall 2 | Design Dive 2018's Best Game Engines How many AI Agents can Unity handle? LUDUM DARE 42 - Making of EVOLUTION The Director AI of Left 4 Dead | AI and Games

    Post a Comment