Game AI Final Assignment Report

“Hiragana Hunter” (demo prototype)

The game uses the following three game AI mechanics: A*, Path Smoothing, Agent Sensors. A* is used through a series of pre-defined nodes (the grass tiles actually), and performs a series of calculations based on A*, with the Manhattan Distance heuristic to calculate the shortest path to the target (the exit portal, the purple circle).

Once the A* final path is calculated, path smoothing is performed, albeit, in a somewhat lacking manner. The path smoothing will only perform smoothing on the final three tiles within a shift of movement from horizontal to vertical movement, and vice versa.

pathsmoothing_astar

With the code currently in place the agent will only move directly from A to C during the angle tiles themselves(indicated by thick pink line) , rather than immediately when a clear path is available (indicated by red thin line). This is a known issue meaning that the pathfinding isn’t technically fully instantiated. The version of path smoothing that is within the code, does check that the piece can move to that destination and does replace the A* path of those nodes to the smoothed path instead — which is a part of the path smoothing algorithm.

While some additional work remains in expanding the smoothing for far longer distances other than the simple 2~3 (Manhattan distance), options were considered to resolve this. Such as using a sphere collider from current point to target point and if the path is clear, simply move in the direction along the tangent line. Another idea of course would be to simply expand on what is currently implemented and have it working properly 100%.

The final game AI mechanic in place is agent detection. The hiragana pieces, aka the agents, have a 3D sphere collider around them for the sole purpose of detecting other agents, the piece itself has a 3D box collider for detecting collisions with the player and its sword, as well as detecting the collision with the exit portal itself.

The sphere collider does currently detect when another agent is near, and it will perform a calculation to see which of the agents is closer to the exit and one of them will pause to allow the other agent to move forward and thus separate the two pieces.

agent_sensor

However, it seems that the coroutine yield to wait any amount of seconds, does not work, however that block of code is being triggered as discovered via console print statements.

The agents technically do have sensors, they do work, but the code for causing the thread to wait during movement ticks, does not for some unknown reason.