Hearing works similarly to vision but doesn't take into account the nodes direct visibility because of the properties of the sound. However, we still need a sound receiver in order to make it work. Instead of making an agent a direct sound receiver, in this recipe, the sound travels along the sound graph and is perceived by the graph nodes.
It is important to have grasped the chapter regarding path finding in order to understand the inner workings of the graph-based recipes.
using UnityEngine; using System.Collections; using System.Collections.Generic; public class EmitterGraph : MonoBehaviour { // next steps }
public int soundIntensity; public Graph soundGraph; public GameObject emitterObj;
public void Start() { if (emitterObj == null) emitterObj = gameObject; }
public int[] Emit() { // next steps }
List<int> nodeIds = new List<int>(); Queue<int> queue = new Queue<int>(); List<int> neighbours; int intensity = soundIntensity; int src = soundGraph.GetNearestVertex(emitterObj);
nodeIds.Add(src); queue.Enqueue(src);
while (queue.Count != 0) { // next steps } return nodeIds.ToArray();
if (intensity == 0) break;
int v = queue.Dequeue(); neighbours = soundGraph.GetNeighbors(v);
foreach (int n in neighbours) { if (nodeIds.Contains(n)) continue; queue.Enqueue(n); nodeIds.Add(n); }
intensity--;
The recipe returns the list of affected nodes by the sound intensity using the Breadth-First Search algorithm. The algorithm stops when there are no more nodes to visit, or when the intensity of the sound is dimmed by the graph traversal.
After learning how to implement hearing using both colliders and graph logic, you could develop a new hybrid algorithm that relies on a heuristic that takes distance as inputs. If a node goes beyond the sound's maximum distance, there's no need to add its neighbors to the queue.
The following recipes of Chapter 2, Navigation: