Clone wiki

High Speed Priority Queue for C# / Why do I have to extend PriorityQueueNode to enqueue something

Typical data structures allow you to add an instance of any class T. To allow for this, they'll wrap the class in their own Node class, which also stores other data relevant to the data structure.

private class Node<T>
    public T Value;
    public Node Parent;
    public Node LeftChild;
    public Node RightChild;
    public double Priority;

This paradigm is convenient - it instantly supports storing instances of classes - but slow. Every time you call .Add(), you have to do another Node allocation. On Windows, allocations are fast (though on XBox 360 and Windows Phone they are still pretty slow), but not doing any allocation is even faster. You could, however, solve this issue with a pool of objects.

Another issue is that, to implement .Contains(), we either have to search for the value in the data structure (slow) or store a separate Dictionary<T, Node> to map values to nodes. The latter is actually pretty fast, but we can do much better.

By forcing the values to extend PriorityQueueNode, we can avoid both of those issues. For example, here is the code for .Contains() in HeapPriorityQueue:

public bool Contains(T node)
   return (_nodes[node.QueueIndex] == node);

That lookup is about 5-10x faster than using a Dictionary<T, node>. A 5-10x speedup over "practically nothing" would typically not be a big deal; however, since IPriorityQueue.Contains() is the most-called method in many path-finding algorithms, making it as fast as possible is extremely important for our application.

Thus, this data structure trades off convenience for speed.