Immutability in C# Part 10: A double-ended queue

Immutability in C# Part 10: A double-ended queue

Rate This
  • Comments 28

Based on the comments, the implementation of a single-ended queue as two stacks was somewhat mind-blowing for a number of readers. People, you ain't seen nothing yet.

Before we get into the actual bits and bytes of the solution, think for a bit about how you might implement an immutable queue which could act like both a stack or a queue at any time. You can think of a stack as "it goes on the left end, it comes off the left end", and a queue as "it goes on the left end, it comes off the right end". Now we want "it goes on and comes off either end". For short, we'll call a double ended queue a "deque" (pronounced "deck"), and give our immutable deque this interface:

    public interface IDeque<T>
        T PeekLeft();
        T PeekRight();
        IDeque<T> EnqueueLeft(T value);
        IDeque<T> EnqueueRight(T value);
        IDeque<T> DequeueLeft();
        IDeque<T> DequeueRight();
        bool IsEmpty { get; }

Attempt #1

We built a single-ended queue out of two stacks. Can we pull a similar trick here?  How about we have the "left stack" and the "right stack". Enqueuing on the left pushes on the left stack, enqueuing on the right pushes on the right stack, and so on.

Unfortunately, this has some problems. What if you are dequeuing on the right and you run out of items on the right-hand stack?  Well, no problem, we'll pull the same trick as before -- reverse the left stack and swap it with the right stack.

The trouble with that is, suppose the left stack is { 1000, 999, ..., 3, 2, 1 } and the right stack is empty. Someone dequeues the deque on the right. We reverse the stack, swap them and pop the new right stack. Now we have an empty left-hand stack and { 2, 3, 4, .... 1000 } on the right hand stack. It took 1000 steps to do this. Now someone tries to dequeue on the left. We reverse the right queue, swap, and pop, and now we have { 999, 998, ... 3, 2 }.  That took 999 steps. If we keep on dequeuing alternating on the right and left we end up doing on average five hundred pushes per step. That's terrible performance. Clearly this is an O(n2) algorithm.

Attempt #2

Our attempt to model this as a pair of stacks seems to be failing. Let's take a step back and see if we can come up with a recursively defined data structure which makes it more apparent that there is cheap access to each end.

The standard recursive definition of a stack is "a stack is either empty, or an item (the head) followed by a stack (the tail)". It seems like we ought to be able to say "a deque is either empty, or an item (the left) followed by a deque (the middle) followed by an item (the right)".

Perhaps you have already seen the problem with this definition; a deque by this definition always has an even number of elements! But we can fix that easily enough.  A deque is:

1) empty, or
2) a single item, or
3) a left item followed by a middle deque followed by a right item.

Awesome. Let's implement it.

    public sealed class Deque<T> : IDeque<T>
        private sealed class EmptyDeque : IDeque<T>
            public bool IsEmpty { get { return true; } }
            public IDeque<T> EnqueueLeft(T value) { return new SingleDeque(value); }
            public IDeque<T> EnqueueRight(T value) { return new SingleDeque(value); }
            public IDeque<T> DequeueLeft() { throw new Exception("empty deque"); }
            public IDeque<T> DequeueRight() { throw new Exception("empty deque"); }
            public T PeekLeft () { throw new Exception("empty deque"); }
            public T PeekRight () { throw new Exception("empty deque"); }
        private sealed class SingleDeque : IDeque<T>
            public SingleDeque(T t) { item = t; }
            private readonly T item;
            public bool IsEmpty { get { return false; } }
            public IDeque<T> EnqueueLeft(T value) { return new Deque<T>(value, Empty, item); }
            public IDeque<T> EnqueueRight(T value) { return new Deque<T>(item, Empty, value); }
            public IDeque<T> DequeueLeft() { return Empty; }
            public IDeque<T> DequeueRight() { return Empty; }
            public T PeekLeft () { return item; }
            public T PeekRight () { return item; }
        private static readonly IDeque<T> empty = new EmptyDeque();
        public static IDeque<T> Empty { get { return empty; } }
        public bool IsEmpty { get { return false; } }
        private Deque(T left, IDeque<T> middle, T right)
            this.left = left;
            this.middle = middle;
            this.right = right;
        private readonly T left;
        private readonly IDeque<T> middle;
        private readonly T right;
        public IDeque<T> EnqueueLeft(T value)
            return new Deque<T>(value, middle.EnqueueLeft(left), right);
        public IDeque<T> EnqueueRight(T value)
            return new Deque<T>(left, middle.EnqueueRight(right), value);
        public IDeque<T> DequeueLeft()
            if (middle.IsEmpty) return new SingleDeque(right);
            return new Deque<T>(middle.PeekLeft(), middle.DequeueLeft(), right);
        public IDeque<T> DequeueRight()
            if (middle.IsEmpty) return new SingleDeque(left);
            return new Deque<T>(left, middle.DequeueRight(), middle.PeekRight());
        public T PeekLeft () { return left; }
        public T PeekRight () { return right; }

I seem to have somewhat anticipated my denouement, but this is coding, not mystery novel writing. What is so awful about this implementation? It seems like a perfectly straightforward implementation of the abstract data type. But it turns out to be actually worse than the two-stack implementation we first considered. What are your thoughts on the matter?

Next time, what's wrong with this code and some groundwork for fixing it.


  • It does leave a little to be desired.  Now instead of an O(n²) algorithm, you have a recursive O(n²) algorithm, so we can hammer both the CPU and the stack with it.  Nice.  ;)

  • The problem here is the recursive chain that is produced when you call EnqueueXXX and DequeXXX over a non trivial deque.

    I'm working in a solution that has, for every generarl Deque,  a cachedDequeLeft and cachedDequeRight, but I'm not sure if it´s going to work.

    Cross your fingers :)

  • Aaron: Yep! It's truly awful. Each enqueue is O(n) in stack consumption, time and number of new nodes allocated, so enqueuing n items in a row is O(n²) in time.

    Olmo: Sounds interesting!

  • For some reason, there's been a lot of buzz lately around immutability in C#. If you're interested in

  • I've been working on it for a while and there is no end.

    In my implementation Deque is FAST so eneque have to be defined in terms of Deque.

    The end of the history is that, while you can reuse a lot of this trees (should we call them firs? hehe), for a deque with n elements you need about n^2 that represent every single instance of all the possible subintervals.  So having a structure that needs so much memory is stupid. Also, inserting an element stills O(n)...

    So a way to nowhere ... :S

  • I'm thinking that you could just throw together an immutable version of a doubly-linked list.  You did say that the solution was tricky, and this one is downright banal, so it probably isn't what you're looking for, but it *would* work and give O(1) enqueue and dequeue performance.

    All you'd need is an Element<T> with (read only) Left, Right and Value properties, and you could give the Deque<T> "beginning" and "end" fields of type Element<T>.  Throw in a constructor that takes two Element<T> parameters and you've got yourself an immutable Deque.

    I guess the problem is that it would have to allocate a new Deque<T> for each and every Dequeue operation.  The memory performance isn't stellar.  Then again, that's exactly what we did for the Queue<T>, so does it make a difference?

  • Actually... never mind, I can see now why that wouldn't work - the Element<T> would have to be mutable.  The Deque could look immutable on the outside, but I imagine the point of this exercise is to build the whole thing using completely immutable data structures.  So, scratch that, back to square one.  :-)

  • Yep, you got it. Doubly-linked lists are always mutable.

  • Going back to attempt #1, if when one stack is empty, you reverse and transfer just *half* the elements from the other stack, you get amortised O(1) performance, and you're done ;)  This is mentioned in Chris Okasaki's book "Purely functional data structures" which describes a lot of interesting immutable data structures.

    Looking forward to seeing your alternative as well!

  • I hope that the half-reverse idea isn't the solution; that was the first thing that popped into my head, but it's only reducing that worst-case O(n²) performance to O(n log n), and it also raises the performance cost of enqueuing everything from one end and dequeuing everything from the other end from O(n) to O(n log n).  It's better than what we've got so far, but it ain't great.

    You could get similar performance using an AVL or red-black tree as the internal data structure; the "beginning" is the outer-left leaf and the "end" is the outer-right.  But it's still a high price to pay.  Knowing Eric, I'm sure he's got some extremely clever and totally nonintuitive hack that gives O(1) performance almost all the time.

  • Aaron: no, Luke is right. If you are clever about when you rebalance the deque, you can get amortized O(1) performance. You end up having to keep around extra information about how big each queue is, but that's not hard.

    However, that is in fact not the solution I had in mind.  Rather than fixing Attempt #1, I'm going to fix Attempt #2 by being more clever about what exactly goes in the left, middle and right sections.

  • I don't think that's exactly what you meant, and I'm sure I'm missing something, but would

    return new Deque<T>(left, middle.DequeueRight(), middle.PeekRight());

    even work as planned? Wouldn't we end up with the wrong right element after we do this? Did I just say 'wrong right element'? How is it you manage to do this to me almost every time?

  • I was able to make an efficient Deque using 2 "Trimmable Stacks" and keeping Deque Count information, but I am not sure about approach #2.  I am thinking it might be possible by making left and right into Stacks and keeping middle a Deque.  EnqueueLeft would be:

     return new Deque<T>(left.Push(value), this, right);

    I am just not sure if this works for all other operations.

    Great series by the way.  Keep it going!

  • Chris: Why would we end up with the wrong right element?

    Remember, middle is IMMUTABLE.  Dequeuing it does not change the value of middle, it returns a different deque.  We can dequeue that thing all we want and its rightmost element is the same as it ever was.

    Dr. Blaise: Your intuition is good.  I'm going to stick with something-on-the-left, something-in-the-middle and something-on-the-right, and those somethings will be more complex than Attempt #2. It's not going to be exactly as you've sketched out though.

  • I'm not sure if I'm on the right track, but the thing I'm noticing is that since a deque implicitly has 3 pieces, it's easier to deal with 2 elements at a time than it is to deal with just 1.  What if you made the left and right sides a kind of "buffer" (stacks with a count, I guess), and did the recursive enqueuing only when you had 2 elements to move?

    For example... if you're enqueuing from the left, and the left buffer has one or two elements, just tack it on.  If it has three, then pop all 3 elements, enqueue the last two onto the inner deque, and put the 1st back on the stack, then enqueue the new element normally.  There is some recursion, but 50% of the time the depth is zero, 25% of the time it's one level deep, 12.5% of the time it only goes two levels, etc.  Once you hit the end, which would be a SingleDeque, then enqueuing two elements at a time is trivial; just move the left to the right, the first element into a new SingleDeque in the middle, and the last element onto the right.

    I think this is also "amortized" O(1) performance, right?  To dequeue off of either side you just reverse this process, same performance.  It can end up being lopsided, but I don't think it matters; if you run out of elements on the "requested" side, just start taking them off the other side - since you've got a maximum of 3 elements to pop at any level before you find the "tail", it's still pretty fast.

    Is this making any sense or have I officially gone off the deep end?

Page 1 of 2 (28 items) 12