Threading – thread-safe size capped queue

Check out my new course Learn you some Lambda best practice for great good! and learn the best practices for performance, cost, security, resilience, observability and scalability.

Here is a queue class based on the implementation Marc Gravell provided in this StackOverflow question:

/// <summary>
/// A thread-safe fixed sized queue implementation
/// See: http://stackoverflow.com/questions/530211/creating-a-blocking-queuet-in-net/530228#530228
/// </summary>
public sealed class SizeQueue<T>
{
    private readonly Queue<T> _queue = new Queue<T>();
    private readonly int _maxSize;
    private readonly object _syncRoot = new object();
    public SizeQueue(int maxSize)
    {
        _maxSize = maxSize;
    }
    public int Count
    {
        get
        {
            lock (_syncRoot)
            {
                return _queue.Count;
            }
        }
    }
    public object SyncRoot
    {
        get
        {
            return _syncRoot;
        }
    }
    /// <summary>
    /// Puts an item onto the queue
    /// </summary>
    public void Enqueue(T item)
    {
        lock (_syncRoot)
        {
            // don't enqueue new item if the max size has been met
            while (_queue.Count >= _maxSize)
            {
                Monitor.Wait(_syncRoot);
            }
            _queue.Enqueue(item);
            // wake up any blocked dequeue
            Monitor.PulseAll(_syncRoot);
        }
    }
    /// <summary>
    /// Returns the first item from the queue
    /// </summary>
    public T Dequeue()
    {
        return Dequeue(1).FirstOrDefault();
    }
    /// <summary>
    /// Returns the requested number of items from the head of the queue
    /// </summary>
    public IEnumerable<T> Dequeue(int count)
    {
        lock (_syncRoot)
        {
            // wait until there're items on the queue
            while (_queue.Count == 0)
            {
                Monitor.Wait(_syncRoot);
            }
            
            // read as many items off the queue as required (and possible)
            var items = new List<T>();
            while (count > 0 && _queue.Count > 0)
            {
                items.Add(_queue.Dequeue());
                count--;
            }
           return items;
        }
    }
}
Liked this article? Support me on Patreon and get direct help from me via a private Slack channel or 1-2-1 mentoring.
Subscribe to my newsletter


Hi, I’m Yan. I’m an AWS Serverless Hero and the author of Production-Ready Serverless.

I specialise in rapidly transitioning teams to serverless and building production-ready services on AWS.

Are you struggling with serverless or need guidance on best practices? Do you want someone to review your architecture and help you avoid costly mistakes down the line? Whatever the case, I’m here to help.

Hire me.


Check out my new course, Learn you some Lambda best practice for great good! In this course, you will learn best practices for working with AWS Lambda in terms of performance, cost, security, scalability, resilience and observability. Enrol now and enjoy a special preorder price of £9.99 (~$13).

Start Learning


Are you working with Serverless and looking for expert training to level-up your skills? Or are you looking for a solid foundation to start from? Look no further, register for my Production-Ready Serverless workshop to learn how to build production-grade Serverless applications!

Find a workshop near you