Circular Queues

November 19, 2025
#circular-queue #ring-buffer #queue #data-structures #algorithms

Introduction

A circular queue (also called a ring buffer) is a linear data structure that uses a fixed-size array in a circular manner. Unlike a regular queue where dequeued elements leave wasted space at the front, a circular queue reuses that space by wrapping around to the beginning when reaching the end of the array.

Think of it like a circular track where runners keep going around—when you reach the end, you simply continue from the start. This makes circular queues incredibly efficient for fixed-size buffers in streaming applications, producer-consumer problems, and resource management systems.

If you’re new to queues, review Queues, Arrays, and Big-O Notation first.

What is a circular queue?

A circular queue is a queue implementation that:

  1. Uses a fixed-size array to store elements
  2. Maintains front and rear pointers that wrap around
  3. Reuses space freed by dequeue operations
  4. Efficiently handles a bounded number of elements

Core operations:

  1. Enqueue: Add an element at the rear (if not full)
  2. Dequeue: Remove an element from the front (if not empty)
  3. Peek: View the front element without removing it
  4. isEmpty: Check if the queue is empty
  5. isFull: Check if the queue is full
  6. size: Get the current number of elements

Why use a circular queue?

Choose a circular queue when:

  1. You have a fixed-size buffer requirement
  2. You need to avoid memory waste in array-based queues
  3. You’re implementing producer-consumer patterns with bounded buffers
  4. You’re working with streaming data (audio, video, network packets)
  5. You need predictable memory usage (no dynamic allocation)
  6. You’re building resource pools (connection pools, thread pools)

Avoid a circular queue when:

  • You need unlimited growth (use a linked list queue)
  • You need access to arbitrary elements (use an array)
  • You need operations at both ends (use a deque)

Real-world applications

Circular queues are used extensively:

  • Audio/Video buffers: Streaming media players use ring buffers to handle data
  • Keyboard buffers: Storing keystrokes before processing
  • Network traffic management: Packet buffers in routers and switches
  • CPU scheduling: Round-robin scheduling algorithms
  • Circular logging: Log files that overwrite oldest entries when full
  • Connection pools: Managing fixed-size database connection pools
  • Memory management: Circular buffer for inter-process communication
  • Game development: Managing fixed-size event queues

Circular Queue vs Regular Queue

Feature Circular Queue Regular Queue (Array) Regular Queue (Linked List)
Memory usage Fixed, efficient Fixed, wasteful space Dynamic, overhead per node
Enqueue O(1) O(1) O(1)
Dequeue O(1) O(n) (shift) or O(1) (wasteful) O(1)
Space reuse ✅ Yes ❌ No ✅ Yes (via GC)
Capacity Fixed Fixed or resizable Unlimited
Cache locality ✅ Excellent ✅ Excellent ❌ Poor
Memory predictability ✅ Guaranteed ✅ Guaranteed ❌ Variable

How circular queues work

The key insight is using modular arithmetic to wrap indices:

nextIndex = (currentIndex + 1) % capacity

Visual representation:

Initial empty queue (capacity = 5):
[_, _, _, _, _]
 ↑
 front=rear=0

After enqueue(10, 20, 30):
[10, 20, 30, _, _]
 ↑          ↑
 front=0    rear=3

After dequeue() twice:
[_, _, 30, _, _]
       ↑   ↑
       front=2, rear=3

After enqueue(40, 50, 60):
[60, _, 30, 40, 50]
     ↑  ↑
     rear=1, front=2

Wrapping around!

Core operations and complexity

All circular queue operations are constant time:

  • Enqueue: O(1) — add element at rear (if not full)
  • Dequeue: O(1) — remove element from front (if not empty)
  • Peek: O(1) — view front element
  • isEmpty: O(1) — check if empty
  • isFull: O(1) — check if full
  • size: O(1) — get current size
  • Space: O(n) — fixed capacity n

Circular queue implementation

Method 1: Using an extra variable for size

Method 2: Sacrificing one slot (without size variable)

An alternative approach uses one empty slot to distinguish between full and empty states:

  • Empty: front == rear
  • Full: (rear + 1) % capacity == front

Common circular queue problems

1. Design circular buffer for producer-consumer

Implement a thread-safe circular buffer:

2. Moving average from data stream

Calculate the moving average of the last N elements:

Common pitfalls

  • Off-by-one errors: Carefully handle modular arithmetic with front and rear pointers
  • Full vs empty confusion: Distinguish between full and empty states correctly (use size or sacrifice one slot)
  • Not checking bounds: Always verify isFull() before enqueue and isEmpty() before dequeue
  • Incorrect size calculation: When using the sacrificed-slot method, size formula is (rear - front + capacity) % capacity
  • Integer overflow: In languages with fixed-size integers, ensure indices don’t overflow

When to prefer other data structures

  • Need unlimited capacity → Use linked list queue
  • Need both-end operations → Use deque
  • Need LIFO → Use stack
  • Need random access → Use array

Practice problems

  1. Implement a circular queue that automatically doubles capacity when full
  2. Design a circular buffer that tracks the maximum element in O(1) time
  3. Implement a circular queue with priority (multiple circular queues)
  4. Solve the “Design Hit Counter” problem using a circular queue
  5. Implement a circular buffer for audio processing with fade-in/fade-out
  6. Design a rate limiter using a circular queue
  7. Create a circular queue that supports bulk enqueue/dequeue operations
  8. Implement a circular log buffer that overwrites oldest entries
  9. Design a packet buffer for network simulation with circular queue
  10. Solve “Design Circular Deque” on LeetCode

Complexity summary

  • Time complexity:
    • Enqueue: O(1)
    • Dequeue: O(1)
    • Peek: O(1)
    • isEmpty: O(1)
    • isFull: O(1)
    • Size: O(1)
  • Space complexity: O(n) where n is the fixed capacity

Circular queues provide the efficiency of array-based access with optimal space utilization, making them perfect for bounded buffers and fixed-size streaming applications.

Thanks for visiting
We are actively updating content to this site. Thanks for visiting! Please bookmark this page and visit again soon.
Sponsor