Skip to content

Add FixedQueue: A Fixed Size Queue Implementation #126204

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
wants to merge 19 commits into from
Closed
2 changes: 2 additions & 0 deletions compiler/rustc_feature/src/unstable.rs
Original file line number Diff line number Diff line change
Expand Up @@ -471,6 +471,8 @@ declare_features! (
(unstable, ffi_const, "1.45.0", Some(58328)),
/// Allows the use of `#[ffi_pure]` on foreign functions.
(unstable, ffi_pure, "1.45.0", Some(58329)),
/// New collection
(unstable, fixed_queue, "1.77.1", Some(126204)),
Comment on lines +474 to +475
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We don't use this for library features.

Suggested change
/// New collection
(unstable, fixed_queue, "1.77.1", Some(126204)),

/// Allows using `#[repr(align(...))]` on function items
(unstable, fn_align, "1.53.0", Some(82232)),
/// Support delegating implementation of functions to other already implemented functions.
Expand Down
1 change: 1 addition & 0 deletions compiler/rustc_span/src/symbol.rs
Original file line number Diff line number Diff line change
Expand Up @@ -859,6 +859,7 @@ symbols! {
field,
field_init_shorthand,
file,
fixed_queue,
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
fixed_queue,

float,
float_to_int_unchecked,
floorf128,
Expand Down
288 changes: 288 additions & 0 deletions library/alloc/src/collections/fixed_queue/mod.rs
Original file line number Diff line number Diff line change
@@ -0,0 +1,288 @@
use crate::vec::Vec;
use core::{
fmt, mem,
ops::{Index, Range},
slice,
};

#[cfg(test)]
mod tests;

/// Fixed Size Queue:
/// A linear queue implemented with a static ring buffer of owned nodes.
///
/// The `FixedQueue` allows pushing and popping elements in constant time.
///
/// The "default" usage of this type is to use [`push`] to add to
/// the queue, and [`pop`] to remove from the queue. Iterating over
/// `FixedQueue` goes front to back.
///
/// A `FixedQueue` with a known list of items can be initialized from an array:
/// ```
/// #![feature(fixed_queue)]
/// use alloc::collections::FixedQueue;
///
/// let fque = FixedQueue::from([1, 2, 3]);
/// ```
///
/// Since `FixedQueue` is an array ring buffer, its elements are contiguous
/// in memory.
///
/// [`push`]: FixedQueue::push
/// [`pop`]: FixedQueue::pop
#[derive(Debug)]
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This should probably not derive Debug, but rather have an implementation that guarantees that the elements are printed from head to tail.

#[unstable(feature = "fixed_queue", issue = "126204")]
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This points to a PR, not a tracking issue. It will have to be fixed before merging this.

pub struct FixedQueue<T, const N: usize> {
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Have you considered an implementation where N is not a constant, but is runtime-determined, yet immutable?

buffer: [Option<T>; N],
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This representation is... bad. It requires the memory leak in Index<Range<usize>> to yield [T]. Please come back with a version that uses MaybeUninit instead:

Suggested change
buffer: [Option<T>; N],
buffer: [MaybeUninit<T>; N],

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This would, of course, require much more unsafe code in order to implement correctly, with which an overconfident yes-man cannot help. I strongly advise not even trying to ask ChatGPT, Codex, Copilot, or any other code-generation tool that you may or may not be using about unsafe code. They will give you wrong answers, or bad ones.

head: usize,
tail: usize,
len: usize,
}

impl<T, const N: usize> FixedQueue<T, N> {
/// Create a new FixedQueue with given fields.
#[inline]
const fn with(
buffer: [Option<T>; N],
head: usize,
tail: usize,
len: usize,
) -> FixedQueue<T, N> {
FixedQueue { buffer, head, tail, len }
}
Comment on lines +43 to +52
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We do not need a private constructor fn that is simply the type's constructor.

Suggested change
/// Create a new FixedQueue with given fields.
#[inline]
const fn with(
buffer: [Option<T>; N],
head: usize,
tail: usize,
len: usize,
) -> FixedQueue<T, N> {
FixedQueue { buffer, head, tail, len }
}


/// Create a new FixedQueue with a given capacity.
#[unstable(feature = "fixed_queue", issue = "126204")]
pub const fn new() -> FixedQueue<T, N>
where
Option<T>: Copy,
Comment on lines +57 to +58
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
where
Option<T>: Copy,

{
FixedQueue::with([None; N], 0, 0, 0)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
FixedQueue::with([None; N], 0, 0, 0)
FixedQueue { buffer: [const { None }; N], len: 0, head: 0, tail: 0 }

}

/// Return the max capacity of the FixedQueue.
#[inline]
#[unstable(feature = "fixed_queue", issue = "126204")]
pub const fn capacity(&self) -> usize {
N
}

/// Returns the number of elements in the FixedQueue.
#[inline]
#[unstable(feature = "fixed_queue", issue = "126204")]
pub const fn len(&self) -> usize {
self.len
}

/// Check if the queue is empty.
#[inline]
#[unstable(feature = "fixed_queue", issue = "126204")]
pub const fn is_empty(&self) -> bool {
self.len == 0
}

/// Check if the queue is full.
#[inline]
#[unstable(feature = "fixed_queue", issue = "126204")]
pub const fn is_full(&self) -> bool {
self.len == N
}

/// Removes all elements from the queue.
#[unstable(feature = "fixed_queue", issue = "126204")]
pub fn clear(&mut self) {
for i in 0..N {
drop(self.buffer[i].take());
}
Comment on lines +94 to +96
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

if N is large, this attempts to run drop N times, even if the amount of elements actually in the queue is 1 or even 0, which seems...

...computationally wasteful, even with branch predictors being real. It could thrash the entire cache, for instance.

self.head = 0;
self.tail = 0;
self.len = 0;
}

/// Fills the queue with an element.
#[unstable(feature = "fixed_queue", issue = "126204")]
pub fn fill(&mut self, item: T)
where
Option<T>: Copy,
Comment on lines +105 to +106
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Per my remarks about Option<T>: Copy not being acceptable:

Suggested change
where
Option<T>: Copy,

{
self.buffer = [Some(item); N];
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I believe this semantically constructs the entire array as a temporary, then writes it. In the past, this sort of thing has led to unfortunate amounts of memcpy on the stack. I think it would be better to do

Suggested change
self.buffer = [Some(item); N];
for element in self.iter_mut() {
*element = Some(item);
}

That should still optimize appropriately.

self.head = 0;
self.tail = 0;
self.len = N;
}

/// Add an element to the queue. If queue is full, the first element
/// is popped and returned.
#[unstable(feature = "fixed_queue", issue = "126204")]
pub fn push(&mut self, item: T) -> Option<T> {
// 'pop' first
let overwritten = self.buffer[self.tail].take();
// overwrite head/tail element
self.buffer[self.tail] = Some(item);
// shift tail with 'push'
self.tail = (self.tail + 1) % N;
if overwritten.is_some() {
// shift head ptr on collision
self.head = (self.head + 1) % N;
} else {
// increase len if no collision
self.len += 1;
}
return overwritten;
}
Comment on lines +114 to +132
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why should this take the front item instead of returning the same item?


/// Removes and returns the oldest element from the queue.
#[inline]
#[unstable(feature = "fixed_queue", issue = "126204")]
pub fn pop(&mut self) -> Option<T>
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Like VecDeque, this should probably be a Deque as well and have pop_front and pop_back.

where
Option<T>: Copy,
Comment on lines +137 to +139
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Some of the most critical API being bounded on Option<T>: Copy, which requires T: Copy, is not really acceptable. It should work without it. This should have tests that confirm it works even if the queue is of Mutex<Vec<T>> or something else painful.

Suggested change
pub fn pop(&mut self) -> Option<T>
where
Option<T>: Copy,
pub fn pop(&mut self) -> Option<T> {

{
if self.len == 0 {
return None;
}
let popped = self.buffer[self.head].take();
self.head = (self.head + 1) % N;
self.len -= 1;
popped
}

/// Converts the queue into its array equivalent.
#[unstable(feature = "fixed_queue", issue = "126204")]
pub fn to_option_array(self) -> [Option<T>; N]
where
Option<T>: Copy,
Comment on lines +153 to +154
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
where
Option<T>: Copy,

{
let mut arr: [Option<T>; N] = [None; N];
for i in 0..N {
arr[i] = self.buffer[(self.head + i) % N];
}
arr
Comment on lines +156 to +160
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

...why is this not simply self.arr?

}

/// Converts the queue into its vec equivalent.
#[unstable(feature = "fixed_queue", issue = "126204")]
pub fn to_vec(self) -> Vec<T>
where
T: Copy,
Comment on lines +166 to +167
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
where
T: Copy,

{
let mut vec: Vec<T> = Vec::new();
for i in 0..N {
if let Some(e) = self.buffer[(self.head + i) % N] {
vec.push(e);
}
}
vec
Comment on lines +169 to +175
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why do you do this instead of constructing a slice reference to the array and then calling .to_vec() on that?

}
}

#[unstable(feature = "fixed_queue", issue = "126204")]
impl<T, const N: usize> From<[T; N]> for FixedQueue<T, N> {
/// Creates a FixedQueue from a fixed size array.
fn from(array: [T; N]) -> Self {
FixedQueue::with(array.map(Some), 0, 0, N)
}
}

#[unstable(feature = "fixed_queue", issue = "126204")]
impl<T: Copy, const N: usize> From<&[T; N]> for FixedQueue<T, N> {
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
impl<T: Copy, const N: usize> From<&[T; N]> for FixedQueue<T, N> {
impl<T, const N: usize> From<&[T; N]> for FixedQueue<T, N> {

/// Creates a FixedQueue from a fixed size slice.
fn from(array: &[T; N]) -> Self {
FixedQueue::with(array.map(Some), 0, 0, N)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why do you use array.map here but not elsewhere when it makes sense?

}
}

#[unstable(feature = "fixed_queue", issue = "126204")]
impl<T: Copy, const N: usize> From<&[T]> for FixedQueue<T, N> {
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
impl<T: Copy, const N: usize> From<&[T]> for FixedQueue<T, N> {
impl<T, const N: usize> From<&[T]> for FixedQueue<T, N> {

/// Creates a FixedQueue from an unsized slice. Copies a maximum of N
/// elements of the slice, and a minimum of the slice length into the
/// queue. {[0, 0], 0} - [0, 0]
fn from(array: &[T]) -> Self {
let mut buf: [Option<T>; N] = [None; N];
let length = N.min(array.len());
for i in 0..length {
buf[i] = Some(array[i]);
}
FixedQueue::with(buf, 0, array.len() - 1, length)
}
}

#[unstable(feature = "fixed_queue", issue = "126204")]
impl<T: PartialEq, const N: usize> PartialEq for FixedQueue<T, N> {
/// This method tests if a FixedQueue is equal to another FixedQueue.
fn eq(&self, other: &FixedQueue<T, N>) -> bool {
if other.len != self.len {
return false;
}
(0..N).all(|x| self.buffer[(self.head + x) % N] == other.buffer[(other.head + x) % N])
Comment on lines +214 to +217
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Like my remarks on clear: this should take advantage of its knowledge of where the head and tail are instead of destroying the entire cache if N is large.

}
}

#[unstable(feature = "fixed_queue", issue = "126204")]
impl<T: PartialEq, const N: usize, const M: usize> PartialEq<[T; M]> for FixedQueue<T, N> {
/// This method tests if a FixedQueue is equal to a fixed size array.
fn eq(&self, other: &[T; M]) -> bool {
if M != self.len {
return false;
}
(0..M).all(|x| self.buffer[(self.head + x) % N].as_ref() == Some(&other[x]))
}
}
Comment on lines +210 to +230
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I do not think we should have two functions here, one should call the other's implementation or they should share a support function.


#[unstable(feature = "fixed_queue", issue = "126204")]
impl<T, const N: usize> Index<usize> for FixedQueue<T, N> {
type Output = Option<T>;

fn index(&self, index: usize) -> &Self::Output {
if index >= N {
panic!("Index out of bounds");
}
&self.buffer[(self.head + index) % N]
}
}

#[unstable(feature = "fixed_queue", issue = "126204")]
impl<T, const N: usize> Index<Range<usize>> for FixedQueue<T, N>
where
T: Copy + Default,
Comment on lines +246 to +247
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Per my remarks, etc.

Suggested change
where
T: Copy + Default,

{
type Output = [T];

fn index(&self, range: Range<usize>) -> &Self::Output {
let start = range.start;
let end = range.end;

// check bounds
assert!(start <= end && end <= self.len, "Index out of bounds");

// create temporary array to store the results
let mut temp = Vec::with_capacity(end - start);
Comment on lines +258 to +259
Copy link
Member

@workingjubilee workingjubilee Jun 10, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

...No.

  • This isn't an array, this is a Vec.
  • This is not an acceptable implementation.


for i in start..end {
let idx = (self.head + i) % N;
if let Some(value) = self.buffer[idx] {
temp.push(value);
}
}

// Return a slice from the temporary array
// SAFETY: This is safe because temp will live long enough within this function call.
let result = unsafe { slice::from_raw_parts(temp.as_ptr(), temp.len()) };
mem::forget(temp);
result
Comment on lines +268 to +272
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This creates a memory leak for performing simple indexing. This should successfully run to completion if you use fixed_queue[0..N] on a FixedQueue of 65535 elements a total of 2147483647 times in a program, without increasing memory usage.

}
}

#[unstable(feature = "fixed_queue", issue = "126204")]
impl<T: fmt::Display, const N: usize> fmt::Display for FixedQueue<T, N> {
fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
if self.len == 0 {
return write!(f, "{{}}");
}
write!(f, "{{")?;
for x in 0..(self.len - 1) {
write!(f, "{}, ", self.buffer[(self.head + x) % N].as_ref().unwrap())?;
}
write!(f, "{}}}", self.buffer[(self.head + self.len - 1) % N].as_ref().unwrap())
}
}
Comment on lines +276 to +288
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

VecDeque does not implement Display. Why should this?

Loading
Loading