-
Notifications
You must be signed in to change notification settings - Fork 13.3k
Add FixedQueue: A Fixed Size Queue Implementation #126204
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
A new data structure, `FixedQueue`, which is a fixed-size queue implemented using a static ring buffer. * Both push and pop operations should be constant time. * Elements should be stored contiguously in memory. * Unit tests should cover all methods and functionality. I recognize that Rust already has `VecDeque`, but it's growable and has a more general implementation. Please correct me if I'm wrong, but I believe this might offer a more efficient solution for scenarios where a fixed-size queue is sufficient. I want to learn and am open to any suggestions or critiques. Example Usage: ```rust use std::collections::FixedQueue; let mut fque = FixedQueue::<i32, 3>::new(); fque.push(1); fque.push(2); fque.push(3); assert_eq!(fque.pop(), Some(1)); assert_eq!(fque.pop(), Some(2)); fque.push(4); assert_eq!(fque.to_vec(), vec![3, 4]); ```
r? @Nilstrieb rustbot has assigned @Nilstrieb. Use |
This comment has been minimized.
This comment has been minimized.
mingw-check-tidy formatting errors fix
This comment has been minimized.
This comment has been minimized.
Fixing file formatting errors
Fixing file fomatting errors
insert tests
moved fixed queue into seperate folder
allow fixed queue to be a module
This comment has been minimized.
This comment has been minimized.
add unstable feature
include vec and string
This comment has been minimized.
This comment has been minimized.
add feature
This comment has been minimized.
This comment has been minimized.
This comment has been minimized.
This comment has been minimized.
unstable tags
This comment has been minimized.
This comment has been minimized.
including docs
This comment has been minimized.
This comment has been minimized.
add symbol
This comment has been minimized.
This comment has been minimized.
sort by alphabetical order
This comment has been minimized.
This comment has been minimized.
feature gate test
This comment has been minimized.
This comment has been minimized.
std to alloc (perhaps wrong place)
This comment has been minimized.
This comment has been minimized.
use feature in example
The job Click to see the possible cause of the failure (guessed by this bot)
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I see you like AI, and also have submitted some unusual Rust code that is... unidiomatic, especially considering the surrounding standard library idioms, as I hope my review comments illuminate. It more closely resembles C or JavaScript code I have read before. Before I kick this over to T-libs-api, I feel duty-bound to simply ask: Did you use an LLM to generate this code?
for i in 0..N { | ||
drop(self.buffer[i].take()); | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
if N is large, this attempts to run drop N times, even if the amount of elements actually in the queue is 1 or even 0, which seems...
...computationally wasteful, even with branch predictors being real. It could thrash the entire cache, for instance.
/// [`pop`]: FixedQueue::pop | ||
#[derive(Debug)] | ||
#[unstable(feature = "fixed_queue", issue = "126204")] | ||
pub struct FixedQueue<T, const N: usize> { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Have you considered an implementation where N is not a constant, but is runtime-determined, yet immutable?
where | ||
Option<T>: Copy, | ||
{ | ||
self.buffer = [Some(item); N]; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I believe this semantically constructs the entire array as a temporary, then writes it. In the past, this sort of thing has led to unfortunate amounts of memcpy
on the stack. I think it would be better to do
self.buffer = [Some(item); N]; | |
for element in self.iter_mut() { | |
*element = Some(item); | |
} |
That should still optimize appropriately.
pub fn pop(&mut self) -> Option<T> | ||
where | ||
Option<T>: Copy, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Some of the most critical API being bounded on Option<T>: Copy
, which requires T: Copy
, is not really acceptable. It should work without it. This should have tests that confirm it works even if the queue is of Mutex<Vec<T>>
or something else painful.
pub fn pop(&mut self) -> Option<T> | |
where | |
Option<T>: Copy, | |
pub fn pop(&mut self) -> Option<T> { |
/// New collection | ||
(unstable, fixed_queue, "1.77.1", Some(126204)), |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We don't use this for library features.
/// New collection | |
(unstable, fixed_queue, "1.77.1", Some(126204)), |
} | ||
|
||
#[unstable(feature = "fixed_queue", issue = "126204")] | ||
impl<T: Copy, const N: usize> From<&[T]> for FixedQueue<T, N> { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
impl<T: Copy, const N: usize> From<&[T]> for FixedQueue<T, N> { | |
impl<T, const N: usize> From<&[T]> for FixedQueue<T, N> { |
/// | ||
/// [`push`]: FixedQueue::push | ||
/// [`pop`]: FixedQueue::pop | ||
#[derive(Debug)] |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This should probably not derive Debug, but rather have an implementation that guarantees that the elements are printed from head to tail.
// create temporary array to store the results | ||
let mut temp = Vec::with_capacity(end - start); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
...No.
- This isn't an array, this is a Vec.
- This is not an acceptable implementation.
// Return a slice from the temporary array | ||
// SAFETY: This is safe because temp will live long enough within this function call. | ||
let result = unsafe { slice::from_raw_parts(temp.as_ptr(), temp.len()) }; | ||
mem::forget(temp); | ||
result |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This creates a memory leak for performing simple indexing. This should successfully run to completion if you use fixed_queue[0..N]
on a FixedQueue of 65535 elements a total of 2147483647 times in a program, without increasing memory usage.
#[derive(Debug)] | ||
#[unstable(feature = "fixed_queue", issue = "126204")] | ||
pub struct FixedQueue<T, const N: usize> { | ||
buffer: [Option<T>; N], |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This representation is... bad. It requires the memory leak in Index<Range<usize>>
to yield [T]
. Please come back with a version that uses MaybeUninit
instead:
buffer: [Option<T>; N], | |
buffer: [MaybeUninit<T>; N], |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This would, of course, require much more unsafe
code in order to implement correctly, with which an overconfident yes-man cannot help. I strongly advise not even trying to ask ChatGPT, Codex, Copilot, or any other code-generation tool that you may or may not be using about unsafe
code. They will give you wrong answers, or bad ones.
I strongly recommend you actually learn to run the standard library build on a local machine... or even a remote one you have full shell access to... if you want to contribute significant data structures to rust-lang/rust, instead of using the GitHub web interface for all of your edits. Apologies if you already are or are having difficulties. I can help on the Rust Zulip if you wish to try. |
"A more efficient solution for scenarios where..." This particular clause is borderline irrelevant for deciding whether something is added to the standard library. There are, when you consider significant implementation differences, millions of data structures that are more efficient or optimal in some specific case. The standard library is not necessarily aiming to have millions of data structures. The path to adding API is expanded on in greater length here: https://std-dev-guide.rust-lang.org/development/feature-lifecycle.html There is probably some appetite for accepting an |
A whole new data structure is a big enough thing that you absolutely need at least an API change proposal for it. It might even end up needing an RFC. As such, I'm actually just going to close this -- a new data structure is a much bigger topic than just a new inherent method here or there. In particular, there's people working on things like rust-lang/wg-allocators#93 with the goal that I would suggest that you turn your code into a crate instead of submitting it here. Then people can use it immediately, whereas if you submit it to |
A new data structure,
FixedQueue
, which is a fixed-size queue implemented using a static ring buffer.I recognize that Rust already has
VecDeque
, but it's growable and has a more general implementation. Please correct me if I'm wrong, but I believe this might offer a more efficient solution for scenarios where a fixed-size queue is sufficient.I want to learn and am open to any suggestions or critiques.
Example Usage: