You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Often when specifying smallvec, I find that I may as well round up the buffer size to leverage any otherwise-unused padding.
For example, a Vec<u32> takes 24 bytes on my platform. Assuming the union feature is enabled, a SmallVec<[u32; 1]> takes 24 bytes as well...but so does SmallVec<[u32; 4]>. I'd like to always maximize usage of padding bytes since (as far as I can tell) there is no downside. But having to explicitly calculate this is error-prone as these are not typically actually u32, but some small register-sized struct.
Is there a way of performing this rounding implicitly? This might be more of version 2.0 request - e.g. a specified capacity of zero means "default", which means the greater of 1 or whatever fits in padding.