Skip to content

Commit 50d3c2a

Browse files
committed
Auto merge of #78736 - petrochenkov:lazyenum, r=Aaron1011
rustc_parse: Remove optimization for 0-length streams in `collect_tokens` The optimization conflates empty token streams with unknown token stream, which is at least suspicious, and doesn't affect performance because 0-length token streams are very rare. r? `@Aaron1011`
2 parents b63d05a + 2879ab7 commit 50d3c2a

File tree

1 file changed

+5
-9
lines changed
  • compiler/rustc_parse/src/parser

1 file changed

+5
-9
lines changed

compiler/rustc_parse/src/parser/mod.rs

+5-9
Original file line numberDiff line numberDiff line change
@@ -1180,8 +1180,7 @@ impl<'a> Parser<'a> {
11801180
/// Records all tokens consumed by the provided callback,
11811181
/// including the current token. These tokens are collected
11821182
/// into a `LazyTokenStream`, and returned along with the result
1183-
/// of the callback. The returned `LazyTokenStream` will be `None`
1184-
/// if not tokens were captured.
1183+
/// of the callback.
11851184
///
11861185
/// Note: If your callback consumes an opening delimiter
11871186
/// (including the case where you call `collect_tokens`
@@ -1203,17 +1202,14 @@ impl<'a> Parser<'a> {
12031202

12041203
let ret = f(self)?;
12051204

1206-
// We didn't capture any tokens
1207-
let num_calls = self.token_cursor.num_next_calls - cursor_snapshot.num_next_calls;
1208-
if num_calls == 0 {
1209-
return Ok((ret, None));
1210-
}
1211-
12121205
// Produces a `TokenStream` on-demand. Using `cursor_snapshot`
12131206
// and `num_calls`, we can reconstruct the `TokenStream` seen
12141207
// by the callback. This allows us to avoid producing a `TokenStream`
12151208
// if it is never needed - for example, a captured `macro_rules!`
12161209
// argument that is never passed to a proc macro.
1210+
// In practice token stream creation happens rarely compared to
1211+
// calls to `collect_tokens` (see some statistics in #78736),
1212+
// so we are doing as little up-front work as possible.
12171213
//
12181214
// This also makes `Parser` very cheap to clone, since
12191215
// there is no intermediate collection buffer to clone.
@@ -1247,8 +1243,8 @@ impl<'a> Parser<'a> {
12471243

12481244
let lazy_impl = LazyTokenStreamImpl {
12491245
start_token,
1246+
num_calls: self.token_cursor.num_next_calls - cursor_snapshot.num_next_calls,
12501247
cursor_snapshot,
1251-
num_calls,
12521248
desugar_doc_comments: self.desugar_doc_comments,
12531249
};
12541250
Ok((ret, Some(LazyTokenStream::new(lazy_impl))))

0 commit comments

Comments
 (0)