Skip to content
Merged
Changes from 1 commit
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Prev Previous commit
Next Next commit
Avoid producing NoDelim values in FrameData.
  • Loading branch information
nnethercote committed Apr 27, 2022
commit 9665da35cc493458e94d0b8a8931bf3d785071dc
27 changes: 15 additions & 12 deletions compiler/rustc_parse/src/parser/attr_wrapper.rs
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,7 @@ use rustc_ast::tokenstream::{AttrAnnotatedTokenTree, DelimSpan, LazyTokenStream,
use rustc_ast::{self as ast};
use rustc_ast::{AstLike, AttrVec, Attribute};
use rustc_errors::PResult;
use rustc_span::{sym, Span, DUMMY_SP};
use rustc_span::{sym, Span};

use std::convert::TryInto;
use std::ops::Range;
Expand Down Expand Up @@ -400,24 +400,26 @@ fn make_token_stream(
) -> AttrAnnotatedTokenStream {
#[derive(Debug)]
struct FrameData {
open: Span,
open_delim: DelimToken,
// This is `None` for the first frame, `Some` for all others.
open_delim_sp: Option<(DelimToken, Span)>,
inner: Vec<(AttrAnnotatedTokenTree, Spacing)>,
}
let mut stack =
vec![FrameData { open: DUMMY_SP, open_delim: DelimToken::NoDelim, inner: vec![] }];
let mut stack = vec![FrameData { open_delim_sp: None, inner: vec![] }];
let mut token_and_spacing = iter.next();
while let Some((token, spacing)) = token_and_spacing {
match token {
FlatToken::Token(Token { kind: TokenKind::OpenDelim(delim), span }) => {
stack.push(FrameData { open: span, open_delim: delim, inner: vec![] });
stack.push(FrameData { open_delim_sp: Some((delim, span)), inner: vec![] });
}
FlatToken::Token(Token { kind: TokenKind::CloseDelim(delim), span }) => {
// HACK: If we encounter a mismatched `None` delimiter at the top
// level, just ignore it.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is this still necessary?
There's no longer a DelimToken::NoDelim at the top level.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

There's also a similar // HACK comment lower in the same function.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The "mismatched None delimiter at the top level" here refers to token, which comes from the iterator, not the frame. Indeed, the condition looks for a non-NoDelim in the frame. So this change preserves the existing behaviour, as I understand it.

More generally: there are three HACK comments in this function, and a comment at the top of the function referring to #67062. The HACK comments are on three if statements. I tried changing the bodies of those three if statements to panic!() and the test suite passed. I then tried removing all three if statements and the test suite again passed.

These if statements were added in #82608, with an explanation here. I don't know if (a) they never did anything useful, (b) they do something useful but we have no test coverage for them, or (c) something has changed in the meantime that means they are no longer necessary.

I suggest we leave this change as is, and consider removing these hacks in a separate PR. That way if there are any regressions they'll be easier to deal with.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I have filed #96543 to remove the hacks.

if matches!(delim, DelimToken::NoDelim)
&& (stack.len() == 1
|| !matches!(stack.last_mut().unwrap().open_delim, DelimToken::NoDelim))
|| !matches!(
stack.last_mut().unwrap().open_delim_sp.unwrap().0,
DelimToken::NoDelim
))
{
token_and_spacing = iter.next();
continue;
Expand All @@ -430,7 +432,7 @@ fn make_token_stream(
// merge our current frame with the one above it. That is, transform
// `[ { < first second } third ]` into `[ { first second } third ]`
if !matches!(delim, DelimToken::NoDelim)
&& matches!(frame_data.open_delim, DelimToken::NoDelim)
&& matches!(frame_data.open_delim_sp.unwrap().0, DelimToken::NoDelim)
{
stack.last_mut().unwrap().inner.extend(frame_data.inner);
// Process our closing delimiter again, this time at the previous
Expand All @@ -439,12 +441,13 @@ fn make_token_stream(
continue;
}

let (open_delim, open_sp) = frame_data.open_delim_sp.unwrap();
assert_eq!(
frame_data.open_delim, delim,
open_delim, delim,
"Mismatched open/close delims: open={:?} close={:?}",
frame_data.open, span
open_delim, span
);
let dspan = DelimSpan::from_pair(frame_data.open, span);
let dspan = DelimSpan::from_pair(open_sp, span);
let stream = AttrAnnotatedTokenStream::new(frame_data.inner);
let delimited = AttrAnnotatedTokenTree::Delimited(dspan, delim, stream);
stack
Expand Down Expand Up @@ -472,7 +475,7 @@ fn make_token_stream(
// HACK: If we don't have a closing `None` delimiter for our last
// frame, merge the frame with the top-level frame. That is,
// turn `< first second` into `first second`
if stack.len() == 2 && stack[1].open_delim == DelimToken::NoDelim {
if stack.len() == 2 && stack[1].open_delim_sp.unwrap().0 == DelimToken::NoDelim {
let temp_buf = stack.pop().unwrap();
stack.last_mut().unwrap().inner.extend(temp_buf.inner);
}
Expand Down