-
-
Notifications
You must be signed in to change notification settings - Fork 117
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
impl PartialEq/Eq for TokenStream and TokenTree #469
Conversation
While this is not really required for implementing proc-macros it helps a lot in tests where one can `assert_eq!()` expected results. Comparing TokenStream/TokenTrees directly with each other.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for the PR!
You mentioned assert_eq!
in tests as the motivation. But I think this feature would make a poor fit for that. Consider a test:
let actual = derive_serialize(input);
let expected = quote! {
impl<T> ::serde::Serialize for Thing {
fn serialize<S>(&self, serializer: S) -> Result<S::Ok, S::Error>
where
S: ::serde::Serializer;
}
};
assert_eq!(actual, expected);
With this PR, a failure looks like this:
thread 'repro' panicked at tests/repro.rs:25:5:
assertion `left == right` failed
left: TokenStream [Ident { sym: impl }, Punct { char: '<', spacing: Alone }, I
dent { sym: T }, Punct { char: '>', spacing: Alone }, Ident { sym: serde }, Punc
t { char: ':', spacing: Joint }, Punct { char: ':', spacing: Alone }, Ident { sy
m: Serialize }, Ident { sym: for }, Ident { sym: Thing }, Group { delimiter: Bra
ce, stream: TokenStream [Ident { sym: fn }, Ident { sym: serialize }, Punct { ch
ar: '<', spacing: Alone }, Ident { sym: S }, Punct { char: '>', spacing: Alone }
, Group { delimiter: Parenthesis, stream: TokenStream [Punct { char: '&', spacin
g: Alone }, Ident { sym: self }, Punct { char: ',', spacing: Alone }, Ident { sy
m: serializer }, Punct { char: ':', spacing: Alone }, Ident { sym: S }] }, Punct
{ char: '-', spacing: Joint }, Punct { char: '>', spacing: Alone }, Ident { sym
: Result }, Punct { char: '<', spacing: Alone }, Ident { sym: S }, Punct { char:
':', spacing: Joint }, Punct { char: ':', spacing: Alone }, Ident { sym: Ok },
Punct { char: ',', spacing: Alone }, Ident { sym: S }, Punct { char: ':', spacin
g: Joint }, Punct { char: ':', spacing: Alone }, Ident { sym: Error }, Punct { c
har: '>', spacing: Alone }, Ident { sym: where }, Ident { sym: S }, Punct { char
: ':', spacing: Alone }, Ident { sym: serde }, Punct { char: ':', spacing: Joint
}, Punct { char: ':', spacing: Alone }, Ident { sym: Serializer }, Punct { char
: ';', spacing: Alone }] }]
right: TokenStream [Ident { sym: impl }, Punct { char: '<', spacing: Alone }, I
dent { sym: T }, Punct { char: '>', spacing: Alone }, Punct { char: ':', spacing
: Joint }, Punct { char: ':', spacing: Alone }, Ident { sym: serde }, Punct { ch
ar: ':', spacing: Joint }, Punct { char: ':', spacing: Alone }, Ident { sym: Ser
ialize }, Ident { sym: for }, Ident { sym: Thing }, Group { delimiter: Brace, st
ream: TokenStream [Ident { sym: fn }, Ident { sym: serialize }, Punct { char: '<
', spacing: Alone }, Ident { sym: S }, Punct { char: '>', spacing: Alone }, Grou
p { delimiter: Parenthesis, stream: TokenStream [Punct { char: '&', spacing: Alo
ne }, Ident { sym: self }, Punct { char: ',', spacing: Alone }, Ident { sym: ser
ializer }, Punct { char: ':', spacing: Alone }, Ident { sym: S }] }, Punct { cha
r: '-', spacing: Joint }, Punct { char: '>', spacing: Alone }, Ident { sym: Resu
lt }, Punct { char: '<', spacing: Alone }, Ident { sym: S }, Punct { char: ':',
spacing: Joint }, Punct { char: ':', spacing: Alone }, Ident { sym: Ok }, Punct
{ char: ',', spacing: Alone }, Ident { sym: S }, Punct { char: ':', spacing: Joi
nt }, Punct { char: ':', spacing: Alone }, Ident { sym: Error }, Punct { char: '
>', spacing: Alone }, Ident { sym: where }, Ident { sym: S }, Punct { char: ':',
spacing: Alone }, Punct { char: ':', spacing: Joint }, Punct { char: ':', spacin
g: Alone }, Ident { sym: serde }, Punct { char: ':', spacing: Joint }, Punct { c
har: ':', spacing: Alone }, Ident { sym: Serializer }, Punct { char: ';', spacin
g: Alone }] }]
versus what you can already get, without this PR, using assert_eq!(actual.to_string(), expected.to_string())
.
thread 'repro' panicked at tests/repro.rs:25:5:
assertion `left == right` failed
left: "impl < T > serde :: Serialize for Thing { fn serialize < S > (& self ,
serializer : S) -> Result < S :: Ok , S :: Error > where S : serde :: Serializer
; }"
right: "impl < T > :: serde :: Serialize for Thing { fn serialize < S > (& self
, serializer : S) -> Result < S :: Ok , S :: Error > where S : :: serde :: Seri
alizer ; }"
Unarguably the fail looks pretty ugly, so far i am using to_string() already. I had the impression that '.to_string()' might be not very well defined in respect to spacing in all cases. If this is not true, then this PR/Idea is indeed void. EDIT: Actually this is BS as long one TokenStream::to_string() the 'expected' side too. My worries where about providing string literals or constructing/concatenating the expected side somehow else. |
While this is not really required for implementing proc-macros it helps a lot in tests where one can
assert_eq!()
expected results. Comparing TokenStream/TokenTrees directly with each other.This addition is pretty trivial, so I just gone ahead implementing/PR this here. Although comparing TokenTrees doesn't come for free since Literals have to converted to_string() just to be dropped afterward. IMO this is good enough for the intended use in tests.
Whats your opinion about adding this?