-
Notifications
You must be signed in to change notification settings - Fork 34
Export an array of all tokens from ct_token_map
#577
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
This is a part of the system I haven't thought about for a while. Is it possible to do the same thing with |
Yes, this seems to work. |
OK, then I think we don't need to generate the array? |
That will only work when user has a generated lexer. If there's a custom lexer with |
I take your point. |
e8356d9
to
49ba5e3
Compare
This helps with writing structured input adapters for fuzzing. When fuzzing a parser specifically (as opposed to fuzzing lexer and parser at the same time), we'd like to supply it with an array of valid lexemes. This export helps us build such an array as we don't have to manually list all tokens in a fuzzing entry point. Note that I didn't implement this functionality for generated lexers because there's already a way to get all tokens via `mod_l::lexerdef().iter_rules()`.
Took a bit of head scratching until I grokked it (building a token stream directly rather than an intermediate vector!), Seems fine to me now, unless Laurence has any further comments. |
@ratmice Thanks for the review! @taminomara Thanks for the PR! |
This helps with writing structured input adapters for fuzzing. When fuzzing a parser specifically (as opposed to fuzzing lexer and parser at the same time), we'd like to supply it with an array of valid lexemes. This export helps us build such an array as we don't have to manually list all tokens in a fuzzing entry point.
Note that I didn't implement this functionality for generated lexers because there's already a way to get all tokens via
mod_l::lexerdef().iter_rules()
.Example of a fuzzing implementation after this PR: