Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Stack overflow when parsing deeply nested map #307

Closed
5225225 opened this issue Jun 30, 2021 · 3 comments · Fixed by #420
Closed

Stack overflow when parsing deeply nested map #307

5225225 opened this issue Jun 30, 2021 · 3 comments · Fixed by #420
Labels

Comments

@5225225
Copy link
Contributor

5225225 commented Jun 30, 2021

Reproduction:

fn main() {
    let _: Result<ron::Value, _> = ron::from_str(&"{".repeat(10_000));
}

Found by fuzzing, I'll make a distinct PR to add the fuzzer.

@kvark kvark added the bug label Jul 6, 2021
@juntyr
Copy link
Member

juntyr commented Nov 22, 2021

serde_json 'gets around' this issue by setting an inner depth limit and checking against it: https://github.com/serde-rs/json/blob/379412b13875cfb2e440937c4508944d48ee6c96/src/de.rs#L1272-L1287. Given serde's recursive parser architecture, I'm not sure if something better than such a custom error message could be achieved.

@juntyr
Copy link
Member

juntyr commented Feb 5, 2022

This issue is not with ron itself but with serde's recursive design. The serde_stacker crate provides a Serializer and Deserializer wrapper for users who want to parse highly recursive datastructures by dynamically growing the stack as needed.

@juntyr
Copy link
Member

juntyr commented Oct 9, 2022

I closed this issue a bit prematurely - while the problem is fundamental to serde and serde_stacker can be used to alleviate the issue in many cases, in the end there has to be some limit or we crash. Hence, I think it is better to allow users to specify that limit explicitly in #420.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants