-
-
Notifications
You must be signed in to change notification settings - Fork 3.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Running load_gltf example on integrated gpu panics. #6841
Comments
Could you run with |
|
That's... not an expected value. I'm not sure how to handle that case, the gpu is probably not reporting the correct value. Do we clamp it to 0 and consider the integrated gpu doesn't support those buffers? But it worked before, so it does. Then do we clamp it to i32::MAX? That could lead to issues later when someone tries to use 10000 buffers... |
i'd advocate for a it seems fair to me to report "no specific limit on storage buffers" using u32::MAX. obviously there'll be practical limits from total bindings, gpu memory, etc but this card is just saying it doesn't have any lower limit specifically for storage buffers. |
# Objective - Fixes bevyengine#6841 - In some case, the number of maximum storage buffers is `u32::MAX` which doesn't fit in a `i32` ## Solution - Add an option to have a `u32` in a `ShaderDefVal`
# Objective - Fixes bevyengine#6841 - In some case, the number of maximum storage buffers is `u32::MAX` which doesn't fit in a `i32` ## Solution - Add an option to have a `u32` in a `ShaderDefVal`
Bevy version
9c79b39. before this commit things work.
Relevant system information
Works when using discrete GPU
But fails when using integrated GPU
What you did
Ran load_gltf on the above commit on my integrated gpu on laptop causes example to crash with
Running load_gltf on discrete gpu works and running on the commit before the above works.
The text was updated successfully, but these errors were encountered: