-
-
Notifications
You must be signed in to change notification settings - Fork 21.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Subviewport rendering different than non-viewport rendering #86975
Comments
This is what I see on my end with 4.3.dev 13a0d6e: This is a different variant of #77149. This one occurs because of Camera2D zoom, rather than stretch scale factor. This is expected behavior, since you're zooming in with the Camera2D node (1.8× factor), but SubViewport textures are not automatically resized to take Camera2D zoom into account. Doing so would impact performance negatively, especially when zoom changes at runtime (as changing viewport resolution is a relatively slow operation, requiring buffers to be recreated every time). To have crisp rendering in a SubViewport at any zoom level, you need to increase its size so it's pixel-perfect when fully zoomed in. The exact resolution you should use depends on the maximum expected zoom level of your Camera2D node. Note that this will result in graininess when fully zoomed out, since ViewportTexture doesn't generate mipmaps in real-time. In your MRP's case, multiply the SubViewport's resolution by a factor of 1.8× on each axis, and change its scale so it appears with a smaller physical size (leading to the same physical size as before in the end). There's a Texture2D size override property that would make this easier, but it's currently not exposed. |
Hi! I appreciate your response. @Calinou, perhaps my MRP was not representative of my actual scenario. My actual scenario does not use a camera -- but the problem happens when popping into fullscreen. I've updated an MRP, here: I'm not sure if I should expect that OhiraKyou's solution #77149 should help here, but I can't get it to :( |
This is indeed the same issue as #77149 (as the updated MRP uses the
Going fullscreen causes the 2D scale factor to increase, as fullscreen resolution is larger than windowed mode here (even on a 1920×1080 display, due to window borders appearing in windowed mode). |
Here's a solution where i think I got things working? Including camera? It's mainly OhiraKyou's solution, with a couple tweaks: Subviewport size override set to match size, override stretch on, and display > window > aspect set to keep. Looks fine to me? Is there a danger with what I've done here? |
Tested versions
System information
Windows 10 - v4.2.1.stable.mono.official, Forward+ and gl_compat renderers
Issue description
Having an issue with subviewports: scaled down sprites that I place into a 1:1 window-sized subviewport either render jaggedy or blurry, compared to the same scaled-down sprite outside of the subviewport. In this image, I’ve got a 1920x1080 subviewport inside the same size window. In this screen shot you can see how the three different sprites render. All 3 are scaled by the same factor, about 0.14 (512x512px source texture → 72x72px) (Might need to open the image in a new tab to see at full resolution) Yet, the sprites in the subviewport look terrible with both types of filters, and the sprite looks perfect in the root viewport.
Steps to reproduce
Create a window at 1920x1080, create a subviewport of the same size, put a large sprite in the subviewport and scale it down, e.g. 512x512 at 0.14,0.14 scale. Add the same sprite and scale, but outside of the subviewport. The same sprite, outside of the viewport, renders fare more crisply than the sprite inside the subviewport. See attached project.
Minimal reproduction project (MRP)
SubviewportRenderingIssue.zip
Just run the scene to see the differences.
The text was updated successfully, but these errors were encountered: