Skip to content

This issue was moved to a discussion.

You can continue the conversation there. Go to discussion →

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

self.batch_update not working / Calling pop_screen multiple times causes a flickering effect / A new method to pop more screens at once #5009

Closed
mzebrak opened this issue Sep 16, 2024 · 1 comment

Comments

@mzebrak
Copy link

mzebrak commented Sep 16, 2024

I just want to pop a couple of screens (more than one, let's say 2), nothing else.
Now this is a painful experience because there is no public API allowing for that.

It appears to be as simple as:

while not isinstance(self.app.screen, SomeScreen)
    self.app.pop_screen()

but that's not true, because with every pop_screen, there is ScreenResume message being posted which might result in bugs when something depends on that. It also causes old screens to appear for a moment bug I think it's a regression.

I know about the MODES and they can be useful in some scenarios.
But not everything can be generalized to a mode, and even within a given mode, there may be a need to pop several screens.

This is what I do to achieve it and it is still not ideal because of the regression of the flicker effect:

    def pop_screen_until(self, *screens: str | type[Screen[ScreenResultType]]) -> AwaitComplete:
        """
        Pop all screens until one of the given screen is on top of the stack.

        Raises
        ------
        ScreenNotFoundError:  if no screen was found.
        """

        async def _pop_screen_until() -> None:
            for screen in screens:
                if not self._is_screen_in_stack(screen):
                    continue  # Screen not found, try next one

                with self.batch_update():
                    while not self.__screen_eq(self.screen_stack[-1], screen):
                        with self.prevent(ScreenResume):
                            await self.pop_screen()
                    self.screen.post_message(ScreenResume())
                break  # Screen found and located on top of the stack, stop
            else:
                raise ScreenNotFoundError(
                    f"None of the {screens} screens was found in stack.\nScreen stack: {self.screen_stack}"
                )

        return AwaitComplete(_pop_screen_until()).call_next(self)
        
    def _is_screen_in_stack(self, screen_to_check: str | type[Screen[ScreenResultType]]) -> bool:
        return any(self.__screen_eq(screen, screen_to_check) for screen in self.screen_stack)

    def _screen_eq(self, screen: Screen[ScreenResultType], other: str | type[Screen[ScreenResultType]]) -> bool:
        if isinstance(other, str):
            return screen.__class__.__name__ == other
        return isinstance(screen, other)

Also related: #3126 (regression) and #3127

I think it would be really great if:

  1. batch_update could respect things like changing screen_stack, changing focus etc. Not only visual changes to the current screen. Consider also such a situation:
    with self.app.batch_update():  # ensure no flicker when taking multiple actions on screens
            await self.app.pop_screen()  # drop current "item" screen
            await self.app.push_screen(Cart())  # we want to go back to this Cart  while being on the  TransactionSummaryFromCart and "escape" binding is pressed
            await self.app.push_screen(TransactionSummaryFromCart())
  1. There was some official method allowing for popping multiple screens in a correct way - either something like pop_screen_until(DesiredScreen) or pop_screen(amount=2)
Copy link

We found the following entries in the FAQ which you may find helpful:

Feel free to close this issue if you found an answer in the FAQ. Otherwise, please give us a little time to review.

This is an automated reply, generated by FAQtory

@Textualize Textualize locked and limited conversation to collaborators Sep 28, 2024
@willmcgugan willmcgugan converted this issue into discussion #5071 Sep 28, 2024

This issue was moved to a discussion.

You can continue the conversation there. Go to discussion →

Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant