Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix flake8 bugbear version and fix some flake8 issues #1504

Merged
merged 3 commits into from
Apr 7, 2022

Conversation

Pierre-Sassoulas
Copy link
Member

@Pierre-Sassoulas Pierre-Sassoulas commented Apr 7, 2022

Description

Type of Changes

Type
βœ“ πŸ› Bug fix
βœ“ πŸ”¨ Refactoring

@Pierre-Sassoulas Pierre-Sassoulas added topic-performance Maintenance Discussion or action around maintaining astroid or the dev workflow labels Apr 7, 2022
@Pierre-Sassoulas Pierre-Sassoulas added this to the 2.12.0 milestone Apr 7, 2022
@coveralls
Copy link

coveralls commented Apr 7, 2022

Pull Request Test Coverage Report for Build 2111421131

  • 0 of 0 changed or added relevant lines in 0 files are covered.
  • No unchanged relevant lines lost coverage.
  • Overall coverage remained the same at 91.485%

Totals Coverage Status
Change from base Build 2078975963: 0.0%
Covered Lines: 9089
Relevant Lines: 9935

πŸ’› - Coveralls

@@ -60,6 +61,26 @@ def _dunder_dict(instance, attributes):
return obj


@lru_cache(maxsize=None)
def str_startswith_attr_(obj: str) -> bool:
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Should we really cache this? Seems a bit a overkill?



@lru_cache(maxsize=None)
def after_attr__in_str(obj: str) -> str:
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Same here..

@@ -100,20 +121,16 @@ def __get__(self, instance, cls=None):
def __contains__(self, name):
return name in self.attributes()

@lru_cache(maxsize=None)
def attributes(self):
def attributes(self) -> List[str]:
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The expensive part of this function is (I assume) the iteration over the dir. The other two function seem fairly cost efficient? Perhaps just remove the caching here altogether?

@@ -366,21 +366,19 @@ def get_children(self):
class LookupMixIn:
"""Mixin to look up a name in the right scope."""

@lru_cache(maxsize=None)
def lookup(self, name):
@lru_cache(maxsize=None) # pylint: disable=cache-max-size-none # noqa
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Don't we want a max here? 128 on the same LookupMixIn seems like it should fit most of our uses in pylint?


from astroid.context import _invalidate_cache

if TYPE_CHECKING:
from astroid.nodes.node_ng import NodeNG
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I know we have different opinions about this, but I like importing nodes here as it also keeps the line much shorter and gives some consistency with the pylint codebase. I'll leave this to you!

@@ -22,18 +25,21 @@ class TransformVisitor:
def __init__(self):
self.transforms = collections.defaultdict(list)

@lru_cache(maxsize=TRANSFORM_MAX_CACHE_SIZE)
def _transform(self, node):
def _transform(self, node: "NodeNG") -> "NodeNG":
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Wasn't #1242 looking at how to cache this without running into memory issues?

tests/unittest_inference_calls.py Outdated Show resolved Hide resolved
tests/unittest_inference_calls.py Outdated Show resolved Hide resolved
@Pierre-Sassoulas
Copy link
Member Author

Maybe we can remove all the caching temporarily and review/merge #1242 ?

@DanielNoord
Copy link
Collaborator

Maybe we can remove all the caching temporarily and review/merge #1242 ?

I think that PR was almost good to go?

@Pierre-Sassoulas
Copy link
Member Author

I fixed the merge conflict when merging main on 1242, it has gone stale for a long time. What about merging the non-cache fix here first even if it fails, then #1242 ?

@Pierre-Sassoulas Pierre-Sassoulas changed the title Fix flake8 bugbear version and fix flake8/pylint 2.13.5 cache's issues Fix flake8 bugbear version and fix some flake8 issues Apr 7, 2022
@DanielNoord
Copy link
Collaborator

DanielNoord commented Apr 7, 2022

I fixed the merge conflict when merging main on 1242, it has gone stale for a long time. What about merging the non-cache fix here first even if it fails, then #1242 ?

Let's add TODOs then to make sure we don't forget about this. And perhaps try to contact that PR owner to see if we can get it merged.
I know last time I looked at it I hadn't looked at the implications of my new knowledge about lru_caches and that PR. That's the only thing we need to fix I think: making sure the new cache isn't used on methods or handles self better.

Edit: Ah, I misinterpreted your comment. Yeah, what you are proposing is fine as well!

@Pierre-Sassoulas
Copy link
Member Author

Let's add TODOs then to make sure we don't forget about this. And perhaps try to contact that PR owner to see if we can get it merged.

Well we won't forget about this if we merge with pre-commit failing πŸ˜„

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Maintenance Discussion or action around maintaining astroid or the dev workflow
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants