Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

'@log_exceptions_and_usage' in Python seems have memory leaks #3269

Closed
efenzha opened this issue Oct 4, 2022 · 4 comments · Fixed by #3371
Closed

'@log_exceptions_and_usage' in Python seems have memory leaks #3269

efenzha opened this issue Oct 4, 2022 · 4 comments · Fixed by #3371

Comments

@efenzha
Copy link

efenzha commented Oct 4, 2022

Expected Behavior

The app which is using 'write_to_online_store' consumes memory in a stable way.

Current Behavior

The memory consumed by the app which is using 'write_to_online_store' keeps growing. (See further comments that point to usage.py being the cause)

Steps to reproduce

The App is based on FastApi, which receives data with a throughput 2-3 req/s. The App creates dataframe from the data posted by client, which has around 10 features, and then writes these features to the online store with the function 'write_to_online_store'.

We can observe the memory consumption of the App keeps growing, and its writing time also keeps growing.

The online store is based on Postgresql.

We also saves raw data to Postgresql with psycopg2, which acts as the offline store, its memory consumption and writing time is quite stable. (we have tried to only use psycopg2 to write to Postgresql, and no 'write_to_online_store', to check App's memory consumption situation)
OOM

Specifications

  • Version:
    Feast 0.22.2
    Python 3.7

  • Platform:
    kubernetes

  • Subsystem:

Possible Solution

@efenzha efenzha changed the title 'write_to_online_store' in Python SDK seems have memory leaks 'write_to_online_store' in Python seems have memory leaks Oct 4, 2022
@adchia
Copy link
Collaborator

adchia commented Oct 4, 2022

Hi! Thanks for reporting this

Any chance you could get a heap dump after a bit to see what might be leaking?

@efenzha
Copy link
Author

efenzha commented Oct 5, 2022

Hi @adchia ,

Thanks for the reply!

Today I am using 'tracemalloc' to take snapshot of the memory usage every 30 seconds, and found below summary:

It seems the memory is increasing because of the '@log_exceptions_and_usage' decorator on 'write_to_online_store'. Is it possible to skip or turn off this '@log_exceptions_and_usage'?

INFO - Top diff since start:
INFO - top_diffs01, /opt/venv/lib/python3.7/site-packages/feast/usage.py:0: size=962 MiB (+962 MiB), count=9955588 (+9955588), average=101 B
INFO - top_diffs02, /opt/venv/lib/python3.7/site-packages/pyarrow/pandas_compat.py:0: size=2191 KiB (+2191 KiB), count=15169 (+15169), average=148 B
INFO - top_diffs03, /usr/local/lib/python3.7/tracemalloc.py:0: size=1371 KiB (+1371 KiB), count=18572 (+18572), average=76 B
INFO - top_diffs04, /usr/local/lib/python3.7/inspect.py:0: size=1032 KiB (+1032 KiB), count=6340 (+6340), average=167 B
INFO - top_diffs05, /usr/local/lib/python3.7/threading.py:0: size=927 KiB (+927 KiB), count=8471 (+8471), average=112 B
INFO - top_diffs06, /opt/venv/lib/python3.7/site-packages/pandas/core/generic.py:0: size=637 KiB (+637 KiB), count=4395 (+4395), average=148 B
INFO - top_diffs07, /usr/local/lib/python3.7/json/encoder.py:0: size=549 KiB (+549 KiB), count=4140 (+4140), average=136 B
INFO - top_diffs08, /usr/local/lib/python3.7/linecache.py:0: size=503 KiB (+503 KiB), count=5176 (+5176), average=99 B
INFO - top_diffs09, /usr/local/lib/python3.7/asyncio/locks.py:0: size=500 KiB (+500 KiB), count=1420 (+1420), average=360 B
INFO - top_diffs10, /usr/local/lib/python3.7/uuid.py:0: size=479 KiB (+479 KiB), count=5541 (+5541), average=89 B
INFO - Top incremental:
INFO - top_incremental01, /opt/venv/lib/python3.7/site-packages/feast/usage.py:190: size=960 MiB (+325 MiB), count=9932668 (+3356430), average=101 B
INFO - top_incremental02, /usr/local/lib/python3.7/asyncio/locks.py:244: size=499 KiB (-264 KiB), count=1415 (-1602), average=361 B
INFO - top_incremental03, /opt/venv/lib/python3.7/site-packages/pyarrow/pandas_compat.py:180: size=1257 KiB (+245 KiB), count=8360 (+1642), average=154 B
INFO - top_incremental04, /usr/local/lib/python3.7/asyncio/base_events.py:404: size=137 KiB (-126 KiB), count=896 (-1158), average=156 B
INFO - top_incremental05, /usr/local/lib/python3.7/inspect.py:2882: size=521 KiB (+110 KiB), count=3173 (+668), average=168 B
INFO - top_incremental06, /opt/venv/lib/python3.7/site-packages/uvicorn/protocols/http/httptools_impl.py:156: size=5351 B (-104 KiB), count=51 (-1937), average=105 B
INFO - top_incremental07, /opt/venv/lib/python3.7/site-packages/httpcore/_backends/anyio.py:151: size=11.7 KiB (-98.7 KiB), count=22 (-665), average=545 B
INFO - top_incremental08, /opt/venv/lib/python3.7/site-packages/feast/usage.py:195: size=537 KiB (+98.0 KiB), count=961 (+176), average=572 B
INFO - top_incremental09, /opt/venv/lib/python3.7/site-packages/feast/usage.py:368: size=517 KiB (+96.3 KiB), count=4995 (+930), average=106 B
INFO - top_incremental10, /opt/venv/lib/python3.7/site-packages/feast/usage.py:268: size=507 KiB (+94.5 KiB), count=9991 (+1860), average=52 B
INFO - Top current:
INFO - top_current01, /opt/venv/lib/python3.7/site-packages/feast/usage.py:0: size=962 MiB, count=9955588, average=101 B
INFO - top_current02, /opt/venv/lib/python3.7/site-packages/pyarrow/pandas_compat.py:0: size=2191 KiB, count=15169, average=148 B
INFO - top_current03, /usr/local/lib/python3.7/tracemalloc.py:0: size=1371 KiB, count=18572, average=76 B
INFO - top_current04, /usr/local/lib/python3.7/inspect.py:0: size=1032 KiB, count=6340, average=167 B
INFO - top_current05, /usr/local/lib/python3.7/threading.py:0: size=927 KiB, count=8471, average=112 B
INFO - top_current06, /opt/venv/lib/python3.7/site-packages/pandas/core/generic.py:0: size=637 KiB, count=4395, average=148 B
INFO - top_current07, /usr/local/lib/python3.7/json/encoder.py:0: size=549 KiB, count=4140, average=136 B
INFO - top_current08, /usr/local/lib/python3.7/linecache.py:0: size=503 KiB, count=5176, average=99 B
INFO - top_current09, /usr/local/lib/python3.7/asyncio/locks.py:0: size=500 KiB, count=1420, average=360 B
INFO - top_current10, /usr/local/lib/python3.7/uuid.py:0: size=479 KiB, count=5541, average=89 B
INFO - track_back, memory_blocks:9932668, size_kb:983241
INFO - File "/opt/venv/lib/python3.7/site-packages/starlette/routing.py", line 670
INFO - await route.handle(scope, receive, send)
INFO - File "/opt/venv/lib/python3.7/site-packages/starlette/routing.py", line 266
INFO - await self.app(scope, receive, send)
INFO - File "/opt/venv/lib/python3.7/site-packages/starlette/routing.py", line 68
INFO - await response(scope, receive, send)
INFO - File "/opt/venv/lib/python3.7/site-packages/starlette/responses.py", line 165
INFO - await self.background()
INFO - File "/opt/venv/lib/python3.7/site-packages/starlette/background.py", line 43
INFO - await task()
INFO - File "/opt/venv/lib/python3.7/site-packages/starlette/background.py", line 26
INFO - await self.func(*self.args, **self.kwargs)
INFO - File "/opt/venv/lib/python3.7/site-packages/ip_flow_feature_creator/apis/feature_creator_api.py", line 156
INFO - get_feature_store().write_to_online_store(feature_view.name, data_frame)
INFO - File "/opt/venv/lib/python3.7/site-packages/feast/usage.py", line 299
INFO - _produce_event(ctx)
INFO - File "/opt/venv/lib/python3.7/site-packages/feast/usage.py", line 190
INFO - for c in reversed(ctx.completed_calls)
INFO - File "/opt/venv/lib/python3.7/site-packages/feast/usage.py", line 190
INFO - for c in reversed(ctx.completed_calls)

@adchia
Copy link
Collaborator

adchia commented Oct 5, 2022

Yep! There's an environment variable you can set (See https://docs.feast.dev/reference/usage)

@adchia
Copy link
Collaborator

adchia commented Oct 5, 2022

Thanks for investigating, will rename this issue then

@adchia adchia changed the title 'write_to_online_store' in Python seems have memory leaks '@log_exceptions_and_usage' in Python seems have memory leaks Oct 5, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants