Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix memory leak #1243

Merged
merged 2 commits into from
Oct 1, 2023
Merged

fix memory leak #1243

merged 2 commits into from
Oct 1, 2023

Conversation

ghostiee
Copy link

Resume stream after all chunks are published to prevent memory leak

📝 Description

Memory leak scenario: we have two machines with each deployed a node. We use molecularr-web as a proxy to implement the file upload function. The files are first sent to the gateway and then forwarded to the service on the other machine using streams.
However, due to the file storage speed being slower than the internal network speed between the two machines, data starts to accumulate in the gateway.
As we had not handled this situation correctly before, it caused significant memory leaks in our production environment when users uploaded files. I identified the issue and fixed it.

🎯 Relevant issues

None.

💎 Type of change

  • Bug fix (non-breaking change which fixes an issue)
  • New feature (non-breaking change which adds functionality)
  • Breaking change (fix or feature that would cause existing functionality to not work as expected)
  • This change requires a documentation update

🚦 How Has This Been Tested?

As the description above

🏁 Checklist:

  • My code follows the style guidelines of this project
  • I have performed a self-review of my own code
  • I have added tests that prove my fix is effective or that my feature works
  • New and existing unit tests pass locally with my changes
  • I have commented my code, particularly in hard-to-understand areas

Resume stream after all chunks are published to prevent memory leak
src/transit.js Show resolved Hide resolved
@icebob icebob merged commit 7fb6762 into moleculerjs:master Oct 1, 2023
176 checks passed

this.logger.debug(`=> Send stream chunk to ${nodeID} node. Seq: ${copy.seq}`);
return this.Promise.all(
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@icebob It's not a good solution to not control the size of chunks, as this can lead to excessive CPU usage in the Promise.all call.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

what's the size which can cause problem?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Based on my analysis, it seems most reasonable to derive the meaning from os.availableParallelism().

We are facing two issues:

  1. an unknown logic in transitPublish may cause synchronous blocking of operations in Promise.all.
  2. We are unaware of the number of available system resources during the planning of Promise.all operations. We cannot control the peak load even if we had this information.

A small size of chunks for available resources is a bad solution, but for busy resources, it is a good solution.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants