fix(upload): Do not read chunks into memory but just stream file chunks #1153
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
net::ERR_OUT_OF_MEMORY
when uploading large files nextcloud/server#43627Instead of reading file chunks into memory and then upload those chunks, we should rather just slice the file and stream it for uploading. This prevents browsers to sky rocket the memory consumption.
We no longer need p-limit so drop that dependency.
You can test this by uploading a large file and monitor the process memory of your browser. Please note that Firefox has a browser bug: As long as the dev tools are open the request data is not freed so if you have the dev tools open you will still suffer this issue.