-
-
Notifications
You must be signed in to change notification settings - Fork 4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Bug]: Browser console error net::ERR_OUT_OF_MEMORY
when uploading large files
#43627
[Bug]: Browser console error net::ERR_OUT_OF_MEMORY
when uploading large files
#43627
Comments
net::ERR_OUT_OF_MEMORY
when uploading large files
Possibly related to #42704 |
@joshtrichards I see that issue has been closed, is there a docker build i can test and see if i can repro this ? |
It wasn't clear if #42704 is applicable to both chunked and non-chunked uploads. When I execute This is a clean setup and from what I understand, chunking is enabled by default. |
@solracsf I could not repro the bug using your image 🚀 |
Closing as per #42704 (comment) |
It'll only return a value if the hard-coded default has been overridden. The default is10 MiB.
|
This comment was marked as off-topic.
This comment was marked as off-topic.
We have the same problem! It arose after the transition from version 27 to 28. I was able to understand this error in more detail and this is what I found out. Our problem only occurs when a reverse proxy is used. If you go directly to the web using an IP address, files of any size are upload without any problems; as soon as the download goes through a reverse proxy, a break can occur at any time. Then we managed to find out that the break and the OUT OF MEMORY error depend on the location on the computer disk from which the file is being uploaded. If the file size is larger than the free space on drive C, then this is a 100 percent guarantee that the upload will fail. For some reason, when working through a reverse proxy, the browser begins to duplicate all chunks on drive C. |
@osscombat and @blmhemu are you also using a reverse proxy? |
yep |
Is it possible that you're dropping some headers with your reverse proxy setup? |
Maybe, but I've never seen something special regarding this. I ended up with a bunch of recommended headers settings block like this:
As far as I understand this is not a pure proxy setup issue, cause all other webdav clients doesn't suffer from this bug. But the web-browser client clearly eats up the desktop RAM equal to the uploaded file during the upload and fails. |
@skjnldsv - I did the tests both with and without reverse proxy (caddy) and found the same results in both cases. |
Alright! Thanks for the help :) |
I use only 64bit everywhere, browsers as well. |
64 - bit everything here as well. |
Chrome or Firefox? What OS? |
Chrome on MacOS (both latest version) |
Windows 10/11 and latest Chrome/Edge, 64bit everything. |
Our problem was observed in Windows 10/11 64bit MacOS latest version. Browsers checked Chrome Firefox, the problem is the same everywhere. Regardless of external systems and browser frameworks, when the laptop’s hard drive and insufficient memory capacity, a break occurs. |
This comment was marked as resolved.
This comment was marked as resolved.
Alright, I tried many things. @blmhemu and @osscombat you both seem to be using the same server, right? |
@coinfastman you haven't given much data, are you also experiencing a |
I'm using my own NC 28.0.4 instance. The Firefox have the same issue, but the desktop NC client uploads just fine. I think this is just a very rare scenario to upload 2+ Gb files via browser and a reverse proxy, that's why not so many complaints. |
What's your reverse proxy? Can you give us a bit more feedback so we can try to reproduce the issue? |
I use nginx, pretty standard setup with LE, nothing special, nextcloud.conf:
|
@skjnldsv I provide a screenshot and change of nginx reverse proxy. Unfortunately, I need to paint over some elements for safety reasons.
|
@skjnldsv hey ! This issue, led me to finally create a (long overdue) dev environment - I created a perfectly new instance and the bug is still repeatable. FWIW, I disabled all the browser extensions and the file I am uploading is In the below screenshot, I used ssh tunnel to bypass the reverse proxy (as good as running locally) and it still failed. |
I can reproduce this issue with Chromium and Firefox (not the error but I see memory consumption increases to ~8GB). I will create the patch soon. |
* Backport of #44835 * Server part of #43627 Signed-off-by: Ferdinand Thiessen <[email protected]>
Yes, the issue is resolved with the release 28.0.5, thank you! |
Bug description
When upload large files (~7.5 GB) the upload fails after sometime - Exactly from chunk 205 onwards. This behaviour is consistent and repeatable. Please see the additional info section.
Steps to reproduce
1.Create nextcloud apache docker compose setup.
2.Login to the instance and upload a large file. (I uploaded a raw fedora image)
3.Check both the network tab and the console tab for error logs.
Expected behavior
Uploads should succeed.
Installation method
Community Docker image
Nextcloud Server version
28
Operating system
Other
PHP engine version
PHP 8.2
Web server
Apache (supported)
Database engine version
PostgreSQL
Is this bug present after an update or on a fresh install?
Fresh Nextcloud Server install
Are you using the Nextcloud Server Encryption module?
Encryption is Disabled
What user-backends are you using?
Configuration report
Error seems to be occurring on the client js side. Typical system. Will be happy to provide more details if you think needed.
List of activated Apps
Nextcloud Signing status
Nextcloud Logs
Additional info
It looks like client js is running out of memory - may be it is storing all the chunks in memory and not freeing them up after upload.
FWIW, I followed the large file upload section in nextcould docs as well. It is also reproducible in fpm images with caddy.
The text was updated successfully, but these errors were encountered: