-
Notifications
You must be signed in to change notification settings - Fork 7.3k
"Recursive process.nextTick detected" from within stream code #6065
Comments
@isaacs Looks like a streams issue to me. Handing this off to you. |
I just tested on v0.11.0 and the problem persists. |
Problem seem to vanish at 0.11.1 - that is, since upgrading its been several hours and no recursion errors. I'll post back in a day. |
so far so good, since upgrading to 0.11.1 this problem seemed to go away |
Okay, thanks for reporting back. I'll close the issue. |
Hi @bnoordhuis This bug has resurfaces in one of my college's projects and this time we've distilled it to a fairly simple program (although that simple program is dependent on aws node.js sdk) that reproduces it consistently on several independent machines. See here: Could someone take a look? p.s. If you need aws/s3 credentials for testing I can send them privately. |
@kessler We can only accept bug reports with test cases that have no third-party dependencies, i.e. that only use core libraries. |
@bnoordhuis this problem is very elusive but clearly its a problem in node core, I've spent several days last time to try and distill a program that reproduces it but it would only occur in high traffic production environment so I was unsuccessful. If you could tell me how to find the connection between the above asynchronous stack trace and a simple s3.putObject() call (which basically reproduces the problem when the file is large enough) I will gladly try and distill a pure core code. ("point me in the right direction and ill go from there") I also like to point out that the developers of aws closed our bug report on the grounds that the problem is in node core, developers of node postgres driver did the same so I feel this problem kind of falls between the lines (not sure if that is how you phrase this in english). A stroke of luck has presented us with a simple program that reproduces this problem repeatedly, so can't there be an exception in this case? |
If one of the other committers wants to pick this up, I'll cheer him on (from the sidelines). From previous experience, in 19 out of 20 cases the bug turns out to be either in the test case or a dependency - meaning you just spent 30 minutes debugging someone else's broken code. No offense but life is too short for that. That's not to say there are no bugs in node core but I'd rather spend my time fixing those rather than bugs in $RANDOM_LIBRARY. |
This occurs across various modules, the two above are just an example. Could you at least reopen this for a little while so people will take notice, maybe someone could contribute a vital piece of information. I will keep digging into this myself as well. |
@kessler the stack trace you posted is too short to see where the emitted event is coming from. If you increase |
@trevnorris hey, wonderful tip that :) anyways, stack trace didn't come out that much longer:
Does that get us anywhere yet? |
@kessler and I'll assume you're using v0.10.20? |
I have a sorta test case: var Readable = require('stream').Readable;
var util = require('util');
var b = new Buffer(1);
function MyThing() {
Readable.call(this);
}
util.inherits(MyThing, Readable);
MyThing.prototype._read = function read(size) {
this.push(b);
if (this._readableState.buffer.length > 1000) {
throw new Error('buffer is huge!');
}
};
var thing = new MyThing();
setTimeout(function() {
thing.on('readable', function() {
thing.read();
});
}, 10); @isaacs Could it be possible that if tls is reading in a very large file that it just continues to loop and call |
@trevnorris This one is from 0.10.15 but the same bug occurs until 0.11.1 (checked when initially submitted this) I can recheck again if you like. |
I just discovered this bug in version 0.10.20, with a minimal test-case. This requires the Producervar crypto = require('crypto');
var net = require('net');
var Readable = require('stream').Readable;
var util = require('util');
function RandomStream(options) {
Readable.call(this);
options || (options = {});
this.defaultBytes = options.defaultBytes || 1024;
}
util.inherits(RandomStream, Readable);
RandomStream.prototype._read = function(bytes) {
this.push(crypto.pseudoRandomBytes(bytes || this.defaultBytes));
};
var stream = new RandomStream();
var conn = net.connect(process.env.PORT || 3000, function() {
stream.pipe(conn);
});
conn.resume(); Consumervar net = require('net');
var server = net.createServer(function(conn) {
console.log('new connection');
conn.resume();
});
server.listen(process.env.PORT || 3000); ErrorIf you run these two in tandem, the producer crashes with a massive number of warnings.
Interestingly, these aren't errors, the error only comes after all the warnings with the expected NotesFor some odd reason, you cannot create the server and connect to it in the same process. If they are in the same process, nothing terribly interesting happens--the producer produces and the consumer consumes. $ node
> require('./consumer')
{}
> require('./producer')
{}
> new connection Also maybe possibly related? #5337 |
good |
@skeggse at least your example is hitting the recursive error because the warning is printing and then ultimately causing an infinite recursion as it uses console.error which then calls nextTick which triggers the warning which causes a console.error which then calls nextTick... |
@tjfontaine any prospects on this one tj ? |
Assigning this to myself. Will take a look when I have some time. |
This one is troublesome. Can I send flowers to anyone to up the priority on this one? :) |
Does anyone have a non-synthetic use case where this is happening? I accept that this is a reduced reproduction, and appreciate it, but any details about the actual use case would be helpful. "I have an infinite stream of random bytes that I want to send as fast as possible" is not a real world scenario ;) (EDIT: "this" = @skeggse's script above) |
@isaacs, I opened this issue initially because in our production we are loading several big files to amazon s3 every second - each file is the source for a bulk copy operation into Amazon's Redshift. Due to the high frequency of the operation, we get this error quite a lot. The size threshold was around 120-150 megabytes. We reduced our bulk to 100 mega to be on the safe side and we rarely see this now. I would really like to be able to upload files larger than 100 megabytes over https using node.js without needing to worry about this issue. But I guess that no so many people ran into this problem after all otherwise this thread would have been much more crowded... |
@isaacs Same here with s3. However, I am seeing it when uploading files only in the 50MB range. Was considering using external cURL calls to get around it. |
@skeggse Quick note. your consumer script has the issue that it's not listening for data events. so that will back up the server as it's trying to write out. As for the other issue. it has nothing to do w/ crypto. it's how streams work. already spent too many hours working on this. i'll look at again later as soon as my eyes stop bleeding. |
@kessler What size of instance are you running on? I am working off a micro. |
@pmccartney m3.2xlarge :-) |
@kessler Just letting you know that executing curl to upload files to s3 has been working pretty well. I'm using the child process module, but others have used straight exec. |
@pmccartney if you take this approach then may I suggest using s3cmd which has retries and such. |
This was an oversight between the streams implementation and the nextTick handling. Though it would be too difficult to fix in v0.10. This specific issue is fixed in v0.12, but another issue exists as explained in #7401 WONTFIX |
* Modify tools/license-builder.sh to support ICU 57.1's plain text license. (Separate issue to add ICU 57.1 in nodejs#6058) * Update/regenerate LICENSE to include ICU 57.1's license * Note that because the tool was rerun, the change in nodejs#6065 is already included here. PR-URL: nodejs/node#6068 Reviewed-By: Rod Vagg <[email protected]> Reviewed-By: James M Snell <[email protected]>
* Modify tools/license-builder.sh to support ICU 57.1's plain text license. (Separate issue to add ICU 57.1 in nodejs#6058) * Update/regenerate LICENSE to include ICU 57.1's license * Note that because the tool was rerun, the change in nodejs#6065 is already included here. PR-URL: nodejs/node#6068 Reviewed-By: Rod Vagg <[email protected]> Reviewed-By: James M Snell <[email protected]>
* Modify tools/license-builder.sh to support ICU 57.1's plain text license. (Separate issue to add ICU 57.1 in nodejs#6058) * Update/regenerate LICENSE to include ICU 57.1's license * Note that because the tool was rerun, the change in nodejs#6065 is already included here. PR-URL: nodejs/node#6068 Reviewed-By: Rod Vagg <[email protected]> Reviewed-By: James M Snell <[email protected]>
* Modify tools/license-builder.sh to support ICU 57.1's plain text license. (Separate issue to add ICU 57.1 in nodejs#6058) * Update/regenerate LICENSE to include ICU 57.1's license * Note that because the tool was rerun, the change in nodejs#6065 is already included here. PR-URL: nodejs/node#6068 Reviewed-By: Rod Vagg <[email protected]> Reviewed-By: James M Snell <[email protected]>
* Modify tools/license-builder.sh to support ICU 57.1's plain text license. (Separate issue to add ICU 57.1 in nodejs#6058) * Update/regenerate LICENSE to include ICU 57.1's license * Note that because the tool was rerun, the change in nodejs#6065 is already included here. PR-URL: nodejs/node#6068 Reviewed-By: Rod Vagg <[email protected]> Reviewed-By: James M Snell <[email protected]>
* Modify tools/license-builder.sh to support ICU 57.1's plain text license. (Separate issue to add ICU 57.1 in nodejs#6058) * Update/regenerate LICENSE to include ICU 57.1's license * Note that because the tool was rerun, the change in nodejs#6065 is already included here. PR-URL: nodejs/node#6068 Reviewed-By: Rod Vagg <[email protected]> Reviewed-By: James M Snell <[email protected]>
Hi,
Occasionally (but not rarely) I'm getting the "Recursive process.nextTick detected" warning in huge spams. When using "--throw-deprecation" I get this stack trace:
I chose to post this issue since the stack trace is purely node core code.
Even worse, I cannot reliably reproduce it yet (but I'm working VERY hard on that) except by running my production servers under medium - heavy load for arbitrary periods of time.
Servers are amazon machines running the aws linux distro with node v0.10.15.
Any and all help (even sympathy) will be tremendously appreciated.
Thanks!
The text was updated successfully, but these errors were encountered: