Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Performance issue on Node 4x #54

Closed
aheckmann opened this issue Oct 17, 2015 · 19 comments
Closed

Performance issue on Node 4x #54

aheckmann opened this issue Oct 17, 2015 · 19 comments

Comments

@aheckmann
Copy link
Contributor

On NodeJS 4x, when configuring lru-cache with a max setting above 8192, behavior gradually becomes very slow. The longer it runs, the slower it gets.

Starting with a max of 8193, the program abruptly slows down soon after the maximum number of elements have been reached. Soon it will take upwards of 30 seconds to process 5000 set/removes. As the max setting is raised higher and higher, performance still slows down but the program needs to run longer and longer to reach the same level of performance degradation.

I haven't been able to reproduce this behavior on any of these earlier releases of Node/iojs:

  • iojs-3.3.1
  • iojs-2.5.0
  • iojs-1.8.4
  • node-0.12.7
  • node-0.11.16
  • node-0.10.40

This is probably a V8 bug but worth making this known here.

Note: I forked this module and changed it to use Map instead of Object.create(null) and the issue is resolved. Not sure if switching to a Map is the right choice for this module due to backward compatibility (and browser support) but it looks like it could be a good solution going forward. I haven't pushed my code public yet but let me know if you'd be interested in going that direction.

@mhart
Copy link

mhart commented Oct 26, 2015

I explored this a little and created an issue over at nodejs/node#3538

@ofrobots
Copy link

Can you try a work-around to see if it helps? Switch the object to dictionary mode up-front to avoid the heuristics being used by V8. This should be achievable by something like the following at startup:

this._cache[200000] = true; delete this._cache[200000];

It seems there is a pathology in V8 in when objects are transitioned from Fast array mode to Dictionary mode. Judging by the usage, it seems like this object would work better as a Dictionary.

@ofrobots
Copy link

To be clear, I am not suggesting this as a fix, but rather as an experiment to see if it helps.

@mhart
Copy link

mhart commented Oct 28, 2015

Thanks @ofrobots – I tested this and it does indeed fix the problem. I created a fix over at #57

@isaacs
Copy link
Owner

isaacs commented Oct 28, 2015

Thank you all so much for the research and workarounds here. This is great. I'll take a look at this soon (next couple days) and push a fix.

I kind of like the solution presented in #55, if only because Map is a better data structure for this use-case. However, it means losing backwards compatibility, and that sucks. I'm half-considering refactoring it in such a way that we can fork the map-specific bits out based on Node.js version, but I wonder if that'd be too weird.

The solution in #57 is obviously lighter-touch, and doesn't require any major surgery, but I worry that it's a bit too much of a kludge. (Otoh, if V8 will fix the perf regression someday, maybe it's a temporary kludge?)

@ofrobots
Copy link

My suggestion is definitely a kludge and I would recommend against using it just yet until the root cause is better understood.

@aheckmann
Copy link
Contributor Author

friendly ping

@joepie91
Copy link

Any update on this?

@aheckmann
Copy link
Contributor Author

There is a fix on the way in V8 but according to the comments we will still see "~200 millisecond" degradation.

https://code.google.com/p/v8/issues/detail?id=4518

I'll probably continue using my Map fork for a while since it works everywhere I need it to.

@isaacs
Copy link
Owner

isaacs commented Nov 25, 2015

@aheckmann Can you try npm i isaacs/node-lru-cache#map and see if that works good for you?

I refactored the map branch with a polyfill for older JS engines.

@ofrobots
Copy link

@isaacs Maps are implemented on top of the same mechanism in V8, and suffer the same issue. Even with maps I would expect the same pathology.

@ofrobots
Copy link

Running the test-case in nodejs/node#3538 with lruCache initialized as a new Map(), I see the pathology still. I am assuming that this what isaacs/node-lru-cache#map does in essence.

@isaacs
Copy link
Owner

isaacs commented Nov 25, 2015

@ofrobots weird, that's not what @aheckmann described.

@ofrobots
Copy link

@aheckmann would you be able to independently try out the Map change in the test-case from nodejs/node#3538? Your observations and mine need to be reconciled.

@aheckmann
Copy link
Contributor Author

@isaacs just tested your branch, looks like it performs same as my fork. +1
@ofrobots sure thing. will try that next

@aheckmann
Copy link
Contributor Author

@ofrobots actually, not sure what you mean - are you running the exact same script but only substituting Object.create(null) with new Map()? That isn't the same thing as using the Map APIs.

@ofrobots
Copy link

D'Oh. My lack of knowledge about ES6 is showing. Please ignore my comments about Map. 😬

@aheckmann
Copy link
Contributor Author

No worries, I'm just getting my feet wet these days too :)
On Nov 25, 2015 11:36 AM, "Ali Ijaz Sheikh" [email protected]
wrote:

D'Oh. My lack of knowledge about ES6 is showing. Please ignore my comments
about Map. [image: 😬]


Reply to this email directly or view it on GitHub
#54 (comment)
.

@isaacs
Copy link
Owner

isaacs commented Nov 27, 2015

Fixed in 3.0.0 release. See: #55 (comment)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

5 participants