Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

1.3-RC4: package loading performance regression #33615

Closed
pfitzseb opened this issue Oct 21, 2019 · 14 comments
Closed

1.3-RC4: package loading performance regression #33615

pfitzseb opened this issue Oct 21, 2019 · 14 comments
Assignees
Labels
compiler:inference Type inference compiler:latency Compiler latency regression Regression in behavior compared to a previous version
Milestone

Comments

@pfitzseb
Copy link
Member

RC4 introduces a fairly big (~x3) loading time regression in some circumstances. Compare

julia on tags/v1.3.0-rc3 [$] 
λ julia --quiet                                 
julia> @elapsed using CSV
2.604898945

julia on tags/v1.3.0-rc3 [$] 
λ julia --quiet
julia> using Atom

julia> @elapsed using CSV
3.608714034

with

julia on tags/v1.3.0-rc4 [$] 
λ julia --quiet
julia> @elapsed using CSV
2.716010038

julia> 

julia on tags/v1.3.0-rc4 [$] 
λ julia --quiet
julia> using Atom

julia> @elapsed using CSV
11.830135055

This was originally reported here.

Bisecting points to c76690e.

@JeffBezanson
Copy link
Sponsor Member

Sadly we're a bit stuck here, since that commit is needed to fix an even worse slowdown (#33472).

@JeffBezanson JeffBezanson added compiler:inference Type inference compiler:latency Compiler latency labels Oct 21, 2019
@pfitzseb
Copy link
Member Author

Yeah, I saw that. Anything I can do to help debug this?

@JeffBezanson JeffBezanson added compiler:inference Type inference and removed compiler:inference Type inference labels Oct 21, 2019
@iamed2
Copy link
Contributor

iamed2 commented Oct 25, 2019

Not sure yet but this may be affecting us as well, but on the order of minutes.

@fchorney
Copy link
Contributor

Doing some tests with a private package we have

Julia 1.2

julia> @elapsed using Foo
11.989522323

Julia 1.3-rc4.1

julia> @elapsed using Foo
592.601374137

Going to try and do some more tests to see if I can pinpoint anything specific, but this is definitely an issue for us.

@baggepinnen
Copy link
Contributor

Also, memory consumption during precompilation seems to have skyrocketed. I have some packages that can not precompile on a 8GB ram laptop unless julia is the only thing running, forget atom and chrome. Ref https://discourse.julialang.org/t/memory-consumption-of-precompilation/30327/3

@KristofferC
Copy link
Sponsor Member

People that encounter this, can you try revert (individually) the two first commits in #33499 and see how that influence things.

@pfitzseb
Copy link
Member Author

pfitzseb commented Oct 26, 2019

When I was bisecting the regression mentioned in the OP, only c76690e made any difference.

@Keno Keno added this to the 1.3 milestone Oct 29, 2019
@JeffBezanson JeffBezanson self-assigned this Oct 29, 2019
@JeffBezanson
Copy link
Sponsor Member

Seems to be caused by a slight increase in backedges, leading to some unlucky slow inference. It's just normal inference time; the time is not spent in the typeintersect call added in the implicated commit. I'll work on some mitigations.
One culprit here is CategoricalArrays, which adds very low-level methods to == and convert like

Base.:(==)(x::CatValue, y::Any) = get(x) == y

that intersects with a huge number of things due to the Any argument.
It's hard to tell what changed here to kick off so much inference time, so the effect is a bit subtle.

@JeffBezanson JeffBezanson added the regression Regression in behavior compared to a previous version label Oct 30, 2019
@JeffBezanson
Copy link
Sponsor Member

@fchorney Can you confirm whether reverting c76690e fixes your case? If not we might need a separate issue.

@fchorney
Copy link
Contributor

@JeffBezanson when I revert that commit, it definitely seems to help the loading times

@JeffBezanson
Copy link
Sponsor Member

Does it help just a little, or fix it completely? You can also send the code to me privately if possible.

@fchorney
Copy link
Contributor

Based on what I was seeing during testing, I was getting around the same times on 1.2 as with reverting that commit on 1.3. I wanted to do a little more testing to make sure I'm actually getting better performance.

on 1.2 I get ~11 seconds, on 1.3-rc4.1 I get ~ 20 seconds, and on 1.3-rc4.1 with c76690e reverted I get ~14 seconds.

Sorry for the hold up, unfortunately I can't seem to reproduce the super slow loading times anymore. I'm not sure if it's caching something or what.

@JeffBezanson
Copy link
Sponsor Member

Further interesting data point: for me, with CSV 0.5.14 and CategoricalArrays 0.7.1 the time for using CSV after using Atom is over 16 seconds. With CSV 0.5.12 and CategoricalArrays 0.6.0 the time is 11 seconds.

JeffBezanson added a commit that referenced this issue Nov 9, 2019
@JeffBezanson
Copy link
Sponsor Member

Has been addressed on the backports branch.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
compiler:inference Type inference compiler:latency Compiler latency regression Regression in behavior compared to a previous version
Projects
None yet
Development

No branches or pull requests

7 participants