Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

classes.jsa and classes_nocoops.jsa now excluded from MacOS JDKs/JREs? #937

Open
chadlwilson opened this issue Nov 1, 2023 · 17 comments · Fixed by adoptium/temurin-build#3923
Labels
keep This label can be applied to prevent the Stale bot from closing it after a period of inactivity

Comments

@chadlwilson
Copy link

chadlwilson commented Nov 1, 2023

Question

I noticed that the JREs for 17.0.9 now exclude classes*.jsa where they were included in 17.0.8.

While smaller JREs distributions are always great, my understanding was that class data sharing was still an important part of improving JRE startup time. Can someone point me to the reading and impact of this compared to previous releases? I infer perhaps these were not actually being used/effective earlier, or perhaps removed as part of the reproducible build effort.

Context

Java version:

openjdk version "17.0.9" 2023-10-17
OpenJDK Runtime Environment Temurin-17.0.9+9 (build 17.0.9+9)
OpenJDK 64-Bit Server VM Temurin-17.0.9+9 (build 17.0.9+9, mixed mode)

Your operating system and platform:

MacOS Sonoma 14.1 on aarch64 (but relevant for other platforms)

@karianna
Copy link
Contributor

karianna commented Nov 5, 2023

I've asked our build team to investigate

@chadlwilson
Copy link
Author

After looking a little bit further (but not exhaustively across platforms and archs) it sort of looks like

  • Mac aarch64 builds: classes*.jsa has been been excluded for quite a while, at least versions back as far as 17.0.4
  • Mac x64 builds: classes*.jsa started getting excluded from 17.0.9+

So I guess they are at least consistent now. But not sure of impact.

@chadlwilson chadlwilson changed the title classes.jsa and classes_nocoops.jsa now excluded from JREs? classes.jsa and classes_nocoops.jsa now excluded from MacOS JREs? Jan 20, 2024
@chadlwilson chadlwilson changed the title classes.jsa and classes_nocoops.jsa now excluded from MacOS JREs? classes.jsa and classes_nocoops.jsa now excluded from MacOS JDKs/JREs? Jan 20, 2024
@chadlwilson
Copy link
Author

chadlwilson commented Jan 20, 2024

Based on a recent Mac aarch64 build log perhaps might be something to do with cross-compilation if relying on default archives being created.

Perhaps aarch64 was always cross-compiled (so no default archive generated by the build if relying on JEP 341), and from 17.0.9+ x64 was maybe also considered cross-compiled due to some build env changes moving it to Apple Silicon machines?

I note work to possibly move to arm64 machines and cross-compile x64 (which might explain why the archives disappeared for x64 builds) in places like adoptium/infrastructure#2536 but can't really make sense of all the WIP as an outsider :-)

Copy link

We are marking this issue as stale because it has not been updated for a while. This is just a way to keep the support issues queue manageable.
It will be closed soon unless the stale label is removed by a committer, or a new comment is made.

@github-actions github-actions bot added the stale label Apr 20, 2024
@chadlwilson
Copy link
Author

I'm pretty sure this is still a problem.

@github-actions github-actions bot added keep This label can be applied to prevent the Stale bot from closing it after a period of inactivity and removed stale labels Apr 21, 2024
@giger85
Copy link

giger85 commented Aug 20, 2024

Is there any update?
This issue is not resolved yet. (current version: jdk-21.0.4+7)

@sxa
Copy link
Member

sxa commented Aug 20, 2024

I note work to possibly move to arm64 machines and cross-compile x64 (which might explain why the archives disappeared for x64 builds) in places like adoptium/infrastructure#2536 but can't really make sense of all the WIP as an outsider :-)

Hi - to confirm what you said yes we are cross compiling the x64 ones on an aarch64 mac and have been for about the last year. Having said that, it looks like we haven't had the shared class cache included in aarch64 mac for any of the jdk17 releases as 17+35 also did not include it. As you say, 17.0.8.1+1 was the last JDK17 mac release to include it, which I believe is consistent with when we switched over to the x64 cross-compiled builds.

Also noting that this is not specific to the JREs - the JDK tarballs are also missing them (filenames here were shortened for my own convenience but 11 and 21 are the latest versions. The 17 tarballs have the exact version number on them):

sxa@orangepi5plus:/dev/shm$ for A in *gz; do echo ===== $A; tar tfz $A | grep classes; done
===== 11amac.tgz
===== 11xmac.tgz
===== 170amac.tgz
===== 170xmac.tgz
jdk-17+35/Contents/Home/lib/server/classes_nocoops.jsa
jdk-17+35/Contents/Home/lib/server/classes.jsa
===== 1781amac.tgz
===== 1781xmac.tgz
jdk-17.0.8.1+1/Contents/Home/lib/server/classes_nocoops.jsa
jdk-17.0.8.1+1/Contents/Home/lib/server/classes.jsa
===== 179amac.tgz
===== 179xmac.tgz
===== 21amac.tgz
===== 21xmac.tgz

@sxa sxa added the PMC-agenda label Aug 20, 2024
@chadlwilson
Copy link
Author

chadlwilson commented Aug 20, 2024

Thanks! I'm a little surprised that no-one else on Mac x64 noticed this changing, but maybe that just reflects that most people had been on Apple Silicon/aarch64 for ages, which has never had the archives included.

If the aarch64 builds are done on Apple Silicon/aarch64 these should presumably not be cross-compiled, so seems some possible quirk of the build env that is causing the archives to neither aarch64 nor x64 and both to be considered as cross-compiled for the purposes of this.

While I do not know of the desired Temurin compliance with things like JEP 345, and each vendor can presumably make their own decisions, it is worth noting that this is a little inconsistent here and I imagine if these could be included MacOS performance would be improved relative to linux aarchs (which seem to be there for Alpine and regular Linux across x64 and aarch64 at least) and also compared to other distributions.

@sxa
Copy link
Member

sxa commented Aug 21, 2024

Thanks! I'm a little surprised that no-one else on Mac x64 noticed this changing

Yeah I'm quite surprised too :-)

it is worth noting that this is a little inconsistent here

Absolutely - I don't believe this is intentional and I'll be discussing it with the team tomorrow, although we have some people on vacation at this time of year. Hopefully we can get something in place before the next release though. Also it's an easy thing for us to put into some of our checks so I'll make sure that's done too if we get this resolved.

seems some possible quirk of the build env that is causing the archives to neither aarch64 nor x64 and both to be considered as cross-compiled for the purposes of this.

Yes I agree - we should be consistent and I suspect we'll need to understand why they're not currently being produced so thanks for pushing this one. As you suggest, the cross-compilation /shouldn't/ be relevant here since it's failing on the natively built aarch64 one too. I'm not entirely sure what the implications might be of creating an x64 CDS archive under Rosetta though ... It might not generate something optimal.

The latest build log for aarch64 initially looks ok:

20:49:08  checking if platform is supported by CDS... yes
20:49:08  checking if JVM feature 'cds' is available... yes

But later on in the build it seems to skip the generation because it thinks it's cross-compiling, so hopefully we can sort it out!

20:49:08  checking if CDS archive is available... no (not possible with cross compilation)
20:49:08  checking if a default CDS archive should be generated... disabled, from default 'auto'
20:49:08  checking if CDS archive is available... yes
20:49:08  checking if compatible cds region alignment enabled... disabled, default

The x64 one has the same:

03:07:22  checking if the CDS classlist generation should be enabled... enabled, from default 'auto'
03:07:22  checking if any translations should be excluded... no
03:07:22  checking if static man pages should be copied... enabled, default
03:07:22  checking if CDS archive is available... no (not possible with cross compilation)
03:07:22  checking if a default CDS archive should be generated... disabled, from default 'auto'
03:07:22  checking if CDS archive is available... yes

@sxa
Copy link
Member

sxa commented Aug 21, 2024

@chadlwilson FYI in the meantime you should be able to populate the shared class files on your local machine with:
java -Xshare:dump
And if you need the nocoops version:
java -Xshare:dump -XX:-UseCompressedOops

(As an aside, doing this is generally a nicer option anyway since the cache files are always somewhat dependent on the machine you're running on, so doing it on your local machine might give you something that's more suitable and efficient)

@sxa sxa removed the PMC-agenda label Aug 21, 2024
@sxa
Copy link
Member

sxa commented Aug 21, 2024

Remove the PMC-agenda label today as this was discussed at the PMC meeting and @gdams took an action to replicate and investigate what's needed to resolve this at least for aarch64 on his system.

@chadlwilson
Copy link
Author

chadlwilson commented Aug 21, 2024

Thanks @sxa - am aware of the workaround from reading the JEPs and earlier knowledge.

https://github.com/gocd/gocd redistributes/bundles Temurin JREs across platforms for users. MacOS is not so common a production deployment target so not critical for me to bother adding special logic and aarch64 builders for this platform (when we otherwise build cross-platform from linux x64 - no MacOS build hardware available).

Locally l probably switch around mise-managed JREs and JDKs too often locally to worry about doing this consistently.

I can't even 100% remember what I was doing when I noticed this - probably just diffing some GoCD binary distributions to compare between releases at some point and wondering why our distribution got smaller between releases with different JRE bundled versions.

And then down the CDS and JEP rabbithole through (morbid?) curiosity :-)

@sxa
Copy link
Member

sxa commented Aug 22, 2024

I can't even 100% remember what I was doing when I noticed this

:-) But good that you did notice it since it gives us a chance to fix it and make sure we can trap it if the cache files disappear in the future.

@gdams
Copy link
Member

gdams commented Aug 22, 2024

Noting that running configure on my local macOS M1 laptop returns checking if CDS archive is available... yes

@sxa
Copy link
Member

sxa commented Aug 22, 2024

Noting that running configure on my local macOS M1 laptop returns checking if CDS archive is available... yes

Yep that line is also present in the log snippets in my earlier comment from the CI

@chadlwilson
Copy link
Author

Thanks folks!

I have no objection, but ..should we take it that generating CDS archives for MacOS x64 is a "won't fix" given the available build hardware (Apple Silicon) and platform priority?

@sxa
Copy link
Member

sxa commented Aug 27, 2024

Thanks folks!

I have no objection, but ..should we take it that generating CDS archives for MacOS x64 is a "won't fix" given the available build hardware (Apple Silicon) and platform priority?

At the moment it's more that it was an easy fix to resolve it on aarch64 so we got that applied. The question world be how to generate something meaningful in the cross-compiled case. Is there any value in a cache created in the Rosetta environment for example, or would we need to run the generation on "real" x64 for it to be worthwhile?

This issue was closed because the related PR auto closed it. I'm going to reopen it until we have closure one way or another on x64 too, but it will require someone to do some investigation I think.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
keep This label can be applied to prevent the Stale bot from closing it after a period of inactivity
Projects
Status: Done
Development

Successfully merging a pull request may close this issue.

5 participants