Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

spark: init 3.2.1 and test on aarch64-linux #160075

Merged
merged 3 commits into from
Mar 16, 2022

Conversation

ConnorBaker
Copy link
Contributor

Motivation for this change

There is a new version of Spark available (3.2.1).

Additionally, assuming since #158613 gets merged, aarch64-linux will be a supported platform, so we should test on it.

Things done

Added Spark 3.2.1, set it as the default for spark, added a spark3 alias, and enabled testing on aarch64-linux.

  • Built on platform(s)
    • x86_64-linux
    • aarch64-linux
      • See warning below
    • x86_64-darwin
    • aarch64-darwin
  • For non-Linux: Is sandbox = true set in nix.conf? (See Nix manual)
  • Tested, as applicable:
  • Tested compilation of all packages that depend on this change using nix-shell -p nixpkgs-review --run "nixpkgs-review rev HEAD". Note: all changes have to be committed, also see nixpkgs-review usage
  • Tested basic functionality of all binary files (usually in ./result/bin/)
    • I verified that spark-shell was able to successfully launch
  • 22.05 Release Notes (or backporting 21.11 Release notes)
    • (Package updates) Added a release notes entry if the change is major or breaking
    • (Module updates) Added a release notes entry if the change is significant
    • (Module addition) Added a release notes entry if adding a new NixOS module
    • (Release notes changes) Ran nixos/doc/manual/md-to-db.sh to update generated release notes
  • Fits CONTRIBUTING.md.

On aarch64-linux (using the set of changes in #158613) and x86_64-linux I see the following when opening the spark2 shell:

$ ./result/bin/spark-shell 
22/02/15 00:25:17 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
[ERROR] Failed to construct terminal; falling back to unsupported
java.lang.NumberFormatException: For input string: "0x100"
	at java.lang.NumberFormatException.forInputString(NumberFormatException.java:65)
	at java.lang.Integer.parseInt(Integer.java:580)
	at java.lang.Integer.valueOf(Integer.java:766)
	at scala.tools.jline_embedded.internal.InfoCmp.parseInfoCmp(InfoCmp.java:59)
	at scala.tools.jline_embedded.UnixTerminal.parseInfoCmp(UnixTerminal.java:242)
	at scala.tools.jline_embedded.UnixTerminal.<init>(UnixTerminal.java:65)
	at scala.tools.jline_embedded.UnixTerminal.<init>(UnixTerminal.java:50)
	at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
	at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
	at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
	at java.lang.Class.newInstance(Class.java:442)
	at scala.tools.jline_embedded.TerminalFactory.getFlavor(TerminalFactory.java:211)
	at scala.tools.jline_embedded.TerminalFactory.create(TerminalFactory.java:102)
	at scala.tools.jline_embedded.TerminalFactory.get(TerminalFactory.java:186)
	at scala.tools.jline_embedded.TerminalFactory.get(TerminalFactory.java:192)
	at scala.tools.jline_embedded.console.ConsoleReader.<init>(ConsoleReader.java:243)
	at scala.tools.jline_embedded.console.ConsoleReader.<init>(ConsoleReader.java:235)
	at scala.tools.jline_embedded.console.ConsoleReader.<init>(ConsoleReader.java:223)
	at scala.tools.nsc.interpreter.jline_embedded.JLineConsoleReader.<init>(JLineReader.scala:64)
	at scala.tools.nsc.interpreter.jline_embedded.InteractiveReader.<init>(JLineReader.scala:33)
	at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
	at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
	at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
	at scala.tools.nsc.interpreter.ILoop$$anonfun$scala$tools$nsc$interpreter$ILoop$$instantiater$1$1.apply(ILoop.scala:858)
	at scala.tools.nsc.interpreter.ILoop$$anonfun$scala$tools$nsc$interpreter$ILoop$$instantiater$1$1.apply(ILoop.scala:855)
	at scala.tools.nsc.interpreter.ILoop.scala$tools$nsc$interpreter$ILoop$$mkReader$1(ILoop.scala:862)
	at scala.tools.nsc.interpreter.ILoop$$anonfun$22$$anonfun$apply$10.apply(ILoop.scala:873)
	at scala.tools.nsc.interpreter.ILoop$$anonfun$22$$anonfun$apply$10.apply(ILoop.scala:873)
	at scala.util.Try$.apply(Try.scala:192)
	at scala.tools.nsc.interpreter.ILoop$$anonfun$22.apply(ILoop.scala:873)
	at scala.tools.nsc.interpreter.ILoop$$anonfun$22.apply(ILoop.scala:873)
	at scala.collection.immutable.Stream$$anonfun$map$1.apply(Stream.scala:418)
	at scala.collection.immutable.Stream$$anonfun$map$1.apply(Stream.scala:418)
	at scala.collection.immutable.Stream$Cons.tail(Stream.scala:1233)
	at scala.collection.immutable.Stream$Cons.tail(Stream.scala:1223)
	at scala.collection.immutable.Stream.collect(Stream.scala:435)
	at scala.tools.nsc.interpreter.ILoop.chooseReader(ILoop.scala:875)
	at org.apache.spark.repl.SparkILoop$$anonfun$process$1$$anonfun$newReader$1$1.apply(SparkILoop.scala:184)
	at org.apache.spark.repl.SparkILoop$$anonfun$process$1$$anonfun$newReader$1$1.apply(SparkILoop.scala:184)
	at scala.Option.fold(Option.scala:158)
	at org.apache.spark.repl.SparkILoop$$anonfun$process$1.newReader$1(SparkILoop.scala:184)
	at org.apache.spark.repl.SparkILoop$$anonfun$process$1.org$apache$spark$repl$SparkILoop$$anonfun$$preLoop$1(SparkILoop.scala:188)
	at org.apache.spark.repl.SparkILoop$$anonfun$process$1$$anonfun$startup$1$1.apply(SparkILoop.scala:249)
	at org.apache.spark.repl.SparkILoop$$anonfun$process$1$$anonfun$startup$1$1.apply(SparkILoop.scala:247)
	at org.apache.spark.repl.SparkILoop$$anonfun$process$1.withSuppressedSettings$1(SparkILoop.scala:235)
	at org.apache.spark.repl.SparkILoop$$anonfun$process$1.startup$1(SparkILoop.scala:247)
	at org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply$mcZ$sp(SparkILoop.scala:282)
	at org.apache.spark.repl.SparkILoop.runClosure(SparkILoop.scala:159)
	at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:182)
	at org.apache.spark.repl.Main$.doMain(Main.scala:78)
	at org.apache.spark.repl.Main$.main(Main.scala:58)
	at org.apache.spark.repl.Main.main(Main.scala)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
	at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:855)
	at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:161)
	at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:184)
	at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:86)
	at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:930)
	at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:939)
	at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

Spark context Web UI available at http://ip-10-0-0-172.ec2.internal:4040
Spark context available as 'sc' (master = local[*], app id = local-1644884722895).
Spark session available as 'spark'.
Welcome to
      ____              __
     / __/__  ___ _____/ /__
    _\ \/ _ \/ _ `/ __/  '_/
   /___/ .__/\_,_/_/ /_/\_\   version 2.4.8
      /_/
         
Using Scala version 2.11.12 (OpenJDK 64-Bit Server VM, Java 1.8.0_272)
Type in expressions to have them evaluated.
Type :help for more information.

scala> 

I also saw this behavior on x86_64-linux:

$ ./result/bin/spark-shell 
22/02/15 00:48:07 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
[ERROR] Failed to construct terminal; falling back to unsupported
java.lang.NumberFormatException: For input string: "0x100"
	at java.lang.NumberFormatException.forInputString(NumberFormatException.java:65)
	at java.lang.Integer.parseInt(Integer.java:580)
	at java.lang.Integer.valueOf(Integer.java:766)
	at scala.tools.jline_embedded.internal.InfoCmp.parseInfoCmp(InfoCmp.java:59)
	at scala.tools.jline_embedded.UnixTerminal.parseInfoCmp(UnixTerminal.java:242)
	at scala.tools.jline_embedded.UnixTerminal.<init>(UnixTerminal.java:65)
	at scala.tools.jline_embedded.UnixTerminal.<init>(UnixTerminal.java:50)
	at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
	at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
	at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
	at java.lang.Class.newInstance(Class.java:442)
	at scala.tools.jline_embedded.TerminalFactory.getFlavor(TerminalFactory.java:211)
	at scala.tools.jline_embedded.TerminalFactory.create(TerminalFactory.java:102)
	at scala.tools.jline_embedded.TerminalFactory.get(TerminalFactory.java:186)
	at scala.tools.jline_embedded.TerminalFactory.get(TerminalFactory.java:192)
	at scala.tools.jline_embedded.console.ConsoleReader.<init>(ConsoleReader.java:243)
	at scala.tools.jline_embedded.console.ConsoleReader.<init>(ConsoleReader.java:235)
	at scala.tools.jline_embedded.console.ConsoleReader.<init>(ConsoleReader.java:223)
	at scala.tools.nsc.interpreter.jline_embedded.JLineConsoleReader.<init>(JLineReader.scala:64)
	at scala.tools.nsc.interpreter.jline_embedded.InteractiveReader.<init>(JLineReader.scala:33)
	at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
	at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
	at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
	at scala.tools.nsc.interpreter.ILoop$$anonfun$scala$tools$nsc$interpreter$ILoop$$instantiater$1$1.apply(ILoop.scala:858)
	at scala.tools.nsc.interpreter.ILoop$$anonfun$scala$tools$nsc$interpreter$ILoop$$instantiater$1$1.apply(ILoop.scala:855)
	at scala.tools.nsc.interpreter.ILoop.scala$tools$nsc$interpreter$ILoop$$mkReader$1(ILoop.scala:862)
	at scala.tools.nsc.interpreter.ILoop$$anonfun$22$$anonfun$apply$10.apply(ILoop.scala:873)
	at scala.tools.nsc.interpreter.ILoop$$anonfun$22$$anonfun$apply$10.apply(ILoop.scala:873)
	at scala.util.Try$.apply(Try.scala:192)
	at scala.tools.nsc.interpreter.ILoop$$anonfun$22.apply(ILoop.scala:873)
	at scala.tools.nsc.interpreter.ILoop$$anonfun$22.apply(ILoop.scala:873)
	at scala.collection.immutable.Stream$$anonfun$map$1.apply(Stream.scala:418)
	at scala.collection.immutable.Stream$$anonfun$map$1.apply(Stream.scala:418)
	at scala.collection.immutable.Stream$Cons.tail(Stream.scala:1233)
	at scala.collection.immutable.Stream$Cons.tail(Stream.scala:1223)
	at scala.collection.immutable.Stream.collect(Stream.scala:435)
	at scala.tools.nsc.interpreter.ILoop.chooseReader(ILoop.scala:875)
	at org.apache.spark.repl.SparkILoop$$anonfun$process$1$$anonfun$newReader$1$1.apply(SparkILoop.scala:184)
	at org.apache.spark.repl.SparkILoop$$anonfun$process$1$$anonfun$newReader$1$1.apply(SparkILoop.scala:184)
	at scala.Option.fold(Option.scala:158)
	at org.apache.spark.repl.SparkILoop$$anonfun$process$1.newReader$1(SparkILoop.scala:184)
	at org.apache.spark.repl.SparkILoop$$anonfun$process$1.org$apache$spark$repl$SparkILoop$$anonfun$$preLoop$1(SparkILoop.scala:188)
	at org.apache.spark.repl.SparkILoop$$anonfun$process$1$$anonfun$startup$1$1.apply(SparkILoop.scala:249)
	at org.apache.spark.repl.SparkILoop$$anonfun$process$1$$anonfun$startup$1$1.apply(SparkILoop.scala:247)
	at org.apache.spark.repl.SparkILoop$$anonfun$process$1.withSuppressedSettings$1(SparkILoop.scala:235)
	at org.apache.spark.repl.SparkILoop$$anonfun$process$1.startup$1(SparkILoop.scala:247)
	at org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply$mcZ$sp(SparkILoop.scala:282)
	at org.apache.spark.repl.SparkILoop.runClosure(SparkILoop.scala:159)
	at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:182)
	at org.apache.spark.repl.Main$.doMain(Main.scala:78)
	at org.apache.spark.repl.Main$.main(Main.scala:58)
	at org.apache.spark.repl.Main.main(Main.scala)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
	at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:855)
	at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:161)
	at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:184)
	at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:86)
	at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:930)
	at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:939)
	at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

Spark context Web UI available at http://ip-10-0-0-214.ec2.internal:4040
Spark context available as 'sc' (master = local[*], app id = local-1644886090440).
Spark session available as 'spark'.
Welcome to
      ____              __
     / __/__  ___ _____/ /__
    _\ \/ _ \/ _ `/ __/  '_/
   /___/ .__/\_,_/_/ /_/\_\   version 2.4.8
      /_/
         
Using Scala version 2.11.12 (OpenJDK 64-Bit Server VM, Java 1.8.0_272)
Type in expressions to have them evaluated.
Type :help for more information.

scala> 

but not on aarch64-darwin:

% ./result/bin/spark-shell 
22/02/14 19:30:11 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
Spark context Web UI available at http://10.0.0.217:4040
Spark context available as 'sc' (master = local[*], app id = local-1644885014257).
Spark session available as 'spark'.
Welcome to
      ____              __
     / __/__  ___ _____/ /__
    _\ \/ _ \/ _ `/ __/  '_/
   /___/ .__/\_,_/_/ /_/\_\   version 2.4.8
      /_/
         
Using Scala version 2.11.12 (OpenJDK 64-Bit Server VM, Java 1.8.0_292)
Type in expressions to have them evaluated.
Type :help for more information.

scala>

The shell still seems to work, though, so I'm not sure if this is a problem. I can't tell if this is just because of the version of OpenJDK being used being different across these platforms, or if it's something else.

@github-actions github-actions bot added the 6.topic: nixos Issues or PRs affecting NixOS modules, or package usability issues specific to NixOS label Feb 15, 2022
@ConnorBaker ConnorBaker marked this pull request as draft February 15, 2022 00:52
@ConnorBaker
Copy link
Contributor Author

@thoughtpolice @offlinehacker @kamilchm @illustris

Would you mind taking a look? Briefly: I factored out the fetchzip logic and added the 3.2.1 release. In all-packages.nix I made spark3 continue to alias the 3.1.2 release. Testing is now also done on aarch64-linux.

@ConnorBaker ConnorBaker marked this pull request as ready for review March 7, 2022 15:01
@illustris
Copy link
Contributor

illustris commented Mar 8, 2022

Thanks! The diff looks good. I'll try running some workloads on aarch64 in a few hours.
Was there any reason you left spark3 pointing to the older version?

@ConnorBaker
Copy link
Contributor Author

ConnorBaker commented Mar 8, 2022

Thanks! The diff looks good. I'll try running some workloads on aarch64 in a few hours. Was there any reason you left spark3 pointing to the older version?

My thought was that I'd rather have a separate PR that changes the version spark3 alises. Does that make sense? I'm not familiar with what best practices are regarding bumping to a new minor release.

EDIT: I think I'd prefer to keep it at spark 3.1.2 since there's more tooling built around that version than the newer one. For example, AWS Glue and AWS EMR both use 3.1.2. I'd hate for an end-user to end up developing locally with a version which is newer and when they deploy, find out that they used a new feature or that the runtime behavior is different.

EDIT2: Although, I am not a maintainer, so I defer to your judgement!

@illustris
Copy link
Contributor

illustris commented Mar 8, 2022

I'm not sure the versions of AWS managed services should be a consideration for something that can also be deployed as a self-managed service. 3.1.x is also a stable release, and it is still available in nixpkgs for anyone to use, but having spark/spark3 default to the latest stable release makes more sense. This change would also need an entry in the release notes.

The error messages you're seeing on running spark-shell are probably because the TERM environment variable is not set correctly. I don't see them on my machine.

[illustris@desktop:~]$ nix shell github:ConnorBaker/nixpkgs/spark_3_2_1#spark3_2_1 -c spark-shell
22/03/09 00:39:00 WARN Utils: Your hostname, desktop resolves to a loopback address: 127.0.0.2; using 192.168.140.57 instead (on interface enp11s0)
22/03/09 00:39:00 WARN Utils: Set SPARK_LOCAL_IP if you need to bind to another address
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
22/03/09 00:39:04 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Spark context Web UI available at http://192.168.1.57:4040
Spark context available as 'sc' (master = local[*], app id = local-1646766544484).
Spark session available as 'spark'.
Welcome to
      ____              __
     / __/__  ___ _____/ /__
    _\ \/ _ \/ _ `/ __/  '_/
   /___/ .__/\_,_/_/ /_/\_\   version 3.2.1
      /_/

Using Scala version 2.12.15 (OpenJDK 64-Bit Server VM, Java 1.8.0_272)
Type in expressions to have them evaluated.
Type :help for more information.

scala>

I ran NixOS tests with the updated package on x86_64. Everything looks good. It should work the same on aarch64, as there is no native code in spark.

Finally, this isn't a big deal, but could you change the package name from spark3_2_1 to spark_3_2_1 for the sake of consistency with hadoop? The _1 could even be omitted from the name since the last digit is usually incremented for updates that fix bugs and improve stability.

@ConnorBaker
Copy link
Contributor Author

I'm not sure the versions of AWS managed services should be a consideration for something that can also be deployed as a self-managed service. 3.1.x is also a stable release, and it is still available in nixpkgs for anyone to use, but having spark/spark3 default to the latest stable release makes more sense.

That's a good point! I updated it so spark3 points to the latest.

This change would also need an entry in the release notes.

Good catch! I forgot to do this for my hadoop PR... any ideas on the best way to get those changes logged? Since aarch64 is supported by hadoop and 4536186 was merged I can now run spark on my M1 Mac. I know Apple Silicon isn't a high-tier platform, but I think it's still something that would be interesting to have announced.

The error messages you're seeing on running spark-shell are probably because the TERM environment variable is not set correctly. I don't see them on my machine.

[illustris@desktop:~]$ nix shell github:ConnorBaker/nixpkgs/spark_3_2_1#spark3_2_1 -c spark-shell
22/03/09 00:39:00 WARN Utils: Your hostname, desktop resolves to a loopback address: 127.0.0.2; using 192.168.140.57 instead (on interface enp11s0)
22/03/09 00:39:00 WARN Utils: Set SPARK_LOCAL_IP if you need to bind to another address
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
22/03/09 00:39:04 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Spark context Web UI available at http://192.168.1.57:4040
Spark context available as 'sc' (master = local[*], app id = local-1646766544484).
Spark session available as 'spark'.
Welcome to
      ____              __
     / __/__  ___ _____/ /__
    _\ \/ _ \/ _ `/ __/  '_/
   /___/ .__/\_,_/_/ /_/\_\   version 3.2.1
      /_/

Using Scala version 2.12.15 (OpenJDK 64-Bit Server VM, Java 1.8.0_272)
Type in expressions to have them evaluated.
Type :help for more information.

scala>

You're right, I think my shell was funky. I'm unable to reproduce it now.

Finally, this isn't a big deal, but could you change the package name from spark3_2_1 to spark_3_2_1 for the sake of consistency with hadoop? The _1 could even be omitted from the name since the last digit is usually incremented for updates that fix bugs and improve stability.

Yep! I changed it so it matches the versioning used for hadoop and took your recommendation to drop the point release from the name.

Thank you so much for the awesome feedback!

@illustris
Copy link
Contributor

Yes, it's definitely worth mentioning the addition of aarch64 support to hadoop and spark. You could just add that to this PR.
That aside, this looks ready to merge.

@ConnorBaker
Copy link
Contributor Author

Just amended my commit to include the notes about hadoop and R.

Thank you again for all of your help and feedback @illustris!

@infinisil infinisil merged commit 47f2ee3 into NixOS:master Mar 16, 2022
@nixos-discourse
Copy link

This pull request has been mentioned on NixOS Discourse. There might be relevant details there:

https://discourse.nixos.org/t/tweag-nix-dev-update-26/18252/1

@ConnorBaker ConnorBaker deleted the spark_3_2_1 branch February 9, 2023 19:36
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
6.topic: nixos Issues or PRs affecting NixOS modules, or package usability issues specific to NixOS 8.has: changelog 8.has: documentation 8.has: package (new) This PR adds a new package 10.rebuild-darwin: 1-10 10.rebuild-linux: 1-10
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants