-
Notifications
You must be signed in to change notification settings - Fork 231
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[FEA] Support Scala 2.13 #1525
Comments
To be clear the above issue allows for compilation with 2.13 but does not move the default to 2.13 We probably should do the same. I don't think we want to release/test anything against scala 2.13 until it is the default on at least one release that we are targeting. We might want to have a nightly build just to verify that we didn't break compilation though. |
Yup. |
Note they are targetting scala 2.13 for spark 3.2 so we would need a plugin with scala 2.13 that matches. @sameerz |
Per @tgravescs there will be a scala 2.13 artifact but it will not be the default version. It will be good if we can also generate a 2.13 artifact. |
@sameerz @tgravescs So we are going to build 2 sets of artifacts for 21.10.0, e.g. |
Yes, we would need to build two sets of artifacts for 21.10.0, one for 2.12 (default) and one for 2.13 (non-default). |
As my understanding, to build Currently, there is no official release of scala-2.13 based So for the And if so, does that mean for |
I've tried to build a SNAPSHOT version of scala-2.13 based spark-3.1.3 binaries, and ran it with our
|
I also tried to build
|
There should be many scala @tgravescs @revans2 Could you please help to make spark-rapids PASS build against 2.13? So that I can set-up build/IT pipeline against scala-2.13? Thanks! |
This will never work. Scala is not backwards binary compatible between versions, which is why the Scala version appears by convention in the artifact names. The plugin must be built against the same Scala version Spark is using.
Yes, there's significant work to make the plugin source-compatible with both Scala 2.12 and Scala 2.13. |
@sameerz should schedule this work. |
Moving to the backlog for now. |
PR apache/spark@eb7adb1b00 updated minor version to .7 . So pom is updated to 2.13.7. Whenever this is prioritized, I think we have to make sure we compile with this version for Spark-3.3 |
I am trying to run a simple scala app using spark. I am getting a similar errors as discussed above. See the error below.
Any help to resolve my issue is highly appreciated! |
@kalyansagi I assume you're trying to run with Scala 2.13? The RAPIDS Accelerator does not yet support Scala 2.13, and this issue is to track adding that support at some point in the future. I also don't see any evidence of the RAPIDS Accelerator in the stacktrace -- are you just running a standard app with Apache Spark without the RAPIDS Accelerator? In any case, I would recommend compiling your app with Scala 2.12, as that is the Scala version supported by most (all?) Spark 3.x distributions at this point. If compiling for Scala 2.12 does not solve the issue, please file a separate issue to discuss this, as we would like to leave this ticket focused on tracking the Scala 2.13 feature for the RAPIDS Accelerator. |
Related Spark migration to Scala 2.13: https://issues.apache.org/jira/browse/SPARK-25075 |
Scala 2.13 will be the default (Scala 2.12 will still be available) in Spark 3.5.0 https://issues.apache.org/jira/browse/SPARK-43836 |
The discussion on the mailing list moved it to Spark 4.0 https://lists.apache.org/thread/xz7x0nkjn1d13yl9on59lfttm6f0yd45 |
…IDIA#1525) Signed-off-by: spark-rapids automation <[email protected]>
Is your feature request related to a problem? Please describe.
Move and test our sql plugin with scala 2.13 compile time after apache/spark@d6a68e0b67
Describe the solution you'd like
Code should compile with scala 2.13
The text was updated successfully, but these errors were encountered: