-
Notifications
You must be signed in to change notification settings - Fork 315
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Attn] Support for Spark 2.4.2 #60
Comments
Strongly support for this one. It took me more than 1 hour to doubt there might be something wrong with my Spark configuration as it keeps raising error about Spark Logging class not found while I was using 2.4.2 |
I get this:
when using v. 2.4.2. Please provide directiosnd on which version of spark to install. |
@borgdylan Please check supported version here: https://github.com/dotnet/spark/blob/master/docs/release-notes/0.2/release-0.2.md |
Spark 2.4.3 is now released: http://spark.apache.org/releases/spark-release-2-4-3.html, and it fixes the default Scala version. If you use microsoft-spark-2.4.x-0.2.0.jar against the Spark 2.4.3, you get the correct error message now:
We are not going to support Spark 2.4.2 since it requires a new We will release 0.3.0 that supports Spark 2.4.3 soon. |
Summary: You cannot use .NET for Apache Spark with Apache Spark 2.4.2
Details: Spark 2.4.2 was released on 4/23/19 and using it against microsoft.spark.2.4.x results in unexpected behavior (reported in #48, #49); the expected behavior is that you would get an exception message such as
Unsupported spark version used: 2.4.2. Supported versions: 2.4.0, 2.4.1
causing less confusion. This is likely due to the scala version upgrade to 2.12 in 2.4.2. Note that microsoft.spark.2.4.x is being built with 2.11.There is an ongoing discussion about this (http://apache-spark-developers-list.1001551.n3.nabble.com/VOTE-Release-Apache-Spark-2-4-2-tc27075.html#a27139), so depending on the outcome of the discussion, 2.4.2 may or may not be supported.
Do you want to help? While we are closely monitoring and working with the Apache Spark community in addressing this issue, you can also feel free to reply back to the main thread about any problems this issue has caused so we can avoid such mishaps in the future.
The text was updated successfully, but these errors were encountered: