Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[SPARK-34246][SQL][FOLLOWUP] Change the definition of findTightestCommonType for backward compatibility #32493

Closed
wants to merge 1 commit into from

Conversation

gengliangwang
Copy link
Member

What changes were proposed in this pull request?

Change the definition of findTightestCommonType from

def findTightestCommonType(t1: DataType, t2: DataType): Option[DataType]

to

val findTightestCommonType: (DataType, DataType) => Option[DataType]

Why are the changes needed?

For backward compatibility.
When running a MongoDB connector (built with Spark 3.1.1) with the latest master, there is such an error

java.lang.NoSuchMethodError: org.apache.spark.sql.catalyst.analysis.TypeCoercion$.findTightestCommonType()Lscala/Function2

from https://github.com/mongodb/mongo-spark/blob/master/src/main/scala/com/mongodb/spark/sql/MongoInferSchema.scala#L150

In the previous release, the function was

static public  scala.Function2<org.apache.spark.sql.types.DataType, org.apache.spark.sql.types.DataType, scala.Option<org.apache.spark.sql.types.DataType>> findTightestCommonType ()

After #31349, the function becomes:

static public  scala.Option<org.apache.spark.sql.types.DataType> findTightestCommonType (org.apache.spark.sql.types.DataType t1, org.apache.spark.sql.types.DataType t2)

This PR is to reduce the unnecessary API change.

Does this PR introduce any user-facing change?

Yes, the definition of TypeCoercion.findTightestCommonType is consistent with previous release again.

How was this patch tested?

Existing unit tests

@SparkQA
Copy link

SparkQA commented May 10, 2021

Kubernetes integration test starting
URL: https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder-K8s/42857/

@SparkQA
Copy link

SparkQA commented May 10, 2021

Kubernetes integration test status failure
URL: https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder-K8s/42857/

@cloud-fan
Copy link
Contributor

cloud-fan commented May 10, 2021

Generally speaking, we don't need to keep backward compatibility for internal APIs. But this one is just a small change to avoid breaking a MongoDB connector, which seems worthwhile to me. LGTM

@gengliangwang
Copy link
Member Author

Thanks, merging to master

@SparkQA
Copy link

SparkQA commented May 10, 2021

Test build #138335 has finished for PR 32493 at commit f8d7acb.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@maropu maropu changed the title [SPARK-34246][FOLLOWUP] Change the definition of findTightestCommonType for backward compatibility [SPARK-34246][SQL][FOLLOWUP] Change the definition of findTightestCommonType for backward compatibility May 10, 2021
@maropu
Copy link
Member

maropu commented May 10, 2021

late lgtm.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants