Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[SPARK-44247][BUILD] Upgrade Arrow to 13.0.0 #42181

Closed
wants to merge 15 commits into from

Conversation

panbingkun
Copy link
Contributor

@panbingkun panbingkun commented Jul 27, 2023

What changes were proposed in this pull request?

The pr aims to upgrade arrow from 12.0.1 to 13.0.0.

Why are the changes needed?

  1. Arrow 12.0.1 VS 13.0.0
    apache/arrow@apache-arrow-12.0.1...apache-arrow-13.0.0

  2. Arrow 13.0.0 is the first version to support Java 21.

When arrow version 12.0.1 running on Java21, the following error occurred:

java.lang.UnsupportedOperationException: sun.misc.Unsafe or java.nio.DirectByteBuffer.<init>(long, int) not available
       org.apache.arrow.memory.util.MemoryUtil.directBuffer(MemoryUtil.java:167)
       org.apache.arrow.memory.ArrowBuf.getDirectBuffer(ArrowBuf.java:228)
       org.apache.arrow.memory.ArrowBuf.nioBuffer(ArrowBuf.java:223)
       org.apache.arrow.vector.ipc.ReadChannel.readFully(ReadChannel.java:87)
       org.apache.arrow.vector.ipc.message.MessageSerializer.readMessageBody(MessageSerializer.java:727)
       org.apache.arrow.vector.ipc.message.MessageChannelReader.readNext(MessageChannelReader.java:67)
       org.apache.arrow.vector.ipc.ArrowStreamReader.loadNextBatch(ArrowStreamReader.java:145)

After this PR, we can try to enable netty related testing in Java21.

Does this PR introduce any user-facing change?

No.

How was this patch tested?

  • Pass GA.

@github-actions github-actions bot added the BUILD label Jul 27, 2023
@LuciferYang
Copy link
Contributor

We need upgrade arrow 13.0 and netty 4.1.94+ together

@LuciferYang
Copy link
Contributor

cc @dongjoon-hyun seems arrow 13.0.0 released, we can re-enable arrow-based test case after this one for Java 21 :)

@LuciferYang
Copy link
Contributor

@panbingkun It appears it hasn’t been published to the Maven Central Repository yet. We can testing it using asf-staging first :)

@panbingkun
Copy link
Contributor Author

@panbingkun It appears it hasn’t been published to the Maven Central Repository yet. We can testing it using asf-staging first :)

Thanks @LuciferYang , let's test it in advance. 😄

@panbingkun
Copy link
Contributor Author

image

@panbingkun panbingkun marked this pull request as ready for review July 27, 2023 09:31
@LuciferYang
Copy link
Contributor

LuciferYang commented Jul 27, 2023

@panbingkun

  1. Please update the pr title due to we upgrade both netty & arrow in this one
  2. Please update the pr description and add an explanation for "why we need to upgrade these two dependencies in the one pr?", such as what errors will occur if we only upgrade arrow, and what errors will occur if we only upgrade netty (although I know this is because netty has changed a developer API that arrow is using). It is best to list the pr links or issue links related to netty and arrow

@panbingkun panbingkun changed the title [SPARK-44563][BUILD] Upgrade Apache Arrow to 13.0.0 [SPARK-44563][BUILD] Upgrade Arrow to 13.0.0 & Netty to 4.1.95.Final Jul 27, 2023
@panbingkun panbingkun marked this pull request as draft July 27, 2023 11:35
@panbingkun
Copy link
Contributor Author

Let's hold it first because when I wrote this PR, Netty had already released version 4.1.96.Final
image
https://github.com/netty/netty/releases/tag/netty-4.1.96.Final

@LuciferYang
Copy link
Contributor

@panbingkun Only upgrade arrow to 13.0.0, the netty version is 4.1.93.Final, the test failed will like

java.lang.NoSuchMethodError: io.netty.buffer.PooledByteBufAllocatorL$InnerAllocator.threadCache()Lio/netty/buffer/PoolArenasCache;
    at io.netty.buffer.PooledByteBufAllocatorL$InnerAllocator.newDirectBufferL(PooledByteBufAllocatorL.java:164)
    at io.netty.buffer.PooledByteBufAllocatorL$InnerAllocator.directBuffer(PooledByteBufAllocatorL.java:214)
    at io.netty.buffer.PooledByteBufAllocatorL.allocate(PooledByteBufAllocatorL.java:58)
    at org.apache.arrow.memory.NettyAllocationManager.<init>(NettyAllocationManager.java:77)
    at org.apache.arrow.memory.NettyAllocationManager.<init>(NettyAllocationManager.java:84)
    at org.apache.arrow.memory.NettyAllocationManager$1.create(NettyAllocationManager.java:34)
    at org.apache.arrow.memory.BaseAllocator.newAllocationManager(BaseAllocator.java:355)
    at org.apache.arrow.memory.BaseAllocator.newAllocationManager(BaseAllocator.java:350)
    at org.apache.arrow.memory.BaseAllocator.bufferWithoutReservation(BaseAllocator.java:338)
    at org.apache.arrow.memory.BaseAllocator.buffer(BaseAllocator.java:316)
    at org.apache.arrow.memory.BaseAllocator.buffer(BaseAllocator.java:280)
    at org.apache.arrow.vector.BaseVariableWidthVector.allocateBytes(BaseVariableWidthVector.java:462)
    at org.apache.arrow.vector.BaseVariableWidthVector.allocateNew(BaseVariableWidthVector.java:420)
    at org.apache.arrow.vector.BaseVariableWidthVector.allocateNew(BaseVariableWidthVector.java:380)
    at org.apache.spark.sql.execution.arrow.ArrowWriter$.$anonfun$create$1(ArrowWriter.scala:44)
    at scala.collection.TraversableLike.$anonfun$map$1(TraversableLike.scala:286)
    at scala.collection.Iterator.foreach(Iterator.scala:943)
    at scala.collection.Iterator.foreach$(Iterator.scala:943)
    at scala.collection.AbstractIterator.foreach(Iterator.scala:1431)
    at scala.collection.IterableLike.foreach(IterableLike.scala:74)
    at scala.collection.IterableLike.foreach$(IterableLike.scala:73)
    at scala.collection.AbstractIterable.foreach(Iterable.scala:56)
    at scala.collection.TraversableLike.map(TraversableLike.scala:286)
    at scala.collection.TraversableLike.map$(TraversableLike.scala:279)
    at scala.collection.AbstractTraversable.map(Traversable.scala:108)
    at org.apache.spark.sql.execution.arrow.ArrowWriter$.create(ArrowWriter.scala:43)
    at org.apache.spark.sql.execution.arrow.ArrowConverters$ArrowBatchIterator.<init>(ArrowConverters.scala:93)
    at org.apache.spark.sql.execution.arrow.ArrowConverters$ArrowBatchWithSchemaIterator.<init>(ArrowConverters.scala:138)
    at org.apache.spark.sql.execution.arrow.ArrowConverters$$anon$1.<init>(ArrowConverters.scala:226)
    at org.apache.spark.sql.execution.arrow.ArrowConverters$.createEmptyArrowBatch(ArrowConverters.scala:224)
    at org.apache.spark.sql.connect.planner.SparkConnectPlanner.handleSqlCommand(SparkConnectPlanner.scala:2430)
    at org.apache.spark.sql.connect.planner.SparkConnectPlanner.process(SparkConnectPlanner.scala:2374)
    at org.apache.spark.sql.connect.execution.ExecuteThreadRunner.handleCommand(ExecuteThreadRunner.scala:170)
    at org.apache.spark.sql.connect.execution.ExecuteThreadRunner.$anonfun$executeInternal$1(ExecuteThreadRunner.scala:146)
    at org.apache.spark.sql.connect.execution.ExecuteThreadRunner.$anonfun$executeInternal$1$adapted(ExecuteThreadRunner.scala:124)
    at org.apache.spark.sql.connect.service.SessionHolder.$anonfun$withSession$2(SessionHolder.scala:192)
    at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:900)
    at org.apache.spark.sql.connect.service.SessionHolder.$anonfun$withSession$1(SessionHolder.scala:192)
    at org.apache.spark.JobArtifactSet$.withActiveJobArtifactState(JobArtifactSet.scala:94)
    at org.apache.spark.sql.connect.service.SessionHolder.$anonfun$withContextClassLoader$1(SessionHolder.scala:179)
    at org.apache.spark.util.Utils$.withContextClassLoader(Utils.scala:199)
    at org.apache.spark.sql.connect.service.SessionHolder.withContextClassLoader(SessionHolder.scala:178)
    at org.apache.spark.sql.connect.service.SessionHolder.withSession(SessionHolder.scala:191)
    at org.apache.spark.sql.connect.execution.ExecuteThreadRunner.executeInternal(ExecuteThreadRunner.scala:124)
    at org.apache.spark.sql.connect.execution.ExecuteThreadRunner.org$apache$spark$sql$connect$execution$ExecuteThreadRunner$$execute(ExecuteThreadRunner.scala:79)
    at org.apache.spark.sql.connect.execution.ExecuteThreadRunner$ExecutionThread.run(ExecuteThreadRunner.scala:188)

I think we can supplement this section in the pr description.

  • Only upgrade netty:
java.lang.NoSuchMethodError: io.netty.buffer.PooledByteBufAllocatorL$InnerAllocator.threadCache()Lio/netty/buffer/PoolThreadCache;
  • Only upgrade arrow:
java.lang.NoSuchMethodError: io.netty.buffer.PooledByteBufAllocatorL$InnerAllocator.threadCache()Lio/netty/buffer/PoolArenasCache;

then, we can say that both dependencies must be upgraded simultaneously

@panbingkun
Copy link
Contributor Author

@panbingkun Only upgrade arrow to 13.0.0, the netty version is 4.1.93.Final, the test failed will like

java.lang.NoSuchMethodError: io.netty.buffer.PooledByteBufAllocatorL$InnerAllocator.threadCache()Lio/netty/buffer/PoolArenasCache;
    at io.netty.buffer.PooledByteBufAllocatorL$InnerAllocator.newDirectBufferL(PooledByteBufAllocatorL.java:164)
    at io.netty.buffer.PooledByteBufAllocatorL$InnerAllocator.directBuffer(PooledByteBufAllocatorL.java:214)
    at io.netty.buffer.PooledByteBufAllocatorL.allocate(PooledByteBufAllocatorL.java:58)
    at org.apache.arrow.memory.NettyAllocationManager.<init>(NettyAllocationManager.java:77)
    at org.apache.arrow.memory.NettyAllocationManager.<init>(NettyAllocationManager.java:84)
    at org.apache.arrow.memory.NettyAllocationManager$1.create(NettyAllocationManager.java:34)
    at org.apache.arrow.memory.BaseAllocator.newAllocationManager(BaseAllocator.java:355)
    at org.apache.arrow.memory.BaseAllocator.newAllocationManager(BaseAllocator.java:350)
    at org.apache.arrow.memory.BaseAllocator.bufferWithoutReservation(BaseAllocator.java:338)
    at org.apache.arrow.memory.BaseAllocator.buffer(BaseAllocator.java:316)
    at org.apache.arrow.memory.BaseAllocator.buffer(BaseAllocator.java:280)
    at org.apache.arrow.vector.BaseVariableWidthVector.allocateBytes(BaseVariableWidthVector.java:462)
    at org.apache.arrow.vector.BaseVariableWidthVector.allocateNew(BaseVariableWidthVector.java:420)
    at org.apache.arrow.vector.BaseVariableWidthVector.allocateNew(BaseVariableWidthVector.java:380)
    at org.apache.spark.sql.execution.arrow.ArrowWriter$.$anonfun$create$1(ArrowWriter.scala:44)
    at scala.collection.TraversableLike.$anonfun$map$1(TraversableLike.scala:286)
    at scala.collection.Iterator.foreach(Iterator.scala:943)
    at scala.collection.Iterator.foreach$(Iterator.scala:943)
    at scala.collection.AbstractIterator.foreach(Iterator.scala:1431)
    at scala.collection.IterableLike.foreach(IterableLike.scala:74)
    at scala.collection.IterableLike.foreach$(IterableLike.scala:73)
    at scala.collection.AbstractIterable.foreach(Iterable.scala:56)
    at scala.collection.TraversableLike.map(TraversableLike.scala:286)
    at scala.collection.TraversableLike.map$(TraversableLike.scala:279)
    at scala.collection.AbstractTraversable.map(Traversable.scala:108)
    at org.apache.spark.sql.execution.arrow.ArrowWriter$.create(ArrowWriter.scala:43)
    at org.apache.spark.sql.execution.arrow.ArrowConverters$ArrowBatchIterator.<init>(ArrowConverters.scala:93)
    at org.apache.spark.sql.execution.arrow.ArrowConverters$ArrowBatchWithSchemaIterator.<init>(ArrowConverters.scala:138)
    at org.apache.spark.sql.execution.arrow.ArrowConverters$$anon$1.<init>(ArrowConverters.scala:226)
    at org.apache.spark.sql.execution.arrow.ArrowConverters$.createEmptyArrowBatch(ArrowConverters.scala:224)
    at org.apache.spark.sql.connect.planner.SparkConnectPlanner.handleSqlCommand(SparkConnectPlanner.scala:2430)
    at org.apache.spark.sql.connect.planner.SparkConnectPlanner.process(SparkConnectPlanner.scala:2374)
    at org.apache.spark.sql.connect.execution.ExecuteThreadRunner.handleCommand(ExecuteThreadRunner.scala:170)
    at org.apache.spark.sql.connect.execution.ExecuteThreadRunner.$anonfun$executeInternal$1(ExecuteThreadRunner.scala:146)
    at org.apache.spark.sql.connect.execution.ExecuteThreadRunner.$anonfun$executeInternal$1$adapted(ExecuteThreadRunner.scala:124)
    at org.apache.spark.sql.connect.service.SessionHolder.$anonfun$withSession$2(SessionHolder.scala:192)
    at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:900)
    at org.apache.spark.sql.connect.service.SessionHolder.$anonfun$withSession$1(SessionHolder.scala:192)
    at org.apache.spark.JobArtifactSet$.withActiveJobArtifactState(JobArtifactSet.scala:94)
    at org.apache.spark.sql.connect.service.SessionHolder.$anonfun$withContextClassLoader$1(SessionHolder.scala:179)
    at org.apache.spark.util.Utils$.withContextClassLoader(Utils.scala:199)
    at org.apache.spark.sql.connect.service.SessionHolder.withContextClassLoader(SessionHolder.scala:178)
    at org.apache.spark.sql.connect.service.SessionHolder.withSession(SessionHolder.scala:191)
    at org.apache.spark.sql.connect.execution.ExecuteThreadRunner.executeInternal(ExecuteThreadRunner.scala:124)
    at org.apache.spark.sql.connect.execution.ExecuteThreadRunner.org$apache$spark$sql$connect$execution$ExecuteThreadRunner$$execute(ExecuteThreadRunner.scala:79)
    at org.apache.spark.sql.connect.execution.ExecuteThreadRunner$ExecutionThread.run(ExecuteThreadRunner.scala:188)

I think we can supplement this section in the pr description.

  • Only upgrade netty:
java.lang.NoSuchMethodError: io.netty.buffer.PooledByteBufAllocatorL$InnerAllocator.threadCache()Lio/netty/buffer/PoolThreadCache;
  • Only upgrade arrow:
java.lang.NoSuchMethodError: io.netty.buffer.PooledByteBufAllocatorL$InnerAllocator.threadCache()Lio/netty/buffer/PoolArenasCache;

then, we can say that both dependencies must be upgraded simultaneously

Done.

@panbingkun panbingkun changed the title [SPARK-44563][BUILD] Upgrade Arrow to 13.0.0 & Netty to 4.1.95.Final [SPARK-44563][BUILD] Upgrade Arrow to 13.0.0 & Netty to 4.1.96.Final Jul 28, 2023
@panbingkun panbingkun changed the title [SPARK-44563][BUILD] Upgrade Arrow to 13.0.0 & Netty to 4.1.96.Final [SPARK-44563][BUILD] Upgrade Arrow to 13.0.0 & Netty to 4.1.95.Final Jul 28, 2023
@panbingkun panbingkun marked this pull request as ready for review July 28, 2023 10:42
@LuciferYang
Copy link
Contributor

@panbingkun if arrow 13.0.0 is target to support Netty 4.1.96, then it is no longer coupled with the upgrade of Arrow. Can you create a separate PR to upgrade Netty to 4.1.96 first?

@panbingkun
Copy link
Contributor Author

panbingkun commented Jul 31, 2023

then it is no longer coupled with the upgrade of Arrow. Can you create a separate PR to upgrade Netty to 4.1.96 first?

Okay, let me do it.
About Netty: #42232

@panbingkun
Copy link
Contributor Author

Last week, I submitted a PR(apache/arrow#36926) on netty arrow memory to make netty arrow memory 13.0.0 run well with the netty newest version 4.1.96.Final.
Based on my limited understanding, this PR will be released with the official version of netty arrow memory 13.0.0.

@dongjoon-hyun
Copy link
Member

Thank you for working on this, @panbingkun and @LuciferYang .
I saw the tag was created 5 hours ago. Please let us know again when the artifacts become available.

@dongjoon-hyun dongjoon-hyun changed the title [SPARK-44563][BUILD] Upgrade Arrow to 13.0.0 & Netty to 4.1.95.Final [SPARK-44247][BUILD] Upgrade Arrow to 13.0.0 & Netty to 4.1.95.Final Aug 1, 2023
@dongjoon-hyun
Copy link
Member

BTW, I updated the PR title with SPARK-44247 instead of SPARK-44563.

@LuciferYang
Copy link
Contributor

Yes, arrow 13.0.0-rc2 has been released, and this version should be able to support Netty <=4.1.93 and >=4.1.96. We can remove the Netty part from this pr and test the new rc today @panbingkun :)

@panbingkun panbingkun changed the title [SPARK-44247][BUILD] Upgrade Arrow to 13.0.0 & Netty to 4.1.95.Final [SPARK-44247][BUILD] Upgrade Arrow to 13.0.0 Aug 2, 2023
Copy link
Member

@dongjoon-hyun dongjoon-hyun left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Could you make this PR ready, @panbingkun ?

@github-actions github-actions bot removed the INFRA label Aug 24, 2023
@panbingkun
Copy link
Contributor Author

Could you make this PR ready, @panbingkun ?

Done.

@dongjoon-hyun
Copy link
Member

Thank you so much, @panbingkun .

Copy link
Member

@dongjoon-hyun dongjoon-hyun left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

+1, LGTM (Pending CIs)

Copy link
Contributor

@LuciferYang LuciferYang left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@LuciferYang
Copy link
Contributor

This doesn't need to be merged into 3.5, right? @dongjoon-hyun

@dongjoon-hyun
Copy link
Member

Yes, it's too late for Apache Spark 3.5.0.

@LuciferYang
Copy link
Contributor

Yes, it's too late for Apache Spark 3.5.0.

OK

@dongjoon-hyun
Copy link
Member

The affected version of SPARK-44247 (this PR) is 4.0.0.
And, the target version of SPARK-43831 (Java 21) is 4.0.0.

@dongjoon-hyun
Copy link
Member

All tests passed.

@dongjoon-hyun
Copy link
Member

Merged to master for Apache Spark 4.

dongjoon-hyun added a commit that referenced this pull request Aug 24, 2023
### What changes were proposed in this pull request?

This PR aims to re-enable `test_sparkSQL_arrow.R` in Java 21.
This depends on #42181 .

### Why are the changes needed?

To have Java 21 test coverage.

### Does this PR introduce _any_ user-facing change?

No.

### How was this patch tested?

```
$ java -version
openjdk version "21" 2023-09-19
OpenJDK Runtime Environment (build 21+35-2513)
OpenJDK 64-Bit Server VM (build 21+35-2513, mixed mode, sharing)

$ build/sbt test:package -Psparkr -Phive

$ R/install-dev.sh; R/run-tests.sh
...
sparkSQL:
SparkSQL functions: ..........................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................
...........................................................................................................................................................................................................................................................................................................................................................................................................
...
```

### Was this patch authored or co-authored using generative AI tooling?

No.

Closes #42644 from dongjoon-hyun/SPARK-44127.

Authored-by: Dongjoon Hyun <[email protected]>
Signed-off-by: Dongjoon Hyun <[email protected]>
dongjoon-hyun added a commit that referenced this pull request Aug 24, 2023
…ql.execution.arrow tests in Java 21

### What changes were proposed in this pull request?

This PR aims to re-enable `PandasUDF` and `o.a.s.sql.execution.arrow` tests in Java 21.
This depends on #42181 .

### Why are the changes needed?

To have Java 21 test coverage.

### Does this PR introduce _any_ user-facing change?

No.

### How was this patch tested?

Run the following on Java 21.
```
$ java -version
openjdk version "21" 2023-09-19
OpenJDK Runtime Environment (build 21+35-2513)
OpenJDK 64-Bit Server VM (build 21+35-2513, mixed mode, sharing)

$ build/sbt "sql/testOnly *.ArrowConvertersSuite"
...
[info] Run completed in 5 seconds, 316 milliseconds.
[info] Total number of tests run: 30
[info] Suites: completed 1, aborted 0
[info] Tests: succeeded 30, failed 0, canceled 0, ignored 0, pending 0
[info] All tests passed.

$ build/sbt "sql/testOnly *.SQLQueryTestSuite"
...
[info] Run completed in 12 minutes, 4 seconds.
[info] Total number of tests run: 629
[info] Suites: completed 1, aborted 0
[info] Tests: succeeded 629, failed 0, canceled 0, ignored 2, pending 0
[info] All tests passed.
```

### Was this patch authored or co-authored using generative AI tooling?

No.

Closes #42641 from dongjoon-hyun/SPARK-44097.

Authored-by: Dongjoon Hyun <[email protected]>
Signed-off-by: Dongjoon Hyun <[email protected]>
dongjoon-hyun pushed a commit that referenced this pull request Aug 24, 2023
…21 after the new arrow version release

### What changes were proposed in this pull request?
This PR aims to re-enable PySpark in Java 21.
This depends on #42181 .

### Why are the changes needed?
To have Java 21 test coverage.

### Does this PR introduce _any_ user-facing change?
No.

### How was this patch tested?
Pass GA.

### Was this patch authored or co-authored using generative AI tooling?
No.

Closes #42646 from panbingkun/SPARK-44302.

Authored-by: panbingkun <[email protected]>
Signed-off-by: Dongjoon Hyun <[email protected]>
dongjoon-hyun added a commit that referenced this pull request Aug 24, 2023
…va 21

### What changes were proposed in this pull request?

This PR aims to re-enable Arrow-based connect tests in Java 21.
This depends on #42181.

### Why are the changes needed?

To have Java 21 test coverage.

### Does this PR introduce _any_ user-facing change?

No.

### How was this patch tested?

```
$ java -version
openjdk version "21-ea" 2023-09-19
OpenJDK Runtime Environment (build 21-ea+32-2482)
OpenJDK 64-Bit Server VM (build 21-ea+32-2482, mixed mode, sharing)

$ build/sbt "connect/test" -Phive
...
[info] Run completed in 14 seconds, 136 milliseconds.
[info] Total number of tests run: 858
[info] Suites: completed 20, aborted 0
[info] Tests: succeeded 858, failed 0, canceled 0, ignored 0, pending 0
[info] All tests passed.
[success] Total time: 44 s, completed Aug 23, 2023, 9:42:53 PM

$ build/sbt "connect-client-jvm/test" -Phive
...
[info] Run completed in 1 minute, 24 seconds.
[info] Total number of tests run: 1220
[info] Suites: completed 24, aborted 0
[info] Tests: succeeded 1220, failed 0, canceled 0, ignored 0, pending 0
[info] All tests passed.
[info] Passed: Total 1222, Failed 0, Errors 0, Passed 1222
```

### Was this patch authored or co-authored using generative AI tooling?

No.

Closes #42643 from dongjoon-hyun/SPARK-44121.

Authored-by: Dongjoon Hyun <[email protected]>
Signed-off-by: Dongjoon Hyun <[email protected]>
dongjoon-hyun added a commit that referenced this pull request Aug 24, 2023
…va 21

### What changes were proposed in this pull request?

This PR aims to re-enable Arrow-based connect tests in Java 21.
This depends on #42181.

### Why are the changes needed?

To have Java 21 test coverage.

### Does this PR introduce _any_ user-facing change?

No.

### How was this patch tested?

```
$ java -version
openjdk version "21-ea" 2023-09-19
OpenJDK Runtime Environment (build 21-ea+32-2482)
OpenJDK 64-Bit Server VM (build 21-ea+32-2482, mixed mode, sharing)

$ build/sbt "connect/test" -Phive
...
[info] Run completed in 14 seconds, 136 milliseconds.
[info] Total number of tests run: 858
[info] Suites: completed 20, aborted 0
[info] Tests: succeeded 858, failed 0, canceled 0, ignored 0, pending 0
[info] All tests passed.
[success] Total time: 44 s, completed Aug 23, 2023, 9:42:53 PM

$ build/sbt "connect-client-jvm/test" -Phive
...
[info] Run completed in 1 minute, 24 seconds.
[info] Total number of tests run: 1220
[info] Suites: completed 24, aborted 0
[info] Tests: succeeded 1220, failed 0, canceled 0, ignored 0, pending 0
[info] All tests passed.
[info] Passed: Total 1222, Failed 0, Errors 0, Passed 1222
```

### Was this patch authored or co-authored using generative AI tooling?

No.

Closes #42643 from dongjoon-hyun/SPARK-44121.

Authored-by: Dongjoon Hyun <[email protected]>
Signed-off-by: Dongjoon Hyun <[email protected]>
(cherry picked from commit a824a6d)
Signed-off-by: Dongjoon Hyun <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants