-
Beta Was this translation helpful? Give feedback.
Replies: 6 comments
-
after run this line in IDEA, it runs a long time , and then exit with code 1 , please help me how to debug this and run unit test of AdaptiveQueryExecSuite |
Beta Was this translation helpful? Give feedback.
-
I was able to step into As to why it's exiting during your debug session, you should check the logs for indication as to what went wrong. Look at recent files within |
Beta Was this translation helpful? Give feedback.
-
Beta Was this translation helpful? Give feedback.
-
No, almost all of the RAPIDS Accelerator unit tests require a GPU because they are testing a plugin that executes code on the GPU. So now we know why it failed, as a unit test requiring a CUDA GPU was run on a platform without that environment. I too have been running Intellij on a MacBook to step through the unit tests, but the difference is I'm using Intellij as a remote debugger, not running the unit tests locally. On a Linux machine with a CUDA GPU I executed this command:
And then once it said it was listening for a debugger attach I would launch a remote debug session in Intellij with my desired breakpoints set. Note that Closing this issue since the root cause is trying to debug a unit test where it cannot run normally. Please reopen if you encounter similar issues when trying to debug on a system where the unit test can normally run. |
Beta Was this translation helpful? Give feedback.
-
@JLow i think now debug methods use the spark jar is in maven repository , it's in here : ~/.m2/repository/org/apache/spark/spark-sql_2.12/3.0.1/spark-sql_2.12-3.0.1.jar , so if i move my modify spark-sql_2.12-3.0.1.jar to here , then I run unit test command: mvn test -DwildcardSuites=com.nvidia.spark.rapids.AdaptiveQueryExecSuite -DdebugForkedProcess -Pspark301tests -DdebuggerPort=8888 can it work with my expect ? |
Beta Was this translation helpful? Give feedback.
-
Yes, if you run the tests with A possibly simpler way to do this is to update the pom to add your own test profile for your custom version of Spark if that Spark is installed or published somewhere, e.g.: |
Beta Was this translation helpful? Give feedback.
I was able to step into
withSparkSession
using a debugger successfully. However that's not how I would normally debug it. If you're interested in debugging the main part ofskewJoinTest
then I'd place the breakpoint with the lambda function you're actually interested in debugging rather than needing to step your way manually there through the generalwithSparkSession
code.As to why it's exiting during your debug session, you should check the logs for indication as to what went wrong. Look at recent files within
tests/target/surefire-reports/
such astests/target/surefire-reports/scala-test-detailed-output.log
to see if an error was logged. For example, if you're seeing something likeCou…