-
Notifications
You must be signed in to change notification settings - Fork 593
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Shutdown hook called before final status was reported. #2666
Comments
Hi @david-wb I reformatted your comment slightly to make the stack trace more legible, I hope you don't mind. I suspect your intuition about the System.exit(0) is entirely correct. I suspect we haven't noticed it because we typically run in yarn client mode and you're running in cluster mode. Two questions:
It looks like we'll probably have to add a check and wait for the spark context to properly shut down. |
And the output.bam has nearly the same size in bytes as the output from the yarn cluster, so it doesn't appear to be truncated. The output .bai files have identical sizes. Here is the full stderr log.
|
@david-wb Thanks for the detailed information. That's really helpful. I don't think we're going to be able to get to this for the next few weeks since we're pretty swamped with work for our beta release. Hopefully it's not blocking you since it seems like the output of the tools is correct. We'll get back to you when we have a proposed PR for a fix that you can test. |
Sounds good. I can also run in client mode for the time being. |
Hi,
I am experimenting with submitting a PrintReadsSpark job to a yarn spark cluster in AWS. I run the job with the following command.
I can see from the output files that the job finished successfully, however the cluster tells me that it failed. It shows the following error message:
I believe this may be due to the
System.exit(0)
statement at line 144 in hellbender.Main, though I am not sure.Here is a more complete snippet from the stderr log.
Thanks in helping me solve this issue!
The text was updated successfully, but these errors were encountered: