-
Notifications
You must be signed in to change notification settings - Fork 32
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
java.lang.ArrayIndexOutOfBoundsException #377
Comments
Spark and pyspark version? |
2.2.0 |
I cannot reproduce this error. I got some Can you reproduce it with what you pasted? If not, can you provide a reproduction case? |
Got the same error when running
You can find my config in this issue |
@r0mainK can you paste the complete logs for your crash? |
Sure, here you go:
|
Managed to reproduce it, I'm trying to figure out the siva file and the blob that are causing this to see if it's a problem in the siva file or something. |
Opening the blob with the offending siva file triggers the error, but not with go-git, so it must be some kind of error in jgit. I'm gonna make a reproduction case and report it to jgit. UPDATE: siva file contains corrupted objects. |
I'm closing this, since the problem is in the siva file and not in the engine. I opened an issue on borges to track this issue: src-d/borges#264 |
I saw the similar issue with old engine version, but it seems to appear again in 0.5.7. Not sure about earlier versions.
I run this code on staging cluster
(the same for
--master "spark://p-spark-master:7077"
)and in python
Expected Behavior
return the result
Current Behavior
Fails at some moment with error:
the worker logs about the same issue from one of experiments:
worker_logs.zip
The text was updated successfully, but these errors were encountered: