-
Notifications
You must be signed in to change notification settings - Fork 143
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
HMS used in Docker Demo doesn't support writing tables to S3A #322
Comments
@sagarlakshmipathy do you have a docker demo of writing Hudi into S3 + HMS? |
the docker demo only supports writing to local file system. feel free to open a patch. |
@alberttwong you can try importing the aws hadoop dependency that in the docker demo |
I couldn't figure it out. Do you have instructions? The other demo i've seen like Trino and StarBurst built the AWS library into a custom container. |
The imports are done at the beginning of the jupyter notebook. That would only load the dependency into that jvm though so I'm not sure if that would work end to end. |
Ideally, somehow we would load the library in the HMS image from apache. My google-fu isn't working since I can't find out how to do it. |
I used the env setup to get this working from https://chetnachaudhari.github.io/2016-02-16/how-to-add-auxiliary-jars-in-hive/
|
now I'm missing aws-java-sdk-core when I try to write. |
using aws-java-sdk-core-1.12.367.jar since that's what we use in StarRocks. now I'm stuck. I don't know why I have this thrift issue.
|
I found this issue so you need: aws-java-sdk-core-1.12.367.jar |
it seems like I just OOM the spark shell trying to run creating my hudi table
|
I might have to switch to the HMS provided by Starburst. It seems like it's the only image that may work (I tried Trino's image and it doesn't work either). |
I switched to HMS provided by StarBurst. https://github.com/StarRocks/demo/blob/master/documentation-samples/datalakehouse/docker-compose.yml |
I get .config org.apache.hadoop.fs.s3a.S3AFileSystem not found message.
The text was updated successfully, but these errors were encountered: