Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix: AwsGlueJobOperator change order of args for load_file #16216

Merged
merged 2 commits into from
Jun 13, 2021

Conversation

avocadomaster
Copy link
Contributor

As far as I understand the load_file function in the S3 hook expects key before bucket name.

The AwsGlueJobOperator currently calls the load_file function with s3_bucket before key.

s3_hook.load_file(self.script_location, self.s3_bucket, self.s3_artifacts_prefix + script_name)

This PR changes that order to comply with the order of load_file

@boring-cyborg boring-cyborg bot added area:providers provider:amazon-aws AWS/Amazon - related issues labels Jun 2, 2021
@boring-cyborg
Copy link

boring-cyborg bot commented Jun 2, 2021

Congratulations on your first Pull Request and welcome to the Apache Airflow community! If you have any issues or are unsure about any anything please check our Contribution Guide (https://github.com/apache/airflow/blob/main/CONTRIBUTING.rst)
Here are some useful points:

  • Pay attention to the quality of your code (flake8, pylint and type annotations). Our pre-commits will help you with that.
  • In case of a new feature add useful documentation (in docstrings or in docs/ directory). Adding a new operator? Check this short guide Consider adding an example DAG that shows how users should use it.
  • Consider using Breeze environment for testing locally, it’s a heavy docker but it ships with a working Airflow and a lot of integrations.
  • Be patient and persistent. It might take some time to get a review or get the final approval from Committers.
  • Please follow ASF Code of Conduct for all communication including (but not limited to) comments on Pull Requests, Mailing list and Slack.
  • Be sure to read the Airflow Coding style.
    Apache Airflow is a community-driven project and together we are making it better 🚀.
    In case of doubts contact the developers at:
    Mailing List: [email protected]
    Slack: https://s.apache.org/airflow-slack

@uranusjr
Copy link
Member

uranusjr commented Jun 2, 2021

Can you change the code to use keyword arguments instead? This will help prevent bugs in the future when the hook is changed.

@avocadomaster
Copy link
Contributor Author

Can you change the code to use keyword arguments instead? This will help prevent bugs in the future when the hook is changed.

👍 ✅

s3_hook.load_file(
filename=self.script_location,
key=self.s3_artifacts_prefix + script_name,
bucket_name=self.s3_bucket,
Copy link
Contributor

@mmenarguezpear mmenarguezpear Jun 12, 2021

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for fixing this. would it be possible to also add an argument to replace if the file exists, which is the replace argument?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think I will merge it as is as we are releasing providers tomorrow/Monday. This might be a separate feature.

@github-actions github-actions bot added the okay to merge It's ok to merge this PR as it does not require more tests label Jun 12, 2021
@github-actions
Copy link

The PR is likely OK to be merged with just subset of tests for default Python and Database versions without running the full matrix of tests, because it does not modify the core of Airflow. If the committers decide that the full tests matrix is needed, they will add the label 'full tests needed'. Then you should rebase to the latest main or amend the last commit of the PR, and push it with --force-with-lease.

1 similar comment
@github-actions
Copy link

The PR is likely OK to be merged with just subset of tests for default Python and Database versions without running the full matrix of tests, because it does not modify the core of Airflow. If the committers decide that the full tests matrix is needed, they will add the label 'full tests needed'. Then you should rebase to the latest main or amend the last commit of the PR, and push it with --force-with-lease.

@potiuk potiuk merged commit 643f3c3 into apache:main Jun 13, 2021
@boring-cyborg
Copy link

boring-cyborg bot commented Jun 13, 2021

Awesome work, congrats on your first merged pull request!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
area:providers okay to merge It's ok to merge this PR as it does not require more tests provider:amazon-aws AWS/Amazon - related issues
Projects
None yet
Development

Successfully merging this pull request may close these issues.

aws Glue operator fails to upload local script to s3 due to wrong argument order
4 participants