Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[WIP][AIRFLOW-5777] Migrate AWS DynamoDB to /providers/aws [AIP-21] #6455

Closed
wants to merge 3 commits into from
Closed

[WIP][AIRFLOW-5777] Migrate AWS DynamoDB to /providers/aws [AIP-21] #6455

wants to merge 3 commits into from

Conversation

BasPH
Copy link
Contributor

@BasPH BasPH commented Oct 28, 2019

Mostly moving around files. Most important change is merging airflow/contrib/operators/dynamodb_to_s3.py and airflow/contrib/operators/hive_to_dynamodb.py together into airflow/providers/aws/operators/dynamodb.py.


Make sure you have checked all steps below.

Jira

  • My PR addresses the following Airflow Jira issues and references them in the PR title. For example, "[AIRFLOW-XXX] My Airflow PR"
    • https://issues.apache.org/jira/browse/AIRFLOW-5777
    • In case you are fixing a typo in the documentation you can prepend your commit with [AIRFLOW-XXX], code changes always need a Jira issue.
    • In case you are proposing a fundamental code change, you need to create an Airflow Improvement Proposal (AIP).
    • In case you are adding a dependency, check if the license complies with the ASF 3rd Party License Policy.

Description

  • Here are some details about my PR, including screenshots of any UI changes:

Tests

  • My PR adds the following unit tests OR does not need testing for this extremely good reason:

Commits

  • My commits all reference Jira issues in their subject lines, and I have squashed multiple commits if they address the same issue. In addition, my commits follow the guidelines from "How to write a good git commit message":
    1. Subject is separated from body by a blank line
    2. Subject is limited to 50 characters (not including Jira issue reference)
    3. Subject does not end with a period
    4. Subject uses the imperative mood ("add", not "adding")
    5. Body wraps at 72 characters
    6. Body explains "what" and "why", not "how"

Documentation

  • In case of new functionality, my PR adds documentation that describes how to use it.
    • All the public functions and the classes in the PR contain docstrings that explain what it does
    • If you implement backwards incompatible changes, please leave a note in the Updating.md so we can assign it to a appropriate release

@BasPH BasPH added the provider:amazon-aws AWS/Amazon - related issues label Oct 28, 2019
@codecov-io
Copy link

codecov-io commented Oct 28, 2019

Codecov Report

❗ No coverage uploaded for pull request base (master@59b6d8b). Click here to learn what that means.
The diff coverage is 93.16%.

Impacted file tree graph

@@           Coverage Diff            @@
##             master   #6455   +/-   ##
========================================
  Coverage          ?   83.7%           
========================================
  Files             ?     633           
  Lines             ?   36646           
  Branches          ?       0           
========================================
  Hits              ?   30673           
  Misses            ?    5973           
  Partials          ?       0
Impacted Files Coverage Δ
airflow/contrib/hooks/aws_dynamodb_hook.py 100% <100%> (ø)
airflow/contrib/operators/hive_to_dynamodb.py 100% <100%> (ø)
airflow/contrib/operators/dynamodb_to_s3.py 100% <100%> (ø)
airflow/providers/aws/hooks/dynamodb.py 91.3% <91.3%> (ø)
airflow/providers/aws/operators/dynamodb.py 92.68% <92.68%> (ø)

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 59b6d8b...c3893ee. Read the comment docs.

@mik-laj
Copy link
Member

mik-laj commented Oct 29, 2019

why did you transfer transfer operators to another file?

@BasPH
Copy link
Contributor Author

BasPH commented Oct 29, 2019

Because it says this in AIP-21: "When there is only one provider as target but source is a database or another non-provider source, the operator is put to the target provider."

@BasPH BasPH changed the title [AIRFLOW-5777] Migrate AWS DynamoDB to /providers/aws [AIP-21] [WIP][AIRFLOW-5777] Migrate AWS DynamoDB to /providers/aws [AIP-21] Oct 29, 2019
@BasPH
Copy link
Contributor Author

BasPH commented Oct 29, 2019

Set title to [WIP] because of @potiuk's message on the mailing list. Location of operators TBD.

@mingrammer
Copy link
Contributor

mingrammer commented Oct 29, 2019

AWSAthenaHook is prefixed with AWS, but AwsDynamoDBHook has Aws. Aren't there some conventions for the class names which have a prefix?

@mingrammer
Copy link
Contributor

mingrammer commented Oct 29, 2019

I'm working on https://issues.apache.org/jira/browse/AIRFLOW-5803.

I think it would be better if we have explicit conventions for the naming.

@BasPH
Copy link
Contributor Author

BasPH commented Oct 29, 2019

AFAIK there is currently no convention for class naming. For consistency we definitely need consistent naming, and we might as well do it right now because users have to migrate anyways. @potiuk thoughts?

@mingrammer
Copy link
Contributor

mingrammer commented Oct 30, 2019

I did #6465 (comment)

"""
This module contains the AWS DynamoDB hook
"""
from airflow.contrib.hooks.aws_hook import AwsHook
Copy link
Contributor

@dazza-codes dazza-codes Dec 10, 2019

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Should this import from the new package path instead? i.e.

  • from airflow.providers.aws.hooks import AwsHook

But I'm actually a bit confused about the migrations, because that's not anywhere obvious in https://github.com/apache/airflow/tree/master/airflow/providers/amazon/aws/hooks and it not clear to me whether it should be

  • from airflow.providers.amazon.aws.hooks import AwsHook

from airflow.hooks.S3_hook import S3Hook
from airflow.models import BaseOperator
from airflow.providers.aws.hooks.dynamodb import AwsDynamoDBHook
from airflow.utils.decorators import apply_defaults
Copy link
Contributor

@dazza-codes dazza-codes Dec 10, 2019

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Should this import from the new package path instead? i.e.

  • from airflow.providers.aws.hooks import S3Hook

But, it's confusing because https://github.com/apache/airflow/blob/master/airflow/providers/amazon/aws/hooks/s3.py#L61 looks like it needs to be imported from

  • from airflow.providers.amazon.aws.hooks.s3 import S3Hook

But that module itself uses the older contrib package:

  • from airflow.contrib.hooks.aws_hook import AwsHook

Very confusing.

@dazza-codes
Copy link
Contributor

Some of the AIP-21 and naming convention requirements are being loaded onto #6764 as well, so I'll have to watch what happens here also.

@stale
Copy link

stale bot commented Jan 24, 2020

This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.

@stale stale bot added the stale Stale PRs per the .github/workflows/stale.yml policy file label Jan 24, 2020
@potiuk
Copy link
Member

potiuk commented Jan 27, 2020

@mik-laj is doing automated migration now for all contrib packages - I guess this one can be closed?

@stale stale bot removed the stale Stale PRs per the .github/workflows/stale.yml policy file label Jan 27, 2020
@mik-laj
Copy link
Member

mik-laj commented Jan 27, 2020

Yes. I've already done it in my fork.

@potiuk
Copy link
Member

potiuk commented Jan 27, 2020

Closing as the whole migration is automated by @mik-laj

@potiuk potiuk closed this Jan 27, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
provider:amazon-aws AWS/Amazon - related issues
Projects
None yet
Development

Successfully merging this pull request may close these issues.

6 participants