-
Notifications
You must be signed in to change notification settings - Fork 14.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[WIP][AIRFLOW-5777] Migrate AWS DynamoDB to /providers/aws [AIP-21] #6455
Conversation
Codecov Report
@@ Coverage Diff @@
## master #6455 +/- ##
========================================
Coverage ? 83.7%
========================================
Files ? 633
Lines ? 36646
Branches ? 0
========================================
Hits ? 30673
Misses ? 5973
Partials ? 0
Continue to review full report at Codecov.
|
why did you transfer transfer operators to another file? |
Because it says this in AIP-21: "When there is only one provider as target but source is a database or another non-provider source, the operator is put to the target provider." |
Set title to [WIP] because of @potiuk's message on the mailing list. Location of operators TBD. |
|
I'm working on https://issues.apache.org/jira/browse/AIRFLOW-5803. I think it would be better if we have explicit conventions for the naming. |
AFAIK there is currently no convention for class naming. For consistency we definitely need consistent naming, and we might as well do it right now because users have to migrate anyways. @potiuk thoughts? |
I did #6465 (comment) |
""" | ||
This module contains the AWS DynamoDB hook | ||
""" | ||
from airflow.contrib.hooks.aws_hook import AwsHook |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Should this import from the new package path instead? i.e.
from airflow.providers.aws.hooks import AwsHook
But I'm actually a bit confused about the migrations, because that's not anywhere obvious in https://github.com/apache/airflow/tree/master/airflow/providers/amazon/aws/hooks and it not clear to me whether it should be
from airflow.providers.amazon.aws.hooks import AwsHook
from airflow.hooks.S3_hook import S3Hook | ||
from airflow.models import BaseOperator | ||
from airflow.providers.aws.hooks.dynamodb import AwsDynamoDBHook | ||
from airflow.utils.decorators import apply_defaults |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Should this import from the new package path instead? i.e.
from airflow.providers.aws.hooks import S3Hook
But, it's confusing because https://github.com/apache/airflow/blob/master/airflow/providers/amazon/aws/hooks/s3.py#L61 looks like it needs to be imported from
from airflow.providers.amazon.aws.hooks.s3 import S3Hook
But that module itself uses the older contrib package:
from airflow.contrib.hooks.aws_hook import AwsHook
Very confusing.
Some of the AIP-21 and naming convention requirements are being loaded onto #6764 as well, so I'll have to watch what happens here also. |
This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions. |
@mik-laj is doing automated migration now for all contrib packages - I guess this one can be closed? |
Yes. I've already done it in my fork. |
Closing as the whole migration is automated by @mik-laj |
Mostly moving around files. Most important change is merging
airflow/contrib/operators/dynamodb_to_s3.py
andairflow/contrib/operators/hive_to_dynamodb.py
together intoairflow/providers/aws/operators/dynamodb.py
.Make sure you have checked all steps below.
Jira
Description
Tests
Commits
Documentation