Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[AIRFLOW-4908] BigQuery Hooks/Operators for update_dataset, patch_dataset, get_dataset #5546

Merged
merged 1 commit into from
Aug 14, 2019

Conversation

ryanyuan
Copy link
Contributor

@ryanyuan ryanyuan commented Jul 8, 2019

To create a BigQuery sink for GCP Stackdriver Logging, I have to assign WRITER access to group [email protected] to access BQ dataset. However, current BigQueryHook doesn't support updating/patching dataset.

Reference: https://googleapis.github.io/google-cloud-python/latest/logging/usage.html#export-to-bigquery

Implement GCP Stackdriver Logging: AIRFLOW-4779

While it is missing update_dataset and patch_dataset, BigQueryHook has get_dataset but it doesn't have operator for it.

Features to be implemented:
BigQueryBaseCursor.patch_dataset
BigQueryBaseCursor.update_dataset
BigQueryPatchDatasetOperator
BigQueryUpdateDatasetOperator
BigQueryGetDatasetOperator

Make sure you have checked all steps below.

Jira

  • My PR addresses the following Airflow-4908 issues and references them in the PR title. For example, "[AIRFLOW-XXX] My Airflow PR"

Description

  • Here are some details about my PR, including screenshots of any UI changes:

Tests

  • My PR adds the following unit tests OR does not need testing for this extremely good reason:
    test_bigquery_operator:BigQueryPatchDatasetOperatorTest
    test_bigquery_operator:BigQueryGetDatasetOperatorTest
    test_bigquery_operator:BigQueryUpdateDatasetOperatorTest
    test_bigquery_hook:TestDatasetsOperations.test_patch_dataset
    test_bigquery_hook:TestDatasetsOperations.test_update_dataset

Commits

  • My commits all reference Jira issues in their subject lines, and I have squashed multiple commits if they address the same issue. In addition, my commits follow the guidelines from "How to write a good git commit message":
    1. Subject is separated from body by a blank line
    2. Subject is limited to 50 characters (not including Jira issue reference)
    3. Subject does not end with a period
    4. Subject uses the imperative mood ("add", not "adding")
    5. Body wraps at 72 characters
    6. Body explains "what" and "why", not "how"

Documentation

  • In case of new functionality, my PR adds documentation that describes how to use it.
    • All the public functions and the classes in the PR contain docstrings that explain what it does
    • If you implement backwards incompatible changes, please leave a note in the Updating.md so we can assign it to a appropriate release

Code Quality

  • Passes flake8

airflow/contrib/hooks/bigquery_hook.py Show resolved Hide resolved
airflow/contrib/hooks/bigquery_hook.py Show resolved Hide resolved
self.delegate_to = delegate_to

self.log.info('Dataset id: %s', self.dataset_id)
self.log.info('Project id: %s', self.project_id)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I would merge this log line with the previous one.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Make sense. I will merge them.

@ryanyuan ryanyuan force-pushed the bq-dataset-ops branch 3 times, most recently from a8ea9f4 to 8c28fce Compare July 9, 2019 06:12
@codecov-io
Copy link

codecov-io commented Jul 9, 2019

Codecov Report

Merging #5546 into master will increase coverage by 0.03%.
The diff coverage is 91.42%.

Impacted file tree graph

@@            Coverage Diff             @@
##           master    #5546      +/-   ##
==========================================
+ Coverage      79%   79.03%   +0.03%     
==========================================
  Files         489      489              
  Lines       30726    30796      +70     
==========================================
+ Hits        24275    24341      +66     
- Misses       6451     6455       +4
Impacted Files Coverage Δ
airflow/contrib/operators/bigquery_operator.py 95.5% <100%> (+1.5%) ⬆️
airflow/contrib/hooks/bigquery_hook.py 61.72% <70%> (+0.27%) ⬆️
airflow/models/taskinstance.py 93.18% <0%> (+0.16%) ⬆️
airflow/contrib/operators/ssh_operator.py 83.75% <0%> (+1.25%) ⬆️

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update a1f9d9a...8c28fce. Read the comment docs.

@ryanyuan
Copy link
Contributor Author

ryanyuan commented Jul 9, 2019

@kaxil @mik-laj @potiuk PTAL

@mik-laj mik-laj added the provider:google Google (including GCP) related issues label Jul 22, 2019
@mik-laj
Copy link
Member

mik-laj commented Aug 13, 2019

Are you planning to continue working on this change? This week, I would like to deal with PR reviews related to GCP. I would be happy if you would respond to all comments

def __init__(self,
dataset_id,
project_id=None,
bigquery_conn_id='google_cloud_default',
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
bigquery_conn_id='google_cloud_default',
gcp_conn_id='google_cloud_default',

Reference:
#5734

dataset_id,
dataset_resource,
project_id=None,
bigquery_conn_id='google_cloud_default',
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
bigquery_conn_id='google_cloud_default',
gcp_conn_id='google_cloud_default',

Reference:
#5734

dataset_id,
dataset_resource,
project_id=None,
bigquery_conn_id='google_cloud_default',
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
bigquery_conn_id='google_cloud_default',
gcp_conn_id='google_cloud_default',

Reference:
#5734

Copy link
Member

@mik-laj mik-laj left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Only one small things - bigquery_conn_id => gcp_conn_id

… patch_dataset and get_dataset

Implement BigQuery Hooks/Operators for update_dataset, patch_dataset and get_dataset
@ryanyuan
Copy link
Contributor Author

@mik-laj Done. PTAL

@mik-laj mik-laj merged commit 09b9610 into apache:master Aug 14, 2019
ashb pushed a commit to ashb/airflow that referenced this pull request Oct 16, 2019
… patch_dataset and get_dataset (apache#5546)

Implement BigQuery Hooks/Operators for update_dataset, patch_dataset and get_dataset

(cherry picked from commit 09b9610)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
provider:google Google (including GCP) related issues
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants