From 75630cfd0cba3977a42e014f39b1693e46a6829a Mon Sep 17 00:00:00 2001 From: Ry Walker Date: Wed, 2 Oct 2019 11:02:59 -0400 Subject: [PATCH 1/2] Fix whitespace --- UPDATING.md | 36 ++++++++++++++++++------------------ 1 file changed, 18 insertions(+), 18 deletions(-) diff --git a/UPDATING.md b/UPDATING.md index 98dc7faf07ee08..f99d393105a660 100644 --- a/UPDATING.md +++ b/UPDATING.md @@ -43,24 +43,24 @@ assists users migrating to a new version. FileSensor is now takes a glob pattern, not just a filename. If the filename you are looking for has `*`, `?`, or `[` in it then you should replace these with `[*]`, `[?]`, and `[[]`. ### Change dag loading duration metric name -Change DAG file loading duration metric from -`dag.loading-duration.` to `dag.loading-duration.`. This is to +Change DAG file loading duration metric from +`dag.loading-duration.` to `dag.loading-duration.`. This is to better handle the case when a DAG file has multiple DAGs. ### Changes to ImapHook, ImapAttachmentSensor and ImapAttachmentToS3Operator ImapHook: -* The order of arguments has changed for `has_mail_attachment`, +* The order of arguments has changed for `has_mail_attachment`, `retrieve_mail_attachments` and `download_mail_attachments`. * A new `mail_filter` argument has been added to each of those. ImapAttachmentSensor: * The order of arguments has changed for `__init__`. -* A new `mail_filter` argument has been added to `__init__`. +* A new `mail_filter` argument has been added to `__init__`. ImapAttachmentToS3Operator: * The order of arguments has changed for `__init__`. -* A new `imap_mail_filter` argument has been added to `__init__`. +* A new `imap_mail_filter` argument has been added to `__init__`. ### Changes to `SubDagOperator` @@ -84,15 +84,15 @@ you should write `@GoogleCloudBaseHook.provide_gcp_credential_file` ### Changes to S3Hook -Note: The order of arguments has changed for `check_for_prefix`. +Note: The order of arguments has changed for `check_for_prefix`. The `bucket_name` is now optional. It falls back to the `connection schema` attribute. ### Changes to Google Transfer Operator -To obtain pylint compatibility the `filter ` argument in `GcpTransferServiceOperationsListOperator` +To obtain pylint compatibility the `filter ` argument in `GcpTransferServiceOperationsListOperator` has been renamed to `request_filter`. ### Changes in Google Cloud Transfer Hook - To obtain pylint compatibility the `filter` argument in `GCPTransferServiceHook.list_transfer_job` and + To obtain pylint compatibility the `filter` argument in `GCPTransferServiceHook.list_transfer_job` and `GCPTransferServiceHook.list_transfer_operations` has been renamed to `request_filter`. ### Export MySQL timestamps as UTC @@ -123,7 +123,7 @@ Hence, the default value for `master_disk_size` in DataprocClusterCreateOperator ### Changes to SalesforceHook -* renamed `sign_in` function to `get_conn` +* renamed `sign_in` function to `get_conn` ### HTTPHook verify default value changed from False to True. @@ -134,8 +134,8 @@ This can be overwriten by using the extra_options param as `{'verify': False}`. * The following parameters have been replaced in all the methods in GCSHook: * `bucket` is changed to `bucket_name` - * `object` is changed to `object_name` - + * `object` is changed to `object_name` + * The `maxResults` parameter in `GoogleCloudStorageHook.list` has been renamed to `max_results` for consistency. ### Changes to CloudantHook @@ -267,10 +267,10 @@ The `do_xcom_push` flag (a switch to push the result of an operator to xcom or n See [AIRFLOW-3249](https://jira.apache.org/jira/browse/AIRFLOW-3249) to check if your operator was affected. ### Changes to Dataproc related Operators -The 'properties' and 'jars' properties for the Dataproc related operators (`DataprocXXXOperator`) have been renamed from +The 'properties' and 'jars' properties for the Dataproc related operators (`DataprocXXXOperator`) have been renamed from `dataproc_xxxx_properties` and `dataproc_xxx_jars` to `dataproc_properties` -and `dataproc_jars`respectively. -Arguments for dataproc_properties dataproc_jars +and `dataproc_jars`respectively. +Arguments for dataproc_properties dataproc_jars ## Airflow 1.10.4 @@ -287,12 +287,12 @@ If you have a specific task that still requires Python 2 then you can use the Py ### Changes to GoogleCloudStorageHook -* the discovery-based api (`googleapiclient.discovery`) used in `GoogleCloudStorageHook` is now replaced by the recommended client based api (`google-cloud-storage`). To know the difference between both the libraries, read https://cloud.google.com/apis/docs/client-libraries-explained. PR: [#5054](https://github.com/apache/airflow/pull/5054) +* the discovery-based api (`googleapiclient.discovery`) used in `GoogleCloudStorageHook` is now replaced by the recommended client based api (`google-cloud-storage`). To know the difference between both the libraries, read https://cloud.google.com/apis/docs/client-libraries-explained. PR: [#5054](https://github.com/apache/airflow/pull/5054) * as a part of this replacement, the `multipart` & `num_retries` parameters for `GoogleCloudStorageHook.upload` method have been deprecated. The client library uses multipart upload automatically if the object/blob size is more than 8 MB - [source code](https://github.com/googleapis/google-cloud-python/blob/11c543ce7dd1d804688163bc7895cf592feb445f/storage/google/cloud/storage/blob.py#L989-L997). The client also handles retries automatically -* the `generation` parameter is deprecated in `GoogleCloudStorageHook.delete` and `GoogleCloudStorageHook.insert_object_acl`. +* the `generation` parameter is deprecated in `GoogleCloudStorageHook.delete` and `GoogleCloudStorageHook.insert_object_acl`. Updating to `google-cloud-storage >= 1.16` changes the signature of the upstream `client.get_bucket()` method from `get_bucket(bucket_name: str)` to `get_bucket(bucket_or_name: Union[str, Bucket])`. This method is not directly exposed by the airflow hook, but any code accessing the connection directly (`GoogleCloudStorageHook().get_conn().get_bucket(...)` or similar) will need to be updated. @@ -557,7 +557,7 @@ then you need to change it like this @property def is_active(self): return self.active - + ### Support autodetected schemas to GoogleCloudStorageToBigQueryOperator GoogleCloudStorageToBigQueryOperator is now support schema auto-detection is available when you load data into BigQuery. Unfortunately, changes can be required. @@ -569,7 +569,7 @@ define a schema_fields: gcs_to_bq.GoogleCloudStorageToBigQueryOperator( ... schema_fields={...}) - + or define a schema_object: gcs_to_bq.GoogleCloudStorageToBigQueryOperator( From fb0fe4611f08fd570af23f133fa8dd19d41cd791 Mon Sep 17 00:00:00 2001 From: Ry Walker Date: Wed, 2 Oct 2019 11:04:01 -0400 Subject: [PATCH 2/2] Make it clear that 1.10.5 wasn't accidently omitted --- UPDATING.md | 5 +++++ 1 file changed, 5 insertions(+) diff --git a/UPDATING.md b/UPDATING.md index f99d393105a660..0d543492ffcda4 100644 --- a/UPDATING.md +++ b/UPDATING.md @@ -26,6 +26,7 @@ assists users migrating to a new version. **Table of contents** - [Airflow Master](#airflow-master) +- [Airflow 1.10.5](#airflow-1105) - [Airflow 1.10.4](#airflow-1104) - [Airflow 1.10.3](#airflow-1103) - [Airflow 1.10.2](#airflow-1102) @@ -272,6 +273,10 @@ The 'properties' and 'jars' properties for the Dataproc related operators (`Data and `dataproc_jars`respectively. Arguments for dataproc_properties dataproc_jars +## Airflow 1.10.5 + +No breaking changes. + ## Airflow 1.10.4 ### Python 2 support is going away