Skip to content

Commit

Permalink
Merge pull request #13 from mineiros-io/zied/issue-11
Browse files Browse the repository at this point in the history
feat: add support for new fields and increase minimum supported version
  • Loading branch information
zied-elouaer authored Jul 19, 2024
2 parents 2b4154c + a76afa5 commit 5a39454
Show file tree
Hide file tree
Showing 8 changed files with 40 additions and 27 deletions.
20 changes: 12 additions & 8 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -325,7 +325,11 @@ project = "project-a"

- [**`use_topic_schema`**](#attr-bigquery_config-use_topic_schema): *(Optional `bool`)*<a name="attr-bigquery_config-use_topic_schema"></a>

When `true`, use the topic's schema as the columns to write to in BigQuery, if it exists.
When `true`, use the topic's schema as the columns to write to in BigQuery, if it exists. Only one of use_topic_schema and use_table_schema can be set.

- [**`use_table_schema`**](#attr-bigquery_config-use_table_schema): *(Optional `bool`)*<a name="attr-bigquery_config-use_table_schema"></a>

When true, use the BigQuery table's schema as the columns to write to in BigQuery. Messages must be published in JSON format. Only one of use_topic_schema and use_table_schema can be set.

- [**`write_metadata`**](#attr-bigquery_config-write_metadata): *(Optional `bool`)*<a name="attr-bigquery_config-write_metadata"></a>

Expand All @@ -335,6 +339,10 @@ project = "project-a"

When `true` and `use_topic_schema` is `true`, any fields that are a part of the topic schema that are not part of the BigQuery table schema are dropped when writing to BigQuery. Otherwise, the schemas must be kept in sync and any messages with extra fields are not written and remain in the subscription's backlog.

- [**`service_account_email`**](#attr-bigquery_config-service_account_email): *(Optional `string`)*<a name="attr-bigquery_config-service_account_email"></a>

The service account to use to write to BigQuery. If not specified, the Pub/Sub service agent, service-{project_number}@gcp-sa-pubsub.iam.gserviceaccount.com, is used.

- [**`cloud_storage_config`**](#var-cloud_storage_config): *(Optional `object(cloud_storage_config)`)*<a name="var-cloud_storage_config"></a>

If delivery to Cloud Storage is used with this subscription, this field is used to configure it. Either pushConfig, bigQueryConfig or cloudStorageConfig can be set, but not combined. If all three are empty, then the subscriber will pull and ack messages using API methods.
Expand Down Expand Up @@ -365,17 +373,13 @@ project = "project-a"
(Optional) The maximum bytes that can be written to a Cloud Storage file before a new file is created.
Min 1 KB, max 10 GiB. The maxBytes limit may be exceeded in cases where messages are larger than the limit.

- [**`state`**](#attr-cloud_storage_config-state): *(Optional `any`)*<a name="attr-cloud_storage_config-state"></a>

(Output) An output-only field that indicates whether or not the subscription can receive messages.

- [**`avro_config`**](#attr-cloud_storage_config-avro_config): *(Optional `bool`)*<a name="attr-cloud_storage_config-avro_config"></a>
- [**`avro_config`**](#attr-cloud_storage_config-avro_config): *(Optional `object(avro_config)`)*<a name="attr-cloud_storage_config-avro_config"></a>

If set, message data will be written to Cloud Storage in Avro format.

The object accepts the following attributes:
The `avro_config` object accepts the following attributes:

- [**`write_metadata`**](#attr-cloud_storage_config-avro_config-write_metadata): *(Optional `any`)*<a name="attr-cloud_storage_config-avro_config-write_metadata"></a>
- [**`write_metadata`**](#attr-cloud_storage_config-avro_config-write_metadata): *(Optional `bool`)*<a name="attr-cloud_storage_config-avro_config-write_metadata"></a>

When true, write the subscription name, messageId, publishTime, attributes, and orderingKey as additional fields in the output.

Expand Down
27 changes: 17 additions & 10 deletions README.tfdoc.hcl
Original file line number Diff line number Diff line change
Expand Up @@ -405,7 +405,14 @@ section {
attribute "use_topic_schema" {
type = bool
description = <<-END
When `true`, use the topic's schema as the columns to write to in BigQuery, if it exists.
When `true`, use the topic's schema as the columns to write to in BigQuery, if it exists. Only one of use_topic_schema and use_table_schema can be set.
END
}

attribute "use_table_schema" {
type = bool
description = <<-END
When true, use the BigQuery table's schema as the columns to write to in BigQuery. Messages must be published in JSON format. Only one of use_topic_schema and use_table_schema can be set.
END
}

Expand All @@ -422,6 +429,13 @@ section {
When `true` and `use_topic_schema` is `true`, any fields that are a part of the topic schema that are not part of the BigQuery table schema are dropped when writing to BigQuery. Otherwise, the schemas must be kept in sync and any messages with extra fields are not written and remain in the subscription's backlog.
END
}

attribute "service_account_email" {
type = string
description = <<-END
The service account to use to write to BigQuery. If not specified, the Pub/Sub service agent, service-{project_number}@gcp-sa-pubsub.iam.gserviceaccount.com, is used.
END
}
}

variable "cloud_storage_config" {
Expand Down Expand Up @@ -470,21 +484,14 @@ section {
END
}

attribute "state" {
type = any
description = <<-END
(Output) An output-only field that indicates whether or not the subscription can receive messages.
END
}

attribute "avro_config" {
type = bool
type = object(avro_config)
description = <<-END
If set, message data will be written to Cloud Storage in Avro format.
END

attribute "write_metadata" {
type = any
type = bool
description = <<-END
When true, write the subscription name, messageId, publishTime, attributes, and orderingKey as additional fields in the output.
END
Expand Down
10 changes: 6 additions & 4 deletions main.tf
Original file line number Diff line number Diff line change
Expand Up @@ -74,10 +74,12 @@ resource "google_pubsub_subscription" "subscription" {
iterator = bqc

content {
table = bqc.value.table
use_topic_schema = try(bqc.value.use_topic_schema, null)
write_metadata = try(bqc.value.write_metadata, null)
drop_unknown_fields = try(bqc.value.drop_unknown_fields, null)
table = bqc.value.table
use_topic_schema = try(bqc.value.use_topic_schema, null)
use_table_schema = try(bqc.value.use_table_schema, null)
write_metadata = try(bqc.value.write_metadata, null)
drop_unknown_fields = try(bqc.value.drop_unknown_fields, null)
service_account_email = try(bqc.value.service_account_email, null)
}
}

Expand Down
2 changes: 1 addition & 1 deletion test/terramate_google.tm.hcl
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
globals {
minimum_provider_version = "5"
minimum_provider_version = "5.35.0"
provider_version_constraint = ">= ${global.minimum_provider_version}, <6"

stack_basename = tm_reverse(tm_split("/", terramate.path))[0]
Expand Down
2 changes: 1 addition & 1 deletion test/unit-complete/_generated_google.tf
Original file line number Diff line number Diff line change
Expand Up @@ -23,7 +23,7 @@ terraform {
required_providers {
google = {
source = "hashicorp/google"
version = ">= 5, <6"
version = ">= 5.35.0, <6"
}
random = {
source = "hashicorp/random"
Expand Down
2 changes: 1 addition & 1 deletion test/unit-disabled/_generated_google.tf
Original file line number Diff line number Diff line change
Expand Up @@ -23,7 +23,7 @@ terraform {
required_providers {
google = {
source = "hashicorp/google"
version = ">= 5, <6"
version = ">= 5.35.0, <6"
}
random = {
source = "hashicorp/random"
Expand Down
2 changes: 1 addition & 1 deletion test/unit-minimal/_generated_google.tf
Original file line number Diff line number Diff line change
Expand Up @@ -23,7 +23,7 @@ terraform {
required_providers {
google = {
source = "hashicorp/google"
version = "5"
version = "5.35.0"
}
random = {
source = "hashicorp/random"
Expand Down
2 changes: 1 addition & 1 deletion versions.tf
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@ terraform {
required_providers {
google = {
source = "hashicorp/google"
version = ">= 5, <6"
version = ">= 5.35.0, <6"
}
}
}

0 comments on commit 5a39454

Please sign in to comment.