Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

aws_s3_data_source of google_storage_transfer_job should have path param #12997

Closed
hinaloe opened this issue Nov 10, 2022 · 5 comments · Fixed by GoogleCloudPlatform/magic-modules#7932, hashicorp/terraform-provider-google-beta#5641 or #14610

Comments

@hinaloe
Copy link

hinaloe commented Nov 10, 2022

Community Note

  • Please vote on this issue by adding a 👍 reaction to the original issue to help the community and maintainers prioritize this request
  • Please do not leave "+1" or "me too" comments, they generate extra noise for issue followers and do not help prioritize the request
  • If you are interested in working on this issue or have submitted a pull request, please leave a comment. If the issue is assigned to the "modular-magician" user, it is either in the process of being autogenerated, or is planned to be autogenerated soon. If the issue is assigned to a user, that user is claiming responsibility for the issue. If the issue is assigned to "hashibot", a community member has claimed the issue already.

Description

In google_storage_transfer_job, other data sources such as Azure or GCS having path paramater, but aws_s3_data_source does not have it.

In REST TransferSpec type or in console, it has path param similar to GCP. But, awsS3DataSchema of this provider looks not has the param.

func awsS3DataSchema() *schema.Resource {
return &schema.Resource{
Schema: map[string]*schema.Schema{
"bucket_name": {
Required: true,
Type: schema.TypeString,
Description: `S3 Bucket name.`,
},
"aws_access_key": {
Type: schema.TypeList,
Optional: true,
MaxItems: 1,
Elem: &schema.Resource{
Schema: map[string]*schema.Schema{
"access_key_id": {
Type: schema.TypeString,
Required: true,
Sensitive: true,
Description: `AWS Key ID.`,
},
"secret_access_key": {
Type: schema.TypeString,
Required: true,
Sensitive: true,
Description: `AWS Secret Access Key.`,
},
},
},
ExactlyOneOf: awsS3AuthKeys,
Description: `AWS credentials block.`,
},
"role_arn": {
Type: schema.TypeString,
Optional: true,
ExactlyOneOf: awsS3AuthKeys,
Description: `The Amazon Resource Name (ARN) of the role to support temporary credentials via 'AssumeRoleWithWebIdentity'. For more information about ARNs, see [IAM ARNs](https://docs.aws.amazon.com/IAM/latest/UserGuide/reference_identifiers.html#identifiers-arns). When a role ARN is provided, Transfer Service fetches temporary credentials for the session using a 'AssumeRoleWithWebIdentity' call for the provided role using the [GoogleServiceAccount][] for this project.`,
},
},
}
}

New or Affected Resource(s)

  • google_storage_transfer_job

Potential Terraform Configuration

# based on documented sample
resource "google_storage_transfer_job" "s3-bucket-nightly-backup" {
  description = "Nightly backup of S3 bucket"
  project     = var.project

  transfer_spec {
    object_conditions {
      max_time_elapsed_since_last_modification = "600s"
      exclude_prefixes = [
        "requests.gz",
      ]
    }
    transfer_options {
      delete_objects_unique_in_sink = false
    }
    aws_s3_data_source {
      bucket_name = var.aws_s3_bucket
      path        = "foo/bar/" # here
      aws_access_key {
        access_key_id     = var.aws_access_key
        secret_access_key = var.aws_secret_key
      }
    }
    gcs_data_sink {
      bucket_name = google_storage_bucket.s3-backup-bucket.name
      path        = "foo/bar/"
    }
  }

  schedule {
    schedule_start_date {
      year  = 2018
      month = 10
      day   = 1
    }
    schedule_end_date {
      year  = 2019
      month = 1
      day   = 15
    }
    start_time_of_day {
      hours   = 23
      minutes = 30
      seconds = 0
      nanos   = 0
    }
    repeat_interval = "604800s"
  }

  notification_config {
    pubsub_topic  = google_pubsub_topic.topic.id
    event_types   = [
      "TRANSFER_OPERATION_SUCCESS",
      "TRANSFER_OPERATION_FAILED"
    ]
    payload_format = "JSON"
  }

  depends_on = [google_storage_bucket_iam_member.s3-backup-bucket, google_pubsub_topic_iam_member.notification_config]
}

References

@renzepost
Copy link

As a workaround, I tried adding the path to the bucket in the bucket_name variable of aws_s3_data_source. But this throws an error:
Error: googleapi: Error 400: S3 source bucket name cannot contain a path. Please use include and exclude prefixes instead., badRequest

Since there is no workaround for this, could this be marked as a bug instead?

@melinath melinath added this to the Goals milestone Dec 12, 2022
@hinaloe
Copy link
Author

hinaloe commented Dec 13, 2022

I am importing and using an existing resource but cannot make any changes to the resource because of this issue. We are treating this as a bug and hope that changes will be made as soon as possible.

@renzepost
Copy link

Just like to comment that I did find an obvious workaround for my issue. The error I was getting was pointing in the right direction. I just added my path to the include_prefixes list in the object_conditions block, and for me that solved my issue.

@kiwamizamurai
Copy link

I'll pick this one up

@github-actions
Copy link

I'm going to lock this issue because it has been closed for 30 days ⏳. This helps our maintainers find and focus on the active issues.
If you have found a problem that seems similar to this, please open a new issue and complete the issue template so we can capture all the details necessary to investigate further.

@github-actions github-actions bot locked as resolved and limited conversation to collaborators Jun 15, 2023
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.