-
Notifications
You must be signed in to change notification settings - Fork 1.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add Vertex AI Dataset #4863
Add Vertex AI Dataset #4863
Conversation
Co-authored-by: upodroid <[email protected]>
Hello! I am a robot who works on Magic Modules PRs. I have detected that you are a community contributor, so your PR will be assigned to someone with a commit-bit on this repo for initial review. Thanks for your contribution! A human will be with you soon. @melinath, please review this PR or find an appropriate assignee. |
Hi! I'm the modular magician. Your PR generated some diffs in downstreams - here they are. Diff report:Terraform GA: Diff ( 8 files changed, 992 insertions(+), 2 deletions(-)) |
@rileykarson We need to rewrite the Operations handling code to deal with APIs that require region specific API subdomains to be inferred for the url. This is the first API (excluding Cloud Run, we seem to polling the status field of the resource for changes instead of tracking an Operation) that has regional endpoints that need the API subdomain to be inferred from
I'm thinking of passing Let me know what you think. |
I have triggered VCR tests based on this PR's diffs. See the results here: "https://ci-oss.hashicorp.engineering/viewQueued.html?itemId=191770" |
Hmm, yeah, it looks like QueryOp would likely be the place to make this call. It's a bit unfortunate that we're going to end up modifying all the other operation waiters but I don't know that it would be worth avoiding that. |
Can someone reopen hashicorp/terraform-provider-google#4624 and target it for v4 release? Found this while reworking Operations. It is a v3 removal that didn't happen on time. I have adjusted all the Operations that are used by MM and partially handwritten products. I didn't edit operations for products like Cloud Functions as the entire API is handwritten. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It looks like there are some file conflicts that need to be resolved. After some more consideration & talking with the team, would it be possible to implement this without the plumbing? I'm sorry to be backtracking on this; the summary is that this is the only resource we expect to need this feature in all of MMv1 ever, so it would be preferable to use a custom operation handler that extracts the location from the operation path (which should already contain it) rather than doing this plumbing.
Hmm
Let me see if I can pull location/region cleanly from |
How do you exclude Keep getting:
|
I think that if you leave out |
It is fixed and ready |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM as a general approach; the build is currently failing because of the deletion of compute_shared_operation.go. That deletion seems unrelated to this PR; if that's correct, could you undo it for now / move it to a separate PR?
Also, could you add a few unit tests for GetRegionFromRegionalSelfLink?
Hi! I'm the modular magician. Your PR generated some diffs in downstreams - here they are. Diff report:Terraform GA: Diff ( 9 files changed, 790 insertions(+), 2 deletions(-)) |
I have triggered VCR tests based on this PR's diffs. See the results here: "https://ci-oss.hashicorp.engineering/viewQueued.html?itemId=192739" |
I have triggered VCR tests in RECORDING mode for the following tests that failed during VCR: TestAccVertexAIDataset_vertexAiDatasetExample|TestAccTags You can view the result here: "https://ci-oss.hashicorp.engineering/viewQueued.html?itemId=192832" |
This looks ready, it would be great if i can get it merged today so I can start working on other Vertex AI resources. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
The tests failed because the API wasn't enabled; here's a test run just for the new test: https://ci-oss.hashicorp.engineering/buildConfiguration/GoogleCloudBeta_ProviderGoogleCloudBetaMmUpstreamVcr/193010 |
test passed |
I know this has been merged, but I'm not sure if what I'm seeing is a bug to be reported or a enhancement to be requested. The example provided in this PR and shown in the documentation creates the resource based on the metadata schema, but there doesn't seem to be a way to pass configuration details for what that schema expects. For example, (slightly modifying the one provided in the documentation) resource "google_vertex_ai_dataset" "dataset" {
display_name = "terraform"
metadata_schema_uri = "gs://google-cloud-aiplatform/schema/dataset/metadata/tabular_1.0.0.yaml"
region = "us-central1"
} The schema refers to this document: title: Tabular
type: object
description: >
The metadata of tabular Datasets. Can be used in Dataset.metadata_schema_uri
field.
properties:
inputConfig:
description: >
The tabular Dataset's data source. The Dataset doesn't store the data
directly, but only pointer(s) to its data.
oneOf:
- type: object
properties:
type:
type: string
enum: [gcs_source]
uri:
type: array
items:
type: string
description: >
Cloud Storage URI of one or more files. Only CSV files are supported.
The first line of the CSV file is used as the header.
If there are multiple files, the header is the first line of
the lexicographically first file, the other files must either
contain the exact same header or omit the header.
- type: object
properties:
type:
type: string
enum: [bigquery_source]
uri:
type: string
description: The URI of a BigQuery table.
discriminator:
propertyName: type Naturally I would want to configure the Should I open a bug or an enhancement for this issue? |
@racosta It sounds like you're requesting a new field, so that would be an enhancement request. |
Part of hashicorp/terraform-provider-google#9298
If this PR is for Terraform, I acknowledge that I have:
make test
andmake lint
to ensure it passes unit and linter tests.Release Note Template for Downstream PRs (will be copied)