Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Promote Vertex AI FeatureStore resources (GA only) #6565

Conversation

shotarok
Copy link
Contributor

@shotarok shotarok commented Sep 18, 2022

part of hashicorp/terraform-provider-google#9298

If this PR is for Terraform, I acknowledge that I have:

  • Searched through the issue tracker for an open issue that this either resolves or contributes to, commented on it to claim it, and written "fixes {url}" or "part of {url}" in this PR description. If there were no relevant open issues, I opened one and commented that I would like to work on it (not necessary for very small changes).
  • Generated Terraform, and ran make test and make lint to ensure it passes unit and linter tests.
  • Ensured that all new fields I added that can be set by a user appear in at least one example (for generated resources) or third_party test (for handwritten resources or update tests).
  • Ran relevant acceptance tests (If the acceptance tests do not yet pass or you are unable to run them, please let your reviewer know).
  • Read the Release Notes Guide before writing my release note below.

Release Note Template for Downstream PRs (will be copied)

`google_vertex_ai_featurestore` (ga only)
`google_vertex_ai_featurestore_entitytype` (ga only)
`google_vertex_ai_featurestore_entitytype_feature` (ga only)

@modular-magician
Copy link
Collaborator

Hello! I am a robot who works on Magic Modules PRs.

I've detected that you're a community contributor. @slevenick, a repository maintainer, has been assigned to assist you and help review your changes.

❓ First time contributing? Click here for more details

Your assigned reviewer will help review your code by:

  • Ensuring it's backwards compatible, covers common error cases, etc.
  • Summarizing the change into a user-facing changelog note.
  • Passes tests, either our "VCR" suite, a set of presubmit tests, or with manual test runs.

You can help make sure that review is quick by running local tests and ensuring they're passing in between each push you make to your PR's branch. Also, try to leave a comment with each push you make, as pushes generally don't generate emails.

If your reviewer doesn't get back to you within a week after your most recent change, please feel free to leave a comment on the issue asking them to take a look! In the absence of a dedicated review dashboard most maintainers manage their pending reviews through email, and those will sometimes get lost in their inbox.


@modular-magician
Copy link
Collaborator

Hi! I'm the modular magician. Your PR generated some diffs in downstreams - here they are.

Diff report:

Terraform GA: Diff ( 7 files changed, 1264 insertions(+), 11 deletions(-))
Terraform Beta: Diff ( 4 files changed, 4 insertions(+), 14 deletions(-))
TF Validator: Diff ( 4 files changed, 260 insertions(+), 3 deletions(-))

@modular-magician
Copy link
Collaborator

Tests analytics

Total tests: 2174
Passed tests 1934
Skipped tests: 238
Failed tests: 2

Action taken

Triggering VCR tests in RECORDING mode for the tests that failed during VCR. Click here to see the failed tests
TestAccFirebaserulesRelease_BasicRelease|TestAccComputeInstance_soleTenantNodeAffinities

@modular-magician
Copy link
Collaborator

Tests passed during RECORDING mode:
TestAccFirebaserulesRelease_BasicRelease[Debug log]

Tests failed during RECORDING mode:
TestAccComputeInstance_soleTenantNodeAffinities[Error message] [Debug log]

Please fix these to complete your PR
View the build log or the debug log for each test

@shotarok
Copy link
Contributor Author

shotarok commented Sep 18, 2022

It looks like the two tests also fail in the other PRs regardless of the PR contents. BTW, I can't see the build log or the debug log either due to the permission errors. Is this an expected behavior?

[UPDATED] I found the following note on README.md. I understand that community developers don't need permission too see the logs.

The false positive rate on these tests is extremely high between changes in the API, Cloud Build bugs, and eventual consistency issues in test recordings so we don't expect contributors to wholly interpret the results- that's the responsibility of your reviewer.

@shotarok
Copy link
Contributor Author

Hi @slevenick, could you please review this PR?

@slevenick
Copy link
Contributor

Hmmm, it looks like some fields may not be supported in GA at the API level (or were renamed, removed or something else)

This test fails:
TestAccVertexAIFeaturestoreEntitytype_vertexAiFeaturestoreEntitytypeExample

Can you see what is going on with that test? I see this failure message:

"description": "Invalid JSON payload received. Unknown name \"monitoringInterval\" at 'entity_type.monitoring_config.snapshot_analysis': Cannot find field.",

@shotarok
Copy link
Contributor Author

shotarok commented Sep 27, 2022

@slevenick Thank you for your comment! I found out the following things:

  • monitoringInterval is deprecated in v1beta1 like below
  • stalenessDays and monitoringIntervalDays are supported in both v1 and v1beta1

image

I'll remove monitoringInterval in this PR and keep this PR only for GA. Apart from that, I'll create another PR to support the new fields. Please let me know if you'd recommend containing both changes in this PR. Thanks!

@shotarok
Copy link
Contributor Author

As for the error messages, I'm afraid I couldn't find the visible link to my account. Could you please give me the link to see the error? The following links returned permission errors to me.

View the build log or the debug log for each test

@modular-magician
Copy link
Collaborator

Hi! I'm the modular magician. Your PR generated some diffs in downstreams - here they are.

Diff report:

Terraform GA: Diff ( 7 files changed, 1239 insertions(+), 17 deletions(-))
Terraform Beta: Diff ( 5 files changed, 4 insertions(+), 45 deletions(-))
TF Validator: Diff ( 4 files changed, 249 insertions(+), 3 deletions(-))

@slevenick
Copy link
Contributor

Yikes, so this is going to be a bit of a tricky situation. We have pretty strict guidelines around backwards compatibility, where we cannot remove a field until a major version change. That means that we can't remove the field from the beta provider, but we also can't have it present in the GA provider. Maybe we can mark that field as min_version: beta to prevent it from being in the GA provider, and then deprecate it so we can remove it in the next major version

@shotarok
Copy link
Contributor Author

Maybe we can mark that field as min_version: beta to prevent it from being in the GA provider, and then deprecate it so we can remove it in the next major version

Thank you for sharing the release policy. This approach looks good to me. I'll update the PR, and get back to you!

@modular-magician
Copy link
Collaborator

Tests analytics

Total tests: 2182
Passed tests 1942
Skipped tests: 238
Failed tests: 2

Action taken

Triggering VCR tests in RECORDING mode for the tests that failed during VCR. Click here to see the failed tests
TestAccVertexAIFeaturestoreEntitytype_vertexAiFeaturestoreEntitytypeExample|TestAccComputeInstance_soleTenantNodeAffinities

@modular-magician
Copy link
Collaborator

Tests failed during RECORDING mode:
TestAccVertexAIFeaturestoreEntitytype_vertexAiFeaturestoreEntitytypeExample[Error message] [Debug log]
TestAccComputeInstance_soleTenantNodeAffinities[Error message] [Debug log]

Please fix these to complete your PR
View the build log or the debug log for each test

@modular-magician
Copy link
Collaborator

Hi! I'm the modular magician. Your PR generated some diffs in downstreams - here they are.

Diff report:

Terraform GA: Diff ( 7 files changed, 1241 insertions(+), 13 deletions(-))
Terraform Beta: Diff ( 5 files changed, 9 insertions(+), 18 deletions(-))
TF Validator: Diff ( 4 files changed, 249 insertions(+), 3 deletions(-))

@modular-magician
Copy link
Collaborator

Tests analytics

Total tests: 2192
Passed tests 1938
Skipped tests: 240
Failed tests: 14

Action taken

Triggering VCR tests in RECORDING mode for the tests that failed during VCR. Click here to see the failed tests
TestAccComputeInstance_soleTenantNodeAffinities|TestAccComputeForwardingRule_internalTcpUdpLbWithMigBackendExample|TestAccComputeGlobalForwardingRule_externalTcpProxyLbMigBackendExample|TestAccComputeForwardingRule_networkTier|TestAccComputeForwardingRule_update|TestAccComputeForwardingRule_forwardingRuleRegionalHttpXlbExample|TestAccComputeForwardingRule_forwardingRuleExternallbExample|TestAccClouddeployDeliveryPipeline_DeliveryPipeline|TestAccComputeRouterInterface_basic|TestAccComputeVpnTunnel_vpnTunnelBetaExample|TestAccVertexAIFeaturestoreEntitytype_vertexAiFeaturestoreEntitytypeExample|TestAccSqlDatabaseInstance_mysqlMajorVersionUpgrade|TestAccComputeFirewallPolicyRule_update|TestAccComputeFirewallPolicy_update

@shotarok
Copy link
Contributor Author

shotarok commented Sep 28, 2022

Hello @slevenick, I removed monitoring_interval from the example, but it seems TestAccVertexAIFeaturestoreEntitytype_vertexAiFeaturestoreEntitytypeExample still is failing. Could you help me what the error message is? I'll also try to reproduce the error locally. Thanks

@modular-magician
Copy link
Collaborator

Tests passed during RECORDING mode:
TestAccComputeForwardingRule_internalTcpUdpLbWithMigBackendExample[Debug log]
TestAccComputeGlobalForwardingRule_externalTcpProxyLbMigBackendExample[Debug log]
TestAccComputeForwardingRule_networkTier[Debug log]
TestAccComputeForwardingRule_update[Debug log]
TestAccComputeForwardingRule_forwardingRuleRegionalHttpXlbExample[Debug log]
TestAccComputeForwardingRule_forwardingRuleExternallbExample[Debug log]
TestAccClouddeployDeliveryPipeline_DeliveryPipeline[Debug log]
TestAccComputeRouterInterface_basic[Debug log]
TestAccComputeVpnTunnel_vpnTunnelBetaExample[Debug log]
TestAccSqlDatabaseInstance_mysqlMajorVersionUpgrade[Debug log]
TestAccComputeFirewallPolicyRule_update[Debug log]
TestAccComputeFirewallPolicy_update[Debug log]

Tests failed during RECORDING mode:
TestAccComputeInstance_soleTenantNodeAffinities[Error message] [Debug log]
TestAccVertexAIFeaturestoreEntitytype_vertexAiFeaturestoreEntitytypeExample[Error message] [Debug log]

Please fix these to complete your PR
View the build log or the debug log for each test

Copy link
Contributor

@slevenick slevenick left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Error is:

    provider_test.go:315: Step 1/2 error: After applying this test step, the plan was not empty.
        stdout:
        
        
        Terraform used the selected providers to generate the following execution
        plan. Resource actions are indicated with the following symbols:
          ~ update in-place
        
        Terraform will perform the following actions:
        
          # google_vertex_ai_featurestore_entitytype.entity will be updated in-place
          ~ resource "google_vertex_ai_featurestore_entitytype" "entity" {
                id           = "projects/ci-test-project-188019/locations/us-central1/featurestores/terraformxhbvcoal6r/entityTypes/terraformxhbvcoal6r"
                name         = "terraformxhbvcoal6r"
                # (4 unchanged attributes hidden)
        
              + monitoring_config {
                  + snapshot_analysis {
                      + disabled = false
                    }
                }
            }
        
        Plan: 0 to add, 1 to change, 0 to destroy.
--- FAIL: TestAccVertexAIFeaturestoreEntitytype_vertexAiFeaturestoreEntitytypeExample (67.11s)

Looks like if monitoring_interval isn't specified we get an empty block back. Can you specify the other field in that test?

@shotarok
Copy link
Contributor Author

Thank you for letting me know about the error!

The stalenessDays and monitoringIntervalDays are the other fields, but they're new fields for the provider. The PR becomes not only GA. So I'd try to use monitoring_config.snapshot_analysis.disabled = true.

Apart from that, I added google_vertex_ai_featurestore_entitytype_feature to the beta provider in #6568. It's also a GA-ready feature store resource. So I'll make it promoted in this PR.

@slevenick
Copy link
Contributor

Still seeing:

              ~ monitoring_config {
                  ~ snapshot_analysis {
                      - monitoring_interval = "0s" -> null
                        # (1 unchanged attribute hidden)
                    }
                }
            }

Are you able to run these tests yourself?

@shotarok
Copy link
Contributor Author

Thank you for your comment. I tried reproducing the error with the auto-generated branch on my local machine like below, but the test passed on my machine. So I might have overlooked something. I attached test.log.

➜  terraform-provider-google git:(auto-pr-6565)  git show HEAD  | head -n 1
commit b769b21056499b22faf81f5f0bf14a1d74b20042

➜  terraform-provider-google git:(auto-pr-6565)  TF_LOG=TRACE make testacc GOOGLE_ORG=test GOOGLE_BILLING_ACCOUNT=test GOOGLE_USE_DEFAULT_CREDENTIALS=true GCLOUD_PROJECT=kouzoh-p-kohama GCLOUD_REGION=us-central1 GCLOUD_ZONE=us-central1-a TEST=./google TESTARGS='-run=TestAccVertexAIFeaturestoreEntitytype_vertexAiFeaturestoreEntitytypeExample' | tee test.log
go generate  ./...
TF_ACC=1 TF_SCHEMA_PANIC_ON_ERROR=1 go test ./google -v -run=TestAccVertexAIFeaturestoreEntitytype_vertexAiFeaturestoreEntitytypeExample -timeout 240m -ldflags="-X=github.com/hashicorp/terraform-provider-google/version.ProviderVersion=acc"
=== RUN   TestAccVertexAIFeaturestoreEntitytype_vertexAiFeaturestoreEntitytypeExample
=== PAUSE TestAccVertexAIFeaturestoreEntitytype_vertexAiFeaturestoreEntitytypeExample
=== CONT  TestAccVertexAIFeaturestoreEntitytype_vertexAiFeaturestoreEntitytypeExample
...
--- PASS: TestAccVertexAIFeaturestoreEntitytype_vertexAiFeaturestoreEntitytypeExample (96.11s)
PASS
ok  	github.com/hashicorp/terraform-provider-google/google	(cached)

@slevenick
Copy link
Contributor

Thank you for your comment. I tried reproducing the error with the auto-generated branch on my local machine like below, but the test passed on my machine. So I might have overlooked something. I attached test.log.

➜  terraform-provider-google git:(auto-pr-6565)  git show HEAD  | head -n 1
commit b769b21056499b22faf81f5f0bf14a1d74b20042

➜  terraform-provider-google git:(auto-pr-6565)  TF_LOG=TRACE make testacc GOOGLE_ORG=test GOOGLE_BILLING_ACCOUNT=test GOOGLE_USE_DEFAULT_CREDENTIALS=true GCLOUD_PROJECT=kouzoh-p-kohama GCLOUD_REGION=us-central1 GCLOUD_ZONE=us-central1-a TEST=./google TESTARGS='-run=TestAccVertexAIFeaturestoreEntitytype_vertexAiFeaturestoreEntitytypeExample' | tee test.log
go generate  ./...
TF_ACC=1 TF_SCHEMA_PANIC_ON_ERROR=1 go test ./google -v -run=TestAccVertexAIFeaturestoreEntitytype_vertexAiFeaturestoreEntitytypeExample -timeout 240m -ldflags="-X=github.com/hashicorp/terraform-provider-google/version.ProviderVersion=acc"
=== RUN   TestAccVertexAIFeaturestoreEntitytype_vertexAiFeaturestoreEntitytypeExample
=== PAUSE TestAccVertexAIFeaturestoreEntitytype_vertexAiFeaturestoreEntitytypeExample
=== CONT  TestAccVertexAIFeaturestoreEntitytype_vertexAiFeaturestoreEntitytypeExample
...
--- PASS: TestAccVertexAIFeaturestoreEntitytype_vertexAiFeaturestoreEntitytypeExample (96.11s)
PASS
ok  	github.com/hashicorp/terraform-provider-google/google	(cached)

It's failing on the beta provider specifically, so maybe it's a beta vs GA difference?

@shotarok
Copy link
Contributor Author

It's failing on the beta provider specifically, so maybe it's a beta vs GA difference?

Ah, I could reproduce the error with the beta provider locally. I'll look into it. Thanks!

@modular-magician
Copy link
Collaborator

Hi there, I'm the Modular magician - !

@modular-magician
Copy link
Collaborator

Tests analytics

Total tests: 2197
Passed tests 1950
Skipped tests: 240
Failed tests: 7

Action taken

Triggering VCR tests in RECORDING mode for the tests that failed during VCR. Click here to see the failed tests
TestAccComputeInstance_soleTenantNodeAffinities|TestAccCGCSnippet_eventarcWorkflowsExample|TestAccFirebaserulesRelease_BasicRelease|TestAccBillingSubaccount_renameOnDestroy|TestAccBillingSubaccount_basic|TestAccVertexAIFeaturestoreEntitytype_vertexAiFeaturestoreEntitytypeExample|TestAccVertexAIFeaturestoreEntitytypeFeature_vertexAiFeaturestoreEntitytypeFeatureExample

@modular-magician
Copy link
Collaborator

Tests passed during RECORDING mode:
TestAccFirebaserulesRelease_BasicRelease[Debug log]
TestAccBillingSubaccount_renameOnDestroy[Debug log]
TestAccBillingSubaccount_basic[Debug log]
TestAccVertexAIFeaturestoreEntitytype_vertexAiFeaturestoreEntitytypeExample[Debug log]
TestAccVertexAIFeaturestoreEntitytypeFeature_vertexAiFeaturestoreEntitytypeFeatureExample[Debug log]

Tests failed during RECORDING mode:
TestAccComputeInstance_soleTenantNodeAffinities[Error message] [Debug log]
TestAccCGCSnippet_eventarcWorkflowsExample[Error message] [Debug log]

Please fix these to complete your PR
View the build log or the debug log for each test

@shotarok
Copy link
Contributor Author

shotarok commented Oct 1, 2022

Hi @slevenick I added the default value of the monitoring_interval and I confirmed the tests passed. Could you review the change when you have time? Thanks!

@@ -273,7 +271,11 @@ objects:
The monitoring schedule for snapshot analysis. For EntityType-level config: unset / disabled = true indicates disabled by default for Features under it; otherwise by default enable snapshot analysis monitoring with monitoringInterval for Features under it.
- !ruby/object:Api::Type::String
name: 'monitoringInterval'
default_value: "0s"
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We can't add a default value in a minor version of the provider. It's technically considered a breaking change even if it was the implicit default value before

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@slevenick Got it! I'll revert the change. BTW, could you give me an example to test a beta field? I'd like to test monitoring_interval only in the google-beta.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

For example, can we define an additional google_vertex_ai_featurestore_entitytype only in the test of the google-beta provider?

resource "google_vertex_ai_featurestore_entitytype" "entity" {
  name     = "<%= ctx[:vars]['name'] %>"
  labels = {
    foo = "bar"
  }
  featurestore = google_vertex_ai_featurestore.featurestore.id
  monitoring_config {
    snapshot_analysis {
      disabled = true
    }
  }
}


<%- if is_beta %>
resource "google_vertex_ai_featurestore_entitytype" "entity_for_beta" {
  provider = google-beta
  name     = "<%= ctx[:vars]['name'] + '_for_beta' %>"
  labels = {
    foo = "bar"
  }
  featurestore = google_vertex_ai_featurestore.featurestore.id
  monitoring_config {
    snapshot_analysis {
      disabled = false
      monitoring_interval = "86400s"
    }
  }
}
<%- end %>

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, you can find examples of the version tag needed to make some code only exist in the beta provider. I think its something like <%- unless version == "ga" %>

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@slevenick it turns out there is no version information in the bindings to render the example. I'm wondering if there is another way to test a beta field. Or should I pass the version to render the example?

body = lines(compile_file(
{
vars: rand_vars.merge(overrides),
test_env_vars: test_env_vars.map { |k, _| [k, "%{#{k}}"] }.to_h,
primary_resource_id: primary_resource_id,
primary_resource_type: primary_resource_type
},
pwd + '/' + config_path
))

Copy link
Contributor Author

@shotarok shotarok Oct 12, 2022

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I understood we could define multiple examples for a resource, and we can define min_version of an example object. I'll add two example erb files.

Copy link
Contributor Author

@shotarok shotarok Oct 12, 2022

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I could define an example only for the beta provider by min_version of an example object. However, I couldn't define an example only for the ga provider only with the min_version.

object.examples
.reject(&:skip_test)
.reject { |e| @api.version_obj_or_closest(version) < @api.version_obj_or_closest(e.min_version) }
.each do |example|

Copy link
Contributor Author

@shotarok shotarok Oct 12, 2022

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I created two examples. One is without monitoring_interval, which works for both ga and beta. Another example is with monitoring_interval, which only works for beta.

mmv1/products/vertexai/api.yaml Outdated Show resolved Hide resolved
@modular-magician
Copy link
Collaborator

Hi! I'm the modular magician. Your PR generated some diffs in downstreams - here they are.

Diff report:

Terraform GA: Diff ( 10 files changed, 1846 insertions(+), 18 deletions(-))
Terraform Beta: Diff ( 7 files changed, 202 insertions(+), 30 deletions(-))
TF Validator: Diff ( 5 files changed, 339 insertions(+), 3 deletions(-))
TF OiCS: Diff ( 8 files changed, 267 insertions(+))

@modular-magician
Copy link
Collaborator

Tests analytics

Total tests: 2194
Passed tests 1949
Skipped tests: 239
Failed tests: 6

Action taken

Triggering VCR tests in RECORDING mode for the tests that failed during VCR. Click here to see the failed tests
TestAccVertexAIFeaturestoreEntitytypeFeature_vertexAiFeaturestoreEntitytypeFeatureWithBetaFieldsExample|TestAccVertexAIFeaturestoreEntitytypeFeature_vertexAiFeaturestoreEntitytypeFeatureExample|TestAccVertexAIFeaturestoreEntitytype_vertexAiFeaturestoreEntitytypeExample|TestAccVertexAIFeaturestoreEntitytype_vertexAiFeaturestoreEntitytypeWithBetaFieldsExample|TestAccComputeForwardingRule_internalTcpUdpLbWithMigBackendExample|TestAccComputeInstance_soleTenantNodeAffinities

@modular-magician
Copy link
Collaborator

Tests passed during RECORDING mode:
TestAccVertexAIFeaturestoreEntitytypeFeature_vertexAiFeaturestoreEntitytypeFeatureWithBetaFieldsExample[Debug log]
TestAccVertexAIFeaturestoreEntitytypeFeature_vertexAiFeaturestoreEntitytypeFeatureExample[Debug log]
TestAccVertexAIFeaturestoreEntitytype_vertexAiFeaturestoreEntitytypeExample[Debug log]
TestAccVertexAIFeaturestoreEntitytype_vertexAiFeaturestoreEntitytypeWithBetaFieldsExample[Debug log]
TestAccComputeForwardingRule_internalTcpUdpLbWithMigBackendExample[Debug log]

Tests failed during RECORDING mode:
TestAccComputeInstance_soleTenantNodeAffinities[Error message] [Debug log]

Please fix these to complete your PR
View the build log or the debug log for each test

@shotarok
Copy link
Contributor Author

@slevenick OK, the added tests passed in the recording mode. Could you please review this PR again when you have time? Thanks

Copy link
Contributor

@slevenick slevenick left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Great thanks! Good call on adding a separate test for the beta-only field

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants