-
Notifications
You must be signed in to change notification settings - Fork 72
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Google Cloud SQL MySQL Connector #4949
Google Cloud SQL MySQL Connector #4949
Conversation
The latest updates on your projects. Learn more about Vercel for Git ↗︎ 1 Ignored Deployment
|
clients/admin-ui/cypress/fixtures/connectors/connection_types.json
Outdated
Show resolved
Hide resolved
) | ||
|
||
|
||
class GoogleCloudMySQLSchema(ConnectionConfigSecretsSchema): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This should be using Google's keyfile_creds
similar to how we do it in connection_secrets_bigquery
. Take a look at option two in this doc for how to authenticate using a keyfile https://ethyca.atlassian.net/wiki/spaces/EN/pages/3029204995/2024-04-19+Google+Cloud+SQL+Integration#Option-2-(IAM-database-authentication)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is a good first pass Andres! I was able to connect to a Google Cloud SQL for MySQL instance using keyfile creds and that was the main part of this ticket 👍 I think it's still good to test the retrieve_data
and mask_data
functions we're inheriting from SQLConnector
so I documented some test files you can reference to do this.
I commented inline but I'll share here to make sure to run nox -s static_checks
to identify any static issues like formatting, type, or import issues.
You will also run into some database migration issues once you merge in the main branch because your new migration and the latest migration both have the same down_revision
down_revision = "efddde14da21"
Search the repo for down_revision = "efddde14da21"
and point your migration to the revision
number of the other file with the same down_revision
.
I know this is a lot so let me know if you want to go over this in person.
.github/workflows/backend_checks.yml
Outdated
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for sorting these values
# Add google_cloud_sql_mysql to ConnectionType enum | ||
op.execute("alter type connectiontype rename to connectiontype_old") | ||
op.execute( | ||
"create type connectiontype as enum('postgres', 'mongodb', 'mysql', 'https', 'snowflake', 'redshift', 'mssql', 'mariadb', 'bigquery', 'saas', 'manual', 'email', 'manual_webhook', 'timescale', 'fides', 'sovrn', 'google_cloud_sql_mysql')" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We're missing some connection types, make sure you include all the values in the ConnectionType
enum
class ConnectionType(enum.Enum):
"""
Supported types to which we can connect Fides.
"""
postgres = "postgres"
mongodb = "mongodb"
mysql = "mysql"
google_cloud_sql_mysql = "google_cloud_sql_mysql"
https = "https"
saas = "saas"
redshift = "redshift"
snowflake = "snowflake"
mssql = "mssql"
mariadb = "mariadb"
bigquery = "bigquery"
manual = "manual" # Deprecated - use manual_webhook instead
sovrn = "sovrn"
attentive = "attentive"
dynamodb = "dynamodb"
manual_webhook = "manual_webhook" # Runs upfront before the traversal
timescale = "timescale"
fides = "fides"
generic_erasure_email = "generic_erasure_email" # Run after the traversal
generic_consent_email = "generic_consent_email" # Run after the traversal
# Remove google_cloud_sql_mysql from the ConnectionType enum | ||
op.execute("alter type connectiontype rename to connectiontype_old") | ||
op.execute( | ||
"create type connectiontype as enum('postgres', 'mongodb', 'mysql', 'https', 'snowflake', 'redshift', 'mssql', 'mariadb', 'bigquery', 'saas', 'manual', 'email', 'manual_webhook', 'timescale', 'fides', 'sovrn')" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Same as above, make sure it's the same list as the upgrade but without the new google_cloud_sql_mysql
entry
tests/ops/integration_tests/test_connection_configuration_integration.py
Outdated
Show resolved
Hide resolved
tests/ops/integration_tests/test_connection_google_sql_mysql_configuration_integration.py
Outdated
Show resolved
Hide resolved
tests/ops/integration_tests/test_connection_google_sql_mysql_configuration_integration.py
Outdated
Show resolved
Hide resolved
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The connection test worked just fine which is the important part of this ticket! But we want to make sure that the access and erasure logic we're inheriting from the MySQLConnector
still works for our new Google Cloud SQL instance.
To do. this, expand the fixtures to include something similar to mysql_integration_db
from the mysql_fixtures.py
file? This is what will create our test data in Google Cloud SQL.
@pytest.fixture(scope="function")
def google_cloud_sql_mysql_integration_db(google_cloud_sql_mysql_integration_session):
truncate_all_tables(mysql_integration_session)
statements = [
"""
INSERT INTO product VALUES
(1, 'Example Product 1', 10.00),
...
Then add a test_integration_google_cloud_sql_mysql_example.py
similar to tests/ops/integration_tests/test_integration_mysql_example.py
to test that the data was created.
Then an equivalent of test_mysql_access_request_task
in tests/ops/integration_tests/test_sql_task.py
And finally the most important part, a Google Cloud SQL version of test_create_and_process_access_request_mysql
and test_create_and_process_erasure_request_specific_category_mysql
in tests/ops/service/privacy_request/test_request_runner_service.py
If you search for @pytest.mark.integration_mysql
in the repo, you should see all of these tests for the original version of the MySQL connector, make sure you have similar tests for your new connector. Add a new pytest marker in pyproject.toml
for integration_google_cloud_sql_mysql
so you can tag all of your new tests.
Co-authored-by: Adrian Galvan <[email protected]>
Co-authored-by: Adrian Galvan <[email protected]>
Passing run #8393 ↗︎
Details:
Review all test suite changes for PR #4949 ↗︎ |
"dsr_version", | ||
["use_dsr_3_0", "use_dsr_2_0"], | ||
) | ||
def test_create_and_process_erasure_request_specific_category_mysql( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
def test_create_and_process_erasure_request_specific_category_mysql( | |
def test_create_and_process_erasure_request_specific_category_google_cloud_sql_mysql( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Almost there! Just a few minor things. Also make sure you run nox -s static_checks
locally to format the files and pick up any other static issues.
src/fides/api/alembic/migrations/versions/7641ea685ea4_add_google_cloud_sql_mysql_to_.py
Show resolved
Hide resolved
|
||
_required_components: List[str] = [ | ||
"db_iam_user", | ||
"instance_connection_name", |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can you add dbname
to _required_components
now that it's required?
yield google_cloud_sql_mysql_integration_session | ||
|
||
|
||
# TODO: Consolidate these |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can we get rid of this TODO?
tests/fixtures/mysql_fixtures.py
Outdated
@@ -95,6 +95,33 @@ def mysql_example_test_dataset_config( | |||
dataset.delete(db=db) | |||
ctl_dataset.delete(db=db) | |||
|
|||
# TODO: Consolidate these |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Same for this one
tests/fixtures/mysql_fixtures.py
Outdated
@@ -95,6 +95,33 @@ def mysql_example_test_dataset_config( | |||
dataset.delete(db=db) | |||
ctl_dataset.delete(db=db) | |||
|
|||
# TODO: Consolidate these | |||
@pytest.fixture | |||
def google_cloud_sql_mysql_example_test_dataset_config( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We can get rid of this one since we moved it to it's own file right?
Co-authored-by: Adrian Galvan <[email protected]>
Co-authored-by: Adrian Galvan <[email protected]>
Codecov ReportAttention: Patch coverage is
Additional details and impacted files@@ Coverage Diff @@
## main #4949 +/- ##
==========================================
- Coverage 86.59% 86.56% -0.03%
==========================================
Files 351 352 +1
Lines 21818 21874 +56
Branches 2881 2884 +3
==========================================
+ Hits 18893 18935 +42
- Misses 2420 2433 +13
- Partials 505 506 +1 ☔ View full report in Codecov by Sentry. |
Passing run #8403 ↗︎
Details:
Review all test suite changes for PR #4949 ↗︎ |
Closes PROD-2152
Description Of Changes
Added GoogleCloudSQLMySQLConnector and tests for it.
Code Changes
Steps to Confirm
Pre-Merge Checklist
CHANGELOG.md
main
downgrade()
migration is correct and works