Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Stream name splits feature should be togglable #58

Open
1 task
visch opened this issue Jan 10, 2023 · 1 comment · Fixed by #59
Open
1 task

Stream name splits feature should be togglable #58

visch opened this issue Jan 10, 2023 · 1 comment · Fixed by #59

Comments

@visch
Copy link
Member

visch commented Jan 10, 2023

re https://meltano.slack.com/archives/C01TCRBBJD7/p1673366790519269?thread_ts=1671688563.297449&cid=C01TCRBBJD7

We should be able to turn off the stream name splits feature for folks that don't want it. Also we should output when we detect a stream split so that folks that have no idea the feature exists can at least have some indicator that the behavior may not be what they want (Like in the case from that slack thread where the stream in this case from dynamo has a dash in the name, but the user didn't want the dash's to represent different schemas/table names. )

  • Schema with Dashes in the name should be called out in the Logs and should be documented
@visch
Copy link
Member Author

visch commented Jan 12, 2023

#59 caused #62

Dug into the issue and the core issue with that PR is a few layered issues

  1. Our tests didn't include any tests for default_target_schema so everything looked to be working properly from tests even though it wasn't working for anyone with a default_target_schema set.

  2. There's a number of places that we're writing SQL directly. Specifically https://github.com/MeltanoLabs/target-postgres/blob/main/target_postgres/connector.py#L63-L70, https://github.com/MeltanoLabs/target-postgres/blob/main/target_postgres/connector.py#L169 , https://github.com/MeltanoLabs/target-postgres/blob/main/target_postgres/sinks.py#L125-L131 , and https://github.com/MeltanoLabs/target-postgres/blob/main/target_postgres/connector.py#L55-L61 . The issue with writing the SQL directly is that we made some assumptions about schema and table names, and we're having to compile our own sql. To be more specific, ALTER TABLE %(table_name)s ADD COLUMN %(column_name)s %(column_type)s Works fine until table_name has a capital letter. ie ALTER TABLE CapitalTableName ADD COLUMN %(column_name)s %(column_type)s , naively I wrapped table_name in quotes so we ended up with ALTER TABLE "CapitalTableName" ADD COLUMN %(column_name)s %(column_type)s which works, except it doesn't work when there's a schema defined ie ALTER TABLE "schemaname.CapitalTableName" ADD COLUMN %(column_name)s %(column_type)s So the function get_fully_qualified_name ( https://github.com/meltano/sdk/blob/main/singer_sdk/connectors/sql.py#L202-L243) that calculates self.full_table_name needs to wrap the schema and the table name in quotes, modifing get_fully_qualified_name can work to make that happen pretty easily, but then we start passing a fully qualified, and quoted name to SQLALchemy which breaks this https://github.com/meltano/sdk/blob/f8f366cea7893e97dd9555ee5600fb3de1d15a11/singer_sdk/connectors/sql.py#L593-L614 , as what we really want is `ALTER TABLE "schemaname"."CapitalTableName" ADD COLUMN %(column_name)s %(column_type)s``

To fix 2. above we're going to probably want to call SQLAlchemy directly for our table inserts and creations. We could also start doing that all ourselves manually instead of using SQLAlchemy which would also work, but we'd have to replace everywhere that we use SQLAlchemy to take care of things for us. We should probably just fix the places we're writing SQL Directly in the target instead.

Edit:

A much easier fix could be to just make a function for the "correct" quoted table reference name. That way I don't have to fix all of this here.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
1 participant