Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add error checking to all of the macros that use get_columns_in_relation #527

Closed
wants to merge 3 commits into from

Conversation

jwills
Copy link

@jwills jwills commented Mar 23, 2022

…so as to provide a more useful error message if no columns are returned by that function.

This is a:

  • bug fix PR with no breaking changes — please ensure the base branch is main
  • new functionality — please ensure the base branch is the latest dev/ branch
  • a breaking change — please ensure the base branch is the latest dev/ branch

Description & motivation

I ran into a situation where different adapters have different behaviors with respect to the case-sensitivity of the relation that is passed in to get_columns_in_relation: Snowflake/Spark largely ignore the case of the relation (so passing in "events" will give the same colimn as "EVENTS") whereas other adapters (e.g., Postgres) do case-sensitive matching of the relation/schema names via the information_schema.columns table and will return 0 columns if the cases do not match. I thought I could make the error message for this situation a bit easier to grok than what happens now (which is that you get a compiled model with a bunch of invalid SQL)

I am happy to add tests for these checks if it's appropriate to do so, but I'll need a little bit of guidance in so far as running make test target=postgres models=sql.test_star seeds=sql.data_star locally threw this error:

wills@wg-josh-mbp-01 integration_tests % make test target=postgres models=sql.test_star seeds=sql.data_star
docker-compose -f ../docker-compose.yml up dbt
dbt-utils_postgres_1 is up-to-date
Starting dbt-utils_dbt_1 ... done
Attaching to dbt-utils_dbt_1
dbt_1        | Running with dbt=1.0.0-b2
dbt_1        | No profile "integration_tests" found, continuing with no target
dbt_1        | Installing ../
dbt_1        |   Installed from <local @ ../>
dbt_1        | Running with dbt=1.0.0-b2
dbt_1        | Encountered an error while reading profiles:
dbt_1        |   ERROR Runtime Error
dbt_1        |   Compilation Error
dbt_1        |     Could not render {{ env_var('POSTGRES_TEST_HOST') }}: Env var required but not provided: 'POSTGRES_TEST_HOST'
dbt_1        | Defined profiles:
dbt_1        |  - integration_tests
dbt_1        |
dbt_1        | For more information on configuring profiles, please consult the dbt docs:
dbt_1        |
dbt_1        | https://docs.getdbt.com/docs/configure-your-profile

Checklist

  • I have verified that these changes work locally on the following warehouses (Note: it's okay if you do not have access to all warehouses, this helps us understand what has been covered)
    • BigQuery
    • Postgres
    • Redshift
    • Snowflake
  • I followed guidelines to ensure that my changes will work on "non-core" adapters by:
    • dispatching any new macro(s) so non-core adapters can also use them (e.g. the star() source)
    • using the limit_zero() macro in place of the literal string: limit 0
    • using dbt_utils.type_* macros instead of explicit datatypes (e.g. dbt_utils.type_timestamp() instead of TIMESTAMP
  • I have updated the README.md (if applicable)
  • I have added tests & descriptions to my models (and macros if applicable)
  • I have added an entry to CHANGELOG.md

Josh Wills added 3 commits March 23, 2022 15:05
…ion to provide a more useful error message if no columns are returned
…les the empty column list case- though maybe it shouldn't?
@jwills
Copy link
Author

jwills commented Mar 23, 2022

So I may simply be way off base here since it seems like all of these tests are intended to work with the zero-column case, so maybe the right thing to do is just a fix to the sql generated by star.sql to not be invalid in the case that there are zero columns?

@joellabes
Copy link
Contributor

@jwills this is interesting!

just a fix to the sql generated by star.sql to not be invalid in the case that there are zero columns

I think it's a feature that macros fail early if they can't do what's asked of them, compared to a fallback like returning * which would cause unpleasant surprises for someone who doesn't notice what they're getting back.

But it's definitely better for that to be at compilation time, as opposed to getting a sql error at runtime.

Perhaps you could take inspiration from the error message that comes up when there's no columns in union_relations, introduced in #473: https://github.com/dbt-labs/dbt-utils/blob/main/macros/sql/union.sql#L64-L77?

@jwills
Copy link
Author

jwills commented Apr 15, 2022

@joellabes I'm going to close this out for now; I don't think I have a good enough handle on what the expected behavior should be in the case that no columns are eligible to be included in the list that gets used in star, unpivot, nullcheck_table, and union-- it may just be that this is not enough of an issue in practice to be worth fixing. 🤷

@jwills jwills closed this Apr 15, 2022
@jwills jwills deleted the jwills_err_on_empty_column_list branch April 15, 2022 15:50
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants