Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Cross-db incremental loads not working in Snowflake #1455

Closed
joshpeng opened this issue May 10, 2019 · 1 comment · Fixed by #1458
Closed

Cross-db incremental loads not working in Snowflake #1455

joshpeng opened this issue May 10, 2019 · 1 comment · Fixed by #1458
Assignees

Comments

@joshpeng
Copy link

Issue

Cross-db incremental loads aren't qualified correctly in Snowflake

Issue description

On Snowflake, when you override the target database of a model in dbt_project.yml like this:

# in dbt_project.yml
extra:
      database: ldw
      schema: extra

A --full-refresh works great for making the initial table into the desired database even when that database is a different one than the one of the target profile.

However, it errors out when doing an incremental load.

Results

This is the stack trace of the error

2019-05-09 16:28:41,774 (Thread-1): On xspend: BEGIN
2019-05-09 16:28:41,940 (Thread-1): SQL status: SUCCESS 1 in 0.17 seconds
2019-05-09 16:28:41,940 (Thread-1): Using snowflake connection "xspend".
2019-05-09 16:28:41,940 (Thread-1): On xspend: use schema extra;
2019-05-09 16:28:42,283 (Thread-1): Snowflake error: 002043 (02000): 018c1420-0274-2253-0000-5a0100302056: SQL compilation error:
Object does not exist, or operation cannot be performed.
2019-05-09 16:28:42,283 (Thread-1): On xspend: ROLLBACK

Notice the use schema extra; command. It should have been a fully qualified use schema ldw.extra;

System information

dbt 0.13.0
macOS
python 3.6.7

Steps to reproduce

  1. target profile database = abc
  2. dbt_project.yml, set a model's database to be def
  3. run the model as an incremental load
  4. dbt errors out due to non-fully qualified object
@drewbanin drewbanin added this to the Wilt Chamberlain milestone May 10, 2019
@drewbanin drewbanin self-assigned this May 10, 2019
@drewbanin
Copy link
Contributor

Thanks @joshpeng! I am pretty sure the bug you encountered will be fixed in 0.14.0, as we'll be using a merge statement for incremental models instead of a temp table + delete + insert. The point remains that the create_table macro implementation on Snowflake isn't database-aware, and we should fix that too :)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants