Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Implement relations api #727

Merged
merged 74 commits into from
Apr 26, 2018
Merged

Implement relations api #727

merged 74 commits into from
Apr 26, 2018

Conversation

cmcarthur
Copy link
Member

WIP

def drop(cls, profile, schema, relation, relation_type, model_name=None):
def drop_relation(cls, profile, relation, model_name=None):
"""
In Redshift, DROP TABLE cannot be used inside a transaction.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

nbd, but drop table... can be used inside of a transaction... it cannot be used inside a transaction if executed against an external table (ie. Spectrum).

This drop lock exists to prevent two drop...cascades from running at the same time.

If view_x depends on tables model_a and model_b, and the following two statements run concurrently, Redshift will fire a table was dropped by a concurrent transaction error:

-- via model_a.sql
drop table model_a cascade;

-- via model_b.sql
drop table model_b cascade;

Presumably because dropping model_a caused view_x to be dropped while model_b was trying to drop it.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

yes, good memory. i knew this wasn't quite right as i was writing it, i'll fix it. thanks

logging.getLogger('urllib3').setLevel(logging.INFO)
logging.getLogger('google').setLevel(logging.INFO)
logging.getLogger('snowflake.connector').setLevel(logging.INFO)
logging.getLogger('parsedatetime').setLevel(logging.INFO)
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@drewbanin i'm confused as to how these got in here...?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

i tried to quiet down the verbosity of logs in the logs/dbt.log file. Tristan saw debug longs in his console -- I wonder if that's related?

Copy link
Contributor

@drewbanin drewbanin left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@cmcarthur there's a lot in here, but everything looked reasonable to me. I think we can go ahead and merge this, then stress test it like crazy in development. Sounds like some folks in the community are willing to help us test too :)

@echo "Changed test files:"
@echo "${changed_tests}"
@docker-compose run test /usr/src/app/test/runner.sh ${changed_tests}

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this is a good change :)

def approximate_relation_match(target, relation):
raise_compiler_error(
'When searching for a relation, dbt found an approximate match. '
'Instead of guessing \nwhich relation to use, dbt will move on. '
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

lol good change

logging.getLogger('urllib3').setLevel(logging.INFO)
logging.getLogger('google').setLevel(logging.INFO)
logging.getLogger('snowflake.connector').setLevel(logging.INFO)
logging.getLogger('parsedatetime').setLevel(logging.INFO)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

i tried to quiet down the verbosity of logs in the logs/dbt.log file. Tristan saw debug longs in his console -- I wonder if that's related?

@drewbanin drewbanin merged commit 5344f54 into development Apr 26, 2018
@drewbanin drewbanin deleted the implement-relations-api branch April 26, 2018 21:38
jon-rtr added a commit to jon-rtr/dbt that referenced this pull request May 12, 2018
…o model-aliasing

$ git merge kickstarter/feature/model-aliasing
CONFLICT (content): Merge conflict in dbt/utils.py
CONFLICT (modify/delete): dbt/include/global_project/macros/materializations/table.sql deleted in HEAD and modified in kickstarter/feature/model-aliasing. Version kickstarter/feature/model-aliasing of dbt/include/global_project/macros/materializations/table.sql left in tree.
CONFLICT (modify/delete): dbt/include/global_project/macros/materializations/bigquery.sql deleted in HEAD and modified in kickstarter/feature/model-aliasing. Version kickstarter/feature/model-aliasing of dbt/include/global_project/macros/materializations/bigquery.sql left in tree.

1. dbt/utils.py

   Some major changes are being introduced in 0.10.1: Implement relations api (dbt-labs#727)
   dbt-labs#727
   dbt-labs@5344f54#diff-196bbfafed32edaf1554550f65111f87

   The Relation class was extracted into ...
   ./dbt/api/object.py                   class APIObject(dict)
   ./dbt/adapters/default/relation.py    class DefaultRelation(APIObject)
   ./dbt/adapters/bigquery/relation.py   class BigQueryRelation(DefaultRelation)
   ./dbt/adapters/snowflake/relation.py  class SnowflakeRelation(DefaultRelation)

   Changing node.get('name') to node.get('alias') ...
   ./dbt/adapters/default/relation.py
   ./dbt/adapters/bigquery/relation.py
   ./dbt/adapters/snowflake/relation.py

2. dbt/include/global_project/macros/materializations/table.sql

   This was renamed to ...
   ./dbt/include/global_project/macros/materializations/table/table.sql

3. dbt/include/global_project/macros/materializations/bigquery.sql

   This was split into ...
   ./dbt/include/global_project/macros/materializations/table/bigquery_table.sql
   and ...
   ./dbt/include/global_project/macros/materializations/view/bigquery_view.sql

4. other instances of model['name']

   The following file also mention model['name'] and probably need to change as well ...

   ./dbt/include/global_project/macros/materializations/archive/archive.sql
   ./dbt/include/global_project/macros/materializations/seed/bigquery.sql
   ./dbt/include/global_project/macros/materializations/seed/seed.sql

   Added comentary to ...

   ./dbt/exceptions.py

5. further changes

   Revert model.get('alias') to model.get('name') ...
   print_test_result_line in ./dbt/ui/printer.py (since in this context schema is NOT being used)

   Change model.get('name') to model.get('alias') ...
   print_seed_result_line in ./dbt/ui/printer.py (since in this context schema is also being used)

   Change node.get('name') to node.get('alias') ...
   _node_context in ./dbt/node_runners.py (since in this context schema is also being used)
   call_get_missing_columns in ./dbt/node_runners.py (since in this context schema is also being used)
   call_already_exists in ./dbt/node_runners.py (since in this context schema is also being used)

6. linting

   import lines must be under 80 characters
   https://www.python.org/dev/peps/pep-0328/
@jon-rtr jon-rtr mentioned this pull request May 12, 2018
iknox-fa pushed a commit that referenced this pull request Feb 8, 2022
automatic commit by git-black, original commits:
  5344f54
iknox-fa pushed a commit that referenced this pull request Feb 8, 2022
automatic commit by git-black, original commits:
  5344f54
  a37374d
@jtcohen6 jtcohen6 mentioned this pull request Feb 3, 2023
6 tasks
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants