-
Notifications
You must be signed in to change notification settings - Fork 119
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Make internal macros use macro dispatch pattern #72
Conversation
In cases where these macros are line-for-line identical with their counterparts in In cases where you do need to redefine these macros—e.g. To that end, I think the right solution is to add
Could I ask you to open that as an issue/PR in |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks good!
@@ -1,19 +1,11 @@ | |||
{% macro dbt_databricks_file_format_clause() %} | |||
{% macro databricks__file_format_clause() %} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Just curious, if a user has re-defined the file_format_clause
in an existing project, for example
{% macro file_format_clause() %}
...
{% endmacro %}
Will this change (and the upstream change in dbt-spark) break the existing pipeline when the user upgrades the adapter version?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
According to @jtcohen6's explanation in #72 (comment), the user's file_format_clause
will be used.
- Macros in the user's own project — they can always override the builtins, and they can do it by defining a macro named any of these:
file_format_clause
,databricks__file_format_clause
,spark__file_format_clause
,default__file_format_clause
-r{toxinidir}/dev_requirements.txt | ||
-r{toxinidir}/requirements.txt |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is needed because we need to install the latest dbt-spark version first?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes.
Thanks! merging. |
### Description Backport of #72. Makes internal macros use macro dispatch pattern for backward compatibility for users who override the equivalent macros in their projects. The old name macros are still available during 1.0.x releases, but will not be available in 1.1.0 release. This fix will be available with `dbt-spark>=1.0.1`.
Description
Makes internal macros use macro dispatch pattern for backward compatibility for users who override the equivalent macros in their projects.