Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Split Parsed and Compiled nodes into subtypes (#1601) #1610

Merged
merged 2 commits into from
Jul 18, 2019

Conversation

beckjake
Copy link
Contributor

@beckjake beckjake commented Jul 16, 2019

Fixes #1601
Fixes #1486

This includes a few test changes to account for removed fields based on types

This PR doesn't remove the raw_sql field from seeds or anything like that, as that's a bit more involved - right now dbt really assumes that all nodes have that field and it's easier to leave it for the moment. In the future when compilation is cleaned up a bit it will probably make sense to remove that.

@beckjake beckjake force-pushed the feature/dataclasses-jsonschema-types branch from 22a480b to 49f7cf8 Compare July 16, 2019 16:00
@beckjake beckjake force-pushed the feature/split-parsed-things branch from 5f7d804 to e4f37ab Compare July 16, 2019 16:03
A few test changes to account for removed fields based on types
@beckjake beckjake force-pushed the feature/split-parsed-things branch from e4f37ab to 12e53c7 Compare July 16, 2019 17:04
@beckjake beckjake changed the base branch from feature/dataclasses-jsonschema-types to dev/louisa-may-alcott July 16, 2019 17:06
Copy link
Member

@cmcarthur cmcarthur left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this looks good to me.

core/dbt/node_types.py Show resolved Hide resolved

def parse_from_dict(self, parsed_dict: Dict[str, Any]) -> ParsedRPCNode:
"""Given a dictionary, return the parsed entity for this parser"""
return ParsedRPCNode.from_dict(parsed_dict)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

all of these duplicated parse_from_dicts actually feel like an unfortunate side effect of using type annotations. is there a way to use a "generic" here and specify which node type each Parser generates?

this case actually doesn't feel like the biggest deal. this code is simple. but i can see this being troublesome in the future when we want to inherit functionality and have to redefine the entire method to annotate the return type

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yeah. The real thing I want is for MacrosKnownParser (or something?) to subclass Generic[T] where T is something ParsedNode-like and then subclasses would be like:

class AnalysisParser(MacrosKnownParser[ParsedAnalysisNode]):
   ...

I just didn't really want to spend a lot of time messing with parsing class definitions in this PR only to rip them out or be deeply inconvenienced in the PRs for #1487 or #1600 where I imagine I'll have to make a ton of changes. typing.Generic can result in some seriously weird errors because it does some metaclass craziness.

core/dbt/parser/schemas.py Outdated Show resolved Hide resolved
core/dbt/parser/snapshots.py Outdated Show resolved Hide resolved
Copy link
Contributor

@drewbanin drewbanin left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'll defer to @cmcarthur on this one!

@beckjake beckjake force-pushed the feature/split-parsed-things branch from c00f9d0 to b9a3fe5 Compare July 18, 2019 15:12
@beckjake beckjake merged commit 62c3318 into dev/louisa-may-alcott Jul 18, 2019
@beckjake beckjake deleted the feature/split-parsed-things branch July 18, 2019 18:52
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants