Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add sample app to run DBT transformations via MWAA Airflow #12

Merged
merged 3 commits into from
Jun 25, 2024

Conversation

whummer
Copy link
Contributor

@whummer whummer commented Jun 19, 2024

Add sample app to run DBT transformations via MWAA Airflow. Still needs a few minor fixes in the Snowflake emulator, but mostly already working e2e. 🎉 (Update 2024-06-24: The sample is now working end-to-end with the latest Snowflake image)

Based on this Snowflake quickstart app: https://quickstarts.snowflake.com/guide/data_engineering_with_apache_airflow

TODO:

  • Add proper README to describe the sample app

@whummer whummer marked this pull request as ready for review June 24, 2024 12:44
tatiana pushed a commit to astronomer/astronomer-cosmos that referenced this pull request Jun 24, 2024
Add ability to specify `host`/`port` for Snowflake connection.

At LocalStack, we have recently started building a Snowflake emulator that allows running SF queries entirely on the local machine:
https://blog.localstack.cloud/2024-05-22-introducing-localstack-for-snowflake/

. As part of a sample application we're building, we have an Apache
Airflow DAG that uses Cosmos (and DBT) to connect to the local Snowflake
emulator running on `localhost`. Here is a link to the sample app:
localstack-samples/localstack-snowflake-samples#12

Currently, we're hardcoding this integration in the user DAG file
itself, [see
here](https://github.com/localstack-samples/localstack-snowflake-samples/pull/12/files#diff-559d4f883ad589522b8a9d33f87fe95b0da72ac29b775e98b273a8eb3ede9924R10-R19):
```
...
from cosmos.profiles.snowflake.user_pass import SnowflakeUserPasswordProfileMapping
...
SnowflakeUserPasswordProfileMapping.airflow_param_mapping["host"] = "extra.host"
SnowflakeUserPasswordProfileMapping.airflow_param_mapping["port"] = "extra.port"
...
```
@whummer whummer requested a review from robertlcx June 24, 2024 13:36
Copy link
Contributor

@yingw787 yingw787 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM 🚀

@whummer whummer merged commit 6e4b023 into main Jun 25, 2024
@whummer whummer deleted the airflow-sample branch June 25, 2024 21:49
arojasb3 pushed a commit to arojasb3/astronomer-cosmos that referenced this pull request Jul 14, 2024
…#1063)

Add ability to specify `host`/`port` for Snowflake connection.

At LocalStack, we have recently started building a Snowflake emulator that allows running SF queries entirely on the local machine:
https://blog.localstack.cloud/2024-05-22-introducing-localstack-for-snowflake/

. As part of a sample application we're building, we have an Apache
Airflow DAG that uses Cosmos (and DBT) to connect to the local Snowflake
emulator running on `localhost`. Here is a link to the sample app:
localstack-samples/localstack-snowflake-samples#12

Currently, we're hardcoding this integration in the user DAG file
itself, [see
here](https://github.com/localstack-samples/localstack-snowflake-samples/pull/12/files#diff-559d4f883ad589522b8a9d33f87fe95b0da72ac29b775e98b273a8eb3ede9924R10-R19):
```
...
from cosmos.profiles.snowflake.user_pass import SnowflakeUserPasswordProfileMapping
...
SnowflakeUserPasswordProfileMapping.airflow_param_mapping["host"] = "extra.host"
SnowflakeUserPasswordProfileMapping.airflow_param_mapping["port"] = "extra.port"
...
```
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants