- About this document
- Getting the code
- Running
dbt-snowflake
in development - Testing
- Updating Docs
- Submitting a Pull Request
This document is a guide for anyone interested in contributing to the dbt-snowflake
repository. It outlines how to create issues and submit pull requests (PRs).
This is not intended as a guide for using dbt-snowflake
in a project. For configuring and using this adapter, see Snowflake Profile, and Snowflake Configs.
We assume users have a Linux or MacOS system. You should have familiarity with:
- Python
virturalenv
s - Python modules
pip
- common command line utilities like
git
.
In addition to this guide, we highly encourage you to read the dbt-core. Almost all information there is applicable here!
Please note that all contributors to dbt-snowflake
must sign the Contributor License Agreement(CLA) before their pull request(s) can be merged into the dbt-snowflake
codebase. Given this, dbt-snowflake
maintainers will unfortunately be unable to merge your contribution(s) until you've signed the CLA. You are, however, welcome to open issues and comment on existing ones.
git
is needed in order to download and modify the dbt-snowflake
code. There are several ways to install Git. For MacOS, we suggest installing Xcode or Xcode Command Line Tools.
If you are not a member of the dbt-labs
GitHub organization, you can contribute to dbt-snowflake
by forking the dbt-snowflake
repository. For more on forking, check out the GitHub docs on forking. In short, you will need to:
- fork the
dbt-snowflake
repository - clone your fork locally
- check out a new branch for your proposed changes
- push changes to your fork
- open a pull request of your forked repository against
dbt-labs/dbt-snowflake
If you are a member of the dbt Labs
GitHub organization, you will have push access to the dbt-snowflake
repo. Rather than forking dbt-snowflake
to make your changes, clone the repository like normal, and check out feature branches.
-
Ensure you have the latest version of
pip
installed by runningpip install --upgrade pip
in terminal. -
Configure and activate a
virtualenv
as described in Setting up an environment. -
Install
dbt-core
in the activevirtualenv
. To confirm you installed dbt correctly, rundbt --version
andwhich dbt
. -
Install
dbt-snowflake
and development dependencies in the activevirtualenv
. Runpip install -e . -r dev-requirements.txt
.
When dbt-snowflake
is installed this way, any changes you make to the dbt-snowflake
source code will be reflected immediately (i.e. in your next local dbt invocation against a Snowflake target).
dbt-snowflake
contains unit and integration tests. Integration tests require an actual Snowflake warehouse to test against. There are two primary ways to do this:
-
This repo has CI/CD GitHub Actions set up. Both unit and integration tests will run against an already configured Snowflake warehouse during PR checks.
-
You can also run integration tests "locally" by configuring a
test.env
file with appropriateENV
variables.
cp test.env.example test.env
$EDITOR test.env
WARNING: The parameters in your test.env
file must link to a valid Snowflake account. The test.env
file you create is git-ignored, but please be extra careful to never check in credentials or other sensitive information when developing.
There are a few methods for running tests locally.
tox
automatically runs unit tests against several Python versions using its own virtualenvs. Run tox -p
to run unit tests for Python 3.7, Python 3.8, Python 3.9, Python 3.10, and flake8
in parallel. Run tox -e py37
to invoke tests on Python version 3.7 only (use py37, py38, py39, or py310). Tox recipes are found in tox.ini
.
You may run a specific test or group of tests using pytest
directly. Activate a Python virtualenv active with dev dependencies installed. Then, run tests like so:
# Note: replace $strings with valid names
# run all snowflake integration tests in a directory
python -m pytest -m profile_snowflake tests/integration/$test_directory
# run all snowflake integration tests in a module
python -m pytest -m profile_snowflake tests/integration/$test_dir_and_filename.py
# run all snowflake integration tests in a class
python -m pytest -m profile_snowflake tests/integration/$test_dir_and_filename.py::$test_class_name
# run a specific snowflake integration test
python -m pytest -m profile_snowflake tests/integration/$test_dir_and_filename.py::$test_class_name::$test__method_name
# run all unit tests in a module
python -m pytest tests/unit/$test_file_name.py
# run a specific unit test
python -m pytest tests/unit/$test_file_name.py::$test_class_name::$test_method_name
Many changes will require an update to dbt-snowflake
documentation. Here are some relevant links.
- Docs are here.
- The docs repo for making changes is located here.
- The changes made are likely to impact one or both of Snowflake Profile, or Snowflake Configs.
- We ask every community member who makes a user-facing change to open an issue or PR regarding doc changes.
A dbt-snowflake
maintainer will review your PR and will determine if it has passed regression tests. They may suggest code revisions for style and clarity, or they may request that you add unit or integration tests. These are good things! We believe that, with a little bit of help, anyone can contribute high-quality code.
Once all tests are passing and your PR has been approved, a dbt-snowflake
maintainer will merge your changes into the active development branch. And that's it! Happy developing 🎉