Skip to content

Commit

Permalink
Update pip syntax (dbt-labs#4498)
Browse files Browse the repository at this point in the history
## What are you changing in this pull request and why?

Updating the `pip install` instructions per this issue
dbt-labs#3468

## Checklist

- [x] Review the [Content style
guide](https://github.com/dbt-labs/docs.getdbt.com/blob/current/contributing/content-style-guide.md)
so my content adheres to these guidelines.
- [x] For [docs
versioning](https://github.com/dbt-labs/docs.getdbt.com/blob/current/contributing/single-sourcing-content.md#about-versioning),
review how to [version a whole
page](https://github.com/dbt-labs/docs.getdbt.com/blob/current/contributing/single-sourcing-content.md#adding-a-new-version)
and [version a block of
content](https://github.com/dbt-labs/docs.getdbt.com/blob/current/contributing/single-sourcing-content.md#versioning-blocks-of-content).
- [x] Add a checklist item for anything that needs to happen before this
PR is merged, such as "needs technical review" or "change base branch."
  • Loading branch information
matthewshaver committed Nov 20, 2023
2 parents cd8e284 + 70c1fc2 commit fd66117
Show file tree
Hide file tree
Showing 25 changed files with 48 additions and 48 deletions.
2 changes: 1 addition & 1 deletion website/blog/2021-11-29-open-source-community-growth.md
Original file line number Diff line number Diff line change
Expand Up @@ -57,7 +57,7 @@ For starters, I want to know how much conversation is occurring across the vario

There are a ton of metrics that can be tracked in any GitHub project — committers, pull requests, forks, releases — but I started pretty simple. For each of the projects we participate in, I just want to know how the number of GitHub stars grows over time, and whether the growth is accelerating or flattening out. This has become a key performance indicator for open source communities, for better or for worse, and keeping track of it isn't optional.

Finally, I want to know how much Marquez and OpenLineage are being used. It used to be that when you wanted to consume a bit of tech, you'd download a file. Folks like me who study user behavior would track download counts as if they were stock prices. This is no longer the case; today, our tech is increasingly distributed through package managers and image repositories. Docker Hub and PyPI metrics have therefore become good indicators of consumption. Docker image pulls and runs of `pip install` are the modern day download and, as noisy as these metrics are, they indicate a similar level of user commitment.
Finally, I want to know how much Marquez and OpenLineage are being used. It used to be that when you wanted to consume a bit of tech, you'd download a file. Folks like me who study user behavior would track download counts as if they were stock prices. This is no longer the case; today, our tech is increasingly distributed through package managers and image repositories. Docker Hub and PyPI metrics have therefore become good indicators of consumption. Docker image pulls and runs of `python -m pip install` are the modern day download and, as noisy as these metrics are, they indicate a similar level of user commitment.

To summarize, here are the metrics I decided to track (for now, anyway):
- Slack messages (by user/ by community)
Expand Down
4 changes: 2 additions & 2 deletions website/blog/2022-04-14-add-ci-cd-to-bitbucket.md
Original file line number Diff line number Diff line change
Expand Up @@ -159,7 +159,7 @@ pipelines:
artifacts: # Save the dbt run artifacts for the next step (upload)
- target/*.json
script:
- pip install -r requirements.txt
- python -m pip install -r requirements.txt
- mkdir ~/.dbt
- cp .ci/profiles.yml ~/.dbt/profiles.yml
- dbt deps
Expand Down Expand Up @@ -208,7 +208,7 @@ pipelines:
# Set up dbt environment + dbt packages. Rather than passing
# profiles.yml to dbt commands explicitly, we'll store it where dbt
# expects it:
- pip install -r requirements.txt
- python -m pip install -r requirements.txt
- mkdir ~/.dbt
- cp .ci/profiles.yml ~/.dbt/profiles.yml
- dbt deps
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -59,7 +59,7 @@ You probably agree that the latter example is definitely more elegant and easier

In addition to CLI commands that interact with a single dbt Cloud API endpoint there are composite helper commands that call one or more API endpoints and perform more complex operations. One example of composite commands are `dbt-cloud job export` and `dbt-cloud job import` where, under the hood, the export command performs a `dbt-cloud job get` and writes the job metadata to a <Term id="json" /> file and the import command reads job parameters from a JSON file and calls `dbt-cloud job create`. The export and import commands can be used in tandem to move dbt Cloud jobs between projects. Another example is the `dbt-cloud job delete-all` which fetches a list of all jobs using `dbt-cloud job list` and then iterates over the list prompting the user if they want to delete the job. For each job that the user agrees to delete a `dbt-cloud job delete` is performed.

To install the CLI in your Python environment run `pip install dbt-cloud-cli` and you’re all set. You can use it locally in your development environment or e.g. in a GitHub actions workflow.
To install the CLI in your Python environment run `python -m pip install dbt-cloud-cli` and you’re all set. You can use it locally in your development environment or e.g. in a GitHub actions workflow.

## How the project came to be

Expand Down Expand Up @@ -310,7 +310,7 @@ The `CatalogExploreCommand.execute` method implements the interactive exploratio
I’ve included the app in the latest version of dbt-cloud-cli so you can test it out yourself! To use the app you need install dbt-cloud-cli with extra dependencies:

```bash
pip install dbt-cloud-cli[demo]
python -m pip install dbt-cloud-cli[demo]
```

Now you can the run app:
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -79,12 +79,12 @@ Depending on which database you’ve chosen, install the relevant database adapt

```text
# install adaptor for duckdb
pip install dbt-duckdb
python -m pip install dbt-duckdb
# OR
# install adaptor for postgresql
pip install dbt-postgres
python -m pip install dbt-postgres
```

### Step 4: Setup dbt profile
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -23,8 +23,8 @@ We'll use pip to install MetricFlow and our dbt adapter:
python -m venv [virtual environment name]
source [virtual environment name]/bin/activate
# install dbt and MetricFlow
pip install "dbt-metricflow[adapter name]"
# e.g. pip install "dbt-metricflow[snowflake]"
python -m pip install "dbt-metricflow[adapter name]"
# e.g. python -m pip install "dbt-metricflow[snowflake]"
```

Lastly, to get to the pre-Semantic Layer starting state, checkout the `start-here` branch.
Expand Down
4 changes: 2 additions & 2 deletions website/docs/docs/build/metricflow-commands.md
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,7 @@ MetricFlow is compatible with Python versions 3.8, 3.9, 3.10, and 3.11.

MetricFlow is a dbt package that allows you to define and query metrics in your dbt project. You can use MetricFlow to query metrics in your dbt project in the dbt Cloud CLI, dbt Cloud IDE, or dbt Core.

**Note** &mdash; MetricFlow commands aren't supported in dbt Cloud jobs yet. However, you can add MetricFlow validations with your git provider (such as GitHub Actions) by installing MetricFlow (`pip install metricflow`). This allows you to run MetricFlow commands as part of your continuous integration checks on PRs.
**Note** &mdash; MetricFlow commands aren't supported in dbt Cloud jobs yet. However, you can add MetricFlow validations with your git provider (such as GitHub Actions) by installing MetricFlow (`python -m pip install metricflow`). This allows you to run MetricFlow commands as part of your continuous integration checks on PRs.

<Tabs>

Expand Down Expand Up @@ -54,7 +54,7 @@ You can install [MetricFlow](https://github.com/dbt-labs/metricflow#getting-star

1. Create or activate your virtual environment `python -m venv venv`
2. Run `pip install dbt-metricflow`
* You can install MetricFlow using PyPI as an extension of your dbt adapter in the command line. To install the adapter, run `pip install "dbt-metricflow[your_adapter_name]"` and add the adapter name at the end of the command. For example, for a Snowflake adapter run `pip install "dbt-metricflow[snowflake]"`
* You can install MetricFlow using PyPI as an extension of your dbt adapter in the command line. To install the adapter, run `python -m pip install "dbt-metricflow[your_adapter_name]"` and add the adapter name at the end of the command. For example, for a Snowflake adapter run `python -m pip install "dbt-metricflow[snowflake]"`

**Note**, you'll need to manage versioning between dbt Core, your adapter, and MetricFlow.

Expand Down
6 changes: 3 additions & 3 deletions website/docs/docs/cloud/cloud-cli-installation.md
Original file line number Diff line number Diff line change
Expand Up @@ -155,9 +155,9 @@ If you already have dbt Core installed, the dbt Cloud CLI may conflict. Here are
- Uninstall the dbt Cloud CLI using the command: `pip uninstall dbt`
- Reinstall dbt Core using the following command, replacing "adapter_name" with the appropriate adapter name:
```shell
pip install dbt-adapter_name --force-reinstall
python -m pip install dbt-adapter_name --force-reinstall
```
For example, if I used Snowflake as an adapter, I would run: `pip install dbt-snowflake --force-reinstall`
For example, if I used Snowflake as an adapter, I would run: `python -m pip install dbt-snowflake --force-reinstall`
--------
Expand Down Expand Up @@ -243,7 +243,7 @@ To update, follow the same process explained in [Windows](/docs/cloud/cloud-cli-
To update:
- Make sure you're in your virtual environment
- Run `pip install --upgrade dbt`.
- Run `python -m pip install --upgrade dbt`.
</TabItem>
Expand Down
2 changes: 1 addition & 1 deletion website/docs/docs/connect-adapters.md
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@ Explore the fastest and most reliable way to deploy dbt using dbt Cloud, a hoste

Install dbt Core, an open-source tool, locally using the command line. dbt communicates with a number of different data platforms by using a dedicated adapter plugin for each. When you install dbt Core, you'll also need to install the specific adapter for your database, [connect to dbt Core](/docs/core/about-core-setup), and set up a `profiles.yml` file.

With a few exceptions [^1], you can install all [Verified adapters](/docs/supported-data-platforms) from PyPI using `pip install adapter-name`. For example to install Snowflake, use the command `pip install dbt-snowflake`. The installation will include `dbt-core` and any other required dependencies, which may include both other dependencies and even other adapter plugins. Read more about [installing dbt](/docs/core/installation).
With a few exceptions [^1], you can install all [Verified adapters](/docs/supported-data-platforms) from PyPI using `python -m pip install adapter-name`. For example to install Snowflake, use the command `python -m pip install dbt-snowflake`. The installation will include `dbt-core` and any other required dependencies, which may include both other dependencies and even other adapter plugins. Read more about [installing dbt](/docs/core/installation).

[^1]: Here are the two different adapters. Use the PyPI package name when installing with `pip`

Expand Down
2 changes: 1 addition & 1 deletion website/docs/docs/core/connect-data-platform/hive-setup.md
Original file line number Diff line number Diff line change
Expand Up @@ -131,7 +131,7 @@ you must install the `dbt-hive` plugin.
The following commands will install the latest version of `dbt-hive` as well as the requisite version of `dbt-core` and `impyla` driver used for connections.

```
pip install dbt-hive
python -m pip install dbt-hive
```
### Supported Functionality
Expand Down
6 changes: 3 additions & 3 deletions website/docs/docs/core/connect-data-platform/spark-setup.md
Original file line number Diff line number Diff line change
Expand Up @@ -35,15 +35,15 @@ If connecting to a Spark cluster via the generic thrift or http methods, it requ

```zsh
# odbc connections
$ pip install "dbt-spark[ODBC]"
$ python -m pip install "dbt-spark[ODBC]"

# thrift or http connections
$ pip install "dbt-spark[PyHive]"
$ python -m pip install "dbt-spark[PyHive]"
```

```zsh
# session connections
$ pip install "dbt-spark[session]"
$ python -m pip install "dbt-spark[session]"
```

<h2> Configuring {frontMatter.meta.pypi_package} </h2>
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -34,7 +34,7 @@ meta:

pip is the easiest way to install the adapter:

<code>pip install {frontMatter.meta.pypi_package}</code>
<code>python -m pip install {frontMatter.meta.pypi_package}</code>

<p>Installing <code>{frontMatter.meta.pypi_package}</code> will also install <code>dbt-core</code> and any other dependencies.</p>

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -255,7 +255,7 @@ The only authentication parameter to set for OAuth 2.0 is `method: oauth`. If yo

For more information, refer to both [OAuth 2.0 authentication](https://trino.io/docs/current/security/oauth2.html) in the Trino docs and the [README](https://github.com/trinodb/trino-python-client#oauth2-authentication) for the Trino Python client.

It's recommended that you install `keyring` to cache the OAuth 2.0 token over multiple dbt invocations by running `pip install 'trino[external-authentication-token-cache]'`. The `keyring` package is not installed by default.
It's recommended that you install `keyring` to cache the OAuth 2.0 token over multiple dbt invocations by running `python -m pip install 'trino[external-authentication-token-cache]'`. The `keyring` package is not installed by default.

#### Example profiles.yml for OAuth

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -33,7 +33,7 @@ pagination_next: null

pip is the easiest way to install the adapter:

<code>pip install {frontMatter.meta.pypi_package}</code>
<code>python -m pip install {frontMatter.meta.pypi_package}</code>

<p>Installing <code>{frontMatter.meta.pypi_package}</code> will also install <code>dbt-core</code> and any other dependencies.</p>

Expand Down
2 changes: 1 addition & 1 deletion website/docs/docs/core/docker-install.md
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,7 @@ description: "You can use Docker to install dbt and adapter plugins from the com

dbt Core and all adapter plugins maintained by dbt Labs are available as [Docker](https://docs.docker.com/) images, and distributed via [GitHub Packages](https://docs.github.com/en/packages/learn-github-packages/introduction-to-github-packages) in a [public registry](https://github.com/dbt-labs/dbt-core/pkgs/container/dbt-core).

Using a prebuilt Docker image to install dbt Core in production has a few benefits: it already includes dbt-core, one or more database adapters, and pinned versions of all their dependencies. By contrast, `pip install dbt-core dbt-<adapter>` takes longer to run, and will always install the latest compatible versions of every dependency.
Using a prebuilt Docker image to install dbt Core in production has a few benefits: it already includes dbt-core, one or more database adapters, and pinned versions of all their dependencies. By contrast, `python -m pip install dbt-core dbt-<adapter>` takes longer to run, and will always install the latest compatible versions of every dependency.

You might also be able to use Docker to install and develop locally if you don't have a Python environment set up. Note that running dbt in this manner can be significantly slower if your operating system differs from the system that built the Docker image. If you're a frequent local developer, we recommend that you install dbt Core via [Homebrew](/docs/core/homebrew-install) or [pip](/docs/core/pip-install) instead.

Expand Down
12 changes: 6 additions & 6 deletions website/docs/docs/core/pip-install.md
Original file line number Diff line number Diff line change
Expand Up @@ -39,7 +39,7 @@ alias env_dbt='source <PATH_TO_VIRTUAL_ENV_CONFIG>/bin/activate'
Once you know [which adapter](/docs/supported-data-platforms) you're using, you can install it as `dbt-<adapter>`. For example, if using Postgres:

```shell
pip install dbt-postgres
python -m pip install dbt-postgres
```

This will install `dbt-core` and `dbt-postgres` _only_:
Expand All @@ -62,15 +62,15 @@ All adapters build on top of `dbt-core`. Some also depend on other adapters: for
To upgrade a specific adapter plugin:

```shell
pip install --upgrade dbt-<adapter>
python -m pip install --upgrade dbt-<adapter>
```

### Install dbt-core only

If you're building a tool that integrates with dbt Core, you may want to install the core library alone, without a database adapter. Note that you won't be able to use dbt as a CLI tool.

```shell
pip install dbt-core
python -m pip install dbt-core
```
### Change dbt Core versions

Expand All @@ -79,13 +79,13 @@ You can upgrade or downgrade versions of dbt Core by using the `--upgrade` optio
To upgrade dbt to the latest version:

```
pip install --upgrade dbt-core
python -m pip install --upgrade dbt-core
```

To downgrade to an older version, specify the version you want to use. This command can be useful when you're resolving package dependencies. As an example:

```
pip install --upgrade dbt-core==0.19.0
python -m pip install --upgrade dbt-core==0.19.0
```

### `pip install dbt`
Expand All @@ -95,7 +95,7 @@ Note that, as of v1.0.0, `pip install dbt` is no longer supported and will raise
If you have workflows or integrations that relied on installing the package named `dbt`, you can achieve the same behavior going forward by installing the same five packages that it used:

```shell
pip install \
python -m pip install \
dbt-core \
dbt-postgres \
dbt-redshift \
Expand Down
8 changes: 4 additions & 4 deletions website/docs/docs/core/source-install.md
Original file line number Diff line number Diff line change
Expand Up @@ -17,10 +17,10 @@ To install `dbt-core` from the GitHub code source:
```shell
git clone https://github.com/dbt-labs/dbt-core.git
cd dbt-core
pip install -r requirements.txt
python -m pip install -r requirements.txt
```

This will install `dbt-core` and `dbt-postgres`. To install in editable mode (includes your local changes as you make them), use `pip install -e editable-requirements.txt` instead.
This will install `dbt-core` and `dbt-postgres`. To install in editable mode (includes your local changes as you make them), use `python -m pip install -e editable-requirements.txt` instead.

### Installing adapter plugins

Expand All @@ -29,12 +29,12 @@ To install an adapter plugin from source, you will need to first locate its sour
```shell
git clone https://github.com/dbt-labs/dbt-redshift.git
cd dbt-redshift
pip install .
python -m pip install .
```

You do _not_ need to install `dbt-core` before installing an adapter plugin -- the plugin includes `dbt-core` among its dependencies, and it will install the latest compatible version automatically.

To install in editable mode, such as while contributing, use `pip install -e .` instead.
To install in editable mode, such as while contributing, use `python -m pip install -e .` instead.

<FAQ path="Core/install-pip-os-prereqs" />
<FAQ path="Core/install-python-compatibility" />
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -45,7 +45,7 @@ Global project macros have been reorganized, and some old unused macros have bee
### Installation

- [Installation docs](/docs/supported-data-platforms) reflects adapter-specific installations
- `pip install dbt` is no longer supported, and will raise an explicit error. Install the specific adapter plugin you need as `pip install dbt-<adapter>`.
- `python -m pip install dbt` is no longer supported, and will raise an explicit error. Install the specific adapter plugin you need as `python -m pip install dbt-<adapter>`.
- `brew install dbt` is no longer supported. Install the specific adapter plugin you need (among Postgres, Redshift, Snowflake, or BigQuery) as `brew install dbt-<adapter>`.
- Removed official support for python 3.6, which is reaching end of life on December 23, 2021

Expand Down
2 changes: 1 addition & 1 deletion website/docs/faqs/Core/install-pip-best-practices.md
Original file line number Diff line number Diff line change
Expand Up @@ -30,6 +30,6 @@ Before installing dbt, make sure you have the latest versions:

```shell

pip install --upgrade pip wheel setuptools
python -m pip install --upgrade pip wheel setuptools

```
2 changes: 1 addition & 1 deletion website/docs/guides/adapter-creation.md
Original file line number Diff line number Diff line change
Expand Up @@ -799,7 +799,7 @@ dbt-tests-adapter
</File>

```sh
pip install -r dev_requirements.txt
python -m pip install -r dev_requirements.txt
```

### Set up and configure pytest
Expand Down
2 changes: 1 addition & 1 deletion website/docs/guides/codespace-qs.md
Original file line number Diff line number Diff line change
Expand Up @@ -61,7 +61,7 @@ If you'd like to work with a larger selection of Jaffle Shop data, you can gener
1. Install the Python package called [jafgen](https://pypi.org/project/jafgen/). At the terminal's prompt, run:

```shell
/workspaces/test (main) $ pip install jafgen
/workspaces/test (main) $ python -m pip install jafgen
```

1. When installation is done, run:
Expand Down
6 changes: 3 additions & 3 deletions website/docs/guides/custom-cicd-pipelines.md
Original file line number Diff line number Diff line change
Expand Up @@ -336,7 +336,7 @@ lint-project:
rules:
- if: $CI_PIPELINE_SOURCE == "push" && $CI_COMMIT_BRANCH != 'main'
script:
- pip install sqlfluff==0.13.1
- python -m pip install sqlfluff==0.13.1
- sqlfluff lint models --dialect snowflake --rules L019,L020,L021,L022
# this job calls the dbt Cloud API to run a job
Expand Down Expand Up @@ -379,7 +379,7 @@ steps:
displayName: 'Use Python 3.7'
- script: |
pip install requests
python -m pip install requests
displayName: 'Install python dependencies'
- script: |
Expand Down Expand Up @@ -434,7 +434,7 @@ pipelines:
- step:
name: Lint dbt project
script:
- pip install sqlfluff==0.13.1
- python -m pip install sqlfluff==0.13.1
- sqlfluff lint models --dialect snowflake --rules L019,L020,L021,L022
'main': # override if your default branch doesn't run on a branch named "main"
Expand Down
6 changes: 3 additions & 3 deletions website/docs/guides/set-up-ci.md
Original file line number Diff line number Diff line change
Expand Up @@ -167,7 +167,7 @@ jobs:
with:
python-version: "3.9"
- name: Install SQLFluff
run: "pip install sqlfluff"
run: "python -m pip install sqlfluff"
- name: Lint project
run: "sqlfluff lint models --dialect snowflake"

Expand Down Expand Up @@ -204,7 +204,7 @@ lint-project:
rules:
- if: $CI_PIPELINE_SOURCE == "push" && $CI_COMMIT_BRANCH != 'main'
script:
- pip install sqlfluff
- python -m pip install sqlfluff
- sqlfluff lint models --dialect snowflake
```
Expand Down Expand Up @@ -235,7 +235,7 @@ pipelines:
- step:
name: Lint dbt project
script:
- pip install sqlfluff==0.13.1
- python -m pip install sqlfluff==0.13.1
- sqlfluff lint models --dialect snowflake --rules L019,L020,L021,L022
'main': # override if your default branch doesn't run on a branch named "main"
Expand Down
Loading

0 comments on commit fd66117

Please sign in to comment.