Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add makefile with common commands #23

Draft
wants to merge 3 commits into
base: main
Choose a base branch
from
Draft
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
42 changes: 8 additions & 34 deletions {{ cookiecutter.repo_name }}/.github/workflows/ci.yml
Original file line number Diff line number Diff line change
Expand Up @@ -20,17 +20,11 @@ jobs:
with:
python-version: "3.11"

- name: Cache pre-commit
uses: actions/cache@v3
with:
path: ~/.cache/pre-commit
key: pre-commit-3|${{ env.pythonLocation }}|${{ hashFiles('.pre-commit-config.yaml') }}

- name: Install pre-commit
run: pip3 install pre-commit
- name: Install venv and pre-commit
run: make
Comment on lines 22 to +24
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

i guess we shouldn't be so brutal :D we want to use cached version to make this step much quicker and avoid setting up venv from scratch

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The problem is that we can't run make functions without creating a virtual environment and sourcing it first. Because of this, I'm not sure if it's worth continuing to work on the Makefile. Since we can't use it everywhere, it would require maintaining code in two places—both the Makefile and the ci.yml.


- name: Run pre-commit checks
run: pre-commit run --all-files --show-diff-on-failure --color always
run: make lint

- name: Run Trivy vulnerability scanner
uses: aquasecurity/trivy-action@master
Expand All @@ -57,16 +51,11 @@ jobs:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
comment_tag: trivy

- name: Create venv
run: . ./setup_dev_env.sh

- name: Check licenses
run: ./check_licenses.sh
run: make lic-check

- name: Generate pip freeze
run: |
source venv/bin/activate
pip freeze > requirements-freeze.txt
run: make freeze

- name: Publish Artefacts
uses: actions/upload-artifact@v3
Expand All @@ -90,10 +79,7 @@ jobs:
retention-days: 10

- name: Validate package build
run: |
source venv/bin/activate
python -m pip install -U build
python -m build
run: make build

- name: Publish Package
uses: actions/upload-artifact@v3
Expand Down Expand Up @@ -124,23 +110,11 @@ jobs:
with:
python-version: ${{ matrix.python-version }}

- name: Cache Dependencies
uses: actions/cache@v3
with:
path: ~/.cache/pip
key: ${{ runner.os }}-pip-${{ hashFiles('**/requirements-dev.txt') }}-${{ hashFiles('**/setup.cfg') }}-${{ hashFiles('**/pyproject.toml') }}
restore-keys: |
${{ runner.os }}-pip-

- name: Install Dependencies
run: pip install -r requirements-dev.txt
run: make

- name: Run Tests With Coverage
run: |
# run with coverage to not execute tests twice
coverage run -m pytest -v -p no:warnings --junitxml=report.xml tests/
coverage report
coverage xml
run: make coverage

- name: Test Report
uses: mikepenz/action-junit-report@v4
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -24,10 +24,10 @@ jobs:
# for best results, it is better to generate
# documentation within development environment
- name: Create venv
run: . ./setup_dev_env.sh
run: make

- name: Build docs
run: ./build_docs.sh
run: make docs

- name: Deploy
uses: peaceiris/actions-gh-pages@v3
Expand Down
24 changes: 13 additions & 11 deletions {{ cookiecutter.repo_name }}/.gitlab-ci.yml
Original file line number Diff line number Diff line change
Expand Up @@ -95,8 +95,10 @@ lint:
image: $PRECOMMIT_IMAGE:latest
stage: lint
interruptible: true
before_script:
- make
Comment on lines +98 to +99
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

as we use $PRECOMMIT_IMAGE, it is not necessary to setup venv before

script:
- pre-commit run --all-files
- make lint
rules:
- if: $CI_PIPELINE_SOURCE == "merge_request_event" || '$CI_COMMIT_BRANCH == $CI_DEFAULT_BRANCH'
changes:
Expand All @@ -112,8 +114,10 @@ lint-precommit-changed:
image: $PRECOMMIT_IMAGE:dev-$CI_COMMIT_SHA
stage: lint
interruptible: true
before_script:
- make
script:
- pre-commit run --all-files
- make lint
rules:
- if: $CI_PIPELINE_SOURCE == "merge_request_event"
changes:
Expand Down Expand Up @@ -145,21 +149,19 @@ tests:
stage: tests
interruptible: true
before_script:
- . setup_dev_env.sh
- pip install build twine bump2version
- make
- pip install build twine bump2version #???
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I guess it may be removed indeed

script:
# fused check license here as it would take more time to do it in different place.
- ./check_licenses.sh
- make lic-check
# test package building
- echo ${CI_PIPELINE_IID} >> src/{{ cookiecutter.__package_name }}/VERSION
- python -m build
# generate pip freeze
- pip freeze > requirements-freeze.txt
- make freeze
- cat requirements-freeze.txt
# run with coverage to not execute tests twice
- coverage run -m pytest -v -p no:warnings --junitxml=report.xml tests/
- coverage report
- coverage xml
- make coverage
coverage: '/(?i)total.*? (100(?:\.0+)?\%|[1-9]?\d(?:\.\d+)?\%)$/'
artifacts:
when: always
Expand Down Expand Up @@ -244,9 +246,9 @@ pages:
stage: pages
interruptible: true
before_script:
- . setup_dev_env.sh
- make
script:
- ./build_docs.sh
- make docs
artifacts:
paths:
- public
Expand Down
1 change: 1 addition & 0 deletions {{ cookiecutter.repo_name }}/.libraries-whitelist.txt
Original file line number Diff line number Diff line change
@@ -1 +1,2 @@
{{ cookiecutter.__package_name }}
pkg_resources
56 changes: 56 additions & 0 deletions {{ cookiecutter.repo_name }}/Makefile
Original file line number Diff line number Diff line change
@@ -0,0 +1,56 @@
PYTHON_VERSION := 3.11
WHEEL_VERSION := 0.43.0

.DEFAULT_GOAL := venv/bin/activate

venv/bin/activate: requirements-dev.txt
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

maybe just venv?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

That's not how makefile works, "functions" below check for existance of venv/bin/activate file if it doesn't exist then makefile will run this "function"

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

why do we need this requirements-dev.txt here?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

makefile will check if requirments file changed, if yes then venv will be rebuild :)

python$(PYTHON_VERSION) -m venv venv
. venv/bin/activate
pip install --upgrade pip
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I am getting following output:

~/Documents/Projects/ds-template/{{ cookiecutter.repo_name }}$ make 
python3.11 -m venv venv
. venv/bin/activate
pip install --upgrade pip
make: pip: No such file or directory
make: *** [Makefile:9: venv/bin/activate] Error 127

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Does the error still occur if you run make after generating the test project? Can you check if pip is configured correctly on your machine?

pip install wheel==$(WHEEL_VERSION)
pip install -r requirements-dev.txt
pre-commit install

lint: venv/bin/activate
. venv/bin/activate
pre-commit run --all-files

clean:
find . -type f -name "*.pyc" -delete
find . -type d -name "__pycache__" -delete
rm -rf venv
rm -rf .pytest_cache
rm -rf .coverage
rm -rf build/
rm -rf dist/
rm -rf *.egg-info/

build: venv/bin/activate
. venv/bin/activate
python -m pip install -U build
python -m build

lic-check: venv/bin/activate
. venv/bin/activate
pip-licenses --from=mixed --ignore-packages `cat .libraries-whitelist.txt`> licenses.txt --allow-only="$(paste -sd ";" .license-whitelist.txt)"

docs: venv/bin/activate
. venv/bin/activate
pip-licenses --from=mixed --format rst --with-urls --with-description --output-file=docs/licenses_table.rst
sphinx-build -d docs/_build/doctrees docs/ public/

bump-version: venv/bin/activate
. venv/bin/activate
bump2version --verbose --commit $(args)

freeze: venv/bin/activate
. venv/bin/activate
pip freeze > requirements-freeze.txt

coverage: venv/bin/activate
. venv/bin/activate
coverage run -m pytest -v -p no:warnings --junitxml=report.xml tests/
coverage report
coverage xml

.PHONY: lint, clean, build, lic-check, docs, bump-version, freeze, coverage
18 changes: 9 additions & 9 deletions {{ cookiecutter.repo_name }}/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@ To start, you need to setup your local machine.
You need to setup virtual environment, simplest way is to run from project root directory:

```bash
$ . ./setup_dev_env.sh
$ make
$ source venv/bin/activate
```
This will create a new venv and run `pip install -r requirements-dev.txt`.
Expand All @@ -31,7 +31,7 @@ All updated files will be reformatted and linted before the commit.

To reformat and lint all files in the project, use:

`pre-commit run --all-files`
`make lint`

The used linters are configured in `.pre-commit-config.yaml`. You can use `pre-commit autoupdate` to bump tools to the latest versions.

Expand All @@ -53,10 +53,10 @@ See also [Cookiecutter Data Science opinion](https://drivendata.github.io/cookie

In `docs/` directory are Sphinx RST/Markdown files.

To build documentation locally, in your configured environment, you can use `build_docs.sh` script:
To build documentation locally, in your configured environment, you can use `make docs` command:

```bash
$ ./build_docs.sh
$ make docs
```

Then open `public/index.html` file.
Expand Down Expand Up @@ -110,15 +110,15 @@ To bump version of the library please use `bump2version` which will update all v

NOTE: Configuration is in `.bumpversion.cfg` and **this is a main file defining version which should be updated only with bump2version**.

For convenience there is bash script which will create commit, to use it call:
For convenience there is make command which will create commit, to use it call:

```bash
# to create a new commit by increasing one semvar:
$ ./bump_version.sh minor
$ ./bump_version.sh major
$ ./bump_version.sh patch
$ make bump-version args=minor
$ make bump-version args=patch
$ make bump-version args=major
# to see what is going to change run:
$ ./bump_version.sh --dry-run major
$ ./bump_version.sh args="--dry-run major"
```
Script updates **VERSION** file and setup.cfg automatically uses that version.

Expand Down
16 changes: 0 additions & 16 deletions {{ cookiecutter.repo_name }}/build_docs.sh

This file was deleted.

3 changes: 0 additions & 3 deletions {{ cookiecutter.repo_name }}/bump_version.sh

This file was deleted.

18 changes: 0 additions & 18 deletions {{ cookiecutter.repo_name }}/check_licenses.sh

This file was deleted.

11 changes: 0 additions & 11 deletions {{ cookiecutter.repo_name }}/setup_dev_env.sh

This file was deleted.

Loading