Skip to content

Commit

Permalink
Merge pull request #29 from geoadmin/cb-update-only
Browse files Browse the repository at this point in the history
library, docs & test updates
  • Loading branch information
ltbam authored Jun 28, 2024
2 parents 0cad37a + f0632d1 commit 181f943
Show file tree
Hide file tree
Showing 25 changed files with 1,442 additions and 725 deletions.
1 change: 1 addition & 0 deletions .env
Original file line number Diff line number Diff line change
Expand Up @@ -9,6 +9,7 @@ ADMIN_PASS=admin
# you can change the directory to wherever you want the data to reside on the host
# MUST be an absolute path
DATA_DIR=/data/transfer/osm_v2
TMP_DATA_DIR=/tmp_data/transfer/osm_v2

VALHALLA_URL="http://localhost"

Expand Down
88 changes: 45 additions & 43 deletions .github/workflows/test-ubuntu.yml
Original file line number Diff line number Diff line change
Expand Up @@ -21,10 +21,10 @@ jobs:
POSTGRES_DB: gis_test
ALLOW_IP_RANGE: 0.0.0.0/0
options: >-
--health-cmd pg_isready
--health-interval 10s
--health-timeout 5s
--health-retries 5
--health-cmd pg_isready
--health-interval 10s
--health-timeout 5s
--health-retries 5
ports:
- 5432:5432
redis:
Expand All @@ -38,50 +38,52 @@ jobs:
- 6379:6379

steps:
- uses: actions/checkout@v2
- uses: actions/checkout@v2

- name: Set up Python 3.12
uses: actions/setup-python@v2
with:
python-version: "3.12"

- name: Set up Python 3.10
uses: actions/setup-python@v2
with:
python-version: '3.10'
- name: Install and set up Poetry
run: |
curl -sSL https://install.python-poetry.org | python
$HOME/.local/bin/poetry config virtualenvs.in-project true
- name: Install and set up Poetry
run: |
curl -sSL https://install.python-poetry.org | python
$HOME/.local/bin/poetry config virtualenvs.in-project true
- name: Cache dependencies.py
uses: actions/cache@v2
with:
path: .venv
key: venv-3.12-${{ hashFiles('**/poetry.lock') }}

- name: Cache dependencies.py
uses: actions/cache@v2
with:
path: .venv
key: venv-3.10-${{ hashFiles('**/poetry.lock') }}
- name: Install dependencies.py
run: |
$HOME/.local/bin/poetry install
- name: Install dependencies.py
run: |
$HOME/.local/bin/poetry install
- name: Install osmium & osmctools
run: |
sudo apt-get update
sudo apt-get install -y -qq osmium-tool osmctools
echo $(osmium --version)
- name: Install osmium & osmctools
run: |
sudo apt-get update
sudo apt-get install -y -qq osmium-tool osmctools
echo $(osmium --version)
- name: linting
run: |
source .venv/bin/activate
pre-commit run --all-files
- name: linting
run: |
source .venv/bin/activate
pre-commit run --all-files
- name: pytest and coverage
run: |
source .venv/bin/activate
sudo python -m smtpd -n -c DebuggingServer localhost:1025 &
sudo docker volume create routing-packager_packages --driver local --opt type=none --opt device=$PWD --opt o=bind &
sudo docker volume create routing-packager_tmp_data --driver local --opt type=none --opt device=$PWD --opt o=bind
- name: pytest and coverage
run: |
source .venv/bin/activate
sudo python -m smtpd -n -c DebuggingServer localhost:1025 &
sudo docker volume create routing-packager_packages --driver local --opt type=none --opt device=$PWD --opt o=bind
export API_CONFIG=test
pytest --cov=routing_packager_app --ignore=tests/test_tasks.py
coverage lcov --include "routing_packager_app/*"
export API_CONFIG=test
pytest --cov=routing_packager_app --ignore=tests/test_tasks.py
coverage lcov --include "routing_packager_app/*"
- name: coveralls
uses: coverallsapp/github-action@master
with:
github-token: ${{ secrets.GITHUB_TOKEN }}
path-to-lcov: ./coverage.lcov
- name: coveralls
uses: coverallsapp/github-action@master
with:
github-token: ${{ secrets.GITHUB_TOKEN }}
path-to-lcov: ./coverage.lcov
4 changes: 3 additions & 1 deletion .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -15,4 +15,6 @@ scan*.txt

# temp
.env_local
.env
.env

.vscode/
21 changes: 11 additions & 10 deletions .pre-commit-config.yaml
Original file line number Diff line number Diff line change
@@ -1,11 +1,12 @@
repos:
- repo: https://github.com/psf/black
rev: 22.3.0
hooks:
- id: black
language_version: python3
args: [routing_packager_app, tests]
- repo: https://github.com/pycqa/flake8
rev: 4.0.1 # pick a git hash / tag to point to
hooks:
- id: flake8
- repo: https://github.com/ambv/black
rev: 23.9.1
hooks:
- id: black
language_version: python3
- repo: https://github.com/astral-sh/ruff-pre-commit
# Ruff version.
rev: v0.0.289
hooks:
- id: ruff
args: [--fix]
29 changes: 20 additions & 9 deletions CONTRIBUTING.md
Original file line number Diff line number Diff line change
Expand Up @@ -5,10 +5,12 @@
We :heart: patches, fixes & feature PRs and want to make sure everything goes smoothly for you before and while while submitting a PR.

For development we use:

- [`poetry`](https://github.com/python-poetry/poetry/) as package manager
- `pytest` for testing
- Google's [`yapf`](https://github.com/google/yapf) to make sure the formatting is consistent.
- [`pre-commit`](https://pre-commit.com) hook for yapf
- [`black`](https://github.com/psf/black) to make sure the formatting is consistent.
- [`ruff`](https://github.com/astral-sh/ruff) for linting
- [`pre-commit`](https://pre-commit.com) hook for formatting and linting

When contributing, ideally you:

Expand All @@ -23,11 +25,13 @@ When contributing, ideally you:
1. Create and activate a new virtual environment

2. Install development dependencies:

```bash
poetry install
```

3. Please add a pre-commit hook for `yapf`, so your code gets auto-formatted before committing it:

3. Please add a pre-commit hook, so your code gets auto-formatted and linted before committing it:

```bash
pre-commit install
```
Expand All @@ -37,15 +41,22 @@ pre-commit install
You'll need a few things to run the tests:

- PostreSQL installation with a DB named `gis_test` (or define another db name using `POSTGRES_DB_TEST`) **and PostGIS enabled**
- Redis database, best done with `docker run --name redis -p 6379:6379 -d redis:6.0`, then you can use the project's defaults, i.e. `REDIS_URL=redis://localhost:6379/0`
- some fake SMTP service to handle email tests, our recommendations:
- [fake-smtp-server](https://www.npmjs.com/package/fake-smtp-server): NodeJS app with a frontend on `http://localhost:1080` and SMTP port 1025
- pure Python one-liner in a separate terminal window: `sudo python -m smtpd -n -c DebuggingServer localhost:1025`
- Redis database

Both can be quickly spun up by using the provided `docker-compose.test.yml`:

```bash
docker compose -f docker-compose.test.yml up -d
```

You'll also need some fake SMTP service to handle email tests, our recommendation: [fake-smtp-server](https://www.npmjs.com/package/fake-smtp-server),
a NodeJS app with a frontend on `http://localhost:1080` and SMTP port 1025

We use `pytest` in this project with `coverage`:

```bash
export API_CONFIG=test
pytest --cov=routing_packager_app
```
```

A `coverage` bot will report the coverage in every PR and we might ask you to increase coverage on new code.
2 changes: 1 addition & 1 deletion Dockerfile
Original file line number Diff line number Diff line change
Expand Up @@ -54,7 +54,7 @@ MAINTAINER Nils Nolde <[email protected]>
RUN apt-get update > /dev/null && \
export DEBIAN_FRONTEND=noninteractive && \
apt-get install -y libluajit-5.1-2 \
libzmq5 libczmq4 spatialite-bin libprotobuf-lite32 sudo locales wget \
libzmq5 libgdal-dev libczmq4 spatialite-bin libprotobuf-lite32 sudo locales wget \
libsqlite3-0 libsqlite3-mod-spatialite libcurl4 python-is-python3 osmctools \
python3.11-minimal python3-distutils curl unzip moreutils jq spatialite-bin supervisor > /dev/null

Expand Down
17 changes: 9 additions & 8 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -41,7 +41,7 @@ curl --location -XPOST 'http://localhost:5000/api/v1/jobs' \
--header 'Content-Type: application/json' \
--data-raw '{
"name": "test", # name needs to be unique for a specific router & provider
"description": "test descr",
"description": "test descr",
"bbox": "1.531906,42.559908,1.6325,42.577608", # the bbox as minx,miny,maxx,maxy
"provider": "osm", # the dataset provider, needs to be registered in ENABLED_PROVIDERS
"update": "true" # whether this package should be updated on every planet build
Expand All @@ -51,6 +51,7 @@ curl --location -XPOST 'http://localhost:5000/api/v1/jobs' \
After a minute you should have the graph package available in `./data/output/osm_test/`. If not, check the logs of the worker process or the Flask app.

The `routing-packager-app` container running the HTTP API has a `supervisor` process running in a loop, which:

- downloads a planet PBF (if it doesn't exist) or updates the planet PBF (if it does exist)
- builds a planet Valhalla graph
- then updates all graph extracts with a fresh copy
Expand All @@ -61,9 +62,9 @@ By default, also a fake SMTP server is started, and you can see incoming message

### Graph & OSM updates

Under the hood we're running a `supervisor` instance to control the graph builds.
Under the hood we're running a `supervisor` instance to control the graph builds.

Two instances of the [Valhalla docker image](https://github.com/gis-ops/docker-valhalla) take turns building a new graph from an updated OSM file. Those two graphs are physically separated from each other in subdirectories `$DATA_DIR/osm/8002` & `$DATA_DIR/osm/8003`.
Two instances of the [Valhalla docker image](https://github.com/gis-ops/docker-valhalla) take turns building a new graph from an updated OSM file. Those two graphs are physically separated from each other in subdirectories `$TMP_DATA_DIR/osm/8002` & `$TMP_DATA_DIR/osm/8003`.

After each graph build finished, the OSM file is updated for the next graph build.

Expand All @@ -80,9 +81,9 @@ The app is listening on `/api/v1/jobs` for new `POST` requests to generate some
1. Request is parsed, inserted into the Postgres database and the new entry is immediately returned with a few job details as blank fields.
2. Before returning the response, the graph generation function is queued with `ARQ` in a Redis database to dispatch to a worker.
3. If the worker is currently
- **idle**, the queue will immediately start the graph generation:
- Pull the job entry from the Postgres database
- Update the job's `status` database field along the processing to indicate the current stage
- Zip graph tiles from disk according to the request's bounding box and put the package to `$DATA_DIR/output/<JOB_NAME>`, along with a metadata JSON
- **busy**, the current job will be put in the queue and will be processed once it reaches the queue's head
- **idle**, the queue will immediately start the graph generation:
- Pull the job entry from the Postgres database
- Update the job's `status` database field along the processing to indicate the current stage
- Zip graph tiles from disk according to the request's bounding box and put the package to `$DATA_DIR/output/<JOB_NAME>`, along with a metadata JSON
- **busy**, the current job will be put in the queue and will be processed once it reaches the queue's head
4. Send an email to the requesting user with success or failure notice (including the error message)
2 changes: 1 addition & 1 deletion conf/valhalla.conf
Original file line number Diff line number Diff line change
Expand Up @@ -16,4 +16,4 @@ redirect_stderr=true
# stdout_logfile_maxbytes=1MB
stdout_logfile=/proc/1/fd/1
stdout_logfile_maxbytes=0
environment=CONCURRENCY="4",DATA_DIR="/app/data"
environment=CONCURRENCY="4",DATA_DIR="/app/data",TMP_DATA_DIR="/app/tmp_data"
14 changes: 14 additions & 0 deletions docker-compose.test.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,14 @@
services:
postgres:
image: kartoza/postgis:14
environment:
POSTGRES_USER: admin
POSTGRES_PASS: admin
POSTGRES_DB: gis_test
ALLOW_IP_RANGE: 0.0.0.0/0
ports:
- 5432:5432
redis:
image: redis:6.2
ports:
- 6379:6379
20 changes: 14 additions & 6 deletions docker-compose.yml
Original file line number Diff line number Diff line change
@@ -1,17 +1,22 @@
volumes:
postgis-data:
packages: # do not change any detail of this volume
packages: # do not change any detail of this volume
driver: local
driver_opts:
type: none
device: $DATA_DIR # DATA_DIR is the host directory for the data. It has to be in the environment, e.g. in .env file
device: $DATA_DIR # DATA_DIR is the host directory for the data. It has to be in the environment, e.g. in .env file
o: bind
tmp_data: # do not change any detail of this volume
driver: local
driver_opts:
type: none
device: $TMP_DATA_DIR # DATA_DIR is the host directory for the data. It has to be in the environment, e.g. in .env file
o: bind

# It's important it runs in its own private network, also more secure
networks:
routing-packager:

version: '3.2'
services:
postgis:
image: kartoza/postgis:12.1
Expand All @@ -31,7 +36,8 @@ services:
restart: always
redis:
image: redis:6.2
container_name: routing-packager-redis
container_name:
routing-packager-redis
# mostly needed to define the database hosts
env_file:
- .docker_env
Expand All @@ -50,7 +56,8 @@ services:
- routing-packager
volumes:
- packages:/app/data
- $PWD/.docker_env:/app/.env # Worker needs access to .env file
- tmp_data:/app/tmp_data
- $PWD/.docker_env:/app/.env # Worker needs access to .env file
depends_on:
- postgis
- redis
Expand All @@ -67,7 +74,8 @@ services:
- .docker_env
volumes:
- packages:/app/data
- $PWD/.docker_env:/app/.env # CLI needs access to .env file
- tmp_data:/app/tmp_data
- $PWD/.docker_env:/app/.env # CLI needs access to .env file
- $PWD/static:/app/static # static file for frontend
networks:
- routing-packager
Expand Down
13 changes: 8 additions & 5 deletions main.py
Original file line number Diff line number Diff line change
@@ -1,3 +1,4 @@
from contextlib import asynccontextmanager
import uvicorn as uvicorn
from arq import create_pool
from arq.connections import RedisSettings
Expand All @@ -10,21 +11,23 @@
from routing_packager_app.config import SETTINGS
from routing_packager_app.api_v1.models import User

app: FastAPI = create_app()


@app.on_event("startup")
async def startup_event():
@asynccontextmanager
async def lifespan(app: FastAPI):
SQLModel.metadata.create_all(engine, checkfirst=True)
app.state.redis_pool = await create_pool(RedisSettings.from_dsn(SETTINGS.REDIS_URL))
User.add_admin_user(next(get_db()))

# create the directories
for provider in Providers:
p = SETTINGS.get_data_dir().joinpath(provider.lower())
p = SETTINGS.get_tmp_data_dir().joinpath(provider.lower())
p.mkdir(exist_ok=True)
SETTINGS.get_output_path().mkdir(exist_ok=True)
yield
app.state.redis_pool.shutdown()


app: FastAPI = create_app(lifespan=lifespan)

if __name__ == "__main__":
uvicorn.run("main:app", host="0.0.0.0", port=5000, reload=True)
Loading

0 comments on commit 181f943

Please sign in to comment.