Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Can't setup enviroments for use secrets in docker-swarm #545

Open
aalemanq opened this issue Apr 27, 2020 · 6 comments
Open

Can't setup enviroments for use secrets in docker-swarm #545

aalemanq opened this issue Apr 27, 2020 · 6 comments

Comments

@aalemanq
Copy link

Hello, I spend so many weeks trying to configure airflow using secrets in docker swarm.

I try to use this config in my compose:

environment:
  - LOAD_EX=y
  - FERNET_KEY_CMD=$$(cat /run/secrets/fernet_key) (If I put '$" swarm tells me that I need to escape with another $)
  - EXECUTOR=Celery
  - AIRFLOW__CELERY__BROKER_URL_CMD=$$(cat /run/secrets/broker_url)
  - AIRFLOW__CORE__SQL_ALCHEMY_CONN_CMD=$$(cat /run/secrets/sql_alchemy_conn)
  - AIRFLOW__CELERY__RESULT_BACKEND_CMD=$$(cat  /run/secrets/result_backend)

I tried with another bash commands to do cat to this secret and all fails, it never gets enviroments and use redis default.

I tried to follow oficial doc but...

https://airflow.readthedocs.io/en/stable/howto/set-config.html

`The _cmd config options can also be set using a corresponding environment variable the same way the usual config options can. For example:

export AIRFLOW__CORE__SQL_ALCHEMY_CONN_CMD=bash_command_to_run`

Now I'm trying to run my entrypoint (a copy of original entrypoint + EXPORT enviroments, and no lucky). Is a little nightmare to me config secrets to airflow in docker swarm :(. I want to avoid copy config and do it sed... I want to use enviroments!

Regards

@aalemanq
Copy link
Author

Any possibility to read secrets from enviroments in docker-compose? like _FILE enviroments or something similar...

@wittfabian
Copy link

See https://docs.docker.com/compose/environment-variables/#the-env_file-configuration-option

@aalemanq
Copy link
Author

aalemanq commented Apr 28, 2020

Thanks for reply. I tried to apply env_file but same issue. When I use enviroments like:

LOAD_EX=y
FERNET_KEY=XXXX
EXECUTOR=Celery
AIRFLOW__CELERY__BROKER_URL=pyamqp://airflow:airflow@rabbitmq:5672/airflow
AIRFLOW__CORE__SQL_ALCHEMY_CONN=postgresql+psycopg2://airflow:airflow@postgres/airflow
AIRFLOW__CELERY__RESULT_BACKEND=db+postgresql://airflow:airflow@postgres/airflow

It's works

but if I apply secrets using this in env_file, is not work :( :

AIRFLOW__CORE__FERNET_KEY_CMD=$(cat /run/secrets/fernet_key)
AIRFLOW__CELERY__BROKER_URL_CMD=$(cat /run/secrets/broker_url)
AIRFLOW__CORE__SQL_ALCHEMY_CONN_CMD=$(cat /run/secrets/sql_alchemy_conn)
AIRFLOW__CELERY__RESULT_BACKEND_CMD=$(cat /run/secrets/result_backend)

Airflow never get broker_url and rabbitmq conection is replaced by redis...

I check if everything is right and I don't see any error about secrets/chains:

Some debug:

I deploy stack and enter in worker airflow container to check if secrets exists:

airflow@03eb98bd469d:~$ cat /run/secrets/sql_alchemy_conn 
postgresql+psycopg2://airflow:airflow@postgres/airflow
airflow@03eb98bd469d:~$ cat /run/secrets/broker_url 
pyamqp://airflow:airflow@rabbitmq:5672/airflow
airflow@03eb98bd469d:~$ cat /run/secrets/result_backend 
db+postgresql://airflow:airflow@postgres/airflow

I execute by hand this enviroments:

airflow@03eb98bd469d:~$ LOAD_EX=y
airflow@03eb98bd469d:~$ AIRFLOW__CORE__FERNET_KEY_CMD=$(cat /run/secrets/fernet_key)
airflow@03eb98bd469d:~$ EXECUTOR=Celery
airflow@03eb98bd469d:~$ AIRFLOW__CELERY__BROKER_URL_CMD=$(cat /run/secrets/broker_url)
airflow@03eb98bd469d:~$ AIRFLOW__CORE__SQL_ALCHEMY_CONN_CMD=$(cat /run/secrets/sql_alchemy_conn)
airflow@03eb98bd469d:~$ AIRFLOW__CELERY__RESULT_BACKEND_CMD=$(cat /run/secrets/result_backend)

And echo is right:

airflow@03eb98bd469d:~$ echo $AIRFLOW__CORE__FERNET_KEY_CMD
46BKJoQYlPPOexq0OhDZnIlNepKFf87WFwLbfzqDDho=
airflow@03eb98bd469d:~$ echo $AIRFLOW__CELERY__BROKER_URL_CMD
pyamqp://airflow:airflow@rabbitmq:5672/airflow
airflow@03eb98bd469d:~$ echo $AIRFLOW__CORE__SQL_ALCHEMY_CONN_CMD
postgresql+psycopg2://airflow:airflow@postgres/airflow
airflow@03eb98bd469d:~$ echo $AIRFLOW__CELERY__RESULT_BACKEND_CMD
db+postgresql://airflow:airflow@postgres/airflow

I don't know what I can do, secrets testes, enviroments testesd, no aiflow running with enviroments and secrets but without secrets it works as exepted...

@wittfabian
Copy link

@aalemanq
Copy link
Author

aalemanq commented Apr 29, 2020

Yes,that env file not exeucte anything, but you recommend me env_file and maybe with _CMD II thougth that it will works...no

Really that anybody can pass secrets via enviroments¿?¿ is normal in a lot of software I can't understand it. I tried and tried and tried and I can't .

I can't understand this makefile applied to my airflow wittfabian:(. I just want tipical workflow:

-enviroment:
-ENVIROMENT_FILE=$(cat /run/secrets/file)

Really that I can't do this in airflow?!?!?

@lxndrcx
Copy link

lxndrcx commented Sep 5, 2023

Reading this makes it sound like it is not supported, but it is quite simple. In docker compose YAML file just use something like

  environment:
    ...
    AIRFLOW__CORE__FERNET_KEY_CMD: 'cat /run/secrets/fernet_key'
    ...

Seems to work fine for me. No $() required.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants