-
Notifications
You must be signed in to change notification settings - Fork 14.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Docker image fails to start if celery config section is not defined #29537
Comments
Ah. So this is in the IMAGE not in Airlfow. That explains it. |
potiuk
added a commit
to potiuk/airflow
that referenced
this issue
Feb 14, 2023
The section check is not really needed and harmful. There should be no problems if sections are missing from the configuration file. Running this check might lead to a problem that if someone stripds down the config file to bare minimum and moves the crucial configuration to env variables the 'get-value' command might fail, even if the configuration is perfectly sound for Airflow. Fixes: apache#29537
potiuk
added a commit
that referenced
this issue
Feb 15, 2023
* Remove section check from get-value command The section check is not really needed and harmful. There should be no problems if sections are missing from the configuration file. Running this check might lead to a problem that if someone stripds down the config file to bare minimum and moves the crucial configuration to env variables the 'get-value' command might fail, even if the configuration is perfectly sound for Airflow. Fixes: #29537 * Update airflow/cli/commands/config_command.py
pierrejeambrun
pushed a commit
that referenced
this issue
Mar 7, 2023
* Remove section check from get-value command The section check is not really needed and harmful. There should be no problems if sections are missing from the configuration file. Running this check might lead to a problem that if someone stripds down the config file to bare minimum and moves the crucial configuration to env variables the 'get-value' command might fail, even if the configuration is perfectly sound for Airflow. Fixes: #29537 * Update airflow/cli/commands/config_command.py (cherry picked from commit 06d45f0)
pierrejeambrun
pushed a commit
that referenced
this issue
Mar 8, 2023
* Remove section check from get-value command The section check is not really needed and harmful. There should be no problems if sections are missing from the configuration file. Running this check might lead to a problem that if someone stripds down the config file to bare minimum and moves the crucial configuration to env variables the 'get-value' command might fail, even if the configuration is perfectly sound for Airflow. Fixes: #29537 * Update airflow/cli/commands/config_command.py (cherry picked from commit 06d45f0)
64 tasks
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Apache Airflow version
Other Airflow 2 version (please specify below)
What happened
Using Airflow
2.3.4
We removed any config values we did not explicitly set from
airflow.cfg
. This was to make future upgrades less involved, as we could only compare configuration values we explicitly set, rather than all permutations of versions. This has been recommended in slack as an approach.e.g. we set
AIRFLOW__CELERY__BROKER_URL
as an environment variable - we do not set this inairflow.cfg
, so we removed the[celery]
section from the Airflow configuration.We set
AIRFLOW__CORE__EXECUTOR=CeleryExecutor
, so we are using the Celery executor.Upon starting the Airflow scheduler, it exited with code
1
, and this message:Upon adding back in an empty
section to
airflow.cfg
, this error went away. I have verified that it still picks upAIRFLOW__CELERY__BROKER_URL
correctly.What you think should happen instead
I'd expect Airflow to take defaults as listed here, I wouldn't expect the presence of configuration sections to cause errors.
How to reproduce
Setup a docker image for the Airflow
scheduler
withapache/airflow:slim-2.3.4)-python3.10
and the following configuration inairflow.cfg
- with no[celery]
section:Run the
scheduler
command, also settingAIRFLOW__CELERY__BROKER_URL
to point to a Celery redis broker.Observe that the scheduler exits.
Operating System
Ubuntu 20.04.5 LTS (Focal Fossa)
Versions of Apache Airflow Providers
No response
Deployment
Other Docker-based deployment
Deployment details
AWS ECS
Docker
apache/airflow:slim-2.3.4)-python3.10
Separate:
services
Anything else
This seems to occur due to this
get-value
check in the Airflow image entrypoint:airflow/scripts/docker/entrypoint_prod.sh
Lines 203 to 212 in 28126c1
Are you willing to submit PR?
Code of Conduct
The text was updated successfully, but these errors were encountered: