Airflow Core parallelism is running with max 32 tasks at a time #774
Unanswered
asif2017
asked this question in
Questions & Answers
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hi Team,
I am running a user community chart 8.7.1 with airflow 2.5.3 version. The airflow is running and jobs are executing properly, I have set core parallelism value to 512 for parallelism but in my airflow at max only 32 jobs are running and rest of them are going into queued state.
I also see the huge memory usage for running the cluster. Currently we have around 350 DAGs and below is the memory state. Is the below memory is normal?
As per the slot at least 128 should be the parallelism.
AIRFLOW_CORE_PARALLELISM=512
AIRFLOW__CORE__MAX_ACTIVE_TASKS_PER_DAG: 128
AIRFLOW__CORE__MAX_ACTIVE_RUNS_PER_DAG: 1
Pool = 1
slot = 128
NAME CPU(cores) MEMORY(bytes)
airflow-cluster-db-migrations 2m 224Mi
airflow-cluster-flower 8m 289Mi
airflow-cluster-pgbouncer 213m 22Mi
airflow-cluster-redis-master-0 7m 14Mi
airflow-cluster-scheduler 897m 2578Mi
airflow-cluster-triggerer 155m 396Mi
airflow-cluster-web 42m 1597Mi
airflow-cluster-worker-0 66m 4085Mi
airflow-cluster-worker-1 77m 6777Mi
Beta Was this translation helpful? Give feedback.
All reactions