Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

max_tis_per_query=0 leads to nothing being scheduled in 2.0.0 #13325

Closed
bouke-nederstigt opened this issue Dec 27, 2020 · 21 comments · Fixed by #13512
Closed

max_tis_per_query=0 leads to nothing being scheduled in 2.0.0 #13325

bouke-nederstigt opened this issue Dec 27, 2020 · 21 comments · Fixed by #13512
Labels
affected_version:2.0 Issues Reported for 2.0 area:Scheduler including HA (high availability) scheduler kind:bug This is a clearly a bug priority:critical Showstopper bug that should be patched immediately
Milestone

Comments

@bouke-nederstigt
Copy link

bouke-nederstigt commented Dec 27, 2020

After upgrading to airflow 2.0.0 it seems as if the scheduler isn't working anymore. Tasks hang on scheduled state, but no tasks get executed. I've tested this with sequential and celery executor. When using the celery executor no messages seem to arrive in RabbiyMq

This is on local docker. Everything was working fine before upgrading. There don't seem to be any error messages, so I'm not completely sure if this is a bug or a misconfiguration on my end.

Using python:3.7-slim-stretch Docker image. Regular setup that we're using is CeleryExecutor. Mysql version is 5.7

Any help would be greatly appreciated.

Python packages
alembic==1.4.3
altair==4.1.0
amazon-kclpy==1.5.0
amqp==2.6.1
apache-airflow==2.0.0
apache-airflow-providers-amazon==1.0.0
apache-airflow-providers-celery==1.0.0
apache-airflow-providers-ftp==1.0.0
apache-airflow-providers-http==1.0.0
apache-airflow-providers-imap==1.0.0
apache-airflow-providers-jdbc==1.0.0
apache-airflow-providers-mysql==1.0.0
apache-airflow-providers-sqlite==1.0.0
apache-airflow-upgrade-check==1.1.0
apispec==3.3.2
appdirs==1.4.4
argcomplete==1.12.2
argon2-cffi==20.1.0
asn1crypto==1.4.0
async-generator==1.10
attrs==20.3.0
azure-common==1.1.26
azure-core==1.9.0
azure-storage-blob==12.6.0
Babel==2.9.0
backcall==0.2.0
bcrypt==3.2.0
billiard==3.6.3.0
black==20.8b1
bleach==3.2.1
boa-str==1.1.0
boto==2.49.0
boto3==1.7.3
botocore==1.10.84
cached-property==1.5.2
cattrs==1.1.2
cbsodata==1.3.3
celery==4.4.2
certifi==2020.12.5
cffi==1.14.4
chardet==3.0.4
click==7.1.2
clickclick==20.10.2
cmdstanpy==0.9.5
colorama==0.4.4
colorlog==4.0.2
commonmark==0.9.1
connexion==2.7.0
convertdate==2.3.0
coverage==4.2
croniter==0.3.36
cryptography==3.3.1
cycler==0.10.0
Cython==0.29.21
decorator==4.4.2
defusedxml==0.6.0
dill==0.3.3
dnspython==2.0.0
docutils==0.14
email-validator==1.1.2
entrypoints==0.3
ephem==3.7.7.1
et-xmlfile==1.0.1
fbprophet==0.7.1
fire==0.3.1
Flask==1.1.2
Flask-AppBuilder==3.1.1
Flask-Babel==1.0.0
Flask-Bcrypt==0.7.1
Flask-Caching==1.9.0
Flask-JWT-Extended==3.25.0
Flask-Login==0.4.1
Flask-OpenID==1.2.5
Flask-SQLAlchemy==2.4.4
flask-swagger==0.2.13
Flask-WTF==0.14.3
flatten-json==0.1.7
flower==0.9.5
funcsigs==1.0.2
future==0.18.2
graphviz==0.15
great-expectations==0.13.2
gunicorn==19.10.0
holidays==0.10.4
humanize==3.2.0
idna==2.10
importlib-metadata==1.7.0
importlib-resources==1.5.0
inflection==0.5.1
ipykernel==5.4.2
ipython==7.19.0
ipython-genutils==0.2.0
ipywidgets==7.5.1
iso8601==0.1.13
isodate==0.6.0
itsdangerous==1.1.0
JayDeBeApi==1.2.3
jdcal==1.4.1
jedi==0.17.2
jellyfish==0.8.2
Jinja2==2.11.2
jmespath==0.10.0
joblib==1.0.0
JPype1==1.2.0
json-merge-patch==0.2
jsonpatch==1.28
jsonpointer==2.0
jsonschema==3.2.0
jupyter-client==6.1.7
jupyter-core==4.7.0
jupyterlab-pygments==0.1.2
kinesis-events==0.1.0
kiwisolver==1.3.1
kombu==4.6.11
korean-lunar-calendar==0.2.1
lazy-object-proxy==1.4.3
lockfile==0.12.2
LunarCalendar==0.0.9
Mako==1.1.3
Markdown==3.3.3
MarkupSafe==1.1.1
marshmallow==3.10.0
marshmallow-enum==1.5.1
marshmallow-oneofschema==2.0.1
marshmallow-sqlalchemy==0.23.1
matplotlib==3.3.3
mistune==0.8.4
mock==1.0.1
mockito==1.2.2
msrest==0.6.19
mypy-extensions==0.4.3
mysql-connector-python==8.0.18
mysqlclient==2.0.2
natsort==7.1.0
nbclient==0.5.1
nbconvert==6.0.7
nbformat==5.0.8
nest-asyncio==1.4.3
nose==1.3.7
notebook==6.1.5
numpy==1.19.4
oauthlib==3.1.0
openapi-spec-validator==0.2.9
openpyxl==3.0.5
oscrypto==1.2.1
packaging==20.8
pandas==1.1.5
pandocfilters==1.4.3
parso==0.7.1
pathspec==0.8.1
pendulum==2.1.2
pexpect==4.8.0
phonenumbers==8.12.15
pickleshare==0.7.5
Pillow==8.0.1
prison==0.1.3
prometheus-client==0.8.0
prompt-toolkit==3.0.8
protobuf==3.14.0
psutil==5.8.0
ptyprocess==0.6.0
pyarrow==2.0.0
pycodestyle==2.6.0
pycparser==2.20
pycryptodomex==3.9.9
pydevd-pycharm==193.5233.109
Pygments==2.7.3
PyJWT==1.7.1
PyMeeus==0.3.7
pyodbc==4.0.30
pyOpenSSL==19.1.0
pyparsing==2.4.7
pyrsistent==0.17.3
pystan==2.19.1.1
python-crontab==2.5.1
python-daemon==2.2.4
python-dateutil==2.8.1
python-editor==1.0.4
python-nvd3==0.15.0
python-slugify==4.0.1
python3-openid==3.2.0
pytz==2019.3
pytzdata==2020.1
PyYAML==5.3.1
pyzmq==20.0.0
recordlinkage==0.14
regex==2020.11.13
requests==2.23.0
requests-oauthlib==1.3.0
rich==9.2.0
ruamel.yaml==0.16.12
ruamel.yaml.clib==0.2.2
s3transfer==0.1.13
scikit-learn==0.23.2
scipy==1.5.4
scriptinep3==0.3.1
Send2Trash==1.5.0
setproctitle==1.2.1
setuptools-git==1.2
shelljob==0.5.6
six==1.15.0
sklearn==0.0
snowflake-connector-python==2.3.7
snowflake-sqlalchemy==1.2.4
SQLAlchemy==1.3.22
SQLAlchemy-JSONField==1.0.0
SQLAlchemy-Utils==0.36.8
swagger-ui-bundle==0.0.8
tabulate==0.8.7
TagValidator==0.0.8
tenacity==6.2.0
termcolor==1.1.0
terminado==0.9.1
testpath==0.4.4
text-unidecode==1.3
threadpoolctl==2.1.0
thrift==0.13.0
toml==0.10.2
toolz==0.11.1
tornado==6.1
tqdm==4.54.1
traitlets==5.0.5
typed-ast==1.4.1
typing-extensions==3.7.4.3
tzlocal==1.5.1
unicodecsv==0.14.1
urllib3==1.24.2
validate-email==1.3
vine==1.3.0
watchtower==0.7.3
wcwidth==0.2.5
webencodings==0.5.1
Werkzeug==1.0.1
widgetsnbextension==3.5.1
wrapt==1.12.1
WTForms==2.3.1
xlrd==2.0.1
XlsxWriter==1.3.7
zipp==3.4.0

Relevant config

# The folder where your airflow pipelines live, most likely a
# subfolder in a code repositories
# This path must be absolute
dags_folder = /usr/local/airflow/dags

# The executor class that airflow should use. Choices include
# SequentialExecutor, LocalExecutor, CeleryExecutor, DaskExecutor
executor = CeleryExecutor

# The SqlAlchemy connection string to the metadata database.
# SqlAlchemy supports many different database engine, more information
# their website
sql_alchemy_conn = db+mysql://airflow:airflow@postgres/airflow

# The SqlAlchemy pool size is the maximum number of database connections
# in the pool.
sql_alchemy_pool_size = 5

# The SqlAlchemy pool recycle is the number of seconds a connection
# can be idle in the pool before it is invalidated. This config does
# not apply to sqlite.
sql_alchemy_pool_recycle = 3600

# The amount of parallelism as a setting to the executor. This defines
# the max number of task instances that should run simultaneously
# on this airflow installation
parallelism = 32

# The number of task instances allowed to run concurrently by the scheduler
dag_concurrency = 16

# Are DAGs paused by default at creation
dags_are_paused_at_creation = True

# When not using pools, tasks are run in the "default pool",
# whose size is guided by this config element
non_pooled_task_slot_count = 128

# The maximum number of active DAG runs per DAG
max_active_runs_per_dag = 16

# How long before timing out a python file import while filling the DagBag
dagbag_import_timeout = 60

# The class to use for running task instances in a subprocess
task_runner = StandardTaskRunner

# Whether to enable pickling for xcom (note that this is insecure and allows for
# RCE exploits). This will be deprecated in Airflow 2.0 (be forced to False).
enable_xcom_pickling = True

# When a task is killed forcefully, this is the amount of time in seconds that
# it has to cleanup after it is sent a SIGTERM, before it is SIGKILLED
killed_task_cleanup_time = 60

#  This flag decides whether to serialise DAGs and persist them in DB. If set to True, Webserver reads from DB instead of parsing DAG files
store_dag_code = True

# You can also update the following default configurations based on your needs
min_serialized_dag_update_interval = 30
min_serialized_dag_fetch_interval = 10

[celery]
# This section only applies if you are using the CeleryExecutor in
# [core] section above

# The app name that will be used by celery
celery_app_name = airflow.executors.celery_executor

# The concurrency that will be used when starting workers with the
# "airflow worker" command. This defines the number of task instances that
# a worker will take, so size up your workers based on the resources on
# your worker box and the nature of your tasks
worker_concurrency = 16

# When you start an airflow worker, airflow starts a tiny web server
# subprocess to serve the workers local log files to the airflow main
# web server, who then builds pages and sends them to users. This defines
# the port on which the logs are served. It needs to be unused, and open
# visible from the main web server to connect into the workers.
worker_log_server_port = 8793

# The Celery broker URL. Celery supports RabbitMQ, Redis and experimentally
# a sqlalchemy database. Refer to the Celery documentation for more
# information.
broker_url = amqp://amqp:5672/1

# Another key Celery setting
result_backend = db+mysql://airflow:airflow@postgres/airflow

# Celery Flower is a sweet UI for Celery. Airflow has a shortcut to start
# it `airflow flower`. This defines the IP that Celery Flower runs on
flower_host = 0.0.0.0

# This defines the port that Celery Flower runs on
flower_port = 5555

# Default queue that tasks get assigned to and that worker listen on.
default_queue = airflow

# Import path for celery configuration options
celery_config_options = airflow.config_templates.default_celery.DEFAULT_CELERY_CONFIG

# No SSL
ssl_active = False

[scheduler]
# Task instances listen for external kill signal (when you clear tasks
# from the CLI or the UI), this defines the frequency at which they should
# listen (in seconds).
job_heartbeat_sec = 5

# The scheduler constantly tries to trigger new tasks (look at the
# scheduler section in the docs for more information). This defines
# how often the scheduler should run (in seconds).
scheduler_heartbeat_sec = 5

# after how much time should the scheduler terminate in seconds
# -1 indicates to run continuously (see also num_runs)
run_duration = -1

# after how much time a new DAGs should be picked up from the filesystem
min_file_process_interval = 60

use_row_level_locking=False

dag_dir_list_interval = 300

# How often should stats be printed to the logs
print_stats_interval = 30

child_process_log_directory = /usr/local/airflow/logs/scheduler

# Local task jobs periodically heartbeat to the DB. If the job has
# not heartbeat in this many seconds, the scheduler will mark the
# associated task instance as failed and will re-schedule the task.
scheduler_zombie_task_threshold = 300

# Turn off scheduler catchup by setting this to False.
# Default behavior is unchanged and
# Command Line Backfills still work, but the scheduler
# will not do scheduler catchup if this is False,
# however it can be set on a per DAG basis in the
# DAG definition (catchup)
catchup_by_default = True

# This changes the batch size of queries in the scheduling main loop.
# This depends on query length limits and how long you are willing to hold locks.
# 0 for no limit
max_tis_per_query = 0

# The scheduler can run multiple threads in parallel to schedule dags.
# This defines how many threads will run.
parsing_processes = 4

authenticate = False
@boring-cyborg
Copy link

boring-cyborg bot commented Dec 27, 2020

Thanks for opening your first issue here! Be sure to follow the issue template!

@mik-laj
Copy link
Member

mik-laj commented Dec 27, 2020

Does this problem occur for specific DAGs or for each DAG? We know that some DAG configuration options can be problematic.

@bouke-nederstigt
Copy link
Author

bouke-nederstigt commented Dec 28, 2020

Based on your comment I tried about 10 different dags with different configurations (different schedules, depends_on_past etc.). But they all seem to show the same behaviour, so I guess it's not config related.

Are there specific configurations that you know are problematic? I could off course double check the dags on those to be sure this isn't the issue.

@potiuk
Copy link
Member

potiuk commented Dec 28, 2020

It is extremely difficult to say what's wrong without getting any logs. Airflow produces a lot of logs. You can even run scheduler manually and get the logs output to you. Note that for airflow 2.0 we've changed a lot the CLI commands - like airflow db airflow user, we've also added some configuration changes.

All that is documented here: http://airflow.apache.org/docs/apache-airflow/stable/upgrading-to-2.html

If you have not followed the steps outlined there and if you did not reconfigure your image/configuration appropriatel, it is very likely that something might be wrong.

note that we also have the automated Upgrade Check tool http://airflow.apache.org/docs/apache-airflow/stable/upgrading-to-2.html#step-3-install-and-run-the-upgrade-check-scripts that you can install and run in your old environment and it might actually help you with pointing out what changes you need to do.

Finally in the new airflow there is the airflow info command that you can run to dump some of the configuraiton information (i,e, where the logs are stored, what configuration is used. Also there is airflow config list command that can show you what configuration parameters are being used. those tools are useful to analyse the problems.

I am closing that as invalid now, but please let us know in case you manage to go through the steps and see if the problem is in your configuration. If you do the investigation and are still lost - please add this extra information - logs, info, config etc.

@potiuk potiuk closed this as completed Dec 28, 2020
@potiuk potiuk added the invalid label Dec 28, 2020
@bouke-nederstigt
Copy link
Author

bouke-nederstigt commented Dec 29, 2020

Thanks for the pointers. I originally used the upgrade_check and upgrade scripts to execute the upgrade. Nevertheless I double checked everything again. Some notes:

  • The upgrade_check script only fails on Users must delete deprecated configs for KubernetesExecutor. However non of the config items listed there are actually part of my airflow.cfg (or available as an environment variable). I assume this isn't a problem considering that I'm not using the KubernetesExecutor.
  • After reverting back to 1.10.14, reverting the database using a backup, and changing back to old style imports for a DAG the scheduler immediately starts working again. Tasks are also executed correctly as expected.
  • I ran through the upgrade process as described in https://airflow.apache.org/docs/apache-airflow/stable/upgrading-to-2.html once again. FYI: I manually removed the known_event, chart and users tables because the migrations aren't working correctly. But other than that there were no issues.

After upgrading the scheduler shows the same behaviour as before. Tasks get scheduled, but not executed. Of course I might be mistaken but I have a strong feeling this isn't related to configuration.

I've attached files with logs generated by the scheduler in scheduler.log. This shows logs from a couple of minutes running the scheduler. There's a single dag running and it transitions a couple of tasks to the scheduled state. I've also added a copy of the scheduler/knmi_weather.py logs. For completeness sake I've also added the result of airflow config list.

knmi_weather.py.txt
scheduler.log.txt
config.txt

Finally, this is the result of executing the airflow info command.

System info                                                                                                            
OS              | Linux                                                                                                
architecture    | x86_64                                                                                               
uname           | uname_result(system='Linux', node='b3dd1f534368', release='4.19.121-linuxkit', version='#1 SMP Tue   
                | Dec 1 17:50:32 UTC 2020', machine='x86_64', processor='')                                            
locale          | ('en_US', 'UTF-8')                                                                                   
python_version  | 3.7.9 (default, Dec 11 2020, 15:01:14)  [GCC 6.3.0 20170516]                                         
python_location | /usr/local/bin/python                                                                                
                                                                                                                       
Tools info                                                                                                                                                               
git             | git version 2.11.0                                                                                                                                     
ssh             | OpenSSH_7.4p1 Debian-10+deb9u7, OpenSSL 1.0.2u  20 Dec 2019                                                                                            
kubectl         | NOT AVAILABLE                                                                                                                                          
gcloud          | NOT AVAILABLE                                                                                                                                          
cloud_sql_proxy | NOT AVAILABLE                                                                                                                                          
mysql           | mysql  Ver 15.1 Distrib 10.1.47-MariaDB, for debian-linux-gnu (x86_64) using readline 5.2                                                              
sqlite3         | NOT AVAILABLE                                                                                                                                          
psql            | NOT AVAILABLE                                                                                                                                          
                                                                                                                                                                         
Paths info                                                                                                                                                               
airflow_home    | /usr/local/airflow                                                                                                                                     
system_path     | /usr/local/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin                                                                            
python_path     | /usr/local/bin:/usr/local/airflow/config:/usr/local/airflow/great_expectations/plugins}:/usr/local/airflow:/usr/local/lib/python37.zip:/usr/local/lib/p
                | ython3.7:/usr/local/lib/python3.7/lib-dynload:/usr/local/lib/python3.7/site-packages:/usr/local/airflow/dags:/usr/local/airflow/plugins                
airflow_on_path | True                                                                                                                                                   
                                                                                                                                                                         
Config info                                                                                                                                                                   
executor             | CeleryExecutor                                                                                                                                         
task_logging_handler | airflow.providers.amazon.aws.log.s3_task_handler.S3TaskHandler                                                                                         
sql_alchemy_conn     | mysql://root:root@airflow_db:3306/airflow_dev                                                                                                          
dags_folder          | /usr/local/airflow/dags                                                                                                                                
plugins_folder       | /usr/local/airflow/plugins                                                                                                                             
base_log_folder      | /usr/local/airflow/logs                                                                                                                                
                                                                                                                                                                              
Providers info                                                                                                                                                                           
apache-airflow-providers-amazon | 1.0.0                                                                                                                                                  
apache-airflow-providers-celery | 1.0.0                                                                                                                                                  
apache-airflow-providers-ftp    | 1.0.0                                                                                                                                                  
apache-airflow-providers-http   | 1.0.0                                                                                                                                                  
apache-airflow-providers-imap   | 1.0.0                                                                                                                                                  
apache-airflow-providers-jdbc   | 1.0.0                                                                                                                                                  
apache-airflow-providers-mysql  | 1.0.0                                                                                                                                                  
apache-airflow-providers-sqlite | 1.0.0  

Happy to provide extra information or spend some time investigating this issue. Unfortunately I'm pretty blank at this stage as to what could be the issue. So any help would be greatly appreciated.

@potiuk potiuk removed the invalid label Dec 29, 2020
@potiuk potiuk reopened this Dec 29, 2020
@potiuk potiuk added the priority:critical Showstopper bug that should be patched immediately label Dec 29, 2020
@potiuk potiuk added this to the Airflow 2.0.1 milestone Dec 29, 2020
@potiuk
Copy link
Member

potiuk commented Dec 29, 2020

Thanks for providing the information!

The information looks really insightful now.

I've reopened the issue, and marked it as critical to investigate and fix for 2.0.1. We are now during a holiday period so it might take a few days before someone manages to investigate it and provide a fix, so apologies for that (and sorry if the closing seems to be a bit harsh initially) - we have rather limited capacity at the core airflow team and we have to make sure we have enough information that we can act on before acknowledging such issues.

@bouke-nederstigt
Copy link
Author

No problem, completely understandable. Just let me know if there's any extra info that would be helpful.

@smowden
Copy link
Contributor

smowden commented Jan 4, 2021

im having the same issue. seems tasks sometime hang in queued state and never get ran. setting the task to failed manually helps (next dag run works then) but obviously that is not a solution ;)

@ashb
Copy link
Member

ashb commented Jan 4, 2021

Are you able to try this without snowflake? It very seems unlikely to be the problem, but we have had problems caused by snowflake in the past.

@bouke-nederstigt
Copy link
Author

What exactly do you mean by running without Snowflake? Run a DAG that doesn't use Snowflake, or build the container without any snowflake dependencies?

@ashb
Copy link
Member

ashb commented Jan 4, 2021

Build the container without any snowflake dependencies please -- it does some pretty horrible monkey patching of things, so it being installed at all may break things. (It's unlikely, but worth checking)

@bouke-nederstigt
Copy link
Author

bouke-nederstigt commented Jan 4, 2021

Just tried without the Snowflake dependencies (removed snowflake-connector-python and snowflake-sqlalchemy). Still ended up with the same issue.

@ashb
Copy link
Member

ashb commented Jan 5, 2021

Can someone who is having this problem try to get us a minimal reproduction steps so we can dig in to this more please?

@bouke-nederstigt
Copy link
Author

Would it be useful if I share a zip file with Dockerfile, docker-compose and minimal dependencies?

@ashb
Copy link
Member

ashb commented Jan 5, 2021

Would it be useful if I share a zip file with Dockerfile, docker-compose and minimal dependencies?

Yes please, and include any dag files that are needed as well.

@bouke-nederstigt
Copy link
Author

bouke-nederstigt commented Jan 5, 2021

The zip should contain everything to reproduce the issue.

airflow_scheduler_issue.zip

  • From directory rabbitmq run docker-compose up
  • From directory airflow run docker-compose up
  • Create a user account and turn on the dag knmi_weather.
  • You should see the described behaviour where tasks get scheduled, but never reach a running state.

Webserver should be running on localhost:8080.

Let me know if you run into any issues with the docker files. We tested running the containers on a multiple computers, but you never know with these things.

@ashb
Copy link
Member

ashb commented Jan 6, 2021

Thanks @bouke-nederstigt - I see the behaviour. Digging in to it now.

@ashb
Copy link
Member

ashb commented Jan 6, 2021

im having the same issue. seems tasks sometime hang in queued state and never get ran. setting the task to failed manually helps (next dag run works then) but obviously that is not a solution ;)

@smowden Tasks getting stuck in queued state is likely a different case.

@ashb
Copy link
Member

ashb commented Jan 6, 2021

@bouke-nederstigt Okay, I've found the problem. max_tis_per_query=0 in the config is broken. A quick work-around for you for now is to set that to a large value (say 512).

We'll fix it so 0 works as documented in 2.0.1 -- this was a bug in Airflow (it turns out we don't have any tests that set it to 0)

@ashb ashb added affected_version:2.0 Issues Reported for 2.0 area:Scheduler including HA (high availability) scheduler labels Jan 6, 2021
@ashb ashb changed the title Scheduler seems broken after 2.0 upgrade max_tis_per_query=0 leads to nothing being scheduled in 2.0.0 Jan 6, 2021
@bouke-nederstigt
Copy link
Author

Problem solved indeed! Thanks a lot for the effort! :)

@vikramkoka vikramkoka added the kind:bug This is a clearly a bug label Jan 18, 2021
@halilduygulu
Copy link

Happened to me as well. wasted half day until i try changing all scheduler related confs.
Please put this into updating.md or something until 2.0.1 is out. This conf existed in my conf of 1.10.14 with 0 value, so people can't figure out to change it before spending a lot of time.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
affected_version:2.0 Issues Reported for 2.0 area:Scheduler including HA (high availability) scheduler kind:bug This is a clearly a bug priority:critical Showstopper bug that should be patched immediately
Projects
None yet
Development

Successfully merging a pull request may close this issue.

7 participants