Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

dynamic allocation executor pending for addition flood fix #396

Merged

Conversation

raviranak
Copy link
Contributor

@raviranak raviranak commented Dec 15, 2023

Overview

Issue :
Currently with dynamic allocation for executors there are too many executors pending to be added while the actual number of max executor is far less

Solution:
This will check with dynamic auto scale no additional pending executor actor added more than max executors count
as this result in executor even running after job completion

@raviranak
Copy link
Contributor Author

@kira-lin Can you please take a look at this

@kira-lin
Copy link
Collaborator

Sorry for the late reply. I have a question: do you mean that the executors killed by Spark due to dynamic resource allocation will try to restart?

@raviranak
Copy link
Contributor Author

Sorry for the late reply. I have a question: do you mean that the executors killed by Spark due to dynamic resource allocation will try to restart?

With dynamic resource allocation the amount of pending executor creation is way greater than actual number of executor , so even after completion of spark job the executor is alive for much longer

@rishabh-dream11
Copy link

@kira-lin Can you help with the review and merge?

@kira-lin kira-lin merged commit 01e851f into oap-project:master Apr 9, 2024
12 checks passed
carsonwang pushed a commit that referenced this pull request Jun 26, 2024
* dynamicallocation executor pending for addition flood fix

* scala checkstyle fix

* scala checkstyle fix

* scala checkstyle fix
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants