Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add threadiness to helm chart #643

Merged
merged 1 commit into from
Jul 9, 2020

Conversation

mvollman
Copy link
Contributor

@mvollman mvollman commented Jul 6, 2020

Add threadiness as an option to the helm chart to increase the value beyond the default of 2 threads. Bump chart version to 1.1.0 to reflect the new chart feature.

@stefanprodan
Copy link
Member

@mvollman changing the number of workers doesn't make any difference, Flagger has its own scheduling mechanism, each canary gets a dedicated go routine.

@mvollman
Copy link
Contributor Author

mvollman commented Jul 6, 2020

We have currently increased threadiness for performance reasons in both of our k8s clusters. I set it manually before forking the helm chart. Once we got the responsiveness we wanted from the operator I tried setting threadiness back to the default and saw the length of time between canary steps increase. I don't have good data to share, but I will try setting it back to the default and gathering some timings.

@@ -134,6 +134,7 @@ Parameter | Description | Default
`resources.limits/memory` | Pod memory limit | `512Mi`
`affinity` | Node/pod affinities | None
`nodeSelector` | Node labels for pod assignment | `{}`
`threadiness` | Number of k8s wortkers | `2`
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
`threadiness` | Number of k8s wortkers | `2`
`threadiness` | Number of controller workers | `2`

Copy link
Member

@stefanprodan stefanprodan left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@stefanprodan
Copy link
Member

@mvollman are you using EKS? This could be related to #638

@mvollman
Copy link
Contributor Author

mvollman commented Jul 7, 2020

It is in fact the exact same issue. I am looking into the rate limiting now. I will collect some more data around timings. The rate limiting makes sense, but I did also see an improvement with the threadiness increase.

@stefanprodan stefanprodan merged commit 997e7be into fluxcd:master Jul 9, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants