Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Store benchmark test results from automated runs in a dedicated webserver #3

Open
ksatchit opened this issue Sep 11, 2017 · 1 comment
Assignees

Comments

@ksatchit
Copy link
Member

The benchmark test suite is required to be run in an automated manner on every major build release on a setup/environment for which baseline numbers are identified.

The results of these periodic benchmark test runs need to be stored in a dedicated webserver with appropriate naming conventions & build details for ready reference

@ksatchit ksatchit self-assigned this Sep 11, 2017
@ksatchit
Copy link
Member Author

ksatchit commented Oct 3, 2017

Please refer to this issue on the OpenEBS repo : openebs/openebs#386 for more details on the same topic

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

1 participant