Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Allowing collecting index-specific internal metrics #23

Open
wangtzh opened this issue Feb 28, 2020 · 2 comments
Open

Allowing collecting index-specific internal metrics #23

wangtzh opened this issue Feb 28, 2020 · 2 comments
Labels
enhancement New feature or request

Comments

@wangtzh
Copy link
Member

wangtzh commented Feb 28, 2020

Some indexes may have their own sets of internal metrics that are meaningful to collect. For example, for an LSM tree like LevelDB we can collect compact speed. These are not included in standard PiBench metrics and are not needed for every index. It would be convenient for the index/wrapper developer to devise a set of metrics that can be summarized via PiBench, for easier stats gathering/plotting etc.

I don't have a concrete design in mind, but one potential solution may be for the index itself to still collect these extra metrics (which they already do), and they register a callback function in PiBench which is called after benchmark is done. What PiBench does is to simply summarize and display these metrics.

@wangtzh wangtzh added the enhancement New feature or request label Feb 28, 2020
@llersch
Copy link
Collaborator

llersch commented Mar 3, 2020

What if internal metrics are left completely for the wrapper to handle?
For example, in the case of LevelDB the wrapper might setup the options required for collecting metrics in the constructor and output the metrics to a file in the destructor code.

Advantage: PiBench is kept agnostic to internal metrics and we don't have to change anything since it might be hard to support a generic interface for all possible wrappers.

Disadvantage: Any changes regarding the collection of internal metrics must be made directly in the wrapper source code and the wrapper must be re-compiled.

@wangtzh
Copy link
Member Author

wangtzh commented Mar 4, 2020

Makes sense, I'm actually torn between the two. @JonghyeokPark is probably doing this in the wrapper level, I think let's see how that goes.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

2 participants