Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Automatically archive historical data and provide downsampling capabilities #4697

Open
killme2008 opened this issue Sep 9, 2024 · 0 comments
Labels
C-enhancement Category Enhancements

Comments

@killme2008
Copy link
Contributor

What type of enhancement is this?

API improvement, User experience

What does the enhancement do?

Detailed metrics or log data have relatively high storage costs.
Typically, after exceeding the TTL or a certain period, the data can be downsampled to reduce storage costs.
For example, only storing error-level logs or aggregating millisecond-level data into minute-level data using specified aggregation functions can further reduce the storage cost of historical data while continuing to provide valuable historical data queries.
If GreptimeDB could offer the ability to create archiving tasks with customizable archiving rules and calculations, it would be very convenient.

Implementation challenges

No response

@killme2008 killme2008 added the C-enhancement Category Enhancements label Sep 9, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
C-enhancement Category Enhancements
Projects
None yet
Development

No branches or pull requests

1 participant