Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[RIP-27] [Feature] Auto batch in producer #3717

Closed
guyinyou opened this issue Jan 6, 2022 · 3 comments
Closed

[RIP-27] [Feature] Auto batch in producer #3717

guyinyou opened this issue Jan 6, 2022 · 3 comments

Comments

@guyinyou
Copy link
Contributor

guyinyou commented Jan 6, 2022

  1. Please describe the feature you are requesting.
    Messages are sent in batches with very good throughput performance, but currently the business end needs to be packaged manually by itself, and the experience is not good enough.

  2. Provide any additional detail on your proposed use case for this feature.
    To realize the automatic message packaging function, the business side only needs to simply call the sending interface of a single message to realize the throughput of batch sending.

  3. Indicate the importance of this issue to you (blocker, must-have, should-have, nice-to-have). Are you currently using any workarounds to address this issue?
    nice-to-have

  4. If there are some sub-tasks using -[] for each subtask and create a corresponding issue to map to the sub task:

@RongtongJin
Copy link
Contributor

@github-actions
Copy link

This issue is stale because it has been open for 365 days with no activity. It will be closed in 3 days if no further activity occurs.

@github-actions github-actions bot added the stale label Jan 27, 2023
@github-actions
Copy link

This issue was closed because it has been inactive for 3 days since being marked as stale.

RongtongJin pushed a commit that referenced this issue Jul 21, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

4 participants