Preferred production ready eventing model: Broker & Trigger (with Kafka Broker) vs. Channel & Subscription (with Kafka Channel) #7217
Replies: 2 comments
-
Based on my understanding potential failure scenario would be - During a big sale event, the e-commerce platform receives a surge in orders. The order-validator service starts to experience latency due to the high volume of validation it needs to perform. Additionally, a bug in the validation logic causes the service to crash intermittently. Observations with the Source-to-Sink Model: Message Backlog: As the order-validator service experiences latency and crashes, it can't process events as quickly as they arrive. This leads to a backlog of messages in the Kafka topic. Loss of Events: If the KafkaSource is not configured with retries or if the retry strategy is not robust enough to handle the downtime of the order-validator, some events might be lost or not processed. Order of Events: If the KafkaSource processes events in parallel, and the order-validator service is not designed to handle out-of-order events, this could lead to issues like an order being shipped before payment is processed. Alternative Approach with Broker & Trigger or Channel & Subscription: Kafka Channel & Subscription: Instead of directly sending events from the KafkaSource to the order-validator, events are sent to a Kafka Channel. Multiple subscriptions can be set up for different services to consume events. This decouples the event producers from consumers. Retries & Dead Letter Sink: With the Channel & Subscription model, we can configure retries. If the order-validator service fails to process an event after several retries, the event can be sent to a Dead Letter Sink, ensuring it's not lost. But I am in doubt about which way to go for - Broker/Trigger or Channel/Subcription. Looking for everyone's input. |
Beta Was this translation helpful? Give feedback.
-
For the Knative Kafka Broker and Channel, the data-plane is the same code base:
And both have a However I'd go with the broker, since that has IMO more flexible approach:
The Broker can be seen as an Event Mesh, where all the consumers are loosely coupled, and just do their work: For the Broker I wrote some piece on recommended configuration: |
Beta Was this translation helpful? Give feedback.
-
Dear Knative community,
I've been exploring Knative Eventing for a production system and am trying to determine the most reliable and scalable eventing model. I've decided against the direct source-to-sink model due to concerns about reliability. Now, I'm torn between two primary models:
Broker & Trigger backed by a Kafka Broker: This model seems to offer a clean separation between event producers and consumers, with the broker acting as a central hub. The Kafka-backed broker would provide the durability and reliability of Kafka.
Channel & Subscription backed by a Kafka Channel: This model provides a direct channel for events, with the Kafka channel ensuring message durability. The subscription mechanism allows for decoupled event consumption.
I'd appreciate insights on the following:
Reliability: Which model offers better guarantees in terms of message delivery, especially in scenarios where consumers might be temporarily unavailable?
Scalability: As the system grows, which model scales better in terms of handling a large number of events and ensuring low latency?
Operational Complexity: From an operational standpoint, which model is easier to manage, monitor, and troubleshoot?
Any experiences, lessons learned, or recommendations from those who have deployed Knative Eventing in production would be invaluable.
Thank you in advance for your insights!
Beta Was this translation helpful? Give feedback.
All reactions