You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi
I know that it is right there in the README that the kafka notion of 'consumer groups' has been explicitly avoided, however consumer groups solve one big problem - scaling consumers.
E.g, consider a simple scenario. I have 1 stream and I publish 'VariableChanged' events to the stream. Now I want a consumer to subscribe to it and validate the change. I will call this little consumer the 'Validation Service'. The problem I have is that 'validating the change' is a very long, complex process and I am publishing lots of 'VariableChanged' events.
So I want to scale out my Validation Service. To do this, I need to guarantee that only 1 of my Validation Services' will receive any one event they have subscribed to - my scaling won't achieve anything if they are all processing the same event!
This is where consumer groups come in. Without this feature in dafka, is there any way to solve this (pretty fundamental) scaling issue?
The text was updated successfully, but these errors were encountered:
Hi @JamesRamm,
that's a fair point in regards to the consumer groups.
As of now you'll need a known consumer which receives the VariableChanged events and distributes them to a set of workers.
This could be achieved with the ZeroMQ PUSH/PULL sockets.
It's not as comfortable as using Kafka's consumer group feature because the workers need to connect to the consumer but in regards to scaling it will work.
Hi
I know that it is right there in the README that the kafka notion of 'consumer groups' has been explicitly avoided, however consumer groups solve one big problem - scaling consumers.
E.g, consider a simple scenario. I have 1 stream and I publish 'VariableChanged' events to the stream. Now I want a consumer to subscribe to it and validate the change. I will call this little consumer the 'Validation Service'. The problem I have is that 'validating the change' is a very long, complex process and I am publishing lots of 'VariableChanged' events.
So I want to scale out my Validation Service. To do this, I need to guarantee that only 1 of my Validation Services' will receive any one event they have subscribed to - my scaling won't achieve anything if they are all processing the same event!
This is where consumer groups come in. Without this feature in dafka, is there any way to solve this (pretty fundamental) scaling issue?
The text was updated successfully, but these errors were encountered: