I have one consumer A that can handle one type of message, which has a conversation_id
allowing it to retrieve its content
, loading it from a database.
Once it has handled it, consumer A passes the message to a pool of consumer B, doing stuff on the content
, then saving it into the database.
The problem is, when two messages with the same conversation_id
arrive at the same time, they get both the same content
loaded, then are both handled by one (different) consumer of type B, and then the content
save is inconsistent.
I was thinking of putting a kind of lock on a conversation_id
(directly in DB) while it is handled by consumer A, then unlock it when consumer B has finished.
The problem is : how can I handle upcoming messages then ?
I got a hack-y-ish solution : consumer A has a list of pending_messages
and locked_ids
: if a message comes and is not in locked_ids
, I add its conversation_id
into locked_ids
and I pass it to consumer B.
If the message is in locked_ids
, I add it to pending_messages
. Then I loop over my pending_messages
. If a conversation_id
is not locked anymore, I handle the first message in the queue with this conversation_id
.
I find this solution a bit over-engineered because it forces consumer A to manage locks and queues internally.
Is there a better design to handle this problem only with MQ ?
Thanks in advance
Aucun commentaire:
Enregistrer un commentaire