lundi 19 juillet 2021

Handling large simultaneous workloads using pub/sub?

I'm working on a problem where large no. of operations have to simultaneously be kicked off based of an event. For example, user types a destination and dates and wants the best offer from over 200 "travel partners".

To satisfy this, I'm planning on an event-driven architecture where upon user providing the appropriate input, a message is published to a topic, and this topic has worker subscribed to it which in turns generates additional events, one for each travel partner to get offers from.

So Essentially:

  • (1) publish message to Topic "TRAVEL_DESTINATION_REQUEST" upon user input being provided
  • (2) a worker is subscribed to this topic
  • (3) worker at (2), For each travel partner in system, publish event with data {date:..., destination:...,travel_partner_id: ...etc} to topic FIND_OFFER.
  • (4) workers subscribed to FIND_OFFER query travel_partner_id and persist the response somewhere.

So if you have 200 travel partners, above would push 200 events to FIND_OFFER topic for workers to handle per each user query.

Is this how you would go about solving a problem as such? If not how would you go about it? Sequentially is obviously not possible since we can't have the user seat there waiting and travel partner api calls may differ in response times...

In GKE world, is pub/sub a good candidate for such an approach? Does anyone know if pod load-balancing would cause any issues with this model?

Aucun commentaire:

Enregistrer un commentaire