mercredi 14 avril 2021

Design pattern for microservice that makes rest calls to API that frequently times out

I have just inherited a project to redesign a Spring Boot microservice that

  1. parses a large csv file and
  2. for each line - executes a rest client getForObject request on a remote API
  3. the object is then saved to a file in S3

We have no control over the remote API, and are limited to a single GET request per object. The csv files can be tens of thousands of lines in length, and the remote API frequently times out when under heavy load - so processing these files can take a long time and also frequently throws exceptions due to request timeouts. Resilience4j is already part of the code - but has not been very helpful as their API seems very brittle.

Wondering if anyone has dealt w this type of problem, or could suggest any design patterns, caching or batching solutions, that would allow us to

  1. process large csv batches in an efficient manner despite lack of batching capabilities in the remote API - possibly some type of caching solution
  2. would allow us to save off requests incomplete/error state for error reporting and retry purposes

Thanks for any suggestions.

Aucun commentaire:

Enregistrer un commentaire