samedi 4 juillet 2020

Importing data into database from Kafka

I have a general question about the "ethics" of integrating messages into database via a Kafka consumer. I always recommend that always keep a separate consumer for each topic as you would have different processing logic for each type of message. So a Spring based Kafka consumer with a Crud repository to do repo.save(). But my colleagues, who are SQL veterans, want such consumers to be generic and rather have the processing logic run at the DB level (stored procedures ofc). So all the consumers do is poll the topic and put it into a staging table and then the integration is taken care by the sp. Now why I do not want this to be the case is because the repo.save() helps to maintain a good ORM between the message and the DB table, and any unwarranted changes at either end would be somewhat controllable. Whereas the concept of sp gives you more control with handing the data as you wish and could lead to some table structure changes. Could anybody beat some sense into me as to what would be a good design practice for this ?!?

Aucun commentaire:

Enregistrer un commentaire