I have been reading multiple blogs where people are saying that we can use Streams as a database. (Like this ).
And it will have obvious benefit of scalability, atomicity etc. Because Let's take the example of kafka. If we use kafka as a datastore and there are 10 topics in kafka. We can theoretically write with 10 transaction per seconds without worrying about collision (if all the writes are going to different topic). But on a very low level in any other database only one thread will be committing something to file system. (is my understanding correct?)
If I plan to implement the above. I will have a kafka as an intermediate datastore. And there are listeners running, who keep pushing to data to some persistent store(for reads/aggregate view).
In many CRUD services, we have to read the entity before we try to update it. But with the above approach, there might be some events which have not been processed yet. (I understand that the above problem might be there in other databases as well in terms of Eventual Consistency, but we can work around by using strong Quorums, but in the above case we have brought in one more point of failure and the probability of being out of sync increases). Also we will have to build pipelines to notify user if their operation was successful or not. Which can not be done synchronously now.
Hence, what are all the feasible use cases for the above Approach? Kafka is good as an event source database, but for writing CRUD service I see too many issues. Am I missing anything?
Aucun commentaire:
Enregistrer un commentaire