What are the best practices to log data to a central entity/database that may be unavailable for some time while logging?
I'm loggging data from a desktop application. The data should be stored in an sql database that may be unavailable for some time (may be days). The database is never read from the desktop application.
What I'm currently doing works, but it feels like a hack: The desktop application simply writes to a local sqlite database. Every table of the database has a 'modified' column that is updated by triggers on updates and inserts. A second application then polls the sqlite database, checks for modified rows and tries to write them to the sql server. If the rows were successfully written, the modified flags are updated.
Alternatives I have thought of so far:
-
A send-queue that buffers data that should be sent to the server until the server is available again. It seems like kafka offers something similar. An issue that could arise: if the system that runs the desktop application (or the application itself) crashes unexpectedly , the data is lost - which I would like to avoid
-
Try to write directly to the sql server. If that fails, write the data to a local database. Every now and then try to write rows from the local database to the server and delete them on success. This is quite similar to the alternative above but it ensures that the data persists on crashes (The local database is now basically the send-queue).
Aucun commentaire:
Enregistrer un commentaire