lundi 22 février 2016

Design to read 100K records from a flat file and process them to DB in Java

I am designing an API in Java with Spring Framework that will read a flat file containing 100K records and compare them with values fetched from database. If the DB values are available in the file values then they will be updated in the database.

The concern in the entire process is performance. I have a maximum of 7 minutes to perform the entire processing of 100K records.

I am looking to use a caching mechanism to fetch all the data from the database in a cache bean. The cache will be refreshed every 30 mins or 1 hour. Second, we will read the file and compare the values with the values in the cache and the matched values will be stored in another cache. Third, we will update the values from the second cache to the database using a threading mechanism.

I need some opinions on this design. Does it look good. Any advice to improve the design is welcome.

P.S. : Database in use is DB2 hosted on Mainframe systems

Thanks Nirmalya

Aucun commentaire:

Enregistrer un commentaire