mardi 1 septembre 2020

How to improve performance of the application where files are passed with more than 1000 lines each and trying to hit Database

I am talking here about a java 7 Application. The application takes input of files continuously. Each file contains 1000s of lines. The application fetches a value(each param from the line would have only one value and not more) from Oracle Database based on a parameter from each line of the file and sets the value to the output.

The performance is very low when the file size is huge as it tries to hit the DB to parse each line of the file. So I thought of a solution to fetch the total values from the Database and caching the values to a hash map so I can lookup the hashmap instead of hitting the database for each line. The values in the database are not constant, it gets updated every 15 mins, so I thought we would call the DB for every 15 mins and store the values in a hashmap, but the problem here is the total number of values in the table I want to fetch are around a million and it could increase. So caching it to memory is not a good idea and it could further impact the performance.

So please let me know the ways I can improve the performance of the system here. i would be really grateful

Aucun commentaire:

Enregistrer un commentaire