I am using spring boot and PostgreSQL for handling client data. Generally it will be a file which has huge data.i need to handle this in best way.
Currently what I am doing
- Reading and storing records.
Currently I will read the file and make list of pojo and store it by saveall(list)
Is that gives me proper way of saving records in performance wise
2.Updating existing records
Here it is even complicating.i will search records for updating.for example in client request I will get only the keys for already saved rows. I need to find those records by that keys for updations.
Here I couldn’t pass unknown size of list object to the Postgres using hibernate.More precisely PostgreSQL IN clause support 32700 approx for finding records.
Is there any work around for processing records with unknown size of lists .i tried passing Json object to the Postgres function but hibernate not supporting for transaction.
So did I have to go by splitting the list object considerably for that IN clause limitation or is there any workaround for handling this problem.
I am eager to know other technologies for this development in best way.
Comments
Post a Comment