RE: How to bulk insert data in the DB from liferay

Olaf Kock, modified 3 Years ago. New Member Posts: 3 Join Date: 6/18/20 Recent Posts

Hi,

I need to insert 300k+ records into custom table from a CSV, using a foreach with liferay service builder I got OutOfMemoryException.

I have seen two options:

The first, using batch hibernate insert executing flush and clear to control the cache.

The second, using Liferay Batch Engine but I dont know if it is a huge fly killer to this requeriment.

Anyone knows an alternative options?.

Thanks.

 

thumbnail
Olaf Kock, modified 3 Years ago. Liferay Legend Posts: 6441 Join Date: 9/23/08 Recent Posts

Depending on what your ServiceBuilder service does upon writing such an entry, you have different options:

If you don't need any Liferay infrastructure (as ResourcePermissions), you can actually write directly to the custom tables, (NEVER do that for build-in tables). Especially when it's INSERT operations (as opposed to UPDATE): They won't be in the cache anyway, and you'll not need to take care of the cache. However: Careful with ID generation - don't cause duplicate ids later.

You certainly want to batch the inserts into smaller chunks, to not end up with a humongous transaction. (this naturally also means that you'll need to think about retry options)

And, of course, you likely don't want to read a file of that size into memory before starting to process it, but operate on the current line, or another smaller chunk of the file at a time.

Last but not least: Sometimes raising the memory limit is a nice quick fix for such an operation

What's best? You judge