Ask Questions and Find Answers
Important:
Ask is now read-only. You can review any existing questions and answers, but not add anything new.
But - don't panic! While ask is no more, we've replaced it with discuss - the new Liferay Discussion Forum! Read more here here or just visit the site here:
discuss.liferay.com
RE: How to bulk insert data in the DB from liferay
Hi,
I need to insert 300k+ records into custom table from a CSV, using a foreach with liferay service builder I got OutOfMemoryException.
I have seen two options:
The first, using batch hibernate insert executing flush and clear to control the cache.
The second, using Liferay Batch Engine but I dont know if it is a huge fly killer to this requeriment.
Anyone knows an alternative options?.
Thanks.
Depending on what your ServiceBuilder service does upon writing such an entry, you have different options:
If you don't need any Liferay infrastructure (as ResourcePermissions), you can actually write directly to the custom tables, (NEVER do that for build-in tables). Especially when it's INSERT operations (as opposed to UPDATE): They won't be in the cache anyway, and you'll not need to take care of the cache. However: Careful with ID generation - don't cause duplicate ids later.
You certainly want to batch the inserts into smaller chunks, to not end up with a humongous transaction. (this naturally also means that you'll need to think about retry options)
And, of course, you likely don't want to read a file of that size into memory before starting to process it, but operate on the current line, or another smaller chunk of the file at a time.
Last but not least: Sometimes raising the memory limit is a nice quick fix for such an operation
What's best? You judge
Powered by Liferay™