Pardon the newbie question.
My data model is really simple (two tables; 200 rows in one, 10,000 in the other; natural join between both).
I’m using Hibernate to attempt a simple read/update on these tables.
Doing so keeps throwing OutOfMemory errors (1GB assigned to my JVM).
This seems very unlikely due to natural memory needs, and more likely that I am omitting some basic step in Hibernate.
I’m at the point where I’ve even replaced most of my actual Hibernate object accesses with direct SQL (absurd, I know). But even getCurrentSession().createSQLQuery(…) is resulting in OOM errors.
Can anyone point me in the right direction?
A common cause of OOM is a lack of control of the size of the session-level cache (by calling
flushandclearon thesessionat regular intervals, ideally, the same interval as the JDBC batch size) when dealing with large collections of entities. See the Chapter 13. Batch processing for more on this.But your case is not that impressive. What are you doing exactly? And how (even pseudo code might help)?