Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

jdbc + large postgresql query give out of memory

Tags:

java

jdbc

I'm trying to execute a postgresql query which returns a large result:

connection.setAutoCommit(false);
st = connection.createStatement(
  ResultSet.CONCUR_READ_ONLY,
  ResultSet.TYPE_FORWARD_ONLY
);
st.setFetchSize(100);
logMemory();
System.out.println("start query ");
rs = st.executeQuery(queryString);
System.out.println("done query ");
logMemory();

but this uses a lot of memory:

Free memory; 4094347680  (= 3905 mb).
start query 
done query
Free memory; 2051038576  (= 1956 mb).

(printed with Runtime.getRuntime().freeMemory() )

So far it works but the database is going to be a lot bigger. I don't ever need the entire result in memory; I just need to proccess each row, write the results to disk and go to the next row.

I know 'setFetchSize' is only a hint, but I would find it strange if postgresql/jdbc would ignore it, as it's around for ages.

Any way to get around this? My only idea so far is to make a batch script which streams the result of the query to disk and then parse the file from Java...

like image 695
kresjer Avatar asked Oct 01 '09 14:10

kresjer


1 Answers

Ouch, this is one of the most nasty bugs using JDBC I've seen. You should change

st = connection.createStatement(
  ResultSet.CONCUR_READ_ONLY,
  ResultSet.TYPE_FORWARD_ONLY
);

into

st = connection.createStatement(
  ResultSet.TYPE_FORWARD_ONLY,
  ResultSet.CONCUR_READ_ONLY
);

Maybe simply

st = connection.createStatement();

will work as well (as you've met the other criteria for a cursor).

like image 52
Frank Meulenaar Avatar answered Oct 04 '22 19:10

Frank Meulenaar