Is there a way to read and write from a blob
in chunks using Hibernate
.
Right now I am getting OutOfmemoryException
because the whole blob data is loaded in memory into a byte[]
.
To be more specific, let's say I want to save a large file into a database table called File
.
public class File {
private byte[] data;
}
I open the file in a FileInputStream and then what?
How do I tell Hibernate that I need to stream the content and will not give the whole byte[]
array at once?
Should I use Blob
instead of byte[]
? Anyway how can I stream the content?
Regarding reading, is there a way I can tell hibernate that (besides the lazy loading it does) I need the blob to be loaded in chunks, so when I retrieve my File
it should not give me OutOfMemoryException
.
I am using:
If going the Blob route, have you tried using Hibernate's LobHelper
createBlob method, which takes an InputStream
? To create a Blob and persist to the database, you would supply the FileInputStream object and the number of bytes.
Your File bean/entity class could map the Blob like this (using JPA annotations):
@Lob
@Column(name = "DATA")
private Blob data;
// Getter and setter
And the business logic/data access class could create the Blob for your bean/entity object like this, taking care not to close the input stream before persisting to the database:
FileInputStream fis = new FileInputStream(file);
Blob data = getSession().getLobHelper().createBlob(fis, file.length());
fileEntity.setData(data);
// Persist file entity object to database
To go the other way and read the Blob from the database as a stream in chunks, you could call the Blob's getBinaryStream method, giving you the InputStream and allowing you to set the buffer size later if needed:
InputStream is = fileEntity.getData().getBinaryStream();
Struts 2 has a convenient configuration available that can set the InputStream result's buffer size.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With