I have JPA a entity class that contains a blob field like this:
@Entity
public class Report {
private Long id;
private byte[] content;
@Id
@Column(name = "report_id")
@SequenceGenerator(name = "REPORT_ID_GENERATOR", sequenceName = "report_sequence_id", allocationSize = 1)
@GeneratedValue(strategy = GenerationType.SEQUENCE, generator = "REPORT_ID_GENERATOR")
public Long getId() {
return id;
}
public void setId(Long id) {
this.id = id;
}
@Lob
@Column(name = "content")
public byte[] getContent() {
return content;
}
public void setContent(byte[] content) {
this.content = content;
}
}
and I have some large data (over 3 gigs) inserted on a record of it in database (using a DBMS procedure).
Application users are supposed to be able to download the content of these records, so I have implemented a method that streams the fetched result to clients browsers.
Problem is, since JPQL select queries tend to fetch entire object from DB first and then give it to application, whenever I try to access this record using JPA I get unable to allocate enough memory exception.
I have seen some solution for this problem using JDBC connections that try to stream data from database, but I could not any JPA specific solution for it.
Does anyone have any clue how to solve should I solve this problem?
This is a late answer, but for those still looking for a solution, I found a good article by Thorben Janssen on Thoughts on Java blog. The drawback, it's Hibernate specific, but you seem to use it anyway. Basicly the solution is to use java.sql.Blob data type attributes in your entity
@Entity
public class Book {
@Id
@GeneratedValue
private Long id;
private String title;
@Lob
private Clob content;
@Lob
private Blob cover;
...
}
And then you use Hibernate’s BlobProxy, which provides an OutputStream. But take a look at the article here
I solved the problem in following manner, note that this solution may only work on hibernate implementation of JPA.
Here is the DAO class that is used to stream the content:
@Repository
public class ReportDAO{
private static final Logger logger = LoggerFactory.getLogger(ReportDAO.class);
@PersistenceContext
private EntityManager entityManager;
//---streamToWrite is the stream that we used to deliver the content to client
public void streamReportContent(final Long id, final OutputStream streamToWrite) {
try{
entityManager=entityManager.getEntityManagerFactory().createEntityManager();
Session session = entityManager.unwrap(Session.class);
session.doWork(new Work() {
@Override
public void execute(Connection connection) throws SQLException
{
PreparedStatement stmt=connection.prepareStatement("SELECT content FROM report where id=?");
stmt.setLong(1,id);
ResultSet rs = stmt.executeQuery();
rs.next();
if(rs != null)
{
Blob blob = rs.getBlob(1);
InputStream input = blob.getBinaryStream();
byte[] buffer = new byte[1024];
try {
while (input.read(buffer) > 0) {
String str = new String(buffer, StandardCharsets.UTF_8);
streamToWrite.write(buffer);
}
input.close();
} catch (IOException e) {
logger.error("Failure in streaming report", e);
}
rs.close();
}
}
});
}
catch (Exception e){
logger.error("A problem happened during the streaming problem", e);
}
}
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With