I need to return a rather large file from a web request. The file is around 670mb in size. For the most part this will work fine but after some time the following error will be thrown:
java.lang.OutOfMemoryError: Direct buffer memory
at java.nio.Bits.reserveMemory(Bits.java:694) ~[na:1.8.0_162]
at java.nio.DirectByteBuffer.<init>(DirectByteBuffer.java:123) ~[na:1.8.0_162]
at java.nio.ByteBuffer.allocateDirect(ByteBuffer.java:311) ~[na:1.8.0_162]
at sun.nio.ch.Util.getTemporaryDirectBuffer(Util.java:241) ~[na:1.8.0_162]
at sun.nio.ch.IOUtil.read(IOUtil.java:195) ~[na:1.8.0_162]
at sun.nio.ch.FileChannelImpl.read(FileChannelImpl.java:159) ~[na:1.8.0_162]
at sun.nio.ch.ChannelInputStream.read(ChannelInputStream.java:65) ~[na:1.8.0_162]
at sun.nio.ch.ChannelInputStream.read(ChannelInputStream.java:109) ~[na:1.8.0_162]
at sun.nio.ch.ChannelInputStream.read(ChannelInputStream.java:103) ~[na:1.8.0_162]
at java.nio.file.Files.read(Files.java:3105) ~[na:1.8.0_162]
at java.nio.file.Files.readAllBytes(Files.java:3158) ~[na:1.8.0_162]
I have set the heap size to 4096mb which I think should be large enough to handle this kinds of files. Furthermore when this error occured I took a heapdump with jmap to analyze the current state. I found two rather large byte[], which should be the file I want to return. But the heap is only around 1.6gb in size and not near the configured 4gb it can be.
According to some other answer (https://stackoverflow.com/a/39984276/5126654) in a similar question I tried running manual gc before returning this file. The problem still occured but now only spardic. The problem occured after some time, but then when I tired running the same request again it seems like the garbage collection took care of whatever caused the problem, but this is not sufficient since the problem apparently still can occur. Is there some other way to avoid this memory problem?
The direct buffer memory is the OS' native memory, which is used by the JVM process, not in the JVM heap. It is used by Java NIO to quickly write data to network or disk; no need to copy between JVM heap and native memory.
A direct buffer is a chunk of native memory shared with Java from which you can perform a direct read. An instance of DirectByteBuffer can be created using the ByteBuffer. allocateDirect() factory method.
MaxDirectMemorySize. This JVM option specifies the maximum total size of java. nio (New I/O package) direct buffer allocations. It is used with network data transfer and serialization activity. The default value for direct memory buffers depends on your version of your JVM.
This Oracle HotSpot option sets a limit on the amount of memory that can be reserved for all Direct Byte Buffers.
You could also try increasing the size of the buffer used for DirectByteBuffer
with the JVM option -XX:MaxDirectMemorySize
. The Java docs are not very detailed about this parameter, but according to this page it will by default be set to 64MB unless you have specified the -Xmx
flag. So if you haven't set this flag the allocated buffer may be too small. Or if you have a very large file and have set -Xmx
, the derived 2GB may be too small and you could still benefit from setting a larger buffer manually.
All in all, the better approach is probably to stream the file as suggested by Stephen C.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With