Is there a cleaner and faster way to do this:
BufferedReader inputReader = new BufferedReader(new InputStreamReader(context.openFileInput("data.txt")));
String inputString;
StringBuilder stringBuffer = new StringBuilder();
while ((inputString = inputReader.readLine()) != null) {
stringBuffer.append(inputString + "\n");
}
text = stringBuffer.toString();
byte[] data = text.getBytes();
Basically I'm trying to convert a file into byte[]
, except if the file is large enough then I run into an outofmemory error. I've been looking around SO for a solution, I tried to do this here, and it didn't work. Any help would be appreciated.
Few suggestions:
As we know the size of this file, somewhat half of the memory can be saved by allocating the byte array of the given size directly rather than expanding it:
byte [] data = new byte[ (int) file.length() ];
FileInputStream fin = new FileInputStream(file);
int n = 0;
while ( (n = fin.read(data, n, data.length() - n) ) > 0);
This will avoid allocating unnecessary additional structures. The byte array is only allocated once and has the correct size from beginning. The while loop ensures all data are loaded ( read(byte[], offset, length)
may read only part of file but returns the number of bytes read).
Clarification: When the StringBuilder runs out, it allocates a new buffer that is the two times larger than the initial buffer. At this moment, we are using about twice the amount of memory that would be minimally required. In the most degenerate case (one last byte does not fit into some already big buffer), near three times the minimal amount of RAM may be required.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With