In my Android application I use FileInputStream to read a text file, which is less than 100 kb in size, and showing its content in the application. The main problem is that although the file is not so big it takes around 3-4 seconds for my device to open the file.
Considering the fact the my device has 1gb ram and a four-core CPU I want to know what's wrong with the way I read the text file and is there any better way to make the process faster?
String aBuffer = "";
try {
File myFile = new File(input);
FileInputStream fIn = new FileInputStream(myFile);
BufferedReader myReader = new BufferedReader(new InputStreamReader(
fIn));
String aDataRow = "";
while ((aDataRow = myReader.readLine()) != null) {
aBuffer += aDataRow + "\n";
}
// Toast.makeText(getBaseContext(), aBuffer,
// Toast.LENGTH_SHORT).show();
myReader.close();
} catch (Exception e) {
Toast.makeText(getBaseContext(), e.getMessage(), Toast.LENGTH_SHORT)
.show();
}
return aBuffer;
your String concat is a very slow operation. You should use a StringBuilder
for this task:
String aDataRow = "";
StringBuilder buffer = new StringBuilder();
while ((aDataRow = myReader.readLine()) != null) {
buffer.append(aDataRow);
buffer.append("\n");
}
aDataRow = buffer.toString();
You can even speed up the reading, if you don't read your file line by line (because this is probably a very small buffer size). You can set a custom buffer size like this:
File myFile = new File(input);
FileInputStream fIn = new FileInputStream(myFile);
//needed to shrink the copied array in the last iteration of the length of the content
int byteLength;
//find a good buffer size here.
byte[] buffer = new byte[1024 * 128];
ByteArrayOutputStream out = new ByteArrayOutputStream();
while((byteLength = fIn.read(buffer)) != -1){
byte[] copy = Arrays.copyOf(buffer, byteLength);
out.write(copy, 0, copy.length);
}
String output = out.toString();
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With