For readibility, I posted the code examples that my solutions refer to first, and thenI listed the explanations of my solutions in a numerical list.
I have been struggling with this for a while now. I have done much reading, asked questions on here, and experimented; but have not come up with a decent solution. I need to read variously sized images from input streams, and display them with as high of quality as my memory constraints allow. Below are the options I have considered, none of which seem great to me. Any help or input would be greatly appreciated.
public class NativeTest extends Activity
{
@Override
public void onCreate(Bundle savedInstanceState)
{
super.onCreate(savedInstanceState);
double nativeUsage = Debug.getNativeHeapAllocatedSize();
Log.i("memory", nativeUsage+"");
}
}
double getAvailableMemory()
{
//current heap size
double heapSize = Runtime.getRuntime().totalMemory();
//amount available in heap
double heapRemaining = Runtime.getRuntime().freeMemory();
double nativeUsage = Debug.getNativeHeapAllocatedSize();
double memoryAvailable = Runtime.getRuntime().maxMemory() - (heapSize - heapRemaining) - nativeUsage;
return memoryAvailable;
}
Bitmap createImageTrialAndError(InputStream stream)
{
Bitmap image = null;
int dowsample = 1;
while(image == null)
{
try
{
Options opts = new Options();
opts.inSampleSize = downsample;
image = BitmapFactory.decodeStream(imgStream, null, opts);
}
catch (OutOfMemoryError ome)
{
downsample = downsample * 2;
Log.i("out of mem", "try using: " + downsample);
}
}
return image;
}
width * height * 4
, calculating the available memory, and then BitmapFactory.Options.inSampleSize
such that the Bitmap will use less memory than what is available. However, this option fails because I have not been able to find a remotely reliable way to calculate the available memory. The getAvailableMemory() method below seems like it should work: it calculates available memory as maximum memory - memory used in the java heap - memory used in the native heap.Debug.getNativeHeapAllocatedSize()
does not appear to be an accurate representation of Bitmap memory usage. One obvious example of its inaccuracy is the NativeTest Activity, below. On my Samsung Galaxy tablet, the log statement outputted is: 3759416.0. 3.75 mb of native allocation for an empty activity, clearly not a reliable way to determine bitmap scaling.createBitmapTrialAndError()
. This is actually surprisingly effective and not terribly slow. However, it is very undesirable because I use SoftReferences elsewhere in my application, and running out of memory forces the collection of these SoftReferences, which has a significant performance impact. It would be much more desirable to know the proper scaling factor initially, which would avoid the unnecessary collection of these SoftReferences.Debug.getNativeAllocatedSize()
. Instead, I track my own Bitmap memory allocation, by tracking anywhere I allocate Bitmaps and tallying their memory usage, and then subtracting the memory usage of any Bitmaps that I recycle. I use this value in place of the nativeUsage in getAvailableMemory(), in order to calculate the proper scaling factor for the Bitmaps. And in the event that an Out Of Memory exception occurs in using this method, I use solution 3 as a fallback way to calculate an acceptable scale. The obvious problem with this is the massive sketchyness of trying to track my own native memory usage but, to me, it seems to be the best solution. The right approach is to decode for the size you need to show and subsample the image in tiles when you need to zoom.
There's a library that should do exactly this, check it out: https://github.com/davemorrissey/subsampling-scale-image-view
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With