I own a Samsung Galaxy S3, which is capable of capturing pitures of size ~3000 x 2000. And, I am currently developing an application that requires capturing pictures. I use my phone as debugger and I set the best possible size that the device offers for the picture to be captured.
However, if I use this setting, the callback onPictureTaken
gives out of memory error at the very first line of it in Bitmap.decodeByteArray
method where I try to decode the captured bytes into a bitmap. If I use Bitmap.Options.inSampleSize = 2
, there occurs no out of memory error.
I want the application to be able to capture the best that the device offers, and, the device does this in its very own camera application but I can't in mine. I don't understand. How can I overcome this problem?
http://www.youtube.com/watch?v=_CruQY55HOk. Android custom view Bitmap memory leak. Also have a look at this video. Talks about MAT analyzer which should help. Also recycle bitmaps when not in use
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With