The only thing I do is standard bitmap API in jpegCallback:
BitmapFactory.Options opts = new BitmapFactory.Options();
opts.inSampleSize = 4;
bm = BitmapFactory.decodeByteArray(data, 0, data.length, opts);
bm = Bitmap.createScaledBitmap(bm , 640, 480, true);
and then write it on the disk
imageFile = new File("/sdcard/app_dir/upload.jpg");
FileOutputStream outStream = new FileOutputStream(imageFile);
bm.compress(CompressFormat.JPEG, 75, outStream);
outStream.flush();
outStream.close();
I think it did help for some users (Desire HD), but I can tell others still got the issue (Desire S).
I really wish someone could explain the reason why pics looks distorded in the first place.
Most phone cameras are landscape, meaning if you take the photo in portrait, the resulting photos will be rotated 90 degrees. In this case, the camera software should populate the Exif data with the orientation that the photo should be viewed in.
For Samsung Devices (Android 10) First, open the Settings app, choose "Privacy" and tap "Permission manager," then "Camera. Here, choose any app, then select "Deny" to prevent it from accessing your cameras in any way.
Go to SETTINGS > APPS & NOTIFICATIONS (select, “See all Apps”) > scroll to CAMERA > STORAGE > Tap, “Clear Data”. Next, check to see if the camera is working fine. If you launch your device's Camera app and get an error message that says “another app is using the camera”, don't panic.
I can't tell you why you're getting garbled data from some devices and not others, but I can suggest a workaround that seems to be working successfully for my app.
Your example code scales the camera's JPEG down to 640x480 before saving it off to the SD card. So I'm guessing you don't require the full-sized camera image.
If this assumption is true, you can skip Camera's takePicture()
API entirely, and just save a preview frame to SD card. The easiest way to do this is with setOneShotPreviewCallback()
:
mCamera.setOneShotPreviewCallback( new StillPictureCallback() );
This will invoke once, and hand you back a buffer of data from the camera:
private class StillPictureCallback implements Camera.PreviewCallback {
@Override
public void onPreviewFrame(byte[] data, Camera camera) {
mPictureTask = new SaveStillPictureTask();
byte[] myData = null;
if ( data != null ) {
myData = data.clone();
}
mPictureTask.execute(myData);
}
}
The callback invokes a background task to compress the data and save it to the file. The only bit of code I'm leaving out is the part that queries the camera for the preview frame format, width and height via getCameraInfo()
. Note also that the Android YUVImage class was introduced with Froyo, so if you need to support earlier versions of Android, you will need to roll your own conversion code (there are examples here on StackOverflow).
/**
* Background task to compress captured image data and save to JPEG file.
*
*/
private class SaveStillPictureTask extends AsyncTask<byte[], Void, Void> {
private static final String TAG="VideoRecorder.SaveStillPictureTask";
@Override
protected Void doInBackground(byte[]... params) {
byte[] data = params[0];
FileOutputStream out = null;
Bitmap bitmap = null;
if ( data == null ) {
Log.e(TAG, "doInBackground: data is null");
return null;
}
try {
out = new FileOutputStream(mSnapshotFilePath);
// Use the preview image format, as documented in Android SDK javadoc
if ( (mPreviewImageFormat == ImageFormat.NV21) || (mPreviewImageFormat == ImageFormat.YUY2) ) {
saveYUVToJPEG( mCamera, out, data );
} else if (mPreviewImageFormat == ImageFormat.JPEG) {
Log.d(TAG, "directly write JPEG to storage");
out.write(data);
} else {
Log.d(TAG, "try decoding to byte array");
bitmap = BitmapFactory.decodeByteArray(data, 0, data.length);
if ( bitmap != null ) {
bitmap.compress(Bitmap.CompressFormat.JPEG, 90, out);
} else {
Log.e(TAG, "decodeByteArray failed, no decoded data");
}
}
}
catch (FileNotFoundException ignore) {;}
catch (IOException ignore) {;}
finally {
if ( out != null ) {
try {
out.close();
} catch (IOException ignore) {;}
out = null;
}
if ( bitmap != null ) {
bitmap.recycle();
bitmap = null;
}
data = null;
}
return null;
}
}
/**
* Save YUV image data (aka NV21 or YUV420sp) data to JPEG file.
*
* @param camera
* @param out
* @param data
*/
protected void saveYUVToJPEG( Camera camera, FileOutputStream out, byte[] data ) {
YuvImage yuvimg = null;
try {
int width = mPreviewWidth;
int height = mPreviewHeight;
Rect rect = new Rect();
rect.left = 0;
rect.top = 0;
rect.right = width - 1;
rect.bottom = height - 1; // The -1 is required, otherwise a buffer overrun occurs
yuvimg = new YuvImage(data, mPreviewImageFormat, width, height, null);
yuvimg.compressToJpeg(rect, 90, out);
} finally {
yuvimg = null;
}
}
Well, it looks like you made some mistake with image size while decoding bitmap from byte array. Can you post code you are using for: - setting up camera - setting up decode parameters - retrieval of image data
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With