With the introduction of an image stream in the flutter camera version 0.2.8, I've tried to integrate it into my project to use alongside AWS.
Amazon requires the image is in the format:
Previously I used the Camera package to take a picture, load the picture and then convert it as amazon required, but using an ImageStream is much more suited for what I'd like to do. My previous approach was:
// Take the picutre
await _cameraController.takePicture(path);
// Load it from my filesystem
File imagefile = new File(path);
// Convert to amazon requirements
List<int> imageBytes = imagefile.readAsBytesSync();
String base64Image = base64Encode(imageBytes);
However, using an image stream, I cannot find any easy way to convert a CameraImage
to the format that amazon requires. I don't have much experience with images so I'm quite stuck.
I attempted to manipulate the code used in the firebase ml & camera stream demo
final int numBytes =
image.planes.fold(0, (count, plane) => count += plane.bytes.length);
final Uint8List allBytes = Uint8List(numBytes);
int nextIndex = 0;
for (int i = 0; i < image.planes.length; i++) {
allBytes.setRange(nextIndex, nextIndex + image.planes[i].bytes.length,
image.planes[i].bytes);
nextIndex += image.planes[i].bytes.length;
}
// Convert as done previously
String base64Image = base64Encode(allBytes);
However, AWS responded with a InvalidImageFormatException
. If someone knows how to correctly encode the image that would be awesome! Thanks
RaisedButton( onPressed: () async { List<int> imageBytes = await sampleImage. readAsBytes(); base64Image = base64Encode(imageBytes); print(base64Image);},), SizedBox(height: 30,), Image.
File file = File(_imagePath); final _imageFile = ImageProcess. decodeImage( file. readAsBytesSync(), ); String base64Image = base64Encode(ImageProcess. encodePng(_imageFile)); print(base64Image);
A solution to convert the image to a png:
Future<Image> convertYUV420toImageColor(CameraImage image) async {
try {
final int width = image.width;
final int height = image.height;
final int uvRowStride = image.planes[1].bytesPerRow;
final int uvPixelStride = image.planes[1].bytesPerPixel;
print("uvRowStride: " + uvRowStride.toString());
print("uvPixelStride: " + uvPixelStride.toString());
// imgLib -> Image package from https://pub.dartlang.org/packages/image
var img = imglib.Image(width, height); // Create Image buffer
// Fill image buffer with plane[0] from YUV420_888
for(int x=0; x < width; x++) {
for(int y=0; y < height; y++) {
final int uvIndex = uvPixelStride * (x/2).floor() + uvRowStride*(y/2).floor();
final int index = y * width + x;
final yp = image.planes[0].bytes[index];
final up = image.planes[1].bytes[uvIndex];
final vp = image.planes[2].bytes[uvIndex];
// Calculate pixel color
int r = (yp + vp * 1436 / 1024 - 179).round().clamp(0, 255);
int g = (yp - up * 46549 / 131072 + 44 -vp * 93604 / 131072 + 91).round().clamp(0, 255);
int b = (yp + up * 1814 / 1024 - 227).round().clamp(0, 255);
// color: 0x FF FF FF FF
// A B G R
img.data[index] = (0xFF << 24) | (b << 16) | (g << 8) | r;
}
}
imglib.PngEncoder pngEncoder = new imglib.PngEncoder(level: 0, filter: 0);
List<int> png = pngEncoder.encodeImage(img);
muteYUVProcessing = false;
return Image.memory(png);
} catch (e) {
print(">>>>>>>>>>>> ERROR:" + e.toString());
}
return null;
}
Source: https://github.com/flutter/flutter/issues/26348#issuecomment-462321428
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With