I have CVPixelBufferRef
from an AVAsset
. I'm trying to apply a CIFilter
to it. I use these lines:
CVPixelBufferRef pixelBuffer = ...
CVPixelBufferRef newPixelBuffer = // empty pixel buffer to fill
CIContex *context = // CIContext created from EAGLContext
CGAffineTransform preferredTransform = // AVAsset track preferred transform
CIImage *phase1 = [CIImage imageWithCVPixelBuffer:pixelBuffer];
CIImage *phase2 = [phase1 imageByApplyingTransform:preferredTransform];
CIImage *phase3 = [self applyFiltersToImage:phase2];
[context render:phase3 toCVPixelBuffer:newPixelBuffer bounds:phase3.extent colorSpace:CGColorSpaceCreateDeviceRGB()];
Unfortunately, the result I get have an incorrect orientation. For example, a video captured in the portrait mode is upside down. I guess the problem is in going from AVAsset
to CoreImage
coordinate system (showing a preview in XCode for phase2 also presents an incorrect result). How to fix it?
I solved it by doing this, It should orientate everything correctly to the coordinate space
var preferredTransform = inst.preferredTransform
preferredTransform.b *= -1
preferredTransform.c *= -1
var outputImage = CIImage(cvPixelBuffer: videoFrameBuffer)
.applying(preferredTransform)
outputImage = outputImage.applying(CGAffineTransform(translationX: -outputImage.extent.origin.x, y: -outputImage.extent.origin.y))
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With