I need to Cut from the Full Image using the mask and created the masked Image.
+=
I tried the following:
UIImage *imgMask = [UIImage imageNamed:@"Mask.png"];
UIImage *imgBgImage = [UIImage imageNamed:@"Full.png"];
GPUImageMaskFilter *maskingFilter = [[GPUImageMaskFilter alloc] init];
GPUImagePicture * maskGpuImage = [[GPUImagePicture alloc] initWithImage:imgMask ];
GPUImagePicture *FullGpuImage = [[GPUImagePicture alloc] initWithImage:imgBgImage ];
[maskGpuImage addTarget:maskingFilter];
[maskGpuImage processImage];
[maskingFilter useNextFrameForImageCapture];
[FullGpuImage addTarget:maskingFilter];
[FullGpuImage processImage];
UIImage *OutputImage = [maskingFilter imageFromCurrentFramebuffer];
But , my generated output image is:
Please guys join hands. Cheers.
Also,Thanks to BradLarson.
The mask is the second target, as can been seen in the filter shader code (textureColor2).
//Averages mask's the RGB values, and scales that value by the mask's alpha
//
//The dot product should take fewer cycles than doing an average normally
//
//Typical/ideal case, R,G, and B will be the same, and Alpha will be 1.0
lowp float newAlpha = dot(textureColor2.rgb, vec3(.33333334, .33333334, .33333334)) * textureColor2.a;
gl_FragColor = vec4(textureColor.xyz, newAlpha);
Then you need to "invert" your mask : white heart on black background, as the filter uses the "weight" of the RGB pixel value to set the alpha value on the target image.
So your code should be
// Image first, Mask next
[FullGpuImage addTarget:maskingFilter];
[FullGpuImage processImage];
[maskingFilter useNextFrameForImageCapture];
[maskGpuImage addTarget:maskingFilter];
[maskGpuImage processImage];
and your mask (ok I did an ugly quick test, use a proper image) like
for the expected result.
You don't need a mask filter but an alpha mask blend.
I implemented one here like this:
// GPUImage shader strings
NSString * const kNBUAlphaMaskShaderString = SHADER_STRING
(
varying highp vec2 textureCoordinate;
varying highp vec2 textureCoordinate2;
uniform sampler2D inputImageTexture;
uniform sampler2D inputImageTexture2;
void main()
{
lowp vec4 textureColor = texture2D(inputImageTexture, textureCoordinate);
lowp vec4 textureColor2 = texture2D(inputImageTexture2, textureCoordinate2);
gl_FragColor = vec4(textureColor.xyz, textureColor2.a);
}
);
Simply keeps the colors from one image and the alpha channel of the second one.
Then create a GPUImageTwoInputFilter
:
GPUImageTwoInputFilter * alphaMask = [[GPUImageTwoInputFilter alloc] initWithFragmentShaderFromString:kNBUAlphaMaskShaderString];
I'm not sure if a similar blend filter has been added to GPUImage since.
Just checked again to see if there's a built-in blend that does it, and there isn't. But the blend filter I used as inspiration is still there (GPUImageAlphaBlendFilter). It merges two images using the alpha mask to mix them. The filter mentioned above doesn't require a second "empty" image.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With