Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Metal Custom CIFilter different return value

I'm writing CIFilter, but result pixel colors are different than returned values from metal function.

kernel.metal

#include <CoreImage/CoreImage.h>

extern "C" { namespace coreimage {

    float4 foo(sample_t rgb){

        return float4(0.3f, 0.5f, 0.7f, 1.0f);

    }
}

MetalFilter.swift

import CoreImage

class MetalFilter: CIFilter {

    private let kernel: CIColorKernel

    var inputImage: CIImage?

    override init() {
        let url = Bundle.main.url(forResource: "default", withExtension: "metallib")!
        let data = try! Data(contentsOf: url)
        kernel = try! CIColorKernel(functionName: "foo", fromMetalLibraryData: data)
        super.init()
    }

    required init?(coder aDecoder: NSCoder) {
        fatalError("init(coder:) has not been implemented")
    }

    func outputImage() -> CIImage? {
        guard let inputImage = inputImage else {return nil}
        return kernel.apply(extent: inputImage.extent, arguments: [inputImage])
    }
}

When I get outputImage I have these values:
R = 0.58431372549019611
G = 0.73725490196078436
B = 0.85490196078431369

It's some kind of post processing (like pow(x, 1/2.373) after metal function returns values.

like image 264
Roman Kazov Avatar asked Dec 04 '18 18:12

Roman Kazov


1 Answers

Core Image performs color matching two times when you process an image: From the color space if the input image to the working color space of the CIContext and, in the final rendering step after all filters were applied, from the working color space to the output color space of the context.

Those color spaces are configured with default values that, in my experience, depend on the device (and its display) you are running on. However, you can define both color spaces using the kCIContextWorkingColorSpace and kCIContextOutputColorSpace options when creating your CIContext.

If you set both values to NSNull(), Core Image won't perform any color matching, treating all color values as they are in the image buffers. However, your filter probably has some assumptions on the color space of the input samples. So keep that in mind when you are dealing with inputs from sources like the camera that might have different color spaces depending on the device and camera configuration.

Another way to ensure the input samples are always in the color space you need is to set the kCISamplerColorSpace option when creating a CISampler that serves as input to your custom kernel.

like image 78
Frank Schlegel Avatar answered Oct 17 '22 06:10

Frank Schlegel