I have an NSImage
. I would like to read the NSColor
for a pixel at some x and y. Xcode seems to thing that there is a colorAtX:y:
method on NSImage
, but this causes a crash saying that there is no such method for NSImage. I have seen some examples where you create an NSBitmapImageRep
and call the same method on that, but I have not been able to successfully convert my NSImage to an NSBitmapImageRep
. The pixels on the NSBitmapImageRep
are different for some reason.
There must be a simple way to do this. It cannot be this complicated.
Without seeing your code it's difficult to know what's going wrong.
You can draw the image to an NSBitmapImageRep
using the initWithData:
method and pass in the image's TIFFRepresentation
.
You can then get the pixel value using the method colorAtX:y:
, which is a method of NSBitmapImageRep
, not NSImage
:
NSBitmapImageRep* imageRep = [[NSBitmapImageRep alloc] initWithData:[yourImage TIFFRepresentation]];
NSSize imageSize = [yourImage size];
CGFloat y = imageSize.height - 100.0;
NSColor* color = [imageRep colorAtX:100.0 y:y];
[imageRep release];
Note that you must make an adjustment for the y
value because the colorAtX:y
method uses a coordinate system that starts in the top left of the image, whereas the NSImage
coordinate system starts at the bottom left.
Alternatively, if the pixel is visible on-screen then you can use the NSReadPixel()
function to get the color of a pixel in the current coordinate system.
Function colorAtX
of NSBitmapImageRep
seems not to use the device color space, which may lead to color values that are slightly different from what you actually see. Use this code to get the correct color in the current device color space:
[yourImage lockFocus]; // yourImage is just your NSImage variable
NSColor *pixelColor = NSReadPixel(NSMakePoint(1, 1)); // Or another point
[yourImage unlockFocus];
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With