I'm trying to get the color of a pixel in a UIImage with Swift, but it seems to always return 0. Here is the code, translated from @Minas' answer on this thread:
func getPixelColor(pos: CGPoint) -> UIColor { var pixelData = CGDataProviderCopyData(CGImageGetDataProvider(self.CGImage)) var data: UnsafePointer<UInt8> = CFDataGetBytePtr(pixelData) var pixelInfo: Int = ((Int(self.size.width) * Int(pos.y)) + Int(pos.x)) * 4 var r = CGFloat(data[pixelInfo]) var g = CGFloat(data[pixelInfo+1]) var b = CGFloat(data[pixelInfo+2]) var a = CGFloat(data[pixelInfo+3]) return UIColor(red: r, green: g, blue: b, alpha: a) }
Thanks in advance!
With the get() function we can read the color of any pixel in our program window. We can specify which pixel we are interested in by using x and y coordinates as parameters. For example, color mycolor = get(100, 200); would grab the color of pixel 100, 200 and put that color into the mycolor variable.
Colors of Pixels To find the color of a specific pixel in a raster, we use raster. getPixel(x, y) and pass it the x and y offset of the pixel we want to look at. The raster. getPixel(x, y) function returns the color of the pixel as an Color.
A bit of searching leads me here since I was facing the similar problem. You code works fine. The problem might be raised from your image.
Code:
//On the top of your swift extension UIImage { func getPixelColor(pos: CGPoint) -> UIColor { let pixelData = CGDataProviderCopyData(CGImageGetDataProvider(self.CGImage)) let data: UnsafePointer<UInt8> = CFDataGetBytePtr(pixelData) let pixelInfo: Int = ((Int(self.size.width) * Int(pos.y)) + Int(pos.x)) * 4 let r = CGFloat(data[pixelInfo]) / CGFloat(255.0) let g = CGFloat(data[pixelInfo+1]) / CGFloat(255.0) let b = CGFloat(data[pixelInfo+2]) / CGFloat(255.0) let a = CGFloat(data[pixelInfo+3]) / CGFloat(255.0) return UIColor(red: r, green: g, blue: b, alpha: a) } }
What happens is this method will pick the pixel colour from the image's CGImage. So make sure you are picking from the right image. e.g. If you UIImage is 200x200, but the original image file from Imgaes.xcassets or wherever it came from, is 400x400, and you are picking point (100,100), you are actually picking the point on the upper left section of the image, instead of middle.
Two Solutions:
1, Use image from Imgaes.xcassets, and only put one @1x image in 1x field. Leave the @2x, @3x blank. Make sure you know the image size, and pick a point that is within the range.
//Make sure only 1x image is set let image : UIImage = UIImage(named:"imageName") //Make sure point is within the image let color : UIColor = image.getPixelColor(CGPointMake(xValue, yValue))
2, Scale you CGPoint up/down the proportion to match the UIImage. e.g. let point = CGPoint(100,100)
in the example above,
let xCoordinate : Float = Float(point.x) * (400.0/200.0) let yCoordinate : Float = Float(point.y) * (400.0/200.0) let newCoordinate : CGPoint = CGPointMake(CGFloat(xCoordinate), CGFloat(yCoordinate)) let image : UIImage = largeImage let color : UIColor = image.getPixelColor(CGPointMake(xValue, yValue))
I've only tested the first method, and I am using it to get a colour off a colour palette. Both should work. Happy coding :)
SWIFT 3, XCODE 8 Tested and working
extension UIImage { func getPixelColor(pos: CGPoint) -> UIColor { let pixelData = self.cgImage!.dataProvider!.data let data: UnsafePointer<UInt8> = CFDataGetBytePtr(pixelData) let pixelInfo: Int = ((Int(self.size.width) * Int(pos.y)) + Int(pos.x)) * 4 let r = CGFloat(data[pixelInfo]) / CGFloat(255.0) let g = CGFloat(data[pixelInfo+1]) / CGFloat(255.0) let b = CGFloat(data[pixelInfo+2]) / CGFloat(255.0) let a = CGFloat(data[pixelInfo+3]) / CGFloat(255.0) return UIColor(red: r, green: g, blue: b, alpha: a) } }
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With