in UIKit we could use an Extension to set hex color to almost everything. https://www.hackingwithswift.com/example-code/uicolor/how-to-convert-a-hex-color-to-a-uicolor
but when I'm trying to do it on SwiftUI, it's not possible, it looks like the SwiftUI does not get the UIColor as parameter.
Text(text) .color(UIColor.init(hex: "FFF"))
error message:
Cannot convert value of type 'UIColor' to expected argument type 'Color?'
I even tried to make an extension for Color
, instead of UIColor
, but I haven't any luck
my extension for Color:
import SwiftUI
extension Color { init(hex: String) { let scanner = Scanner(string: hex) scanner.scanLocation = 0 var rgbValue: UInt64 = 0 scanner.scanHexInt64(&rgbValue) let r = (rgbValue & 0xff0000) >> 16 let g = (rgbValue & 0xff00) >> 8 let b = rgbValue & 0xff self.init( red: CGFloat(r) / 0xff, green: CGFloat(g) / 0xff, blue: CGFloat(b) / 0xff, alpha: 1 ) } }
error message:
Incorrect argument labels in call (have 'red:green:blue:alpha:', expected '_:red:green:blue:opacity:')
To make this work, you have to add two extensions to your codebase. The first one allows you to specify a hex color from an Int , which you can conveniently specify in hexadecimal form. This is also quite a bit quicker to execute than the string version, if perfomance is of any concern to you.
Right-click anywhere in the empty space under “Appicon,” choose “New folder,” and name it “Colors” (this step isn't necessary, but it helps to keep things tidy). Right-click the folder you just created and choose “New Color Set.” A color set is an instance variable that declares a color value.
You're almost there, you were using the wrong initialiser parameter:
extension Color { init(hex: String) { let hex = hex.trimmingCharacters(in: CharacterSet.alphanumerics.inverted) var int: UInt64 = 0 Scanner(string: hex).scanHexInt64(&int) let a, r, g, b: UInt64 switch hex.count { case 3: // RGB (12-bit) (a, r, g, b) = (255, (int >> 8) * 17, (int >> 4 & 0xF) * 17, (int & 0xF) * 17) case 6: // RGB (24-bit) (a, r, g, b) = (255, int >> 16, int >> 8 & 0xFF, int & 0xFF) case 8: // ARGB (32-bit) (a, r, g, b) = (int >> 24, int >> 16 & 0xFF, int >> 8 & 0xFF, int & 0xFF) default: (a, r, g, b) = (1, 1, 1, 0) } self.init( .sRGB, red: Double(r) / 255, green: Double(g) / 255, blue: Double(b) / 255, opacity: Double(a) / 255 ) } }
Another alternative below that uses Int for hex but of course, it can be changed to String if you prefer that.
extension Color { init(hex: UInt, alpha: Double = 1) { self.init( .sRGB, red: Double((hex >> 16) & 0xff) / 255, green: Double((hex >> 08) & 0xff) / 255, blue: Double((hex >> 00) & 0xff) / 255, opacity: alpha ) } }
Usage examples:
Color(hex: 0x000000) Color(hex: 0x000000, alpha: 0.2)
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With