I have quite a few enums in Swift that are backed by Strings
. This makes sense given what the enums are representing, and is practical given that the following code block never has to be written:
public enum MyEnum : String {
...
func stringValue() -> String {
switch self {
...
}
}
// I can just use .rawValue() instead.
}
Something I haven't been able to find is much information on the internal workings and performance of Enums that are backed by non-Integral types. Specifically I am looking for answers to the following questions:
String
hashing or do Enums get their own ==
operator that works across its possible values regardless of the underlying type ("true O(1)")?Enum(rawValue:)
constructor for both integral and non-integral types?Have a look at this code:
enum A {
case A1, A2
}
enum B : Int {
case B1 = 1
case B2
}
enum C : String {
case C1
case C2
}
A.A1.hashValue // 0
A.A2.hashValue // 1
B.B1.hashValue // 0
B.B2.hashValue // 1
C.C1.hashValue // 0
C.C2.hashValue // 1
As you can see the hashValue
of any enum is equal to the number of the case, regardless of the rawValue
and if there even is one. This is most certainly because underneath, every enum is just a number and the rawValue
is just an array that maps those numbers (indices) to the corresponding value. Therefore
Enum comparison is always just a comparison between two numbers, you can just compare the hashValue
s -> O(1) complexity (untested, but I'm pretty sure)
I don't know how initializing with a raw value works, most likely it's O(1) though (wrong, it's O(n)!, see edit below), because all the possible RawValue
types are Hashable
and therefore it's possible to store them in a Hashmap which would mean O(1) indexing.
That said, an enum may be implemented like this:
enum M : RawRepresentable {
case M1 // rawValue = 4, hashValue = 0
case M2 // rawValue = 7, hashValue = 1
typealias RawValue = Int
// Int is Hashable -> Store in Dictionary -> O(1)
static let rawToEnum : [RawValue : M] = [
4 : .M1,
7 : .M2
]
// hashValue is index of this array
static let hashToRaw : [RawValue] = [
4,
7
]
var rawValue : RawValue {
return M.hashToRaw[hashValue]
}
init?(rawValue: RawValue) {
if let m = M.rawToEnum[rawValue] {
self = m
} else {
self = .M1 // Failed, assign a value to let it compile
return nil
}
}
}
EDIT: It seems like initializing an enum is O(n) (where n is the number of cases) complexity!
I made some performance tests, where I created 4 different enums with 128, 256, 512, 1024 cases each. I then made the program choose 128 random rawValues of each of those enums, make an array of them and repeat that array 20 times (to get more accurate times). Then the enum is getting initialized with each of these rawValues. Here are the results (release build):
Default rawValue: 20 repetitions
128 cases -> 0.003 sec (33% StDev)
256 cases -> 0.006 sec (14% StDev)
512 cases -> 0.014 sec (15% StDev)
1024 cases -> 0.025 sec (10% StDev)
You can check out the test project I created here (XCode 7 beta 6)
Another EDIT: I added enums to the test project, which conform to RawRepresentable
the way I showed above, again using exactly the same setup with 128, 256, 512 and 1024 cases. Turns out (as expected) it's O(1)! So apparently creating your own enum is faster! I just don't understand why the Swift devs didn't do it like this... Performance btw is this for the custom implementation of RawRepresentable for 200 more repetitions (10 times the amount of enum initilizations than with default rawValues):
Custom rawValue: 200 repetitions
128 cases -> 0.008 sec ( 7% StDev)
256 cases -> 0.008 sec (15% StDev)
512 cases -> 0.010 sec (19% StDev)
1024 cases -> 0.008 sec (26% StDev)
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With