Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

`ExpressibleByArrayLiteral` conformed to by class and its superclass => "<superclass> is not convertible to <subclass>"

I want to be able to instantiate a subclass, here named MyLabel, which is a subclass of UILabel, using array literals. I am making use of this in my framework ViewComposer which allows for creating UIViews using an array of enums attributing the view, like this:

let label: UILabel = [.text("Hello World"), .textColor(.red)]

In this question I have dramatically simplified the use case, instead allowing for writing:

let vanilla: UILabel = [1, 2, 3, 4]  // works!
print(vanilla.text!) // prints: "Sum: 10"

What I want to do is use the same ExpressibleByArrayLiteral syntax, but to a subclass of UILabel, called MyLabel. However the compiler stops me when I am trying to :

let custom: MyLabel = [1, 2, 3, 4] // Compilation error: "Could not cast value of type 'UILabel' to 'MyLabel'"

Instantiation of UILabel using array literals works, thanks to conformance to custom protocol Makeable below.

Is it somehow possible to make the compiler understand that I am referring to the array literal initializer of the subclass MyLabel and not its superclass UILabel?

The following code might not make any logical sense, but it is a minimal example, hiding away what I really want:

// This protocol has been REALLY simplified, in fact it has another name and important code.
public protocol ExpressibleByNumberArrayLiteral: ExpressibleByArrayLiteral {
    associatedtype Element
}

public protocol Makeable: ExpressibleByNumberArrayLiteral {
    // we want `init()` but we are unable to satisfy such a protocol from `UILabel`, thus we need this work around
    associatedtype SelfType
    static func make(values: [Element]) -> SelfType
}

public protocol Instantiatable: ExpressibleByNumberArrayLiteral {
    init(values: [Element])
}

// My code might have worked if it would be possible to check for _NON-conformance_ in where clause
// like this: `extension Makeable where !(Self: Instantiatable)`
extension Makeable {
    public init(arrayLiteral elements: Self.Element...) {
        self = Self.make(values: elements) as! Self
    }
}

extension Instantiatable {
    init(arrayLiteral elements: Self.Element...) {
        self.init(values: elements)
    }
}

extension UILabel: Makeable {
    public typealias SelfType = UILabel
    public typealias Element = Int

    public static func make(values: [Element]) -> SelfType {
        let label = UILabel()
        label.text = "Sum: \(values.reduce(0,+))"
        return label
    }
}

public class MyLabel: UILabel, Instantiatable {
    public typealias Element = Int
    required public init(values: [Element]) {
        super.init(frame: .zero)
        text = "Sum: \(values.reduce(0,+))"
    }

    public required init?(coder: NSCoder) { fatalError() }
}

let vanilla: UILabel = [1, 2, 3, 4]
print(vanilla.text!) // prints: "Sum: 10"

let custom: MyLabel = [1, 2, 3, 4] // Compilation error: "Could not cast value of type 'UILabel' to 'MyLabel'"

I have also tried conforming to ExpressibleByArrayLiteral protocol by extending ExpressibleByNumberArrayLiteral instead (I suspect the two solutions might be equivalent, and compile to the same code..), like this:

extension ExpressibleByNumberArrayLiteral where Self: Makeable {
    public init(arrayLiteral elements: Self.Element...) {
        self = Self.make(values: elements) as! Self
    }
}

extension ExpressibleByNumberArrayLiteral where Self: Instantiatable {
    init(arrayLiteral elements: Self.Element...) {
        self.init(values: elements)
    }
}

But that did not work either. The same compilation error occurs.

I've written a comment in the big code block above, the compiler might have been able to determine which array literal initializer I was referring to if I would have been able to use negation in the where clause: extension Makeable where !(Self: Instantiatable)

But AFAIK that is not possible, that code does not compile at least. Nor does extension Makeable where Self != Instantiatable.

Is what I want to do possible?

I would be okay with having to make MyLabel a final class. But that makes no difference.

Please please please say that this is possible.

like image 781
Sajjon Avatar asked Jun 06 '17 14:06

Sajjon


2 Answers

after going through the Apple Docs, I initially thought this was not possible. However, I did find a post here, which applied to Strings and other non-UI classes. From the post i merged the idea that you cannot apply ExpressibleByArrayLiteral to a subclass through the inheritance approach, which is probably why you get the error (which I could reproduce many times with many other approaches).

Finally, by moving the ExpressibleByArrayLiteral adoption directly onto your UILabel subclass, it seems to be working!

public class MyLabel: UILabel, ExpressibleByArrayLiteral {

    public typealias Element = Int

    public override init(frame: CGRect) {
        super.init(frame: frame)
    }

    required public init(values: [Element]) {
        super.init(frame: .zero)
        text = "Sum: \(values.reduce(0,+))"
    }


    public convenience required init(arrayLiteral: Element...) {
        self.init()
        self.text = "Sum: \(arrayLiteral.reduce(0,+))"
    }


    public required init?(coder: NSCoder) { fatalError() }
}

let vanilla: MyLabel = [1, 2, 3, 4]

print(vanilla.text) // prints Sum: 10 

Turns out we can't even inherit Expressibility on element type classes (Strings, Ints, etc), you still have to re-specify the initializer for it.

With some tweaking, I also applied these other methods for thought!

public class MyLabel: UILabel, ExpressibleByArrayLiteral {

    public typealias Element = Int

    private var values : [Element]?

    public var append : Element? {
        didSet {
            if let t = append {
                values?.append(t)
            }
        }
    }

    public var sum : Element {
        get {
            guard let s = values else {
                return 0
            }
            return s.reduce(0,+)
        }
    }

    public var sumString : String {
        get {
            return "\(sum)"
        }
    }

    public var label : String {
        get {
            guard let v = values, v.count > 0 else {
                return ""
            }
            return "Sum: \(sumString)"
        }
    }

    public override init(frame: CGRect) {
        super.init(frame: frame)
    }

    required public init(values: [Element]) {
        super.init(frame: .zero)
        text = "Sum: \(values.reduce(0,+))"
    }


    public convenience required init(arrayLiteral: Element...) {
        self.init()
        self.values = arrayLiteral
        self.text = label
    }


    public required init?(coder: NSCoder) { fatalError() }
}

let vanilla: MyLabel = [1, 2, 3, 4]

print(vanilla.label) //prints out Sum: 10 , without unwrapping ;)

For now, I can't seem to apply the Expressibility to a protocol approach like you did. However, as workarounds go, this seems to do the trick. Guess for now we just have to apply the initializers to each subclass. Unfortunate, but still worth looking all this up!

UPDATE TO RE-AFFIRM THE ISSUE WITH THE EXTENSION APPROACH

Swift's inheritance prohibits convenience inits when it cannot guarantee that the superclass will not be altered dramatically. While your init does not change UILABEL's properties, the strict-typing for extensions just does not support the combinations of required and convenience on this type of initializer.

I'm taking this from this post, which was included in the link above btw:

Because this is a non-final class. Consider if there were a subclass to Stack that had its own required initializer. How would you ensure that init(arrayLiteral:) called it? It couldn't call it (because it wouldn't know that it existed). So either init(arrayLiteral:) has to be required (which means it needs to be part of the main declaration and not a extension), or Stack has to be final.

If you mark it final, this works like you expect. If you want it to be subclassed, just move it out of the extension and into the main body.

And if we look at the TWO errors you get, simply by trying to extend UILabel to ExpressibleByArrayLiteral directly, and not through a nested network of protocols like what you are doing:

Initializer requirement 'init(arrayLiteral:)' can only be satisfied by a 'required' initializer in the definition of non-final class 'UILabel'

'required' initializer must be declared directly in class 'UILabel' (non in an extension).

So first. ExpressibleByArrayLiteral requires a 'required' initializer method to conform to it, to which as the compiler says: you cannot implement directly inside of extensions for the superclass you wish to customize. Unfortunately, by this logic alone.. your desired approach is flawed.

Second. The very specific initializer you want, 'init(arrayLiteral:), is for final - type classes. As in, classes which you mark with the keyword FINAL on it's declaration header, or class TYPES (Strings are one, and so are number classes). UILabel is simply non a final-class to allow subclassing and you cannot change this without hacking the language. To illustrate non-final and final, try subclass String, and you will get an error since it is not a class or protocol. which wouldn't get your through the App Store ANYWAYS. So by DESIGN, you just cannot use this method on UILabel itself via an extension.

Three. You take the custom protocol approach, and try to apply it to UILabel by extension and inheritance.. I do apologize, but Swift just does not ignore it's language structure simply because you layer some custom code between the two ends of the constrtaint. Your protocol approach, while elegant as it is, is just nesting the problem here, not addressing it. And this is because it just re-applies these initialization constraints back onto UILabel regardless of your own middleware.

Fourth. On a bit of a logical train of thought here. If you look at the Apple Docs on Expressibles inside of XCode (the code file itself), you notice that the protocols especially applies to RawRepresentable classes and types (Strings, Ints, Doubles, etc.):

For any enumeration with a string, integer, or floating-point raw type, the Swift compiler automatically adds RawRepresentable conformance. When defining your own custom enumeration, you give it a raw type by specifying the raw type as the first item in the enumeration's type inheritance list. You can also use literals to specify values for one or more cases.

So anything which is data representation as the root-level of the class, can adopt this by extension. You can clearly see above, when adding this protocol immediately to UILabel, you're also imposing the RawRepresentable protocol to it. Which it cannot adopt my nature, which i'm willing to bet is the source of the "MyLabel Cannot be cast to UILabel" error. UILabel is not one of these things, which is kind of why it gets this non-final class attribute: It's a UI element, which is a combination of many RawRepresentables. So it makes sense that you should not be able to directly initialize a class which is an ensemble of RawRepresentables directly, because if some mix up happens on init compiler-side and it does not alter the Raw type you're aiming for, it could just corrupt the class instance altogether and take every one down a debug nightmare.

To illustrate the RawRepresentable point I'm trying to make, here's what happens when you apply this approach to String, a RawRepresentable-conforming type:

extension String: ExpressibleByArrayLiteral {
    public typealias Element = Int

    public init(arrayLiteral elements: Element...) {
        self.init()
        self = "Sum: \(elements.reduce(0,+))"//WE GET NO ERRORS
    }

}


let t : String = [1,2,3,4]

print(t) // prints -> Sum: 10

Whereas...

extension UILabel: ExpressibleByArrayLiteral {
    public typealias Element = Int

    public convenience required init(arrayLiteral elements: Element...) { //The compiler even keeps suggesting you add the method types here illogically without taking to account what is there.. ie: it's confused by what you're trying to do..
        self.init()
        self.text = "Sum: \(elements.reduce(0,+))" //WE GET THE ERRORS
    }

}

//CANNOT PRINT.....

I'll even demonstrate how far the constraint goes in terms of add Expressibles on UILabel subclasses, and it's second-tier subclasses:

class Label : UILabel, ExpressibleByArrayLiteral {

    public typealias Element = Int

    override init(frame: CGRect) {
        super.init(frame: frame)
    }

    public required init(arrayLiteral elements: Element...) {
        super.init(frame: .zero)
        self.text = "Sum: \(elements.reduce(0,+))"
    }

    public required init?(coder: NSCoder) { fatalError() }
}

let te : Label = [1,2,3,4]

print(te.text!) //prints:  Sum: 10

class SecondLabel : Label {

    public typealias Element = Int


    required init(arrayLiteral elements: Element...) {
        //THIS IS STILL REQUIRED... EVEN THOUGH WE DO NOT NEED TO MANUALLY ADOPT THE PROTOCOL IN THE CLASS HEADER
        super.init(frame: .zero)
        self.text = "Sum: \(elements.reduce(0,+))"
    }

    public required init?(coder: NSCoder) { fatalError() }

}

let ta : SecondLabel = [1,2,3,4]

print(ta.text!) //prints:  Sum: 10

In conclusion. Swift is designed this way. You cannot apply this particular protocol directly onto it because UILabel is a language level superclass and the guys who come up with this stuff don't want you to have this much over-reach into UILabel. So, you simply cannot do this because this protocol cannot be applied through extensions of non-final superclasses due to the nature of UILabel and the protocol itself. They are just not compatible this way. However, you can apply this on its subclasses on a per-subclass basis. Meaning you have to re-declare the conforming initializer each time. It sucks! But it's just how it works.

I commend your approach, it seems to almost get the extensions approach down to a T. However, there seems to be some things in how Swift is built that you just cannot circumvent. I am not the only one to affirm this conclusion (just check the links), so I would ask you to remove your downvote. You have a solution that i've given you, I've given you the references to validate my point, and i've also provided code to show you how you can remedy this constraint in the language's nature. Sometimes there just is no solution to the desired approach. And other times another approach is the only way to get around things.

like image 198
jlmurph Avatar answered Nov 20 '22 11:11

jlmurph


I've ended up with a solution using a postfix operator.

Since I need to be able to instatiate the UIKits UILabel using ExpressibleByArrayLiteral I cannot use murphguys proposed solution with Label and SecondLabel.

My original code works by adding this postfix operator:

postfix operator ^
postfix func ^<I: Instantiatable>(attributes: [I.Element]) -> I {
    return I(values: attributes)
}

Which makes the code compile and work. Although it feels a bit "hacky"...

let custom: MyLabel = [1, 2, 3, 4]^ // note use of caret operator. Now compiles
print(custom.text!) // prints "Sum 10"

If you are interested in why and how I use this you can take a look at my framework ViewComposer which enables this syntax:

let label: UILabel = [.text("Hello World"), .textColor(.red)]

But I also wanted to be able to create my own Composable subclass called MyLabel (or just Label..)

let myLabel: MyLabel = [.text("Hello World"), .textColor(.red)] // does not compile

Which did not work earlier, but now works using the caret postfix operator ^, like this:

let myLabel: MyLabel = [.text("Hello World"), .textColor(.red)]^ // works!

Which for now is the most elegant solution I think.

like image 30
Sajjon Avatar answered Nov 20 '22 11:11

Sajjon