Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How can we create a generic Array Extension that sums Number types in Swift?

Tags:

Swift lets you create an Array extension that sums Integer's with:

extension Array {     func sum() -> Int {         return self.map { $0 as Int }.reduce(0) { $0 + $1 }     } } 

Which can now be used to sum Int[] like:

[1,2,3].sum() //6 

But how can we make a generic version that supports summing other Number types like Double[] as well?

[1.1,2.1,3.1].sum() //fails 

This question is NOT how to sum numbers, but how to create a generic Array Extension to do it.


Getting Closer

This is the closest I've been able to get if it helps anyone get closer to the solution:

You can create a protocol that can fulfills what we need to do, i.e:

protocol Addable {     func +(lhs: Self, rhs: Self) -> Self     init() } 

Then extend each of the types we want to support that conforms to the above protocol:

extension Int : Addable { }  extension Double : Addable { } 

And then add an extension with that constraint:

extension Array {     func sum<T : Addable>(min:T) -> T     {         return self.map { $0 as T }.reduce(min) { $0 + $1 }     } } 

Which can now be used against numbers that we've extended to support the protocol, i.e:

[1,2,3].sum(0) //6 [1.1,2.1,3.1].sum(0.0) //6.3 

Unfortunately I haven't been able to get it working without having to supply an argument, i.e:

func sum<T : Addable>(x:T...) -> T? {     return self.map { $0 as T }.reduce(T()) { $0 + $1 } } 

The modified method still works with 1 argument:

[1,2,3].sum(0) //6 

But is unable to resolve the method when calling it with no arguments, i.e:

[1,2,3].sum() //Could not find member 'sum' 

Adding Integer to the method signature also doesn't help method resolution:

func sum<T where T : Integer, T: Addable>() -> T? {     return self.map { $0 as T }.reduce(T()) { $0 + $1 } } 

But hopefully this will help others come closer to the solution.


Some Progress

From @GabrielePetronella answer, it looks like we can call the above method if we explicitly specify the type on the call-site like:

let i:Int = [1,2,3].sum() let d:Double = [1.1,2.2,3.3].sum() 
like image 766
mythz Avatar asked Jun 06 '14 19:06

mythz


2 Answers

As of Swift 2 it's possible to do this using protocol extensions. (See The Swift Programming Language: Protocols for more information).

First of all, the Addable protocol:

protocol Addable: IntegerLiteralConvertible {     func + (lhs: Self, rhs: Self) -> Self }  extension Int   : Addable {} extension Double: Addable {} // ... 

Next, extend SequenceType to add sequences of Addable elements:

extension SequenceType where Generator.Element: Addable {     var sum: Generator.Element {         return reduce(0, combine: +)     } } 

Usage:

let ints = [0, 1, 2, 3] print(ints.sum) // Prints: "6"  let doubles = [0.0, 1.0, 2.0, 3.0] print(doubles.sum) // Prints: "6.0" 
like image 126
ABakerSmith Avatar answered Sep 29 '22 18:09

ABakerSmith


I think I found a reasonable way of doing it, borrowing some ideas from scalaz and starting from your proposed implementation. Basically what we want is to have typeclasses that represents monoids.

In other words, we need:

  • an associative function
  • an identity value (i.e. a zero)

Here's a proposed solution, which works around the swift type system limitations

First of all, our friendly Addable typeclass

protocol Addable {     class func add(lhs: Self, _ rhs: Self) -> Self     class func zero() -> Self } 

Now let's make Int implement it.

extension Int: Addable {     static func add(lhs: Int, _ rhs: Int) -> Int {         return lhs + rhs     }      static func zero() -> Int {         return 0     } } 

So far so good. Now we have all the pieces we need to build a generic `sum function:

extension Array {     func sum<T : Addable>() -> T {         return self.map { $0 as T }.reduce(T.zero()) { T.add($0, $1) }     } } 

Let's test it

let result: Int = [1,2,3].sum() // 6, yay! 

Due to limitations of the type system, you need to explicitly cast the result type, since the compiler is not able to figure by itself that Addable resolves to Int.

So you cannot just do:

let result = [1,2,3].sum() 

I think it's a bearable drawback of this approach.

Of course, this is completely generic and it can be used on any class, for any kind of monoid. The reason why I'm not using the default + operator, but I'm instead defining an add function, is that this allows any type to implement the Addable typeclass. If you use +, then a type which has no + operator defined, then you need to implement such operator in the global scope, which I kind of dislike.

Anyway, here's how it would work if you need for instance to make both Int and String 'multipliable', given that * is defined for Int but not for `String.

protocol Multipliable {     func *(lhs: Self, rhs: Self) -> Self     class func m_zero() -> Self }  func *(lhs: String, rhs: String) -> String {     return rhs + lhs } extension String: Multipliable {     static func m_zero() -> String {         return ""     } } extension Int: Multipliable {     static func m_zero() -> Int {         return 1     } }  extension Array {     func mult<T: Multipliable>() -> T {         return self.map { $0 as T }.reduce(T.m_zero()) { $0 * $1 }     } }  let y: String = ["hello", " ", "world"].mult() 

Now array of String can use the method mult to perform a reverse concatenation (just a silly example), and the implementation uses the * operator, newly defined for String, whereas Int keeps using its usual * operator and we only need to define a zero for the monoid.

For code cleanness, I much prefer having the whole typeclass implementation to live in the extension scope, but I guess it's a matter of taste.

like image 32
Gabriele Petronella Avatar answered Sep 29 '22 18:09

Gabriele Petronella