I have been working on a function for reducing fractions in Swift, and came across the Euclidean algorithm for finding the greatest common factor (http://en.wikipedia.org/wiki/Euclidean_algorithm)
I converted the pseudo code into swift, but yet I am confused how this is going to give me the greatest common factor if it is returning a which I thought was supposed to be the numerator of the fraction. Any help on this would be greatly appreciated. Thanks!
Pseudocode:
function gcd(a, b)
while b ≠ 0
t := b
b := a mod b
a := t
return a
Swift:
var a = 2
var b = 4
func gcd(a: Int, b: Int) -> Int {
var t = 0
while b != 0 {
t = b
let b = a % b
let a = t
}
return a
}
println("\(a)/\(b)")
Console output: 2/4
When you do this
let b = a % b
you are creating another readonly variable b, which has nothing to do with the variable b from the outside scope. You need to remove both lets inside the loop, and make parameters modifiable by declaring them with var, like this:
func gcd(var a: Int, var b: Int) -> Int {
var t = 0
while b != 0 {
t = b
b = a % b
a = t
}
return a
}
You can call your function like this:
let a = 111
let b = 259
println("a=\(a), b=\(b), gcd=\(gcd(a,b))")
This prints a=111, b=259, gcd=37
Taking @dasblinkenlight's answer and getting rid of t by using tuples for parallel assignment yields:
Swift 2.1:
func gcd(var a: Int, var _ b: Int) -> Int {
while b != 0 {
(a, b) = (b, a % b)
}
return a
}
gcd(1001, 39) // 13
var parameters are deprecated in Swift 2.2 and will be removed in Swift 3. So now it becomes necessary to declare a and b as var within the function:
func gcd(a: Int, _ b: Int) -> Int {
var (a, b) = (a, b)
while b != 0 {
(a, b) = (b, a % b)
}
return a
}
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With