I have a simple task here: break a Set of n elements into m Sets based on a batch size - typically I'll want to limit my sub-Sets to 1,000 elements. I wrote something like this, where input is the master, large collection:
var strings = Set[String]() ++ input
var sets = List[Set[String]]()
while (!strings.isEmpty) {
val (head, rest) = strings.splitAt(100)
sets = sets :+ head
securities = rest
}
which works fine, but I am thinking there HAS to be a more elegant/functional solution to such a simple and common problem in Scala. Someone please enlighten me.
And it is exists: .grouped(batchSize)
. Example:
scala> List.range(1,10).toSet.grouped(3).toList
// res0: List[scala.collection.immutable.Set[Int]] = List(
// Set(5, 1, 6),
// Set(9, 2, 7),
// Set(3, 8, 4))
Just call Set(1,2,3).grouped(1).toList
scala> Set(1,2,3).grouped(1).toList
res1: List[scala.collection.immutable.Set[Int]] = List(Set(1), Set(2), Set(3))
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With