I am currently doing some tests on F# maps vs C# dictionaries. I realize they are quite different implementation wise but they do fill the same sort of use for their respective languages.
I have designed a simple test to check the insertion times due to the F# map being immutable thus it has to create an entirely new map for each insertion. I was wondering just how much of a hit that is.
Test is as follows:
//F#
module Test =
let testMapInsert () =
let sw = Stopwatch()
let rec fillMap endIdx curr map =
if curr = endIdx then
map
else
fillMap endIdx (curr + 1) (map |> Map.add curr curr)
sw.Start ()
let q = fillMap 100000000 Map.empty
sw.Stop ()
printfn "%A" sw.ElapsedMilliseconds
//C#
class Program
{
static void Test(int x) {
var d = new Dictionary<int,int>();
for (int i = 0; i < x; i++) {
d.Add(i,i);
}
}
static void Main(string[] args) {
Stopwatch sw = new Stopwatch();
sw.Start();
Test(10000000);
sw.Stop();
System.Console.WriteLine(sw.ElapsedMilliseconds);
//FSHARP.Test.testMapInsert(); f# function called in c#.
}
}
Doing 10 million element insertions with this yields the following times measured in ms:
C#: 332
F#: 13605
I figured the C# dictionary would be a fair bit faster but that is quite the difference.
Is there a way to speed up the F# dictionary for this sort of use-case? Or is this just the way it is and the F# map has a trade-off with performance in these situations for thread safety?
To find when a function is increasing, you must first take the derivative, then set it equal to 0, and then find between which zero values the function is positive. Now test values on all sides of these to find when the function is positive, and therefore increasing.
Increasing Functions A function is "increasing" when the y-value increases as the x-value increases, like this: It is easy to see that y=f(x) tends to go up as it goes along.
If f′(x) > 0, then f is increasing on the interval, and if f′(x) < 0, then f is decreasing on the interval. This and other information may be used to show a reasonably accurate sketch of the graph of the function. Example 1: For f(x) = x 4 − 8 x 2 determine all intervals where f is increasing or decreasing.
As mentioned in the comments, the difference is not based on the distinction between C# and F#, but based on the distinction between an immutable tree-based map and hashtable-based mutable dictionary.
Using #time
, I get the following performance in F# interactive:
#time
// Immutable tree-based F# map (~14 sec)
let mutable map = Map.empty
for i in 0 .. 10000000 do
map <- Map.add i i map
// Mutable hashtable-based .NET dictionary (~0.3 sec)
let dict = System.Collections.Generic.Dictionary<_, _>()
for i in 0 .. 10000000 do
dict.Add(i, i)
The interesting question is - can you make immutable F# map faster? In principle, you can build a map faster if you know that you are working with an already sorted array. The F# map does not have any operation that would let you do this, but it could be added.
When I define my own Map type that shares the interanl structure with the F# map:
type MapTree<'Key, 'Value when 'Key : comparison > =
| MapEmpty
| MapOne of 'Key * 'Value
| MapNode of 'Key * 'Value * MapTree<'Key, 'Value> * MapTree<'Key, 'Value> * int
I can then define ofSortedArray
operation:
let height = function
| MapEmpty -> 0
| MapOne _ -> 1
| MapNode(_, _, _, _, h) -> h
let rec ofSortedArray (data:_[]) i j =
if i = j then MapOne(data.[i])
elif i > j then MapEmpty
else
let m = i + (j - i) / 2
let l, r = ofSortedArray data i (m - 1), ofSortedArray data (m + 1) j
let k, v = data.[m]
MapNode(k, v, l, r, 1 + (max (height l) (height r)))
This is still nowhere near as efficient as a mutable hashtable, but I get the following:
// Immutable tree-based F# map, using sorted array
let arr = [| for i in 0 .. 10000000 -> i, i |] // ~1 sec
let map = ofSortedArray arr 0 10000000 // ~3 sec
If you actually wanted to use this, you would need your own version of the F# map - or you could send a pull request to the F# core libraries adding support for something like this!
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With