Libraries and ecosystemIf Haskell is a niche language, then OCaml is a super-niche language. The OCaml community is much smaller. Where Haskell is doing more-or-less fine with libraries, OCaml has significantly less to propose. There are some nice libraries in OCaml, but in many areas the situation is not perfect.
According to The great computer language shootout, (see also the newer Computer language shootout benchmarks) Ocaml is the second fastest language - slower than C, but faster than C++.
As a result, OCaml is not good for applications where performance must be very predictable---like embedded systems. C is a language with a standard and many compilers. OCaml is a software artifact: the only compiler is from a single source, and the compiler is the standard.
OCaml is more of a multi-paradigm language than Haskell. It's not purely functional, so you can easily fallback into imperative code if you need (you can use mutable variables, I/O, mutable arrays, hashmaps, for loops, while loops, etc.).
By all accounts, both OCaml and Haskell have sufficiently performant compilers and runtimes for almost anything. Picking between them on the basis of pure performance seems silly to me. You've come this far -- moving away from the obviously most low-level and performant languages (C, C++, etc.) in the name of clearer, more succinct, more expressive, higher-level code. So why, when the performance differences involved are much smaller, switch to that criteria now?
I'd go with some broader criteria -- if you want pervasive parallelism, then Haskell's the better choice. If you want genuinely pervasive mutation, then OCaml is better.
If you want only very coarse parallelism at best, and you intend to stick with mostly functional structures, then pick based on something else, like syntax (I think Haskell is much nicer here, but that's subjective) or available libraries (Haskell wins on quantity/availability, but OCaml might edge it out in the graphics department nonetheless).
I don't think you'll go wrong either way
With help from two very smart colleagues, I've written a dataflow-optimization library in both Objective Caml and Haskell. The Haskell version is a bit more polymorphic, has more compile-time type checking, and therefore has less run-time checking. The OCaml version uses mutable state to accumulate dataflow facts, which might be faster or slower this week, depending on the phase of the moon. The key fact is that in their intended applications, both libraries are so fast that they are not worth fooling with. That is, in the respective compilers (Quick C-- and GHC), so little time is spent in dataflow optimization that the code is not worth improving.
Benchmarking is hell.
I've written numerous realtime terrain rendering engines, so this is a familiar topic.
Familiar enough to know where most time will be spent?
If so then maybe you can write code for just that part in different languages and compare.
But according to the The Computer Language Benchmarks Game Haskell often beats OCaml, if only by rather small fractions — there remains the problem, that this benchmark takes only very specific samples.
The benchmarks game reports 4 sets of results - one core and quad core, 32 or 64 bit Ubuntu - and you may find that the OCaml or Haskell benchmark programs perform better or worse depending on the platform.
All a benchmark can do is take very specific samples, and of course you should disregard comparisons on tasks that are unlike where most time will be spent in your application - large integer arithmetic? regex? strings? - and look at the comparisons that are most like what you intend to do.
Based on all the data I've seen, they are roughly comparable. The code you write will make a bigger difference than the language itself.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With