I think that Haskell is a beautiful language, and judging by the benchmarks, its implementations can generate fast code.
However, I am wondering if it is appropriate for long-running applications, or would chasing all the potential laziness-induced leaks, that one might ignore in a short-lived application, prove frustrating?
This Reddit comment echos my concerns:
As soon as you have more than one function calling itself recursively, the heap profile ceases to give you any help pinpointing where the leak is occurring.
(That whole discussion seems insightful and frank)
I am personally interested in high-performance computing, but I guess servers and HPC have this requirement in common.
If Haskell is appropriate for such applications, are there any examples proving this point, that is applications that
If Haskell is not appropriate for such applications, then why?
Update: The Yesod web server framework for Haskell, that was put forth as an example, may have issues with memory. I wonder if anyone tested its memory usage after serving requests continuously for days.
"Space leaks" are semantically identical to any other kind of resource use problem in any language. In strict languages the GC tends to allocate and retain too much data (as structures are strict).
No matter the language you should be doing some "burn in" to look for resource usage over time, and Haskell is no different.
See e.g. xmonad
, which runs for months or years at a time. It's a Haskell app, has a tiny heap use, and I tested it by running for weeks or months with profiling on to analyze heap patterns. This gives me confidence that resource use is stable.
Ultimately though, laziness is a red herring here. Use the resource monitoring tools and testing to measure and validate your resource expectations.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With