I want to measure time which Haskell spent to execute some function and use TimeIt package(also i tried these recommendations). But showed time differs from actual time application spent (I've ran application with +RTS -sstderr option):
CPU time: 4.85s
...
INIT time 0.00s ( 0.00s elapsed)
MUT time 0.98s ( 61.69s elapsed)
GC time 0.22s ( 0.19s elapsed)
EXIT time 0.00s ( 0.00s elapsed)
Total time 1.20s ( 61.89s elapsed)
Application source:
import qualified Data.ByteString.Lazy.Char8 as LBS
import System.Environment
import Data.Char
import Data.Int
import System.TimeIt
readChunks :: Int64 -> LBS.ByteString -> Int64
readChunks size str
| LBS.null str = 0
| otherwise = let (chunk, rest) = LBS.splitAt size str
in do
let len = LBS.length chunk
len `seq` len + readChunks size rest
processFile :: String -> IO()
processFile name = do
putStrLn name
content <- LBS.readFile name
let
(recNumStr, rest) = LBS.span (not.isControl) content
recNum = LBS.readInt recNumStr
case recNum of
Nothing -> putStrLn "can't parse"
Just (value, rest) -> print (value)
let chunkSize = 100*1024*1024
timeIt $ print (readChunks chunkSize rest)
UPDATE: I've found that Chronograph package shows right execution time (information taken from this question).
Well you are doing significant work that isn't being timed - it seems reasonable that this work makes up the difference, namely:
putStrLn name
content <- LBS.readFile name
let
(recNumStr, rest) = LBS.span (not.isControl) content
recNum = LBS.readInt recNumStr
case recNum of
Nothing -> putStrLn "can't parse"
Just (value, rest) -> print (value)
If you time that as well then you'll probably find most the difference. Also note there are other operations before you even enter main (which is true even for C programs).
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With