This is how I am planning to build my utilities for a project :
logdump dumps log results to file log. The results are appended to the existing results if the file is already there (like if a new file is created every month, the results are appended to the same file for that month).
extract reads the log result file to extract relevant results depending on the arguments provided.
The thing is that I do not want to wait for logdump to finish writing to log to begin processing it. Also that way I will need to remember till where I already read log to begin extracting more information, which is not what I want to do.
I need live results so that whenever something is added to the log results file, extract will get the required results.
The processing that extract will do will be generic (will depend on some command line arguments to it), but surely on a line by line basis.
This involves reading a file as and when it is being written to and continuously monitoring it for new updates even after you reach the end of the log file.
How can I do this using C or C++ or shell scripting or Perl?
tail -f
will read from a file and monitor it for updates when it reaches EOF instead of quitting outright. It's an easy way to read a log file "live". Could be as simple as:
tail -f log.file | extract
Or maybe tail -n 0 -f
so it only prints new lines, not existing lines. Or tail -n +0 -f
to display the entire file, and then continue updating thereafter.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With