I'd like to monitor the average rate at which lines are being added to a log file in a bash shell.
I can currently monitor how many lines are in the file each second via the command
watch -n 1 'wc -l log.txt'
However, this gives me the total count of lines when I would prefer a rate instead. In other words, I would like a command to every so often output the number of lines that have been added to the file since the command was executed divided by the number of seconds the command has been running.
For a rough count of lines per second, try:
tail -f log.txt | { count=0; old=$(date +%s); while read line; do ((count++)); s=$(date +%s); if [ "$s" -ne "$old" ]; then echo "$count lines per second"; count=0; old=$s; fi; done; }
(Bash required.)
Or, as spread out over multiple lines:
tail -f log.txt | {
count=0
old=$(date +%s)
while read line
do
((count++))
s=$(date +%s)
if [ "$s" -ne "$old" ]
then
echo "$count lines per second"
count=0
old=$s
fi
done
}
This uses date
to record the time in seconds. Meanwhile, it counts the number of lines produced by tail -f log.txt
. Every time another second passes, the count of lines seen during that second is printed.
One one terminal, run the command:
while sleep 0.1; do echo $((count++)); done >>log.txt
This command writes one line to the file log.txt
every roughly tenth of a second.
In another terminal, run:
$ tail -f log.txt | { count=0; old=$(date +%s); while read line; do ((count++)); s=$(date +%s); if [ "$s" -ne "$old" ]; then echo "$count lines per second"; count=0; old=$s; fi; done; }
15 lines per second
10 lines per second
10 lines per second
10 lines per second
9 lines per second
10 lines per second
Due to buffering, the first count is off. Subsequent counts are fairly accurate.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With