I'm trying to write a ruby script to filter the output of a tailed file (tail -f log.log | ./my_filter.rb
). I believe I've set stdin and stdout to be read synchronously, but I still see my output coming out in delayed batches, 20 or so lines at a time, rather than in realtime.
I can reproduce the problem with code as simple as:
#!/usr/bin/ruby
$stdout.sync = true
$stdin.sync = true
ARGF.each do |line|
puts line
end
Am I missing a setting to eliminate buffering, or something along those lines?
Edit: To clarify, if I just tail -f
the log then I see many lines written per second.
If you're dealing with files, you probably want IO#fsync, which says:
Immediately writes all buffered data in ios to disk. Note that fsync differs from using IO#sync=. The latter ensures that data is flushed from Ruby’s buffers, but does not guarantee that the underlying operating system actually writes it to disk.
If you're just dealing with standard input and output, you might also try requiring io/console to see if using IO::console#ioflush gives you the behavior you need. The documentation says:
Flushes input and output buffers in kernel. You must require ‘io/console’ to use this method.
As an example, consider:
require 'io/console'
ARGF.each do |line|
$stdout.puts line
$stdout.ioflush
end
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With