I have a bash script that I created to process videos from within a folder and it's subfolders:
find . -type f -name '*.mkv' | while read file;
do
ffmpeg -i $file ...
done
The problem: Instead of the while loop waiting ffmpeg to complete, it continues iterate through the loop. The end result is, files not getting processed. I need a way to have the current while loop iteration to wait until ffmpeg is complete before continuing to the next. Or alternatively a way to queue these items.
Edit: So The solution when iterating over a set of files is to pass the -nostdin param to ffmpeg. Hope this helps anyone else who might have a similar issue.
Also file --> $file was a copy/paste typo.
I realize I posted this a while ago but I found the solution. Thanks for all of the responses. Providing the -nostdin param to ffmpeg will do the trick. It will process only the current file before moving onto the next file for processing.
ffmpeg's -nostdin option avoids attempting to read user input from stdin otherwise the video file itself is interpreted.
ffmpeg -i <filename> ... -nostdin
The best part about using the above is that you can continue to use verbosity in case an error is thrown in the output:
ffmpeg -i <filename> ... -nostdin -loglevel panic
OR if you would rather report the output to a file do so this way:
# Custom log name (optional). Helpful when multiple files are involved.
# FFREPORT=./${filename}-$(date +%h.%m.%s).log
ffmpeg -i <filename> ... -nostdin -report
You can also use a combination of the two as well. Also thanks @Barmar for the solution!
I think that this is as simple as you missing the $
before file
.
find . -type f -name '*.mkv' | while read file;
do
ffmpeg -i $file ...
done
This is good for you?
find . -type f -name '*.mkv' -exec ffmpeg -i {} \;
I'm a vicious answer snatcher. I found one:
ffmpeg -i $file &
wait $!
Thanks to puchu, here: apply ffmpeg to many files
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With