Motivation: I am currently trying to synchronize two videos on two networked raspis. I tried live streaming from a desktop (http and udp) but each raspi still opened the stream with a noticeable delay. I next tried installing vlc on the raspi and synchronizing with the desktop vlc, but that did not work either. I tried using a shell script to launch omxplayer at near the same time on both raspis, and that failed too. Finally I used a C script to launch the two omxplayers at near identical times, which also failed. Ultimately, I don't think it is possible to control when omxplayer starts actually playing the video file.
Current Progress:
Therefore, now, I am modifying omxplayer's code to synchronize two omxplayers using sockets, but I want to know what approach something like vlc takes when synchronizing its video clients, so as to not reinvent the wheel. I could be wrong but I noticed, by looking at the verbose output and debug statements, that one player would lose time with respect to the other, such that as the video played, this difference would build up and after 2-3 minutes exceeded 200 ms. I find this extremely disturbing. It would mean that after 2 hours, the difference would be 60*200ms=12000ms
or around 12s
. I thought the precision of modern day computing would be more like that of an atomic clock, losing maybe 1s after 1000 hours of footage, hence why I thought it would be sufficient to merely synchronize the feeds once.
Question: If the different players have to be synchronized constnatly, how does something like vlc do it?
NOTE: I am not streaming the actual video files as they are all accessed remotely via NFS on each of the raspis.
Sorry I am not directly answering your questions, but instead this is how would I do it:
I would use MCI (I am Windows friendly but i think all other players had to be something similar)
start play on each client on precise server time
do not expect exact synchronism (without precise time synchronization). Also the play command can be executed in different speeds on different machines but not as much difference as the open stream command (therefore the delay on bullet #3)
Problem with this approach is that it assumes that playback is synchronized with time
This is often not true especially on network streaming. Most players drop frames to compensate but sometimes if the stream is not decoded for a longer time it can cause cumulative offset. In that case you can implement playback progress (your tics)
Tics can be:
Tics synchronization:
In all cases you must implement time synchronization from bullet #1 or any other. There are three basic approaches:
the best is frame synchronization
but need to implement own player or player capable of frame navigation which is very hard to implement correctly.
playback progress time
is the next best thing. If you notice bigger offset than some treshold then either pause or rewind back/forward.
Problem with rewind is that it is unpredictable how much time it will take so you can measure the time it takes and iterate rewind in few steps with applying this time to match the synchronized playback time (its a bit tricky).
playback progress percentage
Almost the same as playback progress time but the resolution is far worse. It is applicable only on very big time offsets. Not suitable for the synchronization itself but for problem detection only. If problem detected then stop all clients and start on new exact server time and or rewind + delay before start playback again. This sucks I know but not all players support playback frame/time announcements.
Hope it helps a little.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With