Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Playing video while buffering: relating buffered bytes to buffered duration

I am working on an application which requires streaming a video from one computer (lets call it video computer) to another computer (user computer). The streaming model is such that video file bytes are sent from video computer to user computer "as it is" and decoding is done at user end.

The bytes being received at the user end are stored in a System.IO.FileStream object. The length of the fileStream object (in bytes) is set at start of buffering (because there is provision for sending metadata about the video file in the beginning).

As buffering starts, source of a System.Windows.Controls.MediaElement object is set to the filestream object.

All goes well if the user has no desire to seek the video and the buffering rate stays higher than the playing rate. However, one cannot rely on luck. I need a mechanism to check if duration of video buffered is less than the current play time... so that the video must get paused (This could happen when the user seeks video at far ahead time, or if buffering rate is slow). Then corrective measures should be taken and playback should get started only when a minimum duration has been buffered.

Thus I need a mechanism to "DETERMINE buffered duration in seconds (i.e. find position of buffer pointer in seconds) given position of buffer pointer in bytes on buffering timeline OR DETERMINE number of bytes that have been played (or passed behind play pointer) given current play pointer position in seconds".

At any instant, the following quantities are known:

  • position of buffer pointer in bytes
  • position of play pointer in seconds
  • duration of video
  • length of video in bytes

It is possible to pause/play mediaElement or seek it to a position in seconds.

Any help will be appreciated.

[Note that one cannot say that bufferPositionInSeconds = bufferPositionInBits/videoBitRate because bitrate is variable for most videos in practice and also because of existence of metatdata in the file.]

like image 548
CtrlAllDelete Avatar asked Nov 10 '13 17:11

CtrlAllDelete


1 Answers

I have a solution for you..

You simply need to build a calibration table for each video you would want transfer.

The idea is rather simple, let's say I have a video file named video1.mpg.
and let's say the length of the video file in bytes is exactly 1mb (1048576 bytes).

In the serving side, that is the computer which needs to transfer video..
I will use the media element locally to play the video and every 5 seconds I'll add a record entry to a table which will contain:
{position of buffer pointer in bytes | position of play pointer in seconds}

When done I will save the table in a simple text file or binary file or XML - whatever makes you feel good.
(This process only need to be done once per video!)

an example for a computed calibration table can look like that:
*video1.mpg
bytes | seconds
150 5
350 10
800 15
900 20
:
: and so on.....
:
923544 445
1000000 450
1048500 455

based on that table you can build any mechanism that allow interpretation of seconds in video to bytes in file to serve..

In your specific issue - you want to know if enough was buffered to start playing the video..
can be done by either sending that table file to client before starting to send the video so the client can know if to start playing - or in other words if at least next 15 secs already buffered.

or another option is to keep the table in service side and when server recognize it served X bytes he can use another tcp channel to notify the client to play the movie for Z seconds.

The only thing which is still arbiter is that you need to decide how much time (or bytes accumulated translated to time via the calibration table) is good enough to allow the player to begin playing. Oh.. and of course, if you want calibration resolution of 5 seconds or maybe 30 seconds.. its up to you to decide.

like image 62
G.Y Avatar answered Oct 26 '22 16:10

G.Y