Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

FFmpeg skips rendering frames

Tags:

c#

windows

ffmpeg

While I extract frames from a video I noticed that ffmpeg wont finish rendering certain images. The problem ended up being byte "padding" between two jpeg images. If my buffer size is 4096 and if in that buffer are located bytes from previous image and next image and if they are not separated by any number of bytes, then next image is not rendered properly. Why is that?

-i path -f image2pipe -c:v mjpeg -q:v 2 -vf fps=25 pipe:1

enter image description here

Rendered frame:

enter image description here

Code sample:

public void ExtractFrames()
{
    string FFmpegPath = "Path...";
    string Arguments = $"-i { VideoPath } -f image2pipe -c:v mjpeg -q:v 2 -vf fps=25/1 pipe:1";
    using (Process cmd = GetProcess(FFmpegPath, Arguments))
    {
        cmd.Start();
        FileStream fStream = cmd.StandardOutput.BaseStream as FileStream;

        bool Add = false;
        int i = 0, n = 0, BufferSize = 4096;
        byte[] buffer = new byte[BufferSize + 1];

        MemoryStream mStream = new MemoryStream();

        while (true)
        {
            if (i.Equals(BufferSize))
            {
                i = 0;
                buffer[0] = buffer[BufferSize];
                if (fStream.Read(buffer, 1, BufferSize) == 0)
                    break;
            }

            if (buffer[i].Equals(255) && buffer[i + 1].Equals(216))
            {
                Add = true;
            }

            if (buffer[i].Equals(255) && buffer[i + 1].Equals(217))
            {
                n++;
                Add = false;
                mStream.Write(new byte[] { 255, 217 }, 0, 2);
                File.WriteAllBytes($@"C:\Path...\{n}.jpg", mStream.ToArray());
                mStream = new MemoryStream();
            }

            if (Add)
                mStream.WriteByte(buffer[i]);

            i++;
        }
        cmd.WaitForExit();
        cmd.Close();
    }
}

private Process GetProcess(string FileName, string Arguments)
{
    return new Process
    {
        StartInfo = new ProcessStartInfo
        {
            FileName = FileName,
            Arguments = Arguments,
            UseShellExecute = false,
            RedirectStandardOutput = true,
            CreateNoWindow = false,
        }
    };
}

Video sample (> 480p) with length of 60 seconds or higher should be used for testing purposes.

like image 855
Srdjan M. Avatar asked Aug 06 '18 14:08

Srdjan M.


2 Answers

If the file is stored, then it might be easier to just tell FFmpeg to convert that video file into Jpegs.

(1) Read video file and output frame Jpegs (no pipes or Memory/File streams involved):

string str_MyProg = "C:/FFmpeg/bin/ffmpeg.exe";
string VideoPath = "C:/someFolder/test_vid.mp4";

string save_folder = "C:/someOutputFolder/";

//# Setup the arguments to directly output a sequence of images (frames)
string str_CommandArgs = "-i " + VideoPath + " -vf fps=25/1 " + save_folder + "n_%03d.jpg"; //the n_%03d replaces "n++" count

System.Diagnostics.ProcessStartInfo cmd_StartInfo = new System.Diagnostics.ProcessStartInfo(str_MyProg, str_CommandArgs);

cmd_StartInfo.RedirectStandardError = false; //set false
cmd_StartInfo.RedirectStandardOutput = false; //set false
cmd_StartInfo.UseShellExecute = true; //set true
cmd_StartInfo.CreateNoWindow = true;  //don't need the black window

//Create a process, assign its ProcessStartInfo and start it
System.Diagnostics.Process cmd = new System.Diagnostics.Process();
cmd.StartInfo = cmd_StartInfo;

cmd.Start();

//# Started process. Check output folder for images...

(2) Pipes method:

When using pipes, FFmpeg will stream back the output like a broadcast. If last video frame is reached, that same last-frame "image" will be repeated infinitey. You must manually tell FFmpeg when to stop sending to your your app (there is no "exit" code in this situation).

This line in code will specify how any frames to extract before stopping:

int frames_expected_Total = 0; //is... (frame_rate x Duration) = total expected frames

You can calculate the limit as: input-Duration / output-FPS or as output-FPS * input-Duration.
Example: video duration is 4.88 secs so 25 * 4.88 = 122 frames is limit on this video.

"If my buffer size is 4096... then next image is not rendered properly. Why is that?"

You have "glitched" images because the buffer is too small to hold a complete image...

Buffer size formula is:

int BufferSize = ( video_Width * video_Height );

Because the final compressed jpeg will be smaller than this amount, it guarantees a BufferSize that can hold any complete frame without errors. Out of interest, where are you getting the 4096 number from? Standard Output typically gives maximum packets size of 32kb (32768 bytes).

Solution (tested):
This is a complete working example to solve the "glitch" image issue, check code comments...

using System;
using System.IO;
using System.Net;
using System.Drawing;
using System.Diagnostics;
using System.Collections.Generic;


namespace FFmpeg_Vid_to_JPEG //replace with your own project "namespace"
{
    class Program
    {
        public static void Main(string[] args)
        {
            //# testing the Extract function...

            ExtractFrames();
        }

        public static void ExtractFrames()
        {
            //# define paths for PROCESS
            string FFmpegPath = "C:/FFmpeg/bin/ffmpeg.exe";
            string VideoPath = "C:/someFolder/test_vid.mp4";

            //# FFmpeg arguments for PROCESS
            string str_myCommandArgs = "-i " + VideoPath + " -f image2pipe -c:v mjpeg -q:v 2 -vf fps=25/1 pipe:1";

            //# define paths for SAVE folder & filename
            string save_folder = "C:/someOutputFolder/";
            string save_filename = ""; //update name later on, during SAVE commands

            MemoryStream mStream = new MemoryStream(); //create once, recycle same for each frame

            ////// # also create these extra variables...

            bool got_current_JPG_End = false; //flag to begin extraction of image bytes within stream

            int pos_in_Buffer = 0; //pos in buffer(when checking for Jpeg Start/End bytes)
            int this_jpeg_len = 0; // holds bytes of single jpeg image to save... correct length avoids cropping effect
            int pos_jpeg_start = 0; int pos_jpeg_end = 0; //marks the start/end pos of one image within total stream

            int jpeg_count = 0; //count of exported Jpeg files (replaces the "n++" count)
            int frames_expected_Total = 0; //number of frames to get before stopping

            //# use input video's width x height as buffer size //eg: size 921600 = 1280 W x 720H 
            int BufferSize = 921600;  
            byte[] buffer = new byte[BufferSize + 1];

            // Create a process, assign its ProcessStartInfo and start it
            ProcessStartInfo cmd_StartInfo = new ProcessStartInfo(FFmpegPath, str_myCommandArgs);

            cmd_StartInfo.RedirectStandardError = true;
            cmd_StartInfo.RedirectStandardOutput = true; //set true to redirect the process stdout to the Process.StandardOutput StreamReader
            cmd_StartInfo.UseShellExecute = false;
            cmd_StartInfo.CreateNoWindow = true; //do not create the black window

            Process cmd = new System.Diagnostics.Process();
            cmd.StartInfo = cmd_StartInfo;

            cmd.Start();

            if (cmd.Start())
            {
                //# holds FFmpeg output bytes stream...
                var ffmpeg_Output = cmd.StandardOutput.BaseStream; //replaces: fStream = cmd.StandardOutput.BaseStream as FileStream;

                cmd.BeginErrorReadLine(); //# begin receiving FFmpeg output bytes stream

                //# get (read) first two bytes in stream, so can check for Jpegs' SOI (xFF xD8)
                //# each "Read" auto moves forward by read "amount"...
                ffmpeg_Output.Read(buffer, 0, 1);
                ffmpeg_Output.Read(buffer, 1, 1);

                pos_in_Buffer = this_jpeg_len = 2; //update reading pos

                //# we know first jpeg's SOI is always at buffer pos: [0] and [1]
                pos_jpeg_start = 0; got_current_JPG_End = false;

                //# testing amount... Duration 4.88 sec, FPS 25 --> (25 x 4.88) = 122 frames        
                frames_expected_Total = 122; //122; //number of Jpegs to get before stopping.

                while(true)
                {
                    //# For Pipe video you must exit stream manually
                    if ( jpeg_count == (frames_expected_Total + 1) )
                    {
                        cmd.Close(); cmd.Dispose(); //exit the process
                        break; //exit if got required number of frame Jpegs
                    }

                    //# otherwise read as usual    
                    ffmpeg_Output.Read(buffer, pos_in_Buffer, 1);
                    this_jpeg_len +=1; //add 1 to expected jpeg bytes length

                    //# find JPEG start (SOI is bytes 0xFF 0xD8)
                    if ( (buffer[pos_in_Buffer] == 0xD8)  && (buffer[pos_in_Buffer-1] == 0xFF) )
                    {
                        if  (got_current_JPG_End == true) 
                        {   
                            pos_jpeg_start = (pos_in_Buffer-1);
                            got_current_JPG_End = false; 
                        }
                    }

                    //# find JPEG ending (EOI is bytes 0xFF 0xD9) then SAVE FILE
                    if ( (buffer[pos_in_Buffer] == 0xD9) && (buffer[pos_in_Buffer-1] == 0xFF) )
                    {
                        if  (got_current_JPG_End == false) 
                        { 
                            pos_jpeg_end = pos_in_Buffer; got_current_JPG_End = true;

                            //# update saved filename 
                            save_filename = save_folder + "n_" + (jpeg_count).ToString() + ".jpg";

                            try
                            {
                                //# If the Jpeg save folder doesn't exist, create it.
                                if ( !Directory.Exists( save_folder ) ) { Directory.CreateDirectory( save_folder ); }
                            } 
                            catch (Exception)
                            { 
                                //# handle any folder create errors here.
                            }

                            mStream.Write(buffer, pos_jpeg_start, this_jpeg_len); //

                            //# save to disk...
                            File.WriteAllBytes(@save_filename, mStream.ToArray());

                            //recycle MemoryStream, avoids creating multiple = new MemoryStream();
                            mStream.SetLength(0); mStream.Position = 0;

                            //# reset for next pic
                            jpeg_count +=1; this_jpeg_len=0;

                            pos_in_Buffer = -1; //allows it to become 0 position at incrementation part
                        }
                    }

                    pos_in_Buffer += 1; //increment to store next byte in stdOut stream

                } //# end While

            }
            else
            {
               // Handler code here for "Process is not running" situation
            }

        } //end ExtractFrame function


    } //end class
} //end program

Note: When modifying the above code, make sure to keep the Process creation within the function ExtractFrames() itself, this will not work if you use some external function to return the Process. Don't setup as: using (Process cmd = GetProcess(FFmpegPath, Arguments)).

Good luck. Let me know how it goes.

(PS: Excuse the "too much" code comments, it's for the benefit of future readers, who may or may not understand what this code is doing to work correctly on the buffer issue).

like image 180
VC.One Avatar answered Oct 20 '22 15:10

VC.One


This issue happens globally, for reference taken from Adobe site:

The answer is all there - the default render output is uncompressed, which yields so high data rates, even quite beefy computers will never be able to play it back smoothly.

The thing here is simple: you are rendering high data rates, even when using low quality. The max buffer size for that case is indeed, 4096. If within that buffer are bytes from previous and next images, and ARE not separated by a comma, the FFmpeg cannot decide which frame to render, so it skips the frame because it dims it right instead of randomly suggest which frame to refresh.

If you separate the bytes by comma, you help the FFmpeg bound the bytes of the pervious and next images, making it easier to distinguish which frame to render, thus not skipping frames.

like image 42
Barr J Avatar answered Oct 20 '22 14:10

Barr J