Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Detect bad frames in OpenCV 2.4.9

I know the title is a bit vague but I'm not sure how else to describe it.

CentOS with ffmpeg + OpenCV 2.4.9. I'm working on a simple motion detection system which uses a stream from an IP camera (h264).

Once in a while the stream hiccups and throws in a "bad frame" (see pic-bad.png link below). The problem is, these frames vary largely from the previous frames and causes a "motion" event to get triggered even though no actual motion occured.

The pictures below will explain the problem.

Good frame (motion captured):

Good Frame

Bad frame (no motion, just a broken frame):

Bad Frame

The bad frame gets caught randomly. I guess I can make a bad frame detector by analyzing (looping) through the pixels going down from a certain position to see if they are all the same, but I'm wondering if there is any other, more efficient, "by the book" approach to detecting these types of bad frames and just skipping over them.

Thank You!

EDIT UPDATE:

The frame is grabbed using a C++ motion detection program via cvQueryFrame(camera); so I do not directly interface with ffmpeg, OpenCV does it on the backend. I'm using the latest version of ffmpeg compiled from git source. All of the libraries are also up to date (h264, etc, all downloaded and compiled yesterday). The data is coming from an RTSP stream (ffserver). I've tested over multiple cameras (dahua 1 - 3 MP models) and the frame glitch is pretty persistent across all of them, although it doesn't happen continuously, just once on a while (ex: once every 10 minutes).

like image 904
user3630380 Avatar asked May 12 '14 23:05

user3630380


2 Answers

What comes to my mind in first approach is to check dissimilarity between example of valid frame and the one we are checking by counting the pixels that are not the same. Dividing this number by the area we get percentage which measures dissimilarity. I would guess above 0.5 we can say that tested frame is invalid because it differs too much from the example of valid one.

This assumption is only appropriate if you have a static camera (it does not move) and the objects which can move in front of it are not in the shortest distance (depends from focal length, but if you have e.g. wide lenses so objects should not appear less than 30 cm in front of camera to prevent situation that objects "jumps" into a frame from nowhere and has it size bigger that 50% of frame area).

Here you have opencv function which does what I said. In fact you can adjust dissimilarity coefficient more large if you think motion changes will be more rapid. Please notice that first parameter should be an example of valid frame.

bool IsBadFrame(const cv::Mat &goodFrame, const cv::Mat &nextFrame) {
    // assert(goodFrame.size() == nextFrame.size())

    cv::Mat g, g2;
    cv::cvtColor(goodFrame, g, CV_BGR2GRAY);
    cv::cvtColor(nextFrame, g2, CV_BGR2GRAY);

    cv::Mat diff = g2 != g;

    float similarity = (float)cv::countNonZero(diff) / (goodFrame.size().height * goodFrame.size().width);

    return similarity > 0.5f;
}
like image 143
marol Avatar answered Nov 03 '22 00:11

marol


You do not mention if you use ffmpeg command line or libraries, but in the latter case you can check the bad frame flag (I forgot its exact description) and simply ignore those frames.

like image 45
Mike Versteeg Avatar answered Nov 03 '22 00:11

Mike Versteeg