Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Finding entropy in opencv

Tags:

opencv

matlab

I need a function like entropyfilt() in matlab, which doesn't exists in opencv.

In matlab, J = entropyfilt(I) returns the array J, where each output pixel contains the entropy value of the 9-by-9 neighborhood around the corresponding pixel in the input image I.

I wrote a function to implement it in c++, foreach pixel get its entropy like this:

  1. Use cvCalHist with the mask parameter appropriately set to get image ROI (That's a 9*9 rectangle).
  2. Normalize the histogram so the sum of its bins is equal to 1.
  3. Use the formula for (Shannon) entropy.

I list the C++ code below:

GetLocalEntroyImage( const IplImage*gray_src,IplImage*entopy_image){
    int hist_size[]={256};
    float gray_range[]={0,255};
    float* ranges[] = { gray_range};
    CvHistogram * hist = cvCreateHist( 1, hist_size, CV_HIST_SPARSE, ranges,1);
    for(int i=0;i<gray_src.width;i++){
            for(int j=0;j<gray_src.height;j++){
                //calculate entropy for pixel(i,j) 
                //1.set roi rect(9*9),handle edge pixel
                CvRect roi;
                int threshold=Max(0,i-4);
                roi.x=threshold;
                threshold=Max(0,j-4);
                roi.y=threshold;
                roi.width=(i-Max(0,i-4))+1+(Min(gray_src->width-1,i+4)-i);
                roi.height=(j-Max(0,j-4))+1+(Min(gray_src->height-1,j+4)-j);
                cvSetImageROI(const_cast<IplImage*>(gray_src),roi);
                IplImage*gray_src_non_const=const_cast<IplImage*>(gray_src);                            

                //2.calHist,here I chose CV_HIST_SPARSE to speed up
                cvCalcHist( &gray_src_non_const, hist, 0, 0 );*/
                cvNormalizeHist(hist,1.0);
                float total=0;
                float entroy=0;

               //3.get entroy
                CvSparseMatIterator it;
                for(CvSparseNode*node=cvInitSparseMatIterator((CvSparseMat*)hist-   >bins,&it);node!=0;node=cvGetNextSparseNode(&it)){
                float gray_frequency=*(float*)CV_NODE_VAL((CvSparseMat*)hist->bins,node);
                entroy=entroy-gray_frequency*(log(gray_frequency)/log(2.0f));//*(log(gray_frequency)/log(2.0))
                }
                ((float*)(local_entroy_image->imageData + j*local_entroy_image->widthStep))[i]=entroy;
                cvReleaseHist(&hist);
            }
        }
        cvResetImageROI(const_cast<IplImage*>(gray_src));
    }

However, the code is too slow. I tested it in a 600*1200 image and it costs 120s, while entroyfilt in matlab only takes 5s.

Does anyone know how to speed up it or know any other good implementation?

like image 676
zhen lee Avatar asked Dec 04 '13 09:12

zhen lee


People also ask

How do you find the entropy of an image?

The entropy of an image can be calculated by calculating at each pixel position (i,j) the entropy of the pixel-values within a 2-dim region centered at (i,j). In the following example the entropy of a grey-scale image is calculated and plotted. The region size is configured to be (2N x 2N) = (10,10).

How do you calculate entropy in Python?

Calculate the entropy of a distribution for given probability values. If only probabilities pk are given, the entropy is calculated as S = -sum(pk * log(pk), axis=axis) . If qk is not None, then compute the Kullback-Leibler divergence S = sum(pk * log(pk / qk), axis=axis) .

What is entropy in image processing?

In image processing, discrete entropy is a measure of the number of bits required to encode image data [41]. The higher the value of the entropy, the more detailed the image will be.

What is entropy how is it calculated?

Key Takeaways: Calculating Entropy Entropy is a measure of probability and the molecular disorder of a macroscopic system. If each configuration is equally probable, then the entropy is the natural logarithm of the number of configurations, multiplied by Boltzmann's constant: S = kB ln W.


Video Answer


1 Answers

The big slow down in your code is this: log(gray_frequency)/log(2.0f)).

You should not call cvNormalizeHist(). You know the bins are going to sum to 81, so just subtract 81 * log(81)/log(2) from the calculated entropy (but of course that is a constant not calcualted every time in your loop). If you don't normalize the hisgram, its entries will be integers and you can use them to access a lookup table.

Since you have a 9x9 kernel the maximum value of gray_frequency is 81 (as long as you don't normalize the histogram) and you can easily replace those two calls to log() by a single lookup of a precalculated table. This will make a huge difference. You can initialize a table like this:

    double entropy_table[82]; // 0 .. 81
    const double log2 = log(2.0);
    entropy_table[0] = 0.0;
    for(int i = 1; i < 82; i ++)
    {
        entropy_table[i] = i * log(double(i)) / log2;
    }

Then later it is just:

entroy -= entropy_table[gray_frequency];

Also you may find implementing your own histgram code is a win. E.g. if you have a small kernel you can keep track of which bins you are going to use and only clear those. But since you are using 81/256 bins this mightn't be worth it.

Another place you can get a speed up is in borrder pixel handling. You are checking this for every pixel. But oif you had separate loops for the boarder pixels and the inner pixels a re lot of calls to max and min could be avoided.

If that still isn't fast enough, you may consider using parallel_for on stripes. As a good example on how to do that, have a look at the source code for OpenCV's morphological filter.

like image 56
Bull Avatar answered Sep 21 '22 01:09

Bull