Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

drawn area in UIImage not recognized correctly

I am having a strange problem in my project. What I want to do is that, a user will paint or draw using swipe over a image as overlay and I just need to crop the area from the image that is below the painted region. My code is working well only when the UIImage view that is below the paint region is 320 pixel wide i.e. width of iPhone. But If I change the width of the ImageView, I am not getting the desired result.

I am using the following code to construct a CGRect around the painted part.

-(CGRect)detectRectForFaceInImage:(UIImage *)image{
    int l,r,t,b;
    l = r = t = b = 0;

    CFDataRef pixelData = CGDataProviderCopyData(CGImageGetDataProvider(image.CGImage));
    const UInt8* data = CFDataGetBytePtr(pixelData);

    BOOL pixelFound = NO;

    for (int i = leftX ; i < rightX; i++) {
        for (int j = topY; j < bottomY + 20; j++) {
            int pixelInfo = ((image.size.width  * j) + i ) * 4;
            UInt8 alpha = data[pixelInfo + 2];
            if (alpha) {
                NSLog(@"Left %d", alpha);
                l = i;
                pixelFound = YES;
                break;
            }
        }
        if(pixelFound) break;
    }

    pixelFound = NO;
    for (int i =  rightX ; i >= l; i--) {
        for (int j = topY; j < bottomY ; j++) {
            int pixelInfo = ((image.size.width  * j) + i ) * 4;
            UInt8 alpha = data[pixelInfo + 2];
            if (alpha) {
                NSLog(@"Right %d", alpha);
                r = i;
                pixelFound = YES;
                break;
            }
        }
        if(pixelFound) break;
    }

    pixelFound = NO;
    for (int i = topY ; i < bottomY ; i++) {
        for (int j = l; j < r; j++) {
            int pixelInfo = ((image.size.width  * i) + j ) * 4;
            UInt8 alpha = data[pixelInfo + 2];
            if (alpha) {
                NSLog(@"Top %d", alpha);
                t = i;
                pixelFound = YES;
                break;
            }
        }
        if(pixelFound) break;
    }

    pixelFound = NO;
    for (int i = bottomY ; i >= t; i--) {
        for (int j = l; j < r; j++) {
            int pixelInfo = ((image.size.width  * i) + j ) * 4;
            UInt8 alpha = data[pixelInfo + 2];
            if (alpha) {
                NSLog(@"Bottom %d", alpha);
                b = i;
                pixelFound = YES;
                break;
            }
        }
        if(pixelFound) break;
    }

    CFRelease(pixelData);


    return CGRectMake(l, t, r - l, b-t);
}

In the above code leftX, rightX, topY, bottomY are the extreme values(from CGPoint) in float that is calculated when user swipe their finger on the screen while painting and represents a rectangle which contains the painted area in its bounds (to minimise the loop).

    leftX   -  minimum in X-axis
    rightX  -  maximum in X-axis
    topY    -  min in Y-axis
    bottom  -  max in Y-axis 

Here l,r,t,b are the calculated values for actual rectangle.

As expressed earlier, this code work well when the imageview in which paining is done is 320 pixels wide and is spanned throughout the screen width. But If the imageview's width is smaller like 300 and is placed to the center of the screen, the code give false result.

Note: I am scaling the image according to imageview's width.

Below are the NSLog output:

  1. When imageview's width is 320 pixel (These are value for the component of color at matched pixel or non-transparent pixel):

    2013-05-17 17:58:17.170 FunFace[12103:907] Left 41
    2013-05-17 17:58:17.172 FunFace[12103:907] Right 1
    2013-05-17 17:58:17.173 FunFace[12103:907] Top 73
    2013-05-17 17:58:17.174 FunFace[12103:907] Bottom 12
    
  2. When imageview's width is 300 pixel:

    2013-05-17 17:55:26.066 FunFace[12086:907] Left 42
    2013-05-17 17:55:26.067 FunFace[12086:907] Right 255
    2013-05-17 17:55:26.069 FunFace[12086:907] Top 42
    2013-05-17 17:55:26.071 FunFace[12086:907] Bottom 255
    

How can I solve this problem because I need the imageview in center with padding to its both side.

EDIT: Ok looks like my problem is due to image orientation of JPEG images(from camera). Png images are working good and are not affected with change in imageview's width. But still JPEGs are not working even if I am handling the orientation.

like image 984
Gaurav Singh Avatar asked May 17 '13 13:05

Gaurav Singh


1 Answers

First, I wonder if you're accessing something other than 32-bit RGBA? The index value for data[] is stored in pixelInfo then moves +2 bytes, rather than +3. That would land you on the blue byte. If your intent is to use RGBA, that fact would affect the rest of the results of your code.

Moving on, with an assumption that you were still getting flawed results despite having the correct alpha component value, it seems your "fixed" code would give Left,Right,Top,Bottom NSLog outputs with alpha values less than the full-on 255, something close to 0. In this case, without further code, I'd suggest your problem is within the code you use to scale down the image from your 320x240 source to 300x225 (or perhaps any other scaled dimensions). I could imagine your image having alpha values at the edge of 255 if your "scale" code is performing a crop rather than a scale.

like image 51
Tom Pace Avatar answered Oct 05 '22 22:10

Tom Pace