Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Save and restore rotation and size of an UIImageView

I'm developing an app which needs, among other things, handle image views, that is, rotate, resize, move them... save that in a core data model, and restore them on demand.

Rotation and scale is performed via gesture recognizers. I have read here that after a transform you can't use image view frame in order to retrieve real rect, you have to use bounds and center instead.

I have tried several combinations saving/restoring view frame, view layer frame, view bounds, view layer bounds, with no luck. This is my best approach, but the imageview restored is bigger (though only) if rotation angle is not 0 (the higher angle value, the bigger restored image size is).

I understand storing frame values and use them while restoring bounds is not coherent, but I'm getting my "best" results with this.

This is the code used to save the image view settings:

- (void)updateModelCoords {

    CGFloat angle = atan2f(myImageView.transform.b, myImageView.transform.b);
    CGRect frame = [myImageView.layer frame];
    [myModel setW:[[NSNumber alloc] initWithFloat:frame.size.width]];
    [myModel setH:[[NSNumber alloc] initWithFloat:frame.size.height]];
    [myModel setX:[[NSNumber alloc] initWithFloat:frame.origin.x]];
    [myModel setY:[[NSNumber alloc] initWithFloat:frame.origin.y]];
    [myModel setCenterX:[[NSNumber alloc] initWithFloat:[myImageView center].x]];
    [myModel setCenterY:[[NSNumber alloc] initWithFloat:[myImageView center].y]];
    [myModel setAngle:[[NSNumber alloc] initWithFloat:angle]];

    // db save stuff...
}

This is the code used to restore the image view settings:

- (void) restoreImage {
    float angle = [[myModel angle] floatValue];
    CGAffineTransform transform = CGAffineTransformMakeRotation(angle);
    myImageView.transform = transform;
    CGRect bounds = [myImageView.layer bounds];
    bounds.size.width = [[myModel w] floatValue];
    bounds.size.height = [[myModel h] floatValue];
    bounds.origin.x = [[myModel x] floatValue];
    bounds.origin.y = [[myModel y] floatValue];
    [myImageView.layer setBounds:bounds];
    CGPoint center = CGPointMake([[myModel centerX] floatValue], [[myModel centerY] floatValue]);
    [myImageView setCenter:center];
}

I apply a blue border to the image view layer as a helper. Orientation and position are the same, but the shape is a square after restoring, while it's a rectangle before saving.

Hints will be very wellcome.

Let me know if more code is needed in order to clarify

Thank you so much

like image 447
Héctor Avatar asked Nov 24 '11 13:11

Héctor


1 Answers

I was running into same issue and after wasting my four hours, I think I am able to fix the issue. The key point is DO NOT store frame, Store BOUNDS and Load BOUNDS. Also order is VERY VERY important while loading.

  1. Bounds
  2. Center
  3. Transform

Here it is, what I did:

-(void)writeImageState {
NSMutableDictionary *dict = [[NSMutableDictionary alloc] init];
[dict setObject:NSStringFromCGAffineTransform(self.transform) forKey:@"transform"];
[dict setObject:NSStringFromCGPoint(self.center) forKey:@"center"];
[dict setObject:NSStringFromCGRect(self.bounds) forKey:@"bounds"];

[NSKeyedArchiver archiveRootObject:dict toFile:forUser.imageTransformationPath];
[dict release];
}
-(void)readImageState {
id obj = [NSKeyedUnarchiver unarchiveObjectWithFile:forUser.imageTransformationPath];
if(obj == nil) return;

CGAffineTransform transformed = CGAffineTransformFromString([obj objectForKey:@"transform"]);
CGPoint centered = CGPointFromString([obj objectForKey:@"center"]);
CGRect rect = CGRectFromString([obj objectForKey:@"bounds"]);

self.bounds = rect;
self.center = centered;
self.transform = transformed;
}

Let us know it worked for you or not.

like image 98
user1092179 Avatar answered Oct 17 '22 01:10

user1092179