I want to add crosses on a NSImage, here's my code:
-(NSSize)convertPixelSizeToPointSize:(NSSize)px
{
CGFloat displayScale = [[NSScreen mainScreen] backingScaleFactor];
NSSize res;
res.width = px.width / displayScale;
res.height = px.height / displayScale;
return res;
}
-(void)awakeFromNib
{
CGFloat scale = [[NSScreen mainScreen] backingScaleFactor];
NSLog(@"backingScaleFactor : %f",scale);
NSImage *img = [[[NSImage alloc]initWithContentsOfFile:@"/Users/support/Pictures/cat.JPG"] autorelease];
NSBitmapImageRep *imgRep = [NSBitmapImageRep imageRepWithData:[img TIFFRepresentation]];
NSSize imgPixelSize = NSMakeSize([imgRep pixelsWide],[imgRep pixelsHigh]);
NSSize imgPointSize = [self convertPixelSizeToPointSize:imgPixelSize];
[img setSize:imgPointSize];
NSLog(@"imgPixelSize.width: %f , imgPixelSize.height:%f",imgPixelSize.width,imgPixelSize.height);
NSLog(@"imgPointSize.width: %f , imgPointSize.height:%f",imgPointSize.width,imgPointSize.height);
[img lockFocus];
NSAffineTransform *trans = [[[NSAffineTransform alloc] init] autorelease];
[trans scaleBy:1.0 / scale];
[trans set];
NSBezierPath *path = [NSBezierPath bezierPath];
[[NSColor redColor] setStroke];
[path moveToPoint:NSMakePoint(0.0, 0.0)];
[path lineToPoint:NSMakePoint(imgPixelSize.width, imgPixelSize.height)];
[path moveToPoint:NSMakePoint(0.0, imgPixelSize.height)];
[path lineToPoint:NSMakePoint(imgPixelSize.width, 0.0)];
[path setLineWidth:1];
[path stroke];
[img unlockFocus];
[imageView setImage:img];
imgRep = [NSBitmapImageRep imageRepWithData:[img TIFFRepresentation]];
NSData *imageData = [imgRep representationUsingType:NSJPEGFileType properties:nil];
[imageData writeToFile:@"/Users/support/Pictures/11-5.JPG" atomically:NO];
}
on non-retina display the result is:
and console displayed:
2012-07-06 00:53:09.889 RetinaTest[8074:403] backingScaleFactor : 1.000000
2012-07-06 00:53:09.901 RetinaTest[8074:403] imgPixelSize.width: 515.000000 , imgPixelSize.height:600.000000
2012-07-06 00:53:09.902 RetinaTest[8074:403] imgPointSize.width: 515.000000 , imgPointSize.height:600.000000
but on retina display (I didn't use the real retina display but hidpi mode):
console:
2012-07-06 00:56:05.071 RetinaTest[8113:403] backingScaleFactor : 2.000000
2012-07-06 00:56:05.083 RetinaTest[8113:403] imgPixelSize.width: 515.000000 , imgPixelSize.height:600.000000
2012-07-06 00:56:05.084 RetinaTest[8113:403] imgPointSize.width: 257.500000 , imgPointSize.height:300.000000
If I omit these lines:
NSAffineTransform *trans = [[[NSAffineTransform alloc] init] autorelease];
[trans scaleBy:1.0 / scale];
[trans set];
However if I change [NSAffineTransform scaleBy] to 1.0 the result is right
NSAffineTransform *trans = [[[NSAffineTransform alloc] init] autorelease];
[trans scaleBy:1.0];
[trans set];
Console:
2012-07-06 01:01:03.420 RetinaTest[8126:403] backingScaleFactor : 2.000000
2012-07-06 01:01:03.431 RetinaTest[8126:403] imgPixelSize.width: 515.000000 , imgPixelSize.height:600.000000
2012-07-06 01:01:03.432 RetinaTest[8126:403] imgPointSize.width: 257.500000 , imgPointSize.height:300.000000
Could anyone give an explanation please ? is hidpi mode different from retina display ?
I think I've found the answer. If NSAffineTransform set to NSImage's context, it transforms the coordinate system to pixel dimension, which is 2 x point dimension. Even if it's empty like this:
NSAffineTransform *trans = [[[NSAffineTransform alloc] init] autorelease];
[trans set];
I don't know if it's a bug or it's the way it works though.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With