I'm using an image picker library to allow the user to select many images from their photo library. They are returned as an array of PHAssets
. Then, I want to convert all the PHAssets
to UIImages
and write them to the app's storage.
At the moment, I'm looping through all the assets and calling requestImageForAsset
synchronously. My issue is that there is incredibly high memory usage spike when this loop is being run (with 30 images, it spikes up to 130MB). I would like to prevent this.
Here is my code:
for(PHAsset *asset in self.assets) {
NSLog(@"started requesting image %i", i);
[[PHImageManager defaultManager] requestImageForAsset:asset targetSize:PHImageManagerMaximumSize contentMode:PHImageContentModeAspectFit options:[self imageRequestOptions] resultHandler:^(UIImage *image, NSDictionary *info) {
dispatch_async(dispatch_get_main_queue(), ^{
assetCount++;
NSError *error = [info objectForKey:PHImageErrorKey];
if (error) NSLog(@"Image request error: %@",error);
else {
NSString *imagePath = [appDelegate.docsPath stringByAppendingPathComponent:[NSString stringWithFormat:@"%i.png",i]];
NSData *imageData = UIImagePNGRepresentation(image);
if(imageData) {
[imageData writeToFile:imagePath atomically:YES];
[self.imagesArray addObject:imagePath];
}
else {
NSLog(@"Couldn't write image data to file.");
}
[self checkAddComplete];
NSLog(@"finished requesting image %i", i);
}
});
}];
i++;
}
Based on the logs, I see that all of the "starting requesting image x" are called first, then all of the completion blocks ("finished requesting image x"). I think that this might be contributing to the memory issue. It would probably be less memory intensive to ensure that the completion block for each iteration is called before freeing those resources and moving to the next iteration. How can I do this?
Please use autoreleasepool
for memory management.
for(PHAsset *asset in self.assets) {
// This autorelease pool seems good (a1)
@autoreleasepool {
NSLog(@"started requesting image %i", i);
[[PHImageManager defaultManager] requestImageForAsset:asset targetSize:PHImageManagerMaximumSize contentMode:PHImageContentModeAspectFit options:[self imageRequestOptions] resultHandler:^(UIImage *image, NSDictionary *info) {
dispatch_async(dispatch_get_main_queue(), ^{
//you can add autorelease pool here as well (a2)
@autoreleasepool {
assetCount++;
NSError *error = [info objectForKey:PHImageErrorKey];
if (error) NSLog(@"Image request error: %@",error);
else {
NSString *imagePath = [appDelegate.docsPath stringByAppendingPathComponent:[NSString stringWithFormat:@"%i.png",i]];
NSData *imageData = UIImagePNGRepresentation(image);
if(imageData) {
[imageData writeToFile:imagePath atomically:YES];
[self.imagesArray addObject:imagePath];
}
else {
NSLog(@"Couldn't write image data to file.");
}
[self checkAddComplete];
NSLog(@"finished requesting image %i", i);
}
} //a2 ends here
});
}];
i++;
} // a1 ends here
}
@Inder Kumar Rathore trick does not work for me. So i tried read more about PHImageManager here
I found that if i switch from
- requestImageForAsset:targetSize:contentMode:options:resultHandler:
to
- requestImageDataForAsset:options:resultHandler:
i will receive the image with the same dimension {5376, 2688} but the size in byte is much smaller. So the memory issue is solved.
hope this help !!
(note : [UIImage imageWithData:imageData] use this to convert NSData to UIImage)
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With