Roughly 10% of the time PHImageManager.defaultManager().requestImageForAsset returns nil instead of a valid UIImage after first returning a valid though "degraded" UIImage. No error or other clue that I can see is returned in the info with the nil.
This seems to happen with photos that need to be downloaded from iCloud, with iCloud Photo Library and Optimize iPad Storage both enabled. I've tried changing the options, size, etc. but nothing seems to matter.
If I retry the requestImageForAsset after the failure it will usually correctly return a UIImage, though sometimes it requires a couple of retries.
Any idea what I might be doing wrong? Or is it just a bug in the Photos framework?
func photoImage(asset: PHAsset, size: CGSize, contentMode: UIViewContentMode, completionBlock:(image: UIImage, isPlaceholder: Bool) -> Void) -> PHImageRequestID? { let options = PHImageRequestOptions() options.networkAccessAllowed = true options.version = .Current options.deliveryMode = .Opportunistic options.resizeMode = .Fast let requestSize = !CGSizeEqualToSize(size, CGSizeZero) ? size : PHImageManagerMaximumSize let requestContentMode = contentMode == .ScaleAspectFit ? PHImageContentMode.AspectFit : PHImageContentMode.AspectFill return PHImageManager.defaultManager().requestImageForAsset(asset, targetSize: requestSize, contentMode: requestContentMode, options: options) { (image: UIImage!, info: [NSObject : AnyObject]!) in if let image = image { let degraded = info[PHImageResultIsDegradedKey] as? Bool ?? false completionBlock(image: photoBlock.rotatedImage(image), isPlaceholder: degraded) } else { let error = info[PHImageErrorKey] as? NSError NSLog("Nil image error = \(error?.localizedDescription)") } } }
I just went through this too. By my tests the issue appears on devices that have the "Optimize Storage" option enabled and resides in the difference between the two methods bellow:
[[PHImageManager defaultManager] requestImageForAsset: ...]
This will successfully fetch remote iCloud images if your options are correctly configured.
[[PHImageManager defaultManager] requestImageDataForAsset:...]
This function only works for images that reside on the phones memory or that were recently fetched from iCloud by your app on any other one.
Here's a working snippet I'm using -bear with me the Obj-c :)
PHImageRequestOptions *options = [[PHImageRequestOptions alloc] init]; options.deliveryMode = PHImageRequestOptionsDeliveryModeHighQualityFormat; //I only want the highest possible quality options.synchronous = NO; options.networkAccessAllowed = YES; options.progressHandler = ^(double progress, NSError *error, BOOL *stop, NSDictionary *info) { NSLog(@"%f", progress); //follow progress + update progress bar }; [[PHImageManager defaultManager] requestImageForAsset:myPhAsset targetSize:self.view.frame.size contentMode:PHImageContentModeAspectFill options:options resultHandler:^(UIImage *image, NSDictionary *info) { NSLog(@"reponse %@", info); NSLog(@"got image %f %f", image.size.width, image.size.height); }];
Full gist available on github
Updated for Swift 4:
let options = PHImageRequestOptions() options.deliveryMode = PHImageRequestOptionsDeliveryMode.highQualityFormat options.isSynchronous = false options.isNetworkAccessAllowed = true options.progressHandler = { (progress, error, stop, info) in print("progress: \(progress)") } PHImageManager.default().requestImage(for: myPHAsset, targetSize: view.frame.size, contentMode: PHImageContentMode.aspectFill, options: options, resultHandler: { (image, info) in print("dict: \(String(describing: info))") print("image size: \(String(describing: image?.size))") })
I found that this had nothing to do with network or iCloud. It occasionally failed, even on images that were completely local. Sometimes it was images from my camera, sometimes it would be from images saved from the web.
I didn't find a fix, but a work around inspired by @Nadzeya that worked 100% of the time for me was to always request a target size equal to the asset size.
Eg.
PHCachingImageManager().requestImage(for: asset, targetSize: CGSize(width: asset.pixelWidth, height: asset.pixelHeight) , contentMode: .aspectFit, options: options, resultHandler: { (image, info) in if (image == nil) { print("Error loading image") print("\(info)") } else { view.image = image } });
I believe the drawbacks to this would be that we're getting the full image back in memory, and then forcing the ImageView to do the scaling, but at least in my use case, there wasn't a noticeable performance issue, and it was much better than loading a blurry or nil image.
A possible optimization here is to re-request the image at it's asset size only if the image comes back as nil.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With