When using Assets Library you could fetch the album's poster image from ALAssetsGroup. How do you achieve the same when using Photos Framework (Photo kit)?
Overview. In iOS and macOS, PhotoKit provides classes that support building photo-editing extensions for the Photos app. In iOS, macOS, and tvOS, PhotoKit also provides direct access to the photo and video assets managed by the Photos app.
A representation of an image, video, or Live Photo in the Photos library.
Press Select in the upper right, select the photo you want to use and then press the Share button in the lower left. Scroll to the right in the bottom row and there should be an option called "Set Key Face."
The PHLivePhotoView class provides a way to display Live Photos—pictures, taken on compatible hardware, that include motion and sound from the moments just before and after their capture.
I do it this way. In the method cellForRowAtIndexPath: in albums tableView add the following code.
Objective C:
PHFetchOptions *fetchOptions = [[PHFetchOptions alloc] init];
fetchOptions.sortDescriptors = @[[NSSortDescriptor sortDescriptorWithKey:@"creationDate" ascending:YES]];
PHFetchResult *fetchResult = [PHAsset fetchKeyAssetsInAssetCollection:[self.assetCollections objectAtIndex:indexPath.row] options:fetchOptions];
PHAsset *asset = [fetchResult firstObject];
PHImageRequestOptions *options = [[PHImageRequestOptions alloc] init];
options.resizeMode = PHImageRequestOptionsResizeModeExact;
CGFloat scale = [UIScreen mainScreen].scale;
CGFloat dimension = 78.0f;
CGSize size = CGSizeMake(dimension*scale, dimension*scale);
[[PHImageManager defaultManager] requestImageForAsset:asset targetSize:size contentMode:PHImageContentModeAspectFill options:options resultHandler:^(UIImage *result, NSDictionary *info) {
dispatch_async(dispatch_get_main_queue(), ^{
cell.imageView.image = result;
});
}];
Swift 3:
let fetchOptions = PHFetchOptions()
let descriptor = NSSortDescriptor(key: "creationDate", ascending: true)
fetchOptions.sortDescriptors = [descriptor]
let fetchResult = PHAsset.fetchKeyAssets(in: assets[indexPath.row], options: fetchOptions)
guard let asset = fetchResult?.firstObject else {
return
}
let options = PHImageRequestOptions()
options.resizeMode = .exact
let scale = UIScreen.main.scale
let dimension = CGFloat(78.0)
let size = CGSize(width: dimension * scale, height: dimension * scale)
PHImageManager.default().requestImage(for: asset, targetSize: size, contentMode: .aspectFill, options: options) { (image, info) in
DispatchQueue.main.async {
cell.imageView.image = image
}
}
Edit: Looks like fetchKeyAssetsInAssetCollection does not always return correct results (most recently captured images/videos). The definition of keyAssets is vaguely defined by apple. Better use
+ (PHFetchResult *)fetchAssetsInAssetCollection:(PHAssetCollection *)assetCollection options:(PHFetchOptions *)options
to get the fetch results array and then get the firstObject from the fetch results as described before. This will certainly return the correct results. :)
Here's what I'm using now. It handles different types of photo collections and the case when key asset is missing.
static func fetchThumbnail(collection: PHCollection, targetSize: CGSize, completion: @escaping (UIImage?) -> ()) {
func fetchAsset(asset: PHAsset, targetSize: CGSize, completion: @escaping (UIImage?) -> ()) {
let options = PHImageRequestOptions()
options.deliveryMode = PHImageRequestOptionsDeliveryMode.highQualityFormat
options.isSynchronous = false
options.isNetworkAccessAllowed = true
// We could use PHCachingImageManager for better performance here
PHImageManager.default().requestImage(for: asset, targetSize: targetSize, contentMode: .default, options: options, resultHandler: { (image, info) in
completion(image)
})
}
func fetchFirstImageThumbnail(collection: PHAssetCollection, targetSize: CGSize, completion: @escaping (UIImage?) -> ()) {
// We could sort by creation date here if we want
let assets = PHAsset.fetchAssets(in: collection, options: PHFetchOptions())
if let asset = assets.firstObject {
fetchAsset(asset: asset, targetSize: targetSize, completion: completion)
} else {
completion(nil)
}
}
if let collection = collection as? PHAssetCollection {
let assets = PHAsset.fetchKeyAssets(in: collection, options: PHFetchOptions())
if let keyAsset = assets?.firstObject {
fetchAsset(asset: keyAsset, targetSize: targetSize) { (image) in
if let image = image {
completion(image)
} else {
fetchFirstImageThumbnail(collection: collection, targetSize: targetSize, completion: completion)
}
}
} else {
fetchFirstImageThumbnail(collection: collection, targetSize: targetSize, completion: completion)
}
} else if let collection = collection as? PHCollectionList {
// For folders we get the first available thumbnail from sub-folders/albums
// possible improvement - make a "tile" thumbnail with 4 images
let inner = PHCollection.fetchCollections(in: collection, options: PHFetchOptions())
inner.enumerateObjects { (innerCollection, idx, stop) in
self.fetchThumbnail(collection: innerCollection, targetSize: targetSize, completion: { (image) in
if image != nil {
completion(image)
stop.pointee = true
} else if idx >= inner.count - 1 {
completion(nil)
}
})
}
} else {
// We shouldn't get here
completion(nil)
}
}
It is a very simple thing...
PHFetchOptions *userAlbumsOptions = [PHFetchOptions new];
userAlbumsOptions.predicate = [NSPredicate predicateWithFormat:@"estimatedAssetCount > 0"];
PHFetchResult *userAlbums = [PHAssetCollection fetchAssetCollectionsWithType:PHAssetCollectionTypeAlbum subtype:PHAssetCollectionSubtypeAny options:userAlbumsOptions];
[userAlbums enumerateObjectsUsingBlock:^(PHAssetCollection *collection, NSUInteger idx, BOOL *stop) {
NSLog(@"album title %@", collection.localizedTitle);
PHFetchResult *assetsFetchResult = [PHAsset fetchAssetsInAssetCollection:collection options:nil];
PHAsset *asset = [assetsFetchResult objectAtIndex:0];
NSInteger retinaMultiplier = [UIScreen mainScreen].scale;
CGSize retinaSquare = CGSizeMake(80 * retinaMultiplier, 80 * retinaMultiplier);
[[SDWebImageManager sharedManager] downloadImageWithURL:[NSURL URLWithString:asset.localIdentifier] options:SDWebImageProgressiveDownload targetLocalAssetSize:retinaSquare progress:^(NSInteger receivedSize, NSInteger expectedSize) {
} completed:^(UIImage *image, NSError *error, SDImageCacheType cacheType, BOOL finished, NSURL *imageURL) {
if (image) {
albumCoverImg.image = image;
}
}];
}];
and if you have not updated SDWebImage classed then load image as a normal way.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With