Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Search iOS photos by keyword?

Apple's Photos app allows users to query photos with search keywords like "surfing", "food", "sky".

How could a third-party iOS app with Camera and Photos permissions search the phone's camera roll using arbitrary strings?

Searching for Photos is part of the SiriKit API. Could we pass strings as a possible approach or is there a better way?

enter image description here

like image 337
sgarza62 Avatar asked Oct 09 '19 18:10

sgarza62


People also ask

Can you search for words in Photos on iPhone?

In iOS and iPadOS, you can use the overall search feature (swipe down and tap to enter text) to find matches in photos.

How do I use keywords in Apple Photos?

Click the Info button in the toolbar. In the Info window, click the Add a Keyword field (or the field where other keywords appear if you've already added some), then type a keyword and press Return to add it to the photos. As you type, Photos suggests keywords that you've used before.

How do you search for pictures by text on iOS 15?

Take a photo of text and put it directly into the Notes appIt will bring a camera window up on the bottom half of the screen. Point it at text and the software will automatically recognize it.


1 Answers

I could not find a way even after longer research. But the good news is that you can use the exact same neural network used to achieve the results in the photo app to build your own search index. Here is some sample code:

let fetchOptions = PHFetchOptions()           
let fetchResult = PHAsset.fetchAssets(with: fetchOptions)
fetchResult.enumerateObjects { asset, index, pointer in
    self.assetManager.requestImage(for: asset,
                                   targetSize: CGSize(width: 150, height: 100), 
                                   contentMode: .aspectFit, 
                                   options: .none) { image, metadata in
            if let uiImage = image, let cgImage = uiImage.cgImage {
                let requestHandler = VNImageRequestHandler(cgImage: cgImage)
                let request = VNClassifyImageRequest()
                try? requestHandler.perform([request])
                let results = request.results as! [VNClassificationObservation]
                // Filter the 1303 classification results, and use them in your app
                // in my case, fullfill promise with a wrapper object
                promise(.success(ClassifiedImage(imageIdentifier: asset.localIdentifier,
                                classifications: results.filter { $0.hasMinimumPrecision(0.9, forRecall: 0.0) }.map{
                                    ($0.identifier, nil)
                                    })))
            }

        }
}

More info: https://developer.apple.com/videos/play/wwdc2019/222

Also note: When building a search index you can use the localIdentifier of class PHAsset when fetching the assets.

like image 135
Falco Winkler Avatar answered Nov 11 '22 02:11

Falco Winkler