Is there a way to accomplish something similar to what the iTunes and App Store Apps do when you redeem a Gift Card using the device camera, recognizing a short string of characters in real time on top of the live camera feed?
I know that in iOS 7 there is now the AVMetadataMachineReadableCodeObject
class which, AFAIK, only represents barcodes. I'm more interested in detecting and reading the contents of a short string. Is this possible using publicly available API methods, or some other third party SDK that you might know of?
There is also a video of the process in action:
https://www.youtube.com/watch?v=c7swRRLlYEo
Best,
Live Text is just another type of OCR. When your iPhone or iPad camera detects text in an image, it reacts by displaying a small indication icon. If you want to grab all the text in an image, just tap that icon.
It's just a built in function. You may have to reimport the handwritten note do it's processed by the OCR engine.
This feature is similar to how Google Lens works on Android phones, and on the Google Search and Photos app on iOS. With Live Text, iOS will now recognise any text in a photo, screenshot, or camera preview thanks to the optical character recognition (OCR) allowing Apple to extract text from any image.
I'm working on a project that does something similar to the Apple app store redeem with camera as you mentioned.
A great starting place on processing live video is a project I found on GitHub. This is using the AVFoundation framework and you implement the AVCaptureVideoDataOutputSampleBufferDelegate methods.
Once you have the image stream (video), you can use OpenCV to process the video. You need to determine the area in the image you want to OCR before you run it through Tesseract. You have to play with the filtering, but the broad steps you take with OpenCV are:
Some other hints:
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With