As an Amazon Associate I earn from qualifying purchases from amazon.com.au

Apple’s Camera Can Read Text, Spotlight Now Searches Photos


Announced today at the WWDC 2021, Apple has added “Live Text” to photos with the forthcoming iOS 15. This feature will enable Apple devices to recognize text and numbers within an image and let users copy that text from the image and paste it into other apps like Mail or Messages.

This new-to-Apple Live Text feature works on new images taken with your iPhone or iPad as well as any existing images Apple users already have stored on their Apple devices. iPhone and iPad users can use various Apple features such as Lookup to find out about the content in their image, copy and send the text from the image they need, or even potentially grab the WiFi password directly from a coffee shop’s sign.

New data detectors can detect phone numbers in photos and makes them tappable. Live Text will support seven languages across iPhone, iPad, and Mac and could be a groundbreaking feature for students note-taking as well as for business people who want to easily translate notes on a whiteboard into digital text.

On top of the new Live Text feature, users can now search through photos in Spotlight using a new feature called Visual Lookup. Using Visual Lookup, users can search by people (Facial recognition), scenes, elements (Snow/Rain), locations and landmarks, or even specific text within the photos. According to Apple, this will also include rich results for contacts in Spotlight, helping iPhone, iPad, and Mac users look for landmarks, animals, plants, and other things in nature.

While these features may be familiar to Google users, Apple’s update finally brings the technology to iOS. As is always the case with Apple, the demonstration seemed very simple, intuitive, and easy to use. It remains to be seen how accurate Spotlight will be, however.

Apple says the feature is enabled using “deep neural networks” and “on-device intelligence.” The company did not go into too much depth on its version of Google Lens, but Apple did say the tool will recognize books, art, nature, pets, and landmarks within users’ photos. Regardless of its actual performance, its inclusion does indicate that the company is taking steps to apply machine learning to users’ images to make the information that can be gleaned increasingly more useful. Since Google is the kind of search, any competition in this space is only good for consumers.





Source link

We will be happy to hear your thoughts

Leave a reply

Mumfaz
Logo
Enable registration in settings - general
Compare items
  • Total (0)
Compare
0
Shopping cart