Sunday, October 2, 2022
HomeOpinionHere's why "Live Text" is Apple’s best feature from WWDC 2021

Here’s why “Live Text” is Apple’s best feature from WWDC 2021


HomeOpinionHere's why "Live Text" is Apple’s best feature from WWDC 2021

Here’s why “Live Text” is Apple’s best feature from WWDC 2021


Up until now Apple hasn’t been quite as adept at dealing with text in images as Google or Samsung. I open my Google Photos app when I want to find, say, a business receipt I’ve taken a picture of, rather than using search on my iPhone library – it’s quicker and more accurate – but that looks like it’s about to change with iOS15 and its new Live Text feature.

The new update means all those photos of favourite recipes, receipts or hand written notes currently languishing somewhere in your iPhone photo library will be given a new lease of life. 

They will not only be easily searchable – using Spotlight – but the text can then be copied, looked up, translated, shared or simply pasted into emails or documents straight from the image.

Text comes alive in iOS15.

For example phone numbers or other text in images from photos can be selected and saved into your contacts or called directly from the camera app. Simply hit the live text button, select the text and you’re good to go.

Been daydreaming through the morning meeting? No worries, just take a photo of the whiteboard on the way out, extract the text, paste it into your document and catch up over coffee.

Apple’s Craig Federighi said in a virtual announcement at Apple’s WWDC that Live Text will work with iPad, iPhone and Mac and across all photos, including screenshots and pics on the web and is available in seven languages English, Chinese (both simplified and traditional), French, Italian, German, Spanish, and Portuguese.

Additionally the text recognition process takes place directly on the phone, Apple call it “on device intelligence” – not in the cloud, ensuring both privacy and speed and giving users the ability to search their libraries without an internet connection.

That search function, as presented by Federighi, looked especially interesting and seemed in the quick demo viewers were given, a lot more intuitive than Google’s.

Like Google Lens you can search images via people, locations, scenes or other elements in your photos – with Federighi promising that the new tool would also “search visually” and recognise “art, books, nature, pets, and landmarks”.

If Live Text is more of a catch-up than other WWDC announcements, it’s an important one. Apple users can only ask – why did it take so long?


Please enter your comment!
Please enter your name here

Read next

The best over-ear wireless headphones NZ 6

The best over-ear wireless headphones NZ (2022)

The over-ear, wireless headphone market is a crowded one and it can be hard to differentiate between good over-ear headphones and bad ones. You’d be...


Google Stadia won’t ever make it to New Zealand

Move over Pixelbook, we’ve got another one for the ever-expanding Google graveyard.  Google has announced that Stadia, its clever game streaming technology, is to die...