X

Apple Live Text takes on Google Lens, can read your photos

Live Text can pull information out of Apple Photos, much like Google Lens does.

Scott Stein Editor at Large
I started with CNET reviewing laptops in 2009. Now I explore wearable tech, VR/AR, tablets, gaming and future/emerging trends in our changing world. Other obsessions include magic, immersive theater, puzzles, board games, cooking, improv and the New York Jets. My background includes an MFA in theater which I apply to thinking about immersive experiences of the future.
Expertise VR and AR, gaming, metaverse technologies, wearable tech, tablets Credentials
  • Nearly 20 years writing about tech, and over a decade reviewing wearable tech, VR, and AR products and apps
Scott Stein
live-photos-ios-15-wwdc
Apple

Apple is pulling more information out of photos: its new Live Text feature coming to iOS 15, announced at its virtual WWDC developer conference, looks like a variation on Google's clever computer vision smarts baked into Lens.

The feature will recognize text in photos or through Apple's Camera app, and will recognize seven languages to start. The computer vision-based tool will be able to search for text found in photos, much like Google Lens already does.

While Live Text isn't quite augmented reality as defined by technologies like Apple's ARKit, Live Text could very well be a key tool for upcoming AR software (or hardware). The ability to recognize and pull information on the fly looks like a pretty necessary trick for glasses, although how it plays out in iOS 15 remains to be seen.

Watch this: We found these amazing features in the iOS 15 beta