How to buy iPhone 13 now Emmys 2021: How to watch Grimes reveals what her son calls her FDA panel rejects Pfizer booster plan for general public SpaceX Inspiration4 mission

Apple Live Text takes on Google Lens, can read your photos

Live Text can pull information out of Apple Photos, much like Google Lens does.

This story is part of Apple Event, our full coverage of the latest news from Apple.

Apple is pulling more information out of photos: its new Live Text feature coming to iOS 15, announced at its virtual WWDC developer conference, looks like a variation on Google's clever computer vision smarts baked into Lens.

The feature will recognize text in photos or through Apple's Camera app, and will recognize seven languages to start. The computer vision-based tool will be able to search for text found in photos, much like Google Lens already does.

While Live Text isn't quite augmented reality as defined by technologies like Apple's ARKit, Live Text could very well be a key tool for upcoming AR software (or hardware). The ability to recognize and pull information on the fly looks like a pretty necessary trick for glasses, although how it plays out in iOS 15 remains to be seen.

Now playing: Watch this: We found these amazing features in the iOS 15 beta