X

Google's Gemini Will Bring Richer Image Descriptions to TalkBack Screen Reader

The company is also adding Project Gameface, an open-source hands-free gaming mouse, to Android.

Abrar Al-Heeti Technology Reporter
Abrar Al-Heeti is a technology reporter for CNET, with an interest in phones, streaming, internet trends, entertainment, pop culture and digital accessibility. She's also worked for CNET's video, culture and news teams. She graduated with bachelor's and master's degrees in journalism from the University of Illinois at Urbana-Champaign. Though Illinois is home, she now loves San Francisco -- steep inclines and all.
Expertise Abrar has spent her career at CNET analyzing tech trends while also writing news, reviews and commentaries across mobile, streaming and online culture. Credentials
  • Named a Tech Media Trailblazer by the Consumer Technology Association in 2019, a winner of SPJ NorCal's Excellence in Journalism Awards in 2022 and has three times been a finalist in the LA Press Club's National Arts & Entertainment Journalism Awards.
Abrar Al-Heeti
2 min read
google-pixel-8-pro-review-cnet-4

AI is helping to advance Google's accessibility features.

Andrew Lanxon/CNET

Google is using its AI prowess to bolster its accessibility features. At its I/O developers conference on Tuesday, the company shared that it's leveraging Gemini Nano, its AI model that runs on smartphones, to advance its TalkBack screen reader, offering blind and low-vision users richer and clearer image descriptions. 

First launched in 2009, TalkBack reads aloud what's on a screen and lets users navigate their device using custom gestures. It also supports voice commands and a virtual braille keyboard.   

Google says TalkBack users come across an average of 90 unlabeled images a day. Gemini can help fill in any gaps, such as details about what's being shown in a photo someone sent or the style and cut of clothes while online shopping. And because Gemini Nano works on-device, the descriptions are generated quickly and can continue to work without a network connection. 

Google shared an example depicting a dress, for which TalkBack generated the description, "A close-up of a black and white gingham dress. The dress is short, with a collar and long sleeves. It is tied at the waist with a big bow."

TalkBack can provide richer image descriptions, with the help of AI. 

Google

Bringing Project Gameface to Android

At last year's I/O, Google launched Project Gameface, an open-source, hands-free gaming "mouse" that lets people control a computer's cursor using head movements and facial gestures. Now, Google is open-sourcing more code for Project Gameface on Github to allow developers to expand this capability to Android.

With this expansion, Android cameras can track facial expressions and head movements and translate them into controls. According to Google's blog post, "Developers can now build applications where their users can configure their experience by customizing facial expressions, gesture sizes, cursor speed and more."

Google Project Gameface setup

With Project Gameface, users can link facial expressions and head movements to various controls.

Google

For instance, a user's head movement will also determine how a cursor moves, and gestures like raising an eyebrow or looking up can be custom-linked to various commands like "Select."

The updates come ahead of Global Accessibility Awareness Day, which this year falls on May 16. Google is one of many tech companies that has doubled down on efforts to expand digital accessibility across its devices and platforms. In recent years, it's launched features like Guided Frame, which helps blind and low-vision Pixel users take selfies; Magnifier, which makes it easier to see small text and objects; and Sound Notifications, which alerts people with hearing loss about "critical household sounds" like appliances beeping or water running.

Editors' note: CNET is using an AI engine to help create a handful of stories. Reviews of AI products like this, just like CNET's other hands-on reviews, are written by our human team of in-house experts. For more, see CNET's AI policy and how we test AI.