Google's approach to the Pixel 6 is to make it work well for a variety of people.
Thematically, Google wanted the Pixel 6 and Pixel 6 Pro to work well for a variety of people in a number of different ways. The opening of Google's fall event Tuesday featured a video of a woman running to catch a train to work. It showed her using her phone to access a transit pass, eating a hard boiled egg under her face mask, taking a selfie on a train under yellow lighting, translating a question someone asks her in Japanese and taking a photo of her dog.
On the surface none of those events seem that extraordinary. It's just an average person commuting. Perhaps during a pandemic, the commuting part seems the least relatable. What is significant is that the Pixel 6 worked for her in the ways she expected. At first, that might not seem like a big deal. But depending on how you speak or your complexion, not all phone features work perfectly.
For decades, cameras and film have been built and designed around people with a lighter complexion. It's a bias that still pervades our phones today, because the hardware and software behind the cameras on our phones are not tested on a diverse group of people. The result can be photos that don't look like you or features that don't work well because of your complexion. Frequently, darker skin tones skew green or even gray in photos. And nobody wants their skin to look gray.
With the Pixel 6 and 6 Pro, Google set out to design the phones to be more inclusive. Two of the more conspicuous areas where the company did that was with their cameras and voice recognition. Google says that the cameras on the Pixel 6 and 6 Pro work better for people with different skin tones and that voice recognition is more adept at identifying different speech patterns and accents.
If there's one thing Google's Pixel cameras are known for, it's software and computational photography. Google says it designed its cameras and computations to be more equitable no matter your complexion. For the past 18 months Google worked with numerous directors, photographers and cinematographers like DP Ava Berkofsky (Insecure) and DP Kira Kelly (13th), known for their beautiful depictions of communities of color.
The idea was to put the phones into the hands of people who know how to take stunning photos of people with darker complexions. They took thousands of portraits, which Google said made their datasets 25x more diverse. These images, along with feedback from the artists, helped Google improve the Pixel's cameras and algorithms in several ways.
The first is detection. Google wants the Pixel to be able to identify a face, no matter how light or dark someone's skin is, or how complex the lighting is. If you are backlit by a bright window, the Pixel 6 and 6 Pro can find your face.
Once a face is detected, the phone usually selects a white balance based on the scene. With Real Tone, the Pixel 6 also factors in the skin tone of the subject to select a white balance that brings out the natural beauty of someone's complexion. Google also improved the auto-exposure tuning to ensure that photos actually look like you do in real life, as opposed to skewing green or gray.
The artists and photographers Google invited to test the phones gave feedback directly to software engineers to improve the aesthetics of how photos were finished. Stray light is minimized so people's complexions don't look ashy or washed out. When a subject has a darker complexion the algorithm now adds more nuance to mid- and undertones. The end result is cameras that can take beautiful photos of all skin tones equally.
But these improvements are not just limited to the Pixel 6 and 6 Pro. Google Photos' auto-enhance feature will also have all of this baked in so that edits are more inclusive. And third-party apps that use the cameras can also take advantage of Real Tone.
Real Tone is an obvious example where you can see meaningful improvement in terms of being more inclusive. Another area where it's nearly as obvious is with voice recognition. Google uses its new Tensor chip to better understand what you're saying no matter your speech pattern or accent. It can learn what terms and names are important to you that might be inconsequential to others. Google's natural language processing on the Pixel 6 is even faster than it was on the Pixel 5.
Google uses Tensor for on-device voice recognition and transcription, and by doing so these processes use half as much power as previous Pixel phones. One of the benefits of this is an improved voice typing experience. The Pixel 6 can use your contacts list to get the correct spelling of a name you say. In the event video, we see a text message being voice typed to someone named Rani. The Pixel 6 correctly spells Rani, instead of, say, Ronnie, because the user has Rani in his contacts.
One of the more impressive additions to voice typing is how the Pixel handles all of the punctuation, so you don't have to dictate commas and question marks. When it comes to transcription suggestions, like Katherine versus Catherine, the Pixel makes suggestions based on the phonetics of what you're saying instead of keystroke-oriented corrections, which aren't always useful for dictation.
And an added benefit of all of this is more accurate emoji transcriptions.
Tensor not only allows for simple translation between languages to be done on-device but it improves the quality of those translations by 18%, Google says. This allows people to communicate in their own language to each other. In the demo, we see an English speaker texting someone who is Japanese. The Pixel 6 translates the messages from Japanese to English for the English speaker and vice versa for the Japanese speaker.
At the end of the day, all of these changes and additions add up to a more meaningful update that goes beyond big specs and a flashy design. The Pixel 6 seems like it can truly transform your relationship with your phone, by making your phone understand you better.