CNET también está disponible en español.

Ir a español

Don't show this again

Commentary Phones

Hey Google, stop trying to make Assistant my friend

Machines are not people, too.

James Martin/CNET

Google wants to trick you into thinking machines are human.

That's a theme the tech titan pushed as it rolled out new features for its Google Assistant voice software at Google I/O, Google's annual conference for developers. Google Assistant is in nearly every Android phone, the Google Home smart speaker and drives future smart displays, which means that robo-voices that sound and act more human possibly than any you've ever heard before will live all around you. And that's exactly what Google wants.

Now playing: Watch this: Robot or human? Google Assistant will leave you guessing
4:25

You'll also be able to make your kids say "please" before the Google Home responds. Soon you can select singer John Legend to be your new Google Assistant voice. And the oddly-named Google Duplex makes digital voices as lifelike as they've ever been.

AI is inescapable. It's the future. But here's my issue: Google Assistant is simply trying too hard. It's too chummy. Say "Good night" to your Google Home and it triggers your bedtime routine (like turning off the smart lights and adjusting the thermostat), and then responds with "Let's get ready for bed" and "Sleep well".

Open Google Assistant on your phone and it cheerily greets you: "Hi, I'm your Google Assistant. I'm here to help you." Ask it who it is and it tells you, "I'm a friend."

google-assistant-friend

Yes, but no.

Screenshot by Jessica Dolcourt/CNET

That's the thing. I don't need to be friends with my technology. I don't need it to greet me by name, wait for it to respond or pretend to know me. I just want it to turn off the lights or open a website… silently.

For me, Google Assistant is solely a tool, not a social experience. And that's a key point Google seems to miss as it falls over itself to own the AI experience we'll most certainly grow to depend on.

Just look at how the internet reacted to Google's eerily lifelike Duplex software, which sounds human enough to fool real humans as it calls real people to make appointments on your behalf.

Google has since clarified that Duplex's robot voices will identify themselves while making calls, which removes our initial ethical question. However, early uproar still unequivocally points to the fact that plenty of people just don't want to treat machines like flesh and blood.

At last week's I/O conference, Google funneled its desperate thirst to create the need for human-like AI and fill it straight to the developers who will make a lot of the apps that use the (admittedly impressive) Google Assistant technology.

Now playing: Watch this: Google Duplex worries me
5:33

"Imagine what it's like to have conversations with technology," Saba Zaidi, a Google interaction designer, told developers during a session at the show.  

Zaidi went on to urge developers to build Google Assistant's conversation off of real human interaction. "Try to observe the relevant conversation in the real world," she said, noting that "Not everything that does well in your app does well in human conversation."

And if that's a challenge for developers, Google even went so far as to create a tutorial of best practices for structuring conversations, called, I kid you not, Conversation 101.

Of course, designing a conversation for a computer program to simulate real human exchange is different than engaging in one yourself. The fact that Google feels it needs to pass out pointers perhaps suggests the crux of the problem.

Maybe instead of trying to make machines have meaningful conversations with human beings, Google Assistant should butt out and let real people focus on having meaningful conversations -- with each other.

Read: The 5 best things from Google I/O this year

Read: Google Assistant could become the most lifelike AI yet