Love 'em or hate 'em, robots will start playing a larger role in our everyday lives in the coming years. Cooley looks at the market and what developers are doing to make robots more relatable.
I hate robots.
But I think I'm changing my mind.
I'm Brian Cooley from CNET in search of the next big thing.
Humanoids that are robots have been on the scene for quite a while at least a decade now give or take And so far, they've had a market share and an importance that is kind of down there with the Segway, and the home 3D printer.
Appealing mostly to geeks who are fascinated by the possible, and not so concerned about the practical.
As a result, we ended up with these things that are extremely humanoid, trying almost too hard to be like mechanical people.
Honda's Asimo dancing like your father Or Toyota's robot, playing the violin.
Both amazed us, but maybe didn't speak to a role in your home.
But now something's changing, as we're seeing robot developers get more focused on relatability and user [UNKNOWN] Not just technical ability.
LG just released a full line of robots that kinda have a face and an ability to communicate.
But notice, no arms or excessively humanoid features.
And their smarts are powered by Amazon Alexa on the backend A bigger model is meant to be an airport employee who not only listens to you and helps you, but also guides you where you may need to go.
Also got a partner that is [UNKNOWN] model.
The impact of that trend on employment is fascinating, but beyond the scoop of our show.
Start up smart beings with developing a robot they have unfortunately named WooHoo, that you might think is just an Amazon Echo with a screen.
But it has a rotating head, so it's camera can look where its ears and microphones Tell it the action is, to better understand the scene and again, be somewhat relatable.
It's envisioned as handling information communication to you, updates, weather, things like that.
As well as integrating with your Smart Home, for example, as part of its function.
And note that it has facial and voice recognition plan, an important part of allowing it to identify you as a individual, not just as a human.
And Mayfield Robotics sorta stole the show at CES with a robot called Kuri that doesn't do much but be very relatable.
Without arms, legs, a screen, or even a voice, it communicates with head and eye gestures, as well as a series of blips and beeps.
[SOUND] Sounds crazy I know, but a lot of sceptics, including me, were kind of impressed by how much they've done to make those simple cues and gestures really understandable.
It to promises advanced face Facial recognitions, we can better anticipate the needs of who is looking at by knowing who is looking at.
I think you can see now the future robots is shaping up to not necessarily be about their abilities that is likely going to be Commodity but instead about their new ones and relatability has something that you have in your life but don't necessarily have to cater your life too.
And this will be based on three particular attributes that I'm seeing.
One, of course, is integration.
Robots that are fully engaged in my preferences, history, calendar, contacts, all of the things that make everything smart in my life.
Another is recognition as we've talked about.
The ability for a robot to understand who I am based on my face and my voice.
It allows it to really dial in and serve me.
Which leads to the big one, anticipation.
We already have a lot of the functions that we're seeing in these robots in our phones and tablets.
But those are more tools than assistants because they have not taken the bar of anticipation and raised it.
That is the sweet spot for the coming robot.
Know what's next at CNET.com/NextBigThing.
I'm Brian Cooley.
Brian Cooley checks the tech and always tells it like it is. His favorite car? Anything that nails what it set out to do, which may make him the only auto journalist who gets as excited by an Optima as a Huracán. He's been focused on high-tech cars and modern driving since he did CNET's first car review in 2005.