Google's Most Advanced Robot Brain Just Got a Body
Google's Most Advanced Robot Brain Just Got a Body
8:11

Google's Most Advanced Robot Brain Just Got a Body

Tech
Speaker 1: Google wants to make robots smarter by teaching them to understand human language and then acting on it in the real world, melding the physical capabilities of walking roaming robots and giving them the kind of intuitive AI powers that you'd expect from a voice assistant or a smart speaker. It's a new technology called palms can, and it takes Google's smarts in natural language processing and machine learning and bakes them into robots, built by a company called everyday robots. And it's [00:00:30] something we haven't seen before. This robot doesn't need to be programmed with really specific instructions. Like if this, then that it can take vague instructions like I'm hungry or I'm thirsty, and then work out the steps it needs to take to solve that problem up until now, we've seen robots out in the real world doing park or, and really physical activities. And we've seen conversational AI driven voice assistance, [00:01:00] but now Google has combined the two. This is a huge deal for the future of robotics and human assistance. So we thought for this week's episode of what the future, we would try something a little bit different. I have my colleagues, Steven Shanklin here to tell me why it's such a game changer now, shanks, you and I were both at this Google demo. It was kind of impressive to see. Can you gimme the basic rundown of what Google was doing? Speaker 2: Sure. This is a technology called Palm Sayan and it combines two very different technologies. The first one is called [00:01:30] Palm, which is Google's very sophisticated, very complicated, natural language processing engine. So this is an AI system that's trained on millions of documents, mostly from the internet. And that is combined with the physical abilities of a robot. They have trained a robot to take a number of actions like moving around a kitchen, grasping objects, recognizing objects. They start with this language model. You can give some, you can give the robot a natural language command, like I've spilled my drink. I need [00:02:00] some help. The robot comes up with a number of possible actions, but then it grounds those possible actions and what the robot actually knows how to do so the marriage of the language model and the real world abilities is what's interesting here. Speaker 1: We saw these great demos of, um, a robot picking up a number of different like balls and blocks in different colors. And it knew that the yellow ball stood for the desert and the blue ball stood for the ocean. How is it recognizing those things? Speaker 2: This is what it learns from [00:02:30] the real world language information that it's been trained on. It knows sort of at a metaphorical level, that green means jungle blue means ocean and yellow means desert. So for example, by reading the novel dune, it can learn that the yellow desert, it might be a phrase that shows up somewhere so it can learn to associate these things. So it actually attains sort of a metaphorical reasoning level. That's much more humanlike than what we've seen in most robots, which are extremely literal, extremely, precisely scripted, and strictly programmed to do a very narrow [00:03:00] set of operations. So this is much more open ended. Speaker 1: Yeah. I remember with that hamburger demo, they showed us a couple of demonstrations of stacking blocks and bowls. But then I asked whether they could ask the robot to make a hamburger and it just picked up the pieces and put them in the order. It did put an entire bottle of ketchup in the middle of a hamburger, which was peak robot behavior. But I loved that. You don't actually have to say, put hamburger, Patty, put lettuce on top of hamburger, Patty, if lettuce then tomato, it [00:03:30] kind of just knows how to do that all at once. Speaker 2: Yeah. So a traditional industrial robot that's maybe installing windshield wipers or soldering capacitors onto a circuit board. That's a very specific, very scripted activity. This is very open ended. And because it's learned from this incredible wealth of knowledge, that's on the internet, it knows what the components of a hamburger might be. It was a pretty interesting demonstration and it was, it was not something that Google had planned out in advance. That was your random in the moment question. So this was, [00:04:00] you know, a good ex example, a good illustration of how this robot can, you know, be more improvisational. Speaker 1: We've seen plenty of robots before from the likes of Boston dynamics, you know, running over obstacles. Or I saw the Amicka robot at CES, which has this very humanoid face and was able to respond with natural language. But those are kind of examples of like physical, real world robot and then natural language in a kind of a humanlike suit, right? This is something that's quite different to [00:04:30] those. Speaker 2: One of the reasons such an interesting demonstration is it combines the brains and the bra. It's got the, it's got the AI language processing and it's got some physical ability to actually go out in the real world. The robots themselves were designed by an alphabet subsidiary called everyday robots. And they want to just build everyday robots that will show up in your house or your workplace. And so they they're designed to actually, you know, move around and grasp things and they have, you know, digital vision. And so with that combined with the Google framework [00:05:00] is, you know, something that's potentially more useful in the house, if they can actually, you know, develop this for another few years to get it out of the research, uh, lab and into your home. Speaker 1: Yeah. So, I mean, we've seen robots like say Astro from Amazon, which is a little home helper, you know, can bring you a can of Coke from the fridge and wheel it into your, into your bathtub. I saw that demo from our smart home team. What would be the future of this kind of robot in the home context compared to some of the other home helpers we've seen before? Speaker 2: If you look at a lot [00:05:30] of these other alternatives, it's, you know, kind of a smartphone with a bit of navigation glued on top. So, you know, Amazon asteroids, you know, it's impressive, it's a first step, but this is, you know, another level entirely when it comes to understanding what humans want and understanding what the robot itself can do. It's much more potentially open ended and therefore much more versatile. I guess, one of the interesting things here that, uh, I saw from the robot demonstration at Google is, is this is, uh, [00:06:00] designed for the chaos and unpredictability of the real world. If you compare it to Boston dynamics, they have very impressive physical, real world navigation abilities. You know, the Atlas robot can do parkour can do flips the spot dogs that can go up and down stairs deal with very complicated terrain. Um, but those don't really have a lot of abilities in terms of actually executing commands. They can go places, but they can't do things. The Google robot is a combination of going places and doing things. Speaker 1: Yeah. I feel like you're kind of combining [00:06:30] like the football team with the chess club into one robot. So if you think about where this goes in the future, maybe 5, 10, 20 years from now, what could the future of this kind of technology bring us? Obviously it's very early days, but it's pretty exciting, right? Speaker 2: Yeah. So what we've seen with the AI revolution is a, is a complete transformation of the computer industry from, uh, machines that could do a very specific task to machines that could handle really complicated, uh, real world situations. Some of those things [00:07:00] are very difficult, like driving a car in a street, incredible number of unpredictable events that could happen in that situation. But AI technology is good enough that it can start to deal with this really, really complicated landscape instead of something, you know, very limited like driving a shuttle bus down a track and back and down a track and back, right? So this is, this is what AI opens up. When you build that into a robot, it's very complicated. And, and you, I think you're, you know, 10 or 20 year time horizon is more likely what we're [00:07:30] looking at here. But when you combine that AI with this physical ability to navigate the real world and take actions, then that's potentially very transformative. Speaker 1: So there you have it, but I'm interested to know what you think. Is this the future of robotics or is it kind of terrifying or is it both because sometimes robotics and technology is like that. Let me know when the comments down below, and while you're here, throw us alike and subscribe for plenty more. What the future videos. We've got amazing stuff on robotics, flying machines, everything you [00:08:00] could possibly want. All right, until next time I'm Claire Riley for CNET bringing you the world of tomorrow today.

Up Next

What I Learned Using Apple's Journal App
231129-site-apples-latest-app-to-fix-you-v2

Up Next

What I Learned Using Apple's Journal App

Apple Products We're Expecting in 2024
231121-site-whats-coming-in-2024-apples-leftovers-v3

Apple Products We're Expecting in 2024

Exploring Spatial Video: What It's Like to View, Shoot 3D iPhone Videos
231116-site-what-to-know-about-spatial-video-v2

Exploring Spatial Video: What It's Like to View, Shoot 3D iPhone Videos

Apple Watch Double Tap Early Impressions
231109-site-double-tap-commentary

Apple Watch Double Tap Early Impressions

Apple's M3 Sparks a New Era of Mac vs. PC Battles
231102-site-the-return-of-mac-vs-pc

Apple's M3 Sparks a New Era of Mac vs. PC Battles

Apple Unveils 14-Inch, 16-Inch MacBook Pros With M3 Processors
pic

Apple Unveils 14-Inch, 16-Inch MacBook Pros With M3 Processors

Apple Debuts 24-Inch iMac With M3 Chips
apple-unveils-new-24-inch-imac-with-m3-processor-mp4-00-00-08-12-still001

Apple Debuts 24-Inch iMac With M3 Chips

Trick or Treat? Apple's 'Scary Fast' Mac Event Makes Us Jumpy
scary-fast-apple-event-clean

Trick or Treat? Apple's 'Scary Fast' Mac Event Makes Us Jumpy

I Upgraded to the iPhone 15 Pro Max: Was It Worth It?
thmbcnet

I Upgraded to the iPhone 15 Pro Max: Was It Worth It?

See Motorola's Bendable Wristwatch Phone Concept at Lenovo World
lenovo-image-cnet

See Motorola's Bendable Wristwatch Phone Concept at Lenovo World

Tech Shows

The Apple Core
apple-core-w

The Apple Core

Alphabet City
alphabet-city-w

Alphabet City

CNET Top 5
cnet-top-5-w

CNET Top 5

The Daily Charge
dc-site-1color-logo.png

The Daily Charge

What the Future
what-the-future-w

What the Future

Tech Today
tech-today-w

Tech Today

Latest News All latest news

These Are the Best Wireless Earbuds for 2023
broll-00-02-39-17-still001.png

These Are the Best Wireless Earbuds for 2023

AMD's AI Chip Event: Everything Revealed in 8 Minutes
amd-ai-event-clean

AMD's AI Chip Event: Everything Revealed in 8 Minutes

Can the Samsung Gaming Hub Replace An Xbox?
p1005566

Can the Samsung Gaming Hub Replace An Xbox?

2023's Top Smartwatches and Wearables of the Year
231129-yt-best-wearable-v03

2023's Top Smartwatches and Wearables of the Year

Best Tech Gifts (All Under $50)
best-gifts-under-50-bucks-00-00-22-17-still001.png

Best Tech Gifts (All Under $50)

Sony Pulse Explore Earbuds: Setup and Hands-On
cnet

Sony Pulse Explore Earbuds: Setup and Hands-On

Most Popular All most popular

Microsoft's AI Ignite Event: Everything Revealed in 8 Minutes
231115-site-microsoft-ignite-keynote-supercut

Microsoft's AI Ignite Event: Everything Revealed in 8 Minutes

Bose QC Ultra vs. Sony WH-1000XM5: Kings of ANC
qc-ultra-vs-xm5-cnetthumb

Bose QC Ultra vs. Sony WH-1000XM5: Kings of ANC

Sony Pulse Explore Earbuds: Setup and Hands-On
cnet

Sony Pulse Explore Earbuds: Setup and Hands-On

CNET Editor Reacts to Vision Pro Spatial Video
04-viewing-spatial-videos-in-apple-vision-pro

CNET Editor Reacts to Vision Pro Spatial Video

Lomi Makes Composting Faster and Cleaner at Home
lomi-thumb-site

Lomi Makes Composting Faster and Cleaner at Home

What I Learned Using Apple's Journal App
231129-site-apples-latest-app-to-fix-you-v2

What I Learned Using Apple's Journal App

Latest Products All latest products

Sony Pulse Explore Earbuds: Setup and Hands-On
cnet

Sony Pulse Explore Earbuds: Setup and Hands-On

The PlayStation 5 Slim: Hands-On
p1019822

The PlayStation 5 Slim: Hands-On

CNET Editor Reacts to Vision Pro Spatial Video
04-viewing-spatial-videos-in-apple-vision-pro

CNET Editor Reacts to Vision Pro Spatial Video

Samsung's 98-inch 8K TV Is Big, Bright and Really Expensive
samsung98in-2

Samsung's 98-inch 8K TV Is Big, Bright and Really Expensive

300-Mile Honda Prologue EV Hits the Road Next Year
hondapic2

300-Mile Honda Prologue EV Hits the Road Next Year

Meta's Ray-Bans, Hands-On: These Glasses Now Stream to Instagram
raybanglassescnet

Meta's Ray-Bans, Hands-On: These Glasses Now Stream to Instagram

Latest How To All how to videos

Tips and Tricks for the Galaxy Watch 6
231120-site-tips-tricks-and-hidden-features-v2

Tips and Tricks for the Galaxy Watch 6

How to Use ChatGPT's New Voice Conversations
how-to-use-chatgpt-voice-chat-00-03-01-13-still003

How to Use ChatGPT's New Voice Conversations

How to Add Multiple Accounts and Set Up a Parent-Supervised Account on the Quest 3
add-accounts-on-quest-3-00-02-59-11-still005

How to Add Multiple Accounts and Set Up a Parent-Supervised Account on the Quest 3

How to Take Screenshots in Windows 11
p1022383-00-00-00-06-still003

How to Take Screenshots in Windows 11

10 Must-Try Hidden iOS 17 Features on Your iPhone
230921-site-ios-17-hidden-features

10 Must-Try Hidden iOS 17 Features on Your iPhone

How to Record Your Screen in Windows 11
how-to-record-your-screen-in-windows-11-00-00-48-13-still002

How to Record Your Screen in Windows 11