Google's Most Advanced Robot Brain Just Got a Body
Google's Most Advanced Robot Brain Just Got a Body
8:11

Google's Most Advanced Robot Brain Just Got a Body

Tech
Speaker 1: Google wants to make robots smarter by teaching them to understand human language and then acting on it in the real world, melding the physical capabilities of walking roaming robots and giving them the kind of intuitive AI powers that you'd expect from a voice assistant or a smart speaker. It's a new technology called palms can, and it takes Google's smarts in natural language processing and machine learning and bakes them into robots, built by a company called everyday robots. And it's [00:00:30] something we haven't seen before. This robot doesn't need to be programmed with really specific instructions. Like if this, then that it can take vague instructions like I'm hungry or I'm thirsty, and then work out the steps it needs to take to solve that problem up until now, we've seen robots out in the real world doing park or, and really physical activities. And we've seen conversational AI driven voice assistance, [00:01:00] but now Google has combined the two. This is a huge deal for the future of robotics and human assistance. So we thought for this week's episode of what the future, we would try something a little bit different. I have my colleagues, Steven Shanklin here to tell me why it's such a game changer now, shanks, you and I were both at this Google demo. It was kind of impressive to see. Can you gimme the basic rundown of what Google was doing? Speaker 2: Sure. This is a technology called Palm Sayan and it combines two very different technologies. The first one is called [00:01:30] Palm, which is Google's very sophisticated, very complicated, natural language processing engine. So this is an AI system that's trained on millions of documents, mostly from the internet. And that is combined with the physical abilities of a robot. They have trained a robot to take a number of actions like moving around a kitchen, grasping objects, recognizing objects. They start with this language model. You can give some, you can give the robot a natural language command, like I've spilled my drink. I need [00:02:00] some help. The robot comes up with a number of possible actions, but then it grounds those possible actions and what the robot actually knows how to do so the marriage of the language model and the real world abilities is what's interesting here. Speaker 1: We saw these great demos of, um, a robot picking up a number of different like balls and blocks in different colors. And it knew that the yellow ball stood for the desert and the blue ball stood for the ocean. How is it recognizing those things? Speaker 2: This is what it learns from [00:02:30] the real world language information that it's been trained on. It knows sort of at a metaphorical level, that green means jungle blue means ocean and yellow means desert. So for example, by reading the novel dune, it can learn that the yellow desert, it might be a phrase that shows up somewhere so it can learn to associate these things. So it actually attains sort of a metaphorical reasoning level. That's much more humanlike than what we've seen in most robots, which are extremely literal, extremely, precisely scripted, and strictly programmed to do a very narrow [00:03:00] set of operations. So this is much more open ended. Speaker 1: Yeah. I remember with that hamburger demo, they showed us a couple of demonstrations of stacking blocks and bowls. But then I asked whether they could ask the robot to make a hamburger and it just picked up the pieces and put them in the order. It did put an entire bottle of ketchup in the middle of a hamburger, which was peak robot behavior. But I loved that. You don't actually have to say, put hamburger, Patty, put lettuce on top of hamburger, Patty, if lettuce then tomato, it [00:03:30] kind of just knows how to do that all at once. Speaker 2: Yeah. So a traditional industrial robot that's maybe installing windshield wipers or soldering capacitors onto a circuit board. That's a very specific, very scripted activity. This is very open ended. And because it's learned from this incredible wealth of knowledge, that's on the internet, it knows what the components of a hamburger might be. It was a pretty interesting demonstration and it was, it was not something that Google had planned out in advance. That was your random in the moment question. So this was, [00:04:00] you know, a good ex example, a good illustration of how this robot can, you know, be more improvisational. Speaker 1: We've seen plenty of robots before from the likes of Boston dynamics, you know, running over obstacles. Or I saw the Amicka robot at CES, which has this very humanoid face and was able to respond with natural language. But those are kind of examples of like physical, real world robot and then natural language in a kind of a humanlike suit, right? This is something that's quite different to [00:04:30] those. Speaker 2: One of the reasons such an interesting demonstration is it combines the brains and the bra. It's got the, it's got the AI language processing and it's got some physical ability to actually go out in the real world. The robots themselves were designed by an alphabet subsidiary called everyday robots. And they want to just build everyday robots that will show up in your house or your workplace. And so they they're designed to actually, you know, move around and grasp things and they have, you know, digital vision. And so with that combined with the Google framework [00:05:00] is, you know, something that's potentially more useful in the house, if they can actually, you know, develop this for another few years to get it out of the research, uh, lab and into your home. Speaker 1: Yeah. So, I mean, we've seen robots like say Astro from Amazon, which is a little home helper, you know, can bring you a can of Coke from the fridge and wheel it into your, into your bathtub. I saw that demo from our smart home team. What would be the future of this kind of robot in the home context compared to some of the other home helpers we've seen before? Speaker 2: If you look at a lot [00:05:30] of these other alternatives, it's, you know, kind of a smartphone with a bit of navigation glued on top. So, you know, Amazon asteroids, you know, it's impressive, it's a first step, but this is, you know, another level entirely when it comes to understanding what humans want and understanding what the robot itself can do. It's much more potentially open ended and therefore much more versatile. I guess, one of the interesting things here that, uh, I saw from the robot demonstration at Google is, is this is, uh, [00:06:00] designed for the chaos and unpredictability of the real world. If you compare it to Boston dynamics, they have very impressive physical, real world navigation abilities. You know, the Atlas robot can do parkour can do flips the spot dogs that can go up and down stairs deal with very complicated terrain. Um, but those don't really have a lot of abilities in terms of actually executing commands. They can go places, but they can't do things. The Google robot is a combination of going places and doing things. Speaker 1: Yeah. I feel like you're kind of combining [00:06:30] like the football team with the chess club into one robot. So if you think about where this goes in the future, maybe 5, 10, 20 years from now, what could the future of this kind of technology bring us? Obviously it's very early days, but it's pretty exciting, right? Speaker 2: Yeah. So what we've seen with the AI revolution is a, is a complete transformation of the computer industry from, uh, machines that could do a very specific task to machines that could handle really complicated, uh, real world situations. Some of those things [00:07:00] are very difficult, like driving a car in a street, incredible number of unpredictable events that could happen in that situation. But AI technology is good enough that it can start to deal with this really, really complicated landscape instead of something, you know, very limited like driving a shuttle bus down a track and back and down a track and back, right? So this is, this is what AI opens up. When you build that into a robot, it's very complicated. And, and you, I think you're, you know, 10 or 20 year time horizon is more likely what we're [00:07:30] looking at here. But when you combine that AI with this physical ability to navigate the real world and take actions, then that's potentially very transformative. Speaker 1: So there you have it, but I'm interested to know what you think. Is this the future of robotics or is it kind of terrifying or is it both because sometimes robotics and technology is like that. Let me know when the comments down below, and while you're here, throw us alike and subscribe for plenty more. What the future videos. We've got amazing stuff on robotics, flying machines, everything you [00:08:00] could possibly want. All right, until next time I'm Claire Riley for CNET bringing you the world of tomorrow today.

Up Next

Boston Dynamics Retires Its HD Atlas Robot
p1022506-00-00-01-20-still001

Up Next

Boston Dynamics Retires Its HD Atlas Robot

Apple and Disney's Unique Bond: Why Vision Pro Needs the Mouse
240411-site-can-disney-save-the-apple-vision-pro-v1

Apple and Disney's Unique Bond: Why Vision Pro Needs the Mouse

The Ocean Cleanup's System 03 Collects Plastic Pollution at Record Levels
The Ocean Cleanup System 03

The Ocean Cleanup's System 03 Collects Plastic Pollution at Record Levels

Latest iOS 18 Rumor Roundup: New Designs, AI Tricks
240404-yt-omt-ios-18-siri-ai-v06

Latest iOS 18 Rumor Roundup: New Designs, AI Tricks

Apple to Talk AI in June: This WWDC Is a Big Deal
240328-yt-omt-wwdc24-v07

Apple to Talk AI in June: This WWDC Is a Big Deal

What Google Gemini AI on the iPhone Could Look Like
240321-site-apple-and-gemini-ai

What Google Gemini AI on the iPhone Could Look Like

Microsoft Surface Pro 10, Surface Laptop 6 Are Here
240320-site-microsoft-surface-pros-first-look-v2

Microsoft Surface Pro 10, Surface Laptop 6 Are Here

Everything Just Announced at Google's AI Health Event
sc-googlehealthai-00-02-29-25-still001

Everything Just Announced at Google's AI Health Event

Everything Just Revealed at Nvidia's GTC AI Conference
nvidiastill

Everything Just Revealed at Nvidia's GTC AI Conference

Nvidia Reveals Omniverse Cloud Streams to the Vision Pro
nvidia-vision-pro-image

Nvidia Reveals Omniverse Cloud Streams to the Vision Pro

Tech Shows

The Apple Core
apple-core-w

The Apple Core

Alphabet City
alphabet-city-w

Alphabet City

CNET Top 5
cnet-top-5-w

CNET Top 5

The Daily Charge
dc-site-1color-logo.png

The Daily Charge

What the Future
what-the-future-w

What the Future

Tech Today
tech-today-w

Tech Today

Latest News All latest news

First Drive in the All-Electric 2024 Fiat 500e
240417-site-fiat-500e-ev-drive-program-thumbnail-v3

First Drive in the All-Electric 2024 Fiat 500e

Laptop Buying Guide: What to Look For
laptop-buying-guide-2024-00-02-36-12-still001

Laptop Buying Guide: What to Look For

Boston Dynamics Retires Its HD Atlas Robot
p1022506-00-00-01-20-still001

Boston Dynamics Retires Its HD Atlas Robot

Taste-Testing 6 Apple Cider Vinegar Drinks
applecider-160424-land-00-00-06-18-still001

Taste-Testing 6 Apple Cider Vinegar Drinks

Testing BruMate's Leakproof Tumbler
brumate-160324-land-00-00-47-14-still001

Testing BruMate's Leakproof Tumbler

Tips and Tricks for the AirPods Pro 2
airpods-pro-2

Tips and Tricks for the AirPods Pro 2

Most Popular All most popular

First Look at TSA's Self-Screening Tech (in VR!)
innovation

First Look at TSA's Self-Screening Tech (in VR!)

Samsung Galaxy S24 Ultra Review: More AI at a Higher Cost
240123-site-samsung-galaxy-s24-ultra-review-4

Samsung Galaxy S24 Ultra Review: More AI at a Higher Cost

'Circle to Search' Lets Users Google From Any Screen
circlesearchpic

'Circle to Search' Lets Users Google From Any Screen

Asus Put Two 14-inch OLEDs in a Laptop, Unleashes First OLED ROG Gaming Laptop
asus-preces-00-00-25-11-still003

Asus Put Two 14-inch OLEDs in a Laptop, Unleashes First OLED ROG Gaming Laptop

Samsung Galaxy Ring: First Impressions
samsung-galaxy-ring-clean

Samsung Galaxy Ring: First Impressions

Best of Show: The Coolest Gadgets of CES 2024
240111-site-best-of-ces-2024-1

Best of Show: The Coolest Gadgets of CES 2024

Latest Products All latest products

2025 Audi Q6, SQ6 E-Tron: Audi's Newest EV Is Its Most Compelling
cnet-audiq6

2025 Audi Q6, SQ6 E-Tron: Audi's Newest EV Is Its Most Compelling

Hands-On with Ford's Free Tesla Charging Adapter
pic3

Hands-On with Ford's Free Tesla Charging Adapter

Nuro R3 is an Adorable Self-Driving Snack Bar
240320-site-nuro-r3-first-look-v1

Nuro R3 is an Adorable Self-Driving Snack Bar

First Look: The $349 Nothing Phone 2A Aims to Brighten Your Day
240304-site-nothing-phone-2-first-look-v3

First Look: The $349 Nothing Phone 2A Aims to Brighten Your Day

Best of MWC 2024: Bendable Screens, AI Wearables and More
240229-site-best-of-show-at-mwc

Best of MWC 2024: Bendable Screens, AI Wearables and More

This Concept Laptop from Lenovo Has a Transparent Display
240225-site-lenovo-translucent-laptop-concept-v3

This Concept Laptop from Lenovo Has a Transparent Display

Latest How To All how to videos

Tips and Tricks for the AirPods Pro 2
airpods-pro-2

Tips and Tricks for the AirPods Pro 2

How to Watch the Solar Eclipse Safely From Your Phone
screenshot-2024-04-03-at-15-47-11.png

How to Watch the Solar Eclipse Safely From Your Phone

Windows 11 Tips and Hidden Features
240311-site-windows-11-hidden-tips-and-tricks-v2

Windows 11 Tips and Hidden Features

Vision Pro App Walkthrough -- VisionOS 1.0.3
VisionOS 1.0.3

Vision Pro App Walkthrough -- VisionOS 1.0.3

Tips and Tricks for the Galaxy S24 Ultra
240216-site-galaxy-s24-ultra-tips-and-hidden-features-2

Tips and Tricks for the Galaxy S24 Ultra

TikTok Is Now on the Apple Vision Pro
tiktok-on-vision-pro-clean

TikTok Is Now on the Apple Vision Pro