Inside Meta's Reality Labs: Hands-On With the Future of Metaverse
Inside Meta's Reality Labs: Hands-On With the Future of Metaverse
14:38

Inside Meta's Reality Labs: Hands-On With the Future of Metaverse

Tech
Speaker 1: I sat across from Mark Zuckerberg as he demonstrated moving things around on a screen, wearing a motor neuron sensing wristband that he was doing micro gestures with. Speaker 2: Here's what I'm seeing. While I'm using this, just the gentle flick of my thumb to check my messages and with another quick movement I can answer while I'm on the move, uh, or I can even take a photo. Speaker 1: Is this the future of the metaverse? Well, it's one of several things that meta is counting [00:00:30] on, and I got to look at these technologies firsthand at their reality research lab, stuff that you won't get a chance to see for years Speaker 1: At me, Reality Research Lab. I got to look at the Quest Pro, which is met his next VR headset for more on that watch the whole second video that I have. But in this video, I'm gonna also talk about all of the future tech demos that I got to try. I've been curious, just like many people about where me is going to go with the Metaverse [00:01:00] beyond VR to ar, to what else? How is this going to take over some of the stuff that we do on our everyday lives? So when the company invited me to come out to their reality research labs, I was super excited. Also, because this is the first time the company has invited journalists out to that facility. I had to shoot my own photo and video when I was there, except for a few parts where I wasn't allowed to shoot in which places meta shot footage and photos using their own equipment on site. Speaker 1: Meta is based down in Silicon Valley, but [00:01:30] it's future tech team. The reality research labs is up in Washington state, kind of near Microsoft. And when I arrived at this lab facility, it was a bunch of nondescript office buildings in an office park near DigiPen. I was super curious what was inside Michael Abrash, who's the chief scientist of reality research labs and has always been met as far future futurist, was the one who guided us through a lot of these demos. So what I saw was kind of a tasting of four different demos that were meant to [00:02:00] represent technologies that met as not able to make happen yet for everyday people, but is trying to shoot for, to advance different zones. One of them was neural inputs, and it's kind of the wildest one. It's one that Meta has talked about before. And Meta acquired this company called Control Labs in 2019. Speaker 1: These conversations and demos from their reports have shown people using this band to, uh, to control things with micro gestures. And we got to see some people using it just a few feet away [00:02:30] from us. I didn't get a chance to demo it myself unfortunately, but I got to see both Mark Zuckerberg and a couple of people who are trained over a period of time to use it. The way this works is it senses individual motor neurons that are firing and can kind of sense a muscle movement in a way that you don't necessarily even have to fully move your fingers according to Speaker 3: Meta. These are the zero one binary events that your brain is communicating down to your muscles. And what Diego has learned how to do is to voluntarily [00:03:00] control Speaker 1: Them. So some of the stuff could look like gestures, but over time, apparently this will start feeling almost microscopic, invisible movements that will then control things kind of looking like mind reading, but it's actually intent to move your hand. The demos that I saw first, Mark Zuckerberg had shown us a whole bunch of little things moving icons around, and it looked like he was kind of moving his thumb, uh, like a mouse and tapping, uh, meta has shown some of these demos before. It looks almost [00:03:30] like a, like a, the way you'd use a mouse or some sort of a futuristic control device, but with no actual device. Apparently it's not precisely enough yet or fast enough yet for typing, but it's meant to eventually be used for things like smart glasses. That's where me is looking at this technology the most because the idea is that you would not be carrying a controller around. Speaker 1: You'd wanna be able to interact really quickly, but right now it's all about trying to prove that it just works and how easy it could be [00:04:00] to use it. The other demo I saw with it were, were people sitting down playing this, um, this game that was like, uh, you know, you move back and forth and try to, uh, survive this, this endless running game. And usually it would involve some, some hand motions, but after a while they kept their hands still and did these micro gestures that it couldn't even tell were happening. And they claim that this was a technique called co adaptation or coap learning. And the feedback that you get from moving, it can eventually be whittled down to something so small that the AI [00:04:30] picks that up and you can start making motions that feel really, really tiny. This is all pretty wild and hard to imagine in everyday use, but there are a lot of possibilities, not just for general control, but possibly for assistive uses for people who may not have full use of their limbs or have other motor complications because this technology is not that different from the types of tech that could be used, say, to create a prosthetic limb. Speaker 1: So you could potentially use it to operate something even if you didn't have full use of [00:05:00] your hand. The second demo that I did get to try involves spatial audio. Now spatial audio has been around in AirPods and VR and it's basically 3D audio that you can hear around you can be interesting, it can feel gimmicky, uh, in vr can be really useful to try to locate where things are. Where me is trying to go with s spatial audio though is to ar to eventually be able to place 3D audio in a real room and make it feel like it's there. And the company has been working on technology to [00:05:30] not just measure where your uh, audio is coming from in the room, but where the echoes are coming from in the space of a room, they took us to an AIC chamber, which is soundproof and showed us this array of, uh, dozens of speakers that was designed to create the sound scape that microphones would be used to measure the echoing in a room and also be able to tune to specific ears. So what I got to listen to were these two demos after that put microphones inside my ears and [00:06:00] then I wore these over your headphones. Somebody in the room walked around me and recorded this, you know, 42nd, uh, clip of them making various noises and things like that. And then I listened to it again, played back Speaker 1: The audio. Even having listened to immersive 3D audio things, it was surprisingly convincing at times. I kept my eyes closed both times and it really felt like somebody was [00:06:30] moving around to the side and whispering near me and that I thought there were someone behind me. So it recreates that soundscape, but I had to stay perfectly still. The other demo that I tried was in a room with four speakers and I wore these, uh, headphones with this, um, tracking device to allow me to track the audio as I moved. And they played back audio both on the speakers and on the headphones to see if I could tell the difference whether it was being projected or real Speaker 4: Hear sound [00:07:00] coming from the first loud speaker. Okay, good. Speaker 1: I failed the first test because they played identified on the speakers and then I realized midway through, are they tricking me? I took off the headphones, it was all playing on the headphones. It was pretty shocking. They were over ear headphones that kind of floated a few centimeters over my ears. This tech is something that they're creating specifically for this space. So why is that any different than anything else? I think again, it's that if you could create audio that feels convincing enough that it's [00:07:30] in the room with you, then eventually if you have, uh, holographic avatars, you know, like the Marvel type things that would appear and beam down to talk to you, it could actually sound like they're in the room with you versus just being in your earbud. And based on these couple of demos, this s spatial audio was a lot better than anything that I've ever heard before, but it won't be here necessarily anytime soon. Speaker 1: The other demo I tried involved 3D scanning. This is the service stuff that I've looked at on iPads and iPhones with lidar and 3D scanning is already a big trend all [00:08:00] across the landscape for VR and ar. What's new here is METAS sort showed some of the ways that phone-based 3D scanning could improve. They used my shoe for one of the scans, so they took, took my sneaker off and scanned it. So I, I got a good familiar look of my own shoe in ar that first scan, which took a few minutes to make. It was good. It looked like a better scan than I've seen most times doing it by myself using lidar. I could still see some flaws with it though. The second demo [00:08:30] that they showed with 3D scanning was a lot more interesting looking at something called Radiance Fields. They used a technology where they could look at the light patterns around an object and kind of create this 3D scan that would be a lot more detailed, but to do it realistically, I mean, what I'm basically saying is they showed some 3D objects that they scanned into VR that were very complicated, a very fuzzy teddy bear and a very spiky cactus with tons of little spikes. Speaker 1: And I, I thought, Okay, how good is this gonna be? But when I looked at it in vr, I could see all the [00:09:00] little curly cues of the hair and the spikes of the cactus, which were really fine. And when I brought it over to a lamp, I could see light being reflected off of it, a virtual lamp. These objects looked really good and crafted, but they were 3D scans. You Speaker 2: Know, when you throw it in the air, uh, against a wall or if you bounce it off the ground, it's gonna respond the same way that the physical object Speaker 1: Would. Now, that's how good they eventually could make 3D scanning into vr. That could be huge because it's the whole dream of scanning in your furniture, your clothing or, or other people. Um, [00:09:30] right now a lot of that stuff looks kind of melted and weird, but could it eventually look good enough to not feel like it was glued into the scene? I feel like some of those later demos that I saw showed a lot of possibility, but those take hours to process right now and aren't ready yet. And the final demo I looked at involved avatars, we've seen them all the time in VR and their cartoony and meta has been promising. These photorealistic avatars that will start looking like we're really talking with somebody. Kodak avatars are what Meta [00:10:00] calls them, and I've never seen a demo of them before until now. I got to look at three different types of avatar demos. One of them was a more boiled down 3D scan avatar that is meant to be done using a phone. Something that you eventually could maybe scan yourself and pop yourself in. Speaker 5: She scans her face from different angles with a neutral expression for about 30 seconds, then spends another minute and a half making a variety of expressions. That's really all there is to it. Speaker 6: Hi [00:10:30] guys. My 3D avatar is ready for use in my phone or vr. It just took a few hours to generate after my scan and the team's working and making that processing a whole lot faster. Speaker 1: The conversation I had with somebody remotely looked not bad, although kind of like an animated bust, you know, I felt like they were talking to me, but, but a little bit still. And so if I had a conversation with somebody like that, I would say, Why aren't you moving very much? What's going on here? It was a little uncanny. The second demo I had, uh, was [00:11:00] really surprising and that was, um, Kodak Avatar two, which is their next generation kind of moonshot avatar that they're, that they're building where I talked to somebody in Pittsburgh who's, who's building this, and I got to see basically their head almost letting candlelight. Speaker 7: So what you're seeing actually is, uh, a relatable volumetric representation of my head, my face, my hair, my shoulders and neck. And to enable this interaction, there are a few cameras mounted on the headset that I'm wearing. They're observing [00:11:30] my eyes and my mouth and allow me to animate the <inaudible> in various Speaker 1: Ways. And I felt like I was talking to them in some weird dark room. It almost felt like a PlayStation five video game or Xbox video game where, you know, you look at it and it looks so incredibly rendered that you wonder if it's photo real, but it's actually that person talking. And so I kept thinking like, is this animated? Am I looking at the actual person? I got really close to them. I think I was was close talking and, but I felt like I was intimate. Like I felt like I was really talking to them and they, I wanted to see what the expressions [00:12:00] were like and the smiles I asked them to kind of make different faces. It was pretty good. I don't know when that's ever gonna become available, but if I saw that in a game or in an app, I'd be really curious to try it. Speaker 1: The third avatar demo I saw was looking at, uh, how med is gonna try to actually add legs to avatars along with clothing. It was a, a scan of an actor in those rooms studied with cameras to create just a quick captured clip of that avatar that I could then walk around. Also, the clothing that was being draped [00:12:30] on that on that person was all virtual that looked as good as the avatar. You know, a lot of um, you know, the shirts kind of rippling and moving and nothing felt kind of glued on. And a lot of the metaverse has been talking a big game about commerce and and fashion and a lot of companies going into this space trying to make things for people. Is that meta trying to flex out to show some of those possibilities? I think so. The tour wrapped up with Michael Arush talking about this big future of where we're going in the [00:13:00] metaverse where he feels that this is a real phase change for people. Speaker 1: Maybe the biggest shift since, uh, personal computing and the internet. Well, I think what it feels like is that a lot of things we know are starting to evolve to a next level. Where Meadow wants it to go to is stuff that bridges VR and AR and avatars and 3D objects. And Me is not the only company trying to get to this point. Uh, Apple has been going there, Google's been talking about it, Microsoft's been talking about it and Vidia has been talking about it. You've got Snap [00:13:30] pretty much every player can. The tech landscape has been exploring it, which makes it feel like it could actually start to happen because a lot of companies are are willing it to happen. I got to see all of me's prototype VR and AR headsets on a wall at their reality research labs. Some of them were looking at things like adding mixed reality. Some of them were adding virtual eyes to the outside of your headset. Some were trying to be slimmer. I saw one that was shooting for the, the way in which [00:14:00] VR glasses could eventually be small enough to almost feel like sunglasses. And Speaker 8: This is our North Star in a sense. You know, can we make this faultlessly realistic, comfortable to wear all day and, and open up this productivity vision? Speaker 1: And when I look at that wall, I get the sense of how much change is still happening in this landscape. I got the sense looking at Med's research lab that there's still a lot of work left to be done, but it was fascinating to get a taste of where things are going. Even if some of that stuff is [00:14:30] going to take five years or more to get there.

Up Next

Inside Meta's Reality Labs: Hands-On With the Future of Metaverse
scott-w-head-tracker

Up Next

Inside Meta's Reality Labs: Hands-On With the Future of Metaverse

I Went Shopping in the Metaverse and Tried On a Gucci Bag
shopping-1

I Went Shopping in the Metaverse and Tried On a Gucci Bag

Tech Shows

The Apple Core
apple-core-w

The Apple Core

Alphabet City
alphabet-city-w

Alphabet City

CNET Top 5
cnet-top-5-w

CNET Top 5

The Daily Charge
dc-site-1color-logo.png

The Daily Charge

What the Future
what-the-future-w

What the Future

Tech Today
tech-today-w

Tech Today

Latest News All latest news

Meta Expands Its Mixed Reality Beyond the Quest Headsets Explainer
Meta Quest 2

Meta Expands Its Mixed Reality Beyond the Quest Headsets Explainer

What AirPods Rumors Tell Us About Apple's Health Ambitions
240424-omt-next-airpods-v07

What AirPods Rumors Tell Us About Apple's Health Ambitions

Robosen's Megatron Transformer Is Too Much Fun for an Evil Robot
240419-megatron-v04

Robosen's Megatron Transformer Is Too Much Fun for an Evil Robot

Apple May Give FineWoven Accessories One More Season
finewoven-240424-land-00-00-13-04-still003

Apple May Give FineWoven Accessories One More Season

US vs. TikTok: What Happens Next
240424-yt-tiktok-vs-us-v04

US vs. TikTok: What Happens Next

Battle of the Humanoid Robots: MenteeBot Is Ready
240423-yt-menteebot-ai-robot-v08

Battle of the Humanoid Robots: MenteeBot Is Ready

Most Popular All most popular

First Look at TSA's Self-Screening Tech (in VR!)
innovation

First Look at TSA's Self-Screening Tech (in VR!)

Samsung Galaxy S24 Ultra Review: More AI at a Higher Cost
240123-site-samsung-galaxy-s24-ultra-review-4

Samsung Galaxy S24 Ultra Review: More AI at a Higher Cost

'Circle to Search' Lets Users Google From Any Screen
circlesearchpic

'Circle to Search' Lets Users Google From Any Screen

Asus Put Two 14-inch OLEDs in a Laptop, Unleashes First OLED ROG Gaming Laptop
asus-preces-00-00-25-11-still003

Asus Put Two 14-inch OLEDs in a Laptop, Unleashes First OLED ROG Gaming Laptop

Samsung Galaxy Ring: First Impressions
samsung-galaxy-ring-clean

Samsung Galaxy Ring: First Impressions

Best of Show: The Coolest Gadgets of CES 2024
240111-site-best-of-ces-2024-1

Best of Show: The Coolest Gadgets of CES 2024

Latest Products All latest products

Robosen's Megatron Transformer Is Too Much Fun for an Evil Robot
240419-megatron-v04

Robosen's Megatron Transformer Is Too Much Fun for an Evil Robot

Battle of the Humanoid Robots: MenteeBot Is Ready
240423-yt-menteebot-ai-robot-v08

Battle of the Humanoid Robots: MenteeBot Is Ready

2025 Audi Q6, SQ6 E-Tron: Audi's Newest EV Is Its Most Compelling
cnet-audiq6

2025 Audi Q6, SQ6 E-Tron: Audi's Newest EV Is Its Most Compelling

Hands-On with Ford's Free Tesla Charging Adapter
pic3

Hands-On with Ford's Free Tesla Charging Adapter

Nuro R3 is an Adorable Self-Driving Snack Bar
240320-site-nuro-r3-first-look-v1

Nuro R3 is an Adorable Self-Driving Snack Bar

First Look: The $349 Nothing Phone 2A Aims to Brighten Your Day
240304-site-nothing-phone-2-first-look-v3

First Look: The $349 Nothing Phone 2A Aims to Brighten Your Day

Latest How To All how to videos

Tips and Tricks for the AirPods Pro 2
airpods-pro-2

Tips and Tricks for the AirPods Pro 2

How to Watch the Solar Eclipse Safely From Your Phone
screenshot-2024-04-03-at-15-47-11.png

How to Watch the Solar Eclipse Safely From Your Phone

Windows 11 Tips and Hidden Features
240311-site-windows-11-hidden-tips-and-tricks-v2

Windows 11 Tips and Hidden Features

Vision Pro App Walkthrough -- VisionOS 1.0.3
VisionOS 1.0.3

Vision Pro App Walkthrough -- VisionOS 1.0.3

Tips and Tricks for the Galaxy S24 Ultra
240216-site-galaxy-s24-ultra-tips-and-hidden-features-2

Tips and Tricks for the Galaxy S24 Ultra

TikTok Is Now on the Apple Vision Pro
tiktok-on-vision-pro-clean

TikTok Is Now on the Apple Vision Pro