How Facebook's AR glasses will work with neural wristbands
Smartwatches
Speaker 1: With AR devices. We're asking the question, how do we build a computing platform that is truly human centric?
Speaker 2: Facebook reality labs is a pillar of Facebook. That's dedicated to bringing AR and VR to people, to us, to consumers of the world.
Speaker 3: Every single new computing requires new input devices, new the set of interactions that really make this possible
Speaker 2: With AR glasses. I think the key here is to communicate with our computers in a way that [00:00:30] is intuitive at an entirely new level. The risk is a great starting point for us, technologically, because it opens up new and dynamic forms of control. This is where some of our core technologies like EMG come into play
Speaker 1: Neural interfaces when they work right. And we still have a lot of work to go here, feel like magic.
Speaker 4: So if you send a, a control to your muscle saying, I wanna move my finger. It starts in your brain. It goes down your spine through motor neurons, and this is an electrical. [00:01:00] So you should be able to grab that electrical signal on the muscle and say, oh, okay, the user wants to move their finger.
Speaker 1: What is it like to feel like pushing a button without actually pushing it? And I could be as simple as, Hey, I just want to move this cursor up or move it left. Well, normally I would do that by actually moving, but here you're able move that cursor left. And it's because you and a machine agreed, which neurons mean left and which neurons mean right? You're in this [00:01:30] constant conversation with the machine,
Speaker 5: This new form of control. It requires us to build an interface that adapts to you and your environment.
Speaker 2: Everything starts with a click.
Speaker 5: The intelligent click is the ability to do these highly contextual actions in a very low friction manner.
Speaker 3: It's kind of the purest form of superpower you are in control, but the system is, is exactly inferring the right thing for you to control all you have to do to operate. It is just click. So
Speaker 5: For [00:02:00] example, if I'm cooking and I'm kind of pulling some noodles out of a box, the interface could ask me, would you like to start boiling the water? The risk can also be a spot where the technology is communicating back to the user,
Speaker 2: The haptics, the sensation of touch around this. This is part of how we learn and, and use motor control is critical to AR and XR.
Speaker 6: We wondered as you pull back the bow string on a bow, if we tied that, not to the tension growing on people's fingers, but rather squeezing on the [00:02:30] wrist, would it add to that experience like you're pulling back the bow string? The answer is yes,
Speaker 3: That future. It really is the computer that is, uh, seamlessly integrated into your day to day life. The next computing path platform is the mixed reality platform. The one that really totally blends your virtual environment and your real environment in a seamless way, we're in
Speaker 2: This moment where we can move from person computing to personalized computing. What
Speaker 1: If you and a computer agreed to design [00:03:00] a keyboard together and you type faster on it than anybody else in the world could type on your keyboard? I
Speaker 2: Think what this enables is the ability to not have to focus on a computer on phone and be able to still interact with other people. It's going to open up a new generation of communication and access to
Speaker 1: Navigation. It leads to this phenomena of increased agency of you feeling like a level of control you've never had before. We want computing experiences where the human is [00:03:30] the absolute center of the entire experience.