Speaker 1: Imagine a future VR AR headset where you're using your brain to control everything, where you have an interaction of feedback between what your brain is saying, what the computer is saying and everything else. That's a glimpse of what I saw in Brooklyn wearing a headset named Galia. Golly, is a collaboration between open BCI sensor array and a VR headset maker named Varrio who has very high resolution VR displays and the ability to mix that with the outside world, with pass through cameras [00:00:30] in a similar way to what Apple Vision Pro does. Now, the idea here is to create a platform, something where there are enough sensors looking at enough little bits of information to study what could possibly be at play in the future, and to develop new types of interfaces. So when I visited, I got a handful of demos that were little glimpses at the types of things that this does, but it's not fully baked into a product per se. Speaker 1: This is a research platform. Golia headset has a number [00:01:00] of sensors. There are so many acronyms here that it may be numbing. There is E eeg, which is electrical arrays that measure your electrical brain activity. There is E M G, which measures the electrical impulses of motor neurons and muscles. There is E D A, which measures the sweat level in your skin. It's like a stress sensing type of a sensor ppg, which is optical heart rate. There's also eye tracking, um, through, in a similar way of VR and AR headsets that are already out there, including [00:01:30] PlayStation VR two, quest Pro and Apple Vision Pro. All of this stuff is meant to combine and maybe open new doorways for interaction. And one demo that I got was where I controlled a little cat, moved it back and forth using tiny muscle movements in my face using the EMG sensors. Speaker 1: This is the type of technology that Meta is also exploring on wrist. In fact, that CAT game demo is something that they've been using for their neural input testing [00:02:00] on wrist open. B c I thinks that the inputs and interactions here are more useful on head than on wrist with the idea that maybe you'll use hand tracking, eye tracking plus a whole neural array of interfaces in the future. I also got to try it kind of a synesthetic feedback demo where I was able to see my different brainwave states and the colors and sounds in the room would begin to adjust based on what brainwave I had activated. Speaker 2: You get to see and hear your mind for the first time, essentially. [00:02:30] Wow. Speaker 1: It was a bit trippy and I didn't really know if I was making it work or how I was making it work. It kind of reminded me of different meditation apps I've tried using, uh, brain sensors that was interesting and again, meant to show that there could be some level, some level of biofeedback in this. In fact, a lot of neurotech involves this concept of feedback and, and the headset itself, the sensor array can be used independently of the VR part. That that's the idea is it could be a little more flexible and not have to be a vr ar interface. [00:03:00] But one of the things that I had to get used to was that the electrodes for e EEG are these kind of, um, soft rubbery tips that you push into your hair to try to get a, a perfect reading. So you kind of have to dig it in. Speaker 1: It doesn't like thick hair that much. When I looked at my demo from the prototype, I used ear clips to get the heart rate sensing, although it'll be built in and uh, the face mask had a lot of the, um, other sensors. So it has to press in tight, um, to your face. So there's a lot of stuff going on in that headset, [00:03:30] but again, it's meant to get a lot of different types of readings. I think one of the most interesting ones to me is certainly EEG because, um, EMG is fascinating in that it senses muscle movement. The e EEG is a little more elusive in that it's looking at brain activity and it's hard to know when that is happening and when it is not. Whereas emg, which is based on muscle impulses, is something that's a little more tactile to control things like heart rate and ed, which is stress sensing I've seen on smartwatches and eye tracking, [00:04:00] which is also on this headset is becoming a pretty common standard across, um, VR and AR headsets. Speaker 1: And Apple now has opened up the doorway to eye tracking plus hand tracking as a standard. But what this technology is looking at is are there ways that we can refine that by also adding some extra inputs from around our brain Open. BCI also explored ways that you could use this headset to control things like drones. A Neurohacker named Christian [00:04:30] Byer Lane worked with Open BCI and was able to wear the headset and control a drone using muscle impulses on his face at a TED Talk demonstration. This was meant to show not only the ability to control something like a drone, but also to open up pathways of accessibility for this type of technology. The whole field of neurotech has a lot of questions to it in terms of privacy, in terms of how much of this works, um, how much of it extends into our personal lives or [00:05:00] external lives, how much of it do we feel like we're able to control? Speaker 1: I don't know. It's it's fascinating and what I see with technology that Open BCI is doing is it's a doorway to sensor technology that's already out there and may begin interconnecting with all the other devices that we already have. We're already looking at our eyes and our ears, but what about all of our other inputs? This is the beginning, I think, of exploring a new wave of wearable tech that's going to be coming in the next decade. Anyway, those are my thoughts right now. I'm sure [00:05:30] we're gonna be looking at Neurotech a lot more. Thanks for watching and make sure to like and subscribe below. Thanks.