X

I Wore the Future With a Brain-Connected AR-VR Headset

The next frontier might be neurotech: OpenBCI's Galea headset, along with advances in assistive controls, points to a wild, wearable road ahead.

Scott Stein Editor at Large
I started with CNET reviewing laptops in 2009. Now I explore wearable tech, VR/AR, tablets, gaming and future/emerging trends in our changing world. Other obsessions include magic, immersive theater, puzzles, board games, cooking, improv and the New York Jets. My background includes an MFA in theater which I apply to thinking about immersive experiences of the future.
Expertise VR and AR | Gaming | Metaverse technologies | Wearable tech | Tablets Credentials
  • Nearly 20 years writing about tech, and over a decade reviewing wearable tech, VR, and AR products and apps
Scott Stein
7 min read
A prototype VR headset, called Galea, with a white plastic headband sitting on a desk

A prototype version of OpenBCI's Galea, a sensor-studded brain-sensing VR/AR headset array.

Scott Stein/CNET

A few weeks ago, I saw the best quality mixed reality headset with an interface controlled using my fingers and eyes: Apple's Vision Pro. But a few months before its announcement, I saw something perhaps even wilder. Clips on my ears, a crown of rubbery-tipped sensors nestled into my hair and a face mask lowered in front of my eyes. Suddenly I was looking at my own brain waves in VR and moving things around with only tiny movements of my facial muscles. I was test driving OpenBCI's Galea.

The future of VR and AR is advancing steadily, but inputs remain a challenge. For now, it's a territory moving from physical controllers to hand- and eye-tracking. But there are deeper possibilities beyond that, and they're neural. 

Watch this: I Wore the Future With OpenBCI's Brain-Sensing VR Headset, Galea

I don't even know how to describe my experience trying the Galea headset, because, well, it's a platform to explore what happens next. And my experiences in neural tech are still embryonic.

OpenBCI, a Brooklyn-based company building research tools for noninvasive brain-computer-interface technology, has been adapting its own sensory systems into a mixed-reality headset called Galea, which will become available later this year. I tried a prototype version of Galea at OpenBCI's Brooklyn offices, curious about how brain-computer interfaces could work in VR and AR. I was also wondering what the future could hold for our interactions with computers in general.

A person wearing a prototype VR headset and holding controllers in front of a desk

An OpenBCI team member demos Galea, which runs on Steam VR. (To see me in the headset, watch the video.)

Scott Stein/CNET

A new sensor platform for VR and AR

I've tried more simplified sets of EEG sensors focused on a particular visual task. NextMind allowed me to focus on particular spots to trigger actions, for example. (NextMind was acquired by Snap last year.) It felt almost like a mouse click but with my mind. But OpenBCI's Galea is a complete mix of all sorts of sensors: EEG, EMG, EDA, PPG and eye tracking. It's an acronym festival.

EEG, or electroencephalography, measures the electrical activity of brain signals. OpenBCI measures this with rubbery-tipped sensors that push in close to my scalp, much like the NextMind hardware I tried back in 2021. The electrodes work when dry, but need to stay clear of too much hair to get a good signal. 

EMG, or electromyography, measures nerve and muscle electrical activity. Here, the sensors are around a facemask on the headset, pushing around my forehead and eyes and cheeks. When I make little movements of my facial muscles, readings get measured. But unlike face cameras on VR headsets like the Quest Pro, which look for particular physical movement, the readings here are all electrical-based. You could theoretically make movements so small they'd be more like simple neural impulses. Meta is also developing EMG technology for wristbands that can be used with future headsets. But that wrist tech only measures finger-hand movement via the wrist. OpenBCI's sensors are looking at the face.

Rubbery-tipped sensors in a white plastic headband, seen on a table

A look into the Galea prototype's headband: The rubbery-tipped EEG electrodes are positioned close to the scalp for better readings.

Scott Stein/CNET

EDA, or electrodermal activity, is an electrical measurement of sweat on the skin. It's often used for stress-sensing: Fitbit built its own EDA sensors into its Fitbit Sense smartwatch for stress measurements. OpenBCI's EDA sensors are on the forehead part of the headset.

PPG, or photoplethysmography, is optical heart-rate sensing, similar to what's already on most smartwatches. PPG is measured on the forehead as well on the final Galea headset. But in my demo, I wore earclips that measured PPG.

The sensor array is married to an existing VR-AR headset, the Varjo XR-3 (or lower-cost Varjo Aero), and connects to a PC to run the software and analyze the data. Varjo's high-resolution display and passthrough video mixed reality gives OpenBCI's sensors a lot of software possibilities to work with in VR and AR-type scenarios. But OpenBCI's sensor array can work independently of a VR headset, or connect with others. 

Apple's Vision Pro could also be an ideal platform for OpenBCI because of its processing power and standalone function. According to OpenBCI's CEO and co-founder, Conor Russomanno, working on something like Vision Pro, or future AR and VR platforms, is totally possible. He likens Apple's recent moves as emphasizing the computer aspects of mixed reality, which is exactly how OpenBCI thinks of the opportunities, too.

A person sitting in a wheelchair operating a sensor-equipped VR headset

Christian Bayerlein learned to use Galea to control a drone using EMG sensors measuring facial muscle movements.

Björn Lubetzki

Accessibility goals

OpenBCI's sensory array could pursue multiple possibilities at once. It's not about one particular goal, but how the system's sensors could enable research and also open doorways to interactions with computers. Most recently, OpenBCI worked with Christian Beyerlein, a hacker with spinal muscular atrophy, who used OpenBCI's sensory array to control a drone by using facial muscle impulses. That presentation, given as a TED talk, was a demonstration of how brain-computer interfaces could open up new doorways of accessibility and control of virtual and real-world tech.

My demos included one EMG-based control game called Cat Runner. I moved a cartoon character back and forth with little movements of my facial muscles, which were recognized by the EMG sensors in the Galea facepiece. This is the same game Meta has been using to test and demonstrate its neural input wristband tech, which I saw last fall at Meta's Reality Labs Research Redmond headquarters. But Meta is looking at sensing movement on wrist, whereas OpenBCI'a Russomanno sees better opportunities when used on the head, where the sensors won't interfere with existing efforts in camera-based hand tracking.

EMG technology is meant to sense electrical impulses so subtle that perhaps no muscles seem to move at all, but that level of relationship between sensors, algorithms and human input could take a while to finesse. OpenBCI's multiple types of sensors could provide a ton of data that could indicate future directions for research, or new interfaces. They can also provide feedback on how using VR and AR affects the brain or attention. There have already been previous efforts to use sensors on VR headsets to study cognitive processes, including the HP Omnicept, which had a heart-rate sensor, or most eye-tracking-enabled headsets. 

Another demo for the EEG sensors, created a meditative "synesthesia room" where my different brain-wave states were turned into colors of ambient light. My brain waves apparently changed the colors I saw. I started to try to see if I could focus in certain ways to bring out different colors. It seemed to be working? Feedback is a key part of a lot of OpenBCI's sensors, and how eventually they can be used to train and improve how we control things with our own neural impulses.

Two VR headsets side by side on a table

The design of the final version of Galea (left) versus the prototype I tried. You can see the rubbery-tipped EEG electrodes in the prototype's headband.

Scott Stein/CNET

Russomanno sees what Beyerlain is doing with drone control, having a system extend his own brain and body functions by using EMG, as a sign of how neurofeedback will change how we interact with computers just as much as AI. 

"It's not to say that AI is not useful, it's just not the end-all be-all, not the only holy grail solution that's going to change the world," Russomanno said. "We actually like neurofeedback, and really smart UI, and really smart design, which is optimizing the other direction of the loop. Using technologies to make computers teach humans better is this other cool revolution that we're going to go through."

It reminds me of the path of smartwatch tech, where optical heart-rate sensors started to open up flows of data that resulted in new health features on watches over time. Fitbit layered multiple new sensors on its Sense watch, which also included an EDA stress sensor. Is OpenBCI's Galea, and efforts like it, going to open up new doorways to future wearable sensors that interface with what we see and hear, and what our hands interact with?

Russomanno is certain of it. He sees the arrival of better standalone VR and AR headsets, including Apple's, as a pathway to new inputs and peripherals. 

"We're just not going to be able to know until these headsets are out there, and people start building the bi-directional applications," Russomanno said to me via video chat months after our demo, referring to more advanced AR and mixed-reality devices to come. "The cool thing about an AR headset is that it has every external world sensor that you would want to know about anything about your local environment. And then what we do is like the internal world. When you bring those two datasets together, we still just don't know what's going to be possible."

While Russomanno mentions neurofeedback as compared to AI, I also think of the dovetailing of both. AI needs datasets to work its magic; so, too, do future sensory tech systems. As neurotech evolves, the possibilities of AI co-evolving with it do too.

Prototype VR headset on a white table, with the visor flipped up.

Galea is a sensor platform fused with a VR/AR headset, but the sensor array could also work without VR and AR. Future ambient computing possibilities could emerge.

Scott Stein/CNET

A sensor platform that could expand beyond headsets

OpenBCI's Galea is actually a VR and AR headset, but its interface with Varjo's hardware is one part of the equation. The sensor array could also be used on its own. That interests me even more when I think about a world of future wearables that could eventually interact with other wearables on our bodies. A world where our everyday interactions are possibly enhanced by more advanced sensors. That's all a long way off, but the beginnings of that future look like they're taking root in some of the sensors OpenBCI's put together in Galea. 

Right now, it's enough of a struggle to convince people of the value of VR and AR and wearable visual tech. But improving the ways we can eventually interface with spatial computing or the real world might be part of the answer to how VR/AR could evolve into something far more meaningful…and possibly unsettling. It already feels like personal tech is developing a deeper relationship with our senses and our brains. But what I've seen suggests we haven't even gotten started yet.