After laying off over 11,000 employees and running into widespread skepticism about its huge metaverse investments, Meta has been one of the big tech companies hammered hardest in 2022. It didn't help that the company also of its 2-year-old consumer VR headset by $100 over the summer. To cap it off, gaming industry pioneer and former Oculus CTO John Carmack announced his departure as consulting CTO via a sendoff last week that accused Meta of "self-sabotage."
This, after a 2021 when Facebook's vision for thetriggered a and massive spending on a VR/AR future that CEO Mark Zuckerberg has been convinced will be inevitable.
For now, Meta is sticking with its metaverse plans. A Dec. 19 post from Meta's CTO, Andrew Bosworth, detailed the continued commitment the company plans for 2023 and onward. It'll be a challenge, though. Meta's most recent VR headset, the , is more of an expensive developer device for future technologies than an everyday headset, and it comes with a massive price tag. And in 2023, a more affordable will face rising competition from companies including , , and maybe . But despite Facebook's brutal year, the tech industry's interest in the AR/VR continues to look like it's continuing.
The last time I saw Bosworthat Meta's Redmond, Washington-based Reality Labs research headquarters, looking at emerging technologies including wrist-worn neural input wristbands. Neural wristbands are still years away from becoming a reality, but what happens in the meantime? The company's , and its mixed reality and face tracking capabilities, are the most likely candidates pointing to where things are heading in the nearer future.
Bosworth answered some of our questions about what comes next for Meta. This conversation was edited for clarity and length.
With everything economically in 2022, how do you feel the metaverse played out in terms of what you imagined it would be?
Bosworth: Horizon, for all of the snarky tweets about it, is not like this big expensive thing. In the scheme of Reality Labs, it's quite a small fraction of the overall work that we're doing, and tremendously important. We will continue doing that, and have ample funding from a very successful core business to keep doing that indefinitely.
There's a leak of that post that Vishal [Shah] had put up who runs that program, talking about getting the quality in order before we go multiplatform, which is the next obvious step. If you look at people like VRChat and Rec Room, and other people who are offering really compelling social experiences, a lot of that usage is cross-platform. That's a huge opportunity for us still, but we don't want to take that step until we feel like we're proud of the core product.
Certainly with the economy changing, we're constantly looking at our plans and trying to figure out how we can be more efficient, and making some big, tough changes. Sad to say goodbye to Portal, a product that I love, that customers that had it loved, was well reviewed, but it was growing kind of slowly, and we just decided it wasn't a bet we could afford anymore. In the case of Horizon, it's very core to what we do as a company, and it's not the most expensive thing, so they can continue unabated.
The Quest has been one of the more affordable VR products. Do you imagine that even in an economic downturn, VR growth still looks the same?
Bosworth: You've got macroeconomic pressure that's putting pressure on entire categories, we're not exempt from that. In every change in the economy, every recession, some things outperform. One thing that we've been trying to push on for VR for a long time is moving beyond just entertainment. We love entertainment, we're committed to that, that's a core use case for us, gaming and entertainment. But it's not all we do.
If you're using this device, and you say, hey, what if instead of going down and paying for a gym subscription, I pay less forand I'm working from home. As you expand the usefulness of the device, you improve its capacity to grow, in spite of or even because of macroeconomic changes. We're obviously in the middle of that trip. We're not there yet. Obviously, you expect some impact from the economy on us. But that's one of the reasons we want to continue to increase the usefulness of the device.
Do you think there'll be any sort of philosophical change in terms of having more subscriptions on Quest [versus app purchases]?
Bosworth: Different applications have different business models. In the case of something like Supernatural, you have an ongoing investment in trainers, and fresh worlds, and fresh routines. It makes sense that you have to have a recurring revenue stream for that to be a viable business. And that's pretty different than something like Beat Saber, where we do these DLCs, they're releasable packs, and they don't have to be every week, every day.
I think I joined quite a few people, probably including our friends in Cupertino, who felt like the App Store model not supporting subscriptions early enough really had a big negative impact on the software available for the iPad, software available for phones. If you were in Adobe, and you have these big software packages that need to be up to date, and people rely on them as tools, there's a mismatch in the alignment of the model that was supported through this. So we're trying to support all these different business models as soon as we can to get those things off the ground.
I was thinking of how Apple bundles things in, or Microsoft does with . Would you ever consider that with Quest?
Bosworth: You're still going to see those types of behaviors, from us and from everybody, where you say, hey, let's let's collaborate, we'll prebuy a certain number of copies so that we can package it in with the hardware or so we can deliver it as part of a value-added bundle for consumers. If we drive enough volume with it, it can be positive for us. Those types of things are certainly totally possible.
We're all, as an industry on the gaming side of things, looking with eager anticipation at Microsoft's exercise there. As a gamer myself, I love it. However, there's a lot of IP that they're never going to be able to deliver as part of their subscription program. Am I really going to end up net saving money versus going a la carte? It's not obvious to me. I think there's a lot of ground to cover as an industry to learn more about that. For now, I think we're pretty content to just continue to connect developers with consumers and work that way.
You mentioned social apps are getting the most traction [on Quest]. For Horizon, you mention it going cross platform. Do you see it interconnecting more with other metaverse platforms? I know it's getting more interconnected with Facebook. Do you see the metaverse becoming more a part of social media [like Twitter] in the immediate future?
Bosworth: Social media has probably never been sufficiently well-defined as a term. We call Horizon a synchronous social network. There's a natural integration there with Messenger and WhatsApp – calling and video calling is more synchronous. How do you know that I'm online and that it's actually gonna be fun to get in there? The answer is probably through Facebook and Instagram and these other tools where you might be exploring your phone. Even if your functionality is a little bit more limited, can you just drop in with me from your phone and be there with the community? We see those integrations as very promising over time. We see Horizon as a part of the fundamental fabric of applications that we already build as a company.
It seems like an opportunity for live chat spaces, like Twitter Spaces, that type of thing?
Bosworth: We don't know exactly what shape it'll take. As somebody who has a pretty long history of creating these social spaces, I'm a little bit loath to be prescriptive about it. I thought Kashmir Hill's article on Horizon was excellent. She found these niche creator communities that were tremendously supportive and positive and fun. That's a kind of "find your people" story that we love about the internet, those of us that have been on it for a long time. There's certainly going to be a lot of people who only want to go and hang out with people they already know. And they should be able to do that. And there's going to be people who explicitly want to explore people they don't know, and what those communities are about. I'm not sure it's going to be one thing.
You also get an equal amount of trouble trying to be all things to all people. So it's not that Horizon doesn't have a point of view – it does – it's that things should be fun and social, and done better together. But we do want to give people that decision, who's a part of that togetherness story. That's the balance we're trying to strike, between having a strong opinion about what this space is, and also letting people make it their own.
Regarding Quest Pro, and the role that it plays now, do you see mixed reality and eye and face tracking as a kind of a foot in the door? What do you see as emerging opportunities for those? It feels in a lot of ways like a developer kit. What does that end up meaning in 2023?
Bosworth: The nice thing about face tracking and eye tracking is, those are scalable features. If you don't have them, and I do, it's still better for both of us than if neither of us has them. We don't both have to have them to appreciate the value that they bring. Eye tracking, of course, has secondary benefits in terms of efficiency with which we render things through gaze-foveated rendering, which will get better over time. Eye tracking has long-term technical importance. Whereas face tracking is a feature that anybody doing anything with social presence is going to value. I think a little bit differently about things like hand tracking and mixed reality – they are core fundamental technology that if you don't have them, these experiences are just not available to you.
We are trying to differentiate where we can on where we think there is a bar of quality that should be demanded by consumers and developers alike. One of the things that we really want to work on for 2023 is making hand tracking an easier part of how you integrate with the system. Just put the headset on and rely on just your hands until you get to the content that needs the controllers. And on mixed reality, yeah it's early there because before this device came out, there wasn't a great development environment for that kind of technology. That is the growth phase we're talking about. We obviously have a lot of use cases already built in around the Guardian, and ways that keep you safe and make it feel comfortable. But that's just the tip of the iceberg. And we're excited to see what people come up with when they play with it.
You get to VR 2.0, you get to a baseline that every single headset should be measured on by consumers and developers. Without mixed reality, and without hand tracking, you're going to miss out on experiences, there's going to be great things that you're not going to do.
You said 50% of Reality Labs research is focused on AR. Where do you see that in 2023? How much do you think AR is going to be surfacing [for Meta] next year?
Bosworth: As the the old saying in technology goes, you tend to get less done in one year than you think and more done in 10 years. My confidence, by the end of the decade, in us having experienced really rich augmented reality is really high. We are still a few fundamental breakthroughs away from making this something that's scalable. We have line of sight on those problems and those breakthroughs – obviously, we could be surprised, they still have a lot of uncertainty around them. But we've got multiple paths we're pursuing in 2023, I think you're gonna continue to have people developing for mixed reality, a great deal of which will be useful in augmented reality. In our case, through the Spark AR platform available on mobile phones, but also hopefully, we can bring to Quest Pro as well – I think you'll see a lot of advances there.
I think you'll see quite a few advances of experiences that are augmented reality-like – our glasses,, are a good example. It's not augmented reality, we wouldn't call it that. But you're getting yourself into the place where you've got form factors and sensors and microphones in the right place. Can you do some audio feedback, be a little more intelligent with the captures … what can you do inside of those spaces? And then in between those is a whole spectrum of products that you'll see, products that have a display, but aren't world-locked. Products that have short battery lives, so there's on-demand usage throughout the day. There's a whole spectrum of products that will start to emerge over the decade, that culminate in full AR glasses as you and I would think of it.
You mentioned in your post that this year was so defined by creative AI tools. How much do you see that factoring into what's going to be happening in VR and AR?
Bosworth: It's hugely important. We finally have kind of cracked the technology required on the infrastructure side … the user interface required to put the human in the loop, withand OpenAI, with Midjourney and Stable Diffusion. And it's this union of a technology and a query language that puts the human in the loop and is a tremendously powerful tool. They're a little bit haphazard today. But they're very expansive, and they haven't really been focused on a single critical problem. That's the opportunity that we see. We are at the absolute vanguard here, in terms of the research we've done, and the technical capacity that we have on the AI side, to apply that in all kinds of domains.
Conventional domains – on messaging apps, in groups on Facebook – but also in VR and AR, where the ability to use natural language, or to have a rich set of sensor inputs serve as a query that produces an output based on its experience, is a very powerful tool. It's one of the challenges that we've always known about with augmented reality. It's a great "human in the loop" device from a display perspective, because the display is right there. But closing the loop with what your input to the machine is, is hard. We're focusing on EMG [electromyography], and these breakthrough technologies there. If you can, with fewer bits of information from me, get to a good outcome in the machine, [that's] tremendously powerful for augmented reality,
The history of humans with tools is that when tools become widely available, we find more uses for them. I do think of these generative AIs as tools. And so they democratize and they create a new opportunity that we didn't even conceive of when we created the tool initially.
How can VR dovetail better with workflows people already have?
Bosworth: We're thinking a lot about this, I'm pretty excited about the upcoming integrationthat we announced. It's a pretty reasonable thing to believe that, over time, having a small device that takes the place of multiple expensive large devices that you can take with you anywhere you go, and have your workstation, is a positive development. There's a lot of work to get the text clarity there, to get the keyboard inputs right. We're not saying mission accomplished. There's use cases that are being built around VR, and so it's not a dovetailing question at all. If you're in architecture, if you're an automotive designer, it's like, yeah, of course, I want to do this in virtual reality, it was weird that we ever did it another way. I'm sure there will be workflows that are very hard to adapt to, because they're just very deep, integrated workflows. You take what's immediately adjacent to the technology that you're able to build, you kind of grow from there.
You mention the evolution of avatars in 2023, and I got to see some of the more advanced avatars at the research labs. What's coming next?
Bosworth: I think this is a reasonably poorly kept secret: Our avatars, and we started from behind, were built for a very specific kind of visual appeal in a very specific environment, way back in Spaces, before Horizon was even a twinkle in the eye. And now that we're in Horizon, we want this to work, we want these to be much more robust to serve a lot more environments. We want these to be able to stretch from work to a fun game, to stickers in a family of apps, to animate your video calls on Messenger – we're just doing a lot of fundamental work on the avatar system itself, to be able to be flexible to all the different demands, to be able to look good, in all those different environments where the expectations are so different.
When you get into VR, and you experience it, this is actually pretty compelling. When you see a screenshot of it, it's not compelling. And there's a screenshot problem – a lot of our media is consumed in two dimensions, it's hard to express those things. But that's a real thing, that people want to look good, whether they're in 2D or 3D. A lot of our work is to build out this kind of world-class, highly flexible system that adapts your same avatar specification to all these different environments really fluidly and with different levels of detail, as rooms get more crowded, and we're trying to tax the compute. So that's the kind of thing that's happening.
How do you see expressiveness [with face tracking] translating to people who don't have face tracking? When one person can move their face and one person can't, how does that get resolved in social spaces?
Bosworth: We've been pretty delighted at people's response to our efforts to infer facial interaction from tone of voice, and expression. Using your hands to express different emojis, both with the hands and with your face, those have been pretty popular. As we have done in text messaging, we are finding avenues to express ourselves. Our generation has decades of experience adding emotion to an otherwise emotionless line of text. We're actually quite good at this. But reducing that mental burden from people is one of the things that we think face tracking and eye tracking can help accomplish. So I'm pleased with how much we've been able to fake it with the existing ones. I don't think it's wildly out of sync for two people, one with the functionality, one without, to be interacting together. But the person without it does have to kind of really express themselves through the tools available as opposed to doing it naturally.
Do you see opportunities for new controllers or peripherals in the near term? Some companies are looking at body trackers and wearable trackers, and you're working on EMG. Do you feel these [VR] controllers are the way to go, do you imagine more opportunities there?
Bosworth: We're pretty happy with the Touch controllers. We're lucky, we have a world expert team on inputs. There isn't one input method to rule them all. For a lot of games, going to a hands-based model … even if you had the perfect tracking, it wouldn't be satisfying to me as the gamer. Having access to controllers for certain experiences is probably always going to be great. It doesn't mean they can't be improved. And we have some exciting ideas there: I'm sorry, I'm not going to share with you what those are.
I hope we continue to expand into more use cases that don't necessarily need them. You know, it's completely reasonable to imagine, if I'm using this device primarily to collaborate in a work environment, that I can just use my hands. I can just use a keyboard, a mouse and my hands, and it's totally fine. Likewise, you can certainly imagine specialized experiences that need their own control scheme, and some kind of an attachment. I've seen the Golf Plus attachments for the Touch controller, where you want to have that feeling of a golf club. I think it's pretty possible that someday we could ship a headset that didn't have controllers with them, for audiences for whom their use cases didn't need the extra weight expense. But we're not there yet.