I am at Microsoft's headquarters to get a demo of the HoloLens 2. I'm led down a basement hallway. Large sliding metal doors separate rooms. As I reach my destination, I find a re-creation of an automotive workshop. It smells like rubber tires. In front of me, there's an ATV on a pedestal. Tools and parts are spread around the room.
I'm handed a HoloLens 2 and told I'm going to learn how to fix the bike.
Microsoft's HoloLens is back, and retooled. The new version of the company's 3-year-old mixed-reality headset, which is nowand is coming later this year, fits easily over my head and glasses. It feels like an industrial tool, a welder's mask. I can see just fine. I go through the eye-tracking setup, and a grid of dots appear. I follow the dots, from one corner to the other, side to side. It works. Now, Microsoft's new Dynamics 365 Guides app is launched. This is my instruction set.
The best way I can describe it is like Google Maps' turn-by-turn directions for real world instructions -- or like a floating Lego manual for reality. I move my eyes over each step-by-step card that floats in the air in front of me. I'm told to put the bike into neutral. Now, a floating arrow arcs through 3D space to show me the gearshift on the bike and where I should move it. I do it. I move my eyes to the next step. Now, I adjust a loose bar on the bottom. A long dotted arrow arcs to a toolkit against the wall, pointing to a ratchet. I almost grab the wrong one, realizing the arrow is pointing me to another tool. My eyes move to the next card. Another arcing line shows me the way to a bin of nuts, to grab the right part to screw in.
Video: Microsoft HoloLens 2: A first dive into the future of AR
Sometimes, the arrows don't line up perfectly. Would I grab the wrong tool, I wonder? Would the program correct me if I made a mistake? I should have tried grabbing the wrong tool. But still, the placement of pointing arrows in space feels weird, like a 3D navigator. My mind, weirdly, leaps to Nintendo and my experience with , a game for the Nintendo Switch that expertly guides players to make elaborate cardboard creations through on-screen instructions. Imagine that, but in 3D, with the instructions spread out everywhere, pointing to real things. The world as a Lego kit.
There's no spider-zapping game here. I'm not playing Minecraft or Halo. Four years after more Microsoft cloud services that will fold into iOS and .as a game-playing in addition to being an enterprise device, but now the . The HoloLens 2 is a practical device for companies to help their employees. AR has become a tool for getting things done. And the headset's improved comfort, better field of view, and better eye and hand tracking are immediately apparent. It's also connected to
After the wild, Willy Wonka dreamslast year, Microsoft's HoloLens 2 feels like the ego to Magic Leap's id. But both sides might be needed to figure out where this tech needs to go next.
And this-- the mass consumer market -- at all. This is for factory workers, in places that can spend thousands on a work tool. In fact, Microsoft may not have eyes on a consumer AR headset for years. For now, the company is just trying to make a better way to mix reality in practical ways... and patiently waiting for the apps, developers and the rest of the connected world to keep evolving with it.
Augmented reality isn't a pipe dream anymore. It's a field being explored by Apple and Google in phones, and plenty of headset-making hopefuls, , have been trying to crack how to design a comfortable, functional way to carry holograms around all the time.
The first HoloLens felt like an accomplishment because it was self-contained, had no wires and wasn't connected to anything at all. It just worked. Microsoft's new HoloLens 2 isn't really a whole new concept, but it pushes forward in three key things that the last headset needed: eye tracking, a larger field of view and better hand tracking. It's also much more comfortable, and allows people who wear prescription lenses, like me and my CNET colleagues, to use the headset by just slipping it on over our glasses. There's a Qualcomm 850 mobile processor to drive everything, along with Microsoft's own AI engine, replacing the Intel processor of the previous HoloLens.
The new design may look casually similar to someone who's never worn a HoloLens before, but it's notably less bulky and feels like it weighs less, too. It does weigh less, by a fraction -- 566 grams or 1.25 pounds, or versus the original HoloLens' 579 grams, or 1.28 pounds -- but it feels like more because the weight distribution has shifted, so a thicker bit at the back now fits around the rear head strap, while the front-facing visor is smaller. The center of balance is now slightly behind the ears, and is meant to feel like "putting on a baseball cap." I loosened the head strap, popped the headset over my glasses, and everything felt fine. The new balance shift makes everything significantly less painful to wear for more than 5 minutes. It's like putting on a backpack with better designed straps.
It even has a flip-up visor. The visor tilts up to allow anyone to make eye contact or do regular work more easily, like all of Microsoft's partner-made flip-up. I loved that I could stop in the middle of a demo to quickly clean my glasses, or scratch my forehead.
The HoloLens 2 hardware is still self-contained, just like the first one, and doesn't have any extra belt-worn pack, like the Magic Leap One uses, but that also means the headset is bigger than the Magic Leap One's head-worn $2,295 goggles. (It's also more than a thousand dollars more expensive.) But it's also much friendlier to my vision. The Magic Leap One requires me to put contact lenses in, or wait for prescription lenses for the headset: Magic Leap doesn't even support my prescription, currently. The HoloLens 2 just works over my glasses. I know which one I prefer.
Eye tracking: A step away from mind reading?
Eye tracking hasn't been a big factor in VR and AR -- yet. But it will be. The first HoloLens didn't have eye tracking. The Magic Leap One does, and higher-end enterprise-targeted VR headsets like the HTC Vive Pro Eye and Varjo VR-1 are starting to include it. Eye tracking can recognize where you're looking with internal cameras, so you don't even have to move your head at all. The HoloLens 2 has added eye tracking, too.
The HoloLens 2's eye tracking has a double purpose: It can measure eye movement and use it to interact with virtual objects. Microsoft uses the new eye-tracking cameras for biometric security, too. The HoloLens 2 has iris scanning via Windows Hello, so users can instantly log in to Windows and launch their personal account or remember personal headset preferences.
More impressively, the HoloLens 2's eye tracking works with regular glasses, even thick ones like mine. Most eye-tracking tech I've used before had a few hiccups when I used my glasses. Early demos of the HTC Vive Pro Eye sometimes wouldn't work unless I loosened the VR headset, and similar things happened with a few experiences using Tobii's eye-tracking VR tech. No such problems happened during the few HoloLens 2 demos I had.
The only real use of eye tracking I got to experience was a brief demo showing how my quick eye movements could pick a virtual object without even moving my head. I made little virtual crystals explode by staring at them and commanding them to burst. But there are plenty of practical uses:already use eye tracking to create analytics and heat maps of where you're looking, to improve training.
The bigger-picture possibilities of eye tracking get a lot weirder.the HoloLens 2's eye-tracking cameras could also measure your emotions via tiny eye changes, as well as where your gaze lands.
How much will HoloLens begin to anticipate what you're feeling, maybe even thinking?
At Microsoft's Human Factors Lab, where hardware is tested for comfort and accessibility, we step into a room surrounded by prototype headset models, and a table full of different rubbery ears. Microsoft's Senior Director of Design, Carl Ledbetter, shows us how the new headset's fit was carefully measured against a wide range of heads and ears, testing for fatigue and eye comfort. But also, in one corner, a mannequin head studded with a net of sensors sits on a table, looking like a Minority Report prop. It's an EEG-sensing headpiece.
"We use this to measure brainwave activity, and we can measure how much load is being put on somebody's mind," says Ledbetter. "We didn't necessarily use this much on HoloLens, but we see this as an opportunity ... we're using it on some other things."
"Mind reading on HoloLens 3?" CNET's Ian Sherr, who toured the Redmond, Washington facilities with me, asks.
"Mmm-hmm," Ledbetter says, maybe half-joking. Maybe not.
Finally, a wider field of view
Holographic magic isn't so magical if your ghostly beings get cut off mid-gaze. The first HoloLens had roughly a 30-degree field of view, which felt like viewing virtual objects through a window the size of a pack of playing cards held a few inches from your face.
The HoloLens 2 expands its field of view to 52 degrees, which Microsoft says is over double the effective viewing area. It feels like viewing holograms through a window the size of a softcover book. The vertical viewing area is taller, too. It makes a big difference when looking at tabletop holograms and monitor-size virtual displays. It still means some of the 3D effects are cut off, because my peripheral vision isn't covered by anything, and I can see in all directions. But still, it's considerably better than the Magic Leap One, which already had a larger field of view than the original HoloLens.
Video: The creator of HoloLens 2 discusses its future
The effective resolution has increased, to the equivalent of a 2K display per eye versus the original HoloLens' 720p per eye, but the density of the images is still the same, at 47 pixels per degree. PPD is a way of measuring density of pixels in optics, like pixels per inch on a phone or tablet. Kipman calls this effectively "retina" level resolution. I'd still say I can see pixels if I look really closely, versus my everyday eyesight. It's crisper than the typical VR headset, too. (Varjo's new VR headset has an even denser PPD resolution at its center, but not everywhere else.) But the hologram-like effects still look ghostly, much like Magic Leap and the first HoloLens. They're bright and present enough to interact in the indoor spaces I tried. Will it be , though? Microsoft says so.
The graphics punch is not shockingly different than the first HoloLens -- except, that is, for how much more expansive the viewing area is. And, based on processors, Magic Leap might still be able to push better-looking graphics -- at least, when not leaning on the cloud.
It all works in ways that would make your eyes cross if you're not an optics specialist. Zulfi Alam, Microsoft's GM of Optics Engineering, explains how the displays work with many charts, and a magnifying glass. Like the first HoloLens, there are waveguides on the visor, which bend light and aim it into the eye with a MEMS (micro electronic mechanical system) using LCOS (liquid crystal on silicon).
This is new to the HoloLens 2: the original HoloLens used a small display that limited field of view more. The display is now etched with a mirror-based laser system that creates a 120 frame-per-second image with three lasers, literally drawn on the fly like an old-fashioned monitor, but high-speed. It allows the extra-wide field of view, and also means the nonlit areas are fully transparent. We stared at a fast-moving tiny mirror, vibrating, on the table. Suffice it to say, it's complicated. But the end results look good.
Reaching out and almost touching things (with fingers, and even eyes)
There are no physical controllers with the HoloLens 2: Microsoft is relying completely on hand tracking and voice controls. Hand tracking has taken a pretty big step forward, too. The sensors can now recognize up to 25 points of articulation per hand through the wrist and fingers, plus can recognize the direction of palms, which means finger-bending, hand motion and an ability to pick things up. The first HoloLens used gesture-based finger clicks and simple moves. This time around, it's pinching, pulling, pressing. It feels significantly more advanced than the Magic Leap One's hand-tracking, but I'm not given that many chances to use it.
I try out the whole experience in what Microsoft calls the "Shell Demo." In a living room-like setting off the Great Room on campus, I see a series of holographic objects on tables: a piece of an engine, a windmill. I walk up to them and see an outline of a box surrounding them. I'm told to grab a corner of the box and pull. I do and the whole hologram gets bigger or smaller, like dragging the corner of a window.
This is, literally, 3D windows. To move something, I stick my hand into the center of the object, make a fist, and move my fist around. I can also put both fists in, and pull my fists apart, and the object expands. It's weird. Not having haptic feedback is a little disorienting. But it all works.
I then see a glowing crystal-like shape across the room, near the real sofa. There's another box, but this time, a triangular play button. I press it with my finger. This starts a demo of eye tracking.
I look at four crystals and as my eye flits to each one, it sparkles. It feels effortless. I say "explode," and whichever one I'm looking at bursts. Microsoft then shows another example of eye tracking: A fluttering hummingbird hologram floats next to a Wikipedia-like text box with an article on hummingbirds. I read and as my eyes move down, the text box starts to scroll. Sometimes the scrolling is too slow or too fast, but I learn how to pace it with my eyes. The point being, eye control works to move things too, no hands necessary.
I've seen demos of eye tracking before, but even so, the whole package here is fascinating. I want to try something more artistic, even more advanced, but this demo was all I got, so I can't say how more detailed interactions could work.
What will this really feel like in the field?
I think back to my virtual fix-the-bike training session in Guides. I wonder: Would I remember these instructions later? Would the training hold, or would I become dependent on the turn-by-turn directions? Weeks later, as I write this now, I can't really remember what I was doing in that room. I'm reminded of how I become "Google Maps blind" sometimes when I drive, and how I forget where I'm going, submitting myself to the directions. Some cab drivers prefer to memorize maps andinstead. I ask whether this type of navigational step-by-step education might diminish learning -- or help? It's a good question, I'm told.
Clearly, there may not be definitive answers yet. But the HoloLens 2 shows possibilities. The step-by-step guides are meant to be as easy to make as PowerPoint decks. Maybe, in some future, it'll be how people will leave instructions in the real world for others to find later and navigate, like ghostly guides. Google's playing in similar territory on phones with AR in Google Maps.
Connecting in the cloud
Microsoft is also emphasizing multiuser mixed reality this time around, with an emphasis on cloud services through Microsoft's Azure that will place points in mixed reality that multiple people can experience at once, in HoloLens or even on phones.last year, and , via ARKit on iOS.
Imagine being able to share the same 3D object, working together on the same 3D model. Microsoft's Dynamics 365 Layout app will remember objects via the cloud so that they stay bonded to a location for others to find, and "cloud anchors" will be remembered via the cloud as well, making sure everyone's sharing the same vision, even on iPhones, iPads and Android devices that use AR, via Microsoft apps that share the same AR tools.
We're only given a brief demo of group collaboration, in a room with a large, circular table, a HoloLens 2 on my head. Next to me are my CNET colleagues, Ian Sherr and Gabriel Sama. They're wearing HoloLens 2s, too. We gather around like an intergalactic delegation. On the table is a glowing 3D virtual map of Microsoft's campus, created in-headset. A woman, rising from the table, stands and talks about the future plans for Microsoft's expanded campus. Ian and Gabriel are watching, too. We see the same thing, from three different perspectives. And above each of their heads is their name, floating. Mine floats over my head, too, if I could somehow see it. This is a demonstration of where Microsoft wants to be next with mixed reality: collaborative, multiuser. To demonstrate how this can all be cross-platform, a few assistants with phones in hand stand near us, also demonstrating how the 3D holograms can be seen on a phone screen, too. We share the same thing.
I immediately notice how the field of view has improved. I see everything on the table without the effect being cut off.
It's not all perfect. Microsoft warns us ahead of time that we're using early engineering prototypes. Sure enough, the landscape of Microsoft's campus that spreads out in semitransparent 3D on the table ends up tilting a bit when I turn around and look at the room I'm in. It readjusts. But it makes me wonder: If corporate customers come to rely on spatial computing to precisely render on top of reality, how many glitches will they tolerate?
Microsoft's cloud services aim to improve the quality of mixed reality to a serious degree. If the current HoloLens enables centimeter precision, then added cloud computing boosts will give it millimeter precision. Similarly, the quality of 3D renders will improve. Kipman shows us in a few slides how current mixed-reality graphics may be good enough to view, but not good enough to create with. An engine block, he demonstrates, could look far more detailed with added Azure cloud rendering.
Microsoft's also planning on leaning on Azure services to render more of what HoloLens does, to improve graphics on-device from rendering 100,000 polygons for a 3D object to rendering 100 million polygons. The goal, eventually, is to reduce the headset size and push as much to the cloud as possible. Right now, it'll mean that the HoloLens 2 should be able to tap into cloud accounts and documents far more. But it's hard to see how that will all come together.
Right now, the HoloLens 2 is designed to be largely self-sufficient. It's still made to be standalone, able to work offline, like the original HoloLens was. It connects via Wi-Fi, but not with cellular. Part of that, according to Microsoft, has to do with where the HoloLens 2 is aiming to be used. But eventually, when 5G networks arrive to blanket the world in high-speed data, the HoloLens will likely evolve as well, into a far more cloud-reliant and more powerful device.
Read: Microsoft's HoloLens 2: Why it's really all about the cloud (from ZDNet's Mary Jo Foley)
Still missing: No haptics, no controls
Notably absent from the HoloLens 2 is any sort of physical controller. Just like the original HoloLens, the headset is designed to work with just hand and voice controls. While the hand-sensing has been greatly improved, the lack of any tactile controller or force feedback threw me off, and made me feel just a little unnerved. The Magic Leap One has a single one-handed physical controller that, while limited, adds some tangible sense of reality to interacting with things, and offers haptic, vibrating feedback.
Microsoft is thinking about controls and haptics -- but it's not here yet.
"We love haptics," Alex Kipman tells me. "The minute I can throw a hologram to you and you catch it and it pushes you back ... ooh, immersion just took one crank forward. The minute I'm holding a hologram and there's temperature to it -- cold, warm, lukewarm -- it changes the level of immersion and believability of the experience."
While Kipman says that haptics are "absolutely in our dreams," Microsoft isn't yet using any controllers like those that exist on. "We don't have any dogma that you can't have something in your hands. In fact, in our virtual reality headsets we have some pretty decent things that work with the same sensor set that's in our HoloLens." But Kipman doesn't see those current Windows VR controllers coming into play yet, despite Microsoft's "mixed reality" branding on those accessories.
Maybe with the HoloLens 3? "It's absolutely also in our road map to think about holding things in the hand. By the way, not just holding things that we create. What if I am a person with a real physical hammer? And my hand's occupied, or I'm holding a coffee cup, and I still want to touch my hologram?"
I'm also curious how the new controls will feel over time. Grabbing objects and pushing buttons feels more realistic. But is realism what I want, or shortcuts and comfort? How will that play out on the HoloLens 2?
"That's stuff we're thinking about," says Ledbetter, about testing for fatigue over time. "Are you trying to do things and your hands are full and you're talking to somebody ... what's the best interaction? That gets us into the software world ... but you're in the right spot to be thinking about that."
The future's still weird and wide open
It's clear that Microsoft's leaning on its Azure cloud computing to make the HoloLens 2 do more, and if that means more accurate placement of objects in 3D space, and more detailed graphics and mapping, then great. Kipman also emphasizes that the future of products like the HoloLens are really part of a large continuum. The computer vision-enabled world-tracking headsets on the HoloLens 2 will also be like the navigational camera sensors on autonomous vehicles and drones, and will be like the world-scanning cameras in homes, factories and appliances. (In fact, Microsoft's selling a new that incorporates the HoloLens sensors.)
It sounds like a physical world full of edge-computing devices, leaning on a cloud that will move at greater and greater speeds. Maybe 5G, enabling all these future HoloLens devices and AR headsets to be more cloud-based than on-headset. The HoloLens 2 doesn't have cellular; it's Wi-Fi and Bluetooth only. The LTE world isn't ready for it yet.
Even Kipman admits that the HoloLens 2 isn't for everyone, or for all situations. While Kipman uses the HoloLens for several hours a day, "there are plenty of times when I'm in my office, and I'm using my keyboard, mouse and my PC monitor to do any number of things."
But when 5G comes, and haptics, what then? Microsoft is clearly playing the long game, just like everyone else in the AR/VR/MR world. The next HoloLens may not be far off after all. Kipman's even hesitant to give any predictions for five years from now: "I'm not going to guess five years, to be honest with you. Let me say for the duration of this product, let's say more in the one to two category ... I think all the successful ones will be enterprise-bound."
And maybe, by then, it'll be the 5G super device I've been expecting all along.
(Originally published at 9:20am PT)