After sitting in the Steve Jobs Theater at Apple's Sept. 12 event and looking at the iPhone X and its new technology for myself, I kept thinking about that front-facing camera array. TrueDepth, it's called. It's a bundle of sensors. It can detect faces, or moving muscles. It can see 3D objects. It can enable more advanced augmented reality.
And, I can see it living on well beyond what we think of as an iPhone.
A larger screen is great. A new design is welcome. But if there's one piece of tech that defines that new iPhone, it's TrueDepth. And it's Apple's first attempt at a more advanced type of magic camera: one similar to , or , or Microsoft Kinect.
I see this tech being used in a bunch or different ways, from Mixed Reality headsets to car dashboards.
Mixed reality headsets need cameras like these. I think of HoloLens, a mixed reality headset with a special set of cameras and sensors for tracking and recognizing the environment well enough to place virtual objects realistically. Or other emerging, experimental headsets like Meta and . The blending of virtual and real objects into everyday space, realistically, like Magic Leap promised but still hasn't delivered.
Could more advanced sensors like TrueDepth start to pave the way for higher end applications? It's hard to know the range of TrueDepth (I think it has pretty short range, hence it being front-facing and not rear-facing), but it's a start. Or, maybe TrueDepth could sit on the inside of future headsets and use face-tracking to control input and eye movements.
Biometrics everywhere. Face ID is mostly hands-free. Maybe that frees it to work on door locks, or laptop screens, or car dashboards, or maybe even the Apple Watch someday. It seems like a technology that could be more flexible to install in more places, because it's more awkward to reach your finger up to scan. Maybe we're heading for that Minority Report future of facial scanning everywhere.
Robots, drones, vehicles and location-aware devices. TrueDepth is meant for photos, Face ID and fun AR tricks right now. But what if it was employed to help robots navigate, or enable a vehicle to see a parking garage better? Cameras and SLAM (simultaneous location and mapping) technologies are what allow robots like Kuri to find their way around, or help an autonomous vehicle drive. TrueDepth might be short-range, perhaps, but what if future sensors can scan longer ranges and help with obstacle avoidance?
A foot in the door for future computer vision. Cameras are the doorway to a next wave of AI that's able to analyze and "see" the world and intelligently use that for all sorts of use cases. Perhaps the next wave of Siri will gain eyes, and finally enable the sorts of tricks that Bixby Vision and Google Lens promised. Maybe next year?
As it is now, TrueDepth on the iPhone X may mostly be about face unlocking. But as Apple said at its event, neural engines can adapt. Maybe, down the road, this will be about a whole lot more.