The AR/VR headset would come with multiple sensors.
Corinne ReichertSenior Writer
Corinne Reichert (she/her) grew up in Sydney, Australia and moved to California in 2019. She holds degrees in law and communications, and currently writes news, analysis and features for CNET across the topics of electric vehicles, broadband networks, mobile devices, big tech, artificial intelligence, home technology and entertainment. In her spare time, she watches soccer games and F1 races, and goes to Disneyland as often as possible.
I've been covering technology and mobile for 12 years, first as a telecommunications reporter and assistant editor at ZDNet in Australia, then as CNET's West Coast head of breaking news, and now in the Thought Leadership team.
Watch this: Apple is developing a combo VR-AR headset
The application was published July 18 after being filed in March 2019.
According to the filing, the headset could come with left and right displays; light sensors to collect light information like color, intensity and direction; head pose sensors to track the user's orientation and motion; world mapping sensors to track movement and location; eye tracking sensors; lower jaw sensors to track expression; hand sensors to track position, movement and gestures; eyebrow sensors to track facial expressions; an inertial measurement unit to augment the sensor information; and left and right cameras.
The controller would also have one or more processors, Apple said.
"The controller is configured to render an avatar of the user's face for display in the 3D virtual view based at least in part on information collected by the one or more eye tracking sensors, the one or more eyebrow sensors, and the one or more lower jaw sensors," the filing says.
Apple said the VR system could use LCD, digital light processing or liquid crystal on silicon technology -- or even "a direct retinal projector system that scans left and right images, pixel by pixel, to the subject's eyes."
14 hidden iPhone features in iOS 13 you need to know about