Tesla's Optimus Robot Everything From Tesla AI Day Bella Hadid's Spray-on Dress Hasbro's Indiana Jones Toy 'Hocus Pocus 2' Review AirPods Pro 2 Discount Meal Delivery Services Vitamins for Flu Season
Want CNET to notify you of price drops and the latest stories?
No, thank you

Spider eyes inspire future-tech that could let nanobots see

Harvard researchers are looking to nature for answers on tiny camera tech.

Jumping spiders have evolved an efficient depth perception system, allowing them to accurately pounce on unsuspecting targets from several body lengths away. 
Harvard/Paul Shamble, Tsevi Beatus, Itai Cohen and Ron Hoy

When a jumping spider tackles a fly from a distance, its pounce must be precisely executed.To achieve this, the spiders have multiple layers of retinas in each of their eyes. As the image gets sharper in one eye and blurrier in another, a depth of focus emerges, allowing the spider to instantly judge the exact distance needed for a lethal jump. The setup has also allowed Harvard researchers to develop a sophisticated new lens, or "metalens," for microbots and other tiny tech. 

In a study published earlier this month, a team of researchers designed a metalens depth sensor that can simultaneously produce two images with different blur. But instead of using layered retinas to capture multiple images simultaneously, as jumping spiders do, the metalens splits the light and forms two differently-defocused images. That data is then fed to an algorithm to get the complete picture. 

"Metalenses are a game changing technology because of their ability to implement existing and new optical functions much more efficiently, faster and with much less bulk and complexity than existing lenses," said the paper's co-author Frederico Capasso in a Harvard release

Currently, depth sensors in phones, cars and video game consoles use multiple cameras to measure distances. Facial identification on smartphones, for instance, uses thousands of laser dots to map your face's shape. But the new metalens development, researchers hope, could allow camera integration with nanotechnology, microbots and smaller wearables. 

Harvard/Paul Shamble, Tsevi Beatus, Itai Cohen and Ron Hoy