X

MIT researchers use drones as photography lights

Researchers at MIT have come up with a novel use for the humble drone, turning them into flying photography lights.

Lexy Savvides Principal Video Producer
Lexy is an on-air presenter and award-winning producer who covers consumer tech, including the latest smartphones, wearables and emerging trends like assistive robotics. She's won two Gold Telly Awards for her video series Beta Test. Prior to her career at CNET, she was a magazine editor, radio announcer and DJ. Lexy is based in San Francisco.
Expertise Wearables, smartwatches, mobile phones, photography, health tech, assistive robotics Credentials
  • Webby Award honoree, 2x Gold Telly Award winner
Lexy Savvides
2 min read

mitphotodrone.jpg
MIT News Office
Rim lighting is a technique used to illuminate the side of a subject, with the light source usually positioned behind or next to the subject.

Also known as backlighting, the technique requires the photographer to position the lights and the subject precisely in order to achieve the desired effect.

If the photographer or subject change position, the lights also need to move. It can be a time-consuming process to get the look just right, so researchers at MIT and Cornell University have come up with a clever solution involving drones.

Equipped with a flash unit, the drone can change positions automatically thanks to a camera-mounted interface.

The drone is a modified Parrot AR.Drone with a wireless flash unit, halogen light as continuous light source, and laser rangefinder to measure distance.

The computational process behind getting the drones to move around is much more complex than you might think. Rather than simply flying around the subject, the prototype system lets the drones respond to the subject's movement.

The photographer first needs to give an idea of how the rim light should fall on the subject. Then, the drone moves to that position, with the photographer specifying how wide the rim needs to be as a percentage of the initial value. As the subject changes position, the drone does as well in order to maintain the rim light specifications.

On top of the subject's position, the system also keeps an eye on where the photographer is situated. Every second, the camera produces around 20 images that are transmitted to a computer running the control algorithm.

In response to the images, the algorithm looks at the rim width and then repositions the drone as required.

"Rim lighting is a particularly interesting effect, because you want to precisely position the lighting to bring out silhouettes," said Ravi Ramamoorthi, professor of computer science and engineering at the the University of California, San Diego.

Apart from this specific example, the technology has potential to be applied to other lighting techniques and situations.

"Other effects are in some sense easier -- one doesn't need as precise positioning for frontal lighting. So the technique would probably generalize to other light effects. But at the same time, as-precise control and manipulation may not be needed. Manual static positioning might be adequate."

The prototype system will be presented at the International Symposium on Computational Aesthetics in Graphics, Visualization and Imaging in August.

(Via PetaPixel)