Apple launches its new advanced photography system, Deep Fusion
We want to give you a sneak peak of a new feature coming in the camera that will be available with the software update this fall, but it's so cool we have to tell you about it.
It's using the neural engine of the A13 bionic to create a whole brand new kind of image processing system.
We call it deep fusion.
And this is so called.
So to tell you what it's doing and to do that while we look at an image.
So this is the photo that was shot on an iPhone of iPhone pro using this deep fusion technology and this kinda of.
Of an image would not have been possible for.
We used machine learning to take this photo in low to medium light, and it's unlike anything possible with an iPhone camera before.
So what is it doing?
How do we get an image like this?
All right, you ready for this?
What it does, it shoots nine images.
Before you press the shutter button, it's already shot four short images, four secondary images.
When you press the shutter button, it takes one long exposure.
And then in just one second, the neural engine analyzes the fuse combination of long and short images, picking the best among them, selecting all the pixels and pixel by pixel, going through 24 million pixels to Optimize for detail and low noise like you see in the sweater there.
It is amazing.
This is the first time a neural engine is responsible for generating the output image.
It is computational photography Mad Science.
It is way cool.
Apple's 2021 iPad Pro is an M1 machine: Let's talk about it
New Apple TV 4K finally gets a better remote
What we think of the new colorful 24-inch iMac
Everything Apple announced: Upgraded iMacs, iPad Pro and AirTags
Highlights from Apple's Spring Loaded event in 10 minutes
Apple launches new M1-powered iPad Pro with 5G
Apple debuts all-new colorful iMacs
Apple unveils new 4K Apple TV with A12 Bionic chip