Not that obvious.
With the introduction of iOS 13.2 Apple has enabled Deep Fusion for the cameras in the iPhone 11 and 11 Pro. For an explanation of how this works click here. Note that Deep Fusion does not work with the ultrawide lens; it’s limited to the normal and telephoto. You must also turn off “Photo capture outside the frame” in Camera Settings. Finally, Deep Fusion does not work in burst mode and requires a current top of the line iPhone, meaning the 11 or 11 Pro.
By sampling and combining the best parts of multiple images Deep Fusion claims to further improve the already stellar results from the iPhone 11’s camera. It’s not easy to test, however, as it only kicks in with moderately lit subjects and there’s no indication that it’s working.
To compare results, I took one indoor image using iOS 13.1, downloaded the update, and took a second image with iOS 13.2, the one with Deep Fusion.
Here’s the original:
Before and after exposures and ISOs were identical. No processing was applied in Lightroom.
Here is the center of the image magnified to a print size of 60″ x 80″:
Regular (left) and Deep Fusion images.
There is just a little more detail (and less aggressive sharpening by the iPhone) in the butterfly’s wings in the Deep Fusion version, less smearing and slightly lower contrast.
Now let’s take a look at the shadows nearer the edge:
Regular (left) and Deep Fusion images.
Again there is a very small gain in definition but a significant reduction in grain and less smearing of the detail. Contrast in the Deep Fusion version is again lower.
So does Deep Fusion improve things? Yes. Is that improvement really significant? No.
But the above – these are enormous enlargements – confirm that the days of gargantuan sensors are numbered. A pinhead sized sensor combining multiple images shows barely any grain and more definition than any photographer looking at a 30″ display will ever need. Unlike on your DSLR, that sensor is dust and waterproof. As for web publication, it bears repeating: No one needs more than 3mp.