The iPhone 16 range and 'Fusion' photos - how does it work and how WELL does it work?

Even though I maintain to this day (2025) that photo resolutions of over 12MP are 'over the top' for 99% of people's needs, I have to concede that Apple's implementation in 2024 of 24MP 'Fusion' images is impressive and does work well. If the extra file size doesn't bother you, then it does give enough resolution margin of error to downscale or crop or process later, as needed.

(Note that this is all aside from my favoured technique for important shots on the iPhone 'Pro' models, which is to use ProRAW, giving all the HDR and texture intelligence but without sharpening and edge enhancement, at the expense of huge file sizes, over 20MB per image.)

Apple's idea stemmed from looking at the two alternatives from its 48MP camera sensors:

  • a pixel-binned 12MP shot, with lower digital noise and greater 'purity'.
  • a full resolution 48MP shot, with higher native resolution but more noise and more uncertainty at the pixel level.

So I guess someone at Apple thought 'What if...' and so created 'Fusion'. Essentially taking the fully realised 12MP and 48MP shots, each with their own imaging pipeline in place (such is the computing power in modern phone chipsets) and combining the two algorithmically to produce a 24MP image that had more detail than the 12MP shot, but also more purity than the 48MP, etc.

Of course, iOS lets you choose in Settings whether to do this or whether to stick to (e.g.) 12MP photos. And it's also intelligent enough not to even attempt a Fusion image when light is low or when you're already zooming (smart-cropping) into the sensor or shooting in Portrait mode, or indeed a dozen other niche modes. This was actually a worry of mine, since I knew that trying to achieve 24MP from a 48MP sensor was bound to blow up a bit in tough conditions. Phew.

Some examples, so that you can see how well Apple's Fusion tech actually works:

Here's the scene, in Henley:

And here are 1:1 crops from, in turn, my old iPhone 14 Pro Max (with a larger sensor but two generations older in terms of sensor and chipset), and then the iPhone 16 (and note that the higher resolution means a closer view in terms of pixels):

I'd say that there's definitely more detail here, even if you'd never notice or need it when looking at the overall photo. Still, it shows that the Fusion system works, without introducing too many jaggies or too much extra noise.

Another example, also in Henley, shooting a Tudor house across the road:

And here are 1:1 crops from, in turn, my old iPhone 14 Pro Max, shooting at pixel-binned 12MP, and then the iPhone 16, shooting 'Fusion' 24MP shots:

Again you can see the difference with your own eyes. So we can file 'Fusion' as just another aspect of computational photography that enhances images through clever algorithms and lateral thinking.

The geek in me cries out '24MP' is so much better, I want it. But the pragmatist in me says 'But you'll never actually USE all those extra pixels!' Still, if I wanted to crop out a section of a 24MP image then the option is there. A little niche, though...

Purely as an imaging geek though, I'm impressed.

PS. If you like my work then think about buying me a beer at paypal.me/stevelitchfield - thanks!

Comments

Popular posts from this blog

Wavelet - and better sounding speakers (and headphones) on Android...

Why the Apple iPhone 14 Pro Max is better than the 15 Pro Max

Review: Anker 737 140W power bank (PowerCore 24k)