Apple Deep Fusion Camera technology is now available with iOS 13.2. If you have an iPhone 11 or iPhone 11 Pro, you can use this new imaging technology to take better photos. Here’s how it works.
What is Deep Fusion?
Smartphones are not yet complete replacements for professional cameras, but Apple is making the iPhone better every year.
This feature is available on iPhone 11, iPhone 11 Pro, and iPhone 11 Pro Max. These phones were released with Apple’s iOS 13. They have received several significant improvements to the camera setup, including improved sensors, a wide-angle lens, night mode, and slow-motion selfies. However, one improvement that didn’t come out of the box with their newest flagships is the Deep Fusion Camera, released with the iOS 13.2 update on October 28, 2019.
Phil Schiller of Apple called it «computational photography insane». While many smartphones are taking big steps towards improving image quality in very dark environments with Night mode and very bright environments with HDR, most of the photos we take fall somewhere in between. The Deep Fusion camera is supposed to reduce noise and greatly improve detail in photos taken in medium to low light conditions, mainly for indoor shooting.
To demonstrate, Apple used several samples of people wearing sweaters, an item of clothing that often loses detail in photographs. Sweaters and other items in images taken with the Deep Fusion camera are more detailed and retain their natural texture.
RELATED:The best new features in iOS 13 available now
How it works?
Apple across the border
According to Apple, the new mode uses the iPhone 11’s new Bionic A13 chip to «per-pixel photo processing, optimizing textures, detail, and noise in every part of the photo.» Basically, it works similar to the iPhone’s Smart HDR camera, which takes multiple shots at different exposures and combines them to maximize the clarity of the finished image. They differ in the amount of information that needs to be processed.
What Deep Fusion does in the background is pretty tricky. When the user presses the shutter button in medium light, the camera takes nine shots at once: four short images, four secondary images, and one long exposure photo. This combines the long exposure with the best of the short shots. The processor then goes pixel by pixel and selects the best elements from both to create the most detailed photo possible. All this happens in one second.
When you initially take a photo, it immediately starts post-processing the image in your album. That way, by the time you open your camera roll to take a look at it, the effect will already have been implemented. This is made possible by the A13 Bionic chip, which is the most powerful processor ever used in a commercial smartphone.
How to get Deep Fusion on your iPhone
You need an iPhone 11, iPhone 11 Pro, or iPhone 11 Pro Max to use Deep Fusion. It will probably also be compatible with future iPhones, but it is not compatible with previous iPhone hardware.
Your iPhone also needs iOS 13.2. If it has an older version of iOS, Deep Fusion will not be available. To update your phone, go to Settings > General > Software Update. Make sure you turn on your Wi-Fi.
After updating your phone, go to Settings > Camera and turn off «Take photos outside the frame». While this is a handy feature, it is not compatible with Deep Fusion mode.
How to use the Deep Fusion Camera
One of Apple’s other features introduced this year is Night Mode, which uses multiple photos to produce the brightest image. It is available via a switch in the camera software or automatically activated in very dark lighting conditions.
Unlike Night Mode, the user cannot activate Deep Fusion Mode manually. There’s no indicator that it’s even on. Apple’s AI automatically detects when an image is best suited for Deep Fusion and captures the shot in a way that is invisible to the end user. The reason for this is that they want you to be able to take better photos under normal lighting conditions without having to worry about which mode to choose.
However, there are a few conditions where you cannot use the Deep Fusion Camera. For now, it is only compatible with wide lens and telephoto lens. Photos taken with the ultra-wide camera will default to Smart HDR if lighting conditions are sufficient.
Also, since each image takes a second to process, it is not compatible with continuous shooting.
How do you know it works?
Apple across the border
In many cases, you may not know what Deep Fusion is doing. On paper, this technology is a huge step forward in mobile photography. In practice, the differences can be a little hard to notice for most people. This is especially true if you are not comparing two images side by side.
Deep Fusion is especially noticeable in objects with a lot of textures. Things like hair, detailed fabrics, textured walls, fur, and some food items will be more detailed, especially if you enlarge them. With the holidays approaching, expect to see very detailed images of people donning sweaters in your feed.
Does this fit my phone?
Deep Fusion will only be compatible with iPhone 11, 11 Pro and 11 Pro Max. Older Apple devices, including the X and XS, do not have the A13 Bionic chip, which supports most of the processing features of the newer cameras on the latest models. It may not be added in a future software update.
On the other hand, if you don’t use an iPhone, it definitely won’t show up on your phone. However, other smartphone makers, such as the Google Pixel lineup, will likely see this as a problem and develop their processing tools to compete with Apple’s new mode.