iPhone 11 Deep Fusion camera is now available to try in iOS 13.2 public beta


iPhone 11 Deep Fusion camera is now available to try in iOS 13.2 public beta

deepfusion-example-2

This photo was taken using Apple’s new Deep Fusion process, which optimizes detail and minimizes image noise.


Apple

Deep Fusion, Apple’s new image processing technique for the iPhone 1111 Pro and 11 Pro Max, is now available as part of both the developers beta and public beta of iOS 13.2. Deep Fusion will only work with iOS devices running an A13 Bionic processor, which is currently only the newest iPhones.

When the iPhone 11 and 11 Pro were first announced in September, Apple showed off the new ultrawide-angle camera, Night Mode and an improved selfie camera, all of which represented a significant step forward for iPhone photography and videos. And now that they’re in the wild, we’ve tested the new iPhone cameras and can confirm their improvements as well as the absolute enjoyment we feel using that ultrawide-angle camera. But there’s one camera feature that Apple teased at its fall iPhone event that no one has gotten to try: Deep Fusion. 

While it sounds like the name of an acid jazz band, Apple claims the brand-new photo processing technique will make your pictures pop with detail while keeping the amount of image noise relatively low. The best way to think of Deep Fusion is that you’re not meant to. Apple wants you to rely on this new technology but not think too much about it. There’s no button to turn it on or off, or really any indication that you’re even using the mode.





http://www.cnet.com/


Now playing:
Watch this:

We compare the cameras on the iPhone 11 Pro and iPhone…





8:23

Right now, anytime you take a photo on an iPhone 11, 11 Pro or 11 Pro Max, the default mode is Smart HDR, which takes a series of images before and after your shot and blends them together to improve the dynamic range and detail. If the environment is too dark, the camera switches automatically into Night Mode to improve brightness and reduce image noise. With Deep Fusion, anytime you take a photo in medium to low light conditions, like indoors, the camera will switch automatically into the mode to lower image noise and optimize detail. Deep Fusion, unlike Smart HDR, works at the pixel level. If you’re using the “telephoto” lens on the iPhone 11 Pro or 11 Pro Max, the camera will drop into Deep Fusion pretty much anytime you’re not in the brightest light.

This means the iPhone 11, 11 Pro and 11 Pro Max have optimized modes for bright light, low light and now medium light. And I’d argue that most people’s photos are taken in medium- to low-light situations like indoors. The impact that Deep Fusion will have on your photos is enormous. It’s like Apple changed the recipe of Coke.

At the iPhone event, Apple’s Phil Schiller described Deep Fusion as “computational photography mad science.” And when you hear how it works, you’ll likely agree. 

Essentially anytime you go to take a photo, the camera is capturing multiple images. Again, Smart HDR does something similar. The iPhone takes a reference photo that’s meant to stop motion blur as much as possible. Next, it combines three standard exposures and one long exposure into a single “synthetic long” photo. Deep Fusion then breaks down the reference image and synthetic long photo into multiple regions identifying skies, walls, textures and fine details (like hair). Next, the software does a pixel-by-pixel analysis of the two photos — that’s 24 million pixels in total. The results of that analysis are used to determine which pixels to use and optimize in building a final image.

deepfusion-example-1

Here’s another iPhone phone taken with Deep Fusion. Details like hair, beard hair, the sweater, the fabric of the couch and the texture of the wall are all optimized by the Deep Fusion process.


Apple

Apple says that the entire process takes a second or so to happen. But to allow to you to continue snapping shots, all of the information is captured and processed when your iPhone’s A13 processor has a chance. The idea is that you won’t be waiting on Deep Fusion before taking the next photo. 

The release of Deep Fusion comes just a couple weeks before Google will formally announce the Pixel 4, its latest flagship phone in a line renowned for camera prowess.

I should note that Deep Fusion will only be available on the iPhone 11, 11 Pro and 11 Pro Max because it needs the A13 Bionic processor to work. I’m excited to try it out and share the results once the developer’s version is out.





http://www.cnet.com/


Now playing:
Watch this:

iPhone 11: 3 phones, reviewed. Which do you choose?





10:05

Originally published Oct. 1. 

Update, Oct. 2: Gives current availability info.


$699

http://www.cnet.com/

CNET may get a commission from retail offers.


Source link