Shutterstock CEO Jon Oringer shares his thoughts on the dual-lens camera technology in Apple’s highly anticipated iPhone release.
Most of the chatter around Apple’s September 7th iPhone event has been about the removal of the headphone jack, but what deserves real attention is the new dual-lens camera system. If this sounds like another incremental improvement that won’t change much, think again: It will change photography forever.
Up until now, the main difference between a camera with a proper lens (like a digital SLR, rangefinder, or compact camera) and the tiny flat ones that you find in any smartphone is a feature called depth of field (DOF). The DOF effect is created with the aperture on a conventional camera. The aperture controls the amount of light that comes through the lens to the sensor, but a secondary effect of opening and closing the aperture is the amount of focus field you will create. Open the aperture to let in more light and the focus field will shorten. Close the aperture to let in less light and the focus field will lengthen. An image taken with a tight DOF creates a more dramatic image.
Below is an example of an image with low aperture setting. The subject of the photo is a microphone — and the background drums are out of focus. Our brains are programmed to look to a focused object to understand what the subject of the photo is. Generally a glass lens on a camera body that has some distance from the sensor is able to create this DOF because you focus on a specific point and the physics behind the glass will blur objects in front of and behind that object. A photo like this one is impossible to take with a camera phone with a single flat lens:
If your lens is flat and right in front of the sensor (like a typical camera phone lens), you don’t have distance information since the focus is always set to infinity (technical details of photography are beyond the scope of this post — if you’re interested in learning more, watch this video). This next photo is more like one taken with a camera phone. Most of the image is in focus and there is little depth or drama to the image.
A flat lens right in front of a sensor (like a typical camera phone lens) doesn’t optically produce DOF. Today’s camera phones don’t have the ability to measure distance, so they can’t digitally re-create the DOF drama that a conventional lens does on its own. This next photo is more like one taken with a camera phone: Most of the image is in focus and there is little depth or drama to the image.
There is no correct way to shoot a photo. Sometimes you don’t need DOF. But the lack of DOF has been a huge issue with camera phones — and it’s the reason I still carry around my Leica Rangefinder from time to time. It’s also the reason why professional photographers often need some sort of lens to do their work. A camera phone can’t produce a photo like this, which I shot with my Leica:
Instead your shots will always usually look flat, like this shot taken with my iPhone:
Until today that is.
Just as our two eyes work together to detect depth, two lenses do the same. By using the disparity of pixels between two lenses, the camera processor can figure out how far away parts of the image are. (If you’re interested, this paper goes into specific detail about how this method works.)
The magic is how software takes information from the two lenses and processes it into an image. Between the extra data collected from this new hardware, and the advancement of machine vision technology, the new iPhone camera is going to be incredible. Depth of Field is one of the last features necessary to complete the full migration from handheld camera to camera phone. Soon both amateur and professional photographers will only need to carry their mobile devices.
This isn’t the first time a camera manufacturer has put a dual camera system into a camera phone — but with the support of Apple’s software application ecosystem, photography will be forever changed on September 7th, 2016.