Everything You Need to Know About the Camera in the iPhone 7 and iPhone 7 Plus
The leaks and rumors leading up to today’s Apple event suggest that the iPhone 7 will have a better flash and come with a dual-lens rear camera that will create DSLR quality images. This is big news for Apple and it may just become the stand-out feature of the new handset.
For those in the Android world, this isn’t big news. Both the LG G5 and Huawei P9 already have a dual-lens rear camera. But this is Apple, people! Surely they are going to make it better, right?
Well, let’s see. Let’s go through what was announced today.
Phil Schiller took the stage to announce what’s new in the iPhone 7. Since I am mainly interested in the camera, I will jump to the part where he talks about what’s new with the camera.
The camera is perhaps one of the most beloved features of the iPhone. With the new camera, it’s going to be a huge advancement in photography for cellphones. Everything about the camera in the iPhone 7 series is entirely new.
There is an optical image stabilizer in all iPhone 7 and 7 Plus. This helps steady the camera from shaky hands.
There’s a wider F1.8 aperture lens that allows 50% more light onto the sensor.
There’s a new 6-element lens, so you get a sharp image, edge to edge, with that F1.8 aperture.
There’s an all-new 12MP sensor. It’s 60% faster and 30% more energy efficient.
Even the flash is all new. The true tone flash, with the same aperture, now has 4 LEDs. It puts out 50% more light, and reaches 50% further. The engineering team came up with a new feature called the flicker sensor.
The flicker sensor reads the flickering of artificial lighting. It can compensate for it in the photos and videos you take. This is big news in terms of image quality.
But behind it all is the brains of the camera, the Image Signal Processor. This enables so much of the unique quality we get in the pictures we take with the iPhone. And the ISP new chip in the iPhone 7 is twice the throughput of previous versions. What does this ISP do?
Here’s what the ISP does every time we take a picture. It can read the scene and uses machine learning to look for objects of people and bodies in it. Then it automatically sets the exposure, sets focus using focus pixels, sets the color with white balance, and for the first time captures wide color, cinema standard wide color. Balance it off with tone mapping, noise reduction, and even take multiple photos and fuse them together into 1 photo to get you the perfect image. This happens every time we take a picture. The ISP is so smart, it’s performing 100 billion operations every time we take a picture. It does all this in just 25 milliseconds. It’s like a super computer for photos.
Live photos got even better with iPhone 7. Apple now applies Video Image Stabilization when you take a live photo. You can now edit them after you shot them. You can either crop them or apply filters. Developers can also capture and edit photos in iOS10. On top of this, developers can now capture RAW files directly from the camera sensor and do even more complex editing. And, for the first time, they can also get wide colors in the photos they take with iPhone 7.
The front camera also got an update. There’s a new 7MP pixel front Facetime HD camera. This is great for those selfies we take and the Facetime calls we make. This is a nice change from the 5MP they had before. The sensor includes some of the technologies they used on the backside such as deep trench isolation. (As a reminder, this helps get sharper images as pixels get close together.) It also captures wide color images and does auto image stabilization.
What’s the Plus in the iPhone 7 Plus?
Plus is for the second camera in the iPhone 7 Plus. There’s now two 12MP cameras built in the iPhone 7 Plus.
1. One camera is a wide angle and has a 28MM lens (same as the iPhone 7)
2. The other camera is a telephoto with a 56MM lens
Why have two completely different cameras with two lenses? With two cameras, and two different lenses Apple created a zoom feature built into the iPhone.
How does it work? Pretty much the same as before. Go into the camera app to take a picture. Except now, there’s a new button that says 1X (it’s above the shutter). You can now take a picture the same as you’ve always done with the wide angle lens. Except now with the button you can do one of two things. You can either just tap it so it jumps to 2X, now you’ll be taking a picture with the telephoto lens. You’re getting the same hi-quality 12MP picture with a beautiful optical lens, no software needed. The second thing you can do is put your finger on it and drag across it. You can zoom from 1X to 5X.
As you’re going beyond 2X you’re doing software zoom. But now, since it’s starting with that telephoto lens, the quality of that image is 4X better than before with software zoom. It’s so good, Apple decided to push it all the way to 10X. So now with the iPhone 7 Plus, you can go from 1X optical zoom to 10X with great software zoom.
There’s one other use for this camera that Apple challenged their engineering team with. It’s a big breakthrough in photography. Phil pulled up a picture taken on a very high-end camera to illustrate what they’re trying to achieve. He was showing how the person in the front was sharply focused and the background has a beautiful blur. He was showing shallow depth of field. This is a technique that’s really useful for things like portraiture. It’s something that’s illustrative of a great camera that has often a very big sensor, like a full frame sensor or a fast lens. The quality of the background blur is called bokeh. The higher the quality of the bokeh, the more advanced and higher quality the lens and camera system. The ending result is almost 3D. Apple’s goal is to do something like this using the iPhone 7 Plus fast lens.
Here’s what Apple did. When we take a picture the camera will use the ISP to scan the scene. This will help recognize people and faces and then create a depth map of that image from the two cameras and the software. Keep the people in the front sharp and focused and apply a beautiful blur to the background. This is a huge breakthrough on what could be done on a smartphone with photography.
So how do you do it? It couldn’t be any easier. Go to the camera app, where you normally select the style of the picture, i.e panorama, square, there’s a new style called portrait. When you select portrait, it jumps to use the telephoto lens and automatically lets you see the depth effect. What’s really cool is it’s being generated in real time as you’re looking at your screen! Even hi end DSLRs can’t do that.
This is by far the best camera Apple has ever made in any of their iPhones. Should you leave your DSLR cameras at home? No! Though, this is the best camera ever made for any smartphone. For some people, this will be the best camera they’ve ever owned to date. This will allow those people to create beautiful pictures using creative tools.
So, that’s the roundup of what’s new on the new iPhone 7 and iPhone 7 Plus. With everything we’ve mentioned above, do you think you’ll be waiting in line to get the new iPhone? Let us know in the comments section below!
Try JPEGmini Pro For Free Now!
Actually high end DSLR’s can do that. They just do it via live view and not the optical viewfinder. Same way the iPhone does it a DSLR would do it.
Thanks for the roundup. Very interesting details here!
Comments are closed.