The most important hardware feature of any phone, after the screen, is probably its cameras. Apple knows this, which is why it pushes them forward every year. And this year, when only the iPhone 14 Pro gets the new A16 system-on-a-chip (SoC), and the regular iPhone 14 is stuck with the same A15, Apple has still totally revamped the cameras in the non-Pro iPhone. Even your selfies will be better. “The new main camera with a larger ƒ/1.5 aperture and 1.9 µm pixels [means] that any average Joe [with] a semi-steady hand can capture a clear and sharp photo in the kind of low light situations that, if you were using a dedicated DSLR or mirrorless camera, would normally require photography experience, an expensive lens and camera combination,” Mark Condon, photographer, and CEO and founder of Shotkit, told Lifewire via email.
Just Like a Pro
The iPhone 14’s cameras are similar to the iPhone 13 Pro’s cameras from last year, with improvements front and back. The main rear-facing camera has a bigger sensor and a larger aperture resulting in around 50% more light getting let in. This, as you’d expect, should result in better low-light photos, which means pretty much every photo that isn’t taken outdoors. Apple has also indulged its passion for naming absolutely every feature on its devices, and the built-in image processing is now called the Photonic Engine. Specifically, the camera grabs several exposures when you take a photo and “merges the best pixels” from each to create an image with better color and detail. As far as I can tell, it’s an evolution of the existing Sweater Mode, aka Deep Fusion, only now, the image data is grabbed at an earlier stage in the processing pipeline. “For me, the increased resolution of the sensor and the improved low light performance are reasons enough for me to upgrade to the latest iPhone 14,” says Condon. One interesting fact is that the sensor in the non-Pro iPhone 14 is now comparable in size to Canon’s Powershot G-Series compact cameras from the 2000s, according to DP Review’s Richard Butler. These cameras were considered some of the highest-end compact cameras around at that time. I used to own one, and I can tell you that the iPhone already produces way better photos. And having a big sensor really does influence what you can do. Like less reliance on flash, for example. “The increased sensor size is especially exciting since using the built-in iPhone flash is so unflattering,” says Condon.
Selfie AF
The real news here, though, is that the front-facing selfie camera now has autofocus. This is a huge deal and may even be surprising. After all, didn’t the selfie camera already have autofocus? No. The selfie cam was always set to focus at a given distance. One of the “features” of the tiny sensors we get in phone cameras is they lead to a very big depth of field, meaning everything is always more or less in focus. This is why phones need computational tricks like Portrait Mode to fake the shallow depth-of-field that comes naturally to cameras with larger sensors. But now, the front-facing camera no longer relies on this. Now, it focuses on the person in front of it, which should make for sharper, better quality selfies. And combined with the front-facing camera’s amazing Portrait Mode, which can use the Face ID camera’s 3D depth sensing to make accurate depth maps, your duck faces should look better than ever. And speaking of Portrait Mode, the iPhone 14’s cameras can now blur the foreground in addition to the background. The trend seems to be clear. Every year, Apple adds great new camera features to the iPhone Pro, and then the next year, it incorporates many of those features into the regular model. Next year, then, we can probably expect the amazing new 48-megapixel sensor from this year’s iPhone 14 Pro to end up in the iPhone 15. And that’s fine. After all, if you really want those new features now, you’re probably a photography enthusiast who is willing to pay extra for the pro. For everyone else, we still get better cameras every year, which is great.