For the past couple of years, Apple has been offering a standard iPhone and a Pro iPhone in the same series. The standard iPhone is a product for the general audience, whereas the Pro lineup is aimed at Professional users (Photographers, videographers, etc.).
Apple finally upgraded from a 12MP sensor after seven long years. The iPhone 14 Pro lineup now features a 48MP camera with a quad-pixel sensor that helps widen the gap between the standard number series and the Pro series in the same lineup. Let’s dive in and look at all its features.
What changes does a 48MP sensor introduce?
Recently, Apple was falling behind the competition in photography as other smartphones come with larger sensors with higher megapixel counts. No, I cannot benchmark the iPhone’s camera based on hardware specifications alone, but these specifications make a difference.
For the very same reasons, the primary camera on the iPhone 14 Pro series has been upgraded to a 48MP Quad-pixel sensor. But what is a Quad-pixel sensor? Does it mean something, or is it just a marketing gimmick? Let’s find out.
What is a quad-pixel sensor?
Quad-pixel sensor is Apple’s way of marketing Pixel binning. It is a process where the sensor combines four individual pixels into one large virtual pixel. This lets the sensor capture 4 times more light and overall brighter images.
However, it comes with a drawback; it compromises on details and sharpness. So why does Apple have to make use of this technology? A sensor with a higher megapixel count means smaller individual pixels. Smaller pixels capture less light which leads to bad picture quality.
To avoid this, Apple uses pixel binning. An image shot with the iPhone 14 Pro’s Quad-pixel sensor (Pixel binning) is captured in 12MP and not 48MP, simply because four pixels are combined into 1. Since 4 pixels are combined into one, the overall megapixel count is divided by four. Here’s the math: 48 divided by 4 equals 12.
So while the overall images will be brighter and better, they’ll be clicked in 12MP unless you’re using Apple’s ProRaw format.
An understanding of focal length will help you frame shots better. The focal length of a sensor affects the angle view. In simpler terms, it tells us how much area would be captured and what are the zooming capabilities.
Now, the focal length of the 48MP sensor on the iPhone 14 Pro series is 24 mm, which is wider than the 26 mm focal length found on the iPhone 13 Pro series. The wider focal length is an improvement for landscape photography and will accommodate more of your friends in group shots.
The overall focal length of the iPhone 14 Pros camera system ranges from 24 mm to 77 mm. A very similar range to the lenses used on most digital cameras; 24 mm to 77 mm. Moreover, the Ultra-Wide-angle lens doubles as a macro camera, and its focal length is as wide as 13 mm.
Another change ushered in by the 48MP sensor is the 2x optical quality telephoto zoom. Your zooming options range from 0.5x, 1x, 2x, and 3x. But how is optical zoom made possible without a dedicated sensor?
The 48MP sensor crops in and utilizes the 12MP from the sensor. And this is what enables 2x optical zoom. You can now use the 2x telephoto to shoot Portrait mode and Panorama images. It also enables a 2x zoomed-in Time lapse.
Flashy iPhone with a new flash
The flash on the iPhone 14 Pro series has received a massive upgrade, and I’m genuinely looking forward to testing it. The flash is now an array of 9 LEDs controlled by the software. The flash’s pattern and intensity will depend on the focal length of the image being shot.
Apple has unique names for all the technology it introduces in its devices, and for once, I agree with them naming the flash “Adaptive True Tone Flash.” Apple’s hardware and software work in unison; that is how this and most other features on the iPhone function seamlessly and provide excellent results.
So, what hardware apart from the camera sensor enables the iPhone 14 Pro series to go beyond its capabilities and process images to a league of its own? The answer is a new and improved Photonic Engine.
Improvements to the Photonic Engine
The A16 Bionic SoC is the fastest chip on any smartphone available today that houses the all-new Photonic Engine. It combines the hardware, software, and machine to deliver up to two times better images in low light and has improved Apple’s Deep Fusion Technology.
The CPU, GPU, Neural Engine, and the Image Signal Processor of the A16 Bionic SoC work together to perform up to 4 trillion operations for a single photo! To know more about this, check out our explainer on, “What is Photonic Engine and how it works?”.
Unbeatable video recording capabilities
The videos shot on iPhone are uncontested, period. ProRes video is exclusive to the iPhone, but that is because it is Apple’s proprietary video codec. However, the iPhone is the only device to support Dolby Vision HDR video shooting.
Apple has introduced the Action mode to widen the gap even further, which offers excellent gimbal-like stabilization. From what I see, the sensor crops in, and the frame is much narrower.
Apple describes Action mode: “It uses the full sensor with more overscan and advanced roll correction to make video look incredibly stable when you’re in the middle of the action. Simply toggle it on for great-looking, smooth video without having to carry extra gear like a gimble. It also supports Dolby Vision HDR.”
The Cinematic Mode introduced with the iPhone 13 Pro series also gets an update and now supports up to 4K HDR. You can shoot either in 30 FPS or cinema grade 24 FPS.
That’s all we know as of now.
These are all the upgrades facilitated by the new camera system on the iPhone 14 Pro series, which we are aware of now. I hope these upgrades finally help Apple catch up with the rest of the competition in smartphone photography.
In terms of video shooting capabilities, no smartphone offers a reliable video camera like the iPhone. I was hoping Apple would introduce a telephoto lens with higher zooming ability in the iPhone 14 Pro.
The tagline “Far Out” strengthened my hope for a periscope lens on the iPhone. Sadly, Apple doesn’t think it’s the right time to capture images of Far Out scenes, and we will have to wait another year until we see a periscope lens on the iPhone.