Here’s a brief tour of how the iPhone camera quality has improved over the years, starting with the iPhone X and moving forwards right up to the current iPhone 14 series and the incoming iPhone 15…



Apple’s iPhone – alongside Android-powered smartphones – has not just been a trailblazer in the smartphone market, but also in the realm of mobile photography. The reason why platforms like Instagram, TikTok and other image-based social platforms are so popular is because everybody is now a photographer, and the catalyst for this change? Smartphones.

Apple’s iPhone is, perhaps, the most well known smartphone from a brand perspective. People from all walks of life, old and young, use iPhones. And this mass adoption of smartphones has completely democratized photography.

Over the years, starting from the iPhone X, each iteration of Apple’s iPhone has introduced impressive camera upgrades, improving on what came before and bridging the gap between DSLR and smartphone camera performance. Nowadays, thanks to billions spent in R&D, that gap is closer than ever.

Films have been made using iPhones, millions of hours’ worth of YouTube content, and trillions of photographs. But how did we get here? Let’s chart the development and evolution of Apple’s iPhone camera over the past several years to find out…

iPhone Camera Quality & Resolution

iPhone ModelCamera ResolutionMegapixels
iPhone 63264 x 2448 pixels8 MP
iPhone 6S to iPhone X4032 x 3024 pixels12 MP
iPhone XR4032 x 3024 pixels12 MP
iPhone XS and XS Max4032 x 3024 pixels12 MP
iPhone 11, 11 Pro, and 11 Pro Max4032 x 3024 pixels12 MP
iPhone 12, 12 Mini, 12 Pro, and 12 Pro Max4032 x 3024 pixels12 MP
iPhone 13 Mini, 13, 13 Pro, and 13 Pro Max4032 x 3024 pixels12 MP
iPhone 144032 x 3024 pixels12 MP
iPhone 14 Plus4032 x 3024 pixels12 MP
iPhone 14 Pro8064 x 6048 pixels48 MP
iPhone 14 Pro Max8064 x 6048 pixels48 MP

It’s important to note that these figures apply to the default settings. You can choose to capture images in different formats (like HEIF, JPEG, or RAW), different aspect ratios, or use features that might affect the final resolution (like Panorama or Burst mode).

Furthermore, starting with the iPhone 12 Pro, you can take ProRAW images, which contain more detail and allow for more flexibility in editing but also result in much larger file sizes. The resolution remains the same, but the additional data in ProRAW files allows for more detail to be captured and preserved.

What Are ProRAW Images on iPhone?

Apple ProRAW is a feature introduced with the iPhone 12 Pro and 12 Pro Max that provides the benefits of shooting in RAW with the computational photography capabilities that iPhones are known for.

In traditional photography, RAW is a file format that captures all image data recorded by the sensor when you take a photo. Unlike JPEG or HEIF images which are processed and compressed, RAW images are uncompressed and unprocessed. This gives photographers more control over parameters like white balance, exposure, tone mapping, and noise reduction in post-production.

Shooting in RAW does mean that you miss out on many of the advanced photography features that modern iPhones offer, like Deep Fusion and Smart HDR, which are applied at the moment of capture.

This is where ProRAW comes in. It gives you all the standard RAW information, along with the image processing and noise reduction techniques that iPhones apply. This allows photographers to manually adjust things like exposure, color, and tone in post-production without losing the benefits of the iPhone’s computational photography.

You can think of ProRAW as a hybrid between a traditional RAW format and the processed JPEG or HEIF formats that iPhones typically capture. It’s designed for people who want to manually edit their photos to achieve a specific look, while still benefiting from the iPhone’s advanced photography features.

iPhone Camera Comparison Over The Years

The Leap with iPhone X

iPhone Keyboard ShortcutsPin

With the iPhone X, Apple signaled a significant shift in iPhone camera technology. Boasting a 12MP dual-camera system, the iPhone X offered a wide and telephoto lens combination, allowing optical zoom and Portrait mode, which had previously been exclusive to the Plus models.

The wide sensor featured an f/1.8 aperture, while the telephoto sensor had an f/2.4 aperture. What stood out was its improved image signal processor (ISP) that offered faster autofocus in low light conditions, and the introduction of Portrait Lighting.

What is Portrait Lighting on iPhone?

Portrait Lighting uses complex algorithms and the iPhone’s depth-sensing capabilities to manipulate the lighting on the subject of your photograph as you take it, almost as if you were changing real-world lighting setups in a studio environment.

There are six different Portrait Lighting effects:

  1. Natural Light: Your subject’s face in sharp focus against a blurred background.
  2. Studio Light: A clean look with your subject’s face brightly lit.
  3. Contour Light: Dramatic directional lighting that highlights the subject’s facial features.
  4. Stage Light: The subject’s face is spotlit against a deep black background.
  5. Stage Light Mono: Like Stage Light, but in classic black and white.
  6. High-Key Light Mono: The subject appears in grayscale against a white background.

The Refinement with iPhone XS and XS Max

iPhone XS Max ReviewPin

In the next iteration, the iPhone XS and XS Max, Apple focused on refining its camera capabilities. On paper, the specs seemed the same as the iPhone X. However, Apple introduced the concept of computational photography with the feature “Smart HDR”. This feature used multiple exposures to create a single image with more detail in both the shadows and highlights, elevating the camera’s dynamic range.

What is Smart HDR on iPhone?

Traditionally, cameras have struggled with scenes that include both very bright and very dark areas, also known as high-contrast scenes. HDR technology is designed to remedy this by taking multiple photos at different exposure levels (one for highlights, one for shadows, and one in between) and then combining them into one photo. This results in an image that more accurately represents the range of light and dark areas that the human eye can see in the real world.

Smart HDR takes this concept a step further. When you take a photo, the iPhone’s camera actually captures a series of images almost simultaneously. These images are taken at different exposure levels: some are optimized for highlights, and others for shadows.

The iPhone then analyzes these images, chooses the best parts of each one, and merges them into a single photo. This happens almost instantly, thanks to the incredibly fast processing power of the iPhone’s chip.

Moreover, Smart HDR uses machine learning to recognize faces in your photos, ensuring that people are always well-exposed and detailed.

The result is photos that show more detail, have better balanced lighting, and look more like what you see with your own eyes. It’s particularly useful when you’re shooting in challenging lighting conditions, such as backlit scenes or situations with high contrast between bright and dark areas.

The Game Changer: iPhone 11 Series

iPhone 11 worth it 2023Pin

The iPhone 11 series brought with it a notable advancement: the introduction of an ultra-wide lens. The new lens allowed users to capture a much wider field of view, opening up creative possibilities in landscapes, cityscapes, and group shots.

Night Mode was also introduced, enabling the capture of remarkably bright photos in low-light environments. The iPhone 11 Pro models offered a triple-lens setup for the first time, bringing together wide, ultra-wide, and telephoto lenses.

What is Night Mode on iPhone?

Traditionally, photos taken in dimly lit environments have been a challenge for most smartphones due to digital noise, loss of detail, and inaccurate colors. Night Mode on iPhone addresses these issues using a combination of longer exposure times, multiple shots, and computational algorithms.

Here’s how it works:

  1. When you take a photo in a low-light situation and Night Mode is engaged, the camera automatically takes a series of images over a period of time (the exact duration varies depending on the light conditions). Each of these images is at a different exposure level to capture various levels of detail.
  2. Then, the iPhone’s powerful chip and advanced software go to work. They align the images to correct for any movement (either from the subject or the camera), combine the best parts of each image, reduce noise, and enhance detail.
  3. The result is a single, well-exposed photo that appears much brighter and more detailed than what you could typically achieve in a similar low-light situation.
  4. One key thing to note about Night Mode is that it engages automatically when the camera sensor determines that the lighting conditions require it. You can tell when Night Mode is active because the Night Mode icon (a moon) at the top of the Camera app becomes yellow. You can also manually adjust the exposure time when Night Mode is active by tapping on the Night Mode icon and using the slider that appears.

The Bold Innovator: iPhone 12 Series

Does The iPhone 12 Have A Headphone JackPin

The iPhone 12 series, while retaining the triple-camera system of the iPhone 11 Pro models, brought about significant improvements. The entire iPhone 12 lineup saw the introduction of Night mode across all cameras, including the front-facing camera, a first for iPhones. HDR video recording became possible with Dolby Vision HDR, raising the bar for video quality in smartphones.

The flagship iPhone 12 Pro Max introduced sensor-shift optical image stabilization, previously only found in DSLRs, for its wide lens. This change helped reduce camera shake and improved low-light photography. In addition, it also brought in a larger sensor for the wide camera and a longer telephoto lens.

What is Sensor-Shift Optical Image Stabilization?

In traditional optical image stabilization, the lens moves to counteract any minor movements of the camera (such as your hand shaking slightly) that might blur the image. The goal is to keep the lens stable so that the image hitting the sensor is as sharp as possible.

Sensor-shift OIS takes a different approach: instead of moving the lens, it moves the sensor. This allows for potentially more precise correction of camera movement, resulting in even sharper photos and steadier videos. It’s especially beneficial in situations where the camera might be moving a lot (like walking or riding in a vehicle) or in low-light conditions, where the camera needs to keep the shutter open longer and is therefore more susceptible to blur from camera shake.

One of the main advantages of sensor-shift OIS over traditional OIS is that it can correct movement along five axes (up/down, left/right, and rotation) rather than just two (up/down and left/right), providing a more comprehensive stabilization system.

This feature, which was previously found only in high-end DSLR and mirrorless cameras, is part of Apple’s ongoing effort to make the iPhone a serious tool for professional photographers and videographers.

The Ultimate Powerhouse: iPhone 13 Series

iphone 13 pro max reviewsPin

With the iPhone 13 series, Apple continued to focus on low-light photography. The wide camera saw an increase in sensor size, enabling it to capture more light. The sensor-shift optical image stabilization also extended across the entire iPhone 13 range. Notably, the Cinematic Mode was introduced, using depth effects to automatically change focus during videos – a feature usually associated with high-end cameras and professional filmmaking.

The Pro models continued to differentiate themselves with ProRAW and ProRes video capabilities, giving professionals more control over their images and videos.

What is Cinematic Mode on iPhone?

The standout characteristic of Cinematic Mode is its ability to automatically change the focus between subjects in a video in a way that mimics the focus changes in professional movies, hence the name “Cinematic”. This is also known as depth-of-field or “bokeh” effect, where the subject is in sharp focus while the background is blurred.

Here’s how it works:

  1. When you’re shooting a video in Cinematic Mode, the iPhone uses machine learning and AI to identify people, animals, and objects in the scene. It then automatically creates a depth map of the scene, which allows it to keep certain subjects in sharp focus while blurring the background.
  2. One of the most impressive aspects of Cinematic Mode is its ability to automatically switch focus when a subject enters the frame or even when a subject in the frame looks away. For example, if two people are in the frame and one looks at the other, Cinematic Mode will automatically shift the focus to the person being looked at.
  3. Furthermore, you can manually change the focus and even adjust the level of bokeh effect during and after recording, offering a great deal of creative control. This is done with the help of Apple’s ProRAW and ProRes features.

The New Era: iPhone 14 Series

iPhone 14 Pro Max reviewPin

The iPhone 14 series, while still fresh in the market, has shown that Apple is not slowing down in pushing the boundaries of mobile photography. There’s a noticeable enhancement in both photo and video quality, thanks to the advanced ISP and neural engine. Night mode has become more capable, and the introduction of Photographic Styles allows users to implement preferred tonal adjustments that persist across shots, ensuring consistent style without compromising on photo quality.

What is Photographic Styles on iPhone?

Unlike filters, which are applied after a photo is taken and can often drastically alter an image, Photographic Styles are built into the photographic process itself. When you choose a style, it influences how the iPhone processes photos at the moment of capture, influencing factors like color, contrast, and tone mapping.

Apple provides four preset Photographic Styles to choose from:

  1. Rich Contrast: For strong, vibrant colors and a deeper contrast. Great for high-impact, bold images.
  2. Vibrant: For brighter, more vivid colors and a bit of boosted contrast.
  3. Warm: Adds a warm color temperature to your photos, enhancing yellows and reducing blues.
  4. Cool: Applies a cooler color temperature, emphasizing blues and reducing yellows.

Each of these styles is designed to create a different look and feel, allowing you to choose the one that best matches your creative intent or the mood of the scene you’re capturing. Furthermore, you can customize each style to further match your preferences, adjusting the Tone and Color of each style.

Once set, your chosen Photographic Style applies to all photos you take, ensuring a consistent look across your images. However, you can change your Photographic Style at any time, and it won’t affect the original quality of the images.

Additional Resources:

Leave a Reply

Your email address will not be published. Required fields are marked *