Theis almost certainly around the corner, and the rumor mill is spinning furiously as we head towards launch. I talked a lot about but it’s the camera that I particularly want to see take real steps forward.
Apple’s phone cameras have always been superb, with thecapable of taking the kind of photos you’d expect from professional cameras, and even its cheapest iPhone SE capable of taking great photos on your summer vacation. But the and pack incredible camera systems that mean Apple doesn’t have the edge it once had.
So I sat there dreaming about how I would go about redesigning Apple’s camera system for the iPhone 14 to hopefully secure its position as the best photography phone. Apple, take note.
A much larger image sensor on the iPhone 14
Image sensors inside phones are tiny compared to those in pro cameras like the Canon EOS R5. The smaller the image sensor, the less light can reach it and the light is everything in photography. More captured light means better images, especially at night, and that’s why professional cameras have sensors many times larger than those found in phones.
Why are phone cameras lacking in this regard? Because image sensors have to fit in pocket phone cases, where space is limited. But there is definitely room to play. Phones likeand even the 1 inch camera sensors from the 2015 Panasonic CM1 pack which can offer vastly improved dynamic range and low-light flexibility, so it’s not too crazy to hope for a much larger image sensor inside the iPhone 14.
Sure, Apple is doing incredible things with its computational photography to extract every ounce of quality from its small sensors, but if it combined those same software skills with a huge image sensor, the difference could be huge. A 1-inch image sensor surely couldn’t be out of the question, but I’d really like to see Apple go even further with an APS-C sized sensor, like those found in many cameras without mirror.
Alright, the three cameras couldn’t have massive sensors or they just wouldn’t fit in the phone, but maybe the main one could get a size upgrade. Either that or just a massive image sensor and place the lenses on a rotating dial on the back to let you physically change the angle of view to suit your scene. I’ll be honest, that doesn’t sound like a very Apple thing to do.
A zoom to finally compete with Samsung
While I generally find images taken on the main iPhone 13 Pro camera to be better than those taken on the Galaxy S22 Ultra, there is one area where Samsung wins hands down; the telephoto zoom. The iPhone’s optical zoom peaks at 3.5x, but the S22 Ultra offers up to 10x optical zoom.
And the difference it makes in the shots you can get is amazing. I love zoom lenses because they allow you to find all sorts of hidden compositions in a scene, instead of just using a wide lens and capturing everything in front of you. I find they allow for more artistic and thoughtful images, and while the iPhone’s zoom helps you get those compositions, there’s no competition for the S22 Ultra.
So what the phone needs is a proper zoom lens that relies on good optics, not just digital cropping and sharpening, which still results in pretty muddy photos. It must have at least two levels of optical zoom; 5x for portraits and 10x for more detailed landscapes. Or even better, it will allow continuous zooming between those levels to find the perfect composition, rather than having to simply choose between two fixed zoom options.
Personally, I think 10x is the maximum Apple should go for. Sure, Samsung actually claims its phone can zoom up to 100x, but the reality is that those shots rely heavily on digital cropping and the results are terrible. 10x is huge and the equivalent of carrying a 24-240mm lens for your DSLR – wide enough to scan landscapes, with enough zoom for wildlife photography too. Ideal.
Professional video controls built into the default camera app
With the introduction of ProRes video on the iPhone 13 Pro, Apple has given a strong signal that it sees its phones as a genuinely useful video tool for professional creatives. ProRes is a video codec that captures a huge amount of data, allowing greater editing control in post-production software like Adobe Premiere Pro.
But the camera app itself is still pretty basic, with video settings limited mostly to turning ProRes on or off, switching zoom lenses, and changing resolution. And that’s kind of the point; make it as easy as possible to shoot and capture beautiful pictures hassle-free. But pros who want to use ProRes will likely want more manual control over things like white balance, focus, and shutter speed.
And yes, that’s why there are apps like Filmic Pro that give you incredible control over all those settings to get exactly the look you want. But it would be nice to see Apple find a way to make these settings more accessible in the default camera app. This way, you can turn on the camera from the lock screen, change a few settings, and get started right away, being sure that you get exactly what you want from your video.
In-Camera Focus on iPhone
Imagine you have found a beautiful mountain wildflower with a towering alpine peak behind it. You walk up to the flower and tap it to bring it into focus and it comes out in sharp view. But now the mountain is blurred and tapping on it means the flower is now blurred. This is a common problem when trying to focus on two elements of a scene that are far apart, and experienced landscape and macro photographers will work around it using a technique called focus stacking.
Focus stacking means taking a series of images with the camera still while focusing on different elements of a scene. Then these images are later blended – usually in desktop software like Adobe Photoshop or dedicated focus software like Helicon Focus – to create an image that focuses on the extreme foreground and background. It’s the opposite lens of the camera’s Portrait mode, which deliberately tries to defocus the background around a subject for that nifty shallow depth of field – or “bokeh”.
It might be a niche desire, but I would love to see this focus stacking capability built into the iPhone, and it might not even be that hard to do. After all, the phone already uses image fusion technology to combine different exposures into a single HDR image – it would just do the same thing, just with focus points, rather than exposure.
Much better long exposure photography
Apple has had the ability to take long exposure photos on the iPhone for years. You will have seen these plans; images of waterfalls or rivers where the water has been artfully blurred but the rocks and landscape around the water remain in focus. It’s a great technique for really emphasizing movement in a scene, and it’s something I like to do on my own camera and on my iPhone.
And while it’s easy to do on the iPhone, the results are only okay. The problem is that the iPhone uses a moving image – a live photo – to detect movement in the scene and then digitally blur it, which usually means that any motion blurs, even parts that shouldn’t be. The result is some pretty soft shots, even when you place the phone on a movable tripod for stability. They can be sent to your family or maybe posted on Instagram, but they won’t look good printed and framed on your wall, and I think that’s a shame.
I’d like to see Apple make better use of its optical image stabilization to enable very sharp long exposure shots of not just water but also night scenes, perhaps car headlights snaking through the street. This would be another great way to get creative with your phone photography and utilize the excellent quality of these cameras.