fbpx

Here’s Why The Pixel 3 Camera Is The Best On Any Phone

by

Spread the love

There once was a time when the average Android user felt incredibly insecure around their iOS-owning friends for one key reason: photography. Traditionally, Android-based phones have struggled to match Apple’s iPhone when it comes to this vital element of smartphone design; from the earliest days of the smartphone era, Apple’s devices have tended to be faster to focus and better at capturing images than the vast majority of handsets running Google’s OS. Even in the past, when Google has made a big song and dance out of the fact that its latest Nexus phone had the “best camera we’ve ever produced”, the results were usually depressingly second-best when compared to that year’s iPhone.

Since the arrival of the Pixel line of phones, that has changed completely. Sure, Apple’s devices are still great at taking photos, but Google’s phones are now considered to be the best in class when it comes to image capture. The first Pixel blew everyone away with its photographic abilities, while last year’s Pixel 2 upped the ante by including a dedicated chip focused solely on image capture: the Pixel Visual Core.

However, Google has outdone itself with the camera on this year’s Pixel; while on paper it looks rather ordinary (and yes, it’s still just a single sensor – no four-camera gimmicks here), but behind the scenes the company has invested a lot of time and effort into making this the industry standard in mobile photography.

The Visual Core Gets An Upgrade

When Google introduced the Pixel Visual Core on the Pixel 2, it was something of a game-changer. The chip’s aim was to give the phone’s photographic power a boost with some dedicated hardware, as well as allowing third-party applications to use the fancy new HDR+ feature for more striking images. On this year’s phone, the Visual Core has been given a host of new tasks to undertake, all aimed at improving the quality of your images.

Top Shot is perhaps the most notable example; instead of taking a single image when you tap the ‘capture’ button, the phone takes several and uses machine-learning to pick out the best ones – in theory, this means less blurry images and more shots of open eyes and smiling faces. The funny thing is, if you didn’t know Top Shop existed you might not even notice it’s there; instead, you’d just think you had really good timing when it comes to taking a photo. It’s a case of the phone making you feel like a better photographer.

Photobooth works in very much the same way, and is best used for selfies or group shots. It again uses AI to look for the best time to take the photo, so it will wait until it sees smiling faces and open eyes before capturing the shot. The process is pretty much automatic, with the phone snapping the images it thinks you’ll like best and allowing you to pick the one you like.

Welcome to Night Sight

[[{“type”:”media”,”view_mode”:”content_full_width”,”fid”:”44306″,”attributes”:{“alt”:””,”class”:”media-image”,”height”:”417″,”width”:”620″}}]]

The problem of low-light photography is something that smartphone makers have been wrestling with for years; the small sensors on most phones aren’t great at taking images in the dark, and this has led to some interesting solutions – perhaps the most famous being Samsung’s variable aperture on its Galaxy S9 device, a mechanical system which adjusts the amount of light being let into the sensor.

Google has opted for a software-based system which blows Samsung’s approach clean out of the water. Night Sight – which wasn’t available at launch but is being rolled out to Pixel 3 phones as we speak – works by taking up to 15 frames in a third of a second, and then uses machine learning to ‘fix’ the image and make it look brighter. The system works in very much the same way as HDR, which also uses several photos to build up its final image. In short, the phone is smart enough to read a scene and understand what things would look like if there was more light, and the resultant image is remarkable – the only downside is that because of that extended capture period, you have to hold the phone really steady.

When One Sensor is Better Than Two

While Google has decided to equip the Pixel 3 with two selfie-cams – one of which is wide-angle and means you can cram in more co-workers and family members than ever before – it has stuck with a single sensor on the back. The reason? Google’s engineers think that any more than that is overkill.

Instead, Google is adamant that it can do everything a second camera makes possible simply by using clever software tricks. For example, a lot of other dual-camera phones use that second sensor for depth information so that ‘portrait’ effect shots – those with ‘bokeh’ applied – look more convincing. Google insists that this kind of depth data can be captured by a single sensor, and given how reliable and accurate the Pixel 3’s bokeh tricks are, we’re inclined to agree.

Optical Zoom? Who Needs It?

Both the iPhone XS and Galaxy S9+ feature a x2 optical zoom mode thanks to their secondary cameras – something that naturally isn’t possible with the Pixel 3’s single sensor. To overcome this, Google has created something called Super Res Zoom, which again uses software to achieve something that would normally be done via hardware. It enhances the resolution on a photo that you’ve zoomed in on using AI to compensate for handshake and other movements. To do this, the phone’s lens – not the sensor itself – has been made super-sharp, so it can pick out those fine details. The result is a convincing zoom which looks like it’s optical but is, in fact, digital zoom, all done via a combination of that improved lens and AI.

Video Recording Is Even Better This Year

Google has had a tendency to put all of its efforts into still images in the past, and as a result video recording on the Pixel phones has always been merely good rather than amazing. That changes this year with the introduction of Motion Focus, which avoids that problem you have with smartphone cameras where they have to adjust focal depth as you move around. As you can see in this video recorded in collaboration with award-winning Hollywood director Terrence Malick (The Thin Red Line), the Pixel 3’s new focus tricks help you capture some seriously impressive footage, even when you’re up-close to the subject or the scene is really busy.

Thanks to Mobile Fun for supplying the Pixel 3 used in this feature.

  • Save
Comments
You might also like...
    Share via
    Copy link
    Powered by Social Snap