With the Pixel 2, Google is doubling down on the single camera

The Google Pixel 2 has one camera. And that's actually a really big deal.

There are a lot of interesting aspects to the Google Pixel 2 and Pixel 2 XL, but one of the most interesting is also the least controversial: Google is sticking with a single rear camera.

Every major phone manufacturer, from Apple to Samsung to LG and Huawei, has transitioned over the last years to a flagship with a dual camera setup. While the implementation varies between handsets — a second monochrome sensor, or a wide-angle lens, or telephoto/portrait abilities — the strategy is the same: augment the primary shooter with additional functionality in an attempt to stand out from the quickly-maturing field of competitors, and in turn sell more phones.

Google wants people to take great photos every time, and it's using its expertise in software to make sure that happens.

With the Pixel 2 and Pixel 2 XL, Google is doing the exact opposite. It is doubling down on the single camera, and investing heavily in software-based solutions to augment the 12MP sensor's natural abilities. Sure, both the Pixel 2 and Pixel 2 XL benefit from new physical hardware, in this case the addition of optical image stabilization as well as a wider, faster f/1.8 lens, but any portrait effects, digital zoom noise reduction, or tightly-stitched panoramas are all done using Google's increasingly powerful, and incredibly impressive, suite of software tools marketed under the guise of "computational photography."

As Google showed with the first-generation Pixel's HDR+ mode, computational photography has real-world advantages. Sure, most manufacturers, from Apple to Samsung, engage software to influence the output of photos to some extent, but Google's strategy is to completely mitigate the disadvantages of only one sensor — indeed, to pour all of its resources into that one digital pathway — through lines of code. And while HDR+ has existed in the Nexus line as far as 2013's Nexus 5, it wasn't until 2016 with the Pixels that the hardware speed caught up to the software's ambition. Back in 2014, with the Nexus 6, Phil Nickinson wrote this about the oversized phone's camera:

Again, I've gotten some really good low-light shots. And I've gotten some really bad ones.

Google's HDR+ mode helps with that some, bringing a little better balance. But it also exposes our chief gripe with the camera app. It's just slow. It takes more than a few beats to launch from a cold start, and even worse if you don't manage to actually launch it on the first try from the lock screen shortcut. And you can take an HDR+ shot then have to wait a good 5 or 10 seconds for it to finish processing before you can tell if you need to take another one.

That frustrating wait time became less significant in 2015 with the Nexus 6P, and considerably more tolerable with the Pixels. Today, when you aim and shoot with a Pixel or Pixel XL, it's safe to leave HDR+ on all the time, since processing is practically instantaneous. And the processing abilities are profoundly better; HDR+ improves smoothes the skin tones of a portrait, captures the vibrancy of a sunny day, exposes properly a delicate sunset, and brings out detail in low light.

Part of the reason the Pixel does so much better at this stuff than the Nexus 6P before it is thanks to Google's stringent oversight of the hardware and software; though the company does not build its own phones, it spent a lot of time working with its hardware partners HTC and Sony to perfectly tune the camera for its software.

With the Pixel 2, we can expect even more control over this process, and more impressive results.

Post a Comment

Previous Post Next Post

Smartwatchs