I’ve used Google Pixels and Apple iPhones for my daily smartphone photography needs for years. I’ve mostly relied on Pixels because of Google’s pioneering computational photography software, which wrings superior image quality out of limited hardware. My current iPhone, an XS Max, has been relegated to occasions when I’ve needed a telephoto lens.
But two recent smartphone launches — of Google’s Pixel 5 and Apple’s iPhone 12 lines — have changed my mind. The midrange camera hardware on the Pixel 5, and the high-end array of cameras on the iPhone 12 Pro Max, along with the gadget’s large image sensor and new software options, are pushing me to the Apple camp.
It wasn’t supposed to be this way. I’ve been impressed by Google’s ability to convert cutting-edge image processing research into superior smartphone photos. Google demonstrated how profoundly computers can modernize cameras, as it surpassed smartphone rivals and traditional-camera makers.
Google’s decision to build a midrange phone with just two cameras feels like an abandonment. There’s just no way to make up for the multiple cameras that rivals like Samsung, Huawei and Apple employ. Sure, rivals haven’t necessarily matched all of Google’s camera software, but Google isn’t close to their hardware.
Telephoto vs. ultrawide cameras
In 2019, Google’s Pixel 4 took a step up by adding a second rear-facing camera, a telephoto option for distant subjects. That was the same year Apple added a third camera to its higher-end iPhone 11 Pro models, an ultrawide camera that sat alongside its main and telephoto cameras.
Google tried to match Apple’s prowess this year by replacing the telephoto camera with an ultrawide camera in the Pixel 5. But Apple made major camera improvements with its iPhone 12 Pro, including a bigger image sensor, a longer-reach telephoto lens, improved image stabilization to counteract shaky hands, Dolby Vision HDR video at 60 frames per second and Apple’s more flexible ProRaw format. It’s clear Apple is sinking enormous resources into better photography.
Google’s Pixel 5 smartphone has ultrawide and wide-angle cameras, but no telephoto for more distant subjects.
Google may have made the right call for the broad market. I suspect ultrawide cameras are better for mainstream smartphone customers than telephotos. Ultrawide cameras for group shots, indoor scenes and video are arguably more useful than telephoto cameras for portraits and mountains.
But I want both. I enjoy the different perspectives. Indeed, for a few years I usually carried only telephoto and ultrawide lenses for my DSLR.
In response to my concerns, Google says it’s improved the Super Res Zoom technique for digital zooming on the Pixel 5 with better computational photography and AI techniques that now can magnify up to a factor of 7X. The idea was
“We studied carefully to determine what’s really important to folks, and then we focused on that — and shaved off literally hundreds of dollars in the process,” said camera product manager Isaac Reynolds. Having a telephoto camera would have helped image quality, but Google’s priority this year “was to produce a phone that compared well to the top end but at a much lower price — and we did that.”
I’m not so convinced. When shooting even at 2X telephoto zoom, my 2-year-old iPhone XS Max and my 1-year-old Pixel 4 both offer far superior imagery compared with the Pixel 5.
What I do like so far about the Pixel 5 cameras
I want to be clear: Google’s new phone has its merits, and I’ve experienced some of its strengths while testing the Pixel 5 cameras over the past few days. Here are a handful:
- Google’s computational raw offers photo enthusiasts the best of both worlds when it comes to photo formats. It marries the exposure and color flexibility of unprocessed raw photo data with the exposure range and noise reduction of the multishot HDR+ processing ordinarily used to make a JPEG.
- The ultrawide camera really is fun. It also dramatically improves video options, particularly indoors.
- Based on earlier Pixel phones, I share my colleague Lynn La’s concern that Google’s video stabilization can be “drone-like,” but my early tests of video I shot while walking looked more natural.
- Double-tapping the phone’s power button launches the camera app fast. It’s not new with the Pixel 5, but it’s so much faster than the iPhone’s lock screen icon.
- Night Sight, particularly astrophotography mode, still is amazing for low-light shots.
Google also pointed to other Pixel 5 perks, including a portrait light ability to control the apparent light source brightening a subject’s face; portrait shots that work in Night Sight mode; 4K video that now works at a fast 60 frames per second, more advanced high dynamic range processing called HDR+ that’s now boosted by exposure bracketing for better shadow details like a backlit face, and better video stabilization.
Here’s the rub, though: As Google slips in hardware, rivals are improving their software.
Google’s rivals in computational photography are catching up
Apple didn’t comment on its photography plans for this story, but it spent more than 11 minutes touting the iPhone Pro photo and video abilities, and its actions speak volumes.
Last year, Apple matched most of what was best about Google’s HDR+ for challenging scenes with bright and dark elements. This year’s Pixel 5 boosts HDR+ with bracketing technology into the multishot blending technique. Apple’s Smart HDR alternative, however, is now in its third generation of refinement. Apple is improving the iPhone’s nighttime photos, too. And by using special purpose processing engines on its A14 chip, Apple’s Deep Fusion technology to preserve detail in low-light shooting works on all four of the iPhone Pro cameras.
Photo enthusiasts like me prefer unprocessed, raw photo formats so we can fine-tune color balance, exposure, sharpening and noise reduction. That’s great for when the camera doesn’t make the right choices when “baking” raw image data into a more convenient but limited JPEG image. Google’s computational raw blended HDR processing with raw’s flexibility, but now Apple plans to release its answer, ProRaw, in an update coming later this year to iPhone Pro models.
“We want to give our pros even more control over the images they capture,” said Alok Deshpande, Apple’s senior manager of camera software engineering, during Apple’s launch event.
Relatively few people use Pixel phones, and that weighs on Google too. Imaging software powerhouse Adobe calibrates its Lightroom photo software to correct lens problems and adapt its HDR tool for some cameras and lenses. No surprise that Pixel phones aren’t on that list. “We tend to provide support based on the popularity of the devices with our customers,” Adobe said in a statement.
In contrast, Adobe is “partnering closely with Apple” to tap into ProRaw abilities. And a Google computational photography guru, Marc Levoy, has left Google and is now at Adobe, where he’s building photo technology into Adobe’s camera app.
Selling a midrange smartphone like a Pixel 5 or Pixel 4a 5G might well make sense when the COVID-19 pandemic has cost millions of jobs and made a $1,099 iPhone Pro Max unaffordable. But for people like me with a photography budget and appreciation for Google’s computational photography smarts, it’s tragic that Google has lost its lead.