In the smartphone industry, it's commonly held that you should buy the Google Pixel for its computational photographyandpost-processing prowess. The hardware is secondary.
Real-time image processing and HDR+ are still the bread and butter of the Pixel camera experience, but it's the side dishes like Photo Unblur, Guided Frame, and Magic Eraser that will resonate with most users -- professional or not. And Google continues to bet on making the most inclusive cameras in the business. One example is putting its machine learning to work in optimizing for a wide variety of skin tones. But there are other examples, too.
Also: I went to the zoo with a$2,500 camera and a Pixel 7 Pro. The results surprised me
I recently caught up with Navin Sarma, Product Manager of Google Research, to discuss the thought process behind the latest software features on thePixel 7 and Pixel 7 Pro , as well as how the company is further differentiating its cameras from the rest of the market.
Google's camera team is largely comprised of professional photographers and videographers, including Sarma, who thrives in landscape photography. Like Sarma, if you've shown any ounce of knowledge in the field, you've probably been asked before, "What is the best camera setting?" He chuckles at the thought of it. There is no "best" camera setting.
Also: Magic Eraser is finally coming to iPhones (and other Android phones)
Here are the top Android phones you can buy.
Read nowInstead, Sarma and team lean into the idea of one-tap capture, a series of user-friendly tools and features for every shooting scenario.
Creating such a flexible smartphone camera was what excited Sarma to scale the Pixel experience. "We aren't catering to a specific demographic with the Pixel camera. The general philosophy is that if you haveanyinclination to take a picture, then this camera's for you." Night Sight and Top Shot are among the tools that Sarma categorizes as "accessible creativity".
Accessible creativity extends to Photo Unblur, Magic Eraser, and Portrait Light, features that are somewhat different than the real-time settings within the Google Camera app. Instead of guiding your shooting experience, they're found in the Photos app, created to correct the past, doubling as a safety net for bad photos. It is in this area where Google's AI and machine learning really come to play.
More: How to use Pixel's Magic Eraser
"There are a bunch of issues that the team faces on a day-to-day basis. Naturally, as photographers in and outside of work, there's an intuition of common challenges and roadblocks, such as blurriness and the need for, say, a lightbox to capture evenly-lit portraits," Sarma said. "We consider these pervasive problems from a professional level, democratize them from the context of general users, and then validate the problems that are worth solving."
Photo Unblur lets you dial the sharpness of previously captured images.
June Wan/That's why Photo Unblur and Magic Eraser, while not as robust as professional software like Adobe Photoshop, exist. In both cases, Google's computational system -- with the help of in-house silicon, Tensor -- studies your images at a pixel level to define consistent patterns, lines, and color profiles. Then, it can discern objects from the foreground and background and make the changes you want.
Also: How to use Pixel's Photo Unblur
Ultimately, Google is encouraging users to look forward to the images they capture, as well as to look back at the ones from the past.
Representation matters and building for inclusivity is central to Sarma and the Google Pixel team. For instance, last year's Pixel 6 introduced Real Tone, a camera-tuning feature that corrected the white balance and highlights when capturing people of color. For the longest time before then, images of darker skin tones would often appear washed out by brighter backgrounds, and general skin brightness looked unnatural. Real Tone addressed that problem.
Also: Google Pixel 7 Pro vs Pixel 6 Pro: Should you upgrade?
Image: GoogleWith the Pixel 7, Google introduced Guided Frame, a feature that leverages Android's TalkBack screen reader and haptic feedback to assist users with blindness or visual impairments with taking selfies. It's a handy tool and one that brings meaningful innovation to smartphone cameras. "Our job isn't done yet," Sarma followed. "It's an endless goal to make everyone feel represented. Again, accessible creativity."
More: Smartphones with interchangeable camera lenses: Hardware chaos or pure genius?
Google's machine learning and AI-powered camera features are like cheat codes, shortcuts that bypass hardware limitations while still simplifying even the most tedious photography tasks like long-exposure shots and achieving the most balanced dynamic range.
Also: Google commits to building AI model that supports 1,000 languages
Putting the megapixel counts and sensor sizes aside, Sarma said, "If I had to describe the Pixel camera with one word, it would be authentic. It's all about you; from the images you capture to how your subjects are portrayed to what you're able to do with them after the fact."