Another obvious example is when a model lifts a surgical mask up to their face: the model’s skin appears darker in the Pixel 8 Pro video, but the change in skin is less dramatic on the Pixel 9 Pro. This also plays out in situations where multiple people with different skin tones are together, where Koenigsberger says there should be less exposure distortion.
Analyzing photos taken in the field, the algorithm update wasn’t hard to spot, especially when you have the luxury of having a model in front of you. Even in normal lighting conditions, the Pixel 9 Pro’s skin tones looked much closer to lifelike people than the Pixel 8 Pro’s. Koenigsberger says this is also down to major changes to Google’s HDR+ imaging pipeline (more on that later), allowing the system to produce more accurate shadows and mid-tones.
Another new change is Auto White Balance Segmentation. This process allows Auto White Balance to adjust the exposure of people and the background in your photo separately. Before, you might have noticed colors bleeding from the background, such as a blue sky making skin tones look cooler. This new system helps “people separate from the background and keep their natural appearance,” says Koenigsberger.
This year’s Pixel 9 series also marks the first time that Google’s skin tone classification feature is fully compliant with the Monk Skin Tone Scale, a publicly available, 10-point scale that represents a wide range of skin tones to help with everything from computational photography to healthcare. Koenigsberger said the change allows for more granular color tuning.
Perhaps most important is the fact that Real Tone has been tested on all of Google’s “hero” features across the entire Pixel 9 series for the first time. Koenigsberger says his team has been able to expand testing to ensure that new features like Add Me are tested with Real Tone before launch. This is important. Koenigsberger says his team sometimes can’t spend enough time testing on A-series Pixel phones, which may be why there were issues with Real Tone on the Pixel 8a. While he hopes that expanding the process will help, Koenigsberger says this will move Real Tone from a specific set of technologies into Google’s operating philosophy.
“Ultimately, this is going to be someone’s memory,” Koenigsberger said. “An experience with your family, a trip with your best friend, the more we can recreate those experiences as we test them, the more we can ensure we’re delivering something that people will enjoy.”
Artificial memory
Memories are the underlying theme driving many of the new features announced by Google’s camera team. Earlier in the day, I spoke with Isaac Reynolds, group product manager for Pixel camera, who’s been in the job since the first Pixel phone launched in 2015. As he nears his 10th anniversary, Reynolds is “probably more passionate than a lot of other people” about mobile photography and believes there’s still a lot of room for cameras to evolve. “I know there are memories that people can’t capture because of technological limitations.”
While the new camera features on Pixel phones are more focused on specific situations than broad changes to the general camera experience, Reynolds said the Pixel 9 series has a reworked HDR+ pipeline. Reynolds said exposure, sharpness, contrast, and shadow blending have been retuned, along with updates to Real Tone, all to help create more “real” and “natural” images, which he suggests people prefer over the more processed, punchy, and filter-heavy images that were popular a decade ago.