Image processing for those people of colour has since been a long-standing problem that dates back to the early days of film. Favouring those of lighter skin tones, black and brown subjects are found to be at a disadvantage.
After years of waiting, Google has finally addressed the matter. In an effort to solve the issue, Google announces that its cameras and imaging products will focus on making images of people of colour “more beautiful and more accurate.”
In the technical backend, Google is making changes in its auto-white balance and exposure algorithms to improve accuracy for subjects who have dark skin tones basing it on a broader data set of images that feature black and brown subjects.
Through these tweaks, the company aims for its photos to have a more accurate representation of these people of colour and hopefully avoiding over-brightening and de-saturating them.
These changes will come to Google’s own Pixel cameras this fall and the company is said to share what they have learned across the broader Android ecosystem.
Additionally, Google has made improvements for portrait mode selfies as well. The company strives to be more inclusive for those of different hair types such as people with curls and wavy hair. Google has developed a more accurate depth map that would avoid simply cutting around the subject’s hair, and in turn, have an accurate representation of it.
In light with the announcements for more inclusive features, it’s already a welcome step in the right direction.