Image Recognition
It’s a long-standing problem that dates back to the days of film: image processing tends to be tuned for lighter skin tones and not that of black and brown subjects. Specifically, Google is making changes to its auto-white balance and exposure algorithms to improve accuracy for dark skin tones based on a broader data set of images featuring black and brown faces. With these tweaks, Google aims to avoid over-brightening and de-saturating people of color in photos for more accurate representation. Google has also made improvements for portrait mode selfies, creating a more accurate depth map for curly and wavy hair types — rather than simply cutting around the subject’s hair. The company says it still has much to do — and it has certainly stumbled in the past on the image recognition and inclusion front — but it’s a welcome step in the right direction.