According to experts, facial-recognition technology is still very much a work in progress, with false matches disproportionately affecting people with darker skin. But as Ali Breland writes in The Guardian, this hasn’t stopped police departments in the US from deploying the technology to identify suspects, leading to questionable arrests, especially of African-Americans. Breland spoke to a number of technology researchers who argue that the racial biases of facial-recognition software can be traced back to the overwhelmingly white makeup of the technology industry. Here’s an excerpt from the piece:
Experts such as Joy Buolamwini, a researcher at the MIT Media Lab, think that facial recognition software has problems recognizing black faces because its algorithms are usually written by white engineers who dominate the technology sector. These engineers build on pre-existing code libraries, typically written by other white engineers.
As the coder constructs the algorithms, they focus on facial features that may be more visible in one race, but not another. These considerations can stem from previous research on facial recognition techniques and practices, which may have its own biases, or the engineer’s own experiences and understanding. The code that results is geared to focus on white faces, and mostly tested on white subjects.
And even though the software is built to get smarter and more accurate with machine learning techniques, the training data sets it uses are often composed of white faces. The code “learns” by looking at more white people – which doesn’t help it improve with a diverse array of races.
Technology spaces aren’t exclusively white, however. Asians and south Asians tend to be well represented. But this may not widen the pool of diversity enough to fix the problem. Research in the field certainly suggests that the status quo simply isn’t working for all people of color – especially for groups that remain underrepresented in technology. According to a 2011 study by the National Institute of Standards and Technologies (Nist), facial recognition software is actually more accurate on Asian faces when it’s created by firms in Asian countries, suggesting that who makes the software strongly affects how it works.
Image via The Guardian.