e-flux Conversations has been closed to new contributions and will remain online as an archive. Check out our new platform for short-form writing, e-flux Notes.

e-flux conversations

The quiet racism of Instagram filters

At Racked.com, Morgan Jenkins writes about the tendency of Instragram filters to lighten the skin of subjects in the photos, rendering people of color more fair-skinned than they are in real life. She tests the filters with various models of color, and in almost every case, the filters that purport to “enhance” the images brighten the models’ skin. As Jenkins writes, technological bias against people of color has a long and sordid past in the history of photograph:

People often think of technology as inherently unbiased, but photography has a history of racism. In Technologies of Seeing: Photography, Cinematography and Television, British academic Brian Winston writes, “Colour photography is not bound to be ‘faithful’ to the natural world. Choices are made in the development and production of photographic materials.” In other words, what you see in a photo is never pure reality—it’s the world as someone has chosen to depict it. And for the first hundred or so years of filmmaking, camera technology chose to ignore people of color entirely, leaving photographers’ tools with built-in biases.

The way that racism operates aesthetically is to neglect or, in extreme cases, erase whoever is not white. In the 1950s, for example, Kodak measured and calibrated skin tones in still photography using a reference card featuring “Shirley,” a white model dressed in high-contrast clothing. Ultimately, Shirley ended up being the standard for image processing in North American photography labs. It didn’t matter if the photo in question contained entirely black people; Shirley’s complexion was still treated as the ideal.

Kodak’s film was so bad at capturing the different hues and saturations of black skin that when director Jean Luc Godard was sent on an assignment to Mozambique in 1977, he flat-out refused to use Kodak on the grounds that its stock was “racist.” Only when the candy and furniture industries began complaining that they couldn’t accurately shoot dark chocolate and brown wood furniture did Kodak start to improve its technology.

Technology might be biased, but it is functioning under the ethics embedded by designer/user.
What is to blame here at essence is ultimately the Euro-centric aesthetics.
Besides Instagram, let’s take a look at other photo apps/technology in Asia. Purinto Kurabu from Japan has always been enlarging the users’ eyes and making their face looks smaller, skin color lighter; now with mei tu xiu xiu, the most popular photo app in China, the goal of having bigger eyes and lighter skin can be achieved through one click. Let’s not forget plastic surgery in North Korea, where most of girls end up looking the similar–double eye lid eyes and cone shaped face.