back to

e-flux conversations

Proxy Politics: Signal and Noise

A while ago I met an extremely interesting software developer who was working on smartphone camera technology. Photography is traditionally thought to represent what is out there by means of technology, ideally via an indexical link. But is this really true anymore? The developer explained to me that the technology for contemporary phone cameras is quite different from traditional cameras: the lenses are tiny and basically crap, which means that about half of the data being captured by the camera sensor is actually noise. The trick, then, is to write the algorithm to clean the noise, or rather to discern the picture from inside the noise.

But how can the camera know how to do this? Very simple: it scans all other pictures stored on the phone or on your social media networks and sifts through your contacts. It analyzes the pictures you already took, or those that are associated with you, and it tries to match faces and shapes to link them back to you. By comparing what you and your network already photographed, the algorithm guesses what you might have wanted to photograph now. It creates the present picture based on earlier pictures, on your/its memory. This new paradigm is being called computational photography.

The result might be a picture of something that never even existed, but that the algorithm thinks you might like to see. This type of photography is speculative and relational. It is a gamble with probabilities that bets on inertia. It makes seeing unforeseen things more difficult. It will increase the amount of noise just as it will increase the amount of random interpretation.

And that’s not even to mention external interference into what your phone is recording. All sorts of systems are able to remotely shut your camera on or off: companies, governments, the military. It could be disabled in certain places—one could for instance block its recording function close to protests or conversely broadcast whatever it sees. Similarly, a device might be programmed to autopixelate, erase, or block secret, copyrighted, or sexual content. It might be fitted with a so-called dick algorithm to screen out NSFW (Not Suitable/Safe For Work) content, automodify pubic hair, stretch or omit bodies, exchange or collage context, or insert location-targeted advertising, pop-up windows, or live feeds. It might report you or someone from your network to the police, PR agencies, or spammers. It might flag your debt, play your games, broadcast your heartbeat. Computational photography has expanded to cover all of this.

Read the full article here.