e-flux Conversations has been closed to new contributions and will remain online as an archive. Check out our new platform for short-form writing, e-flux Notes.

e-flux conversations

Could robots write poetry?

For the Guardian, Marta Bausells reports on Poetry for Robots, a new initiative that attempts to teach robots about poetry and metaphor using metadata from images. Read the text below.

In 1989, American author Norman Cousins wrote that poetry was the key to preventing computers from dehumanising us: “The company of poets may enable the men who tend the machines to see a larger panorama of possibilities than technology alone may inspire. Poets remind men of their uniqueness.”

Twenty-six years later, researchers in the US are testing that idea, starting with search engines and image databases. Any nuance or metaphor gets lost on an engine such as Google: search “sorrow”, for example, and you’ll get pictures of people crying, whereas a human might associate a more varied range of images, such as a foggy seascape or an empty forest. This is because computers use metadata (the data search engines associate with the millions of digital objects out there, from YouTube videos to Instagram pictures) in a completely different way to the human brain. Our human “metadata” tends to be far more symbolic and less literal. But what if an image bank was populated by poems? Can robots learn from our view of the world?

The Poetry for Robots project has created an online image bank of 120 pictures, which anyone can access in order to write poetry inspired by what they see. By feeding poems to the robots, the researchers want to “teach the database the metaphors” that humans associate with pictures, “and see what happens,” explains Corey Pressman from Neologic Labs, who are behind the project, along with Webvisions and Arizona State University.

The project also poses the question: “Can an algorithm, informed by our poetic input, generate compelling works of its own?” The answers will start coming in September, when they perform initial searches at the Webvisions technology conference in Chicago.

The hope is that, with a big enough dataset, “we’ll be delighted to see we can teach the robots metaphors, that computers can be more like us, rather than the other way around,” says Pressman. “I’d like them to meet us more halfway.”