e-flux Conversations has been closed to new contributions and will remain online as an archive. Check out our new platform for short-form writing, e-flux Notes.

e-flux conversations

Asynchronous! On the Sublime Administration of the Everyday

A cold pail of water passes through a line of workers, sloshing from hand to hand. Another follows behind it. And another. To coordinate this bucket brigade, the line of busy hands moves according to a fixed rhythm, each movement synchronized like a metronome. The analogy illustrates the primary principle of synchronous processing: no matter the speed of a single movement, the pace of the chain may not exceed the time it takes the slowest transfer to complete. This familiar scene is the basic unit of Fordism—an assembly line of exchanges locked in linear progression. One thing at a time. One thing after another. All you can really do is speed it up.

This dictatorship of synchrony—from clocked computer chips to supply chains and back again—hamstrings productivity and constrains the marketplace. For the designers of scalable systems, it represents the ultimate barrier to progress. To break through this barrier, engineers dream of the asynchronous: a vision of the world where the bucket brigade stops following the tick of the metronome. In the event that one worker finishes passing their bucket early, they can accept the next from anywhere along the line. Instead of waiting for the second worker to pass their bucket, the third takes it directly from the first, or from a different line entirely. Work flows to available resources, regardless of where these resources are located in the traditional sequence. At first, the line becomes chaotic. But suddenly, the light accelerates past the heavy. Soon we have an asynchronous system and a new transaction can begin without waiting in line.

Inside every computer is a microprocessor ticking back and forth about a billion times a second. This tick organizes each transmission, signaling to the operating system when one process has completed and when the next can begin. Just as the bucket brigade’s linear rhythm constrains the movement of the water, so too do synchronous computer chips limit the performance of our fastest information transmissions. At Sun Microsystems in the 1990s, Ivan Sutherland and Jo Ebergen used the bucket brigade metaphor to explain the advantages of their experimental research into asynchronous chip design. When computer chips become asynchronous, “actions can start as soon as the prerequisite actions are done, without waiting for the next tick of the clock.” But in the early days of computing, the market pressure for a straightforward, reliable solution meant that synchronous chip design, which was simpler, won out over the grander, theoretical plans for asynchronous computing. The processor that runs your MacBook is synchronous and clocked, running at about 2.7 GHz. Despite intense research, truly asynchronous chips took years to get out of the lab—and even then their commercial use was limited.

But something funny happened on the way out of R&D. Asynchronous processing hasn’t simply left the lab and entered our devices and networks. Instead, the asynchronous principle—that complex systems should be designed to allow tasks to run independently as resources become dynamically available—has moved outwards from the chip to the server, from the server to the data center, from the data center to the workplace, and from the workplace to the city. Asynchronous processing has emerged as a new ideal, and it is increasingly being applied in fields as diverse as software design, biomedical engineering, and labor-force management.

Read the full article here.