Learning to Wait: Asynchronous Becomes Normal

Software grows patient

The biggest change I’ve seen in the last few years of software development isn’t a new language, a new environment, or magical new algorithms. The biggest change is that programmers in many different arenas, working independently, have come to accept waiting.

Part of the joy of computers, a magic that grew and grew as computers and networks got faster and faster, was the confidence that making a change was immediate. Yes, it took time to get a message over a network or for a CPU to execute calls—but things happened.

The event loop has been around for a long time—computers have always had to wait for us slow humans. Transactions have provided a buffer against the possibility of simultaneous changes to the same data, and certainly slowed things down, but the time that buffering took was generally considered a cost, not a feature. Message queues have existed for a long time, but again, seemed bulletproof but expensive.

Over the last few years, these approaches have become more common and better understood. I suspect that there are two main drivers of this change:

  • As larger scale projects have become more common, the knowledge needed to use these approaches has become more widely available. The average project may be larger scale as well, but these techniques are appearing even in cases where I wouldn’t have thought them necessary. (Of course, I also like playing with Erlang in tiny single-user environments.)

  • The Web always has latency, and JavaScript practice has evolved to support that. Asynchronous JavaScript may not be everyone’s dream work, but the pattern has evolved from UI events to Ajax to promises, deferreds, and much more. Node is built on these foundations. It’s not just JavaScript, though—I just found Asynchronous Processing with PHP on App Engine.

I saw a great talk last week on IndexedDB. It wasn’t the data storage or the asynchronous API that grabbed my attention, but the conversation about promises and ways to make “wait for it” seem like a normal programming idiom. There are a lot of those conversations happening now, about many environments.

Should we make asynchronous seem normal with syntax sugar? Or should we flag it, call attention to it, and make sure programmers remember that their code is waiting?

tags: , , , ,