Back to the future

Imagine a computing technology where your data is instantly available from anywhere on the network: no matter where you are, there are your files — just as you left them. You work with relatively simple tools, with a consistent user interface, and it’s entirely location-independent. If you’re over at a friend’s, just open up your account and do your thing. There’s no need to haul USB drives around, or worry about who has the most current version, it’s all Out There, Somewhere.

Hey! Welcome to 1984!

I can’t be the only person who thinks we have, once again, come full circle in the computing industry. It’s hard to tell whether this is a good thing or not. But the innovations in processing power, the cheapening of mass storage, and the hellacious pace of network deployment and development have brought us to a place where the solutions don’t seem all that different from what we started with back at the beginning of the whole experiment. At the dawn of networked computing systems, the idea of having one master copy that you worked with wasn’t particularly radical; that was just how things worked. The rise of the killer microcomputers and PCs with steadily improving power and accessibility meant that the centralized computing facility, shared by scores of users, started to decline in its importance. But those PCs weren’t really useful until we networked them together, and then we were left with the problem that although they could talk to one another, they were still individual machines.

So now here comes cloud computing that is going to unshackle us from our PCs for all time (or somesuch nonsense; I don’t read the PR) by turning every computational device we have in our homes into what is basically a dumb terminal with a much prettier interface (and longer boot times). I remember this from the early 1990s; we called them X Terminals. And I’m hard pressed to think what the point of all that development work was, if we’re going to return to a model from the past. In the case of Google’s Application Suite, so long as you could get a compliant Web browser to run on a device of some kind, why on earth would you still need a multicore processor-driven machine with 3D graphics capability?

I’m not saying this is a bad model. Believe me, there are a whole bunch of things in my life that would be a lot easier with ubiquitous cloud computing (free of bandwidth limitations, mind you, which is my biggest worry about this whole concept). A few of us have even half-assedly kicked the idea of starting a private cloud up — a decidedly low-tech cloud, mind you, but a cloud none the less, with the same intentions as the more staid offerings from the usual suspects. It’s not a bad idea at all. It does promise to be very liberating, and make technology work in a way that might actually be useful.

I’m just trying to figure out why it’s better than a remote account on a Unix box in a data center somewhere and either an X client and/or copies of putty and rcp. Because it’s pretty?

One thought on “Back to the future”

  1. “Indeed, it would not be an exaggeration to describe the history
    of the computer industry for the past decade as a massive attempt
    to keep up with Apple.”
    — Byte, December 1994

    “Indeed, it would not be an exaggeration to describe the history of
    microcomputer technology for the past 2.5 decades as a massive
    attempt to (catch) up with 70’s mainframes.”
    — Mike Fidler

    As you observe, this observation is more general and thus more infuriating.
    My own rhetorical question is to wonder how the world would look if C had at least had first-class functions and garbage collection, though it would probably be easier to arrange things so that Apple nicked Smalltalk as the base of their skunkworks project, instead of all the fancy but comparatively shallow design goodies made with it.


Comments are closed.