Saturday, December 29, 2007

Computing on the Edge of the Cloud

While you read these words, fork yourself a little mindlet and send it off a few years into the future. Imagine it reading a page just like this one - with a little extra something, a computational payload to be processed along with the rest of the content.

The concept is simple enough. The various voluntary computing projects have long since established the core principles of partitioning a huge problem into many smaller problems, shooing them off to thousands of voluntary client computer. As the results trickle in, they are swallowed, digested and - pardon my french - nicely excreted, so as to enlighten the operators, whatever their quest. Although we have found no aliens yet, nor discovered a cure for cancer, the method actually works. A lot of numbers are being crunched that way, each and every day.

The weak link is this voluntary thing. People participating in such programs must consciously decide to do so. Downloading and installing a distributed computing client (such as a screensaver) is trivial and well within the capabilities of the average patron. More often than not, though, it is one of those thing we are likely to put off to some other day. Or completely forget about. In essence, the success (in terms of numbers crunched) of an endeavor like folding@home hinges on communicating the worthiness of the cause to the widest possible audience. On that vehicle called marketing.

So to harness at least a bit of that awe-inspiring potential of a billion computers, we might want to tackle the issue from a different angle. We might put it into banner ads.

In the Browser
That concept should not be too hard to digest, either. JavaScript has been with us for well over a decade, and while that supremely underestimated language is neither down-to-the-metal-C nor all-your-meta-LISP, its scripting nature makes it very well suited for wrapping up a minuscule piece of action along with the data in distress. These cadgets (code-and-data widgets) would be stamped out in huge amounts by industrial strength code generators. As for reporting back, a simple XMLHttpRequest based scheme should do the trick nicely.

On top of that, of course, there is the ubiquity of a language born and bred in the browser. In terms of deployment, JavaScript is second to none.

Performance, however, looms. For this prospect to gain any traction, a bare minimum of speed is required, and JavaScript was never known to dispatch itself with abandon. There are probably a lot of reasons for that (though I suspect neglect to be chief among them), but if we relate the testimony of Jeff Atwood over at Coding Horror, times may get very interesting out there in browserland. The various implementations are getting faster and more efficient with each release, and Atwood even speculates that this may become a significant battleground in an imminent browser war.

Which makes perfect sense in this AJAX crazed society. There are a lot of tricks in the book, so in a few years, JavaScript should be able to hold its own with ease. As interpreted scripting languages goes, JavaScript has quite a lot going for it, not the least an open, formal specification, which lowers the barrier of entry for new - or improved - implementations enormously. I believe we are set for a rather interesting ride.

The Infrastructure
Thus satisfied with the state of affairs in the browser, we are left with the minor issues of distributing these cadgets and collating their feedback. The whole business of generating possibly billions of distinct cadgets, disseminating them redundantly across the web, and keeping track of what may or may not return, is daunting in a distinctly darwinian sense.

Sort of like combining Google Adsense and Google Analytics.

Indeed, the constituents of this scheme lies mostly in well trodden territory. While I'm certainly no expert on massively distributed computing, I reckon that the few remaining principles could be fleshed out fairly easily. This isn't exactly MapReduce, since the code and the data are bundled, but it isn't that different, either. Recombining a function with a set of arguments sounds a lot like splicing dynamic content with an HTML template, and how hard is that?

To ease the burden on cadget generators and networks alike, helper libraries might be deployed to a CDN like CacheFile. That way, only the essential computation would have to be generated and send off to the browser.

The resulting cadget would end up along the bitmaps, banner ads, blog rolls and whatever else travels along with modern web pages. It would execute and quietly send back the result.

Sideshow or Big Picture?
Case by case, such a system could never compete with MapReduce at a Google datacenter, or an equivalent number of carefully orchestrated folding@home patrons. The overhead of plain-text transmissions to and from finicky, flickering scripting agents is likely too high. On the other hand, the accumulated effect of millions of JavaScript engines chipping away at some problem or the other, even for short runs, should not be discounted off hand. We are dealing with hitherto undisclosed laws of very large numbers.

To hark back to the issue of banner ads, one might imagine an ecology of computing projects and sponsorship leagues, vying for placements at blogs, portals and corporate websites alike. Personally, I might choose to flag my support for an effort like this - should they ever need some largescale geological analysis - by sporting a cadget banner right here, on this blog.

Who knows, the dreaded Slashdot Effect might well end up being eagerly anticipated by some.

Tuesday, December 18, 2007

The Vista Blessing

Windows Vista is a blessing upon us all. A gift from Microsoft to humanity. I should know, since I don't use it - and hopefully never will. Well, at least not in any of it's current incarnations, I won't, not on a daily basis.

To conclude, by now, that Vista is not an unadulterated success, probably wouldn't stretch it too far. A disaster (for Microsoft), according to some, as unmitigated as they come. In the wee hours of a drunken night on town, even Steve Ballmer might admit to some of these assertions. Reluctantly, much diluted, but still.

In plain speak, Vista is too late, it's too little and the price is far too high - both in terms of cash money, hardware requirements and, well, the sheer burden of migrating my digital life, my habits and settings into this new environment.

In all fairness, Windows XP was and is quite decent. It does, after all, provide the majority of computer users with a huge selection of services, it is not too unfriendly, unstable or unreasonable. Most of the time it just works, more or less, despite loads and loads of well-publicized idiosyncrasies, security issues and plain bloat. Microsoft is not an incompetent bunch of fools. Over the years they have striven to meet a lot of very diverse requirements, and mostly they have succeeded.

I was never particularly well disposed towards Microsoft. Or the reverse, for that matter. I like my options, and in the department of operating systems, they are getting plentiful as of late. Apple is tempting me more and more with their slickness and style, while Ubuntu, given its open source nature, is nothing short of impressive in terms of coherence, versatility and, frankly, innovation.

But far more important, I'm getting seriously motivated to actually take the jump and detach myself from relying on any operating system, whatsoever.

And that is the blessing.

The thing is, operating systems are becoming a generic commodity. Somewhat like cars. Something we just use, regardless of brand, model or color. We may have our preferences, inclinations and favourites, but, by and large, the operating system just doesn't matter. It is not the focal point of our activities.

In truth, this has been the case for most people, most of the time. "Ordinary" people never quite cared for what happened beneath their gaming, spreadsheeting, e-mailing and browsing experience. The operating system is the default, a neccessary evil, not terribly more important than what kind of fuel the car is using.

But this is changing, and not quite like some would have it. The "Year of the Linux Desktop" might never come to pass, and a number of trends are pointing firmly in that direction:

The Cloud: we are moving online in droves. Not just technically inclined people, but everyone. And not even for our own personal productivity. We are calling out, meeting up, working with each other in venues far removed from the actual piece of hardware in front of us. Our stuff, the documents, pictures, e-mails, will increasingly move out there on a permanent basis. What is left in our physical vicinity will be some sort of local cache, containing what we need on the go. Google Apps and Google Gears is but a vague precursor of things to come. I'm not breaking any news here, I know, but I do think a lot of people are severely underestimating the impact and the timescale. Even at Google.

The Tools: We will still need an operating platform, a substrate, for animating our personal kit, something akin to Windows, Linux, FreeBSD, OS X or whatever. My guess is, personal virtualization will take off in the near future. The descendants of VMware, Virtual PC and Parallels will come to dominate the scene between the physical hardware and somewhere well into what we presently understand as operating systems. In turn, those will become smaller, possibly morphing into appliances. Migrating to faster, smaller or just different hardware will be a simple matter of copying one file, if even that. Backups will be a snap, too. And, since our data is already segregated for online consumation, what we need may not be terribly big and unwieldy. I imagine that I will be able to keep the lot, including essential files, virtual keychains, digital signatures and such, in one pocketable device. Like a USB flashstick, though preferably with a little something in the way of preventing unauthorized access.

The Gear: The hardware is changing, too. Getting smaller, unobtrusive, integrated. The mobile phones, media players, GPS, pocket harddisks, bluetooth, flash keys, the Nintendo DS, Wi-Fi ... well, you get the picture, for sure. Some call it personal area networks - which is too bland for my taste, but what the heck. The point is, I will carry it with me, and it with interface with whatever is around me. Including a fullblown desktop setting with large monitors, keyboards and pointing devices. Or a Tablet PC for the couch. Or a gaming console. Or my mobile phone. Or the car. The actual hardware will become conduits for my virtual environment, with only passing significance in itself.

I'm getting way ahead of myself here. Personally, for now, I'm taking small steps. Already, my e-mail resides out there, between GMail and my rented IMAP host. My bookmarking is with, all my documents reside on a pocket USB harddisk, and I'm starting to experiment with private, online version control repositories. Lately, I have taken to do all my professional work on a couple of VMWare images that I haul around as needed (FreeBSD, if you must know). Soon I will move most of the rest in there, as well. I'm very consciously aiming for my next Windows migration to be the very last, ever.

Then, wherever I lay my hat ...

So, does any of this spell the doom of Microsoft? Probably not, if you consider the width and depth of the software giants portfolio. They are, after all, the largest software company in the world, in a position comparable to the one held by IBM in the 1980's. I wouldn't be surprised if, say, 5 years from now, Microsoft were to open source large parts of Windows in a grand gesture of good intention. Needless to say, they will milk it for every scrap of publicity. And by then, of course, they would have secured themselves a nice corner of this new landscape, a rich source of revenue, what with shareholders and all. That Microsoft will not perish anytime soon.

Friday, December 7, 2007


Or, like 8 years later, I actually pulled over to engage in some weakly witty monologue.