Monday, September 15, 2008

Why should send a chill down evey ISV's spine

Independent software vendors (ISV's) have recently been touting some of the most useful applications that I have seen in my entire career. From 37signals to GitHub to CulturedCode, the genius of these applications are simple interfaces to complex sets of requirements. That last one, CulturedCode, has a problem though. Their flagship about to launch product just got pwned by some russians.

While 37signals and GitHub are already web applications, with a service based revenue stream, CC's product is an OS X desktop application that works beautifully as a get-things-done task orginizer, with a perpetual licensing model. The aforementioned russians took every piece of this application and made it into what appears to be an almost complete equivilient of the desktop application. And if that we not bad enough, they integrated it into Google Gears for offline use.

What can CC do about this? Not much, unless they already have a huge bankroll for lawyers. And this situation should send chills down the spine of every small software shop releasing small useful tools. The very nature of these shops constrains feature creap, which in turn forces the designers and developers to squeeze the most they can out of what they have in place, which in turn makes the software simple and powerful at the same time. But it also makes these applications vulnerable to xeroxing using the web, (relatively) cheap labor pools, and robust distribution network with firewall-like immunity to legal action in the form of international borders ( "litigation-wall" ?).

Anyone want to take bets on who is next on the feed tray? My guess is a Delicious Library web clone.

So take home message is, design software that is not easily replicated, either by feature or connectivity, or realize that your app really is easy to xerox and have a large maerketing scheme to drown out any news of the enemy.

Speaking of Google, there was that little hiccup in relations related to releasing a clone 37signal's campfire with the release App Engine. And then there is Chrome+Gears, the browser-DB combination that makes web-applications even more desktop like. Not that they are alone in this, Adobe (AIR) and Microsoft (SilverLight will certainly "extend" the reach of IE) are walking the same browser-as-platform path, and trying to building their market share on the "everything should be free" web culture, and that really raises my mercury. Fucking piracy enablers.

Friday, September 12, 2008

Licensing makes Cloud Computing ... difficult

I have thought a lot in recent months about how to best leverage cloud computing resources, or utility computing, that are increasingly becoming available to the general development community and one issue in particular makes me cringe: LICENSING.

It just so happens that in my field, proteomics, the open source set of tools gather a lot of press, but really most submissions to publish research are still using commercial algorithms for the initial data analysis, even as plenty of research has been show (and published) that results from commercial and open source algorithms are comparable. There is an inherent level of trust manuscript reviewers have in the commercial offerings that is hard to overcome, hence most researchers still opt to use the commercial algorithms as the gold standard.

Not that this is a bad thing, mind you. As someone whom supports the informatics efforts for many researchers, I find that the commercial offerings are much more stable, and are subsequently easier to support and maintain than most of the current crop of open source offerings.

The trouble lies with rigid licensing models of commercial offerings. Specifically you must purchase perpetual licenses for a certain number of compute cores. Such a model is just not compatible with what I would like to do as a service center, namely to provide software-as-service billing to researchers. True the high up-front licensing can be amortized over the life of the support contract, but it assumes that the computers running the algorithm are already procured and dedicated to the software (also not a bad assumption in most cases, since the hardware costs pale in comparison to the licensing).

In effect, there is no way to make a utility computing model, such as one offered by Amazon Web Services, work with these sorts of license restrictions. The set-up and tear down of a compute job is too high to be a viable full time solution.

What I would like to do is augment my current computing capacity during crunch times. Dedicated licensing prevents this. As does the way most networked algorithms work, but that's another post.