July 4, 2004
Beware the Platform II
A long time ago we said “The implementation platform is relatively unimportant.” This was a statement made at a time when a lot of people were insisting that to do “real” cyberspace (whatever that is), you needed an $80,000 Silicon Graphics system (at least), whereas we came along and somewhat arrogantly claimed that all the stuff that really interested us could be done with a $150 Commodore 64. And we still believe that, notwithstanding the fact that today’s analog of the C64, a $150 PlayStation II or Xbox, has performance specs that exceed 1989’s $80,000 SGI machine. (Hey, we never said that we didn’t want cool 3D graphics, just that they weren’t the main event.)
So it should come as no great surprise, at least to those of you who have come to recognize us for the crusty contrarians that we are, when I tell you that one of the lessons that we’ve had our noses rubbed in over the past decade or so is that the platform is actually pretty darned important.
Our point about the platform being unimportant was really about performance specs: pixels and polygons, MIPS and megabytes. It was about what our long-time collaborator Doug Crockford calls the “threshold of goodenoughness”. We were in an era when the most salient characteristic of a computational platform was its performance envelope, which seemed to define the fundamental limits on what you could do. Our thesis was simply that much of we wanted to do was already inside that envelope. Of course we always hunger for more performance, but the point remains. What we didn’t pay enough attention to in our original story, however, was that a platform is characterized by more than just its horsepower.
No matter how much our technical capabilities advance, there will always be something which acts as a limiting constraint. But though there are always limits, our experience had always been that these limits kept moving outward with the march of progress. While we were always champing at the bit for the next innovation, we were also fundamentally optimistic that the inexorable workings of Moore’s Law would eventually knock down whatever barrier was currently vexing us.
In the past 5-10 years, however, we have begun to encounter very different kinds of limits in the platforms that are available in the marketplace. These limits have little to do with the sorts of quantitative issues we worry about in the performance domain, and none of them are addressed (at least not directly) by Moore’s Law. They include such things as:
- Operating system misfeatures
- Dysfunctional standards
- The ascendency of the web application model
- The progressive gumming up of workings of the Internet by the IT managers and ISPs of the world
- Distribution channel bottlenecks, notably customer reluctance or inability to download and/or install software
- A grotesquely out of balance intellectual property system
- The ascendency of game consoles and attendant closed-system issues
- Clueless regulators, corrupt legislators, and evil governments
As with the performance limitations that the march of progress has overcome for us, none of these are fundamental showstoppers, but they are all “friction factors” impeding development of the kinds of systems that we are interested in. In particular, several of these problems interact with each other in a kind of negative synergy, where one problem impedes solutions to another.
For example, the technical deficiencies of popular operating systems (Microsoft Windows being the most egregious offender in this regard, though certainly not the only one) have encouraged the proliferation of firewalls, proxies, and other function impeding features by ISPs and corporate network administrators. These in turn have shrunk many users’ connectivity options, reducing them from the universe of IP to HTTP plus whatever idiosyncratic collection of protocols their local administrators have deigned to allow. (Folks should remind me, once I get the current batch of posts I’m working on through the pipeline, to write something about the grotty reality of HTTP tunneling.) Furthermore, the security holes in Windows have made people rationally hesitant to install new software off the net (setting aside for a moment the additional inhibiting issues of download bandwidth and the quantum leap in user confusion caused by any kind of “OK to install?” dialog). Yet such downloaded software is the major pathway by which one could hope to distribute workarounds to these various connectivity barriers. And working around these barriers in turn often comes down to overcoming impediments deliberately placed by self-interested vendors who attempt to use various kinds of closed systems to achieve by technical means what they could not achieve by honest competition. And these workarounds must be developed and deployed in the face of government actions, such as the DMCA, which attempt to impose legal obstacles to their creation and distribution. Although we enjoyed a brief flowering of the open systems philosophy during the 1990s, I think this era is passing.
Note that, barring the imposition of a DRM regime that is both comprehensive and effective (which strikes me as unlikely in the extreme), the inexorable logic of technological evolution suggests that these barriers will be both permeable and temporary. That is a hopeful notion if you are, for example, a human rights activist working to tunnel through “The Great Firewall of China”. On the other hand, these things are, as I said, friction factors. In business terms that means they increase the cost of doing business: longer development times due to more complex systems that need to be coded and debugged, the time and expense of patent and intellectual property licensing requirements, more complicated distribution and marketing relationships that need to be negotiated, greater legal expenses and liability exposure, and the general hassle of other people getting into your business. This in turn means a bumpier road ahead for people like Randy and me if we try to raise investment capital for The Next Big Thing.