- Software takes forever to die - it's really hard to throw it out and start over again.
- Network effects are very strong - once all the apps are on Windows, everyone wants to run Windows. Once everyone runs Windows, we want to write apps for Windows.
What's weird here is that the iPhone is pretty much invented out of whole cloth. It doesn't run software from any other platform, it builds its UI off of Objective C and Cocoa (which, to the non-Kool-Aid drinking half of the Apple third party development community looks like a new way to force us to use what we've been ignoring for years) and Apple has had the device locked up from day 1. This couldn't be more different than how Windows gained domination. So how did we get here?
Clearly having a beautiful device way before everyone else makes a huge difference. But I want to focus on another idea: is it possible that technology "productivity dividends" have fundamentally changed the calculus of building a new platform?
Development of applications for the original Macintosh was, by modern standards, brutal. You had 128K for the OS and your app, and it was a tight squeeze. Every line of code was performance critical and size critical. Those first GUI-based apps were written by some seriously brilliant programmers who had to sweat bullets.
Fortunately for us working programmers, computers are now much much faster and bigger. Instead of writing apps that are millions of times faster (which no one would care about - at some point, the window appeared to open instantly and any speed improvement is moot) we write at a higher level of abstraction, which means we write apps more quickly. To draw a supply and demand analogy, apps for the iphone (or any computer now) are less expensive in man hours because we have better tools that trade hardware horsepower for ease of development.
So that might partly explain why Apple now has 140,000 apps or so on their phone. It's not that hard to write them. But what about this business where Apple hand-picks apps and rejects the ones they don't like? My first reaction as an iPhone app developer was "hrm....it sure looks like a real computer, but man is it locked down." It certainly wasn't what I was used to.
The iPhone is surprising device to develop for, because as an app developer, you aren't given the tools to hose the machine. As a Windows developer you might be grumpy that, after decades, Microsoft has finally said that you can't dump files randomly in the system folder without user permission, but the iPhone takes things more seriously. It's somebody's phone, damnit, and your app isn't getting outside of its sandbox, let alone into the OS.
I see the fact that the iPhone has successfully developed a third party market despite being locked down as an indication that user demands may be changing. In the old world, where apps were rare and expensive to write, what we wanted was: more software. Perhaps in the new world, where writing apps isn't so hard, what users want is an experience that focuses on quality rather than quantity of apps.
(Or to put it another way: if you would agree to audit every single piece of software that a user might put on their Windows computer and guarantee that none of it was going to wreck that computer, you'd have a service you could sell. The iPhone comes with that out of the box.)
Of course, I could be missing the point entirely; the iPhone cuts distributors out of the loop, with sales going only to store and studio - perhaps that's enough to launch 140,000 apps.