Auto-update is a Killer Technology
By Adrian Sutton
There’s a huge amount of new technology pouring out into the world these days – lots of it focussed around the web and consumers and the key term that keeps being talked about it ubiquity. Flash brought video to the web because it was ubiquitous, HTML5 will kill flash once it’s ubiquitous, mobile web access is becoming ubiquitous etc. The most striking thing in all of this is just how quickly these new technologies are actually finding their way into real world usage.
The path to ubiquity is suddenly an awful lot shorter than it ever has been. Take for example the rate of upgrades from IE 6 to IE 7, compared with IE 7 to IE 8. Flash support for H.264 is comparatively new but it’s now ubiquitous. Anyone supporting FireFox 1.0 anymore? What about 2.0? The upgrade cycle is so fast that the range of older tech needing to be supported has shrunk dramatically in the past few years.
How has all this happened? Auto-update became standard.
When IE 6 was first release, I’m not sure if Windows Update existed or not, but it certainly wasn’t turned on by default. Not so long before that, downloading software updates was really quite uncommon – you went to the store and bought the upgrade. These days, it’s expected that software includes an auto-update function and you assume it will be on by default or at least ask on first start up.
The best example of this that I know of is actually Java. EditLive! still supports the ancient and no-longer-supported-by-Sun Java 1.4, not because we like it but because it’s a key customer requirement and it keeps showing up in real usage. Java 1.4 didn’t include an auto-update mechanism, so it’s users had to take specific action to upgrade it and most of them didn’t care. Java 1.5 on the other hand, did add auto-update and as a result, while it still turns up a fair bit, it’s almost always at the latest 1.5 release and the same goes for Java 6. And that’s with the terrible upgrade experience Java provides.
The thing is, this rate of upgrading is only going to increase. iPhone software for example is updated incredibly rapidly because it all comes through the App Store and the more users see upgrades happening smoothly and successfully, the more they trust it to happen so the more frequently they’ll do it. Not to mention the more they’ll be willing to accept software that just silently updates itself.
For developers, this changes the way to view the world. It’s actually viable in many cases to target just the latest version of many new technologies because it’s easier for users to upgrade and more likely they’ll do it either now or at least very soon. It also means that things like the Flash vs HTML5 debate change quite significantly – the time required for the HTML5 tech to become ubiquitous enough is probably shorter than most people expect, much of it is already there. So the argument that Flash will have moved on by the time HTML5 catches up is a lot weaker than history would make it seem.
Look out a bit further than today and start thinking of the power that this auto-update brings. Imagine the world in say 5 years time – it’s likely that by then IE 6 market share will be small enough to be ignored by most or all people. So everyone would have upgraded to a browser of today’s standard or better – significantly, with a solid auto-update program. How likely do you think it will be that there’s significant market-share for anything but the latest and maybe second latest major release of a browser? That would mean that new web technologies would be rolled out to effective ubiquity within one or two years. Potentially much quicker if they’re released in a minor update which is also increasingly common.
The same thing is being outside outside of web technologies as well – in roughly everything except desktop OS’s. The bottom line is that developers have never had it better and can leverage newer and better platforms sooner to deliver more features, more easily. All thanks to the unnoticed workhorse that is auto-update.