Monday, July 24, 2006

Time to dump the desktop?

The lead article in the Technology section of today's Wall Street Journal today (July 24, 2006) is titled, "Is it Time to Dump Your Desktop?" It's interesting to see the question being asked so prominently, but the answer offered in the article is somewhat disappointing - perhaps because the question, as asked, is premature.

The WSJ stacks the deck heavily in favor of the desktop by focusing on the two main components of the standard "office suite," word processing and spreadsheets. It's pretty obvious that people aren't going to switch to using online spreadsheets to run multinational corporations any time soon.

Nonetheless, the trend away from the desktop is clearly established. I've blogged on this subject before. In an entry in June, I listed some of the things I actually do now in my web browser without launching any other locally installed application, such read and write email, manage my calendar, manage projects for my clients (using Backpack), blog, online banking, pay bills, make travel arrangements, buy almost anything that can be bought (books, CDs, comptuers, clothes, groceries), manage my online photo collection, read the news, do FTP, build a Web site, look up almost anything in an encyclopedia or dictionary, and on and on. Actually, I also mentioned word processing and spreadsheets: I do use Google's spreadsheet and I was a Writely subscriber before Google bought it. But even without word processing and spreadsheets, the fact remains: the application I spend the most time in is my web browser - and apparently, I'm not alone in this respect.

It's easy to grasp the appeal of web apps, if they are good. Users don't have to install anything (usually). The application is stored in some central location (on the application server) and users simply open a browser (or the appropriate client application) and connect. Because the application used by everybody is in just one location, it can be updated easily by the owners of the app and users get the benefits immediately. Data storage is also centralized so it's much easier to establish a sound backup program. And (as the WSJ article does note), if the documents being edited are on a shared server, then you can start talking about sharing the documents themselves, doing collaborative editing, publishing documents from the server, etc. Many of these benefits are available with any client-server system, for example, a database hosted on a machine running FileMaker Pro (in which case the appropriate client application would probably be FileMaker Pro, rather than a web browser). The web is, in many respects, simply a huge client-server system with a remarkably flexible client applicatio (the browser).

So the question boils down to this: how good can web apps get? Or to put it differently, can they compete with desktop apps?

The answer to this is, at least in many cases, not just yes, but hell yes. The most complex apps on my desktop (say, Microsoft Word and Excel) are not going to be replaced by web-based services until web programming gets a lot more powerful or our expectations from these kinds of applications get simpler. Or both. In fact, both things are happening. The tools available to Web 2.0 developers - Ajax, Ruby on Rails, etc. - are relatively easy for programmers to use and provide plenty of power to developer neat and useful web apps.

In the last twenty years or perhaps the last thirty, software developers constantly asked the computer to do more. The ultimate product of that era in software development is Microsoft Office, the product that can do almost anything, from making grocery lists to running a multinational company, from writing a business letter to writing and collaboratively editing a scholarly book with illustrations, notes, indexes, table of contexts, and computer-generated typesetting. I think that we've learned that computers can do a lot, but that they do some things better than others, and some of the things that computers don't do terribly well (or at least don't do very easily) may be done better by human beings using other technologies or other systems. The one thing that software developers have talked about for decades but never achieved is real ease of use. I cannot predict the future, but I think easy to use computing must be the Next Big Thing. And if computers are going to be easy to use, then applications need to have easy to use user interfaces, but they must also be shared and centrally administered.

About Me

I am an event photographer living in Dallas, Texas.