Hello, reader!

My name is Eugene Lazutkin.

I'm a software developer based in the Dallas-Fort Worth metroplex. Lazutkin.com is my professional blog, which contains short writeups and presentations regarding modern web applications. All articles here are written by myself, unless stated otherwise.

Clients: thin vs. thick

Brad Neuberg wrote a good article, which compares two different approaches to AJAX: thick client (e.g., Dojo style) and thin client (e.g., Prototype style). While it does a good job contrasting two approaches, I want to underscore that the underlying problem is a clash of two cultures between "local application" developers, and "web site" developers.

There is no doubt that local applications create the most satisfying end-user experience. Their typical weakness is in restriction of underlying data to local installation, which makes any collaboration impossible. "Connected applications" can help to alleviate this problem, but networking is hard in general and many local app programmers try to avoid it. They don’t have proper culture to do it, existing network APIs are hard to combine with GUI, and so on. In general they don’t get it.

On the other side of the spectrum we have web site developers. They tend to use AJAX as a crutch for improving user interface, or providing some whiz-bang eye candy. Doing so they try to be footed on familiar territory of web servers. They don’t have proper culture to deal with the client side, JavaScript is notoriously hard to deal with, browser environments impose some funky (== unfamiliar) style of programming, and different browsers support JavaScript and related infrastructure (e.g., DOM, events) differently. In general they don’t get it too. That’s why they are desperately clinging to "it should degrade seamlessly" requirement. In majority of cases there is no rational foundation for such requirement, we are talking about pure psychology here.

Another gap is due to historical reasons. Ideas (just like applications) tend to be developed incrementally. While there is nothing wrong with that, this is the reason why we see baby steps in bridging the gap from both sides. One camp adds ability to update local programs using Internet, or even access some databases from applications claiming "new revolutionary approach". Another camp adds AJAX to web platforms as an afterthought trying to keep all logic on servers like in good old days.

In public perception there is a factor of coolness, which presently tips the balance against client-side applications. Everything not related to the web is deemed archaic. In reality there are a lot of applications, which cannot be web-based. While it is technically possible, it doesn’t make any sense to do so. If your application does require a lot of data (image processing, video processing, data crunching, and so on) or relies on near real-time response (device control, some video games, and so on), it is not practical to access required data using Internet.

In order to understand what’s going on let’s take a look at the big picture:

  • The combined processing power of client computers exceeds the computing power of all servers combined on all fronts (CPU, memory, and so on). On global level it makes sense to distribute the load as evenly as possible increasing the global throughput.
  • Don’t forget "the round trip problem": response delay is ultimately limited by the speed of light, and we have no practical or theoretical means to reduce it even for unlimited bandwidth connections.

Historically all robust systems tends to be self-optimized increasing "useful work" to "required resources" ratio. The first two rules of server load optimization are:

  • Conserve bandwidth:
    • Do not send unnecessary data.
    • Cache is your best friend.
  • Conserve CPU load:
    • Don’t do unnecessary calculations.
    • Don’t repeat expensive calculations, if possible. (Cache is your best friend).
    • Off-load everything possible to clients. Let somebody else’s computer do the work.

How can client side help to optimize a system? Easy:

  • Conserve bandwidth:
    • Do not ask for unnecessary data.
    • Reduce bandwidth with available means (data caching, data compression, partial data reconstruction, and so on).
  • Conserve CPU load:
    • Don’t do unnecessary calculations. In some cases it may be cheaper to ask server to provide precalculated results.
    • Don’t repeat expensive calculations, if possible. Other clients or servers may have done it for you already.
    • Take as much processing on your side as practically possible. A server is a shared component of global infrastructure, which can be overloaded by other clients, while your client is pretty much dedicated to the task most of the time.

My point is the trends of development can be easily predicted from global trends. And I foresee a merge of existing technologies, which will bring the best of both worlds producing truly distributed applications and services.

It looks like the recent development (some call it Web 2.0) goes in the right direction: web sites provide different client-side helpers ranging from "smart" bookmarklets for web browsers (e.g., Reddit) to browser plugins (e.g., Del.icio.us) to full-blown client-side helper applications: uploaders (e.g., Flickr Uploadr), notifiers (e.g., GMail Notifier), updaters (e.g., Google Updater), and so on. Connected applications are there as well (e.g., NASA World Wind). AJAX has its place too — just take a look at all high-flying companies or new startups.

In my opinion it is not enough. We are yet to see new applications, which will leverage all existing possibilities of Internet. How long will it take? Not until developers will soak in new trends and understand how to design applications properly. New tools will emerge as well. Do we have to wait for Web 3.0 for that? I hope not.