You can spend a lot of time debating the relative merits of fat-client and server-based computing. One side says server-based computing is easier to manage. The other side says clients don't want to depend on a server for their applications and data. We've all heard the debate. And, from a usability standpoint, both arguments have merit. However, the debate might be moot now or in the near future because it might become impossible to maintain fat-client platforms.
The problem doesn't lie in supporting the software, it lies in the computers themselves. Fat clients are getting increasingly paunchy, and we're limited by how much of that bulk we can support. Just think about issues such as power use, heat generation, and the disposition of old computers when you buy newer, more powerful, ones.
Anyone who uses a laptop has seen how more powerful systems draw greater amounts of power. This summer, California faces more rolling blackouts. The EPA says that an average PC consumes about 50 watts of power per hour (some estimates say 200 watts), and many people leave their computers on because they take so long to power up. Thin-client devices consume only about 10 watts per hour, and because they boot quickly, you don't need to leave them on when you're not using them.
According to the Environmental Energy Technologies section of Lawrence Berkeley National Laboratory, office buildings are the biggest electricity consumers among US nonresidential customers because of their huge amount of floor area. Bodies generate heat—little can be done about that. But computers—and monitors—not only generate heat, they don't operate well above a fairly low internal temperature. To keep the inside of a computer at less than 110 degrees Fahrenheit, you must maintain a cool ambient temperature. Thin-client devices run cooler than PCs and have the added advantage of not having fans, which means a much quieter work environment.
What happened to those 486s and Pentiums and Pentium IIs? What's going to happen to your new 1GHz system 2 years from now? Older computers aren't living out their twilight days at Ye Olde Computer Home, and I doubt that most of them are being used as Apache servers, either. Because the new computers are relatively inexpensive and newer software is so hardware-hungry, it's not easy to give older computers away, let alone sell them. One estimate is that more than 100,000 tons of old PCs and their components end up in landfills each year. Of course, thin-client devices can become obsolete too—otherwise, Wyse, NCD, and every other thin-client manufacturer would be out of business—but the usable lifespan of a terminal is longer than that of today's PC.
I don't know whether we can persuade organizations to move to a server-based model. Honestly, scrapping all existing PCs and purchasing thin clients addresses part of the problem by reducing heat and power requirements, but exacerbates the landfill problem: You must figure out what to do with the PCs you replace. One way or the other, we'll have to do something about our computers' ever-increasing power demands. In the opinion section of Monday's New York Times, Evar D. Nering, professor emeritus of mathematics at Arizona State University, observes that if the demand for a nonrenewable power supply increases by 5 percent each year, today's 100-year supply of that resource will last only 36 years. At the same rate of demand growth, today's 1000-year supply would last 79 years. If increasing the supply can't resolve power problems, we need to cut demand—and one way to cut that demand is to keep 1GHz computers off people's desks.
I don't think it's possible to halt computer development. Nor do I want to; it's exciting to see what we can do today that we couldn't do 10, 5, or even 2 years ago. Some of those advances even let us reduce consumption of other goods, such as the paper needed to print newspapers. But most people simply don't need huge amounts of processing power on their desktops. A solution that would let us keep most of the heat and power concentrated in the server room and put low-power devices on desktops would surely help the power situation. We have such a solution—it's just not as widely used as it could be. And the "biggerbetterfastermore" trend is still noisier than the voices that ask, "Do we really NEED all this power on the desktop? Are we USING it?"
A major reason to embrace server-based computing is to get off the client-upgrade treadmill. Doing so is not only a feel-good measure for the environment, but also a way to keep using our computers while we figure out how to power them in years to come.