Google focuses resources on keeping latency low on searches (Google ppt presentation). They view it as a user experience necessity.
At the Web 2.0 Summit I was amazed at the number of companies that declared that high cost of power was becoming a real business driver for them. Articles like this one emphasize that the problem of power cost is being recognized throughout the industry and it isn't just about cooling anymore; the cost of power consumed is the issue. It even turns out that Second Life avatars consume more power than most real world humans (and this is just in the data center; doesn't even count the cost of power at the user's end).
So how long will it be before company's like Google begin to consider metrics like the marginal cost per click of improving response time as a function of increased power consumption (because more machines are required to reduce latency)?
In addition to the already beginning trends toward more efficient chips running in more efficient data centers with the latest virtualization hardware optimization, will application designers one day be building for efficient user experience? Will they no longer take for granted the number of cycles required to execute a particular task?
Web 1.0 companies were built around the idea that the marginal cost of serving each additional customer was near zero. With existing technologies, at what cost / kilowatt does this assumption breakdown?