Perceived speed is a great way to make your end users think that your application is snappy.

It's especially important in regards to web applications because people have no tolerance for waiting on the web. We want things to be responsive, snappy, fast. Developers can give us that, but it takes more foresight (and experience in failing to do this) than it does to take the easy way out.

The easy way out is to send everything back and forth between the browser. It's easy because you do not need to keep track of anything. You put it all out there and you get it all back.

The problem arises when it comes time to deal with larger volumes of data. You did plan on your software managing larger volumes of data than you tested with...right?

Most people don't. I know I didn't. I figured with nessquik "why do I need to page this target list? I can't imagine any single person having more than 50 or so targets". And then I found myself adding site-audit functionality....with exclusion lists of 1500+ targets. In the words of the grail knight from The Last Crusade, "he chose poorly"

I had made the decision to send everything back and forth between the client and server, and my code on the backend operated this way. However, to do this was unacceptable because it caused the user experience to be abysmal. Loading several meg of HTML into the browser makes most browsers kack.

This problem turned me in the direction of paging. I could page the target list, only loading a small subset, 15 or so, and then loading the rest on demand, a page at a time.

This works, but requires that the backend be modified to support paging and the front end be modified to support paging. In addition to that, I also need to modify the backend and frontend to support updating of the audit targets in a totally different way; by using only the changes.

Since the whole list will usually never be on the screen, I can no longer rely on sending the whole list back to the server to be saved. I now needed to track the changes to the list and apply those changes on the backend when the audit is saved.

These changes are not overly complicated, but they require time, and time is a precious commodity.

Unfortunately, you usually do not have these revelations until you've experienced doing it wrong. Even for small applications, or applications that are home-grown and will stay in-house, you develop the false belief that your data volume will always ever be small. That's usually not the case.