caching and performance
Caching is one of the most important concepts in computer systems. Understanding how caching works is the key to understanding system performance on all levels. On a PC, caching is everywhere. We'll focus our attention here on CPU-related cache issues; specifically, how code and data are cached for maximum performance. We'll start at the top and work our way down, stopping just before we get to the RAM.
Feeding the beast
So what it all boils down to is this: if the CPU needs something, it checks its fastest cache. If what it needs isnt there, it checks the next fastest cache. If thats no good, then it checks the next fastest cache . . . all the way down the storage hierarchy. The trick is to make sure that the data thats used most often is closest to the top of the hierarchy (in the smallest, fastest, and most expensive cache), and the data thats used the least often is near the bottom (in the largest, slowest, and cheapest cache).
Most discussions on caching break things down according to issues in cache design, e.g. cache coherency and consistency, caching algorithms, etc. Im not going to do that here. If you want to read one of those discussions, then you should buy a good textbook. The approach Ill take here is to start at the top of the cache hierarchy and work my way down, explaining in as much detail as I can each caches role in enhancing system performance. In particular, Ill focus on how code and data work with and against this caching scheme.
One more thing, I look forward to getting feedback on this article. As always, if Im out of line, then please feel free to correct me. And as always, leave the tude at the door. We all try to be professional and courteous here; well respect you if you respect us. That having been said, lets get on with the show.