Wednesday, March 30, 2011
Loops and Making Horseshoes
Tuesday, March 29, 2011
The Importance of (Variable) Names
Monday, March 28, 2011
Computer Memory and Making Dinner
Memory in a computer is a hierarchy. At the very top you have the ultra-expensive and ultra-fast memory called registers. And at the bottom you have mind-numbingly slow but large storage such as a hard disk. Why does this matter? From a programmer's point of view, understanding the memory hierarchy can allow the programmer to optimize the program to work efficiently within this hierarchy. The correct bits of information are kept at the lowest (and fastest) levels. And from a non-programmers point of view, these concepts provide insight into how a computer works.
It has been rumored* that the entire basis of the modern computer memory hierarchy is based off a single late night conversation between Dr. Simon Babbage XI and Chef Louis Von-Register IX (aka the Iron Cache). The two had been arguing late into the night about the most efficient way to organize ingredients in a kitchen. It was a heated debate between empirical evidence and learned rules of thumb. Unfortunately, the argument became so heated that neither of the men realized that they were both proposing the exact same scheme. Ultimately, the night marked both the end of a years' long friendship and a major step forward in computer architecture.
Thus, a good analogy for understanding how computer memory works is naturally to view it in terms of food storage and dinner preparation:
Registers - Registers hold values that you are using at this moment. For example, these could be the inputs into an addition operation in the CPU. This storage is analogous to your own hands or maybe the current mixing bowl. It is literally the "stuff" that you are working with at that moment.
L1 Cache - The L1 Cache holds data that is cached from main memory, but may not be in use at the moment. It is relatively small, but very fast to access. This cache is analogous to the kitchen counter right next to where you are working. You can put stuff down there that you know you will need in just one minute. Just do not leave open containers of milk there all night.
L2 Cache - The L2 Cache is a larger cache between the L1 cache and main memory. It is like the far end of the counter - still easy to get things, but somehow a little more annoying.
RAM - RAM is your kitchen cabinets: larger, but further. You can store a lot more ingredients there than you will need for just the current meal. But, when you are carefully stirring the soup, while trying to keep it at the perfect simmer, it can be a little annoying to run to the cabinet to get more salt.
Hard Drive / Network / Flashdrive - These represent the slow, but very large storage devices. They are the neighborhood supermarket of the memory system. There is a lot of food there, but if you need to go there to get something for a particular recipe, you will probably miss dinner. Ideally, you want to limit your trips to the supermarket and not dash out at every step of the recipe.
Of course most of these levels are conveniently hidden from even low-level programmers. The obvious analogy here is the assistant chef that almost reads your mind by: helping you get the correct ingredients, putting things on the counter, cleaning up unused items from the counter, and running simple errands. Fortunately, the computer's memory system does not get cranky and talk about you behind you back (I think).
*Actually, no such rumor existed until I made up that story. There is probably a much better (or at least more accurate) story for how the computer memory hierarchy came into existence, but I doubt it involves a thrown souffle.