Aside from PSUs another source of confusion for normal consumers buying PCs is the graphics card. For normal desktop use, any cheap graphics card will do as long as the card fits in the motherboard (AGP or PCIe) and monitor (VGA or DVI-I or both). For gaming however, the graphics card spells the difference between smooth and choppy gameplay.
When I try to explain in the past how to buy graphics cards, I’d mention the naming scheme used by both nVidia and ATI/AMD: the final 3 digits determine the intended market of the card (e.g. an nVida 8200 will be for entry level machines while an 8800 will be for gaming) while the rest of the numbers determine the generation of the card (e.g. an ATI HD4850 is 5 generations higher than an ATI 9800). While simple, I find that this is still confusing to many.
[EDIT : Image removed due to hotlinking. If you're going to steal my paid bandwidth, at least ask permission first.]
You can use this hierarchy to compare the pricing between two cards, to see which one is a better deal, and also to determine if an upgrade is worthwhile. I don’t recommend upgrading your graphics card unless the replacement card is at least three tiers higher. Otherwise, the upgrade is somewhat parallel and you may not notice a worthwhile difference in performance.
Some of the more tech-savvy people in the audience might ask: “what about benchmarks?”
IMO, benchmarks are good indicators of a card’s performance compared to other cards. However, if the difference between the results of a benchmark of two graphics cards is lower than 10%, you might not notice the difference in real world gameplay. Because of this, I cannot recommend using benchmarks when judging the performance of cards. It is ok to use it when judging the price/performance ratio of the cards in question, though.