Three-dimensional game engines running on low-end hardware (PlayStation, Nintendo 64, early PC) have to resort to dirty tricks to improve framerate. One popular trick is the use of simulated fog to limit the depth of the scene such that the engine does not have to spend time drawing objects beyond a certain depth at all. This is easy to implement; simply play with the diffuse lighting coefficients of distant polygons and set the far clipping plane really close.

The result, however, can be annoying. First-person and behind-the-player games depend on a large depth of field to create a sense of space. Not to pick on one console, but N64 games such as Buck Bumble, Turok: Dinosaur Hunter, Star Fox 64's multiplayer, South Park, and Superman have way too much fog for their own good without the storyline to back it up. (Atari was much better at routing around gaming hardware deficiencies with the storyline, but that's another node.)

Graphics coders: If you're running into performance problems, don't just fog the whole world up. Try decreasing polygon count in other ways (for instance, using models at various levels of detail depending on distance from the player or more efficient world data structures) or optimizing the heck out of your code before you make the game unplayable by adding too thick a layer of fog. If you need a target to shoot for, Super Mario 64, Zelda 64, and GoldenEye show what can be done without dense fog.