The GeForce 2 MX (aka NV11) was nVidia's first value product based off the GeForce architecture, being a stripped-down GeForce2 GTS. Introduced in June 2000, it surprised many reviewers with its level of performance compared with significantly more expensive cards. Intended as the replacement for the aging TNT2, it was the first 'mainstream' video card with hardware T&L, thus driving the adoption of that feature by game developers.
There are three basic differences between the MX (NV11) and GTS (NV15) versions of the GeForce2 engine. They are:
- Fewer rendering pipelines: The GTS has four double-texturing pipelines, while the MX has two.
- Narrower memory bus: The GTS has a 128-bit DDR memory bus, while the MX is restricted to either a 64-bit DDR bus or a 128-bit SDR bus.
- Lower clock speeds: The GTS has a core clock speed of 200 MHz, the MX is clocked at 175 MHz. Both cards have a memory clock of 166 MHz (although the GTS has DDR)
One positive side-effect of the first and third restrictions is that the NV11 chip does not require
active cooling and can be cooled sufficiently with a relatively small
heatsink. In fact, it generates only 4W of heat to the GTS's 8W and the original
GeForce 256's 16W. The first restriction is not a huge problem for its intended market, as the only 3D card at the time which had a greater texel
fill rate was the GTS. The
throughput of the MX's pipelines is similar to the original GeForce's four single-texturing pipelines, except that the original GeForce was clocked at only 120 MHz.
The largest restriction is without a doubt number 2. The original GeForce 256 had a major memory bandwidth problem, preventing the rendering engine from living up to its potential. The DDR version of the original GeForce alleviated this problem, but the GeForce 2 MX brings it back. The NV11 core was never allowed to perform to its best because it was crippled by having exactly the same memory bandwidth as the original GeForce. However, a slight relief was provided by some graphics card manufacturers, such as Hercules. My Hercules GeForce 2 MX card has 183 MHz SDR memory on it rather than the specified 166 MHz, which causes a noticeable performance increase over cards which follow the reference implementation. Which leads to the reason why many gamers chose the MX over the GTS...
Overclocking! The memory bandwidth limitations of the MX allowed many people to increase the performance of their video card simply by overclocking the memory. Core overclocking also produces an improvement, as always, but judicious memory overclocking can allow the MX to almost reach the performance of a (stock) GeForce DDR.
After the introduction of the GeForce3 in early 2001, the GeForce 2 MX was replaced by the GeForce2 MX 400, which is exactly the same card except with a core clock of 200 MHz. Unfortunately this meant that the new MX cards required either a fan on the core as well as a heatsink, or a significantly larger heatsink. Also released at that time was the GeForce2 MX 200, an even more cut down chip which features only a 64-bit SDR memory interface, for half again as much memory bandwidth as the original MX. With the introduction of the GeForce 4 MX in early 2002, the Geforce2 MX line began to fade away just as the TNT2 did back in 2000.
(CC)
This writeup is copyright 2002-2004 D.G. Roberge and is released under the Creative Commons Attribution-NoDerivs-NonCommercial licence. Details can be found at http://creativecommons.org/licenses/by-nd-nc/2.0/ .