The GeForce 3 chip is graphics chip maker Nvidia's successor to the GeForce 2. It is a substantially reengineered piece of hardware, sharing a number of innovations with the more customized component that is slated to power 3D graphics in Microsoft's XBox. With 3dfx eliminated and ATI as a latecommer and at best a dark horse, expect this card to be the mainstay of home computer graphics accelleration.

The trend towards offloading more and more work towards the graphics hardware is taking a rather dramatic leap with their new architecture, centering around a programmable geometry engine and rendering system, what NVidia calls a Vertex Processor and a Pixel Processor, respectively. Software developers are expected to "customize the chips" - I can't tell you yet just how such customization will work, but you can think of them as two new specialized CPU's, with all the attendant benefits (incredible power) and pitfalls (nightmarishly complicated to program against) such systems usually have.

Other hallmarks of the new design are expected benefits of the move to better fabrication techniques and overall maturation of the design principles involved. The system is substantially faster despite a decrease in core clock speed; the anti-aliasing system is improved. The memory architecture is based on four controllers which operate in parallel. Reminiscent of NUMA, each controller is associated with a bank, but other banks are accessible for load balancing. Texture compression is now the norm, and a new kind of z-buffer occlusion culling is being used to optimize rendering.

As most of you know, NVIdia is about as in bed with Microsoft as you can be. The features on the card are tied intimately to DirectX 8. This is potentially bad for OpenGL, the platform-agnostic alternative graphics API developed by SGI and popularized by id software, and thus bad for Apple and Linux.

Lest you think NVidia is just fucking around, take a look at some numbers:

57 million transistors
12x12mm die size
Fabbed at 0.15 micron
800 billion ops per second
76 billion floating-point ops per second
Up to 36 pixel shading ops per second
Fully anti-aliased Quake III at 1024x768 in 32-bit colour runs to over 70fps
Quake III frame rate over 117 per cent faster than GeForce 2 Ultra (previous generation card)

Sources: theregister.co.uk, nvidia.com

Correction: Rather than say that NVidia is in bed with Microsoft (with regards to the integration of GeForce 3 features with DirectX 8), it is probably more accurate to say that Microsoft is in bed with NVidia, and designed the DX 8 API around the new features on the chip. Microsoft, without question, had plenty of input as to what those features should be, but by all reports so did traditional OpenGL advocates such as John Carmack. More on that in a moment.

NVidia continues to expose its new features in the OpenGL API through well recognized OpenGL extensions. Many of these extensions will eventually become part of the OpenGL standard (for a clear instance of this look at the evolution of SGI_multitexture to ARB_multitexture) and in the meantime, other vendors are free to implement any of the NVidia features they deem worthy in either software or hardware (just as they are free to do the same for their DirectX 8 products).

Certainly NVidia has an awfully close relationship these days with Microsoft; they are providing not just the graphics engine for the XBox console but also the sound and system chipset (new directions for the company, as of this writing). But it is premature to suggest that because the features of the GF3 are well exposed in DX8 that NVidia is fleeing its roots as a solid supporter of OpenGL. John Carmack has publicly expressed his support for the chip, calling it a "must have" for developers. In fact, developers of all persuasions have generally hailed the GF3 as the most important advance in consumer 3d technology since the original 3dfx Voodoo. There has been little concern amongst that community that NVidia's OpenGL support would be sub-par or lag behind the DX 8 support.

The new NVidia OpenGL extensions include NV_evaluators, NV_vertex_program, and NV_texture_shader.

Log in or register to write something here or to contact authors.