The official release of the GeForce FX 5950 and 5700 will be here within a few days, and today NVIDIA held a conference in San Franciso to inform the press of its current and future plans in its 3D graphic product line. They did not say much positive about ATi.
Nvidia chief technical officer Kurt Akeley, chief scientist David Kirk, software engineering vice president Dwight Diercks, and software engineering director Nick Triantos did a presentation to show the press why their GeForce FX line is better than ATi his Radeon 9800.
The panel highlighted what Nvidia apparently believes to be the most important aspects of designing new graphics hardware. One of these aspects is creating cards with physical architecture that allows them to be powerful, yet affordable. The company's current line of cards based on the GeForce FX architecture (which is included in the GeForce FX 5800 and 5900 series), attempts to maximize high-end graphics performance by supporting both 16-bit and 32-bit per-color-channel shaders--most DirectX 9-based games use a combination of 16-bit and 32-bit calculations, since the former provides speed at the cost of inflexibility, while the latter provides a greater level of programming control at the cost of processing cycles. The panel went on to explain that 24-bit calculations, such as those used by the Radeon 9800's pixel shaders, often aren't enough for more-complex calculations, which can require 32-bit math. As the panel explained, the GeForce FX architecture favors long shaders and textures interleaved in pairs, while the Radeon 9800 architecture favors short shaders and textures in blocks.
After this they discussed their optimizations policy which according to them refuses the company to optimize drivers for specific benchmarks, that emphasize feature which can not be found in real games. According to them this is one of the reasons why their latest GeForce FX cards don't show universally high performance in benchmarks.
The company also reiterated its commitment to image fidelity--rather than opt not to draw certain parts of a scene, GeForce FX cards draw every last part and effect. As an example, the panel showed two screenshots of an explosion from an overdraw benchmark, in which the GeForce card drew the entire explosion as a bright white flare, ATI Radeon card didn't draw every layer of the explosion (the upper-right corner had a slight reddish tinge).
They also stressed the importance of advanced graphical effects in ID's upcoming Doom III. Another interesting claim is that they did not experienced market share loss because of ATi but because of Intel his mainstream solution which is integrated in many Pentium 4 motherboards. And this is ofcourse the market in which they make the biggest profits.
Huang also made the interesting claim that although his company has recently experienced a loss of market share (Nvidia has traditionally sold the most graphics cards, from its entire product line, of any manufacturer), this loss was due not to competition from ATI, but rather, to competition from Intel's integrated graphics. According to Huang, Intel's integrated graphics hardware (or "Free-D"), which comes bundled with new pre-built Intel PCs, is very attractive to mainstream users because of its price point.
NVIDIA also plans to launch the 52.16 drivers soon and 55-series drivers will follow early next year. 60-series will follow a few months later.
However, according to an Nvidia epresentative, though the company has traditionally released reference drivers quarterly, it won't necessarily continue to do so unless needed, and that ideally, the company hopes to release as few as one driver update per year--though on this point, a representative acknowledged that the GeForce FX line of cards has a known issue with full-scene anti-aliasing (especially in recent games like Halo for the PC), and that the company hopes to address this issue soon.