FUD Zilla claims NVIDIA's upcoming GeForce GTX 280 and 260 won't feature big architectural changes and that these chips are basically G92 GPUs with a lot more shaders:
Putting 240 Shader units in the chip that basically remains on G80 and G92 design will naturally get things moving faster. More Shader units at faster clocks will always make your card faster, especially at higher resolutions.
G92 with 65nm design has 128 Shaders, while GT200 has 240, or almost twice as much. The die size of GT200 is much bigger than G92 and that is how you get the fastest chip around.
Our developer friends added that the last innovation that Nvidia did was G80 and that G92 is simply a die shrink of the same idea. You can look at GT200 as G92 with 240 Shaders.
This results in GT200 running quite hot, but it will compensate with sheer power, so let’s just hope that Nvidia's yields of such a huge chip (rumoured bigger than 550mm2) will be acceptable.
Some official info will be available within a week or two.
Use Disqus to post new comments, the old comments are listed below.
Re: NVIDIA GT200 = a G92 with 240 shaders? by Anonymous on Saturday, May 31 2008 @ 04:52:16 CEST
But the shaders are "new and improved, up to 50% faster!"
Yeah, believe that and they've got a few bridges to sell you too...
ATI is going to win on cost, win in the mobile arena because the 700 can easily be made to a mobile chip while the GT200 can't, and is going to win on the mainstream while Nvidia flounders.
Nvidia is going to strike out this round (while having the fastest single chip as they'll tell anyone who will listen). They have another dustbuster on their hands and they know it. 55nm is easy. 45 would have been smart. But if you look at how they are duking it out with Intel you know "smart" isn't on Nvidia's resume.
Nvidia is kind of reminding me a bit of Yahoo right now... or Creative... I can't decide which.
Reply by Anonymous on Saturday, May 31 2008 @ 18:37:32 CEST
Don't get your hopes up too high about ATI's new boards, they're just catching up previous gen cards from NVIDIA and some pricecuts from NVIDIA could easily spoil ATI's party. It's sad but no matter how the fanboys spin it, ATI barely offers any competition for NVIDIA anymore. For competition's sake I hope it won't happen but I fear ATI will gradually become a niche player like Matrox is nowadays.