DV Hardware bringing you the hottest news about processors, graphics cards, Intel, AMD, NVIDIA, hardware and technology!

   Home | News submit | News Archives | Reviews | Articles | Howto's | Advertise
 
DarkVision Hardware - Daily tech news
December 9, 2016 
Main Menu
Home
Info
News archives
Articles
Howto
Reviews
 

Who's Online
There are currently 85 people online.

 

Latest Reviews
Zowie P-TF Rough mousepad
Zowie FK mouse
BitFenix Ronin case
Ozone Rage ST headset
Lamptron FC-10 SE fan controller
ZOWIE G-TF Rough mousepad
ROCCAT Isku FX gaming keyboard
Prolimatech Magnetic Pin
 

Follow us
RSS
 

Intel chap claims discrete graphics card market will die

Posted on Thursday, April 03 2008 @ 20:29:26 CEST by


Ron Fosner, an Intel Graphics and Gaming Technologist and former video game programmer, claims multi-core processors will drive multi-GPU "madness" out of the market. Fosner claims people probably won't need discrete cards in the future. This seems a pretty odd remark considering how much money Intel is pouring into the development of Larrabee.
Fosner made these comment while demonstrating Intel’s ‘Smoke’ demo at the Technology Showcase at IDF. We posted a short version of the demo a few days ago and you can watch that video here. Today we went back to get some more detailed answers.

...

Fosner told us that multi-core CPUs are more than capable of rendering complex scenes that used to be reserved for top-end graphics cards. He argued that Intel processors offered “more bang for the buck” and that it was more economical to go from single to multiple core processors versus popping multiple graphics cards into a machine. “The fact of the matter is that you’re going to have one graphics card, you may have a dual graphics card, but you’re not going to have a four graphics card or eight graphics card system,” said Fosner.

Another advantage to CPU graphics and physics programming is that people won’t need to continually keep up with the latest programming techniques of all the newest cards – this means futzing around with shader models and DirectX programming will be a thing of the past. Fosner said that “everybody” knows how to program for a CPU and that this new way of programming will “get rid of” a path of graphics obsolescence.

When asked if discrete graphics cards will be needed in the future, Fosner answered, “Probably not”. He explained that computer didn’t have discrete graphics in the 80s and that CPUs are becoming powerful enough to take over that role.
Source: TG Daily


 



 

DV Hardware - Privacy statement
All logos and trademarks are property of their respective owner.
The comments are property of their posters, all the rest © 2002-2016 DM Media Group bvba