DV Hardware bringing you the hottest news about processors, graphics cards, Intel, AMD, NVIDIA, ATi, hardware and technology!

   Home | News submit | News Archives | Reviews | Articles | Howto's | Advertise
 
DarkVision Hardware - Daily tech news
October 21, 2014 
Main Menu

Home
Info
News archives
Articles
Howto
Reviews
 

Who's Online
There are currently 101 people online.

 

Latest Reviews
Zowie P-TF Rough mousepad
Zowie FK mouse
BitFenix Ronin case
Ozone Rage ST headset
Lamptron FC-10 SE fan controller
ZOWIE G-TF Rough mousepad
ROCCAT Isku FX gaming keyboard
Prolimatech Magnetic Pin
 

RSS
RSS





 

Apple to adopt NVIDIA's CUDA technology?

Posted on Friday, June 06 2008 @ 23:58:17 CEST by


CNET suggests Apple may adopt NVIDIA's CUDA technology to boost the performance of some of its applications, like video transcoding. More information about this may follow next week at Apple's Worldwide Developers Conference.
"Apple knows a lot about CUDA," Huang said, implying the company might be ready to formally embrace Nvidia's technology to make it easier to exploit graphics chips inside Macs. Apple's implementation "won't be called CUDA, but it will be called something else," Huang said in an interview here at Nvidia's headquarters on Wednesday.

Software developers are interested in the potential of graphics chips because of their ability to embrace parallelism, or the simultaneous execution of different types of problems. CPUs from Intel and AMD are designed as general-purpose processors, able to handle any kind of code a programmer can throw at the chip. But until multicore chips became all the rage, those CPUs were basically designed to tackle one problem, and then move onto the next problem: and software for those chips has been designed accordingly.

GPUs, on the other hand, break up a problem into much smaller bits and process it in parallel with other problems at a very high rate of speed. To this point, however, only specialized applications such as graphics software or high-performance computing applications have been able to take advantage of that raw horsepower. Nvidia, AMD, and Intel are all working on ways to allow everyday programmers to exploit the unique characteristics of graphics processors.



 



 

DV Hardware - Privacy statement
All logos and trademarks are property of their respective owner.
The comments are property of their posters, all the rest © 2002-2014 DM Media Group bvba