Intel strikes back at NVIDIA, says video encoding belongs on CPU

Posted on Wednesday, June 04 2008 @ 20:59 CEST by Thomas De Maesschalck
NVIDIA boasts its GPUs can do tasks like video encoding much faster than the fastest Intel quad-core CPUs but Intel snaps back by saying video encoding is better on the CPU. Bit Tech had a chat with François Piednoel, senior performance analyst at Intel, at Computex and they were told you need to do video encoding on the CPU because GPUs will deliver poor video quality.
Intel’s representatives said that DivX uses the CPU to dynamically adjust detail across the scene to ensure that detailed areas of scenes have the detail they require.

“When you’re encoding on the CPU, the quality will be higher because we’re determining which parts of the scene need higher bit-rates applying to them,” said François Piednoel, senior performance analyst at Intel.

Piednoel claimed that the CUDA video encoder will likely deliver poor quality video encodes because it uses a brute force method of splitting the scene up and treating each pixel the same. It’s interesting that Intel is taking this route, because one thing Nvidia hasn’t really talked about so far is video quality.

“The science of video encoding is about making smarter use of the bits and not brute force,” added Piednoel.
When asked about Larrabee, Piednoel said you can't compare this upcoming graphics board to a traditional GPU as the Larrabee has full x86 cores:
I asked Piednoel what will happen when Larrabee turns up because that is, after all, a massively parallel processor. I thought it’d be interesting to see if Intel would change its tune in the future once it had something that had the raw processing power to deliver similar application performance to what is being claimed with CUDA. Intel said that comparing this to a GPU is impossible, because the GPU doesn’t have full x86 cores. With CUDA, you can only code programs in C and C++, while x86 allows the developer to choose whatever programming language they prefer – that’s obviously a massive boon to anyone that doesn’t code in C.

Intel claimed that not every developer understands C or C++ – while that may be true to an extent, anyone that has done a Computer Science degree is likely to have learned C at some point in their careers, as the first language I learned during my Computer Science degree was, you guessed it, C. And, after learning procedural programming in C, I then applied the knowledge I gained from that to then learn to write procedural programs in other languages.
Sounds like Intel is a bit scared that GPUs will hurt the sales of their high-end processors.


About the Author

Thomas De Maesschalck

Thomas has been messing with computer since early childhood and firmly believes the Internet is the best thing since sliced bread. Enjoys playing with new tech, is fascinated by science, and passionate about financial markets. When not behind a computer, he can be found with running shoes on or lifting heavy weights in the weight room.



Loading Comments