Dave Orton kicked things off with an overview of the whole concept, and he put forward quite a realistic outlook on what it can and can not do. Far from being the best thing for every task out there, Orton started out saying how it may or may not apply to your workload. If the problem you have will map to a GPU, you can see speedups from 10x to 40x, a massive increase. Doing the same math purely on a CPU would take a very expensive computer.
Check it out over here.