I believe that real-time rendering will be largely reinvented over the next few years: still with plenty of rasterization, but now with not just one, but two really exciting new tools in the toolbox—neural nets and ray tracing. I think a lot of rapid progress will come as we revisit the algorithms used for real-time rendering and reconsider them in light of the capabilities of those new tools.Pharr is jointly credited with physically based rendering. He worked at Pixar on A Bug’s Life and Toy Story 2, co-founded a company that got acquired by NVIDIA, founded another company that got bought out by Intel, worked on the latter's Larrabee GPU, and took up a job at Google in 2013.
I wanted to be in the thick of all of that and to help contribute to it actually happening. Therefore, last Friday was my last day working at Google. I’m in the middle of a three-day spell of unemployment, and will be starting at NVIDIA Research joining the real-time rendering group on Tuesday. I can’t wait to get started.
PC Perspective notes we're at an interesting time for computer graphics. On one hand, real-time ray-tracing is finally becoming a realistic possibility for future video games, and on the other hand we're now seeing the emergence of neural networks that can create convincing, but not physically accurate, effects with a lower hardware cost.