James McCombe on ray tracing & the gaming industry

02.06.2009

: Ultimately, the vision behind this is that today for professional graphics use for offline rendering, people that work in that space already understand the benefits of ray tracing. [They] use it on a regular basis and build their whole workflows around the assumption that ray tracing takes an incredibly long period of time. And they're used to that.

When they want to do rapid previews, they don't use ray tracing. They use rasterization. They tend to build preview tools around the GPU. Of course, it means that there's a tremendous visual gap between what they get when they preview, and what they ultimately get when they submit their work to a render farm or to a large computer, to wait a long time to get the results out.

Then you've got gaming. There is no ray tracing in gaming. Intel talks about it, but there is no game out there today that uses ray tracing. There's nothing out there that's even close to being able to bring truly fully ray traced [technology] to end the gap.

Our vision is [to solve] the fundamental problem of ray tracing, which is 'how do you take advantage massively parallel stream processors, to be able to bring to bear all that computing to solve the ray tracing problem and to be able to create a straightforward and standardized programming interface to allow people to migrate their existing game engines or rendering engines over to use ray tracing?'

Our vision is in that four or five years' time, the same fundamental underlying technologies will be used for both production rendering needs and for game rendering needs. The only difference will be that for games, they may not catch quite so many rays, because they need to maintain interactivity, whereas for production rendering, they'll need to cast thousands of rays for every pixel, because they're trying to create absolute beauty that's going to be on the cinema screen.