With AMD tech powering the next generation consoles, NVIDIA explains why PC gaming won’t lag behind
Bennett Ring speaks with NVIDIA’s Senior Vice President of Content and Technology, Tony Tamasi, about the impact the next generation consoles will have on the PC.
PCPP: In the past, when a new console launched, the graphics were on par, if not even better, than a reasonably well-specced PC of the time. Yet at this year’s E3, we noticed that the Xbox One and PS4 demos didn’t look as good as the earlier PC demos of the same games. Do you think the lead that consoles had in the past at launch day is over?
Tamasi: It’s no longer possible for a console to be a better or more capable graphics platform than the PC. I’ll tell you why. In the past, certainly with the first PlayStation and PS2, in that era there weren’t really good graphics on the PC. Around the time of the PS2 is when 3D really started coming to the PC, but before that time 3D was the domain of Silicon Graphics and other 3D workstations. Sony, Sega or Nintendo could invest in bringing 3D graphics to a consumer platform. In fact, the PS2 was faster than a PC.
By the time of the Xbox 360 and PS3, the consoles were on par with the PC. If you look inside those boxes, they’re both powered by graphics technology by AMD or NVIDIA, because by that time all the graphics innovation was being done by PC graphics companies. NVIDIA spends 1.5 billion US dollars per year on research and development in graphics, every year, and in the course of a console’s lifecycle we’ll spend over 10 billion dollars into graphics research. Sony and Microsoft simply can’t afford to spend that kind of money. They just don’t have the investment capacity to match the PC guys; we can do it thanks to economy of scale, as we sell hundreds of millions of chips, year after year.
“It’s no longer possible for a console to be a better or more capable graphics platform than the PC”
The second factor is that everything is limited by power these days. If you want to go faster, you need a more efficient design or a bigger power supply. The laws of physics dictate that the amount of performance you’re going to get from graphics is a function of the efficiency of the architecture, and how much power budget you’re willing to give it. The most efficient architectures are from NVIDIA and AMD, and you’re not going to get anything that is significantly more power efficient in a console, as it’s using the same core technology. Yet the consoles have power budgets of only 200 or 300 Watts, so they can put them in the living room, using small fans for cooling, yet run quietly and cool. And that’s always going to be less capable than a PC, where we spend 250W just on the GPU. There’s no way a 200W Xbox is going to be beat a 1000W PC.
But how is this different from the launch of the PlayStation 3 and Xbox 360, where they still had these power limitations but were on par with the PC?
Because at that time, the PC graphics industry wasn’t operating at the limits of device physics and power. If you wind back the clock, a high-end graphics card at that time was maybe 75W or 100W max. We weren’t building chips that were on the most advanced semiconductor process and were billions of transistors. Now we’re building GPUs at the limits of what’s possible with fabrication techniques. Nobody can build anything bigger or more powerful than what is in the PC at the moment. It just is not possible, but that wasn’t the case in the last generation of consoles. Taken to the theoretical limits, the best any console could ever do would be to ship a console that is equal to the best PC at that time. But then a year later it’s going to be slower, and it still wouldn’t be possible due to the power limits.
There’s been a shift here, the R&D budgets required to build the PC’s level of graphics are enormous, there are only a few companies that can do it. The technology that we’re applying to PC graphics is literally state of the art, at the limits of semiconductor technology. That’s why I don’t think it’s possible any more to have a console that can outperform the PC.
On that note, we’ve heard developers say that they can extract roughly three times the level of performance out of a console’s hardware compared to a similarly-specced PC, due to the fixed nature of the hardware. They know exactly what they’re developing for, and there is less driver/OS bloat. Is this still true, or have newer versions of Direct X helped to tap into the PC’s performance? Will the next-gen consoles still punch well above their weight?
I think a console can punch above their weight to some degree, but not by a factor of three. I wouldn’t even say a factor of two. Partially because they have leaner and meaner operating systems and APIs, they’re closer to the metal, and also because developers can hand-craft code for these fixed platforms.
Things have changed, though. Direct X has gotten much, much better compared to where it used to be. The barrier between Direct X level of interface and to-the-metal interface has gotten much closer. It used to be huge, but not so much anymore. Also, I think the PCs and the consoles look alike these days. The PS4 and Xbox One have an x86 CPU, a PC-style GPU. It’s a giant integrated graphics PC.
It’s great for gamers, as games can be better on all platforms.
That’s great, because if devs are spending all this time optimising for a PS4 or Xbox One, then a good portion of that will benefit the PC, because they’re basically doing PC architecture optimisation. It’s good for everyone – the developers don’t have all these crazy architectures they have to sort through. 80% of their work is now applicable to all platforms. It’s great for gamers, as games can be better on all platforms. And it’s great for PC, as there’s less weird divergence between consoles and PC, which means a lot more leverage for devs to raise the bar. If there were technological reasons that games weren’t ported to the PC in the past, there are a lot less of those reasons come next-gen.