Are Cheap Graphics...GOOD?

Den Den 27 April 2020
Are Cheap Graphics...GOOD?

If you're a gamer, you probably know that using the integrated graphics built in to your computer's processor, instead of a full-fledged, discrete graphics card, is about like eating meatloaf instead of steak.

In a nutshell, trying to cram high-performance graphics into a teeny tiny CPU is tough, because there just isn't enough space for all the extra transistors that you'd need, not to mention that thermals would become a big concern. But, it is also true that integrated graphics have gotten a lot better over the years, and there's actually a huge demand for quality, integrated solutions.

To explore this further, we spoke with Nick Majeskovic and John Webb of Intel, and we'd like to thank them for their time and contribution.

Part of the reason that integrated graphics have improved to the point where you can play games in HD, that is, assuming you have reasonable expectations regarding frame rates and visual quality, is that transistor sizes have continued to shrink. We've gone from having CPUs based on the 32 nanometer process by in 2010, to under half of that in the last 10 years. More transistors means that not only do you get more processing muscle, but you don't need as much electrical power to get the same level of performance, which reduces heat output. But manufacturers could have used those extra transistors to help with CPU performance. So why improve graphics that many customers won't even use?

Thing is, I clearly underestimated it, but the market for quality IGPU solutions has been there for a while, and is absolutely huge. Despite the fact that many PC gamers are more interested in higher-tier discrete graphics cards, there are plenty of others who just want their favorite titles to be playable, without breaking the bank, or draining their laptop's battery. And the progress has been such that with solutions like AMD's RX Vega graphics-equipped APUs, and Intel's upcoming XE-based products, the market for budget graphics cards is all but dead, because they can't compete with what's already on the system for free.

And aside from gaming, onboard GPUs handle a variety of everyday tasks, like improving video playback, and productivity. So, many people actually use their integrated graphics even if they have a discrete card, without even knowing it, which has driven companies like Intel and AMD to design their chips to deliver a certain baseline of performance with their graphics solutions, for a broad audience. With this in mind then, chip manufacturers have set out to make integrated graphics not only powerful enough to handle gaming, but also efficient enough for other tasks on laptops and tablets. They've done this by adding fixed-function units, so parts of the processor that are extremely good at one thing, and do basically nothing else, and through more general architectural improvements.

The architecture of a chip is how the transistors all connect to each other. And we're always learning new ways to make it better. They also use tricks like dynamic tuning, which uses software to allocate power between the CPU and GPU part, depending on what the system is doing, using data from thermal sensors. This was seen on Kaby Lake G, a series of laptop processors from a few years ago that combined separate dies for the CPU and GPU, on a single package.

There are still limitations to how large you can make an integrated GPU, as we discussed previously. But manufacturers have, at times, dedicated extra space to graphics, like in Intel's Iris Plus, for Ice Lake, which uses more die space for graphics execution units as well as for media encoding, and decoding, which is increasingly important, as we demand higher, and higher definition audio and video without killing our batteries. And, the use of chiplets, could make it cheaper for manufacturers to take this approach, to slapping a little bit less graphics silicone on this basic product, and a little bit more on a more multimedia-focused product.

So, the answer then, as it turns out, is that the higher performance, is something of a side effect of a huge push in the industry to optimize the cost and power efficiency of, especially mobile devices, through a combination of hardware and software improvements.

DX12 Ultimate, for example, could actually enable integrated graphics to run ray tracing in the future, not necessarily well, I mean we're not saying to ditch your $500 graphics card. It's just that the future does look pretty bright for gamers operating on tighter budgets, especially if you like to take your games with you on the road.

Comments (1)

  1. "We've went from 32nm to half that (14nm) in ten years"... yeah and if we also count AMD, half that again ;)

You must be logged in to comment.