GPUs Are Getting MORE Power-Hungry

GPUs Are Getting MORE Power-Hungry
5 min read

Pointless! That's how a lot of people these days would describe high-wattage power supplies that promise you 1000 watts or more of power with an unreasonably high price tag. They used to be more common when people rocked multiple graphics cards in their rig, but as SLI and CrossFire fell out of favor due to stability and performance issues, so did those hefty power supplies. But are the days coming where they might be relevant once again?

Yeah, they very well may be if trends in graphics card power continue the way they're going. And if you haven't paid super close attention to exactly what's going on in GPU land, this might seem surprising considering power efficiency has been a huge trend across electronics. I mean, we're getting to the point where it's disappointing if a high-end smart phone only gives you one day of battery life, and the latest Apple Silicon-equipped Macbook Pros boast over 20 hours.

And although we don't run our desktop gaming rigs off of batteries, graphics cards have been getting more efficient. But the issue is, that although the number of frames you get per watt of power has been increasing, the total power has also been increasing. Not to mention the fact that both chipmakers and game developers keep pushing the boundaries of visual fidelity. It's just always more photorealistic.GPUs Are Getting MORE Power-Hungry

Just a few years ago, Invidia's then top end Titan XP drew around 250 watts at load with AMD's competing Radeon VII weighing in at 300 watts. But now Team Red's current best, the Radeon RX 6950 XT has increased to 330 watts while Invidia's 3090 Ti has a TDP of a whopping 450 watts. May as well have just come out of a flame broiler.

 And the expectation is that the upcoming RTX 4080 featuring Invidia's new Ada Lovelace architecture, could clock in at around 400 or 500 watts, while the 4090 could suck down as much as 600 watts of power on its own. But why does more power automatically have to be the way we try to get more performance?

Now, one big reason that manufacturers might not be paying too much attention to how much power their cards are guzzling is simply because they don't particularly have to. I mean, sure, you can advertise a desktop card as being power efficient or having a good cooling solution, but at the end of the day, the thing that's going to sell cards is performance.

AMD and Invidia would much rather compete on FBS benchmarks than trying to one-up the other company by saying, "Hey, our GPU uses 15% less power!" And this trend might continue due to the rise of chiplets in CPUs and GPUs instead of the use of one big monolithic chip die. GPUs Are Getting MORE Power-HungryIf you don't know, chiplets are modular chip pieces that can be combined to act as a single processor, and are gaining popularity in fabs because they have better yield, meaning a defect on the wafer will only affect one small chiplet rather than a whole, complete processor. It's likely the chiplets will allow companies to build bigger GPUs more profitably. Something that AMD in particular seems quite interested in, and we can't rule out that Invidia may move in that direction someday either.

Of course, this doesn't mean there isn't an upper limit to how much power a card of the future will draw. High-wattage power supplies are expensive, and expecting folks to save up for both an expensive GPU as well as a 200 plus dollar power supply might just be too much to ask. Not to mention companies that build pre-built PCs won't be too happy about having to spend extra money on nicer power supplies which they've traditionally cheaped out on.

And even though there are plenty of spacious gaming-oriented cases on the market, an insanely high wattage card means a super bulky cooling solution that would take up an unpalatable amount of space or spit out unacceptable amounts of heat. Not ideal if you're in a small, poorly ventilated room, like you are, right now. 

Den W. 3158
I'm a passionate tech enthusiast who loves diving into the world of software, programming, and tech reviews.
In case you have found a mistake in the text, please send a message to the author by selecting the mistake and pressing Ctrl-Enter.
Comments (0)

    No comments yet

You must be logged in to comment.

Sign In / Sign Up