What happens when we connect Nvidia’s new RTX 3090 graphics card up to a small 13” laptop? I’ve tested 10 games at 4K, 1440p and 1080p resolutions as well as some content creator workloads to find out!
For testing I’m using the MSI’s GeForce RTX 3090 Gaming X Trio, and I’ve got this inside the Mantiz Saturn Pro Gen II external GPU enclosure. This is connected to the Razer Blade Stealth over Thunderbolt. As the Blade Stealth is just a small 13” gaming laptop, the specs are on the lower side, and in general you need to run AAA titles at 720p for decent frame rate, but this setup should take us to the next level. Although an eGPU setup is more niche compared to either buying a gaming laptop or gaming PC, the idea people have is that if they buy a laptop with Thunderbolt, they can upgrade GPU performance in future with a setup like this.
I’ll be comparing the 3090 against a 2080 Ti, so we can see if the GPU difference gives us much of a performance boost, as we are definitely expecting the laptop’s weaker processor and Thunderbolt to act as bottlenecks compared to running the 3090 in a desktop PC.
I’ve tested with an external monitor connected directly to the graphics card, as performance would be worse using the laptop’s screen.
Battlefield 5 was tested running through the same mission in campaign mode on both graphics cards. I’ve got the 3090 shown by the purple bars, and the 2080 Ti shown by the red bars, and the three resolutions tested are listed on the left, with 1080p at the bottom, 1440p in the middle, and 4K up top. The 3090 was further ahead the higher the resolution, which is typically expected as the GPU can better stretch its legs. At 4K the 3090 was 24% faster in average frame rate, a little below average.
Red Dead Redemption 2 was tested with the games benchmark tool. This time at 4K there was a much bigger difference, with the 3090 now 38% ahead of the 2080 Ti in average frame rate. The 3090 was still able to sit around 60 FPS while the 2080 Ti was a fair bit lower comparatively, impressive stuff considering the 13” laptop.
For Control we’ll start with RTX off results. The differences to average frame rate were quite small at 1080p and 1440p resolutions, though the 3090 was seeing a much larger 19% boost to 1% low. At max settings though, the 3090 was around 38% faster than the 2080 Ti, which definitely had more noticeable stuttering.
These are the results but with RTX and DLSS now enabled. At 4K there’s now an 18% higher average frame rate with the 3090, and while this sounds nice, it’s actually the smallest difference out of all 10 games tested.
Shadow of the Tomb Raider was tested using the games built in benchmark. This was the only game that saw no difference between the two GPUs at 1080p. At 1440p there was just a couple of frames difference, then at 4K the 3090 was 22% faster than the 2080 Ti, though this result was still below average.
Death Stranding was tested running through the same section of the game with both graphics cards. Again there was a small difference at 1080p in average FPS, then a much larger 29% boost with the 3090 at 4K. Interestingly, the 1% lows were actually lower on the 3090 at 1080p and 1440p, though I’m not exactly sure why. This was the only game tested where this behaviour was seen.
Assassin’s Creed Odyssey was tested using the games benchmark tool. This game saw one of the smallest differences out of all titles tested, with a 21% higher average frame rate at 4K, and just 13% at 1080p. Call of Duty Modern Warfare was slightly ahead of the 10 game average with the 3090, coming out 30% faster than the 2080 Ti at 4K, 17% at 1440p, and 10% at 1080p. It’s also worth noting that at 4K even the 1% low with the 3090 was ahead of the average from the 2080 Ti.
Metro Exodus was tested with the built-in benchmark. This seems to be one of the more GPU heavy games, as even the high setting preset was showing some of the biggest differences with the 3090. The 3090 was 19% faster at 1080p, 26% faster at 1440p, and 34% faster at 4K.
The Witcher 3 was tested with the same test pass on each GPU. The 2080 TI was still playable at 4K with ultra settings, but the 3090 was able to give us a 24% performance boost.
Rainbow Six Siege was tested with the games benchmark tool using Vulkan. The 1% lows in this test are down because the first 5 to 10 seconds was chugging hard on either graphics card, that doesn’t normally happen so might be some other eGPU specific issue, either way it was much smoother after the initial hiccups. This game saw the biggest improvement at 4K out of all 10 games tested, where the 3090 was 39% faster in average frame rate over the 2080 Ti.
On average over all 10 games tested with a 1080p resolution the RTX 3090 was coming out just under 10% ahead of the RTX 2080 Ti in average FPS, though this could vary anywhere between 0 to 20% depending on the specific game.
When we step up to 1440p the 3090’s lead increases to 15% over the 2080 Ti, so better results now as higher resolutions tend to be more GPU heavy.
Then finally with 4K in use there’s a bigger 29% improvement with the 3090, so quite a decent result, many AAA games even with the higher settings I’ve tested were definitely playable at 4K, which I think is quite impressive when you consider I’m just using a Razer Blade Stealth with lower powered quad core CPU.
Let me know if you’d be interested in seeing a review where I compare these results but with a better laptop CPU like a 6 core i7-10750H to see how much that matters.
Outside of gaming, I’ve also tested DaVinci Resolve with the Puget Systems benchmark, which is generally considered to be more GPU heavy than alternatives like Adobe Premiere. In this test the 3090 was scoring 16% higher than the 2080 Ti. I’ve also used the V-Ray benchmark, and the 3090 was producing a massive 112% higher score in this test. So yeah, depending on the workload and how GPU bound you are, the 3090 can make a big difference, and if you’re doing other things like modelling with large projects and can actually take advantage of the 24 gig of VRAM then the 3090 might do even better.
The improvement in the games wasn’t super crazy, but I’ll test the 3090 in a desktop PC soon to compare the results and get an idea of how much performance is being left on the table.
In any case, if you’re willing to pay a premium, which let’s face it if you’re planning an eGPU setup you’ve likely accepted, it is possible to get some decent gains in gaming and other tasks over what was available last gen despite clear CPU and Thunderbolt bottlenecks present in my setup. Let me know if you’re running an eGPU setup down in the comments and what else you want to see me test in future review.
No comments yet