How much memory your graphics card has access to is crucially important to gaming performance, but is a bigger buffer always worth spending more money on? The AMD Radeon RX 7600 XT puts this theory to the test, delivering 16 gigabytes of VRAM alongside a very familiar GPU and starting at ~$329.

The underlying core of the RX 7600 XT is pretty much identical to the RX 7600, which launched in May of last year. They both use the same Navi 33 GPU, featuring up to 2,048 shaders split across 32 compute units (CUs). What’s too bad for the XT model is that the regular RX 7600 uses the full complement of cores available with this chip, the full 2,048, meaning there’s no room for improvement on the XT without an entirely different GPU design. Since that’s not happening, we’re left with an XT that resembles the non-XT in almost every regard.

From the identical shader count to the memory subsystem, the RX 7600 XT offers no surprises. There’s 32MB of Infinity Cache, which is used to help limit calls out to the VRAM, and when the GPU does need to go further afield, it does so over a 128-bit memory bus.

The XT does come with faster clock speeds than the non-XT card, even on the standard non-OC models. That said, the PowerColor Hellhound I’m looking at here is a factory OC model and runs up to 2,539MHz game clock and 2,810MHz boost clock, above the reference specification of 2,460MHz and 2,760MHz, respectively.

It’s the 16 gigabytes of GDDR6 memory that AMD is hoping to draw in the punters with, however.

That’s a bounty of VRAM compared to other graphics cards in the market at similar prices. Both the $299 (~£300) Nvidia GeForce RTX 4060 and $269 (~£250) AMD Radeon RX 7600 come with just eight gigabytes. The benefit of having more is that there’s little chance of brushing up against the limits of the onboard memory chips and having to resort to using much slower system memory far away on your motherboard—far away in cache terms, anyways.

Running out of VRAM on your graphics card can be pretty catastrophic for game performance. However, having access to greater amounts of memory is not necessarily going to help you net higher performance in every game, only those where 8GB of memory capacity is a problem, which is very few of them, as clear from the benchmarks laid out below.

Test rig

CPU: Intel Core i9 12900K
Motherboard: Asus ROG Strix Z690-F Gaming WiFi
Storage: 2TB Sabrent Rocket 4.0 Plus
Cooler: Asus ROG Ryujin II
PSU: Gigabyte Aorus P1200W
Memory: G.Skill Trident Z5 Neo DDR5-6000 CL30 2x 16GB

Graphics cards:
PowerColor Hellhound Radeon RX 7600 XT 16GB
AMD Radeon RX 7600 (MBA)
Acer Predator BiFrost Arc A770 16GB
MSI Ventus Black 2X RTX 4060

You’d expect some improvement more generally across every resolution tested due to the increase in clock speed with the RX 7600 XT. That’s largely true throughout my results. I’m seeing consistent improvements to average and minimum frame rates with the XT, if quite small most of the time. That improvement is certainly helped along by the fact I’m testing a reference RX 7600 versus a factory overclocked RX 7600 XT.

How much of that small uplift across my benchmarking suite is down to increased memory demand is another question entirely. 

In the games I’ve benchmarked that provide information on expected memory use, Far Cry 6 and Red Dead Redemption 2, neither hold to max out 8GB even at 4K with highest graphical presets enabled. That said, whereas RDR2 doesn’t claim to get anywhere close to using 8GB, Far Cry 6 is nearly at the limit. That would appear to explain why the 8GB RX 7600 falls apart at 4K in Far Cry 6, managing only a snail’s pace, and easily outperformed by the RX 7600 XT.

The Hellhound lights up the test bench while benchmarking. (Image credit: Future)

But I’m not entirely satisfied with this explanation. That’s due to a thorn in AMD’s side, named Nvidia. The RTX 4060 comes with 8GB and performs almost identically to the RX 7600 XT in Far Cry 6. That flies in the face of the theory, and both Nvidia and AMD stuff a lot of cache into their modern cards. That said, perhaps Nvidia’s approach with vastly increased L2 cache amounts on the RTX 40-series is a better fit for Far Cry 6 than AMD’s Infinity Cache and smaller L2 cache, and that somehow keeps the wolves from the door for Nvidia’s plucky card.

So while I do feel like the huge performance drop off in Far Cry 6 for the RX 7600 is due to its smaller memory buffer, which the RX 7600 XT avoids by having much more memory available, I’m not convinced it’s the be-all and end-all for performance just yet.

(Image credit: Future)

In most other games, the RX 7600 XT and RX 7600 perform within a few frames of one another at almost every resolution. The only other exception being Cyberpunk 2077 at 4K. You could suggest, from looking at the numbers, the RX 7600 XT doesn’t collapse like the non-XT card under the sheer weight of Cyberpunk 2077’s ray-traced glory, though this is clutching at straws. All budget graphics cards struggle in the single-digits due to the intense ray tracing used throughout this benchmark. It’s a bloodbath.

Cranking up the resolution to 4K will often result in higher memory demands, and on rare occasions in excess of 8GB. Though I want to try and be realistic here: that’s pretty much the opposite of what this affordable graphics card is intended to be used for.

(Image credit: Future)

Buy if…

You must have 16GB: If you’re desperate to ditch 8GB, you could maybe justify picking up the RX 7600 XT. Though don’t sleep on the Intel Arc A770 16GB, either. Or the RX 6700 XT!

Don’t buy if…

You want the best value graphics card: You could save some cash and net equal or better performance with an RTX 4060, or you could spend the same on last gen’s RX 6700 XT for more frames and still plenty of memory.

❌ You’re not bothered by 8GB: It might not be the perfect amount of VRAM for the future, but right now 8GB suits most games just fine. For a more affordable card such as this, it’ll likely be passable for a while yet.

For the most part, this RX 7600 XT is a 1080p card, and a good one at that. It’ll usually crank out over 60 frames per second in the latest games at highest settings, and you can use AMD’s wonderful FidelityFX Super Resolution and Frame Generation to pump those numbers up even further in supported titles. It’s also reasonable to expect solid 1440p performance, depending on the game. But a 4K capable card, it is not.

I do realise that the quantity of games that will demand greater memory capacity is only set to rise in coming years. We’ve already seen The Last of Us Part 1 come around with high VRAM demands, though that’s largely been chalked up to a poor PC port rather than intrinsic limitation of the game’s engine. I’m sure there will be games that require pulling back some graphics settings in order to get working on 8GB cards at some point in the future, but my argument today is this card costs too much to justify taking a punt on 16GB when it comes attached to a fairly small GPU.

What’s more is the price of this specific model: the PowerColor Hellhound. It’s a $350 card, and while it runs quietly and maintains low temperatures under load, it’s even tougher to justify the RX 7600 XT for an increased price. For that money, you could consider the wildcard option: the frequently discounted RX 6700 XT. With 12GB of memory and performance consistently ahead of the RX 7600 XT, for around the $340 mark today, it’s a solid buy while stocks last.

Personally, I’d stick with AMD’s last generation card, or failing that, the RTX 4060, which can be found under $300, for any affordable PC build I was planning today. If you really must have access to 16GB on a budget, say for some AI experimentation you’re doing, an Intel Arc A770 16GB will cost you less than AMD’s card and occasionally outperform it. Intel’s Arc card does bounce around in benchmarking more than AMD or Nvidia’s alternatives, but it’s still a solid performer and often going cheap. 

Leave a Reply

Your email address will not be published.

Previous post Turns out AMD’s ‘Compressonator’ tool for artists and devs was named after The Governator himself, Arnold Schwarzenegger
Next post Sims competitor Life By You delayed to June to refine gameplay and ‘add more life’ to character faces