Colstan
Site Champ
- Joined
- Nov 9, 2021
- Posts
- 822
As Nvidia and AMD continue to roll out their latest GPU offerings, another phenomenon appears to be developing in PC gaming, one which may have significant implications for anyone shopping for a new graphics card.
Hardware Unboxed recently released a video revisiting the RX 6800 compared to the RTX 3070. Their results were shocking, and it comes down to VRAM. Modern games, released *this* year, are hitting memory barriers. The 16GB included with the 6800 is allowing it to outclass the 8GB with the 3070, even in ray tracing, not AMD's strongest feature, particularly with RDNA 2. I timestamped the most interesting benchmark, with the recently released "Hogwart's Legacy":
What is most striking is how textures keep popping in and out of existence, because there isn't sufficient VRAM to keep them stored in memory with the 3070. This isn't just a single unoptimized game, but happening with multiple titles, where 8GB simply isn't cutting it. Even though Nvidia has an edge with ray tracing, the memory demands aren't allowing this card to keep up with the 6800. It's not just poor performance, but visuals suffer, something that isn't obvious until you're actually playing a game, not simply running a benchmark.
This is further backed up by the latest episode of "Broken Silicon", Tom's podcast on Moore's Law is Dead, with his guest who is a VFX artist with developer Infinity Ward.
The short version, from this Call of Duty VFX artist, is that he considers these to the the appropriate amounts of VRAM for each product segment:
Entry-level: 12GB
Mid-range: 20GB
High-end: 32GB
This is where he expects PC games requirements are headed over the next few years, and that consumers shouldn't have to buy a new graphics card every generation. Obviously, the current market doesn't reflect this.
Right now, I have an RX 580 eGPU hooked up to my 2018 Intel Mac mini. Despite being an ancient GPU, it has 8GB of VRAM, as most 580s have had since it was introduced in early 2017. For some time now, it seemed like few games took advantage of anything more, but that time is over. Anyone with an 8GB card is looking at significantly reduced settings and running at 1080p for many titles being released this year.
For anyone considering an upcoming graphics card purchase, this should be part of the equation. Nvidia has been particularly stingy with VRAM. With current gen, you have to move up to the RTX 4080, which is currently $1,150 USD on Newegg, as of this writing, to get 16GB from an Nvidia card. There are far more affordable options from AMD and Intel which feature 16GB of VRAM. Whether Nvidia is doing this intentionally to make their cards obsolete after a few years is an open question. They clearly don't want to spend the extra $30 to up the VRAM.
So, we're finally getting to the point where VRAM is impacting in-game quality, and with 12GB Nvidia cards starting at $600, perhaps gamers should look at alternatives, if they don't want to dial down settings on their expensive new GPU.
Hardware Unboxed recently released a video revisiting the RX 6800 compared to the RTX 3070. Their results were shocking, and it comes down to VRAM. Modern games, released *this* year, are hitting memory barriers. The 16GB included with the 6800 is allowing it to outclass the 8GB with the 3070, even in ray tracing, not AMD's strongest feature, particularly with RDNA 2. I timestamped the most interesting benchmark, with the recently released "Hogwart's Legacy":
What is most striking is how textures keep popping in and out of existence, because there isn't sufficient VRAM to keep them stored in memory with the 3070. This isn't just a single unoptimized game, but happening with multiple titles, where 8GB simply isn't cutting it. Even though Nvidia has an edge with ray tracing, the memory demands aren't allowing this card to keep up with the 6800. It's not just poor performance, but visuals suffer, something that isn't obvious until you're actually playing a game, not simply running a benchmark.
This is further backed up by the latest episode of "Broken Silicon", Tom's podcast on Moore's Law is Dead, with his guest who is a VFX artist with developer Infinity Ward.
The short version, from this Call of Duty VFX artist, is that he considers these to the the appropriate amounts of VRAM for each product segment:
Entry-level: 12GB
Mid-range: 20GB
High-end: 32GB
This is where he expects PC games requirements are headed over the next few years, and that consumers shouldn't have to buy a new graphics card every generation. Obviously, the current market doesn't reflect this.
Right now, I have an RX 580 eGPU hooked up to my 2018 Intel Mac mini. Despite being an ancient GPU, it has 8GB of VRAM, as most 580s have had since it was introduced in early 2017. For some time now, it seemed like few games took advantage of anything more, but that time is over. Anyone with an 8GB card is looking at significantly reduced settings and running at 1080p for many titles being released this year.
For anyone considering an upcoming graphics card purchase, this should be part of the equation. Nvidia has been particularly stingy with VRAM. With current gen, you have to move up to the RTX 4080, which is currently $1,150 USD on Newegg, as of this writing, to get 16GB from an Nvidia card. There are far more affordable options from AMD and Intel which feature 16GB of VRAM. Whether Nvidia is doing this intentionally to make their cards obsolete after a few years is an open question. They clearly don't want to spend the extra $30 to up the VRAM.
So, we're finally getting to the point where VRAM is impacting in-game quality, and with 12GB Nvidia cards starting at $600, perhaps gamers should look at alternatives, if they don't want to dial down settings on their expensive new GPU.