PC 16GB of VRAM: The new normal.

Colstan

Site Champ
Posts
822
Reaction score
1,124
As Nvidia and AMD continue to roll out their latest GPU offerings, another phenomenon appears to be developing in PC gaming, one which may have significant implications for anyone shopping for a new graphics card.

Hardware Unboxed recently released a video revisiting the RX 6800 compared to the RTX 3070. Their results were shocking, and it comes down to VRAM. Modern games, released *this* year, are hitting memory barriers. The 16GB included with the 6800 is allowing it to outclass the 8GB with the 3070, even in ray tracing, not AMD's strongest feature, particularly with RDNA 2. I timestamped the most interesting benchmark, with the recently released "Hogwart's Legacy":



What is most striking is how textures keep popping in and out of existence, because there isn't sufficient VRAM to keep them stored in memory with the 3070. This isn't just a single unoptimized game, but happening with multiple titles, where 8GB simply isn't cutting it. Even though Nvidia has an edge with ray tracing, the memory demands aren't allowing this card to keep up with the 6800. It's not just poor performance, but visuals suffer, something that isn't obvious until you're actually playing a game, not simply running a benchmark.

This is further backed up by the latest episode of "Broken Silicon", Tom's podcast on Moore's Law is Dead, with his guest who is a VFX artist with developer Infinity Ward.



The short version, from this Call of Duty VFX artist, is that he considers these to the the appropriate amounts of VRAM for each product segment:

Entry-level: 12GB
Mid-range: 20GB
High-end: 32GB

This is where he expects PC games requirements are headed over the next few years, and that consumers shouldn't have to buy a new graphics card every generation. Obviously, the current market doesn't reflect this.

Right now, I have an RX 580 eGPU hooked up to my 2018 Intel Mac mini. Despite being an ancient GPU, it has 8GB of VRAM, as most 580s have had since it was introduced in early 2017. For some time now, it seemed like few games took advantage of anything more, but that time is over. Anyone with an 8GB card is looking at significantly reduced settings and running at 1080p for many titles being released this year.

For anyone considering an upcoming graphics card purchase, this should be part of the equation. Nvidia has been particularly stingy with VRAM. With current gen, you have to move up to the RTX 4080, which is currently $1,150 USD on Newegg, as of this writing, to get 16GB from an Nvidia card. There are far more affordable options from AMD and Intel which feature 16GB of VRAM. Whether Nvidia is doing this intentionally to make their cards obsolete after a few years is an open question. They clearly don't want to spend the extra $30 to up the VRAM.

So, we're finally getting to the point where VRAM is impacting in-game quality, and with 12GB Nvidia cards starting at $600, perhaps gamers should look at alternatives, if they don't want to dial down settings on their expensive new GPU.
 

diamond.g

Power User
Posts
246
Reaction score
87
As Nvidia and AMD continue to roll out their latest GPU offerings, another phenomenon appears to be developing in PC gaming, one which may have significant implications for anyone shopping for a new graphics card.

Hardware Unboxed recently released a video revisiting the RX 6800 compared to the RTX 3070. Their results were shocking, and it comes down to VRAM. Modern games, released *this* year, are hitting memory barriers. The 16GB included with the 6800 is allowing it to outclass the 8GB with the 3070, even in ray tracing, not AMD's strongest feature, particularly with RDNA 2. I timestamped the most interesting benchmark, with the recently released "Hogwart's Legacy":



What is most striking is how textures keep popping in and out of existence, because there isn't sufficient VRAM to keep them stored in memory with the 3070. This isn't just a single unoptimized game, but happening with multiple titles, where 8GB simply isn't cutting it. Even though Nvidia has an edge with ray tracing, the memory demands aren't allowing this card to keep up with the 6800. It's not just poor performance, but visuals suffer, something that isn't obvious until you're actually playing a game, not simply running a benchmark.

This is further backed up by the latest episode of "Broken Silicon", Tom's podcast on Moore's Law is Dead, with his guest who is a VFX artist with developer Infinity Ward.



The short version, from this Call of Duty VFX artist, is that he considers these to the the appropriate amounts of VRAM for each product segment:

Entry-level: 12GB
Mid-range: 20GB
High-end: 32GB

This is where he expects PC games requirements are headed over the next few years, and that consumers shouldn't have to buy a new graphics card every generation. Obviously, the current market doesn't reflect this.

Right now, I have an RX 580 eGPU hooked up to my 2018 Intel Mac mini. Despite being an ancient GPU, it has 8GB of VRAM, as most 580s have had since it was introduced in early 2017. For some time now, it seemed like few games took advantage of anything more, but that time is over. Anyone with an 8GB card is looking at significantly reduced settings and running at 1080p for many titles being released this year.

For anyone considering an upcoming graphics card purchase, this should be part of the equation. Nvidia has been particularly stingy with VRAM. With current gen, you have to move up to the RTX 4080, which is currently $1,150 USD on Newegg, as of this writing, to get 16GB from an Nvidia card. There are far more affordable options from AMD and Intel which feature 16GB of VRAM. Whether Nvidia is doing this intentionally to make their cards obsolete after a few years is an open question. They clearly don't want to spend the extra $30 to up the VRAM.

So, we're finally getting to the point where VRAM is impacting in-game quality, and with 12GB Nvidia cards starting at $600, perhaps gamers should look at alternatives, if they don't want to dial down settings on their expensive new GPU.

A lot of the blame can be laid at the feet of the PS4/X1 lasting as long as they did. Now "next-gen" games are targeting PS5/XSX where they have 10GB useable for vram (last gen it was like 6.5GB).
 

Colstan

Site Champ
Posts
822
Reaction score
1,124
A lot of the blame can be laid at the feet of the PS4/X1 lasting as long as they did. Now "next-gen" games are targeting PS5/XSX where they have 10GB useable for vram (last gen it was like 6.5GB).
Tech Yes City has done a video on the subject. Bryan gets the same results as Hardware Unboxed.



The short version is that 12GB is bare minimum for the latest PC games being released right now, unless you're willing to turn settings down to 1080p Low and have the game look like mud. What a lot of folks don't seem to realize is that 8GB, at any resolution or settings, isn't enough to hold the assets for some titles. That's not just textures, but geometry, physics, etc.

Unless a user is willing to upgrade every two years, it's insanity to purchase anything less than a 16GB card. That's partially why the old AMD 6000-series are outselling the 4070. This does limit options for current gen cards, which means if a customer refuses to look at anything other than Nvidia, then that's a $1,150 USD 4080 minimum, $800 for a 7900 XT at least, or wait for the mid-range AMD cards. I think AMD could a score a victory if the 7800 XT and 7700 XT come with 16GB and are somewhat reasonably priced. Nvidia handed that to them on a platter, but AMD never seems to miss an opportunity to shoot themselves in the foot.
 
Top Bottom
1 2