RTX 4070 Super launch day sales are rumored to be a ‘disaster’ – what’s going on with Nvidia’s new GPU?

Hypx@kbin.social to Technology@lemmy.world – 299 points –
RTX 4070 Super launch day sales are rumored to be a ‘disaster’ – what’s going on with Nvidia’s new GPU?
techradar.com

Do PC gamers feel 12GB of VRAM is simply not enough for the money in 2024?

125

You are viewing a single comment

600 $ for a card without 16 GB of VRAM is a big ask. I think getting a RX 7800 XT for 500 $ will serve you well for a longer time.

12gB vram is not a bottleneck in any current games on reasonable settings. There is no playable game/settings combination where a 7800xt's 16gB offers any advantage. Or do you think having 15fps average is more playable than 5fps average(because the 4070s is ram bottlenecked)? Is this indicative of future potential bottlenecks? Maybe but i wouldnt be so sure.

The 4070 super offers significantly superior ray tracing performance, much lower power consumption, superior scaling(and frame generation) technology, better streaming/encoding stuff and even slightly superior rasterization performance to the 7800xt. Are these things worth sacrificing for 100€ less and 4gB vram? For most people they arent.

Amd's offerings are competitive, not better. And the internet should stop sucking their dick, especially when most of the internet, including tech savvy people, dont even use AMD gpus. Hell, LTT even made a series of videos about how they had to "suffer" using AMD gpus, yet they usually join the nvidia shitting circlejerk.

I have an amd 580 card and have bought and recommended AMD gpus to people since the 9500/9700pro series. But my next gpu will almost certainly be an nvidia one. The only reason people are complaining is because nvidia can make a better gpu(as shown by the 4090) but they choose not to. While AMD literally cant make better gpus but they choose to only "competitively" price their gpus, instead of offering something better. Both companies suck.

D4 on Linux. Literally the only bottleneck is it eats 11GB of my 1080Ti's VRAM for breakfast and then still wants lunch and dinner. Plays 4k on high with prefect fps otherwise. Starts glitching like crazy once VRAM is exhausted after 10-15 minutes.

Zero issues on a 20GB card. I understand that shitty code on single game is not exactly universal example, but it is a valid reason to want more VRAM.

Is this indicative of future potential bottlenecks? Maybe but i wouldnt be so sure.

This is exactly what I expect. I have seen what happened to my friends with their GTX 970 when 3.5 GB of VRAM wasn't enough anymore. Even though the cards were still rasterizing quickly enough they weren't useful for certains games anymore. Therefore I recently make sure I go for enough VRAM to extend the useful service life of my cards.

And I'm not just talking about buying AMD, I actually do buy them. I first had the HD 5850 with 1GB, then got my friends HD 5870 also with 1GB (don't remember if I used it in crossfire or just replaced), then two of my friends each sold me their HD 7850 with 2GB for cheap and I ran crossfire, then I bought a new R9 380 with 4GB when a game that was important to me at the time couldn't deal with crossfire well, then I bought a used RX 580 with 8GB and finally the RX 6800 with 16 GB two years ago.

At some point I also bought a used GTX 960 because we were doing some CUDA stuff at University, but that was pretty late, when they weren't current anymore, and it was only used in my Linux server.

5 more...
5 more...