Why even push for more realistic graphics anymore?

Doods@infosec.pub to Gaming@beehaw.org – 272 points –

I am probably unqualified to speak about this, as I am using an RX 550 low profile and a 768P monitor and almost never play newer titles, but I want to kickstart a discussion, so hear me out.

The push for more realistic graphics was ongoing for longer than most of us can remember, and it made sense for most of its lifespan, as anyone who looked at an older game can confirm - I am a person who has fun making fun of weird looking 3D people.

But I feel games' graphics have reached the point of diminishing returns, AAA studios of today spend millions of dollars just to match the graphics' level of their previous titles - often sacrificing other, more important things on the way, and that people are unnecessarily spending lots of money on electricity consuming heat generating GPUs.

I understand getting an expensive GPU for high resolution, high refresh rate gaming but for 1080P? you shouldn't need anything more powerful than a 1080 TI for years. I think game studios should just slow down their graphical improvements, as they are unnecessary - in my opinion - and just prevent people with lower end systems from enjoying games, and who knows, maybe we will start seeing 50 watt gaming GPUs being viable and capable of running games at medium/high settings, going for cheap - even iGPUs render good graphics now.

TLDR: why pay for more and hurt the environment with higher power consumption when what we have is enough - and possibly overkill.

Note: it would be insane of me to claim that there is not a big difference between both pictures - Tomb Raider 2013 Vs Shadow of the Tomb raider 2018 - but can you really call either of them bad, especially the right picture (5 years old)?

Note 2: this is not much more that a discussion starter that is unlikely to evolve into something larger.

139

Is it diminishing returns? Yes, of course.

Is it taxing on your GPU? Absolutely.

But, consider Control.

Control is a game made by the people who made Alan Wake. It's a fun AAA title that is better than it has any right to be. Packed with content. No microtransactions. It has it all. The only reason it's as good as it is? Nvidia paid them a shitload of money to put raytracing in their game to advertise the new (at the time) 20 series cards. Control made money before it even released thanks to GPU manufacturers.

Would the game be as good if it didn't have raytracing? Well, technically yes. You can play it without raytracing and it plays the same. But it wouldn't be as good if Nvidia hadn't paid them, and that means raytracing has to be included.

A lot of these big budget AAA "photorealism" games for PC are funded, at least partially, by Nvidia or AMD. They're the games you'll get for free if you buy their new GPU that month. Consoles are the same way. Did Bloodborne need to have shiny blood effects? Did Spiderman need to look better than real life New York? No, but these games are made to sell hardware, and the tradeoff is that the games don't have to make piles of money (even if some choose to include mtx anyway).

Until GPU manufacturers can find something else to strive for, I think we'll be seeing these incremental increases in graphical fidelity, to our benefit.

This is part of the problem, not a justification. You are saying that companies such Nvidia have so much power/money, that the whole industry must spend useless efforts into making more demanding games just to make their products relevants.

Right? "Vast wealth built on various forms of harm is good actually because sometimes rich people fund neat things that I like!" Yeah sure, tell that to somebody who just lost their house to one of the many climate-related disasters lately.

I'm actually disgusted that "But look, a shiny! The rich are good actually! Some stupid 'environment' isn't shiny cool like a videogame!" has over fifty upvotes to my one downvote. I can't even scrape together enough sarcasm at the moment to bite at them with. Just... gross. Depressing. Ugh.

Maybe it’s not what you’re saying but how you’re saying it.

Yeah, its too bad that that's always true for every game and not just ~30 AAA year.

Too bad Dave the Diver, Dredge, and Silksong will never get made 😔

So the advantage is that it helps create more planned obsolescence and make sure there will be no one to play the games in 100 years?

Is that a real question? Like, what are we even doing here?

The advantage is that game companies are paid by hardware companies to push the boundaries of gamemaking, an art form that many creators enjoy working in and many humans enjoy consuming.

"It's ultimately creating more junk so it's bad" what an absolutely braindead observation. You're gonna log on to a website that's bad for the environment from your phone or tablet or computer that's bad for the environment and talk about how computer hardware is bad for the environment? Are you using gray water to flush your toilet? Are you keeping your showers to 2 minutes, unheated, and using egg whites instead of shampoo? Are you eating only locally grown foods because the real Earth killer is our trillion dollar shopping industry? Hope you don't watch TV or go to movies or have any fun at all while Taylor Swift rents her jet to Elon Musk's 8th kid.

Hey, buddy, Earth is probably over unless we start making some violent changes 30 years ago. Why would you come to a discussion on graphical fidelity to peddle doomer garbage, get a grip.

Would the game be as good if it didn’t have raytracing? Well, technically yes. You can play it without raytracing and it plays the same. But it wouldn’t be as good if Nvidia hadn’t paid them, and that means raytracing has to be included.

To place another point on this: Control got added interest because their graphics were so good. Part of this was Nvidia providing marketing money that Remedy didn't have before, but I think the graphics themselves helped this game break through the mainstream in a way that their previous games did not. Trailers came out with these incredible graphics, and critics and laygamers alike said "okay I have to check this game out when it releases." Now, that added interest would mean nothing if the game wasn't also a great game beyond the initial impressions, but that was never a problem for Remedy.

For a more recent example, see Baldur's Gate 3. Larian plugged away at the Divinity: OS series for years, and they were well-regarded but i wouldn't say that they quite hit "mainstream". Cue* BG3, where Larian got added money from Wizards of the Coast that they could invest into the graphics. The actual gameplay is not dramatically different from the Divinity games, but the added graphics made people go "this is a Mass Effect" and suddenly this is the biggest game in the world.

We are definitely at a point of diminishing returns with graphics, but it cannot be denied that high-end, expensive graphics drive interest in new game releases, even if those graphics are not cutting-edge.

This comment is 100% on-point, I'm just here to address a pet peeve: Queue = a line that you wait in, like waiting to checkout at the store Cue = something that sets off something else, e.g. "when Jimmy says his line, that's your cue to enter stage left"

So when you said:

Queue BG3, where Larian got added money from Wizards of the Coast that they could invest into the graphics.

What you meant was:

Cue BG3, where Larian got added money from Wizards of the Coast that they could invest into the graphics.

"We are definitely at a point of diminishing returns with graphics,"

WTF. How can you look at current NERF developments and honestly say that? This has never been further from the truth in the history of real time graphics. When have we ever been so close to unsupervised, ACTUALLY photo-realistic (in the strict sense) graphics?

I played Control on a 10 year old PC. I'm starting to think I missed out on something

I played it on PS5 and immediately went for the higher frame rate option instead.

I think Ghostwire Tokyo was a much better use of RT than Control.

Also found Ratchet and Clank to be surprisingly good for 30fps. I can't put my finger on exactly what was missing from the 60fps non RT version, but it definitely felt lesser somehow.

Until GPU manufacturers can find something else to strive for

Machine Learning / AI has been a MASSIVE driver for Nvidia, for quite some time now. And bitcoin....

Shadow can definitely look a lot better than this picture suggests.

The biggest advancements in game graphics have not occurred in characters, except for perhaps in terms of animation and subsurface scattering tech.

The main character always gets a disproportionate graphical resource allocation, and we achieved "really damn good" in that category a while ago.

Adam Jensen didn't look that much better in Mankind Divided, than he did in Human Revolution, but Prague IS SO MUCH MORE DETAILED than Detroit was.

Then there's efficiency improvements in rendering brought by systems like nanite, material shader improvements, more detailed lighting systems and more efficient ambient occlusion.

Improvements in reverse kinematics is something I'm really excited about, as well.

Sounds like I need to try Mankind Divided. Ultra-detailed Prague as a level sounds badass.

It is. Adam works for the secret underground interpol base in the middle of the city. There are abusive secret societies to dismantle, murder cases to solve, drug rings to bust, corrupt cops to beat up. Mankind Divided is a prime example of making a hub-world medium sized but super detailed, being just as good if not better than huge and full of nothing.

I can not elaborate on that as I am unqualified - remember, I have never played newer titles.

My main point is that a headshot of the main character is not a good yardstick. The mc is always going to be rendered with enough oomph to look good, no matter the settings or game generation.

The difference in recent years has been in environment detail and material shading, lightning, things you maybe can't even enable due to playing on older hardware.

While I agree ray tracing is a total energy hog, that's not the only area seeing advancement. Rendering pipelines like nanite enable more graphics, AND less power consumption.

Three thoughts:

  1. I wonder if you would still have this take if you played a newer, high quality AAA game on a high end setup. I don’t mean to imply that your mind will definitely be blown — really don’t know — but it would be interesting to see what doing so would do to your opinion.

  2. Gaming is about entertainment. There is no denying that better/bigger/smoother/more immersive tends to add to the entertainment. So devs push those boundaries both for marketing reasons and because they want to push the limits. I have a hard time seeing a world in which gaming development as a whole says “hey, we could keep pushing the limits, but it would be more environmentally friendly and cheaper for our customers if we all just stopped advancing game performance.”

  3. There are SO MANY smaller studios and indie devs making amazing games that can run smoothly on 5/10/15 year old hardware. And there is a huge number of older games that are still a blast to play.

Another point in favour of new graphics tech, you mentioned you're worried about artist needing to do more work. As someone who has done 3D work, I can tell you that its actually easier to make something photo-real. The hard part is making it look good within the limitations of a game engine. How to get something that looks just as good, with simpler material shaders and fewer polygons.

Tech like nanite actually eliminates the need for all that work. You can give the game engine the full-quality asset, and it handles all the difficult stuff to render it efficiently. This is why we are now seeing games that look as good as Unrecord coming from tiny new studios like DRAMA.

The thought that today's state of technology is enough and we should stop improving sounds pretty Amish to me.

The word you want is Luddite

Luddites, the original ones were pretty rad. They were anti tech for anti capitalist reasons.

I agree that Luddite is the more correct term since it's more general now, but I hate that the term got warped over time to mean anyone that hates any new tech

I quite like what Cory Doctorow has to say about it. (Author of Little Brother, coiner of the term "enshittification" (and much much more obviously))

Article

Podcast version

I didn't mean we should stop improving, what I meant is we should focus more on the efficiency and less on the raw power.

I think game and engine developers should do both. If it's possible to improve efficiency and performance it should be done. But at the same time hardware is improving as well and that performance gain should be used.

I'm kinda worried a little bit about the recent development in hardware though. At the moment GPU power mostly increases with energy consumption and only a little with improved architecture. That was different some years ago. But in my eyes thats a problem the hardware manufactorera have, not the game developers.

Performance is always about doing as much as possible with as little as possible. Making a game runs faster automatically makes it more efficient because the only way it can run faster is by doing less work. It's just that whenever you can run faster it means the game has more room for other things.

It's resource consumption and graphics output are directly linked. If you gain more efficiency, that gives you more headroom to either reduce resource consumption or increase the graphics output. But you can't maximize both. You have to decide what do to with the resources you have. Use them, or not use them. You can't do both at the same time.

To kinda reiterate the point in a different light, I'd like more games with cel shading, which can look somewhat timeless if done right. I think Risk Of Rain 2 is cel shaded, and it looks fantastic while not also being the centerpiece of the game.

I understand the sentiment, but it seems like you're drawing arbitrary lines in the sand for what is the "correct" amount of power for gaming. Why waste 50 watts of GPU (or more like 150 total system watts) on a game that something like a SteamDeck will draw 15watts to do almost identically. 10 times less power for definitely not 10 times less fidelity. We could all the way back to the original Gameboy for 0.7 watts, the fidelity drops but so does the power. What is the "correct" wattage?

I agree that the top end gpus are shit at efficiency and we should could cut back. But I don't agree that fidelity and realism should stop advancing. Some type of efficiency requirement would be nice, but every year games should get more advanced and every year gpus should get better (and hopefully stay efficient).

I agree that the top end gpus are shit at efficiency and we should could cut back.

According to Steam survey, 4090, 3090, 6900XT, and 7900 XTX combined are being used by about 1.7% of gamers.

This number is, of course, inflated (at least slightly) because people who have money to buy these cards are also more likely to buy games and people owning older/cheaper cards are more likely to be playing pirated copies.

The top tier cards are showcase of technological advancement. They are not really used by a large number of people. So there's not much point. It will only reduce the baseline for next generation, leading to less advancement.

That's a very good point, but a little misleading. A better number would be to add up all the top tier cards from every generation, not just the past 2. Just because they're old doesn't mean they still aren't relatively inefficient for their generation.

If we kept the generations exactly the same, but got rid of the top 1 or 2 cards. The technological advancement would be happening just as fast. Because really, the top tier cards are about silicon lottery and putting as much power in while keeping stable clocks. They aren't different from an architecture perspective within the same generation. It's about being able to sell the best silicon and more VRAM at a premium.

But as you said, it's still a drop in the bucket compared to the overall market.

I agree that I shouldn't have set the arbitrary 50 watt thing, I just saw my GPU and bigger ones and came out with that number.

Yes, it basically boils down to diminishing returns, it also eats up all the good potential for creative graphic design.

I don't think higher graphics requirements hurt creativity, you can have an unrealistic looking game that is very GPU-intensive, I was mainly concerned about the costs and wasted money/efforts.

But lowering the graphics budget - and the budget in general - can make creativity/risk-taking a more appealing option for AAA studios.

Edit : I just noticed both sentences kind of contradict each other but you get the point.

Because it impresses people and so it sells. If they didn't do that, all those EAs and Ubisofts would have to find a new selling point like making their games good or something.

I like seeing advances in graphics technology but if the cost is 10 year dev cycle and still comes out s-s-s-stuttering on high end PCs and current gen consoles then scale back some.

I think we hit a point where it's just not feasible enough to do it anymore.

"I want shorter games with worse graphics made by people who are paid more to work less and I'm not kidding"

EDIT: Never mind, I thought that was a sarcastic comment mocking the other user.


And what's wrong with that, exactly? Would you prefer broken games made by under paid and overworked people?

As for "worse graphics", AC: Unity came out in 2014, The Witcher 3 came out in 2015, and the Arkham Knight is also from 2015. All of those have technically worse graphics, but they don't look much different from modern games that need much beefier systems to run.

And here's AC: Unity compared to a much more modern game.

I'm pretty sure that's in support of the concept.

Ah, by bad. I didn't even realize it was a known quote, I just thought it was a sarcastic reply making fun of the other user.

It's from a tweet. It's earnest. You can google the quote to get more context.

You picked the absolute best examples of their respective years while picking the absolute worst example of the current year, that makes the comparison a bit partial, doesn't it? Why not compare them to Final Fantasy XVI or one of the remakes like Dead Space or Resident Evil 4? Or pick the worst example of previous years, like Rambo: The Video Game (2014)? While good graphics don't make a good game, better hardware allows devs to spend less time doing better graphics. 2 of the 3 examples you gave have static lightning (ACU and BAK), while the bad example you gave have dynamic lightning. Baking static lightning into the map is a huge time consuming factor while making a game, I assure you that from my second hand experience that at least 1 of those 2 games you mentioned had to compromise gameplay because they couldn't change the map after the light-baking was done. And I'm just scratching the surface on the amount of things that are time consuming when making good graphics like the games you mentioned. As an example, you have the infamous "squeezing through the cap" cutscene that a lot of AAA of the last generation had, because it allowed the game to load the next area. That was time wasted on choosing the best times to do it, recording the scenes, scripting, testing etc. etc. All because the hardware was a limiting factor. Now that consumers have better hardware that isn't a problem anymore, but consumers had to upgrade to allow it. That was also true for a lot of other techniques like Tessellation, GPU Particles etc. The consumers had all to upgrade to allow devs to make the game prettier with less cost. And it will also be true with ray-tracing and Nanite, both cut a LOT of dev time while making the game prettier but requires the consumers to upgrade their hardware. Graphics are not all about looks, it is also about facilitating dev time which makes the worst looking graphics look better. If Rambo The Video Game (2014) was made with the tech of today, it would look much better while costing the devs the same amount of time. Please don't see make comment as a critique, I'm just trying to make you understand that not everything is black and white, especially on something that is as complex as AAA development.

EDIT: I guess the absolute worst example of the current year would be Gollum, not Forspoken.

If Rambo The Video Game (2014) was made with the tech of today, it would look much better while costing the devs the same amount of time.

I don't think this is quite correct. A while back devs were talking about a AAApocalypse. Basically as budgets keep on growing, having a game make its money back is exceedingly hard. This is why today's games need all sorts of monetisation, are always sequels, have low-risk game mechanics, and ship in half broken states. Regardless of the industry basically abandoning novel game engines to focus on Unreal (which is also a bad thing for other reasons), game production times are increasing, and the reason is that while some of the time is amortised, the greater graphical fidelity makes the lower fidelity work stand out. I believe an "indie" or even AA game could look better today for the same amount of effort than 10 years ago, but not a AAA game.

For example, you could not build Baldur's Gate 3 in Unreal. This is an unhealthy state for the industry to be in.

Yeah, I agree with everything you said. But what I was trying to say is that it is not all of the graphics push that are hurting production, I believe that on this generation alone we have many new graphics techniques that are aiming to improve image quality at the same time that it takes the load out of the devs. Just look at Remnant II that has the graphical fidelity of a AAA but the budget of a AA. Also, some of the production time is increasing due to feature creep that a lot of games have. Every new game has to have millions of fetch quests, a colossal open world map, skill trees, online mode, crafting, looting system etc. etc. Even if it makes no sense for the game to have it. Almost every single game mentioned on this thread suffers from this. With Batman being notorious for their Riddler trophies, The Witcher having more question marks on the map than an actual map, and Assassin's Creed... Well, do I even need to mention it? So the production time increase is not all the fault of the increase in graphical fidelity.

I've been honestly blown away with how newer games look since I upgraded my graphics card.

Remnant 2 is not even a AAA game but does such a good job with light and reflections that it looks better than anything released 5+ years ago.

Then you have games like Kena: Bridge of Spirits, which have a very nice art style but take advantage of current hardware to add particles everywhere.

I already wrote another comment on this, but to sum up my thoughts in a top comment:

Most (major) games nowadays don’t look worlds better than the Witcher 3 (2015), but they still play the same as games from 2015 (or older), while requiring much better hardware with high levels of energy consumption. And I think it doesn't help that something like an RTX 4060 card (power consumption of a 1660 with performance slightly above a 3060) gets trashed for not providing a large increase in performance.

Its not so much the card itself, its the price and false market around it (2 versions to trick the average buyer, one with literally double the memory). Also its a downgrade from previous gen now cheaper 3070. Its corporate greed with purpose misleading. If the card was 100 € cheaper, it would be actually really good. I think that is the census on reviewers like GN but don’t quote a random dude on the Internet ahah

It's less of the fact that there is a version with double the memory and more of the fact that the one with less memory has a narrower memory bus than the previous generation resulting in worse performance than the previous generation card in certain scenarios.

I don't know about the 2 versions, but the 3070 bit is part of what I mean.

Price has been an issue with all hardware recently - even in regard to other things due to inflation in the last few years - but it's not exclusive to the 4060. But more importantly, from what I can tell, the 3070 has a 1.2x to 1.4x increase in performance in games, but it consumes about 1.75x the power (rough numbers, i'm kind busy rn). Because I don't have much time right now I can't look at prices, but when you consider the massive difference in consumption, the price different might start making more sense and only seem ridiculous if you just focus on power.

still play the same as games from 2015

I wish they played more like games from the late 90's, early 2000's, instead of stripping out a lot of depth in favor of visuals. Back then, I expected games to get more complex and look better. Instead, they've looked better, but played worse each passing year.

I remember seeing an article somewhere about this. Effectively, there really bad diminishing returns with these game graphics. You could triple the detail, but there's only so much that can fit on the screen, or in your eyes.

And at the same time, they're bloating many of these AAA games sizes with all manner of garbage too, while simultaneously cutting the corners of what is actually good about them.

There's definitely something to be said about proper use of texture quality. Instead of relying on VRAM to push detail, for games that go for realism I think it's interesting to look at games like Battlefield 1 - which even today looks incredible despite very clearly having low quality textures. Makes sense - the game is meant to be you running around and periodically stopping, so the dirt doesn't need to be much more than some pixelated blocks. On the other hand, even just looking at the ground of Baldur's Gate 3 looks like the polish rest of Battlefield 1 visual appeal.

Both these games are examples of polish put in the right places (in regards to visual aesthetics) and seem to benefit from it greatly with not a high barrier for displaying it. Meanwhile still visually compelling games like 2077 or RDR2 do look great overall but just take so much more resources to push those visuals. Granted there's other factors at play like genre which of course dictates other measures done to maintain the ratio of performance and fidelity and both these games are much larger in scope.

I think in some cases there's a lot of merit to it, for example Red Dead Redemption, both games are pretty graphically intensive (if not cutting edge) but it's used to further the immersion of the game in a meaningful way. Red Dead Redemption 2 really sells this rich natural environment for you to explore and interact with and it wouldn't quite be the same game without it.

Also that example of Tomb Raider is really disingenuous, the level of fidelity in the environments is night and day between the two as well as the quality of animation. In your example the only real thing you can tell is the skin shaders, which are not even close between the two, SotTR really sells that you are looking at real people, something the 2013 game approached but never really achieved IMO.

if you don't care then good for you! My wallet wishes I didn't but it's a fun hobby nontheless to try and push things to their limits and I am personally fascinated by the technology. I always have some of the fastest hardware every other generation and I enjoy playing with it and doing stuff to make it all work as well as possible.

You are probably correct in thinking for the average person we are approaching a point where they just really don't care, I just wish they would push for more clarity in image presentation at this point, modern games are a bit of a muddy mess sometimes especially with FSR/DLSS

It mattered a lot more early on because doubling the polygon count on screen meant you could do a lot more gameplay wise, larger environments, more stuff on screen etc. these days you can pretty much do what you want if you are happy to drop a little fidelity in individual objects.

Also that example of Tomb Raider is really disingenuous, the level of fidelity in the environments is night and day between the two as well as the quality of animation. In your example the only real thing you can tell is the skin shaders, which are not even close between the two, SotTR really sells that you are looking at real people, something the 2013 game approached but never really achieved IMO.

I've noticed this a lot in comparisons claiming to show that graphics quality has regressed (either over time, or from an earlier demo reel of the same game), where the person trying to make the point cherry-picks drastically different lighting or atmospheric scenarios that put the later image in a bad light. Like, no crap Lara looks better in the 2013 image, she's lit from an angle that highlights her facial features and inexplicably wearing makeup while in the midst of a jungle adventure. The Shadow of the Tomb Raider image, by comparison, is of a dirty-faced Lara pulling a face while being lit from an unflattering angle by campfire. Compositionally, of course the first image is prettier -- but as you point out, the lack of effective subsurface scattering in the Tomb Raider 2013 skin shader is painfully apparent versus SofTR. The newer image is more realistic, even if it's not as flattering.

I'm someone who doesn't care about graphics a whole lot. I play most modern games at 1080p Mid/high on my RTX 3060.

And yet, I totally agree with your points. Many times, older games had rich looking environment from a distance, but if you go close or try to interact with it, it just breaks the illusion. Like, leaves can't move independently or plants just don't react to your trampling then etc.

A lot of graphical improvements are also accompanied with improvements in how elements interact with other elements in the game. And that definitely adds to the immersion, when you can feel like you're a part of the environment.

Yatzee from The Escapist recently did a video on this exact topic.

Here's the link for those interested.

I agree with everything he said. But I've also been saying things like that for thirty years. I remember when Morrowind came out complaining about companies using extra processing for shitty 3D graphics instead of sticking with high quality 2d that works perfectly fine and putting that extra processing power to work on better AI or something.

I think the problem is that better graphics is the one thing they can do that will please a mass audience. Sure, there are plenty of other things they could be doing, but I would bet that each of them has a niche appeal that will have fewer fans to spread the cost among. Thus producers of "AAA" titles pretty much by definition have to pursue that mass audience. The question is when they reach that point of diminishing returns and be becomes more profitable to produce lower cost niche titles for smaller audience. And we also have to factor in that part of that "profit" of pleasing that assumption our society has that anything with niche appeal is necessarily "lower" in status than mass appeal stuff.

I think we are approaching that point, if we haven't already reached it. Indie stuff is becoming more and more popular, and more prevalent. It's just hard to tell because indie stuff tends to target a smaller but more passionate audience. For example, while I am looking forward to trying Starfield out, I may be too busy playing yet more Stardew Valley to buy it right away, and end up grabbing it in a sale. (I haven't even really checked if it'll run on my current gaming laptop.)

I'm with you. Barely notice the changes in graphics, just the increasing of my gpu fan speeds over the years.

I'm more interested in games that graphics that look good enough, but do more interesting things with the extra horsepower we have these days.

But what will all these hardware companies do if we stop raising system requirements!

Yeah there seems to be a planned obsolescence element here overall!

Games don’t need better, more complex graphics. They need adequate time and resources during the development process. They need to actually be completed by their release date, not just barely playable. They need to be held to a higher standard of quality when publishers judge if they’re ready to sell.

Pushing for even more realistic graphics will make the cost of making even higher with no significant change in enjoyment of players.

Players enjoyed games when we had Supernintendos and DOS games. They actually gave players more room for imagination.

So... this is only partially correct IMHO.

Yes, it will continue to be expensive for the studios that push the envelope. However, as those studios continue to invest large amounts of cash, the smaller studios are continually getting access to better and better tools because of it. That means that a small studio can create something that is not quite-as-good as the major studios, but still very competitive, and for significantly cheaper.

As technology progresses, last-year's tech will always fall in price.

As to the point of enjoying Super Nintendo and DOS games, sure. Much of that style has returned in the form of pixel art games and what have you. But the conservative viewpoint of '8-bit was good enough in my day, why improve on it' is just short-sighted in my opinion. Why keep pumping out Atari-grade stuff when so much more is possible? Why not advance and improve?

7 more...

Everything is ruined by marketing it's capitalist roots, and game development is no exception.

They push for fidelity just because sells well, the fact that this makes for the need of much powerful hardware is not a drawback for them. It's actually good, since it's someone you can profits on.

Games need artistic direction and vision, much more than they need photorealism (which is great for some kind of games, but not a universal standard).

Photorealism really only fits a dew genres, almost all indie games have some kind of interesting artstyle like the long dark for example

Art style will always be more important than graphic fidelity, imo. Having a less realistic-looking game with a strong art style allows the game to age better with time.

Take a look at the first Borderlands - it’s from like 2006 or 2008, but it still looks good today because of the cel-shaded art style it went with. Meanwhile the first Uncharted looks goofy as hell today because it was trying to look realistic during that same timeframe that Borderlands was released.

Art style, fidelity, and realism are all somewhat separate. Yeah a good art style can make a game timeless. There are countless examples of that. But it doesnt mean that realistic looking games cant have a great art style. I think cohesion is a huge part of what people mean by "art style". Realistic looking games that focus on having a cohesive look can also be fairly timeless. They may not compete with today's fidelity (resolution, technology, etc), but Assassin's Creed, various Battlefield games, Crysis 1, and plenty more all look great if you look at them today.

Theres also the concept of somwthing not realistic, but high fidelity. Minecraft RTX is simply amazing. Taking that already timeless art style and increasing the fidelity doesn't suddenly remove the timeless factor, and i love that. I want more of that. Even if it means i need a fairly beefy gpu for such simple graphics otherwise.

Realistic graphics are fuxking sucks, not only it requires remotely high end PC to get a somewhat playable fps but it also eat so much storage space which is unnecessary if you play on medium or low graphics which defeated the purpose of "realism"

That's why I almost exclusively play indie games. They don't invest massively in graphics, microtransactions or dumb features not related to the game (like the chess/darts/drinking simulators in Watchdogs). Instead, they focus on making games that do one thing and to that one thing great.

I prefer good gameplay over great graphics. The brain knows how to fill in the gaps and allow for immersion. Thats why most recently Battlebit Remastered has left such a big impression. It's gameplay is really good and it is a much better game than the last Battlefield, although the Battlefield has much better graphics.

The same goes for some of my other all time favourites. Deep Rock Galactic, Terraria, Minecraft all do well without amazing graphics and take up much less drive space as a nice side effect.

I think some of the issue also has to do with art style more than graphics. Realism is by far the hardest style to achieve, and it seems to be the preferred one for a lot of gamers probably because it makes them feel more adult or something. But I think a lot of games could gain a lot from striving to look good rather than realistic, settling for an art style and committing to it.

From what I've seen around it seems like Baldurs Gate is doing just that, and it seems to have shook up the entire industry.

Art style makes a huge difference. And it's one of those things that you just see and don't really register as much as "more shadows" or whatever.

I don't understand how anyone can play Battlebit with the UI as it is currently.

Blue dots floating around everywhere.

I am playing mostly 32v32, so I've never been bothered by the blue dots. It can be overwhelming in the largest game mode, but then again you can also modify the HUD in the settings.

Play something like The Quarry and you'll want them to be a tad more realistic, cuz it's not quite there and triggers the uncanny valley effect. Seeing the likeness of a real person in a video game even with the best graphics available is still very easily seen to be a video game. Some stuff in the works right now yet to be released built on UE5 is even closer. Some things have been shown to look almost photorealistic, such as Unrecord or The Matrix tech demo. Like so much so, people thought they were fake at first.

It also depends incredibly on the team handling the animations. Super Massive games has nothing on Naughty Dog when talking facial animations. There are some serious deep emotions conveyed in Uncharted 4 or The Last of Us. Since they're both on PC now I'd recommend to give them a play if you haven't. While i really like the kind of narratives SMG makes, I think their quality is not as high level as some others.

Mostly I just want games with good stories that are really, really fun to play. And games where I can play with 1-8 of my friends. Games like Sons of the Forest or Raft are perfect for this.

I'm a big proponent of going with a distinct art style over "realism", because the latter tends to kind of fall apart over time as technology improves. The former, will always look good though.

I like it the way it is: There's both and gamers decide what to buy. In the end, we are talking about a MASSIVE economy so of course there are also a lot of people who WANT you to upgrade your PC / console every 2 years or so.

Even if you do go with realism, we've hit a point of diminishing returns. Most PS5 games just look like the best looking PS4 games, but in 4K. I'd rather developers start using the system resources for things that actually matter instead of realistically simulating every follicle of a characters ass hair. Like, give me better NPC AI, give me more interactive environments, give me denser crowds, more interconnected systems. Just something.

Honestly, I agree to an extent. I like watching at a well-designed scenery but I think it hurts games if it takes the priority. I'm not playing games for that, but for cool gamedesign ideas and my own experiences with mechanics. That's tl;dr, next is my rant, for I had a long bus ride.

Graphics are very marketable and ad-friendly, easier to implement\control rather than changes to engine or scripts (you need to understand first) and they may cover up the lack in other departments. Effective managers love that. CGI guys at Disney are on strike because this sentiment held as true in movie industry too, and they are overloaded, filming the whole movie over chromakey. Computer graphics almost replaced everything else.

In my perspective, this trend in AAA lowers the quality of the end product, makes it safer to develop (formulaic reiteration) but just ok to play, mostly unremarkable. Indie and small gamestudios can't compete with them in visuals, so they risk and try to experiment, bring novelty, and sometimes win a jackpot.

Like, obviously, Minecraft, that was initially coded by Notch alone. It invented indie scene as we know it now. It put tech and mechanics over looks, and the whole world was playing it. No one cared for how abstract it is being addicted to the gameplay.

Playing older games, I see, that they were in this race too, like how (recently remastered) Quake 2 was a great visual upgrade over Quake 1. People sold an arm and a leg to play them on HIGH at that time. And how they nodded like yeah, now it's just like a real life watching at a 640x320 screenshot, or how marketologists sold it. But somehow they were made completely different in many ways, not gfx alone, and that's for a braindead shooter. I feel it with my fingers. I see it in how the game logic works. This sensation was greater for me than anything I see on the screen.

Not being able to recall what happened in what CoD game, I become more amused with how gamedesign, presented via code, affects the feeling of a game. How in Disco Elysium all these mental features made it stand out. How Hotline: Miami did extreme violence so stylish. How Dwarf Fortress taught me to care about ASCII symbols on my screen but accepting the fun of loosing them. How the first MGS's Psycho Mantis read my savefiles from other games and vibrated my controller on the floor with his psychic power.

These moments and feelings can't be planned and managed like creation of visual assets. And they are why I like games, as outdated as NES ones or as ugly as competitive Quake config looks. They, like making love with a loving partner, hits different than a polished act of a fit and thin sex-worker. They bring unique experience instead of selling you a horse-painted donkey.

And that's why I don't really care about graphics and dislike their unending progress.

This describes my thoughts pretty well - in short, the chase for better graphics tends to hamstring innovation and other creative ideas; we end up playing mostly the same games but with better graphics. But I wanted to add something else, so I figured I'd use your comment as a jump-starting point.

The never ending race for better graphics that end up giving us diminishing returns, also ends up setting in us on a race of pure computing power where I feel like efficiency comes second - or not at all. It doesn't help that poor optimization is also so common in a lot of games.

I don't want to shill for Nvidia, but the response to the launch of the 4XXX generation was a pretty good example of that, and how a big part of the issue is also consumers. The 4060 card has great power consumption, to the point of being on par with the 1660, while performing about 2x better than it, and 1.5x better than a 2060. And to put it another way, it's sightly better than the 3060 while using between only 60% and 70% of its power. Yet, the card was widely trashed, by both reviewers and consumers, most of which (both former and latter) never mention the efficiency of the card. A quick look up of performance videos on YouTube, will show you how people will usually just show FPS, VRAM usage, and maybe memory usage too; quite a few will only show you FPS; a surprising amount will show you pretty much anything you can think of except GPU power usage.

This is especially worrying and disheartening, when I think about how we seem to be on the verge of an energy crisis.

Most (major) games nowadays don't look worlds better than the Witcher 3 (2015), but they still play the same as games from 2015, while requiring much better hardware with high levels of energy consumption.

And would even require Windows Eleven soon, banning older CPUs!

I agree with you. One factor I still have hope for in that angle are new handhelds. We had Switch, we had Steam Deck and its newer competitors. And they all judged by their battery life and also has small screen where gfx don't matter as much. Players on a long roadtrip or shift intuitively chose less consumptive games over those eating the battery in a hour. I wonder if Steam can make a special category for energy-light games, just for that obvious reason. And it, in reverse, inspiring devs to make games to cater to that usercase. I can dream.

I care about the story of the game and general enjoyment. As long as I can see and understand what is going on my screen, i do not care that much about the graphics. I am fine with playing old games with potato graphics. Also, I know we could have both, but if studios had the choice between more accesibility options for more different demographics, or better graphics, i wishthey would always choose to put the resources in accessibility.

I feel that sometimes realistic graphics are what a game needs- like some simulators or horror titles going for that form in immersion. We're not quite over the uncanny valley in AAA titles, and pushing the boundaries of real-time rendering technology can lead to improvements in efficiency of existing processes.

Other times, pushing the boundaries of realism can lead to new game mechanics. Take Counter-Strike 2 and their smoke grenades: they look more realistic and shooting through them disturbs only a portion of the cloud.

I do miss working mirrors in games, though.

Some great games make use of that increased power. Many more do not.

maybe we will start seeing 50 watt gaming GPUs being viable and capable of running games at medium/high settings, going for cheap

The Steam Deck runs lots of games at those settings at about 15W, more or less, depending on how you count, and at lower resolutions too. Strides are being made toward this end of better performance per watt, especially in ARM architectures, which is a step that Apple arguably took too soon, as it relates to gaming.

So I'm an environmental criminal just because I enjoy graphics and have an expensive hobby?

You’re hardly responsible for the actions and decisions of several large companies. Your choice to buy a fancy card or run a powerful rig are not even a drop in the bucket compared to the actions of companies that shoe horn in the tech even when it isn’t wanted just so they can sell more.

Ray Tracing makes a huge difference regardless of everything else.

Games are still fun at low quality (I mostly play on steam deck), but we haven't reached diminishing returns until every game is fully ray traced for all lighting including special effects.

And audio!

Genuinely 3D audio with a full soundscape is awesome. I don't play with headphones frequently enough to comment on what games really shine there, but it can absolutely make a huge difference.

I still intend to play Hellblade: Senua's Sacrifice with headphones after hearing so much about it. I've put it off only because I don't like jump scares.

you shouldn’t need anything more powerful than a 1080 TI for years

This seems true to me. 1080 Ti runs everything today & likely going to be okay for 3 years or more. Though the 1080 Ti is a very powerful card..

I think a 1060 3g should be where the line is drawn, absolutely trash looking textures that some games have on minimum graphics shouldn't use more than 2 gigs of vram. (yes I'm looking at you BG3)

As far as people with lower-end systems being unable to enjoy newer titles, I'd like to point you to the prolific and endlessly creative indie scene. Yes, you may not be able to play the latest AAA "Skyrim Ultra Remaster DX: Penultimate Boogaloo - GotY Edition," but there's lots of gems out there that don't require the latest GPUs.

But as for realistic graphics, there's actually an aspect you may buy be aware of: game companies actually influence the movie industry. Because games not only have to deliver realism but realism in real time, they have to constantly invent new techniques. The movie industry picks up these techniques and makes better movies.

Plus, IMO, improvements are always welcome. Death Stranding had some of the best mocap I've seen in both the movie and videogame industries, and don't feel those improvements were wasted effort.

I buy games based on the following tier scale:

Gameplay > Performance > Price > Expected time playing > Graphics

I agree with your point in the post, especially after playing Darktide, which chucked performance out the window for fog and lighting effects. It doesn't matter how pretty your game is if it's rendering at 3fps.

1 more...

I'm with you. I've wanted more players, larger maps, and less emphasis on graphics (cause they've been pretty good) for awhile now. I think back on games that were fun and addictive: they weren't pretty but blocky. Yet it didn't matter.

Contemporary graphics are amazing, I admit. I also find it difficult to put a release date on any games from the last 5-8 years. They look pretty similar in graphics to my eye.

Am I wrong for being willing to sacrifice some graphic sophistication for the sake of say a 128v128 game in a destructible environment?

This person is evoking the idea of a game like BattleBit Remastered, by the way, if anyone reading this is living under a rock.

Oh I know I live under one. Thanks for the insight! This looks like too much fun. I hope these things were/are eventually addressed.

I haven’t watched the second video, but I will say the game is solid and fun with friends. And it seems like community criticism is affecting a lot of the balancing and updates they do.

I think whatever concerns people had during EA the devs are attempting to address.

So I watched a video on its history and I agree about the devs. They seem like they are responsive, hard-working and passionate about their game.

It has come a long way. I'm still shocked I'm just now learning about it, and that it isn't more visible. It's exactly what I've been wanting to play for a long time.

I'd much rather play this than Battlefield or Call of Duty.

It depends on the type of game for me. For (competitive) multiplayer games, I don't really care much. As long as it looks coherent and artistically well done, I'm fine with it.

For single player games, especially story-based games, I like when there's a lot of emphasis on graphical fidelity. Take The Last of Us: Part II for example. To me, especially in the (real-time) cutscenes, the seemingly great synergy between artists and engine developers really pays off. You can visually see how a character feels, emotions and tensions come across so well. Keep in mind they managed to do this on aging PS4 hardware. They didn't do anything revolutionary per se with this game in terms of graphical fidelity (no fancy RT or whatever), but they just combined everything so well. Now, would I have enjoyed the game if it looked significantly worse? Probably yes, but I have to say that the looks of the game likely made me enjoy it more. Low resolution textures, shadows or unrealistic facial impressions would've taken away from the immersion for me. Now, some would say that the gameplay of TLoU:II was rather bland because it didn't add or change a lot over TLoU (1), but for me, bringing the story across in this very precise way was what made it a great game (people will have other opinions, but that's fine).

I agree with you on the power consumption part though. Having GPUs consuming north of 400 watts while playing a game is insane. I have a 3080 and it's what, 340 watts TDP? In reality it consumes around 320 watts or whatever under full load, but that's a lot of power for playing games. Now, this generation GPUs are a lot more efficient in theory (at least on the Nvidia side, a 4070 uses 100-150 watts less to achieve the same output as a 3080), which is good.

But there's a trend in recent generations where manufacturers set their GPUs (and CPUs) to be way beyond their best power/performance ratio, just to "win" over the competition in benchmarks/reviews. Sure you can tweak it, but in my opinion, it should be the other way around. Make the GPUs efficient by default and give people headroom to overclock with crazy power budgets if they choose to.

I remember when AMD released the FX-9590 back in 2013 and they got absolutely laughed at because it had a TDP of 220 watts (I know, TDP != actual power consumption, but it was around 220). Nowadays, a 13900K consumes 220 watts out of the box no problem and then some, and people are like "bUt LoOk At ThE cInEbEnCh ScOrE!!111!111". Again, you can tweak it, but out of the box it sucks power like nobody's business.

This needs to improve again. Gaming GPUs should cap out at 250 watts at the high-end, and CPUs at like 95 watts, with mainstream GPUs targeting around 150 watts and CPUs probably around 65 watts.

My stance on graphics is that the realistic graphics of now compared to a game like Fallout New Vegas is that, to me, they don't look very noticeably better unless I actively pay attention to them (which I don't do whenever I play a game).

Then again, I usually don't play triple AAA studio games anymore, so my perception is skewed by either older games or ones with a lot more stylized graphics.

Development has always been incremental, but as gaming engines get better, they start being used for more and more things. AAA games don't always develop the graphics and game engine from scratch, most often they develop one technology and use it for many games, or even buy a pre-built engine and "just" build the game around it.

Unreal Engine has been used not just for games, but also for real time, or near real time film making. The same engine that is used for playing a game can be used to create the background effects in TV shows, or whole scenes.

It's crazy to suggest we just stop working on something because it's good enough, because that's not what people do.

It’s crazy to suggest we just stop working on something because it’s good enough, because that’s not what people do.

Came here to say this, glad it's already been posted.

Also, why is it that every time someone is being critical of advancements in "realistic graphics" they always post screenshots of Lara Croft?

Lara Croft has been around since the triangle boobs. There aren't too many other characters that have been in 3D as long as Lara Croft (Mario 64 Was released the same year, but Mario hasn't come as far as Lara Croft has in terms of photorealism). Plus, she's instantly recognizable. Personally, I don't think there's any deeper reason than that.

That might make sense if they were posting pictures of PS1 era Lara Croft, but they aren't, they always use the newest examples... They would show a bigger gap in time passed if they used the dragonborn from the initial release of skyrim vs the newest remastered version.

I also mostly played slightly older games on not quite cutting edge hardware, but since upgrading to something half modern there are a few games that really impress with their graphics. Cyberpunk 2077 springs to mind. It's the near photo realism of the environments that impresses mostly. The depth and detail in a scene must be really demanding, but damn if it doesn't look really nice. I think we're now at a point where it's going to be only the bigger developers who can really make the most of the hardware as designing the worlds can be so unique, detailed and time consuming to make. Smaller developers aren't going to have the time or budget to build such detailed worlds that will push top hardware (assuming equally well optimized code)

I would look at some of the tech demos for Unreal Engine 5 and its capabilities. We're at the point now where the engine itself, with sufficient art assets, can look photorealistic more or less out of the box.

Where this is still hardest- and likely always will be- is in faces, because our brains are so thoroughly wired to spot micro-expressions and subtle facial and body language as social animals.

That said, for a game like Doom or Path of Exile, sure, I think we're way past the point of diminishing returns. But for games where you're actually experiencing a story and interested in the relationships and characters involved? Things like Death Stranding, RDR, Cyberpunk 2077, or more recently BG3? All of those same nuances of expression can definitely add to the experience- it's the difference between a mediocre actor and a good or great one.

You're not wrong, but the fact is that photorealistic graphics sells games. Everytime a AAA title releases gameplay footage prior to release, the graphics fidelity is always one of the most talked about things. Hell, there have been a lot of games where the majority of the hype has been around the graphics quality.

I'm not saying this is a good thing, quite the opposite, but the fact is that game studios are primarily interested in building a game that makes money (because that's how capitalism works), and having the best graphics is a proven way to boost sales. The only way it's going to stop is for people to stop basing their game purchasing decisions on graphics and start basing them on gameplay. But for casual gamers, that seems highly unlikely

Once I get a 3080 or 4080 I’ll agree with you.

There are 2 different ways in which I managed to interpret this, care to elaborate further?

I used to have a top of the line GPU, and when I did, I would have agreed with you. Now that I don’t, I can see the difference between my experience & the videos & screenshots I see online.

So yeah, once I have a newer card, that can handle the pretty games I like, I’ll start agreeing with you. Probably because I don’t want to have to upgrade the GPU eventually lol

I am pretty sure we just want to run the games we want at ultra without fearing stutters, even if said games have PS3 level graphics.

It's not really worth it considering the jump in processing power that is needed imo. My last PC lasted me about 10 years while my current one can't play current titles after 3 years and I already have to upgrade the CPU.

A better approach would be to revoke gamer privileges for filthy casuals.

I gotta be honest, the old tomb raider looks more real than the newer ones