Todd Howard asked on-air why Bethesda didn't optimise Starfield for PC: 'We did [...] you may need to upgrade your PC'

Edgelord_Of_Tomorrow@lemmy.world to Games@lemmy.world – 411 points –
Todd Howard asked on-air why Bethesda didn't optimise Starfield for PC: 'We did [...] you may need to upgrade your PC'
pcgamer.com

You heard him 4090 users, upgrade to a more powerful GPU.

198

Damn this is a pathetic response. He could've said "We've tried our best to make it as polished as possible before launch, and are working towards further optimising it to give you the best experience, wherever you play". Even if they did jackshit, it would not come out as condescending and snarky. Maybe he wasn't prepared for a tough question on the spot right at the beginning of the interview, but it does show how he thinks about his games. In his mind, the game running at all on PC is optimised enough.

I am not saying he's bad for not making Creation Engine super optimised engine on this planet, I'm saying he's bad for not acknowledging it is currently most demanding engine despite looking merely half as good as Cyberpunk 2077 or idk Arkham Knight.

It's not even about graphics alone.

They're clearly building their games in an extremely inefficient way. Starfield does not have anything going on in it that other games with much lower requirements also have done.

You see evidence of this in their previous games. One of the major performance issues with Fallout 4 for example, was that instead of building their cities in performant ways, they literally plonked every building as an individual asset into the world which thrashed the CPU for no reason. Modders just had to merge them all into one model to significantly improve performance. Their games are full of things like this and Starfield will be no different.

Unless I'm completely mistaken here, modders didn't combine the buildings together, that's how they are by default. Mods, however, sometimes needed to break said system which resulted in massively degraded performance.

Nah the Boston performance was terrible in vanilla. The precombination fixes made huge performance improvements. There were issues with mods breaking precombined meshes but that was a separate issue.

Why would he? Todd hates everyone who plays his games and cares only about separating money from pockets. Fallout 76 made that quite clear to everyone.

If he gave a standard appeasing PR statement without following it up at all, that would somehow be preferable? This may be snarky, but at least you know what to expect.

I mean, yeah I guess this does help temper expectations that they are done optimising, so maybe you're right, being blunt is probably for the best.

The missing part is that the user with a 4090 complaining had a CPU from 2017 πŸ₯΄

What’s a CPU bottleneck? I have the magic gpu

Yeah, I'm not buying that either. I'm on a 2014 i7 and a 3060 playing on ultra. My sole issue was not running on an SSD which I resolved yesterday. That kid is clearly playing on a potato and lying.

I'm shocked at home many PC users are still running HDDs given that SSDs have been standard ok consoles for three years now.

They've pretty much been standard for gaming and containing the os on PC for 5 if not more. HDDs are still good for storage, but only luddites and people trying to save money in the stupidest way would have their games on them.

Playing on ultra on a 3060 ? So you're getting 20-30 fps? Because that's what it gets on mine with a much newer CPU. I had to turn it down to med-high to average 45 fps

Considering that this thing runs great on a Series S (which is CPU-heavy, but with a weak graphics card) that makes so much more sense.

Gotta love the Bethesda fanboys upvoting this one cherry picked comment. They're are like 70 comments in there with all different combos of system specs complaining about performance.

https://www.youtube.com/watch?v=uCGD9dT12C0

Get a new game engine, Todd. Bethesda owns id Software. id Tech is right where.

Exactly this. It was only two generations ago when idTech was an open world engine, id can and have made it to do whatever they want and to suggest that despite Bethesda money (let alone MICROSOFT money) id couldn't make a better engine with similar development workflows as Creation is just dishonest to suggest.

It's a shame idTech is no longer released publicly. It would've been amazing to see what people could do with the beast of an engine that powered DOOM Eternal, especially modders.

I assume you're talking about Rage, which had an open world map, but no where near the level of simulation systems as a Bethesda game. In fact I remember back at the time most of us saying the map was pointless as it was just a way to travel between levels with nothing to do in it.

There are no "levels of simulation systems" in Starfield. NPCs don't even have schedules in this game, they literally just stand around in the same spot 24/7.

It's still keeping track of lots of variables across a big play space at any time regardless of NPC schedules.

They tried that once with Oblivion and clearly it didn't add enough to the game and players experience to return to.

They tried that once with Oblivion

They advertised that with Oblivion's AI but never delivered on half the claims.

Go look at the pre-release claims of the Radiant AI and what was actually delivered.

"keeping track of lots of variables" doesn't cost CPU time though, since nothing that isn't on the same map as you is ever relevant for anything. Their engine just fucking sucks.

Keeping track of variables doesn't use CPU time? Ok man. I'm all for hating on Bethesda's shitty engine but that's just not true. At the very least it does track what NPCs are doing off screen which is how they end up at your ship when you tell them to go there. They will actually walk to your ship if you don't get there first.

On the other hand it's basically guaranteed that Bethesda spent zero effort optimizing that. I bet it's the same code they ran for Skyrim.

You realise custom engines are built for specific game types right? iD Tech is great for creating high fedelity FPS games with linear levels and little environment interactivity. That's not what Bethesda make though.

They could do everything they usually do but better if they used Unreal. They don't need a custom engine. They just need an engine that isn't over 2 decades old with a bunch of shit taped to it to make it look modern. Not to mention, ID already did make a custom built engine that handles much of what Bethesda RPGs do when they made RAGE. They could have used that, with the only issue being learning it. Not sure what their turnover rate is like... maybe they're just too used to GameBryo/Creation to be able to switch now. It might take too long to learn anything new. Plus it would have to be able to have a toolset. If they didn't release those easy to use modding tools, there could be rioting in the streets.

As far as I know, Bethesda are unusual in modern Devs in that they have a small team for the size of game they make, but they have strong retention of staff so have huge amounts of institutional knowledge about how they do things. Shifting to a new engine would basically mean starting from scratch on a company level. Unlike Ubisoft or Activison, they can't just throw several thousend Devs at a game to brute force the development either.

But that's their biggest problem. There's no reason for them to have a small unchanging team. It's very very obvious that they never get an influx of new ideas. Starfield feels like it was made in 2016 and the optimization effort is comically bad. The writing is still mostly boring, campy and naive like it was written by a 15 year old Mormon. The facial animations are incrementally better than fallout but still noticeably worse than much older games like Witcher 3. I could go on.

It's not a bad game at all but it could've been so much better if Bethesda execs weren't greedy cheapasses and the dev team was open to changing their process.

This why Bethesda needs to be criticized instead of constantly getting fellated by fanboys. ES6 will be an outdated mess because Bethesda never sees any feedback except over the top praise for half-assing their games.

Fanboys downvote you but you are right, even if I love the fallout franchise, the same gameplay loop, the same engines, potato faces in 2023, outdated animations... etc, right now I would prefer Microsoft to force obsidian to take care of the next fallout, and ban Todd Howard for ever putting one foot in the dev took, even in the building. He can go fuck himself and his shit engine.

ID tech is nowhere near flexible enough for something like Starfield or even Skyrim. It's partially the reason why it's so efficient. It simply isn't fit for the task.

And the Bethesda developers are intimately familiar with Creation Engine, achieving the same level of productivity with something new will take a long time. Switching the engine is not an easy thing.

Not to say that Creation Engine isn't a cumbersome mess. It has pretty awful performance, stability and is full of bugs, but on the other hand it's extremely flexible which has allowed its games to have massive mod communities.

If Bethesda can't take the time to do it then who can? People act like they're some small time developer but they're not. They simply refuse to expand their dev team to do things like a redesign.

Creation engine is not going to hold up well for another 6 years, there's no way their cell loading system will be considered acceptable by the time ES6 comes out. The amount of loading screens in Starfield is insane for a modern game. This company needs new talent badly.

I know they don't want to switch, but it would be worth it to make the swap to something like unreal, even if it takes a few years of customization to get the open world stuff right. Creation Engine just feels so old.

I have a 3080 ti, and a 12700k, and 32 gigs of ddr5, and a 2 terabyte ssd. It runs great for me. I don’t understand the problem. /s

So, this system runs it fine? Good to know. I was worried that my computer would not be able to run it smoothly, but now no worries at all.

I've got an 8086K and 3080, running on a 4K screen - with Ultra settings and FSR of 80% I'm getting 35-40fps, which honestly doesn't feel too bad. It's noticable sometimes but it's a lot smoother than the numbers suggest.

Because my CPU is a little long in the tooth, I've gone probably a bit hard on the visuals, but my framerate didn't improve much by lowering it. The engine itself has never really liked going past 60fps, so I don't know why people expected to be able to run 100+ frames at 4k or something.

Sorry mate but 35fps on a 3080 with FSR is just objectively bad performance.

Starfield is not doing anything in terms of graphics or gameplay that other games that run 3-4 times as well aren't doing.

That’s because they’re CPU limited, mate.

Easy 1440p60 on ultra everything with no scaling on my 3090. Frequently up in the 80-90 FPS range. This game runs fine. It’s not a β€œteetering mess” as you say.

What exactly is it doing that an 8086k is CPU limited to 35fps?

Completed some testing on my end, using intels PresentMon and sitting at 35fps average in New Atlantis my GPU busy is pegged at about 99% of the frame time, so nothing really.

I do get a bit of a CPU limitation when it's raining, but nothing significant, dropped to about 30fps.

Trying at 1440p with the same settings as the 3090 above got me around 50fps, 1440p is almost half the pixels of 80% of 4k as well, so that's not helping my GPU much!

I’d really not expect the performance difference between a 3090 and a 3080 to be that large, and the only difference I can think of in our systems is the CPU. (5800X3D vs 8086k)

New Atlantis is a smooth 60+ fps with every setting maxed out at 1440p.

Considering that CPU is less powerful than what’s in the Xbox Series S, which does 1080p30, I’m not at all surprised they’re getting a similar frame rate.

If this was a β€œteetering mess” you would have heard it in the Gamers Nexus benchmarks. Steve says nothing to this end, and the game benches predictably across the spectrum of hardware tested.

for me the game runs pretty well with ~90+ fps on high and activated fsr2.

5800x 3D, 64 gigs of ram and a 6900XT I shot cheap during the great gpu collapse. And by the looks of the game this seems pretty reasonable to me.

AMD users are having a better time with it, unsurprisingly. I wish I hadn't gone for Nvidia but too late for that.

I think there wil be patches and some updates to NVIDIAs shitty driver that will fix things in the future :) Otherweise yeah maybe get an AMD GPU next time, don't fall for the NVIDIA Marketing. Using Radeons since the 9800 pro Bundle with Half Life 2 and never had any issues with them or their drivers.

Hopefully. I've always been more of an AMD/ATI fan, but for this laptop the deal worked out to be better with an Nvidia card. But next time I'm not settling for it. AMD CPU and GPU is the way to go. Especially because I'm trying to daily on Linux now and the driver side is much much nicer with AMD.

I've got that but with a 4080 - no issues.

I admittedly feel like I went full retard on my build and seriously hope these specs aren't what's necessary...

Hey at least you don't have to upgrade for a while, could probably run it for another yr if todd is generous

You had me in the first sentence, and then I realized it was sarcasm. πŸ€ͺ I'm running a similar rig, but it's primarily for rendering work, etc., so for juuust a second there, I wondered if it was falling behind. πŸ˜…πŸ€“

I'm a game developer and I'm ashamed by this.

When chip production will halt because of the climate, you will see programmers optimizing their code again.

Jeez I hope this economy crashes.

Honestly, what do you expect someone to say when asked a question like that? There's no answer there.

β€œwe have worked a lot on PC performance. wanted to reach performance parity with consoles for release on similar hardware and we achieved that, However, our teams will continue working on improvements and integrating technologies like fsr and dlss in the future. β€œ

Umm.. honesty. Games used to run on the bleeding edge of performance. Not Bethesda games but just games in general. Now the release half broken blatant cash grabs and think no ones gonna call them out for it.

They don't think that. They just know that the people will pay up anyway, bringing in the profits for shareholders and the C-suite, and that's all that matters.

The DLCs, cosmetics, MTX, etc. are all pretty much alive and well despite everything just because enough people cash out, so why change their ways?

AAA gaming is a big industry, and big industries are nothing wholesome.

Seriously? Just say that we're always trying to optimize our games and we'll continue working on it. It's such an easy question to tackle. I refuse to believe you can't see that. People just think Bethesda is above criticism for some inane reason.

That's not an answer that people would have accepted either and no matter what answer was said, it would have been dissected and criticized by the syllable.

The point I'm trying to make here is that "optimize your game" doesn't help anybody. Especially not as an interview question. You might as well have asked "why didn't you make your game fun?"

people really need to put the nostalgia googles down...back in the days nobody played Crysis with full details and a steady framerate.

You were in 1024x768 and turned everything down just to play the game with barely 30fps and you know what, it was still dope as fuck. So yeah guys get used to lower your settings or to upgrade your rig and if you don't want to do that get a xbox

Crysis was built by a company specialising in building a high fidelity engine. It was, by all accounts, meant primarily as a tech demo. This is absolutely not the case with Starfield - first, the game doesn't look nearly good enough for that compared to Crysis, and second it's built on an engine that simply can't do a lot of the advanced stuff.

The game could be playable on max settings on many modern computers if it was optimised properly. It isn't.

sure mister gamedev, please continue to tell more on how an engine you clearly worked on, should run...

I dont say that Starfield is a well optimised game and performance will get better with upcoming patches. But I also don't think it's an unoptimized mess, I think it is running reasonable and people really should start review their rig, because modern games will need modern components

Oh and also other games did not run that well like you maybe remember ;)

You don't have to be a game dev to see that games that came out before Starfield look and perform better. If you bought the game and you enjoy it, that's all fine and I won't make fun of you for it, but let's not defend what is an obvious point of incompetence on Bethesda's side.

why buying starfield when it is on gamepass πŸ˜…

And buddy, I've been playing Bethesda Games since Daggerfall and believe me, Starfield is a fucking polished diamond compared to their old good games and compared to their latest shitshows like fallout 4 and fallout 76...

I'm not your buddy

You're comparing Bethesda games to Bethesda games, which we all know are buggy messes. Starfield falls short of my expectations for what a polished diamond looks like.

okay not-buddy πŸ˜‚ I think we are also pretty much done here, since I dont see any point in discussing this any further with you. So byeeee and have a pleasent day not playing Starfield I guess.

sure mister gamedev, please continue to tell more on how an engine you clearly worked on, should run...

I can easily compare between what different game companies do. Why are you acting like I need to be a developer on a game to criticise that game?

I dont say that Starfield is a well optimised game and performance will get better with upcoming patches.

Todd could have said so. He didn't. Why?

But I also don't think it's an unoptimized mess, I think it is running reasonable and people really should start review their rig, because modern games will need modern components

I never stated this. I simply said: comparing Starfield and Crysis is deliberately disingenuous, because Crysis was fundamentally meant to break boundaries, which Starfield doesn't do.

Oh and also other games did not run that well like you maybe remember ;)

Okay, what's the argument here? Do you think I say for those games "well, you're not Bethesda, so I'm fine with you not running well"?

you don’t have to know the internals of the engine. you just need some basic deduction powers.

does it look it look good compared to other AAA games? no

does it run fast? no

ergo. the engine is crap.

the same thing happened to cd projekt red but they ditched their engine after the cyberpunk fiasco. they will just pay epic

I don't know why they keep using that piece of shit engine, Microsoft should order them to format every PC and start again with UE5, the engine that it's actually next gen

does it look it look good compared to other AAA games? no

well I beg to differ on that, but it's quiete a subjective topic right ;)

does it run fast? no ergo. the engine is crap.

Again very subjective, very dependend on your hardware and also a pretty dumb conclusion, since an engine has more qualities then to run "fast".

I already mentioned in this thread, the games runs quite well for me and I would call fps in the range from 80 to 124 quite fast for a Bethesda Open World Game. So what do we do now with our subjective oppinions πŸ€”

well you can put your β€œnot in my computer” opinion in your ass. widespread benchmarks by established gaming journalists show good computers struggling.

complains about others wearing nostalgia goggles

calls Cysis dope

After all this time I don't think I ever heard anything about how Crysis plays or what's the story and such. People only talk about how hard it was to run and how fancy these graphics were. Doesn't make it sound all that great.

Story is meh but lots of people will say how the open ended nature of Crysis was fun and a pity that it was removed for a more linear CoD style in Crysis 2

Wtf Crysis 1 was awesome... At least the first part without the aliens... And not because of the graphics

Except this time even with 1024x768 and lowest settings you can barely break 60 FPS due to the huge CPU overhead.

And that's with a Ryzen 7 5800X.

I have the same processor and no issues. 1440p 80-125 fprs, high Details, 100% and FSR2

In New Atlantis City outdoors? Mine barely stays above 60 FPS, sometimes dipping under.

It's system by system, I have the same cpu and do fairly well, admittedly with it boosting to 4.5ghz. My wife has the same cpu and it struggles on her machine. It feels like the game just wasn't tested well.

There will always be that game that pushes the boundaries between current gen and next gen. Sometimes even more. Crysis is the perfect example of the past. Starfiels seems to do a decent job right now even if it's probably not even close to what Crysis did. When people spend a lot of money we feel entitlement, thats only natural. No one did anything wrong. So no need to point a finger anywhere.

But it didnt tho, it looks shit and hogs more resources compared to other games like cyberpunk which is probably a better example for next gen graphics

Please explain in detail how Starfield is pushing the edge graphically in any way that's comparable to Crysis.

Also please explain how you expect them to improve as a developer when you refuse to criticize them.

You seem to have missed the part where I wrote that Starfield is probably not even close to pushing the boundaries in the same way that Crysis did. So I can't do much explaining in detail about that it is.

"We optimized it for the very high end of computers. The issue is your wallet."Kek mf'ing w

With my experiences playing the game with an unsupported GPU and getting a solid 60 fps still as long as no NPCs are in the vicinity, I don't think it's the GPU side of things that needs optimization. It's whatever uses the CPU.

It's the CPU. I had to throttle the process to be able to able. This game is a CPU ressource hog

2 more...
2 more...

First game to just have constant crashes on my seven year old RX480, which is great since otherwise the game runs completely fine. Support doesn't seem to want my crash reports either, I guess in Todds world, I should just throw the thing in the trash for a game that does literally nothing special in the tech department.

To be fair, that GPU is long past EoL. Even the 7xx series doesn't receive support/driver updates anymore.

EDIT: It was late, totally misread it as GTX not RX.

RX480 is an AMD card that came out 7 years ago, not the GTX 480.

Maybe he thought you typoed RTX since nvidia uses that now instead of GTX. But there is no RYX480 so idk

It was late and I totally misread it as GTX instead of RX.

Also my bad for not replying to this sooner. I thought I did.

Their idea of optimization in console was to cap the frame rate to 30, even on the Series X. So you can wonder what that means for PC

Since negative opinions travel fast, I'm just gonna say my GPU is actually below the minimum requirements, though admittedly I upgraded CPU last year. The game's minimum is a GTX 1070 TI, I just have a regular GTX 1070.

In my case, it's doing a LOT of dynamic resolution and object blurring nonsense to get the game to run smoothly, but it does run smoothly. I get to see the character faces during conversations, I can see what I'm doing, there's no hitching, etc. New Atlantis looks ugly, but that might change if I get a new GPU.

It's perfectly optimized. I'm getting a rock solid 30fps. /s

Seriously though, I think it's fine. Especially indoors and in space, it performs well and looks incredible. New Atlantis is kinda ugly and janky though.

If there’s an Xbox One version, then there’s really no excuse for it not to load on a PC with similar or better cpu/memory/graphics specs.

There isn't.

Looks like you’re right. You have to scroll to the bottom of the Xbox page to see Xbox X and S as system requirement.

https://www.xbox.com/en-US/games/starfield

I think it was known since 2 years now, so maybe they didn't bother publicizing it on the page. In fact, MSFT said they won't be putting any games on Xbox One anymore.

There isn't, MS stopped supporting the One last year.

Damn, glad I didn't buy it on day 1. Got baldurs gate instead and am enjoying that

lol, no they didn't. They didn't even test adequately, more than a few GPUs that meet the requirements didn't work when early access launched.

Running on i5-8400, 3080ti. Runs good to great depending on whether I'm in New Atlantis.

Wish they had asked him why the console version isn't optimized either running at only 30.

WHERE IS MY CLIMBABLE LADDER, TODD?

But uhhh, you can climb ladders.

Maybe they really lost their ladder. Why Todd would know where it is beats me, tho. Or maybe Todd is one of those people you lend stuff to and will never give it back.

I've got a mid-grade PC and haven't had any issues except the potato people with weird speaking animations and that ugly green filter over everything. Used a mod to take care of the second thing. While I am enjoying the game quite a bit, I've long wished that Bethesda could up their art style and animations. Thank fuck they got more than 5 voice actors now.

I have a 4060 TI and it crashes constantly on low. Runs fine between crashes so not a performance issue...

Runs great on my 5000 series AMD CPU and 3000 series Nvidia GPU, those came out 2 years ago now, and that's averaging about 50fps on a 4k monitor.

If that isn't optimized, idk what is. Yes, I had high end stuff from 2 years ago, but now it's solid middle range.

People are so damn entitled. There used to be a time in PC gaming where if you were more than a year out of date you'd have to scale it down to windows 640x480. If you want "ultra" settings you need an "ultra" PC, which means flipping out parts every few years. Otherwise be content with High settings at 1080p, a very valid option

I mean, this was also before video cards cost as much as some used cars or more than a month's rent for some people.

yea idk if used cars or rent are good comparisons.

4090 MSRP: $1,599

Rent for a 3 bedroom in a nearby town: $1,495/month

For only 300 more I have a mortgage on a 2000sq foot home in a large American city….

I have a 6900xt because I got a promotion recently and wanted to treat myself to get off the r9-300 series finally but it wasn’t 1600, I think I paid 1100

1,500 gets you a closet with a window around me. Prices are fucked.

Funny enough I picked one hell of a deal to be close-ish

It's averaging 1.9 to 2k round here for a 1bdrm

1 more...
1 more...

I'm not saying it's not an expensive hobby, it is. PC gaming on ultra is an incredibly expensive hobby. But that's the price of the hobby. Saying that a game isn't optimized because it doesn't run ultra settings on hardware that came out 4+ years ago is nothing new, and to me it's a weird thing to demand. If you want ultra, you pay for ultra prices. If you don't want to/can't, that's 100% acceptable, but then just be content to play on High settings, maybe 1080p.

If PC gaming is too expensive in general that's why consoles exist. You get a pretty great experience on a piece of hardware that's only a few hundred dollars.

PC gaming didn't used to be THIS expensive.

You could build an entire machine for the cost of a 4090.

I don't know if you noticed, but everything became more expensive in the last year. Food, housing, etc, it's called inflation and PC parts aren't immune.

4090 is definitely nuts, but with inflation the 4080 is right about on par. As usual team red very close in comparison for a much lower cost. You don't have to constantly run the highest of the high level to get those sweet graphics, but it's about personal taste. Personally it's not for me paying the 40% more for a 10% jump in graphics, but every 2-3 generations is when I usually step back and reanalyze. Tbh usually it's a game like starfield that makes me think if I should get a new one. Runs great for now though, probably have at least 1 hopefully 2 more generations before I upgrade again

4090 is definitely nuts, but with inflation the 4080 is right about on par.

On par with the competing product? Sure. On par with inflation? Not by a long shot. GPU prices tripled a couple years back. Inflation accounted for only a small fraction of that. They have come down somewhat since then, but nowhere close to where they should be even with inflation.

As usual team red very close in comparison

Indeed. Both brands being overpriced doesn't make them any less overpriced. Cryptocurrency and scalping may be mostly gone now, but corporate greed persists.

That's not Todd Howard's fault, but when he makes a snarky comment expecting everyone to cough up that kind of money to play his game, it's more than a little tone deaf.

I'll admit didn't know the 4000 was that high, but yeah 1200 for the midrange card is too much. If it stays like this I may switch back to team Red. I do believe costs are probably higher, (I remember buying my first board with an AGP slot), the ones now are... a bit more complicated and complex to make, but the jump from 800 in 2020 to 1200 in 2023 is too much.

the 4080 is right about on par

Adjusted for inflation in the US, the 1080 ti cost only $876 in today's money when it came out. The 4080 launched at $1231 in today's money. You are simply incorrect

The dude digs a hole and then grabs a bigger shovel

Some people just really love a company and will do anything to excuse their shortcomings

Starfield is poorly optimized and that's really all there is to it. I'm sure in a few weeks modders will (once again) fix some obvious issues. Bethesda has no incentive to do the work themselves when the community will do it for free

Okay I'll admit I didn't know that's how much the 4080 was, last time I checked was the 3000 series and yeah, that's a lot. (I thought it started around 8-900) I stick to my points though, if you want ultra gaming, it's going to cost an arm and a leg. My main point is still shouldn't expect older hardware to get ultra settings, and that's okay. You can play a game on medium settings and still have a blast.

1 more...

People are entitled because they don't want to spend thousands of dollars on components only for them to be outdated within a fraction of the lifecycle of a console?

How about all the people that have the minimum or recommended specs and still can't run the game without constant stuttering? I meet the recommended specs and I'm playing on low everything with upscaling turned on and my game turns into a laggy mess and runs at 15fps if I have the gall to use the pause menu in a populated area. I shouldn't have to save and reload the game just to get it to run smoothly.

Bethesda either lied about the minimum/recommended requirements or they lied about optimization. Let's not forget about their history of janky PC releases, dating back to Oblivion, which was 6 games and 17 versions of Skyrim ago.

Consoles don't even last their whole life time anymore, both machines required pro models to keep up with performance last gen and rumours have it Sony are gearing up for one this gen too.

and no one is saying they have to, that's my point that keeps getting overlooked. If someone wants to play sick 4k 120fps that's awesome, but you're going to pay a premium for that. If people are upset because they can't play ultra settings on hardware that came out 5 years ago, to me that's snobby behavior. The choice is either pay up for top of the line hardware, or be happy with medium settings and maybe you go back in a few years and play it on ultra.

If the game doesn't play at all on lower hardware (like Cyberpunk did on release), then that is not fair and needs to be addressed. The game plain did not work for lower end hardware, and that's not fair at all, it wasn't about how well it played, it's that it didn't play.

4k 120fps would be great

But the 4090 only averages 75fps at 4k high preset. 7900xtx averages 74fps

You can go skim the Gamers Nexus review of the 7700xt, it has a portion dedicated to Starfield in it.

You need a 6700xt or 4060ti to get 60fps on high at 1080p

Idk what to tell you mate, I'm on a 3080, 1440p, and I'm getting average 60fps on 1440p My settings are all ultra except for a couple, FSR on at 75% resolution scale. To me, that's optimized, I don't even expect 60fps on an RPG. Cyberpunk I've never had higher than 50.

Why don't you set it to ultra "except for a couple" until you get 60+ in cyberpunk

You’re missing the point.

There are a lot of games that look much better AND run much better.

It’s not about how often you upgrade.

I mean, yeah but also by what metric. There's a thousand things that can affect performance and not just what we see. We know Starfield has a massive drive footprint, so most everything is probably high end textures, shaders, etc. Then the world sizes themselves are large. I don't know, how do you directly compare two games that look alike? Red Dead 2 still looks amazing, but at 5 years old it's already starting to show it's age, but it also had a fixed map size, but it got away with a few things, etc etc etc every game is going to have differences.

My ultimate point is that you can't expect to get ultra settings on a brand new game unless you're actively keeping up on hardware. There's no rules saying that you have to play on 4K ultra settings, and people getting upset about that are nuts to me. It's a brand new game, my original comment was me saying that I'm surprised it runs as good as it does on the last generation hardware.

I played Borderlands 1 on my old ATI card back in 2009 in windowed mode, at 800x600, on Low settings. My card was a few years old and that's the best I could do, but I loved it. The expectation that a brand new game has to work flawlessly on older hardware is a new phenomenon to me, it's definitely not how we got started in PC gaming.

I have an AMD 3800X and an RTX2070 and I am barely seeing 30fps on the lowest settings at 1080p and 1440p.

DOOM Eternal runs just fine at 144fps on High and looks miles better.

It's just not optimised.

Doom eternal also came out 3.5 years ago now, and your card is nearly 5 years old. That's the performance I would expect from a card that is that old playing a brand new game that was meant to be a stretch.

I'm sorry, but this is how PC gaming works. Brand new cards are really only awesome for about a year, then good for a few years after that, then you start getting some new releases that make you think it's about time. I've had the 3000 series, the 1000 series, before that I was an ATI guy with some sapphire, and before that the ATI 5000 series. It's just how it goes in PC gaming, this is nothing new

Why do people use entitled like it is a bad thing? Why wouldn't consumers be entitled as opposed to spending money as though it is an act of charity? Pretty weird how mindset of gamers over the years has shifted in a way where the fact that they are consumers has been forgotten.

I say entitled because gamers should just be happy, be happy with the hardware you have even if it can't put out 4k, turn off the FPS counter, play the game. If you're enjoying it, who cares if it occasionally dips down to 55? The entitlement comes from expecting game makers to produce games that run flawlessly at ultra settings on hardware that's several years old. If you want that luxury, you have to spend a shitload of money on the top of the line gear, otherwise just be happy with your rig.

Products are just products designed to get money out of people. I don't have an appreciation like its some sports team for them. It comes down to simply if it is worth spending money on or not. Being entitled is a good thing, since it encourages less consumerist behavior with how lot of people can use less frivolous spending in their lives.

You can try to spin it as a negative, but I find this hail corporation approach to consumerism very odd. Wanting more value for the money is a good standard to have.

I'm actually agreeing with you, people should be happy to play the games on their older hardware even if it can't pull down the ultra specs. We don't need to always be buying the latest generation of GPUs, it's okay to play on medium specs. We don't have to have the top of the line latest card/processor/drive, we can enjoy ours for years, even if it means newer games don't play on ultra. If you have the funds to buy new ones every generation, more power to you, but I buy my cards to last 8-10 years. The flipside is just expect that the games won't run on ultra.

People should expect more optimization for the games they look into and better price for performance offerings for hardware. Approach of just pushing what is acceptable further into the category of the premium tier leads to worse consumer offerings over the long run. What is considered acceptable hardware has gotten more and more out of reach each generation while disposable income has not kept up.

Complacency and constantly falling scale of what is acceptable is what leads to worse standards. Bad prices and optimization should not get passes. PR management of be happy with hardware or performance is not the job of consumers aside from those who are being paid to run those type of campaigns.

Hmm .i dont know if you ever noticed but there usualy is a very little diffrence between ultra and high/very high but a lot of diffrence in performance. Ultra settings were always designed to sweat the pc and i assume its similar with starfield . And there is also advent of the 4k which put this ridicolous standard even higer( which especialy on pc makes very little sense unless you play on it like on a console from your couch ). In fact the fact that old graphics card are still faring so well is an anomaly rather than the standard.

That’s the thing - I’d say this game is pretty well optimized. People have unrealistic expectations of what their hardware can do. That’s a better way of putting it than β€œentitled”.

None of the 3D Bethesda games played this well at release. I speak from first hand experience building PCs since 1999 and playing Oblivion, FO3, NV, Skyrim, and FO4 at release. Playing those games on years old hardware required lower than native resolutions and medium settings - exactly what you see in Starfield currently.

I'm running it on a Ryzen 1600 AF and a 1070. NOT Ti. 1440 at 66% resolution. Mix of mostly low some medium. 100% GPU and 45% CPU usage. 30 fps solid in cities. I won't complain at all. I'm just happy it runs at all solidly under minimum spec.

This is a great way to view it, and I think you're getting excellent specs for that card. Kudos to you for getting it running !

Runs great on my 5000 series AMD CPU and 3000 series Nvidia GPU

Just specifying the series doesn't really say much. Based on that and the release year you could be running a 5600X and RTX3060 or you could be running a 5950X and RTX3090. There's something like a ~2.5x performance gap between those.

I'm happy with my games at 1080 and I'm going to be sad when they start requiring higher resolutions.

Curious if you can name one thing Starfield is doing that wasn't possible in a game from 2017.

I mean, there isn't one thing you can point to and say "ah ha that's causing all teh lag", things just take up more space, more compute power, more memory as it grows. As hardware capabilities grow software will find a way to utilize it. But if you want a few things

  • Textures are larger, where 4k was just getting rolling in 2017 (pre RDR2 after all), to accomodate 4K textures had to be scaled up (and remember width and height, so that's 4x the memory and 4x the space on drive)
  • Engines have generally grown to be more high fidelity including more particles, more fog, (not in Starfield but Raytracing, which is younger than 2017), etc. All of these higher fidelity items require more computer power. Things like anti-aliasing for example, they're always something like 8x, but that's 8x the resolution, which the resolutions have only gone up, again rising with time.

I don't know what do you want? Like a list of everything that's happened from then? Entire engines have come and gone in that time. Engines we used back then we're on at least a new version compared to then, Starfield included. I mean I don't understand what you're asking, because to me it comes off as "Yeah well Unreal 5 has the same settings as 4 to me, so it's basically the same"

Textures are larger, where 4k was just getting rolling in 2017 (pre RDR2 after all), to accomodate 4K textures had to be scaled up (and remember width and height, so that's 4x the memory and 4x the space on drive)

Texture resolution has not considerably effected performance since the 90s.

Changing graphics settings in this game barely effects performance anyway.

Things like anti-aliasing for example, they're always something like 8x, but that's 8x the resolution, which the resolutions have only gone up, again rising with time.

Wtf are you talking about, nobody uses SSAA these days. TAA has basically no performance penalty and FSR has a performance improvement when used.

If you're going to try and argue this point at least understand what's going on.

The game is not doing anything that other games haven't achieved in a more performant way. They have created a teetering mess of a game that barely runs.

Texture resolution has not considerably effected performance since the 90s.

If this were true there wouldn't be low resolution textures at lower settings, high resolutions take up exponentially more space, memory, and time to compute. I'm definitely not going to be re-learning what I know about games from Edgelord here.

You're being disingenuous mate. On a machine with adequate VRAM there is zero performance difference.

ohhhh so IT DOES affect performance at all 😱

Only if you run out of VRAM. If there's sufficient VRAM the frame rate barely changes between Lowest and Highest texture quality.

Texture resolution has not considerably effected performance since the 90s.

lol. try to play a game with 4K textures in 4K on a NVIDIA graphics card with not enough vram and you see how it will affect your performance πŸ˜…

I wouldn't say that Starfield is optimized as hell, but I think it runs reasonably and many people will fall flat on their asses in the next months because they will realize that their beloved "high end rig" is mostly dated as fuck.

To run games on newer engines (like UE5) with acceptable framerates and details you need a combination of modern components and not just a "beefy" gpu...

So yeah get used to low framerates if you still have components from like 4 years ago

Changing graphics settings in this game barely effects performance anyway.

That's sound like you are cpu bound...

If a 5950 is CPU bound then the game is badly optimised.

I don't know and I don't care what is wrong with your system but the amd driver tells me I'm averaging at 87fps with high details on a 5800X and a radeon 6900, a system that is now two years old and I think this is just fine for 1440p.

So yeah the game is not unoptimized, sure could use a few patches and performance will get better (remember it's a fucking bethesda game for christ's sake...) but for many people the truth will be to upgrade their rig or play on xbox

The game might be much more CPU bound on Nvidia cards. Probably due to shitty Nvidia drivers.

I have a 5800X paired with a 3080 Ti and I can't get my frame rate to go any higher than 60s in cities.

sorry to hear that, no problems here with AMD card but I've been team AMD all my life so I have no expierence in NVIDIA Cards and their drivers

PC gamers enjoyed a bit of a respite from constantly needing to upgrade during the PS4/Xbone era. Those machines were fairly low end even at launch and with them being the primary development formats for most games, it was easy to optimize PC ports even on old hardware.

Then the new consoles came out that were a genuine jump in tech again as consoles used to be, and now PCs need to be upgraded to keep up and people that got used to the last decade on PC are upset they can't rock hardware for multiple years anymore.

I'm running it on a Ryzen 5 2600 and an RX 570, and it seems to run relatively well other than CTD every hour or so.

I have a PC with 5800X, 3080 Ti, and 64 GB DDR4-3600. I play at 1440p with 80% render scale, Medium-High settings (mostly Medium) and it's barely above 60 FPS outdoors. It runs like shit.

Luckily it can go 140+ FPS indoors.

I'm curious, I have a 3080 as well and I'm getting ultra across the board and I average 60fps, maybe a setting or two is at high, also 1440p. Installed on an SSD, right? Render scale for me is 75%, only other thing I can think of is I overclocked my ram? But I don't think that'd account for that huge of a jump

Exactly my point. I want 90 FPS at least and lowering the settings didn't help at all.

Oh, well then I'd readjust expectations. Doom and fast paced shooters usually go up that high because they have quick fast-paced combat, but RPGs focus on fidelity over framerate. Hell, Skyrim at launch only offered 30fps, Cyberpunk I mentioned I never got above 45. 60 in an RPG is really a good time, don't let the number on the screen dictate your experience. Comparing a fast shooter and an RPG like this is apples and oranges

I'm honestly shocked a game like this can run at 60fps. <45 and I start to get annoyed in RPGs. I'd expect if you wanted framerates that high you may be needing to window it at 1080 and lowering the settings further.

Nah 60 is not good enough for me. I'm fine if it's a mobile game or handheld. I have no problems getting 90 FPS minimum in A Plague Tale: Requiem and Cyberpunk 2077.

In Starfield, not even 720p with lowest settings will help because the game is very heavily dependent on CPU. Looking at HW Unboxed benchmarks, the 5800X only managed to do 57 FPS average. You need a 7800X3D or a 13600K to get 90 FPS average.

As long as you know you're definitely not in the key demographic then, for RPGs 60fps is pretty much the standard. Fine if you want more, but the game was not built as an FPS, it was built as an RPG. Those are the people I'm annoyed with, the ones who are complaining at Bethesda for not building an RPG to run like how you describe on hardware that's several years out of date already, that's just not possible

Bullshit, there's no "standard" FPS for a certain genre. Also the 3080 Ti is a $1200 last gen GPU and the 5800X is a $450 last gen CPU. It's ridiculous that they can't even push 100+ FPS at the lowest settings. The CPU overhead in this game is insane. I used to target 120 FPS minimum for all games I play, hence the high-end build, but now even 90 FPS is too much? lmao

How about people with a Ryzen 5 5600 and RTX 3060 that wants to play at 60 FPS? Keep in mind that we're not talking about 120 FPS, just measly 60 FPS and those parts are barely 2 years old.

Why does it need to go above 60fps? It's not a twitch FPS where every bit of latency counts. It's an RPG and 60 is perfectly smooth.

60 FPS is quite smooth and playable but far from perfectly smooth. There's still noticeable juddering on continuous camera motion.

1 more...

That's why I will never PC game. You spend thousands on your gaming PC and it can't play a game that will come out in a year

Wait, what? I have to buy a PC about every 10 to 15 years and it does't cost me "thousands". Last year, I bought one for about $700 and I can run every game at maximum settings with no issues.

Just wait for components to be on sale (it happens often) and you'll have a good pc for a very good price.

What did you buy for $700 that can play every game at max?

I spend something equivalent to around $2k in USD every 5-7 years and I'm fine.

To be fair that's a lot more money than a console.

To be fair people who pay thousands are probably perfectly fine with Starfield, although they may have to be satisfied with 120fps instead of 240fps.

The ones mainly hurting are the ones with similar budgets as console gamers. And console gamers are hardly unfamiliar with performance issues.

Being a pc gamer has much more to do with what ecosystem you get to tap into, rather than how much you’re spending.

Actually its quite the opposite usualy and games on conosles run well while on pc they can be a buggy mess. Granted on pc they will/can look better but the optimization is mostly done for console players.

The game averages 75fps with a 4090 or 7900xtx at 4k high settings

I'm sure a $1k+ PC can run pretty much any game, but not at the same graphics quality as a $4k HPC that helped create it.

Just stop chasing trends and play everything on a 5-10 year delay. Or better yet, just play indie. I save so much
money and my backlog is so long I don't even have time to play all of it.

Any recommendations?

I've just installed STALKER: Anomaly, a total conversion mod for STALKER: Call of Chernobyl. If you've never played the STALKER series before it is one of my favourite games of all time.

Operation Harsh Doorstop and Ravenfield are fun fps to casually dick around. I like them for their growing Steamworkshop mod scene, especially with Ravenfield.

Return of the Obra Dinn and Disco Elysium are two games on my backlog that saddens me every time I see it because I'd love to finish them but I just couldn't find the time.

Project Zomboid and Deep Rock Galayctic are fun times with people.

I was gifted Frostpunk and Outer Wilds. I haven't got around to playing it yet but my brother loved it.

5 more...