This incessant nagging about fps is the most tiresome thing in gaming since gamergate.
I agree up to a point. If a game is at 30 and feels good to play, then I’m OK. For example, Zelda feels great. Controlling Link is tight and snappy.
On the other hand, if the game has bad frame pacing (like Bloodborne), playing at 30 feels real bad.
I try not to get too crazy about frames, but sometimes some games just don’t feel good.
I will say, though, that while I really like channels like Digital Foundry, I sometimes wonder if them picking apart games to show the most minor frame dips is slowly teaching us to see these things, and as a result we kind of subconsciously will be like, “Well now I noticed this game had some moments where the frames dropped during an explosion. Obviously it’s a bad game.” I know that’s some hyperbole, but still.
It's also heavily dependend on getting used to it. There are games that have a quality and a performance mode where I sometimes start to think I'm at 60FPS until I switch to the actual 60FPS mode and realize that it's a completely different feeling.
Switching back lets those 30FPS seem pretty bad. But if I didn't had the possibility of switching between those two, I would've been happy with the 30.
But as you said it has to be rock stable. I played GoW Ragnarök on my PS5 and that Quality 30FPS mode was just terrible and felt like 20FPS. That of the Final Fantasy 16 Demo is better but here it's the overdone motion blur that bugs me enough to wan't to switch to the 45-60FPS mode where the blur is weaker
I have no problem playing 30 fps games. It's when it goes really low like 10 fps that it's choppy. Like dark souls in blighttown. Now that isn't the best to play. But it's still doable and an awesome game. 30 fps games are great. 60 is butter but not necessarily.
Yeah how dare consumers expect their products to be good
Good ≠ a single metric.
Sure, but a game is objectively better if it can run at a higher framerate.
Bloodborne is excellent, but it would 100% be better if it ran on solid 60 FPS.
Computers (including consoles) have limited resources, at some point you need to deal with tradeoffs, for example do you prioritize graphics quality or do you prioritize FPS? Do you want/need to have more resources available for the physics engine? That eats on the maximum possible FPS. Do you want to do real time procedural generation? Do you want to use the GPU to run some kind of AI? All this are design considerations and there's no one size fits all prioritization decision for all videogames. Clearly the people working on Starfield believe that for their intended game experience graphic fidelity is more important than FPS, and this is a perfectly valid design choice even if you don't agree with it.
It's a matter of optimization and Bethesda games have all had pretty poor optimization. They could get it running at a higher framerate but there's no need because people will buy it even if it runs 30fps.
If it was only a matter of optimization we would all still be playing games on the original NES.
What's so revolutionary or ambitious about Starfield that it couldn't be optimized to have "acceptable" framerate? Pretty much everything Starfield does has been done before and the creation engine isn't some visual marvel that would burn down graphics cards. So where's the performance going?
This incessant nagging about fps is the most tiresome thing in gaming since gamergate.
I agree up to a point. If a game is at 30 and feels good to play, then I’m OK. For example, Zelda feels great. Controlling Link is tight and snappy.
On the other hand, if the game has bad frame pacing (like Bloodborne), playing at 30 feels real bad.
I try not to get too crazy about frames, but sometimes some games just don’t feel good.
I will say, though, that while I really like channels like Digital Foundry, I sometimes wonder if them picking apart games to show the most minor frame dips is slowly teaching us to see these things, and as a result we kind of subconsciously will be like, “Well now I noticed this game had some moments where the frames dropped during an explosion. Obviously it’s a bad game.” I know that’s some hyperbole, but still.
It's also heavily dependend on getting used to it. There are games that have a quality and a performance mode where I sometimes start to think I'm at 60FPS until I switch to the actual 60FPS mode and realize that it's a completely different feeling. Switching back lets those 30FPS seem pretty bad. But if I didn't had the possibility of switching between those two, I would've been happy with the 30.
But as you said it has to be rock stable. I played GoW Ragnarök on my PS5 and that Quality 30FPS mode was just terrible and felt like 20FPS. That of the Final Fantasy 16 Demo is better but here it's the overdone motion blur that bugs me enough to wan't to switch to the 45-60FPS mode where the blur is weaker
I have no problem playing 30 fps games. It's when it goes really low like 10 fps that it's choppy. Like dark souls in blighttown. Now that isn't the best to play. But it's still doable and an awesome game. 30 fps games are great. 60 is butter but not necessarily.
Yeah how dare consumers expect their products to be good
Good ≠ a single metric.
Sure, but a game is objectively better if it can run at a higher framerate.
Bloodborne is excellent, but it would 100% be better if it ran on solid 60 FPS.
Computers (including consoles) have limited resources, at some point you need to deal with tradeoffs, for example do you prioritize graphics quality or do you prioritize FPS? Do you want/need to have more resources available for the physics engine? That eats on the maximum possible FPS. Do you want to do real time procedural generation? Do you want to use the GPU to run some kind of AI? All this are design considerations and there's no one size fits all prioritization decision for all videogames. Clearly the people working on Starfield believe that for their intended game experience graphic fidelity is more important than FPS, and this is a perfectly valid design choice even if you don't agree with it.
It's a matter of optimization and Bethesda games have all had pretty poor optimization. They could get it running at a higher framerate but there's no need because people will buy it even if it runs 30fps.
If it was only a matter of optimization we would all still be playing games on the original NES.
What's so revolutionary or ambitious about Starfield that it couldn't be optimized to have "acceptable" framerate? Pretty much everything Starfield does has been done before and the creation engine isn't some visual marvel that would burn down graphics cards. So where's the performance going?