They’ll save 60fps for the Special Edition in 3 years.
Only on the “Pro” consoles that will have been released by then :).
Xbox Series X X Xbox X (Elite edition)
XxXboxelite69XxX
deleted by creator
Every 10fps a separate DLC.
Only comes in a $50 bundle with horse power armor and various skins directly imported form older games.
Most generous ZeniMax executive.
I’d rather see consoles be limited to what they can handle than a game to be limited for everyone because of what a single console can handle.
I want this game to be huge and look beautiful. If my PC can handle 60fps I don’t want to locked to 30fps because that’s all an Xbox can handle. And if I want to play it on an Xbox I don’t want it to be a blurry mess to get 60fps, I want it to look as good as it possibly can. Especially in a game like this where the visuals do a great majority of the storytelling when it comes to exploration and finding new things.
Fully agree, I hate this recent trend of consoles effectively bottlenecking PCs.
It will always happen. Games are going to generally target the lowest common denominator: the weakest of the current generation mainstream consoles. Right now, that means Xbox Series S.
Yeah, but they can go above the bottleneck on the PC by committing to less on consoles, as per the article.
It still seems way to common for an engine to have other systems tied to FPS, so e.g. running at a higher framerate will mean the physics engine also runs faster, or all animations are faster.
As a game dev: this is 100% the developers fault. The engine knows how long it’s been between frames. Devs can use that information to keep everything running at the same pace regardless of 30fps, 10, or 120.
Next time you see a game with its speed tied to the frame rate, make sure you make some noise! Maybe devs will finally fucking learn.
Oh agreed with you. Its a problem that shouldn’t happen. Yet somehow it still does 🤷
i wouldnt be too sure that fps and physics are tied, they managed to separate them for 76
Good then, because this is quite literally what Bethesda is doing with Starfield.
the fps lock isnt on pc
Yes, I was praising that. I may have worded it in a confusing way.
As someone who doesnt mind 30fps, there shouldnt be games running at 30 on new gen hardware anymore lol
If you target 60 fps you have to be more conservative woth poly counts, draw calls, shader complexity, rendering capabilities etc. You get have more you can play with on the rendering side and can technically have better visuals. It’s a dev decision. Devs will always need to make that decision until there are not hardware limitations.
and in this case rhey made the wrong decision imo
games like minecraft, runescape or WoW are still popular, why the hell are studios spending this much of their performance on having 4k resolution on every rock, tree and dust mite
Beth has historically had to make serious gameplay concessions because of consoles. Console limitations killed open cities and levitation on their engine in Oblivion.
I don’t mind if they play it safe with Starfield.
And the PlayStation ports of their games were always terrible. Like the further south you went in Oblivion the longer it would take to load a town. Sometimes Leyawin would take 5+ minutes to load.
Skyrim had that stuff with data corruption and the upside down dragons.
While I don’t remember, I’m sure the fallout PlayStation versions had their own issues. So I’m glad Bethesda is solely Xbox/pc now because the PS versions were a distant afterthought anyways.
30fps is fine so long as it’s not a crutch. And since it’s on game pass day one, if it’s terrible all I’ve done is waste bandwidth downloading it, and not $70.
Double the frame rate is always better than marginally better visuals no one would even notice unless you have a magnifying glass to compare side by side
This right here. As a 40+ gamer, I don’t mind 30fps. Been dealing with lower fps for a long, long time and its fine for me. But that just seems like an unreasonably low expectation of a AAA video games these days.
What’s really weird to me is the hard 30fps cap. Why not have at least an option to disable the cap and let VRR do it’s job?
deleted by creator
Not a deal breaker for this kind of game but a 60fps performance mode on series x at 1080p would’ve been a nice option.
Playing TOTK right now on switch and it really proves how great games can overcome technical limitation. A masterpiece at 30fps is still a masterpiece. Here’s hoping Starfield can deliver as a great game first and foremost.
If the bottleneck is something like AI or physics calculations on the CPU then lowering the rendering resolution won’t help achieve a higher framerate unfortunately.
I suspect most games shipping this gen without 60 FPS modes are CPU bound.
That’s a great point.
The game isn’t even out yet and it is already a disappointment
I love how Microsoft said that with their exclusive titles they’ll only have to focus on one console and as such the performance will be better. Now here we are and seemingly all of these titles run at 30 FPS. I just hope they will offer a performance option if it is run on a lower resolution. Having these options is exactly what keeps me on the PC platform.
This isn’t surprising. Todd Howard already stated that given the choice between fidelity or framerate they would choose fidelity every time. It’s disappointing that he thinks that’s still what people want in 2023, but it’s not surprising.
That’s absolutely what I want in 2023. Anything over 30fps is completely unnecessary outside of competitive multiplayer.
I find 30 FPS strictly bearable with a controller, unplayable on mouse and keyboard
Yeah 30 to 60 is a big difference. Past 60 things definitely start looking real samey though.
I wouldn’t say that’s accurate at all. Especially since providing more options to players is never a bad thing, letting them pick performance vs quality.
At least give a performance/fidelity toggle like many games. Especially with how similar the Xbox architecture is to Windows, I’ve always wondered why devs can’t use some of the same tools to give console players more graphical options.
I don’t think I’ve played a native/ported PS5 game yet that didn’t have a toggle. This is an odd choice for sure.
This helps the game be better on PC and later consoles. I’m down for it, it’s a welcome decision. And yes, I’m still salty they mixed levitation in Oblivion because of consoles.
I’m curious about this kind of thing from an engine and console architecture perspective. Any gamedevs able to shed some light?
I work in the industry, but not directly on low-level engine implementation details. Personally, my gut thinking is that the Creation Engine is falling behind in terms of modern asset streaming techniques.
Similarly, I wonder if a lack of strong virtualized geometry / just-in-time LOD generation tech could be a huge bottleneck?
From what I understand, efforts like Nanite in UE5 were an enormous engineering investment for Epic, and unless Bethesda has a massive engine team of their own (they don’t), they simply won’t be able to benefit from an in-house equivalent in tech.
Ultimately, I do think the lack of innovation in the Creation Engine is due to internal technical targets being established as “30FPS is good enough”, with frame times below 33ms being viewed as “for those PC gamers with power to spare.”
My best guess would be that the engine just has vast amounts of technical debt. Skyrim (pre-LE at least) had a savegame corruption bug that has been around since Morrowind. And while I’m sure they have rewritten huge parts of the engine over the decades it’s not rare to see bugs persist over generations, and modders complaining loudly about it. The engine has never been great about asset streaming either so no surprise here.
I think they should at least give console players the choice between 4k 30 fps or 1080p 60 fps Let’s be realistic here, 4k 60fps for a game of this size in this engine will require a BEEFY machine, nothing a current gen console can offer.
The game is probably CPU bound not GPU bound, based on past bethesda games. If that is the case, decreasing the resolution will not necessarily increase the frame rate a proportional amount.
Idk, if they release a game in 2023 that is still CPU bound that would be a big L from them. I really hope that’s not the case.
Especially because I bought a freaking 7900 XTX mainly for Starfield :D
Idk, if they release a game in 2023 that is still CPU bound that would be a big L from them.
This is Bethesda were talking here lmao. Starfield is still running on the Creation Engine, which they’ve been hacking together since the Morrowind days.
I feel that’s an unfair thing to say. Lets be real here, the Creation Engine that runs Starfield isn’t the same engine that runs Fallout 4 or Skyrim. It’s a new version of that engine. When Unity, Unreal or Anvil (just to name a few) release new versions of their engine everyone is like “wow, so much better, so much more possibilities” When Bethesda releases a new version of the CE, everyone is like “Yeah but it’s still the garbage CE” although the CE is a very powerful engine when you think what it really enables them to do (and how easy accessible it is for modders)
My point is, of course we can end up with a CPU bound game again, but before we know for sure, we should give Bethesda the benefit of the doubt.
Idk, facial animation is still honest to god the worst in the industry. Worse than Gollum. Can’t blame people for saying it’s the same engine when the faces you spend 30% of the game staring at look as bad as they do.
I mean, the lack of animations in general is something that bothers me about Bethesda games. I am currently playing RDR2 and people in this game actually feel like people, not videogame characters.
Oh come on Todd, if there’s headroom you could’ve at least give an option to run it at 40 fps
40 FPS is such a good compromise. It feels great on a compatible TV.
So on a series X you are forced to run it at 4k?
I think that’s just the display resolution. I expect this game will use dynamic render resolution like most games these days. The render resolution will probably not hit 4K often (if at all).
Nothing wrong with that. They want stability instead of poor performance and I can appreciate that decision. Games are best played on a system you can actually upgrade anyways, if you play console this is something that you should expect, maybe not now but in the near future as gen (pick a number) comes out
I think this is more of a “decent graphics and 30 fps, or worse graphics and 60 fps” question. Personally I would like to have a performance setting where one can choose to downgrade the graphics for more fps.
Part of me hopes that they do a demo or open beta or something so that we can see what it’s like at 30fps
Peasants get the shaft again, unsurprising
I’m curious about this kind of thing from an engine and console architecture perspective. Any gamedevs able to shed some light?
I work in the industry, but not directly on low-level engine implementation details. Personally, my gut thinking is that the Creation Engine is falling behind in terms of modern asset streaming techniques.
In an imaginary world where I’ve poured over Bethesda’s engine source for days, I wonder if I might discover that:
-
Asset formats and/or orchestration code used for asset streaming in the Creation Engine are not optimized to a degree where scene graphs can be effectively culled based on camera frustum or player proximity without noticeable dips in frame-time. It simply takes too long to pause actor simulations or too long to stream assets back into memory and reintroduce objects to the scene graph
-
Virtualized geometry or other magical low-overhead auto-LOD solutions aren’t in place. As far as I understand it, efforts like Nanite in UE5 were an enormous engineering investment for Epic, and unless Bethesda has a massive engine team of their own (they don’t), they simply won’t be able to benefit from an in-house equivalent in tech
-