I’m going to sound a little pissy here but I think most of what’s happening is that console hardware was so limited for such a long time that PC gamers got used to being able to max out their settings and still get 300 FPS.
Now that consoles have caught up and cranking the settings actually lowers your FPS like it used to people are shitting themselves.
If you don’t believe me then look at these benchmarks from 2013:
https://pcper.com/2013/02/nvidia-geforce-gtx-titan-performance-review-and-frame-rating-update/3/
https://www.pugetsystems.com/labs/articles/review-nvidia-geforce-gtx-titan-6gb-185/
Look at how spikey the frame time graph was for Battlefield 3. Look at how, even with triple SLI Titans, you couldn’t hit a consistent 60 FPS in maxed Hitman Absolution.
And yeah, I know high end graphics cards are even more expensive now than the Titan was in 2013 (due to the ongoing parade of BS that’s been keeping GPU prices high), but the systems in those reviews are close to the highest end hardware you could get back then. Even if you were a billionaire you weren’t going to be running Hitman much faster (you could put one more Titan in SLI, which had massively diminishing returns, and you could overclock everything maybe).
If you want to prioritize high and consistent framerate over visual fidelity / the latest rendering tech / giant map sizes then that’s fine, but don’t act like everything was great until a bunch of idiots got together and built UE5.
EDIT: the shader compilation stuff is an exception. Games should not be compiling shaders during gameplay. But that problem isn’t limited to UE5.
The issue is not that the games performance requirements at reasonable graphics settings is absolutely destroying modern HW. The issue is that once you set the game to low settings it still performs like shit while looking worse than a 10y old games
EDIT: the shader compilation stuff is an exception. Games should not be compiling shaders during gameplay. But that problem isn’t limited to UE5.
You can preload them if you want but that leads to loadscreens. It’s a developer issue not an Unreal one
No matter what you’ve got to compile the shaders, either on launch or when needed. The game should be caching the results of that step though, so the next time it’s needed it can be skipped entirely.
Gpus do cache them
That’s why on launch/loading screens work
I rarely have a good time with UE4/UE5 games, performance is often rough and while on a technical level the graphics are ‘better’, I often don’t think they look as pleasant or feel as immersive as older games.
I’ve seen a lot of talented devs explain that UE5 does give devs the tools to pre-cache shaders but since AAA studios rush everything, it ends up being low priority compared to maximizing the graphics. It’s not hard to believe considering games are pushed out the door with game-breaking bugs nowadays.
But it does beg the question of why the engine doesn’t do that itself. UE4 games ran like a dream, but this generation has felt like nothing but stuttering and 20 minutes of compiling shaders every time you open a game for the first time…
A lot of UE4 games had big issues with shader compilation stutter. This is nothing new.
20 minutes of compiling shaders every time you open a game for the first time…
Shiiit, Stalker 2 be compiling shaders every time I launch it!
Most games made in UE are AAA games, where every A stands for more scam, jankyness and less value overall. Very rushed, no love, made to barely work on “my machine” (4090). Many Unity games are smaller cash grabs.
The most devs that fulfill at least one criterium well (eg. Gameplay, Performance, Stability) are either small studios with their own engine (4AGames, Croteam, Minecraft), or publishers with one banger per 5 years or so: Valve (lost it with CS2 tho), Rockstar. Because those devs either put love, time or both into the games.
Hugely disappointed in Stalker 2…
But after that article I’ll give it another shot sooner than I was going to. I never thought that horrible performance could have been shaders loading in the background.
If that’s what was going on, then they really need to make that more obvious, or lock people in a sort of training area until it’s done and then start the actual game.
A couple weeks and it’ll probably be a lot better.
But initial thoughts before the article, I think the mistake was watching huge budget games designed from the ground up to be a showcase for the engine, and assuming that would be what any third party studio could crank out.
UE5 has amazing potential, but it still needs good code run on good hardware to get Selene’s result.
Wasn’t this the game developed under siege for a while, then the studio fled to set up in another country?
That’s what I mean.
Everybody had unrealistic expectations, myself included.
My PC isn’t a slouch, but everybody who got early play has top of the line shit and there’s a large discrepancy in PC hardware these days.
Apparently it’s not shaders, but I had to check what resolution it was at thinking it was throwing 720 by default or something. With everything cranked to 4k and only the normal performance hogs off the highest settings it looked bad. 1080p with everything down still had stuttering tho.
I didn’t put much effort in and my experience was launch day.
So people should definitely try for themselves if they have it from Xbox for PC for free…
I just expected it to be amazing on boot when I shouldn’t have.
I ran it on default recommended settings (High, 3440x1440) and it’s smoother than any of the originals were, even after they had years of patches. I experience some mild stuttering when I approach a hub area with lots of NPCs but it’s not terrible. I can’t really complain. 3090 and 5800X3D.
Pretty fun for me so far. There’s some weirdness with dudes spawning too close and A-Life AI seems to be missing but I’m enjoying the zone so far after 15 hours or so.
3090 and 5800X3D.
Yeah. I’m 4070 super and 7800x3d
Like I said, I went in expecting it to look like Senue’s 2 on boot. And there was just no reason for me to have done that.
I’ll give it a month or so and then mess with settings/drivers/etc and it’ll probably be fine. It’s just even when I tried turning stuff down I was having issues, but I haven’t put a lot of effort into getting it right.
Just because the engine is capable of crazy stuff, doesn’t mean every game will push it to its full potential, and that’s fine. That’s how engines last for a long time and that’s good for all of us in the long run.
They’re definitely not pushing the engine to its limits and it’s a shame. No Ray Reconstruction for example and no hardware ray tracing. I was wondering why shadows and reflections lacked clarity at first. This is apparently why.
It’s a weird one though because despite all the flaws I can’t stop playing the game. Maybe I just love STALKER that much. I also have a bunch of mods installed, granted.
It’s an emergent simulation fps. Nothing like it.
The game does not suffer shader compilation stutters. Rather, it’s heavily CPU limited for whatever reason.
5800x3d/6800xt/32gb all overclocked/undervolted playing on 1440p with dlsss and fsr Enabled. My settings are a combo of epic and high.
I didnt encounter gamebreaking bugs and hover around 130 fps with dips to 80 in heavy weather. No macros or micros.
The only complaints I have is the spawning system combined with weak enemy ai and my expectation that the factions would be fighting more with each other cause that is what I’ve seen and enjoyed in the modded stalker games before.
I really like the game how it is, looks amazing and the atmosphere is top notch.
I’ve seen a lot of complaints online about the game’s performance. But my ‘okay’ computer is handling the game at max settings just fine. I’m kinda of confused. Is it because I’m using Linux?
But my ‘okay’ computer is handling the game at max settings just fine.
Yeah, that’s the issue.
Your comp is running maxed setting at what you consider a serviceable framerate, while admitting your PC is just “okay”.
Everyone with a better comp than you, is also running at max setting, and seeing the graphics you are at probably close to the same average frames and dips. But we’re used to better graphics at higher frame rates with zero stutter/dips.
I’ve talked about this issue in the past, and it’s hard to explain. But a properly optimized game shouldn’t really run with everything maxed out on release except the very top hardware setup.
What’s currently max setting should be “medium” settings, because lots of people can handle it.
Your experience wouldn’t change at all, there’d just be the higher graphical settings available for people who could run them.
Think of it like buying the game on PS5 Pro, and then finding out that it plays exactly the same on the PS4. It’s not that you’d be mad that the PS4 people get a playable version, it’s that you don’t understand why that’s comparable to the newest gen console version. And compared to games that use your PS5 pro’s full power, it’s going to seem bad.
People (myself included) just assumed since it was UE5, they’d be at least giving us the options that UE5 was updated to support.
It seems they did it for future proofing the game, which 100% makes sense. Hopefully they add that stuff in with updates later.
Like, it doesn’t support hardware ray tracing…
And it doesn’t have non ray based lighting either. It forces everything to software ray tracing, which is a huge performance hit to people with hardware that can do ray tracing, but is completely unnoticeable to people with hardware that can’t do ray tracing. They may even see better graphics than a game that uses traditional lighting.
Like. I’m just a hobbyist nerd, I don’t really know all the in and outs of what’s going on with Stalker 2. But it seems like this is just a game that caters to the average PC gamer to the point everyone with an above average PC wasn’t even an afterthought.
I’m sure there’s going to be a lot of people who know more than me looking at lot closer at why the reaction to this game has been so varied.
I’m using Linux with a Radeon 7900 XTX and I can’t get over 120 fps
People are finally catching on.