PC Imagine spending $1.5-2k for GPU and not being able to get 60 fps in a game on max settings

Nameless King

Purple Belt
@purple
Joined
Jun 19, 2019
Messages
1,622
Reaction score
1,409
Someone is taking a pi$$ in here.

Alan Wake 2

PC gaming is expensive, it's been for ages. But back in the day when you bought top shelf GPU it lasted you years playing games on max settings. Now 4090 that was released last year can't handle the very recent game.

I was to upgrade my GPU but to what? What's even the point considering GPU's of lesser performance than 4090? It's just waste of money. 4090 itself it's ridiculously expensive as it is and it's already diving.
 
Last edited:
<Lmaoo> @ spending $1.5-2K for a GPU.

Would have thought there's plenty of games available you could play that'd look gorgeous and at a high framerate that you haven't played yet.

Instead, you bought a brand-new game without bothering to check how it ran on your setup, and its rarely the case that games are 100% finished on their release date. There's always patches released over weeks or months.

I hate it as much as you do, but that's the reality of the current gaming industry. Its best to adapt rather than insist on it to change.
 
Well back in the day though you worried about a lot less things for maxing a game. I mean really it was just setting it to max and hoping it got like 23-30 fps. Today there’s more to worry about, like raytracing, hitting x number of fps, resolutions, etc.

But yeah that would probably annoy me. I doubt it’s the graphics card either. My bet is Alan Wake needs some optimization. I also would recommend the 4090 due to the horror stories of them catching fire, even on people that took care plugging it in. That part is absolutely ridiculous imo and nvidia is blaming customers and not honoring warranties on them. Fuck that
 
Is Alan Wake 2 going to get DLSS 3.5 support at some point? If it does then you should be able to get 60+ fps on a 4090.

Have you considered playing without path tracing on? I mean you could just sacrifice path tracing in favour of better performance with just regular ray tracing.
 
Is Alan Wake 2 going to get DLSS 3.5 support at some point? If it does then you should be able to get 60+ fps on a 4090.

Have you considered playing without path tracing on? I mean you could just sacrifice path tracing in favour of better performance with just regular ray tracing.
It already has DLSS 3 frame generation possible:
https://www.techpowerup.com/review/alan-wake-2-performance-benchmark/8.html

With Path Tracing enabled (the most intensive possible graphic setting), for "Quality" DLSS mode, with everything at the highest setting, at 4K resolution, the 4090 averages 63 FPS.
dlss-pt-3840.png


And if we look at just raw rasterization at 1080p (no upscaling, but also no ray tracing):
performance-1920-1080.png



The truth is that for any who have fuddled with .ini settings in the past, or where games have offered more extensive controls of certain graphic settings, you could bring any GPU to its knees by cranking up certain features that had almost no return in graphic quality. It's been that way for decades. The developers simply tune the more simplified presets accessed by most users with the in-game menus according to the processing power of cards in circulation at any given time. It's up to them, philosophically, how they want their game to perform according to those presets. Because otherwise why not tap the processing power that is available to you even if you mostly don't need it-- even if the bulk of what you tap is providing diminishing returns?

That's not really the tell. The tell is how most GPUs run relative to the "Low" or "Normal" presets in most games. If you look at things in this context, never before have GPUs been so far beyond what is actually required by the 'eye candy' AAA games.
 
It already has DLSS 3 frame generation possible:
https://www.techpowerup.com/review/alan-wake-2-performance-benchmark/8.html

With Path Tracing enabled (the most intensive possible graphic setting), for "Quality" DLSS mode, with everything at the highest setting, at 4K resolution, the 4090 averages 63 FPS.
dlss-pt-3840.png


And if we look at just raw rasterization at 1080p (no upscaling, but also no ray tracing):
performance-1920-1080.png



The truth is that for any who have fuddled with .ini settings in the past, or where games have offered more extensive controls of certain graphic settings, you could bring any GPU to its knees by cranking up certain features that had almost no return in graphic quality. It's been that way for decades. The developers simply tune the more simplified presets used by most users according to what is on the market at any given time. It's up to them, philosophically, how they want their game to perform according to those presets. Because otherwise why not tap the processing power that is available to you even if you mostly don't need it-- even if the bulk of what you tap is providing diminishing returns?

That's not really the tell. The tell is how most GPUs run relative to the "Low" or "Normal" presets in most games. If you look at things in this context, never before have GPUs been so far beyond what is actually required by the 'eye candy' AAA games.

Take this perspective and forget Alan Wake for a second.

If you wanted to run game on low to normal then you would think a 'console' cost GPU should do the job? I'm spending 2k on GPU, 2k brother that is. You could get 3-5 PS5 or whatever they sell these days for that price. What am I exactly paying here for? All these game devs and nvidia brags about ray tracing and whatnot. How that relates to actual performance to the money you spend? It's a freaking joke to be honest with you.
 
I'm getting 67-100 fps on Alen Wake 2 with every setting set to high including Ray Tracing and DLSS on.
Laptop 4080 with 32 GB ram.
(edit) Not 4K, have a QHD monitor.
 
Last edited:
Take this perspective and forget Alan Wake for a second.

If you wanted to run game on low to normal then you would think a 'console' cost GPU should do the job? I'm spending 2k on GPU, 2k brother that is. You could get 3-5 PS5 or whatever they sell these days for that price. What am I exactly paying here for? All these game devs and nvidia brags about ray tracing and whatnot. How that relates to actual performance to the money you spend? It's a freaking joke to be honest with you.
This seems to miss the point. I'm merely pointing out there is more GPU overkill than ever before pertaining to the raw rasterization big picture. This doesn't mean someone on a 4090 should expect to play on low or normal settings.

As I've said many times in the past, the Xbox Series X and PS5 are the best bang-for-your-buck in gaming hardware, hands down. There is a diminishing return on the expenditure of money as you go up the GPU ladder. You're not buying the best value. You're buying the best experience. The consoles still offer a tremendous value & stability by virtue of the volume of units they move, and the fact that Sony/Microsoft put so much behind making those products successful.

However, no, they still aren't giving you the same experience. The XSX and PS5 are rendering at 1276p (2268x1276) then upscaling to 4K using FSR. Not only is there no path-tracing, there is zero ray-tracing. They target 30fps. Graphics settings are probably a combination of Medium and Low. This doesn't even qualify as "Medium" settings on a PC running at native 1440p+ at a higher framerate. Additionally, only the XSX matches PC load times. The PS5 is much slower.
https://twistedvoxel.com/alan-wake-2-ps5-vs-xbox-series-vs-pc-comparison/


Just look at the image comparisons with and without path-tracing using TPU's slider tool in the performance analysis I sent you. You don't have to squint. It's stark:
https://www.techpowerup.com/review/alan-wake-2-performance-benchmark/5.html

Finally, as has been pointed out, the 4090 will average above 60fps on 4K with path-tracing and every highest setting if you turn on DLSS. Furthermore, for TPU on their test bench, it averaged 117fps with all Max settings at 1440p with no ray-tracing.

I don't understand what has you so upset.
 
Take this perspective and forget Alan Wake for a second.

If you wanted to run game on low to normal then you would think a 'console' cost GPU should do the job? I'm spending 2k on GPU, 2k brother that is. You could get 3-5 PS5 or whatever they sell these days for that price. What am I exactly paying here for? All these game devs and nvidia brags about ray tracing and whatnot. How that relates to actual performance to the money you spend? It's a freaking joke to be honest with you.
I’m not sure what you’re mad about. It still runs the game way better than a PS5 can. @Madmick said it was upscaling to 4k and only managing 30fps on much lower detail settings. They aren’t even close in performance. Set the 4090 to those settings and it would be doing way more frames than that.

Dollar for dollar is it a good deal? No. It’s marketed to people who want the best. It delivers on that
 
Eh video cards were the same back in the day, you're wearing nostalgia goggles. The most expensive card in 1998 was $600 (Quantum 3D Obsidian X2, which was two Voodoo 2s on one board) and basically obsolete within 2-3 years (unless you wanted to play at 640x480 and low settings).
 
That's more of a knock on the shitty optimized game more so the hardware is lacking.
Correct, I don't think the tech is there currently to release a game that a 4090 doesn't cut through like butter that isn't because it is optimized horribly.
 
Dont see any scenario that the typical Pc gamer can justify buying a GPU at or above a 1.5K price tag. Especially when GPU prices skyrocketed from crypto miners and now AI training. Top of the line gaming CPU, motherboard and RAM will run someone around $700. Spending double that on a single component that has a ~5 year usage life cycle track record is more of a stupid tax.
 
Dont see any scenario that the typical Pc gamer can justify buying a GPU at or above a 1.5K price tag. Especially when GPU prices skyrocketed from crypto miners and now AI training. Top of the line gaming CPU, motherboard and RAM will run someone around $700. Spending double that on a single component that has a ~5 year usage life cycle track record is more of a stupid tax.
For the average/typical your right it's pointless. Historically the type of people I've played with over the years that have that type of investment in hardware they do so mainly because it's really their only hobby and only thing they drop money on. They're not dropping thousands of dollars on trips to make their Instagram look good or dropping $600 a month to join the local country club
 
Us old timers getting Crysis flashbacks

a2608355651-10.jpg


I spent like $5000 on a brand new rig specifically so I could max out Crysis on release. It broke me. I've only bought mid-range rigs ever since and never looked back
 
Us old timers getting Crysis flashbacks

a2608355651-10.jpg


I spent like $5000 on a brand new rig specifically so I could max out Crysis on release. It broke me. I've only bought mid-range rigs ever since and never looked back
Three 8800 Ultras in SLI?
 
I dunno, I have a 4090 (thanks to @Madmick for picking components) and it handles everything I have attempted to play on it at max settings, easily.

If a game is somehow not able to play properly on a 4090... it was probably just shit coding.
 
For the average/typical your right it's pointless. Historically the type of people I've played with over the years that have that type of investment in hardware they do so mainly because it's really their only hobby and only thing they drop money on. They're not dropping thousands of dollars on trips to make their Instagram look good or dropping $600 a month to join the local country club
Agreed, there's a huge gulf between what gamers (who are passionate enough to post about it regularly) think a "normal GPU" is and what it is in reality. The NPD numbers are pretty clear on that. The 1660 Super is still in one of the top 5 best selling gaming desktops and only one of the top 25 best selling desktops was over $1,500.
 
Back
Top