Based on a chart I've seen, the 5090 vs 4090 has:
32% more CUDA cores (21760 vs 16384)
-4% boost clock (2.41 vs 2.52 GHz)
33% more memory (32 vs 24 GB)
GDDR7 vs GDDR6X
33% wider memory bus (512 vs 384 bit)
27% higher TDP (575 vs 450 W)
33% higher launch price ($1999 vs $1599)
What this mean for actual gaming? Probably depends a lot on the game. I don't know enough about how the architecture actually gets used, but at first blush some of these upgrades seem to be more useful for computation than for gaming (what should be a huge increase in memory bandwidth on top of the 8GB extra total). I imagine this will be an AI monster, hence why nVidia's presentation was so focused on stuff like DLSS and frame generation. There was one infographic (I believe taken directly from their presentation) that was showing a 1080p render upscaled to 4k and then triple frame-genned, meaning the result was only ~6% of what you'd be seeing was traditionally rendered -- the rest of it was an AI making its best guess.
I've never had an nVidia card (ATI x300 to x1550 to HD4870 to HD5950 to AMD RX5700 to RX6800XT) so all this fancy AI stuff does not apply to me, plus I'm a Patient Gamer and I don't think a single game I own has raytracing. My monitor is 1440p/144Hz so I'm not trying to drive big res or piles of frames. Which is a long way of saying: on a personal level, I could not give less of a shit.
I think you're approaching it all rationally except I would say this. Even at 1440p 144Hz the eye candy AAA games from the past 7-8 years can crunch you below your 144Hz cap on the High/Ultra settings, especially if they do have ray-tracing. I haven't hesitated to mock DLSS/DLAA in its infancy, as new technologies almost always suck at launch, and DLSS 1.0 did, not only because it was a blurry mess, but mostly because there were no games that supported it. Go look at my
old posts comparing the 5700 XT vs. 2060 Super for that, or perhaps in particular this
one from Oct-2022. Pay attention to this bit:
NVIDIA's DLSS was introduced on February 15, 2019. Right now, not counting demos, there is still only 171 games that support DLSS (165 with DLSS 2.0+). A handful of those are VR games that nobody plays, too. That's a rate of 47 games added per year.
165 games with DLSS 2.0 support on Oct-14-2022. Know how many have DLSS 2.0 support today?
542. Yeah. NVIDIA upped the pace:
See which games, engines and applications support NVIDIA DLSS 2, NVIDIA DLSS 3, NVIDIA DLSS 3.5, Ray Tracing, Frame Generation, Ray Reconstruction, and AI technologies and features when you have a GeForce RTX graphics card, GPU, or laptop.
www.nvidia.com
Meanwhile, DLSS 3.5+ with frame generation is magic. Just dark freaking sorcery. It's still not perfect, you still get weird shimmering/glittering effects sometimes, but it's usually pretty damn jaw-dropping. It's nearing the point where the retention of image quality in real-time is similar in effectiveness as skillful mp4 re-encodes of video for playback of movies (consider that an uncompressed 1080p 90-minute HDR movie at 30fps would be 1.52
terabytes before you even added audio the next time you watch a YouTube movie of that length that is probably around 2GB-4GB).
And the performance gains for "Quality" mode aren't the 10%-15% figures we were seeing years ago. We're talking ~30ish% gains all the way up to 70%, sometimes.
That's nearly doubling the framerate.
Finally, most importantly, it's the midrange cards that most gamers buy where DLSS/DLAA matter the most because, unlike the 5090/4090, they can't just eat through pretty much any game out there on the highest settings on rasterization alone.