• We are currently experiencing technical difficulties. We sincerely apologize for the inconvenience.

Tech Gaming Hardware discussion (& Hardware Sales) thread

At the high end, Nvidia smacks AMD around.
But once you get down to the 4060 price range, AMD starts to become a better option. Nvidia just doesn't compete on the low end anymore.
Sad but true. The base model rtx 4060 was on a 107 series die which is literally a xx50 tier chip by classification (transistor and die count where both on par with the last gen ipads). PCB and memory bus where xx50 tier as well. It's why it couldn't even match a previous gen 3060 Ti. Pretty embarrassing if you think about, like you said they don't even compete anymore on the low-end.

Interesting to see if the 5060 rebounds, most likely not. Problem is even if the raw performance increase significantly they'll just raise the price and keep the ram at 8GB
 
Last edited:
This video shows why NVIDIA is so cocky with pricing aimed at gamers, and doesn't feel the need to be remotely competitive on rasterization-per-dollar, anymore. Skip to 4:13 starting with the difference in the Hogwarts dinner hall in last year's Hogwarts: Legacy to see how plain and gaping the advantage of DLSS 4's new Transformer-based upscaling is over all other competing technologies in terms of image quality. It's superior even to native 4K, drastically. FSR and XeSS lack detail and clarity, especially with lighting and reflections. Furthermore, they are unstable, shimmering, noisy messes in motion, and you can see that throughout the video (ex. the panning shot in the Hogwarts courtyard; the ceiling fan in Alan Wake). Quite plainly, everything else looks like crap compared to it. You don't have to squint.


But as GN covered in their recent video, this doesn't really make the new RTX 50 series cards terribly appealing over, for example, RTX 40 series cards that will remain on the market, presumably with a much stronger rasterization-per-dollar, because older NVIDIA cards are also getting the Transformer upgrade. The exclusivity of MFG (multi-frame generation) doesn't make RTX 50 appealing because, first, it isn't one of these software technologies within the DLSS suite that seems to make a huge difference in terms of image quality to the eye versus traditional FG (as for image smoothness); second, even for what benefit you do see, the only people who would seem to truly benefit are gamers on 240+ Hz monitors, and that is still a fringe minority of PC gamers. Because it's glorified interpolation. Remember the "soap opera" effect on TVs in the last decade? So it's funny watching Steve try not to crack up laughing when he talks about trying to objectively compare the merits of "fake frames".


TLDR: NVIDIA are dicks, but their arrogance isn't like Intel's arrogance as AMD cultivated Zen. Because NVIDIA isn't complacent. They aren't resting on their laurels. They continue to innovate, engineer, and distance themselves from their rivals.
 

Nvidia RTX 5090 eBay Price Soars to $9,000 as Users Revolt With Framed Photo Listings to Trick Bots and Scalpers​



damn, so thats why i was seeing people listing pictures of 5090's for 5090 prices. they are scamming the bots and scalpers!

this one sold for $2500 USD


i wonder if the buyers will be able to get a refund? i mean they're clearly getting scammed, but they didn't take the time to read the description and were using automated software to try to buy up an item before the next person can get it, so it's kinda on them a little too.

would suck to be the regular gamer just hoping to luck out and find one for $2500 USD only to end up getting duped by these. but my heart bleeds purple piss for the scalpers and botters.
 
Man ever since Covid computer building has been a really annoying process of trying to get your parts at the msrp. The graphics card segment has been particularly frustrating.
 

Nvidia RTX 5090 eBay Price Soars to $9,000 as Users Revolt With Framed Photo Listings to Trick Bots and Scalpers​



damn, so thats why i was seeing people listing pictures of 5090's for 5090 prices. they are scamming the bots and scalpers!

this one sold for $2500 USD


i wonder if the buyers will be able to get a refund? i mean they're clearly getting scammed, but they didn't take the time to read the description and were using automated software to try to buy up an item before the next person can get it, so it's kinda on them a little too.

would suck to be the regular gamer just hoping to luck out and find one for $2500 USD only to end up getting duped by these. but my heart bleeds purple piss for the scalpers and botters.
I'd bet almost anything they get refunded. This happened like 20 years ago when people were selling cardboard boxes with an X written on them and sold them as "X Box."
 
Would appreciate your thoughts. Since I've been searching lots for a gaming 4K OLED I'm getting a lot of notifications. One was saying OLEDs have maintenance and burn in issues. This seems to me to be 99% bullshit. I understand it's possible to theoretically get burn in, but have never ever heard anyone in practice actually have a monitor ruined by it. I just don't see it as a realistic reason to not go OLED. Thoughts?
 
This video shows why NVIDIA is so cocky with pricing aimed at gamers, and doesn't feel the need to be remotely competitive on rasterization-per-dollar, anymore. Skip to 4:13 starting with the difference in the Hogwarts dinner hall in last year's Hogwarts: Legacy to see how plain and gaping the advantage of DLSS 4's new Transformer-based upscaling is over all other competing technologies in terms of image quality. It's superior even to native 4K, drastically. FSR and XeSS lack detail and clarity, especially with lighting and reflections. Furthermore, they are unstable, shimmering, noisy messes in motion, and you can see that throughout the video (ex. the panning shot in the Hogwarts courtyard; the ceiling fan in Alan Wake). Quite plainly, everything else looks like crap compared to it. You don't have to squint.


But as GN covered in their recent video, this doesn't really make the new RTX 50 series cards terribly appealing over, for example, RTX 40 series cards that will remain on the market, presumably with a much stronger rasterization-per-dollar, because older NVIDIA cards are also getting the Transformer upgrade. The exclusivity of MFG (multi-frame generation) doesn't make RTX 50 appealing because, first, it isn't one of these software technologies within the DLSS suite that seems to make a huge difference in terms of image quality to the eye versus traditional FG (as for image smoothness); second, even for what benefit you do see, the only people who would seem to truly benefit are gamers on 240+ Hz monitors, and that is still a fringe minority of PC gamers. Because it's glorified interpolation. Remember the "soap opera" effect on TVs in the last decade? So it's funny watching Steve try not to crack up laughing when he talks about trying to objectively compare the merits of "fake frames".


TLDR: NVIDIA are dicks, but their arrogance isn't like Intel's arrogance as AMD cultivated Zen. Because NVIDIA isn't complacent. They aren't resting on their laurels. They continue to innovate, engineer, and distance themselves from their rivals.

Wish AMD and DeepSeek would get a viable competition together and really get them worried.
 
Would appreciate your thoughts. Since I've been searching lots for a gaming 4K OLED I'm getting a lot of notifications. One was saying OLEDs have maintenance and burn in issues. This seems to me to be 99% bullshit. I understand it's possible to theoretically get burn in, but have never ever heard anyone in practice actually have a monitor ruined by it. I just don't see it as a realistic reason to not go OLED. Thoughts?
This is of course anecdotal but I have had my OLED monitor for a year now with no burn in. It was a major fear of mine but no, no sign of burn-in. I run the pixel refresh cycle (every 4 hours), sometimes I wait 5 hours if I'm in the middle of playing a game or doing something. It lasts 4 minutes so I used it as a break.

If you leave one static image on there for a week straight, I could see it but otherwise, I think we're at the point where OLED has matured enough to mitigate a lot of burn in.
 
This is of course anecdotal but I have had my OLED monitor for a year now with no burn in. It was a major fear of mine but no, no sign of burn-in. I run the pixel refresh cycle (every 4 hours), sometimes I wait 5 hours if I'm in the middle of playing a game or doing something. It lasts 4 minutes so I used it as a break.

If you leave one static image on there for a week straight, I could see it but otherwise, I think we're at the point where OLED has matured enough to mitigate a lot of burn in.


Do OLED monitors not have automatic ones once you've turned them to stand by / off like OLED TV's do?
 
Do OLED monitors not have automatic ones once you've turned them to stand by / off like OLED TV's do?
My one prompts me to do it every 4 hours, if I leave it and the monitor goes to standby, it will run the cycle automatically after about 5-10 minutes.
 
My one prompts me to do it every 4 hours, if I leave it and the monitor goes to standby, it will run the cycle automatically after about 5-10 minutes.

Must be different compared to TV's then, for those it's advised to run the manual one no more than once a year other wise it ages the panel, but I know on my LG at least it does an automatic large one after every 1000hours use give or take automatically once it goes to standby that lasts an hour
 
My one prompts me to do it every 4 hours, if I leave it and the monitor goes to standby, it will run the cycle automatically after about 5-10 minutes.
Don't you set your .monitor to turn off automatically if not in use for X amount of .minutes? Isn't that the easiest and moat sure fire way to ensure no issues? Sorry for typos I am drinking
 
Don't you set your .monitor to turn off automatically if not in use for X amount of .minutes? Isn't that the easiest and moat sure fire way to ensure no issues? Sorry for typos I am drinking
So basically to summarise from my posts above

My monitor will turn off after 2 minutes, so I don't have to worry about static images showing if I have to step away and forge to put my PC to sleep.

After 4 hours, the pixel refresh prompt will pop up, takes 4 minutes. If I dismiss it, it will show up again after another hour, it will keep doing it till I run it but it won't force it while my monitor is on. If I leave my monitor to idle and it turns off after 2 minutes of inactivity, it takes about 5-10 minutes for the pixel refresh to kick in if I decided to dismiss it while the monitor was in. Same thing happens when I put my PC to sleep or turn it off.

Also, from what I've seen all OLED monitors use pixel shift like the OLED TVs which is another good feature. Some also have logo dimming which can also mitigate burn in.
 
Would appreciate your thoughts. Since I've been searching lots for a gaming 4K OLED I'm getting a lot of notifications. One was saying OLEDs have maintenance and burn in issues. This seems to me to be 99% bullshit. I understand it's possible to theoretically get burn in, but have never ever heard anyone in practice actually have a monitor ruined by it. I just don't see it as a realistic reason to not go OLED. Thoughts?

Modern OLED monitors have three feature to prevent burn-in. First is a pixel shift feature, second is another variant of this just where the taskbar will be located and third is OLED care that prompts every 4-8-16 hours. If you delay it up until 16 hours it'll automatically do it after a ~2 minute prompt.

OLED care feature can be an annoyance depending upon the length of time you spend at the monitor in a single sitting.
 
It's a pity the hardware advances and pricing are so pathetic, but the GPU battle continues to shift more significantly over to the software side of things, and so that's what I'm continuing to scrutinize. DSO Gaming were more impressed with their hands-on experience of MFG (Multi-Frame Generation) than Hardware Unboxed. I think the key difference is that DSO was focusing on games and settings that would otherwise cripple even the most powerful PCs, choking them down to the 20-45 FPS range, where MFG's framerate uplift becomes a massive difference maker:

DLSS-4-benchmarks.png

Let’s start with Cyberpunk 2077. With DLSS 4 Quality Mode, we were able to get a minimum of 56FPS and an average of 63FPS at 4K with Path Tracing. Then, by enabling DLSS 4 Multi-Frame Gen X3 and X4, we were able to get as high as 200FPS. Now what’s cool here is that I did not experience any latency issues. The game felt responsive and it looked reaaaaaaaaaaaaaally smooth.

And that’s pretty much my experience with all the other titles. Since the base framerate (before enabling Frame Generation) was between 40-60FPS, all of the games felt responsive. And that’s without NVIDIA Reflex 2 (which will most likely reduce latency even more).

I cannot stress enough how smooth these games looked on my 4K/240Hz PC monitor. And yes, that’s the way I’ll be playing them. DLSS 4 MFG is really amazing, and I can’t wait to try it with Black Myth: Wukong and Indiana Jones and the Great Circle. With DLSS 4 X4, these games will feel smooth as butter.

And I know what some of you might say. “Meh, I don’t care, I have Lossless Scaling which can do the same thing“. Well, you know what? I’ve tried Lossless Scaling and it’s NOWHERE CLOSE to the visual stability, performance, control responsiveness, and frame delivery of DLSS 4. If you’ve been impressed by Lossless Scaling, you’ll be blown away by DLSS 4. Plain and simple.

Because they were even able run games like Alan Wake 2 and Cyberpunk 2077 on Ultra settings with path-tracing in 8K at 80+ & 90+ FPS rates thanks to the "fake frames".



Also, as was just brought up, recently, the 4060 Ti has gotten beaten up by the 7700 XT since it released, but in Spider-Man 2, with the latest DLSS, even without MFG enabled, the 4060 shoots past it, and remember, the true superiority of DLSS isn't about framerates, usually, but image quality. Perhaps the even more important takeaway is to notice the effect of VRAM on high resolutions with DLSS. The 4060 Ti 16GB outpaces the more mainstream 8GB variant by a surprising 36%.
performance-upscaling-3840-2160.png


Ironically, this does bring us back around to an eyeroll at NVIDIA for continuing to gimp the VRAM so badly in head-to-heads against AMD. Maybe this will be AMD's saving grace once FSR 4 releases (if they can catch up in titles supported).
 
Because they were even able run games like Alan Wake 2 and Cyberpunk 2077 on Ultra settings with path-tracing in 8K at 80+ & 90+ FPS rates thanks to the "fake frames".



Also, as was just brought up, recently, the 4060 Ti has gotten beaten up by the 7700 XT since it released, but in Spider-Man 2, with the latest DLSS, even without MFG enabled, the 4060 shoots past it, and remember, the true superiority of DLSS isn't about framerates, usually, but image quality. Perhaps the even more important takeaway is to notice the effect of VRAM on high resolutions with DLSS. The 4060 Ti 16GB outpaces the more mainstream 8GB variant by a surprising 36%.
performance-upscaling-3840-2160.png


Ironically, this does bring us back around to an eyeroll at NVIDIA for continuing to gimp the VRAM so badly in head-to-heads against AMD. Maybe this will be AMD's saving grace once FSR 4 releases (if they can catch up in titles supported).


Well, if you're spending the extra $100 to step up to the 16gb version of the 4060ti, you could have spent $50 to step up from a 7700xt to a 7800xt. Both the 4060ti 16gb and 7800xt released at $500.
 
Back
Top