Tech Gaming Hardware discussion (& Hardware Sales) thread

I tried out fluid motion frames on Cyberpunk 2077 with a 7900 xtx The gap between Nvidia and AMD in my opinion has been mostly software and not hardware. It is legit I haven't gotten a chance to test FSR 3 but the shit is fluid frames is miles better than FSR 2.
 
I got a very very early look at the 8090 gpu as I secretly got a photo from a friend lol.

full
 
So the Arc 580 dropped, and Intel is once against simply mollywhopping the big two in raw rasterization bang-for-your-buck, but maybe the more impressive thing I've noticed is that XeSS already appears to my eye to be outperforming FSR 2.2 in the ultra quality modes at higher resolutions (where these technologies are most relevant). FSR 3 is supposedly about to hit, but wow, AMD is way behind. I remember looking at these comparisons just last year and having to squint with a magnifying glass for 2 minutes to notice any differences in the photos. Not anymore. It's taken 4 years for DLSS to mature, but it has arrived. DLSS 3.5 is a clear frontrunner, now. The image quality is just shitting on FSR if you compare it in these photos. And AAA titles are finally being supported with regularity.
Assassin's Creed Mirage: DLSS vs FSR vs XeSS Comparison Review
 
Dont enable Antilag in AMD latest GPU driver. It'll result in a VAC ban in CS2 for its messing with Source2's DLL's.

5hgj97q240ub1.png
 
So the Arc 580 dropped, and Intel is once against simply mollywhopping the big two in raw rasterization bang-for-your-buck, but maybe the more impressive thing I've noticed is that XeSS already appears to my eye to be outperforming FSR 2.2 in the ultra quality modes at higher resolutions (where these technologies are most relevant). FSR 3 is supposedly about to hit, but wow, AMD is way behind. I remember looking at these comparisons just last year and having to squint with a magnifying glass for 2 minutes just to notice differences in still photos. Not anymore. It's taken 4 years for DLSS to mature, but it has arrived. DLSS 3.5 is a clear frontrunner, now. The image quality is just shitting on FSR if you compare it in these photos. And AAA titles are finally being supported with regularity.
Assassin's Creed Mirage: DLSS vs FSR vs XeSS Comparison Review

I went with an AMD card this time, and I'm regretting it. Their software is still bad.
I've run into issues like driver crashes when streaming, I've gotten the old green screen after driver updates a couple of times, etc.
 
With all the stories of crappy AMD drivers I must be the luckiest guy on the planet. Almost 20 years of exclusively ATI/AMD cards* and not once have I had a crash that I could point to and say, "yup, bad driver." I'm a cheap bastard and don't play brand new games so that might be part of it, where issues get resolved long before I'm willing to spend $10 in a steam sale for a 4+ year old game.

*not a specific brand choice, they were just the best bang-for-buck mid-range option each time I've been in the market to upgrade
 
With all the stories of crappy AMD drivers I must be the luckiest guy on the planet. Almost 20 years of exclusively ATI/AMD cards* and not once have I had a crash that I could point to and say, "yup, bad driver." I'm a cheap bastard and don't play brand new games so that might be part of it, where issues get resolved long before I'm willing to spend $10 in a steam sale for a 4+ year old game.

*not a specific brand choice, they were just the best bang-for-buck mid-range option each time I've been in the market to upgrade
My last 3 have been AMD 5700XT>6800Xt and now 7900XTX

I only had one driver issue earlier this year that was addressed pretty quickly thankfully, it seemed to affect 165hz monitors and caused BSOD's if Freesync was on.
 
Dont enable Antilag in AMD latest GPU driver. It'll result in a VAC ban in CS2 for its messing with Source2's DLL's.

5hgj97q240ub1.png

Apparently Anti-Lag/+ interupts DLL's. If that's the case, what the flying fuck was AMD thinking.
 
I went with an AMD card this time, and I'm regretting it. Their software is still bad.
I've run into issues like driver crashes when streaming, I've gotten the old green screen after driver updates a couple of times, etc.

I have gotten none of that. The only time I got crashes is when I undervolted too much. I don't stream though.
 
Last edited:
Reviews of Intels 14th gen are coming in. Looks like the increase in game performance hovers just below 2% over the 13th gen :confused:
 
Reviews of Intels 14th gen are coming in. Looks like the increase in game performance hovers just below 2% over the 13th gen :confused:
You can't expect much since it's just a refresh but that's still disappointing.
 
Reviews of Intels 14th gen are coming in. Looks like the increase in game performance hovers just below 2% over the 13th gen :confused:

You can't expect much since it's just a refresh but that's still disappointing.
It's not a surprise. From the paper specifications we knew it would be a yawn. I already covered this in the "Intel's fucked" thread a month ago:
The 14th Gen Intels definitely aren't worth buzz. They're more like a "tock" in the classic Intel "tick-tock" cycle despite being the first 7nm chips Intel produces. It's weird. The 14700K just bumps the clock frequency +200MHz and the cache +10%. It also has four more efficiency cores than the 13700K which is why it does so much better in this multi-threaded benchmark, but that's a bit misleading, because without those cores, there would a very modest difference between the two. It's going to bring a pitiful improvement in gaming performance: probably 2%-4%. And it's jacking the power draw again to achieve this.

The only meaingful praise I think one can muster for this release is that Intel is at least rapidly turning out new "gens" however modest the improvements. The Intel 10th gen came out in August 2019. This 14th gen is expected to launch in October. That's a whopping 5 generations released in a impressively compact 49-month window. Conversely, Zen 2 came out in July 2019, and Zen 5 is nowhere on the immediate horizon. However, AMD's strategy has been that rather than focus on new lineups with refined manufacturing processes, they just release the 3D-cache variants of CPUs from the same gen, and that has appeared to offer a superior bump for actual gaming performance (for games that accommodate it).

Intel
10th Gen (14nm) --> 11th Gen (14nm) --> 12th Gen (10nm) --> 13th Gen (10nm) --> 14th Gen (7nm)
AMD
Zen 2 (7nm) --> Zen 3 (7nm)--> Zen 4 (5nm)
The sole silver lining is that the temps have come down despite the power draw going up. So between the refinement of the fabrication process, the ratio change of power:efficiency cores, and whatever they're doing with dynamic clocking, there is at least some benefit to temps.

*Edit* Oh, to be clear...as it turns out this isn't truly their next gen, what should be the 14th gen, but Intel is calling it the 14th gen, anyway. The paper specifications with clocks and core counts above from last month's leaks is accurate, except it isn't Meteor Lake. Meteor Lake isn't yet released, and when it is, it will be called "Core Ultra", and they won't label it with a generation (i.e. it won't be the "15th Gen"). They've completely borked their branding. This is still 10nm, just 10nm+++, now, or whatever the fuck they call it, not the new 7nm that we thought was coming. This underwhelming "14th Gen" is just a Raptor Lake refresh (Raptor Lake = 13th Gen).
 
Last edited:
It's not a surprise. From the paper specifications we knew it would be a yawn. I already covered this in the "Intel's fucked" thread a month ago:

Sorry, forgot to pay homage to the duke of all things hardware related @Madmick on these haloed grounds of sherdog. I'll inform the hardware review plebs of this slight.
 
Sorry, forgot to pay homage to the duke of all things hardware related @Madmick on these haloed grounds of sherdog. I'll inform the hardware review plebs of this slight.
Do you want some TNT programming to go along with all this drama?
 
Reviews of Intels 14th gen are coming in. Looks like the increase in game performance hovers just below 2% over the 13th gen :confused:

Did they ever fix the E-Core issues with CS2? I haven't paid much attention to new launches, but I did see somewhere that on higher end i7 and i9's were having issues with huge frame rate drops because of the E-Cores.
 
Did they ever fix the E-Core issues with CS2?

Wasnt even aware of that issue. Only one i heard about through was certain AMD CPU's having stuttering issues. Which kind of makes sense now. For some posters were posting that their game would be getting 300+ fps but felt like 60 fps.
 
Apparently Anti-Lag/+ interupts DLL's. If that's the case, what the flying fuck was AMD thinking

CS2 update today put a check in place for AMD GPU users about their drivers at game launch. Also Valve has begun reverting VAC bans that Antilag caused.

Cant think of a bigger L that AMD self inflicted from a driver release.
 
Back
Top