Tech Gaming Hardware discussion (& Hardware Sales) thread

Do you guys think this will hurt Nvidia pursuant to making them dial back their insane prices? I'm getting mixed feedback all over
 
AMD just unloaded on NVIDIA. Team Green with retain the performance crown, but on value, this is a massacre.


December 13, 2020
RX 7900 XTX = $999
RX 7900 XT = $899

The 7900 XTX won't be too far off the 4090 in terms of performance. It actually has superior pixel throughput, equal VRAM, only ~6% inferior VRAM throughput, then trails furthest in texturing, processing power, and ray-tracing* at ~34% inferior. Weirdly, it also has very little L2 cache, but a greater amount of the slower L3 cache. It also has more advanced output with 2 x DisplayPort 2.1 instead of 3 x DisplayPort 1.4. All of that for $600 less.

It shits on the 4080 16GB, but will cost $200 less.

*Correction: It has 33% fewer ray-tracing cores, not an equal amount. My mistake.


Yeah, when the bumped out the price I was like that is how much people are charging for 3090s almost. Why is anyone going to fuck with NVIDIA when you can get the card and a bomb monitor for the price of a 4090.
 
When are they going to get their hands on the new AMD cards for benchmarks?
 
On paper. AMD has pulled this type of embellishment before.
LOL, "on paper". Remind me when AMD embellished specifications before? In fact, they're the ones who have distinguished "boost clock" from more a more realistic and sustainable "game clock".
 
Main reason I was considering the XT was my Corsair SF750 PSU, the recommended PSU for that is a 700W one so I figured I would just drop the new card in and wouldn't have to worry about anything else.
Gotcha. Yeah, these news GPUs are upending PSU adequacy that has stood for a long time.
 
Gotcha. Yeah, these news GPUs are upending PSU adequacy that has stood for a long time.
Now I'm considering maybe getting the XTX, the recommended PSU is 750W but do you reckon I should upgrade my PSU for that one or just stick with my 750W that I already have?
 
Now I'm considering maybe getting the XTX, the recommended PSU is 750W but do you reckon I should upgrade my PSU for that one or just stick with my 750W that I already have?
Wait for reviews to see what its power consumption is.
 
AM4 launch. Though that dealt with performance not specifications.
Indeed, and that's why your comment is silly.

We're not talking about hype graphs showing claims about fps increases over the competition, or abstractions like IPC uplift. This isn't even CPU specs which are difficult to gauge. This is GPU rasterization. It's as straightforward as it gets. There would only be room for confusion if the GPUs traded blows in throughput, which is why AMD's RX 6000 series performed so much better than might have been expected from its specifications; we learned pixel and textel throughput is apparently more important for gaming performance supremacy than memory bandwidth and processing power.

But they aren't tradiing blows. The 7900 XTX shits on the 4080 in everything for $200 less. Hell, it would still destroy it if you used the Game Clock for calculations, not the Boost Clock, but I'm still unclear why AMD is making that distinction, since the more sophisticated reviewers have shown the past two generations of AMD cards sustaining the Boost Clock in benchmarking for long periods of time, not fractions of a second.

NVIDIA got greedy. AMD didn't fall in stride. It highlights the former's greed. Makes them look bad. Team Red is taking them down a peg.
 
No 7600 and 7700 announced.

<Oku01>
 
Indeed, and that's why your comment is silly.

People gave them a pass then for their price accurately reflected the performance. Now today people are giving them a pass from the card being priced 200 less than its competitor when the supplied performance graphs are obfuscated.

Both companies are acting like divas with their MSRP's.
 
People gave them a pass then for their price accurately reflected the performance. Now today people are giving them a pass from the card being priced 200 less than its competitor when the supplied performance graphs are obfuscated.

Both companies are acting like divas with their MSRP's.
LMFAO, nobody ever cared about the manufacturer-supplied pre-launch performance hype graphs. When did gamers ever pay attention to that nonsense?

What is this fanboy bullshit I'm reading? Stop white knighting NVIDIA. Just knock it off. There is no equivalence, here. NVIDIA alone is acting like a diva. NVIDIA is being a proud, greedy bitch trying to normalize COVID+Cryptoboom prices as if nobody would notice. Everybody noticed. Then, on top of that, they shined up their GPU naming like a used card salesman with the 4070. When they got called on that, they threw a tantrum like a child who had their toy taken from them, and ran back into their room, announcing they were canceling the card altogether, rather than just admitting they mislabeled it.

Meanwhile, what has AMD done? That's right. They've integrated a new physical feature that anyone can find useful: an intake air thermistat. Now even the everyday gamer without sophisticated testing equipment can get a very good idea how hot the interior of his case is.
 
What is this fanboy bullshit I'm reading? Stop white knighting NVIDIA. Just knock it off.

I stated multiple times prior that the 1080ti i bought years ago was the first and last time i'd ever spend 700+ on a GPU. Recently ive even said that with these new prices i'll be 1 to 1-1/2 generations behind on the GPU i use moving forward.

End of the day im not the target customer. $799 being the starting price point for lowest end version of the next line of GPU's has me telling both manufacturers to fuck off.
 
Samsung's next $2,200 Odyssey Neo G9 will be the 'first' 8K ultrawide gaming monitor (release date: Jan-2023)
I don't give a shit about "8K", and it's eye-rolling that Samsung is calling what will be a 7680x2160 superwide display that due to "horizontal pixel count", but this is a significant moment in the advance of gaming displays, perhaps above all for competitive online gamers who don't participate in formal events.

I believe the reason manufacturers have been hesitant to make higher framerate ultrawides is because of the state of ports. It wasn't just superwides like the Odyssey series, and almost nobody buys those, anyway. It was ultrawides. 3440x1440@240Hz and 2560x1080@360Hz displays weren't even possible until HDMI 2.1 which only appeared with the last gen of GPUs. Despite that, to this day, none exist. Gamers have noticed, and are irritated. Meanwhile, 3440x1440@360Hz hasn't been possible until just now with the advent of DisplayPort 2.0/2.1 ports. USB4 can also support it. So hopefully we'll see some 240Hz+ 1440pUW and 1080pUW monitors actually start to roll out.

Casually competitive gamers may find these especially appealing. The reason is high framerate ultrawides offer a competitive advantage prohibited in eSport tournaments. They allow you to see a wider field of view. This is an advantage not only in first and third person shooters, but for games with the God's eye overhead view, too, like League of Legends. Only some games, like Arma 3, implement aspect ratio multipliers that prohibit you from seeing more of the field than gamers on the traditional 16:9 aspect ratio.

For example, on the recently released Call of Duty: Modern Warfare 2 (2022), the max FOV setting is 120 degrees. Because they use hDeg, if you're on a 16:9 display, that's actually your true FoV (with most games it isn't, btw, so know that if you set your FoV to 120, and you think that means your actual FoV is 120, it's probably not). However, with a standard 21:9 ultrawide, like a 2560x1080 or 3440x1440 monitor, when you set your FoV slider to 120, your true FoV becomes 133 degrees.

A more extreme example would be Fortnite where a 21:9 display allows an effective FoV of 96, relative to just 80 on 16:9, giving you a 19% increase in FoV.

Of course, ultrawides still aren't favored by casually competitive gamers on PC because most would opt for 240Hz. If 240+ ultrawides start to roll out, you'll be able to have your cake and eat it, too.


*Edit* Note a correction-- Fortnite is not controlled outside eSport gaming environments. At home, you do get a benefit with ultrawides in FoV, and it in fact offers one of the greatest increases in effective FoV.
 
Last edited:
Thanks to all of the thread-regulars. I decided it was time to build a new PC so I've been lurking for the past week. Put the order in last night. Thanks for all the info, sherdudes.
 


Digital Foundry showing off benchmarks of Intel's Arc cards. It is a generation behind but punches in a lot of games comparable to 3060 and 6600. Their cards surprisingly punch on ray tracing.
 
Does anyone have experience with the Thermalright Peerless Assassin series of coolers?
I'm looking at the Thermalright PA120 for around $40 on Amazon. Dual tower with 6x6mm heat pipes.

Hardware Canucks called it a NHD15 cooler, but I haven't kept up on them the past couple years to know if their testing methods are reliable.
 
Now I'm considering maybe getting the XTX, the recommended PSU is 750W but do you reckon I should upgrade my PSU for that one or just stick with my 750W that I already have?


im running an evga supernova g3 750 watt psu in my latest build. i thought that would be plenty of headway to upgrade from my 2080 down the road. i didnt expect the newer generations of psu's to be as power hungry as they are right now. like i dont think i could even accomodate a 3090ti in here let alone a 4080 or 4090 without ripping my pc apart and getting at least a 1000 watt psu

unfortunately even running at factory speeds at 5.0 ghz one core, 4.6 - 4.9 ghz on all the rest, my 11700k can draw up to 250 watts of power on its own. i never really get close to that during the heaviest of gaming, its usually always drawing between 80-140 watts under load

but im still holding out hope that the xtx won't be quite as power demanding. i really dont want to tear my shit apart, and i could use a bit of an upgrade for down the road. my options just seem kind of limited, or non existant the way things are right now because any significant upgrade over my 2080 seems like i won't have the power needed to be able to run it
 
Last edited:
come to think about it even if i wanted to go with nvidia and grab a 4080 or 4090 i'd probably want to get a new psu with atx 3.0 compatibility anyways, so there goes that idea. i'm hoping it won't be the same kind of bullshit with the 7900 xtx or i'm pretty well hooped here for just a simple set it and forget it gpu swap.
 
come to think about it even if i wanted to go with nvidia and grab a 4080 or 4090 i'd probably want to get a new psu with atx 3.0 compatibility anyways, so there goes that idea. i'm hoping it won't be the same kind of bullshit with the 7900 xtx or i'm pretty well hooped here for just a simple set it and forget it gpu swap.
Neither the 7900XTX or the XT version will use the new ATX 3.0 connector so you don't have to worry about that at least
 
Back
Top