Tech Gaming Hardware discussion (& Hardware Sales) thread

Neither the 7900XTX or the XT version will use the new ATX 3.0 connector so you don't have to worry about that at least

im looking at the power draw of all of my components, but i swapped the 2080 with a radeon 6950xt because apparently the 7900xtx draws the same amount of power

power supply calculator recommends 700-800 watts

but apparently the 6950xt manufacturers recommends 850 watt psu's for most aftermarkets. some even higher.

on pcpartpicker with my entire build including extra case fans, with the 2080 swapped with an asus strix 6950x (i always buy the asus strix models) it estimates the wattage of 619 watts. but thats with my 11700k cpu only drawing 125 watts. everything is unlocked on my motherboard, and the cpu power limit was disabled and it can supposedly hit up 300 watts straight out of the box! - even though ive never quite got it that high even on the most brutal torture tests ive got it around 230.

theres a way in the bios that i can limit the cpu power draw to a value i set, but i prefer to leave everything as it is because it runs beautifully as it is. with that being said, i'll add another 125 watts to the 620 watt estimation on pcpartpicker just to make up for the power starved cpu, and that oughtta bring me up to 745 watts. mind you the most my cpu only runs 80-140 watts tops,, but if my cpu is gonna be working harder or running avx workloads, it will likely draw more cpu power

i have no idea, but it seems as if if i'm cutting it real close. my only solution would be to limit the cpu power draw to around 150-200 watts. and that should bring my whole pc to draw around 650-700 watts. i really dont know if thats an ideal solution, but then i could keep my cpu in check and still give it some room to breathe and i'll still be able to keep it running 4.9 ghz on all cores, which it seems to achieve quite well at stock even on a much lower power draw. i think the only reason i can even achieve this is because the noctua dh-15 is a beast of an air cooler.

dont get me wrong, the 11700k is a more than capable cpu, but man it can really draw power. under extreme stress tests its capable of hitting 300 watts on its own if not kept in check. thats insane for a cpu, compared to AMD's 5800x which was roughly the same in performance but only draws like 110 watts, and is limited to a max 142 watts tops. the 11700k got shit on in reviews because nobody would review it for performance which it excels in. they would just compare it and then write it off to the 5800x which was quite a bit cheaper, would run alot cooler, and sucks a whole lot less power in exchange for just a smaller hit in performance.

either way i feel like i'm really cutting it close. i should have went with the 5800x and i'd be laughing, but i didnt care about the extra cost at the time, i just wanted more raw performance for gaming and even emulating, and the 11700k pulled ahead of the 5800x at the time. i wasnt going to go for any higher components or i probably would have gone for the 5900x cpu, as the 11900k is a joke of an "upgrade" over the 11700k, and the 12 gen intel cpus werent quite ready at the time

but oh boy. im gonna have to wait and see here. something tells me i might have to bite the bullet and get a new psu anyways. ahh fuck it im not gonna worry about it..im just gonna play me some diablo 2 resurrected this morning. the fuck do i need a gpu upgrade to play a 20 year old game? well not really but lol i aint gonna dwell on it. though i'm sure a 7900 xtx would be an ideal solution for 4k 60fps or 1440p 120 fps. i have a 120hz display that is totally up for the task, but my 2080 is starting to show its age when i need to start turning a few settings off to achieve that target in a bunch of my games these days. i'm sure a 4080 or 7900 xtx would be more then enough to destroy that for quite a few years to come.

i think im just gonna have to wait until the card hits the market and the reviews to come out and see what others are saying. surely theres gotta be someone else out there who is crazy enough to pair a 7900 xtx with a 750 watt power supply, and an extremely power hungry 11700k. maybe i'll let them bite the bullet and see what the end result is rather than to try it out for myself.

i've already had enough random experiments, deciding to originally upgrade my 970 with a 2080....on an old i5 4690k haswell setup. yeah, after a couple years of cpu bottleneck city, i started to realise why i couldnt find anyone with completed builds on pcpartpicker that paired a 2080 with a 4690k! a while ago i finally gave my 2080 a proper home and threw the 970 back in the old pc. i'll wait for someone else to do this kind of weird combination with the 11700k+7900xtx+750 watt psu and then i'll ask them how its going.

i really didn't think i'd ever need more than a 750 watt psu for anything, but here we are
 
Last edited:
ahh fuck it i aint even gonna worry about it. i doubt ill be able to find an xtx at a reasonable price any time soon...maybe not even in a year. this cards gonna get scalped to shit. everybody and their dog is going to want it.
 
Does anyone have experience with the Thermalright Peerless Assassin series of coolers?

I use the Scythe Fuma2. Which is reviewed the same way as "the $50 version of Noctua".

That heatsink design looks near identical to the Fuma2. Performance is likely the same. Will keep the CPU at low to mid temps exactly like the Noctua. Its when the CPU starts reaching temperature threshold the heatsink will struggle unlike the Noctua. Strictly because the pipes, fins and tower on Noctua is so massive for heat displacement.
 
Has anyone heard about ray tracing being of loaded to the CPU? There was a guy on reddit talking about the future of ray tracing will be burdened on the CPU and the the GPU? Sort of made sense considering that there is head room on the CPU in most cases.
 
this is great news. I have like a g-synch monitor, not sure if that means I should stick to nvidia video cards, but maybe I will move to an amd build instead of an intel/nvidia one. Seems like Nvidia is really taking the piss and annoying me.
 
We can wait until the benchmarks, but on spec, I'd suggest the XTX. For just $100 more it's the better value.
full
Bad news correction from yours truly for AMD fans.

I never watched AMD's full event video, but I noticed the Techpowerup figures I quoted Thursday above were from previous rumors, not from the event itself, and hadn't been updated. Now they have. Most importantly, the 7900 XTX and 7900 XT shader counts were halved in concordance with what was shown at the event.
https://www.techpowerup.com/gpu-specs/radeon-rx-7900-xtx.c3941
https://www.techpowerup.com/gpu-specs/radeon-rx-7900-xt.c3912

It gets worse. Unless I'm mistaken, Techpowerup's throughput figures are inaccurate even after the update. Because due to the new feature of the decoupled clocks there is now a separate "Shader Clock", and unlike the "Gamer Clock", which as I previously mentioned seems to be a responsible new specification standard AMD introduced to avoid misleading consumers where none was needed, the shader clock will actually be used for throughput calculations. In this case, it means the pixel throughput and FLOPS will be reduced. It also means ray-tracing performance will be reduced.

full


So things look a bit more competitive, but in reality, it probably doesn't change my former assessment much at all. Because bear in mind the previous generation already showed us that processing power doesn't matter nearly as much as we previously assumed. For example, the 6900 XT barely has higher FLOPS than the RTX 3070: 23.04 FLOPS vs. 20.31 FLOPS. And it has far less than the RTX 3080's 29.77 FLOPS. Yet it still mollywops both due to superior pixel and texturing performance. But now, at least, NVIDIA's 4080 has one lone crutch to lean on-- superior FLOPS.

Against the 4080, the 7900 XTX:
  • +83% pixel throughput
  • +26% textel throughput
  • +50% VRAM (+8GB)
  • +29% VRAM bandwidth
  • -42% FLOPS
  • +16% ray-tracing performance
  • No tensor cores
  • -17% price (-$200 USD)

Also, @Slobodan, with that shader clock adjustment, there's no longer any question, the XTX is the way to go. Far, far better value.
 
Last edited:
When are the reviewers going to get their hands on the new AMD cards for benchmarking?
 
Bad news correction from yours truly for AMD fans.

I never watched AMD's full event video, but I noticed the Techpowerup figures I quoted Thursday above were from previous rumors, not from the event itself, and hadn't been updated. Now they have. Most importantly, the 7900 XTX and 7900 XT shader counts were halved in concordance with what was shown at the event.
https://www.techpowerup.com/gpu-specs/radeon-rx-7900-xtx.c3941
https://www.techpowerup.com/gpu-specs/radeon-rx-7900-xt.c3912

It gets worse. Unless I'm mistaken, Techpowerup's throughput figures are inaccurate even after the update. Because due to the new feature of the decoupled clocks there is now a separate "Shader Clock", and unlike the "Gamer Clock", which as I previously mentioned seems to be a responsible new specification standard AMD introduced to avoid misleading consumers where none was needed, the shader clock will actually be used for throughput calculations. In this case, it means the pixel throughput and FLOPS will be reduced. It also means ray-tracing performance will be reduced.

full


So things look a bit more competitive, but in reality, it probably doesn't change my former assessment much at all. Because bear in mind the previous generation already showed us that processing power doesn't matter nearly as much as we previously assumed. For example, the 6900 XT barely has higher FLOPS than the RTX 3070: 23.04 FLOPS vs. 20.31 FLOPS. And it has far less than the RTX 3080's 29.77 FLOPS. Yet it still mollywops both due to superior pixel and texturing performance. But now, at least, NVIDIA's 4080 has one lone crutch to lean on-- superior FLOPS.

Against the 4080, the 7900 XTX:
  • +83% pixel throughput
  • +26% textel throughput
  • +50% VRAM (+8GB)
  • +29% VRAM bandwidth
  • -42% FLOPS
  • +16% ray-tracing performance
  • No tensor cores
  • -17% price (-$200 USD)

Also, @Slobodan, with that shader clock adjustment, there's no longer any question, the XTX is the way to go. Far, far better value.
The founders edition looks really nice, I'd be happy to get one as it would look great with my system/case. I also want to see what design XFX comes up with. Imo they've had the nicest looking cards out of any AIB, I love the look of my 6800XT.

When are the reviewers going to get their hands on the new AMD cards for benchmarking?
Maybe in the next few weeks, that's usually how it goes. But they won't be able to release any benchmarks until the NDA is lifted.
 
lol bought a 2nd copy of diablo resurrected for pc today. running two pc's off of two tv's side-by-side, and my g613 keyboard and g604 mouse can switch inputs with the push of a button so its really convenient for mee

i really dont like having to change characters all the time in diablo just to move stuff around, and ive got a really bad hoarding problem and 12 online characters just wont do it for me.

i was doing it before with my xbox series x and an xbox x and i could open up basically as many accounts as i want. unfortunately the loading times on the xbox one x were abysmal because it doesnt have an ssd. it was hilarious on my series x and the tv its hooked up to i could see my other character joining into the game and i could even move around, for almost a whole entire minute before the loading screen would go away on my other tv and it would let me play the game.;

and it was a chore to have to keep setting up games all the time because of those long loads, especially if i was just farming one area and then restarting, so i kinda gave up on that after a while. eventually i bought a pc copy because trading on consoles is shit.

but its 50% off right now on pc so i ended up buying a second copy because it won't allow me to game "share" same way that xbox does with anyone who owns a digital copy off a console with an active xbox live account

i thought i was crazy for buying this game 3 damn times, 4 technically if you count the original diablo 2. but ive also bought gta 5 for on four different platforms, and diablo 3 on three. i guess i'm a sucker.

anyways this oughtta speed things up a little. now ive got an extra 5 tabs to just dump my shit into and then i can sort my shit later while i keep on farming. gonna make muling great again lol
 
oh fuck me the game wont run on my older pc. at first it was giving me a driver error, but it turns out the game only runs on windows 10 and 11

well i guess i gotta figure out if windows 10 will let me share the same license key on multiple machines and then upgrade to windows 10 on my older pc, otherwise i gotta find a fucking pirated copy of windows 10 or buy it again! either that or i'll have ended up wasting 25 bucks for nothing. i can now use two accounts on my main pc, but its essentually useless because i cant transfer items between them, i would have to go back to using my consoles for that. what was supposed to save me a bit of time and inconvenience turned out to be a waste of time and an inconvenience.

i totally forgot about that windows 10 requirement lol. whoops. i should have known! thats why i originally ended up buying the game for my series x first. i didnt do my next build until like half a year after i bought the game on console and then i finally caved in to windows 10.

alright bill gates, you win. damn you to hell! wont let me play a 20 year old game on a somewhat modern pc. way to piss in my corn flakes asshole.
 
Last edited:
And only an NVIDIA fanboy would be dumb enough to believe that shit.

Its troubling that you still believe this.

AMD in their GPU presentation excluded Nvidias performance from all their charts. An focused a good amount of time on 8K gaming for their "budget" GPU.
 
Its troubling that you still believe this.

AMD in their GPU presentation excluded Nvidias performance from all their charts. An focused a good amount of time on 8K gaming for their "budget" GPU.
WTF does that have to do with the price of tea in China?
 
Presenting your performance of a 900-1000 GPU labeled budget on a display that costs over two thousand.
Are you being deliberately obtuse, or is this an instance where you're just genuinely lost?
https://en.wikipedia.org/wiki/What's_that_got_to_do_with_the...?

I mocked you for believing a rumor that AMD was withholding GPUs for review, and you respond with this unrelated nonsense about marketing spin. What does marketing spin have to do with your gullibility?
 
What does marketing spin have to do with your gullibility?

Enjoy another paper launch where GPU manufacturers think doubling the price is acceptable for resolution use the majority of gamers will use in over a decade from now.
 
Enjoy another paper launch where GPU manufacturers think doubling the price is acceptable for resolution use the majority of gamers will use in over a decade from now.
Stop objecting to being labeled an NVIDIA fanboy. It's on full display, here.
 
Back
Top