Tech Gaming Hardware discussion (& Hardware Sales) thread

@Madmick are there any gaming benchmarks for non K series? I looked all over youtube and web but only cinebench. If you have something please do share
 
@Madmick are there any gaming benchmarks for non K series? I looked all over youtube and web but only cinebench. If you have something please do share
Your Google isn't broken, there's just nothing out there as far as sophisticated benchmarking that includes results for the processors running PL2 states at higher power limits in comparison to stock. You'll have to wait to see if there is a more pronounced difference in gaming than there was with the 12th gen.

In the meantime, perhaps so you'll stop obsessing over the 13900K:
https://www.techpowerup.com/review/...-i9-13900k-e-cores-enabled-vs-disabled/2.html
0.1% faster at 4K or 0.5% faster at 1440p with E-cores enabled.

In other words, those extra cores give you virtually nothing. It's all about the frequency of the 8 power cores. That's where all of the 13900K's (small) advantage over the 13700K in games benchmarking comes from.
 
Your Google isn't broken, there's just nothing out there as far as sophisticated benchmarking that includes results for the processors running PL2 states at higher power limits in comparison to stock. You'll have to wait to see if there is a more pronounced difference in gaming than there was with the 12th gen.

In the meantime, perhaps so you'll stop obsessing over the 13900K:
https://www.techpowerup.com/review/...-i9-13900k-e-cores-enabled-vs-disabled/2.html
0.1% faster at 4K or 0.5% faster at 1440p with E-cores enabled.

In other words, those extra cores give you virtually nothing. It's all about the frequency of the 8 power cores. That's where all of the 13900K's (small) advantage over the 13700K in games benchmarking comes from.

Sometimes it’s about having the biggest e-peen.
 
Sometimes it’s about having the biggest e-peen.
Indeed. Although when I saw this I also realized I overlooked the i9 also has a +6MB cache pool for its eight cores (36MB vs. 30MB). That also contributes. But really it pretty much all comes down to binning. The 13900Ks are culled as the cream of the crop for frequency potential at the same power consumption.
 
well even though my strix 2080 is still a rather capable card, ive been meaning to get a new gpu. not that i really need to. i kinda just want to.

the thing is im only running a 750 watt psu and my 11700k is a bit power hungry, like sometimes when im just gaming the thing can draw around 170 watts but thats the highest ill ever get it to draw just from using my pc. usually its around 100-140 watts,but sometimes itll draw more. i can make it take 200 watts and even 250+ if i torture test it with multiple programs. i will likely never see that kind of power draw unless im specifically trying to push it with avx 3 torture tests.

but for the most part with all of my fans and my drives and everything, i'm stretched a little thin with my 750 watt supernova g3. sure its gold rated and a good psu that can can deal with spikes and run a little bit over the wattage if it had to, but im not about to try to push it to its limits. so my upgrade options are limited just because i didnt predict the future and go with an 850 or 1000 watt psu just to be safe.

im not about to buy a new psu and tear apart my pc to wire in a new power supply, so a 4090 and 7900xtx is out of the question. if i had went with the 5800x cpu instead of my 11700k i could probably get away with an xtx. and a 4080 has cut things a little too close and im way too wary of it, because i know the aftermarket 4080's are factory overclocked and will draw well over the 320 watts they are rated for, and 320 watts is right on the tipping point of what my current setup can handle.

plus all the good aftermarkets for those 4080's are either sold out or marked up to absurd canadian prices, and im not going all out on a 4090 and its overkill for my display

that leaves me with a 4070ti and a 7900xt as my most feasible options.

i have a sony x930e tv with a 120hz panel but it doesnt have a hdmi 2.1 port so i can only run my shit at either 4k 60 hz or 1440p/1080p 120hz. the cpu and upscaler in my tv is really fucking good so i prefer to run my display at 1440p 120hz instead of its native 4k just because of all the extra frames and the lower input delay with really no ascernable difference in visual fidelity. even though 4k 60 fps is fine for me, 1440p 120fps is my target. they both take roughly about the same amount of gpu power to hit those targets.

my 2080 is "showing its age" in most newer games when i have to turn down some settings a bit to achieve that target. i figure to hell with it. i wanna go back to just buying a new game, throwing my shit to ultra and forgetting about it.

so ive been doing my research between the two cards, and quite frankly the 7900xt's extra vram and bus speed is quite appealing. but to get a good aftermarket 7900 xt is gonna set me back $200-$400 more than a good 4070ti, and at that point id rather just spend even more and get a new psu and a 4080 or an xtx. i dont really want to go that route. plus theres a big thermal design flaw with the xtx which has kinda put me off going that route

the other thing about the 7900xt that has put me off is the drivers, and only being able to set a frame target instead of a hard framerate cap inside of the control panel. screen tearing has been an issue when im pushing more frames than my refresh rate, and i prefer to just set a hard cap instead of running vsync and adding the extra input lag and slowdowns

but the final nail is that my display's native settings over hdmi is 4k 60 and i have to enable custom resolutions either through my xbox series x, or my nvidia control panel in order to get 1440p 120hz to work. and where i can enable a custom display through AMD's control panel, alot of 7900xt and xtx users have been reporting issues getting their custom resolutions to save and work with the new AMD drivers, when they worked just fine with their nvidia gpu's.

its been hit or miss with these new AMD cards to see whether or not custom resolutions will play well with their displays. and i absolutely dont want to take the chances and spend all this money on a new gpu just to find out that i cant get 120 hz to work with it, and have to buy a native 1440p 120hz panel when i dont fucking need to and it works just fine with both of my nvidia cards.

i also have another 50 inch 1080p tv beside my x930e, and i wouldnt mind running multiple displays, even though i already have a seperate pc hooked up to it. but right now there is an issue witth the 7900xt where it consumes 100 watts of power just at idle while running multiple displays. AMD has acknowledged this and i'm sure they will be fixed though,

but still, there seems like a whole lot of driver issues with these cards, and its been hit or miss for people. and after thinking long and hard, i just want to stick with nvidia. i only plan on gaming at 1440p 120fps or 4k 60, and despite the cut down system bus and the 12 gb of vram, from what i'm seeing in all the benchmarks it seems like more than a capable card for that.

i dont want to wait around for years for the the next generation of cards to release in case they end up not being much of a step up....but only end up slightly better but more expensive....like when i went from a 970 to a 2080 when i should have went from 970 to a 1080ti, but i expected the 2000 series to be a big step up like pascal was from maxwell. i dont want to wait, and take the chances for it either.

so ultimately i dont want to end up paying out the ass for a new psu and a 4080 or 4090 which is a stud card too but costs way too much money to justify the price tag if all i want is 4k60 or 1440p120 for years to come, and i've pretty much ruled AMD out just because of the drivers.

so i pulled the trigger on an asus tuf gaming 4070ti. my last two gpu's were asus strix, but at the current prices of a strix 4070ti i may as well just buy a cheap 4080 brand and see if my psu could handle it.

a 4070ti seems like a good enough upgrade over a 2080, but man the gpu prices these days are a little discouraging to say the least. the tuf 4070ti was just a few bucks over the $799 USD MSRP but i can work with that. to be honest i'm not happy about the prices of any of these new generation cards, but it is what it is and i'm just going to swallow my pride.

ive been on the fence about this for a while. i mean my 2080 does a decent enough job, and itll still be fine for years to come. not sure if i want to try to sell it, or retire my 970 once and for all and park the 2080 back in its old home to run alongside my trusty old 4690k. i refer to that setup as bottleneck city. backi wondered why i couldnt find any completed builds on partpicker that paired a 2080 with a 4690k. then i bought a 2080 and stuck that bastard in the machine and quickly found out why. my 2080 was bored as hell most of the time while my cpu was always pegged to the limit. couldnt even hit 120fps in most games at all just because of the cpu. going from the old i5 haswell to a 11700k has been a significant improvement. im a bit bummed i didnt wait the extra 6 months for the 12700k to be released before building because the newer series of cpu's seem to be a bit more impressive, but i'm not gonna lose any sleep over it. i think a 4070ti + 11700k setup should be able to outlast this console generation and be good for some time to come

but yeah thats where i'm at. fuck turning down settings from high to ultra just to hit my frame target. that shit is for peasants! im just gonna go and buy an overpriced gpu!

the 4070ti is really fast and a capable card though. ive never card much for ray tracing, but after seeing the gains from DLSS, i am sold on that. im not sure what to make of this new frame generation technology but i probably wont dabble with it. as long as i can hit 60 fps at 4k or 120 fps at 1440p some other way.

i also very much like the power draw of the 4070ti, and this particular asus card to begin with. its factory overclocked and still rated to draw 285 watts just like the founders cards put out, though in all these benchmarks im seeing, its always drawing less than that, typically in the 260's, and the max theyve been able to get this card to draw was 293 watts. my 2080 overclocked draws in the 270 watt range, sometimes even into the 280's, so i have no power concerns at all with this card. its actually rather impressive seeing how much more powerful it is than my 2080 but it will use the same and often even a little less power. i do say the power draw of this card is quite impressive, and almost a selling point for the card. it puts my power limit paranoia to bed!

ive always been an nvidia guy, but aside from the pricing, the only thing that erks me about the 4070ti card itself is the 12 gb of vram compared to all the rest of the newer generation cards, also and the cut down system bus. but this shouldnt be an issue at all for me at 1440p, and the general speed and power of the card should be able to make up for the smaller bus size. i dont think i need to worry about that, but it would definately be an issue if i'm trying to drive really high framerates at 4k on ultra. but if i was doing that i would rather go all out on a 4080/4090.

the only redeeming thing about the gpu prices these days is that newegg is giving free shipping on them. at least for the one i ordered. that would be a kick in the dick if i had to pay an extra 40 bucks or whatever on top of something thats already priced a few hundred bucks more than what it should be.
 
Last edited:
fuck i went off there, but it is what it is. ive been meaning to upgrade my gpu for a while now, but aside from the stupid pricing, theres been a fair bit of logistics involved, mainly because of my concerns with power draw and my 750 watt psu

think i'm gonna be happy enough with the 4070ti. to be fair i probably would have bought the 7900xt...had it not been $200 more expensive for a half decent aftermarket 7900xt, and had it been an nvidia card. i have too much concerns with AMD's software, drivers, and whether or not it would play nice with my custom resolutions. its been hit or miss from people all across the board, and i'm not about to take my chances when im spending over a thousand dollars on a computer part.

i might have found a way to justify the ridiculous price difference between the two cards that more or less perform the same, at least at 1440p which is what i'm mostly interested in, bit it was definately the AMD and drivers and software thing that put the final nail in the coffin for me.

that extra 8 gb of vram in the 7900xt might have been nice at 4k, but unless i buy a newer more capable display that can do 4k 120hz, i'll likely never even be able to take advantage of that and i'll probably just stick to 1440p. and i am perfectly happy with my tv and i dont really have any more room in here to park another new display anyways.
 
Last edited:
I gave up after the second sentence. Only to scroll down to see a dear diary essay....lol


yeah this shits been a dilemma for me for like the last 2 months. figured id have a good bit of a rant here. ive been pounding my head at the wall for the last month or two trying to decide if i should upgrade my shit or not. i know i dont need to, but at the same time for some reason ive been wanting to

finally i just decided to say fuck it and bite the bullet. then i realised there was a bit of logistics as to how i could even go about doing it and what i should buy. i think ive got that ironed out now.

i still dont like these prices one bit. still i cant control the prices, and im not gonna wait for the next cryptomining craze or whatever to drive this shit off the market completely, so i'm just gonna ignore the price tag and just try to enjoy this thing for what it does rather than how much i had to pay.

its still a kick in the dick with these prices though. i'll have to say that one last time. probably not the last time actually. but it is what it is and i cant really do anything about these corporations being stingy with their prices. and im not gonna sit on my 2080 for a couple more years waiting to see how much farther they want to bend me over for the next series of cards either. fuck it. you got me there nvidia. well played i guess.
 
Last edited:
Not sure if this is the best place to ask, but for my gaming laptop should I buy a cooling pad? Or only if I’m actually going to be putting it on my lap if I’m on say a couch as opposed to just sitting at my desk?

Also, if I want to occasionally hook it up to my TV are there any recommendations for a good wireless keyboard + mouse combo (I say combo as my laptop apparently only has one usb port) or should I just not do that and instead get a good wired mouse?
 
Microsoft's DirectStorage, debuted on the Xbox Series X, but brought to PCs, is turning out to be the truth. Not only does it spare CPU resources, but nothing will load games more quickly if developers utilize the technology with strong compression.
https://www.tomshardware.com/news/directstorage-performance-amd-intel-nvidia

Decompression, and therefore load time potential, is 3x as fast when offloaded to the GPU. Surprisingly, Intel's Arc GPUs were bettering both AMD and NVIDIA, not that the difference between the three was substantial.

Yes, shit, sorry, I just realized how confusing my last post could be.

The point of DirectStorage is that you can transfer data in compressed form, then the CPU or GPU will decompress it. This means you can effectively exceed the bandwidth ceiling of the protocol if it was shuttling the data in uncompressed form. These are those limits:
  • NVMe PCIe 4.0x4 (SSD) ---> 7.9 GB/s ceiling bandwidth
  • NVMe PCIe 3.0x4 (SSD) ---> 3.9 GB/s ceiling bandwidth
  • SATA III (HDD/SSD)----------> 0.6 GB/s ceiling bandwidth

If a game uses DirectStorage, yes, your 7900 XTX can effectively achieve a 12.6 GB/s exchange rate on your PCIe 3.0x4 SSD. As you can see, that's over three times the peak possible on that protocol (this is assuming your SSD actually achieves read/write times close to that, and most are under by quite a ways, only the best 3.0x4 SSDs come close).

And yes, I don't see any reason why a game that was built from the ground up to support DirectStorage on the Xbox wouldn't be capable on PC. This will become a standard.
DirectStorage Causes 10% Performance Hit On RTX 4090 In Forspoken
Oof. Obviously we were all expecting a slight performance hit to the GPU due it handling another demanding task during load sequences, but an ~11% hit on the 4090 (of all GPUs) for the fastest PCIe 4.0x4 SSDs is definitely steep. Hopefully that's something diminished with fine-tuning.
 
Not sure if this is the best place to ask, but for my gaming laptop should I buy a cooling pad? Or only if I’m actually going to be putting it on my lap if I’m on say a couch as opposed to just sitting at my desk?

Also, if I want to occasionally hook it up to my TV are there any recommendations for a good wireless keyboard + mouse combo (I say combo as my laptop apparently only has one usb port) or should I just not do that and instead get a good wired mouse?

cant help u for the laptop stuff, but you should just get a usb hub for your laptop, so you can hook multiple usb devices to your computer. those things are rather cheap and easy to find in stores.

as for wireless keyboards and mice, everybody has their own preference. me i go with a logitech G613 mechanical keyboard and a logitech G604 mouse. they both use a usb dongle, but will also work with bluetooth. this is very good for me because i run two computers side by side on multiple displays, and i can switch devices with just a press of a button on both my keyboard and my mouse. this saves me from having to park a 2nd mouse and keyboard in front of me, or having to get off the couch and start unplugging usb dongles or cords from one computer to another just to switch devices. they will also work with consoles, and if you use the onboard memory modes it will remember all the mouse button and sensitivity settings, and your keyboard hotkeys no matter which device you are connecting it to. so you dont have to fuck around with software on multiple devices or anything like that just to use all your extra buttons and stuff

the battery life on these devices is insane, and using the usb mode the input lag is non existant. bluetooth connectivity is pretty good too but i wouldnt recommend using bluetooth if going to be gaming with it. theres about as many buttons as youd ever need for gaming on the mouse as well and its about as ergonomic as its gonna get for something with as many buttons anyways.

ive used my fair share of mice and keyboards and right now ive been using the logitech G613 + G604 setup and i couldnt be happier with it. but i dont know what your budget is, or what these things are selling for these days. i think theyve came down alot in price but i paid a fair bit when they were first released. you can probably find some shit cheaper that would work just as well and get the job done.

i used to run a g700s mouse which was wireless but had a usb connector in the front. i preferred to have it wired and almost always ran it plugged in because the battery life on that mouse was utter shit and the thing was a little on the big and heavy side where it would start sliding off the arm rest of my couch and hitting the floor sometimes if it wasnt plugged in. that was a good mouse too for what it was but it was a bit too big to operate and had no options to adjust the weight of it either, but these days i just go wireless. theyve got the technology down now to the point where the input delay is all but eliminated and you will never notice the difference between wired or wireless. at least logitech does anyways with their lightspeed usb shit. and its always a bonus to not have to run usb cables and usb extension cables all along my living room anymore.
 
Last edited:
DirectStorage Causes 10% Performance Hit On RTX 4090 In Forspoken
Oof. Obviously we were all expecting a slight performance hit to the GPU due it handling another demanding task during load sequences, but an ~11% hit on the 4090 (of all GPUs) for the fastest PCIe 4.0x4 SSDs is definitely steep. Hopefully that's something diminished with fine-tuning.
They put a retraction in the article, so some good news:

Update 01/27/2022 6:30 pm PT

PC Games Hardware has retracted the original claim that DirectStorage affects frame rates. The outlet used CapFrameX, a utility based on Intel PresentMon, to record the frame rates. Unfortunately, the publication didn't consider that the Forspoken benchmark contains black screens with high FPS that affect the average frame rate.

Logically, the SATA SSD took longer to load than the M.2 SSDs, resulting in higher overall frame rates. So while the data was accurate, the conclusion was wrong. PC Games Hardware has issued the following statement (opens in new tab) (machine translation) on its YouTube channel:

"After some questions came up: PCGH measured graphics card and SSD with full PCI Express connection. However, the measurement did not take into account that a slower SSD has longer loading phases with a black screen that has very high fps. This falsifies the values and we will take a look at Forspoken again in a detailed test - then hopefully with GPU compression deactivated in order to work out more precisely where the advantages and disadvantages of Direct Storage lie."
 
Not sure if this is the best place to ask, but for my gaming laptop should I buy a cooling pad? Or only if I’m actually going to be putting it on my lap if I’m on say a couch as opposed to just sitting at my desk?

Also, if I want to occasionally hook it up to my TV are there any recommendations for a good wireless keyboard + mouse combo (I say combo as my laptop apparently only has one usb port) or should I just not do that and instead get a good wired mouse?

If you want a keyboard/mouse for basic things like clicking YouTube videos, the Logitech K400 is the way to go. It’s usually under $25, and the battery life is insane. It requires 2 AAA batteries, but I don’t remember the last time I changed the batteries. I want to say it’s been close to a year. I never turn it off and it gets used daily.
It’s not a gaming keyboard though.
 
If you want a keyboard/mouse for basic things like clicking YouTube videos, the Logitech K400 is the way to go. It’s usually under $25, and the battery life is insane. It requires 2 AAA batteries, but I don’t remember the last time I changed the batteries. I want to say it’s been close to a year. I never turn it off and it gets used daily.
It’s not a gaming keyboard though.
@Law Talkin’ Guy
Seconded on this keyboard, i've got one for my Media server and have had no problems with it.
 
Back
Top