Tech Gaming Hardware discussion (& Hardware Sales) thread

The connector on the 4090's a few have apparently been burning up at first I thought this was a joke but no apparently it's real or at least seems real. It using Intel's current sensor connector and is recommended that you upgrade your power supply to be equipped with that feature and follow proper handing of it.

NVIDIA-GeForce-RTX-4090-Graphics-Card-16-Pin-Connector-Burned-Melted-_1-555x740.webp





It was recently spotted that the NVIDIA 16-pin adapter cable for the GeForce RTX 4090 graphics cards is just too thick and not easy to manage in smaller PC cases as it hits the side panel really often. So users will have to bend the cable a lot to close the panel but once again, this bending is not recommended as it could heat up the cables and cause the adapter to burst up in flames.

These monster GPU require special handling to align the cable to avoid it overheating and burning up via the card heating up. This card maybe a bridge to far as far as the requirements but till more turn up with problems will see.


NVIDIA-GeForce-RTX-4090-Graphics-Card-16-Pin-Connector-Burned-Melted-_4.png



Holy crap. This is terrifying
 
  • Like
Reactions: PEB
I put together two options for myself, an AMD and an Intel build. I'm leaning towards the AMD build because of the lower estimated power consumption. Is it safe to assume that less power = less heat dissipation? i've always hated how a powerful PC can turn my office into a sauna. The price points on these builds are basically equal but there are some pros and cons to each. IMO AMD pros are: lower power, less heat, possible to do a CPU upgrade without a MB + ram upgrade in 3ish years. Intel pros: more premium MB (no plans to OC so maybe i can reduce cost here), maybe some better performance in certain programs/circumstances but for gaming i think they are about neck and neck. Am i missing anything here?
AMD build
View attachment 949918 View attachment 949919
You've got it in a nutshell.

Although "heat dissipation" is the rate at which heat is being removed, not the rate at which it is being generated. You are concerned with the former when you are considering your CPU cooler, case airflow, fans. You are concerned with the latter when you are considering how much heat your components-- including your CPU-- are producing with which you will have to cope. Yes, power consumption and heat production have always been inextricably linked. That's thermodynamics. However, it's gotten a bit more complicated, recently. First, just as with the last generation, Alder Lake, know that Intel's power consumption in gaming typically isn't nearly as rapacious as its power consumption in stress testing. There is a much bigger gap between the two than there is for AMD. Second, due to a change in how the CPUs operate, for the most demanding games, know that the AMD processors will scale up to their 95C limit, and let that thermal ceiling dictate their performance limit. So the only time they don't hit 95C is when the CPU cooling is strong enough, and the task isn't so demanding, that there is overhead left to operate at max performance without hitting the thermal ceiling.

Here's what that means. Below is a chart that shows Techpowerup testing the 7950X on three different coolers: the Wraith Spire, the Noctua NH-U14S, and the Arctic Liquid Freezer 420mm. The U14S is a moderately less powerful cooler from Noctua than the NH-D15. As you can see, in rendering tasks, none of the CPU coolers could prevent the 7950X hitting the thermal limit-- not even the monstrous 420mm liquid cooler. In gaming, the Wraith Spire couldn't prevent it hitting this limit. The Noctua only kept it below 80C at 80% and 100% fan speeds, and never got below 75C (historically a target for optimal longevity of the CPU we tried to stay under).
https://www.techpowerup.com/review/amd-ryzen-9-7950x-cooling-requirements-thermal-throttling/
temperatures.png


But that's the top where Intel and AMD have gotten a bit crazy competing for dominance. This is another reason why we avoid the 13900K and 7950X for gaming. Much more directly to the point, under gaming load, on that Noctua NH-U14S there isn't much difference at all between the 13600K and 7600X: just 2C (72C vs. 70C).
cpu-temperature-gaming.png



One last departing protip for you and the forum. You want a stronger SSD. The 970 EVO is an older, PCIe 3.0x4 drive. The ceiling read/write for these is 3.94 Gb/s. You want a PCIe 4.0x4 drive. PCPP finally added a filter to make this easier. It's the "M.2 PCIe 4.0 X4" under the "Interface" on the left.
https://pcpartpicker.com/products/i...rt=ppgb&page=1&A=1000000000000,22000000000000

Right now, if I was buying today, I'd opt for the "ADATA Premium for PS5". $90 for 1TB.
https://pcpartpicker.com/product/kX...-m2-2280-nvme-solid-state-drive-apsfg-1t-csus
7600 Mb/s Read, 6800 Mb/s Write. Has its own DRAM (not HMB offloading to your own DRAM). TLC NAND. Micron-manufactured. Quad-core controller & channel configuration. This thing shits on the 970 EVO.
 
You've got it in a nutshell.

Although "heat dissipation" is the rate at which heat is being removed, not the rate at which it is being generated. You are concerned with the former when you are considering your CPU cooler, case airflow, fans. You are concerned with the latter when you are considering how much heat your components-- including your CPU-- are producing with which you will have to cope. Yes, power consumption and heat production have always been inextricably linked. That's thermodynamics. However, it's gotten a bit more complicated, recently. First, just as with the last generation, Alder Lake, know that Intel's power consumption in gaming typically isn't nearly as rapacious as its power consumption in stress testing. There is a much bigger gap between the two than there is for AMD. Second, due to a change in how the CPUs operate, for the most demanding games, know that the AMD processors will scale up to their 95C limit, and let that thermal ceiling dictate their performance limit. So the only time they don't hit 95C is when the CPU cooling is strong enough, and the task isn't so demanding, that there is overhead left to operate at max performance without hitting the thermal ceiling.

Here's what that means. Below is a chart that shows Techpowerup testing the 7950X on three different coolers: the Wraith Spire, the Noctua NH-U14S, and the Arctic Liquid Freezer 420mm. The U14S is a moderately less powerful cooler from Noctua than the NH-D15. As you can see, in rendering tasks, none of the CPU coolers could prevent the 7950X hitting the thermal limit-- not even the monstrous 420mm liquid cooler. In gaming, the Wraith Spire couldn't prevent it hitting this limit. The Noctua only kept it below 80C at 80% and 100% fan speeds, and never got below 75C (historically a target for optimal longevity of the CPU we tried to stay under).
https://www.techpowerup.com/review/amd-ryzen-9-7950x-cooling-requirements-thermal-throttling/
temperatures.png


But that's the top where Intel and AMD have gotten a bit crazy competing for dominance. This is another reason why we avoid the 13900K and 7950X for gaming. Much more directly to the point, under gaming load, on that Noctua NH-U14S there isn't much difference at all between the 13600K and 7600X: just 2C (72C vs. 70C).
cpu-temperature-gaming.png



One last departing protip for you and the forum. You want a stronger SSD. The 970 EVO is an older, PCIe 3.0x4 drive. The ceiling read/write for these is 3.94 Gb/s. You want a PCIe 4.0x4 drive. PCPP finally added a filter to make this easier. It's the "M.2 PCIe 4.0 X4" under the "Interface" on the left.
https://pcpartpicker.com/products/i...rt=ppgb&page=1&A=1000000000000,22000000000000

Right now, if I was buying today, I'd opt for the "ADATA Premium for PS5". $90 for 1TB.
https://pcpartpicker.com/product/kX...-m2-2280-nvme-solid-state-drive-apsfg-1t-csus
7600 Mb/s Read, 6800 Mb/s Write. Has its own DRAM (not HMB offloading to your own DRAM). TLC NAND. Micron-manufactured. Quad-core controller & channel configuration. This thing shits on the 970 EVO.

Yeah dissipate wasn't the right word.
So since there isn't a major difference in heat, just 2 degrees, that sounds like real world differences are negligible? Are there any major draw backs to either system I might be missing? I read your post about the AM5 socket being new and the possibility of it staying a standard longer than intel which is why I looked into it.

I might get a 3090 from a friend for a song, If that happens I'll have an extra 4-500 in my budget to play with.
 
If you don't care about overclocking then just get a 12400 (or wait for the 13400) for Intel or get a 5600 for AMD.
 
My friend was telling me he underclocks his 3090 and takes a 5% hit in performance but keeps the heat and noise down. Any legitimacy to this?
 
My friend was telling me he underclocks his 3090 and takes a 5% hit in performance but keeps the heat and noise down. Any legitimacy to this?

Yes


(also search for 3090 undervolt and you'll get dozens of hits)
 
Yeah dissipate wasn't the right word.
So since there isn't a major difference in heat, just 2 degrees, that sounds like real world differences are negligible? Are there any major draw backs to either system I might be missing? I read your post about the AM5 socket being new and the possibility of it staying a standard longer than intel which is why I looked into it.

I might get a 3090 from a friend for a song, If that happens I'll have an extra 4-500 in my budget to play with.
Indeed, for gaming, there is no appreciable difference in heat output between the 7600X and 13600K. However, per gaming performance, the 13600K holds a more significant advantage, unless one opts for DDR4 RAM, but in that case, the 13600K gains a significant advantage in terms of value. You can pour over dozens of charts to observe this:


If this were the only consideration, there is no question, the 13600K wins. However, that's not the case, based on what you read that I wrote. This is the ground level for the AM5 socket, and with AM4, AMD supported the X370 motherboards for several generations, even for those CPU gens that shrank the fabrication nm, including in some cases the ability to support Zen 3 5000 series CPUs on the original X370/B350 motherboards. Running a 5000 series CPU on those oldest motherboards isn't always possible, isn't entirely without issues, and certainly isn't ideal to realize the peak performance capability of a 5000 series CPU across the RAM & SSDs, but they made it an option for some motherboards, and that is damn impressive.
Ryzen-Motherboard-Chart-Slide-scaled.jpg


To this point, that ASUS TUF X670-E Plus (and its WiFi variant) have been the most thrifty options I've seen, when at their best price point the past few weeks, for anyone considering such a road ahead. Don't count on these MoBos being great upgrade support options for the best CPUs (& other hardware) in 5 years, but out to 3 years from now, I estimate, they will still be able to support robust new standards.
 
Last edited:
Trying to decide if I look into getting a gaming laptop semi-soon (Black Friday/Cyber Monday maybe??) with say a 3070 Ti or if I wait till say February or March to see what the 4000 series laptop offerings look like, like maybe if I’d be better off with say a 4060 that would (theoretically) be able to take advantage of DLSS 3.0 maybe.
 
Trying to decide if I look into getting a gaming laptop semi-soon (Black Friday/Cyber Monday maybe??) with say a 3070 Ti or if I wait till say February or March to see what the 4000 series laptop offerings look like, like maybe if I’d be better off with say a 4060 that would (theoretically) be able to take advantage of DLSS 3.0 maybe.
Dude ive been looking at laptops A LOT lately. Im in the same boat. Theres so many pros and cons for each model. Lenovo and Dell seem the safest options based off their warranties. Also Costco has 90 day return policy and extra year of free warranty. Less selection on laptops though. But worry free so id suggest seriously considering costco. Id avoid mem express as its only 7 day return so u could get stuck with a lemon. Best buy gives 14 days but they put up a fight last time i returned an open box laptop that was defective.
 
Trying to decide if I look into getting a gaming laptop semi-soon (Black Friday/Cyber Monday maybe??) with say a 3070 Ti or if I wait till say February or March to see what the 4000 series laptop offerings look like, like maybe if I’d be better off with say a 4060 that would (theoretically) be able to take advantage of DLSS 3.0 maybe.
DLSS 3.0 turning out at least from Nvidia end a pretty big deal for sure. At least the number of DLSS 3 games I have seen so far. I have to admit I was not to jacked about the 4060, 4070 or 4080 this seems to make a strong case for them.
 
Just when I was getting happy about the 13000 series of Intel processors Meteor Lake maybe out early like end of Dec to Feb. time frame and maybe a little longer because Intel has let timeline slip. It's the first of a new class of CPU that you could say is a part of Intel CEO Pat's era of chips. Pat let it slip during conference call and was more then happy to talk about it.

"
Intel 4 Meteor Lake has now successfully booted Windows, Chrome, and Linux. The speed at which the team was able to achieve this milestone is a significant sign of the health of both Meteor Lake and our Intel 4 process technology.”

— Pat Gelsinger, Intel CEO"

This is an older graphic based on early information about the timeline Pat one of his engineers talked about shooting for late Dec to release it. The Intel 4 apparently not 4nm just a reference but apparently built on 7nm?
Intel-Meteor-Lake-700x426.jpg
 
Just when I was getting happy about the 13000 series of Intel processors Meteor Lake maybe out early like end of Dec to Feb. time frame and maybe a little longer because Intel has let timeline slip. It's the first of a new class of CPU that you could say is a part of Intel CEO Pat's era of chips. Pat let it slip during conference call and was more then happy to talk about it.
The conference call was just to more to appease/hype up shareholders. No way desktop Meteor Lake is coming anywhere close to that timeframe for 2 reasons

1. They still have like 10 different 13000 series chips along with 700 series B boards that are all releasing around CES 2023 (should roll out Jan/Feb).

2. Zen 5 (which is what Meteor Lake is truly competing against) even slated to come out till at least the 2nd half of 2024

This happens every generation, people see "oh it's coming early" and it ends up being prototypes, laptops, and/or OEM only for like 9 months.
 
I know this is probably going to be a stupid question, but has anyone noticed that you're getting slight screen tearing with G-Sync unless you also toggle V-Sync ON in the game? Am I doing something wrong?
 
I don't understand why. The 13900K is just 5% stronger in gaming according to both Techspot (aka Hardware Unboxed) and Tom's Hardware's suites. It has 8 more total efficiency cores than the 13700K, and thus only 8 more total threads. Whether or not you choose to upgrade that in several years as a matter of strategy is irrelevant to the fact the 13900K isn't going to outlive the 13700K as a relevant gaming processor for an appreciable period of time. Their window as viable gaming processors, whether for Ultra/High settings, or to meet mandatory minimums, is going to be nearly identical.

I don't understand why you favor the inferior strategy for the long-term of going huge, then not updating for a decade. You can wait, but I doubt it behooves you much, because the focus for Intel & AMD going forward is on power efficiency. But since that's what you're doing, you might as well do it. Run that 3820 into the ground. Upgrade only once the game sputters, and won't start.

Well, you jinxed it my friend ;) Just had this message trying to run Uncharted

upload_2022-10-30_21-58-18.png
 
The conference call was just to more to appease/hype up shareholders. No way desktop Meteor Lake is coming anywhere close to that timeframe for 2 reasons

1. They still have like 10 different 13000 series chips along with 700 series B boards that are all releasing around CES 2023 (should roll out Jan/Feb).

2. Zen 5 (which is what Meteor Lake is truly competing against) even slated to come out till at least the 2nd half of 2024

This happens every generation, people see "oh it's coming early" and it ends up being prototypes, laptops, and/or OEM only for like 9 months.
The reason for AMD delay is the tiled architecture where I am hearing Intel has had a bit more success at producing these chips. So the timeline could favor Intel at releasing these chips. As far as overlap I would imagine these chips use entirely different motherboard so there will be only 14 gen chips going into that so 600 and 700 series will be home for 12 and 13 gen versions and people will have to spend money to move up to 14 gen. So people with these boards will still buy 13 gen versions.
 
Back
Top