Tech Gaming Hardware discussion (& Hardware Sales) thread

Oh, surprised I missed this headline a few days ago when it dropped. There is finally a new most popular desktop GPU. Long live the GTX 1060, the GTX 1060 is dead (it held this title longer than any other GPU). Below, I've ordered the succession of that crown:
https://www.techspot.com/news/96835-new-graphics-card-tops-steam-survey-first-time.html
  • November 2022 ----> GTX 1650 (launched Jun-2020)
  • December 2017 ----> GTX 1060 (launched July-2016)
  • June 2017 -----------> GTX 750 Ti (launched Feb-2014)
  • November 2015 ----> GTX 970 (launched Sep-2014)
  • December 2013 ----> Intel HD 4000 series (launched May--2012)
  • July 2012-------------> Intel HD 3000 series (launched Feb-2011)
  • Prior to the above --> GTX 560 (launched May-2011)
 
What exactly is the GTX 1650 and where does it fit in the Nvidia lineage? If it came out in June 2020 that would only make it a couple of months younger than the 3070/3080/3090 right? And younger than the 2000 series which I thought launched in… 2018?
 
What exactly is the GTX 1650 and where does it fit in the Nvidia lineage? If it came out in June 2020 that would only make it a couple of months younger than the 3070/3080/3090 right? And younger than the 2000 series which I thought launched in… 2018?
With AMD, I find it makes more sense to look at architecture: GCN 1.0 --> GCN 2.0 --> GCN 3.0, etc., RDNA 1.0 --> RDNA 2.0

With NVIDIA, because their naming schemes for architecture are stupid, it's more logical to look at series: GeForce 900 series --> GeForce 10 --> GeForce 16 --> GeForce 20 ---> GeForce 30 --> GeForce 40.

The 16 series was a tweener series. The 1650 has flourished because it's the most powerful card that can be slotted into any repurposed comp by drawing less than 75W. So even office comp PSUs can run it.
 
A worthy alternative. Although it's showing $155 on Amazon for me for the R5-5600 as of this reading, and even at $137, I'm not sure I would sacrifice 200Mhz to spare a mere $22. One Andrew Jackson seems worth it. Focus on gaming which depends more heavily on that first-core superiority.
relative-performance-games-1280-720.png


Most YouTube game head-to-head benchmarks also show a roughly ~3% advantage in games at stock for the 5600X which is consistent with the difference one would expect from the base/turbo clock boosts. That's not a large difference, but it's also not much smaller than the difference between the i7-12700K and i5-12600K when you scrutinize specifically gaming at low resolutions like TPU above.


And it would seem to be in line with what Tom's Hardware found who, like Gamer's Nexus, showed a marginal difference overall (less than 1%) between the 5600X and 5600, but that's because both reviewers include more editing and general benchmarks in their reviews than strictly gaming. Focus on single-threaded performance, which offers the most insight into gaming, and you see this. After all, Tom's Hardware's 1080p gaming benchmarks also put the 5600X's performance at 4% superior to the 5600 if you focus strictly on them.


You have to go to the other sellers to see the $137 option, like I said in the first post.

Keep in mind Toms tested with a 3090. Are you going to be pairing a $1200 GPU with a $150 cpu? No, you’re not. If you can afford an 80 or 90 series card, you buy the best of the best for the rest of the system.
When you pair it with something more in line with the price, like a GTX1650, that fps gap is going to drop down to a 2-3fps difference.
Are you going to notice a 2-3fps difference in game? Absolutely not.
So you’d be paying an extra $23 for what? Being able to see a slightly bigger number on paper, but when it comes to real world performance, the difference is negligible.
Take that $23 savings and put it towards a better cooler.

edit: Lol, I didn't even notice your chart was 720p. Let's look at the actual resolutions that people would be using this chip for, 1080p and 1440p.
relative-performance-games-1920-1080.png

relative-performance-games-2560-1440.png


I circled the button you click to see prices from other sellers
WllyoVv.jpg
 
Last edited:
You have to go to the other sellers to see the $137 option, like I said in the first post.
That wasn't clear. I wasn't sure what the "you have to look at other sellers" was about. I thought that meant Amazon in relation to Best Buy. Your language indicated a claim it was $137 on Amazon itself, and I followed your link. I didn't realize you meant to convey it was shipped directly from Amazon at that price if you looked in the Amazon Marketplace.
The 5600 is $137 on Amazon, sold and shipped by Amazon, and comes with Uncharted.

As for this:
Keep in mind Toms tested with a 3090. Are you going to be pairing a $1200 GPU with a $150 cpu? No, you’re not. If you can afford an 80 or 90 series card, you buy the best of the best for the rest of the system.
When you pair it with something more in line with the price, like a GTX1650, that fps gap is going to drop down to a 2-3fps difference.
Are you going to notice a 2-3fps difference in game? Absolutely not.
So you’d be paying an extra $23 for what? Being able to see a slightly bigger number on paper, but when it comes to real world performance, the difference is negligible.
Take that $23 savings and put it towards a better cooler.

edit: Lol, I didn't even notice your chart was 720p. Let's look at the actual resolutions that people would be using this chip for, 1080p and 1440p.
relative-performance-games-1920-1080.png

relative-performance-games-2560-1440.png


I circled the button you click to see prices from other sellers
WllyoVv.jpg
720p minimizes any CPU binding, to keep gaming performance assessment more pure, you know this, and even at 1080p, it's still 2.1% inferior at stock across the gaming gamut. Comparing PBO or overclocks to the 5600X@stock in those charts is misleading because the 5600X will PBO or overclock, so you have to look at that.

All charts will show this. It's 2%-4% inferior in gaming. That's perfectly consistent with the "slightly bigger number on paper". While 2.1% isn't much, again, it's roughly identical to the difference between the 12700K and 12600K in gaming, for example. And that's just across the gamut. As any suite will show you, when you look at individual scores, CPUs or GPUs with that difference will sometimes see great differences in FPS for any given game. Sometimes 10%-15%. That's less likely with the 5600 and 5600X, given their similarity, but still, some will more heavily that first-core frequency advantage. That can be the difference between hitting the 120fps or 144fps or 240fps cap for your monitor. No one should dispute the 12600K is the better gaming value, nor would I dispute the 5600 is a better value at these costs, but it's a question of what the buyer desires.

If the difference were greater than $22 I would get it, but here, what's the opportunity cost? That amount won't net the difference betwen the 6600 and 6600 XT, for example. There's not much to be done with it in this scenario. The better binning seems worth it. It's up to the buyer.
 
Last edited:
That wasn't clear. I wasn't sure what the "you have to look at other sellers" was about. I thought that meant Amazon in relation to Best Buy. Your language indicated a claim it was $137 on Amazon itself, and I followed your link. I didn't realize you meant to convey it was shipped directly from Amazon at that price if you looked in the Amazon Marketplace.

I said "The 5600 is $137 on Amazon, sold and shipped by Amazon". How can you not realize I meant to convey it was shipped directly from Amazon? I literally said it shipped by Amazon. Literally. I don't know how I could have been more clear on that.
Then I said you have to look at other sellers. Are you unfamiliar with how to check for other sellers on Amazon? Was this your first time on Amazon?


As for this:

720p minimizes any CPU binding, to keep gaming performance assessment more pure, you know this, and even at 1080p, it's still 2.1% inferior at stock across the gaming gamut. Comparing PBO or overclocks to the 5600X@stock in those charts is misleading because the 5600X will PBO or overclock, so you have to look at that.

All charts will show this. It's 2%-4% inferior in gaming. That's perfectly consistent with the "slightly bigger number on paper". While 2.1% isn't much, again, it's roughly identical to the difference between the 12700K and 12600K in gaming, for example. And that's just across the gamut. As any suite will show you, when you look at individual scores, CPUs or GPUs with that difference will sometimes see great differences in FPS for any given game. Sometimes 10%-15%. That's less likely with the 5600 and 5600X, given their similarity, but still, some will more heavily that first-core frequency advantage. That can be the difference between hitting the 120fps or 144fps or 240fps cap for your monitor. No one should dispute the 12600K is the better gaming value, nor would I dispute the 5600 is a better value at these costs, but it's a question of what the buyer desires.

If the difference were greater than $22 I would get it, but here, what's the opportunity cost? That amount won't net the difference betwen the 6600 and 6600 XT, for example. There's not much to be done with it in this scenario. The better binning seems worth it. It's up to the buyer.


Let's do some basic math with your 2.1%. If the 5600x is hitting 120fps, the 5600 would be hitting 117.5fps. If the 5600x is hitting 90fps, the 5600 would be hitting 88.1 fps. If the 5600x is hitting 60fps, the 5600 would be hitting 58.3fps.
Now let's do some math with the 4%. If the 5600x is hitting 120fps, the 5600 would be hitting 115.2fps. If the 5600x is hitting 90fps, the 5600 would be hitting 86.4fps. If the 5600x is hitting 60fp, the 5600 would be hitting 57.6.
If you're telling me you can notice a less than 5fps difference in games, I'm going to call bullshit. Of course, if it drops you below 30fps, then it can have an effect. But we both know that.

The 12700k vs 12600k in meaningless here, we're talking about the 5600x and 5600.

If you're buying a $150 cpu, you're building a budget system, and every dollar counts.
You could take that $22 savings, throw in an extra dollar, and go from a 512gb SSD to a 1tb SSD. Or it could bump you up to a case with much better airflow. Or, like I posted above, a better cooler. Or better ram.
 
Last edited:
As for this:

Let's do some basic math with your 2.1%. If the 5600x is hitting 120fps, the 5600 would be hitting 117.5fps. If the 5600x is hitting 90fps, the 5600 would be hitting 88.1 fps. If the 5600x is hitting 60fps, the 5600 would be hitting 58.3fps.
Now let's do some math with the 4%. If the 5600x is hitting 120fps, the 5600 would be hitting 115.2fps. If the 5600x is hitting 90fps, the 5600 would be hitting 86.4fps. If the 5600x is hitting 60fp, the 5600 would be hitting 57.6.
If you're telling me you can notice a less than 5fps difference in games, I'm going to call bullshit. Of course, if it drops you below 30fps, then it can have an effect. But we both know that.

The 12700k vs 12600k in meaningless here, we're talking about the 5600x and 5600.
It's meaningful because it provides a context for the general difference in the processor's capability. Anyone who routinely pours over gaming suite fps average differences knows that while 2.1% doesn't look like much, it typically separates entirely different classes of processors within the same generation. I don't understand the purpose of downplaying a general percentile difference when it can often be meaningful. Staying with Techpowerup, at 1080p, note the following differences:

Imagine the above processors were like the 5600X vs. 5600, and we were only discussing games, nothing else. You wouldn't take the 12700K over the 12600K for $22 more? The 3700X over the 3600? The 7700X over the 7600? Even if only for games, even for a mere binning advantage, this cost seems a minor premium.

Because projecting the general average onto every scenario or framerate target can be misleading. For example, Techspot often does rundowns. They were a bit of an outlier in showing the 7600X as faster in gaming the the 13600K-- by 3.1%. Now, one could do the same math, and we're looking at 139 vs 144 fps, for example, or you can see that the difference from game to game can often be far greater and more meaningful.
1080p.png
See that? While the average is only 3.1% overall, in terms of individual games, the difference is sometimes far greater. And as you can see, while the 13600K enjoyed its own big win of 14% in Halo: Infinite, that is not nearly as great as the 36% advantage for the 7600X in Battlefield V. That small overall advantage will more often play out with these larger advantages.

36%. Suddenly we are talking about 60fps vs. 38fps.... 144fps vs. 106fps.... 240fps vs. 154fps.

Given, the variances won't be as great from game to game as with other processors with a similarly narrow average margin, given the frequency is the only variable, but I'm illustrating the point. In games with great single core threshold dependency, you will sometimes see these greater variances.
Jefferz said:
If you're buying a $150 cpu, you're building a budget system, and every dollar counts.
You could take that $22 savings, throw in an extra dollar, and go from a 512gb SSD to a 1tb SSD. Or it could bump you up to a case with much better airflow. Or, like I posted above, a better cooler. Or better ram.
He's not building a budget system, he's upgrading his current comp-- CPU and GPU. Could you not be bothered to read his post?
 
It's meaningful because it provides a context for the general difference in the processor's capability. Anyone who routinely pours over gaming suite fps average differences knows that while 2.1% doesn't look like much, it typically separates entirely different classes of processors within the same generation. I don't understand the purpose of downplaying a general percentile difference when it can often be meaningful. Staying with Techpowerup, at 1080p, note the following differences:

Imagine the above processors were like the 5600X vs. 5600, and we were only discussing games, nothing else. You wouldn't take the 12700K over the 12600K for $22 more? The 3700X over the 3600? The 7700X over the 7600? Even if only for games, even for a mere binning advantage, this cost seems a minor premium.

Because projecting the general average onto every scenario or framerate target can be misleading. For example, Techspot often does rundowns. They were a bit of an outlier in showing the 7600X as faster in gaming the the 13600K-- by 3.1%. Now, one could do the same math, and we're looking at 139 vs 144 fps, for example, or you can see that the difference from game to game can often be far greater and more meaningful.
1080p.png
See that? While the average is only 3.1% overall, in terms of individual games, the difference is sometimes far greater. And as you can see, while the 13600K enjoyed its own big win of 14% in Halo: Infinite, that is not nearly as great as the 36% advantage for the 7600X in Battlefield V. That small overall advantage will more often play out with these larger advantages.

36%. Suddenly we are talking about 60fps vs. 38fps.... 144fps vs. 106fps.... 240fps vs. 154fps.

Given, the variances won't be as great from game to game as with other processors with a similarly narrow average margin, given the frequency is the only variable, but I'm illustrating the point. In games with great single core threshold dependency, you will sometimes see these greater variances.

He's not building a budget system, he's upgrading his current comp-- CPU and GPU. Could you not be bothered to read his post?

We aren't talking about all those other cpu's, we're talking about the 5600 vs the 5600x.
But you be you kid. Move them goalposts. Spam a bunch of stuff not relevant to the discussion. This is the way.
 
We aren't talking about all those other cpu's, we're talking about the 5600 vs the 5600x.
But you be you kid. Move them goalposts. Spam a bunch of stuff not relevant to the discussion. This is the way.
We were discussing the 2.1% overall average framerate difference in gaming. I simply demonstrated how that plots out more meaningfully for individual games when you parse them. This is why such numbers should not necessarily be dismissed so nonchalantly. Your analysis was inadequately abstract and misleading.

In the case of the 5600X vs. 5600, the $22 would effectively be for superior binning to the tune of about 4.5%. This extends to efficiency and heat reslience, but it isn't clear if the latter is relevant in Jav's case since he only mentioned upgrading. If he is perhaps considering an upgrade to his CPU cooler, since he didn't mention if he already has a more powerful aftermarket one, that's the only instance where I think he could make better use of the $22. Otherwise, again, I noted the 5600 is a worthy alternative, but I explained why I favor a superior binned CPU for such a modest premium. Gamers have long paid far more than that for superior binning at places like the Silicon Lottery.
 
We were discussing the 2.1% overall average framerate difference in gaming. I simply demonstrated how that plots out more meaningfully for individual games when you parse them. This is why such numbers should not necessarily be dismissed so nonchalantly. Your analysis was inadequately abstract and misleading.

In the case of the 5600X vs. 5600, the $22 would effectively be for superior binning to the tune of about 4.5%. This extends to efficiency and heat reslience, but it isn't clear if the latter is relevant in Jav's case since he only mentioned upgrading. If he is perhaps considering an upgrade to his CPU cooler, since he didn't mention if he already has a more powerful aftermarket one, that's the only instance where I think he could make better use of the $22. Otherwise, again, I noted the 5600 is a worthy alternative, but I explained why I favor a superior binned CPU for such a modest premium. Gamers have long paid far more than that for superior binning at places like the Silicon Lottery.

I wasn't misleading anyone, we were talking about the 5600x vs the 5600. You're the one that brought up all those other cpu's for some reason, and then tried to steer the conversation to cpu's that we weren't even talking about.

Now it's 4.5%. It's been 2.1%, 3%, 4%, and finally 4.5%. If you keep it up, it'll be over 10% in a couple of posts.

Bringing up Silicon Lottery is pointless in this conversation. We're talking about an under $150 cpu.

0*RPUxnwFT08KLyCzH.
 
Oh, surprised I missed this headline a few days ago when it dropped. There is finally a new most popular desktop GPU. Long live the GTX 1060, the GTX 1060 is dead (it held this title longer than any other GPU). Below, I've ordered the succession of that crown:
https://www.techspot.com/news/96835-new-graphics-card-tops-steam-survey-first-time.html

Its hilarious when new CSGO gamers complain about Valves MM not being 128tick. They dont understand the overwhelming majority of Pc gamers in 2012 had bad Pc's and dialup internet connections.

Amazing how the 2010's revitalized Pc gaming.
 
I wasn't misleading anyone, we were talking about the 5600x vs the 5600. You're the one that brought up all those other cpu's for some reason, and then tried to steer the conversation to cpu's that we weren't even talking about.

Now it's 4.5%. It's been 2.1%, 3%, 4%, and finally 4.5%. If you keep it up, it'll be over 10% in a couple of posts.

Bringing up Silicon Lottery is pointless in this conversation. We're talking about an under $150 cpu.

0*RPUxnwFT08KLyCzH.
2.1% was the difference you emphasized from TPU's 1080p suite. I mentioned that the difference between the two CPUs varied from suite to suite (my opening post highlighted 2.4%, ~3%, and <4% differences). I showed half a dozen other CPU comparisons with differences around that 2.1% figure to demonstrate that a seemingly small number like that in these kinds of benchmark suites may belie much larger differences from game to game. This is directly on topic.

The 4.5% difference I mention is the most basic: the difference in their turbo clocks. On that critical first core, the difference will be 5.7%.

The Silicon Lottery is not irrelevant. It demonstrates that gamers value higher binning. The 5600X are the highest binned 5600 chips. That's why their PBOs outperform the 5600 PBOs by a margin consistent with their advantage at stock.

Another off-topic point raised was your blathering about what to do with $22 in a brand new budget build. This recommendation was to a Sherdogger who explicitly stated that he was looking to upgrade his CPU and GPU. Perhaps he might consider other additions/upgrades to his PC with that $22 that would make more sense than the 5600X over the 5600, but you didn't ask him about these possible opportunity costs. I was going on what he shared. He is on the B350-F, so this will be his final CPU upgrade on his MoBo, and his first CPU change since buying the R7-1700X (a CPU that came out 5 years ago). If that is how long he retains his CPUs, then from my point of view, $22 is a small premium for an advantage that will sit in his comp for years to come. He would know best how that might impact the specific GPU upgrade options he weighs. Maybe it would be the difference between and less and greater AIC. As I always do, I'd be happy to go over those options with him if he inquires further.

I don't understand why you get so emotional about this stuff.
 
2.1% was the difference you emphasized from TPU's 1080p suite. I mentioned that the difference between the two CPUs varied from suite to suite (my opening post highlighted 2.4%, ~3%, and <4% differences). I showed half a dozen other CPU comparisons with differences around that 2.1% figure to demonstrate that a seemingly small number like that in these kinds of benchmark suites may belie much larger differences from game to game. This is directly on topic.

The 4.5% difference I mention is the most basic: the difference in their turbo clocks. On that critical first core, the difference will be 5.7%.

The Silicon Lottery is not irrelevant. It demonstrates that gamers value higher binning. The 5600X are the highest binned 5600 chips. That's why their PBOs outperform the 5600 PBOs by a margin consistent with their advantage at stock.

Another off-topic point raised was your blathering about what to do with $22 in a brand new budget build. This recommendation was to a Sherdogger who explicitly stated that he was looking to upgrade his CPU and GPU. Perhaps he might consider other additions/upgrades to his PC with that $22 that would make more sense than the 5600X over the 5600, but you didn't ask him about these possible opportunity costs. I was going on what he shared. He is on the B350-F, so this will be his final CPU upgrade on his MoBo, and his first CPU change since buying the R7-1700X (a CPU that came out 5 years ago). If that is how long he retains his CPUs, then from my point of view, $22 is a small premium for an advantage that will sit in his comp for years to come. He would know best how that might impact the specific GPU upgrade options he weighs. Maybe it would be the difference between and less and greater AIC. As I always do, I'd be happy to go over those options with him if he inquires further.

I don't understand why you get so emotional about this stuff.

I don't get emotional at all, I just get tired of your long-winded posts that usually have nothing to do with the topic. You always have to "be right" and have the last word. The amount of times I've seen you move goal posts to "be right" is nauseating.
And I'm kinda surprised you're actually using the reply button, you haven't used it in a couple of years no w when responding to me. It’s like trying to have a conversation with a passive aggressive snotty teenage girl.
 
Last edited:
I hear early Benchmarks on the XTX are really disappointing.
The benchmarks that were leaked yesterday looked like they were either improperly tested or fake (openCL and vulkan where both scored lower then the 6900XT).

The retest today showed 50% higher on openCL and almost double on Vulkan which is more realistic (that would put the 7900xtx scores in between the scores of the 4080 and 4090)
 
The benchmarks that were leaked yesterday looked like they were either improperly tested or fake (openCL and vulkan where both scored lower then the 6900XT).

The retest today showed 50% higher on openCL and almost double on Vulkan which is more realistic (that would put the 7900xtx scores in between the scores of the 4080 and 4090)
Wish they had a ray tracing solution-- it would take away any leverage Nvidia has. As I understand it FSR is getting better and better.
 
Wish they had a ray tracing solution-- it would take away any leverage Nvidia has. As I understand it FSR is getting better and better.
They do have ray-tracing. They just don't have discrete, dedicated ray-tracing cores. Ray-tracing is executed by their "accelerators" (or whatever they call them) that are part of the streaming multiprocessor compute units. This isn't pseudo-emulation like in the RX 5000 series. It's hardware-level ray-tracing.
 
They do have ray-tracing. They just don't have discrete, dedicated ray-tracing cores. Ray-tracing is executed by their "accelerators" (or whatever they call them) that are part of the streaming multiprocessor compute units. This isn't pseudo-emulation like in the RX 5000 series. It's hardware-level ray-tracing.
Holy crap, I gotta watch that Red dude on youtube and see what he says about this. Thank you for this info
 
I hear early Benchmarks on the XTX are really disappointing.
Dude, it’s early testing. They’ve still got bug fixes and drivers to work on. It’s not unheard of to have new drivers or bios come out the day before a launch because they’re continuously working on them.
 
Wish they had a ray tracing solution-- it would take away any leverage Nvidia has. As I understand it FSR is getting better and better.
I’ve been playing cyberpunk with ray tracing on, and honestly I’m not really sure it adds much. I guess I’ll flip the settings around some and play with it more this weekend.
 
Back
Top