Tech Gaming Hardware discussion (& Hardware Sales) thread

You just proved yourself wrong.
First you said “nobody looked at them”, then said “It's a tinyfraction of the market.”
How can nobody buy them and a tiny fraction buy them?
In b4 a long, drawn out post with the goalposts being moved.
Ah, I see, you're going to try to hide in the absurdly absolutist interpretation of a word rather than the commonly accepted intended connotation of its phrasing. Yeah, that's what I meant! AMD and NVIDIA produced those cards as art pieces. Never intended a single one for consumption. I certainly didn't mean reference cards were a "running joke" that moved a tiny number of units because they were overwhelmingly avoided by buyers in favor of aftermarket partner cards which clearly AMD themselves always intended as the mode of the consumption for their customers.

"Plenty" is an abstract term that doesn't hold up to that microscope, either, after all. Readers of this thread will decide for themselves which characterization of the reference cards presence on the market more accurately described their actual market presence.
 
Ah, I see, you're going to try to hide in the absurdly absolutist interpretation of a word rather than the commonly accepted intended connotation of its phrasing. Yeah, that's what I meant! AMD and NVIDIA produced those cards as art pieces. Never intended a single one for consumption. I certainly didn't mean reference cards were a "running joke" that moved a tiny number of units because they were overwhelmingly avoided by buyers in favor of aftermarket partner cards which clearly AMD themselves always intended as the mode of the consumption for their customers.

"Plenty" is an abstract term that doesn't hold up to that microscope, either, after all. Readers of this thread will decide for themselves which characterization of the reference cards presence on the market more accurately described their actual market presence.

Sucks when your own tactics are used against you, doesn’t it?
I should have added a 1000 word post moving the goalposts and you could have had the full Madmick experience.
 
Last edited:
Sucks when your own tactics are used against you, doesn’t it?
I should have added a 1000 word post moving the goalposts and you could have had the full Madmick experience.
Concession accepted. Not sure why you sought to pick a fight for no reason, but I see I'm still living #rentfree.
 
How many times do we have to go over this? Gaming and editing are not the same thing. Cinebench is not a benchmark designed to assess CPU gaming performance.
@Madmick any opinion on these? tdp appears to be decent, not sure about performance

View attachment 959463

https://www.techpowerup.com/cpu-specs/core-i9-13900.c2854
Those are just the lower performance non-K variants that release like they do every generation. For example, with the 13900 vs. the 13900K:
  • -200 MHz = Single Core P-Turbo ("P" for power cores)
  • -200 MHz = P-Turbo 3.0
  • -200 MHz = P-Turbo
  • -100 MHz = E-Turbo ("E" for efficiency cores)
  • -1000 MHz = P-core base clock
  • -700 MHz = E-core base clock

For benchmark comparisons, including gaming, it's the turbo comparisons that tend to matter most. Although for more recent generations the gap has widened a bit between the K and non-K variants due to the lower power targets because the power consumption has become so great. For gaming, you're just better going off with lesser CPU from the K line if that's your desire (i.e. get the 13700K instead of the 13900, or the 13600K instead of the 13700).
However, I also recently noted:
Meanwhile, there is also an i5-13500, and it is particularly attractive running in the PL1 Unlimited mode where it mops the floor with the 12600K in Cinebench even if the latter is running in the same state. The two early samples each of the 13500 and 13400 showing up on Passmark are notching a similarly impressive average; particularly when you note single core scores.

Ten years ago the value strategy for tech-savvy buyers with self-built PCs containing powerful cooling was to buy a "K" processor, and then overclock it. Today, it appears the strategy is to buy a non-K processor, and run it in an unlimited power state.

Thus, synethesizing those two sensibilities, if you wish to achieve the best bang-for-your-buck at the highest end of current CPU performance with a strict focus on gaming, the strategy would be to buy the i7-13700 (not yet released), and run it in a PL1 Unlimited mode. If you want to make these comparisons, I suggest you dig for that once reviews start to appear. Find a reviewer who has run the i9-13900K at stock on the same test bench hardware as the i7-13700 in the PL1 unlimited power mode across a dozen or so different games, and compare the average framerates. If the i9-13900 and i5-13600 are also benched, all the better.
 
Concession accepted. Not sure why you sought to pick a fight for no reason, but I see I'm still living #rentfree.
I’m not conceding anything. You proved yourself wrong. If anything, you should be bowing out. But of course, you’ll never do that.
 
I’m not conceding anything. You proved yourself wrong. If anything, you should be bowing out. But of course, you’ll never do that.
My goodness. I made a casual observation to Slob about the irony that reference cards used to be a largely ignored product because presently there is much ado about AMD's latest reference cards suffering a design flaw.

So you whine about custom loop GPU liquid coolers as a serious market presence to represent the "plenty of people" who bought reference cards in the pre-RTX 2000 era. I challenged you to provide statistics on what percentage of gamers use such custom loops, or any statistics on the number of gamers who bought reference cards back then; after enlightening you, of course, ignorant as you were, to the fact that AMD never produced more than a one-off launch batch of stock of their reference cards back then.

And you're clinging to "Nobody means 0 people! You said 'nobody'! Nobody!!!" Of course this comes from the guy who repeatedly championed the RTX 2060 as the better buy over the RX 5700 XT (when, years later, the latter stomps the RTX 2060 Super). Just let it go, man.
 
My goodness. I made a casual observation to Slob about the irony that reference cards used to be a largely ignored product because presently there is much ado about AMD's latest reference cards suffering a design flaw.

So you whine about custom loop GPU liquid coolers as a serious market presence to represent the "plenty of people" who bought reference cards in the pre-RTX 2000 era. I challenged you to provide statistics on what percentage of gamers use such custom loops, or any statistics on the number of gamers who bought reference cards back then; after enlightening you, of course, ignorant as you were, to the fact that AMD never produced more than a one-off launch batch of stock of their reference cards back then.

And you're clinging to "Nobody means 0 people! You said 'nobody'! Nobody!!!" Of course this comes from the guy who repeatedly championed the RTX 2060 as the better buy over the RX 5700 XT (when, years later, the latter stomps the RTX 2060 Super). Just let it go, man.

Just admit you were wrong.

Do you ever reread your posts and realize how much of an ignorant ass you are?
 
Last edited:
#rentfree

Not even close. And by that post, it just further enforces my point.
And you still won’t admit you were wrong.

My goodness. I made a casual observation to Slob about the irony that reference cards used to be a largely ignored product because presently there is much ado about AMD's latest reference cards suffering a design flaw.

So you whine about custom loop GPU liquid coolers as a serious market presence to represent the "plenty of people" who bought reference cards in the pre-RTX 2000 era. I challenged you to provide statistics on what percentage of gamers use such custom loops, or any statistics on the number of gamers who bought reference cards back then; after enlightening you, of course, ignorant as you were, to the fact that AMD never produced more than a one-off launch batch of stock of their reference cards back then.

And you're clinging to "Nobody means 0 people! You said 'nobody'! Nobody!!!" Of course this comes from the guy who repeatedly championed the RTX 2060 as the better buy over the RX 5700 XT (when, years later, the latter stomps the RTX 2060 Super). Just let it go, man.

It's ironic AF when you say let it go, but yet bring up something from a couple years ago.
 
Last edited:
We discussed specific variants, and how many of them were out of stock at their original targeted pricing (whether MSRP or a manufacturer-indicated premium above that), which was true across the board at numerous times during that exchange, if you snapshotted them. The point was the 4080 was out of stock at MSRP and its lowest intended pricing at major retailers. This was towards highlighting the larger truth, regarding the relationship of supply to demand, as healthy, but as I pointed out, that's because NVIDIA never intended to furnish as robust a supply. They adapted their strategy to a market that is buying less and less, and has been for a decade.

The downturn isn't just for GPUs. This is industry-wide, globally. Everyone is slashing orders from TSMC (Intel, NVIDIA, AMD). All of the major players are expecting depressed sales and revenue in 2023. NVIDIA is seeking to maximize ROI.
Well no the argument started over you saying that scalpers were not having a problem with selling and everyone was reporting fake news. My point bringing up the partners was that they were still in stock and near msrp. That was true at the time and it’s true now. And anyways, in my like 15 years of buying graphics cards I think this is the first time I’ve not seen nvidia sell out on a high end card. Historically they’re sold out for a few months at least before you start seeing them in stock by retailers. That’s doubly bad if they aren’t producing as many as they usually do. 4080 is a dud at launch.
 
Anybody here have any experience with Xidax. Seems to me they price considerably cheaper than competitors.
 
Anybody here have any experience with Xidax. Seems to me they price considerably cheaper than competitors.
I have them bookmarked, that's it. They're not cheaper than CyberpowerPC as far as custom builders, but their appeal is a lifetime warranty, IIRC.
 
Looks overpriced. Cheapest system is $1250 with a 12400 and 3050.
Yeah, that's definitely not their appeal. But rarely is the cheapest configuration the best value for any custom builder. Probably the best value they offer:
https://www.xidax.com/desktop/x-2?saved_config=436567
  • R5-5600X
  • Be Quiet Pure Rock Slim 2 CPU Cooler
  • RX 6650 XT 8GB
  • ASUS Prime B550 Plus HES-AC WiFi ATX
  • 32GB (2x16GB) DDR4-3200MHz "Xidax Ultra" RAM
  • 1TB WD Blue SN570 NVMe SSD
  • 650W Gold Xidax PSU
  • Windows 11 Home
    $1311

That's not terrible, and for +$121, you can opt for the RX 6750 XT upgrade to the GPU. Not counting Windows, that's a little under a +$300 premium over building oneself. If counting Windows, it's only about +$150.
 
Snapshot of the officially announced upcoming non-K Raptor Lake processors:
intel-13th-01-1440x810.jpeg
 
Just saw a video from "gamers nexus" totally shitting on the 4070 Ti.

Apparently horrible value.
There’s just no card you can really recommend. The 4090 draws a nutty amount of power and might burn itself out at the connector. The 4080 is super expensive to the point where you feel like you might as well jump to a 90. And now this one.

I guess the AMD 7900 xtx is looking like the best option right now. You save 200 bucks off the 4080 for similar performance. Of course you lose out on DLSS and RTX.

I will say I’ve played with RTX on a 3080 and 3080 to some recently. At least in those cards, the performance hit was so profound I ended up turning it off all together. That was even with DLSS 2 running. Yeah the image looked nice, but the games were running janky.

Last weekend I was running Witcher 3 at 4k on a 3080ti. With RTX off and everything else at max, I was getting 80-90 fps average. Turning it on, I was dropped to about 45 in more open scenes and struggling to stay above 30 in more crowded scenes. RTX looks nice, but no thanks. It’s not worth that performance hit. It wasn’t just Witcher 3 either. I messed about on a few other games with similar effect I just don’t remember those numbers as exactly.

Looking at the charts for the 4070 and 4080, it doesn’t look like it’s gotten a whole lot better at handling the hit. At least with the 4090 it seems to stay in the 75+ range on a lot of games, which I consider a must these days.

I say all that to say maybe RTX isn’t even worth considering right now. Of course I’m using 30 series hardware at 4k, so it’s not a perfect comparison. Looking at the charts though, 4090 is probably the only card I would switch RTX on with.
 
Back
Top