Tech Gaming Hardware discussion (& Hardware Sales) thread

You could save some money on not getting that OS. You can buy a key for like 3 bux, or use the product key from your current Windows PC.

But the rest looks good. A beast of a PC, and very goddamn costly lol.

Lol, where can I get a key for 3 bucks? I already have win10 ready to go on a flash drive for my new build, but saving $100 on a new key would be great...
 
maxresdefault.jpg


Apple M1 silicon upsets the establishment by skipping past the AMD Ryzen 7 5800X and coming close to Intel Core i7-11700K single-thread performance on PassMark

Apple is looking like a bull in a China shop. Intel and AMD might not be too threatened, right now, because Apple only makes chips for their own computers, but if this is an indication of the future Apple's M chip line will take, then every PC gamer in the world will be pining for their chips. I have no idea how to predict what the impact will be on PC gaming. I strongly doubt Apple would permit Windows manufacturers access to their hardware, the margins are far too low for them, but all of the tools for game development are on PC, and everyone will want this processing power.

It's not a leak or an engineer sample of anything like that. It's already showing up on the official charts (26 samples):
https://www.cpubenchmark.net/high_end_cpus.html
https://www.cpubenchmark.net/singleThread.html

This is a freaking 15.1W chip under load when actively cooled. The 5800X is 131W. The 11700K is 225W.

Apple just swooped in with a Fat Man level bitch slap.

14756838956368017.jpg
 
maxresdefault.jpg


Apple M1 silicon upsets the establishment by skipping past the AMD Ryzen 7 5800X and coming close to Intel Core i7-11700K single-thread performance on PassMark

Apple is looking like a bull in a China shop. Intel and AMD might not be too threatened, right now, because Apple only makes chips for their own computers, but if this is an indication of the future Apple's M chip line will take, then every PC gamer in the world will be pining for their chips. I have no idea how to predict what the impact will be on PC gaming. I strongly doubt Apple would permit Windows manufacturers access to their hardware, the margins are far too low for them, but all of the tools for game development are on PC, and everyone will want this processing power.

It's not a leak or an engineer sample of anything like that. It's already showing up on the official charts (26 samples):
https://www.cpubenchmark.net/high_end_cpus.html
https://www.cpubenchmark.net/singleThread.html

This is a freaking 15.1W chip under load when actively cooled. The 5800X is 131W. The 11700K is 225W.

Apple just swooped in with a Fat Man level bitch slap.

14756838956368017.jpg
Have you seen this?

https://www.anandtech.com/show/16535/intel-core-i7-11700k-review-blasting-off-with-rocket-lake/19
 
Yep. Always read them first. That was my reference for real-world power consumption. It turned out to not be quite as impressive at 14nm as the early leaks suggested, but still delivered what was expected: a return to per-core and per-thread supremacy. I expect that performance in gaming will improve more with driver maturation for these CPUs than most releases due to the weirdness of the transplanted architecture. But maybe not. It could be Intel knew that all they would achieve would be benchmark hype, and that was all they wanted because it gives them something to sell, and avoids a black hole of press.

As you can see, most are extremely disappointed with the release. Intel has to be praying AMD won't hit back that hard with Zen 3+.
 
maxresdefault.jpg


Apple M1 silicon upsets the establishment by skipping past the AMD Ryzen 7 5800X and coming close to Intel Core i7-11700K single-thread performance on PassMark

Apple is looking like a bull in a China shop. Intel and AMD might not be too threatened, right now, because Apple only makes chips for their own computers, but if this is an indication of the future Apple's M chip line will take, then every PC gamer in the world will be pining for their chips. I have no idea how to predict what the impact will be on PC gaming. I strongly doubt Apple would permit Windows manufacturers access to their hardware, the margins are far too low for them, but all of the tools for game development are on PC, and everyone will want this processing power.

It's not a leak or an engineer sample of anything like that. It's already showing up on the official charts (26 samples):
https://www.cpubenchmark.net/high_end_cpus.html
https://www.cpubenchmark.net/singleThread.html

This is a freaking 15.1W chip under load when actively cooled. The 5800X is 131W. The 11700K is 225W.

Apple just swooped in with a Fat Man level bitch slap.

14756838956368017.jpg


It's a synthetic benchmark. Whooopie do. Did they win a JD Power award as well?



Remember that's a pre-release review meaning things haven't been optimized/released for the chip yet.
 
Last edited:
a return to per-core and per-thread supremacy.

WAT

the link showed the opposite.

Our results clearly show that Intel’s performance, while substantial, still trails its main competitor, AMD. In a core-for-core comparison, Intel is slightly slower and a lot more inefficient. The smart money would be to get the AMD processor.

that said, i don't believe it. even intel isn't THAT bad. and i think a bios update or something already was announced that places this back on par. supposedly.
 
It's a synthetic benchmark. Whooopie do. Did they win a JD Power award as well?
Sure. This is only a harbinger of the impending death of x86 architecture.

Silly me with my silly story.
 
Sure. This is only a harbinger of the impending death of x86 architecture.

Silly me with my silly story.

1 synthetic chart and it's the end of x86. GTFO.

Let's look at some real world benchmarks.

pic_disp.php

pic_disp.php

pic_disp.php
 
Why on earth anyone would assess CPU strength/potential in terms of GPU accelerated software is...well, it's pretty pointless.
 
Why on earth anyone would assess CPU strength/potential in terms of GPU accelerated software is...well, it's pretty pointless.

It's showing that in real world applications, x86 is still vastly superior.
 
LOL, dafuq? No it's not. It's showing that the RTX 3080 is superior to the SoC GPU. That ain't x86. That's like dismissing an APU's CPU potential because of a game benchmark where another APU with a more robust GPU rendered a higher framerate.

What a dumbfounding argument. The first M1 was assessed months ago, and across the preponderance of benchmarks, including synthetic or real-world ones, for CPUs specifically, it isn't the top processor. It's not the top processor in Passmark (the 1700X is still beating it). It is still decisively routed in terms of multicore performance. That isn't the point. The point is that even across real-world distributions of performance assessment, including those stressing all core simultaneously, it's challenging x86 supremacy for single-threaded dominance, and it's a 15W-- real draw-- chip. That's freakish. The entire A1 Mini didn't even exceed 27W active power in stress testing. Also, this insistence on "real-world benchmarks" is absurdly stupid. The correlation between synthetic and real-world benchmarking/application performance is incredibly positive.

Apple already said this wasn't even the premier chip they're going to release. They're rolling out more in the next two years. The M1 is aimed at lower-end devices. This wasn't supposed to happen. This shouldn't happen. Not this soon-- if ever.
 
LOL, dafuq? No it's not. It's showing that the RTX 3080 is superior to the SoC GPU. That ain't x86. That's like dismissing an APU's CPU potential because of a game benchmark where another APU with a more robust GPU rendered a higher framerate.

What a dumbfounding argument. The first M1 was assessed months ago, and across the preponderance of benchmarks, including synthetic or real-world ones, for CPUs specifically, it isn't the top processor. It's not the top processor in Passmark (the 1700X is still beating it). It is still decisively routed in terms of multicore performance. That isn't the point. The point is that even across real-world distributions of performance assessment, including those stressing all core simultaneously, it's challenging x86 supremacy for single-threaded dominance, and it's a 15W-- real draw-- chip. That's freakish. The entire A1 Mini didn't even exceed 27W active power in stress testing. Also, this insistence on "real-world benchmarks" is absurdly stupid. The correlation between synthetic and real-world benchmarking/application performance is incredibly positive.

Apple already said this wasn't even the premier chip they're going to release. They're rolling out more in the next two years. The M1 is aimed at lower-end devices. This wasn't supposed to happen. This shouldn't happen. Not this soon-- if ever.

The proof is there that an x86 system destroys the M1 no matter if you choose to believe it or not.
 
This was built for sherdog rollers. lol

 
I guess the CPU shortage is done (for now). Microcenter has actually lowered the price on both the 5600X and 5800X. 25+ in stock for each.
 
I guess the CPU shortage is done (for now). Microcenter has actually lowered the price on both the 5600X and 5800X. 25+ in stock for each.

Amazon and Newegg have had them in stock for awhile now.
 
At this rate I don't think we will be able to get our hands on a GPU till 2023. I just heard that due to gpu sharing don't understand how it works people with 8 gig RTX boards have been neting 500 plus a month. Here's where it gets more interesting people who run massive cryptocurrencies mines are spending upwards of 3 grand for 3080 GPU's but don't seem that interested in 3090 boards.

Each dot contains a GPU and CPU plus memory. They according to this mega miner used to cost around 1,500 each now cost over 4 to 5 grand each. They still use GPU's in many cases vs ASIC mining rigs due to scaling limitations.

full
 
At this rate I don't think we will be able to get our hands on a GPU till 2023. I just heard that due to gpu sharing don't understand how it works people with 8 gig RTX boards have been neting 500 plus a month. Here's where it gets more interesting people who run massive cryptocurrencies mines are spending upwards of 3 grand for 3080 GPU's but don't seem that interested in 3090 boards.

Each dot contains a GPU and CPU plus memory. They according to this mega miner used to cost around 1,500 each now cost over 4 to 5 grand each. They still use GPU's in many cases vs ASIC mining rigs due to scaling limitations.

full

Jesus Christ, that reminds me of the interior of the Borg ship on TNG

MiscBorgCube3.jpg
 

Forum statistics

Threads
1,280,174
Messages
58,263,466
Members
175,986
Latest member
BloodandBeer
Back
Top