Tech Gaming Hardware discussion (& Hardware Sales) thread

For any who are confused, no, the new Mac Pro carries 802.11ac + Bluetooth 5.0 by default, and Apple's store is telling you this adapter is compatible with older networks/standards.
https://www.apple.com/mac-pro/specs/
Those familiar with Mac spec sheets know they list "optional" if something like that requires you to select it as an upgrade, or add it yourself, as they did with the 2008 Mac Pro models and older. One may also notice this for the Magic Trackpad 2 among the "Input Devices" listed for this model.

One hiccup is that this is WiFi 5 (ac), not WiFi 6 (ax). I screwed that up. Those numbers aren't yet second nature to me, but I matched the Asus PCE-AC55BT 2x2 1200 Mbps network card as a reference price point for $35 because it is also WiFi 5 (though only Bluetooth 4.2).
 
I just saw a story about a new server but imagine it for some future desktop. There are Intel SSD a total of over a hundred at 32 terabytes each 3.2 petabytes total in something the size of 6 pizza boxes. The CPU is 32 cores 64 threads an 2 terabytes of ECC memory. The interesting part was it was using AMD CPU's and GPU's I think it was heading into Cray's next generation supercomputer. All this fit into something that was about the size of 6 pizza boxes stacked 3 up and two down.
32 terabytes of storage don't ask how much it costs because if you have to ask you cannot afford it.
2018-08-08-image-33.jpg



1 hundred of these things in this monster this photo was not the one I saw it had more stuff going on a GPU looking slots.

Open-Server-Right_360.jpg
 
I just saw a story about a new server but imagine it for some future desktop. There are Intel SSD a total of over a hundred at 32 terabytes each 3.2 petabytes total in something the size of 6 pizza boxes. The CPU is 32 cores 64 threads an 2 terabytes of ECC memory. The interesting part was it was using AMD CPU's and GPU's I think it was heading into Cray's next generation supercomputer. All this fit into something that was about the size of 6 pizza boxes stacked 3 up and two down.
32 terabytes of storage don't ask how much it costs because if you have to ask you cannot afford it.
2018-08-08-image-33.jpg



1 hundred of these things in this monster this photo was not the one I saw it had more stuff going on a GPU looking slots.

Open-Server-Right_360.jpg
Incredibly exciting indeed. It has come up before ITT:
https://forums.sherdog.com/posts/150034903/
https://forums.sherdog.com/posts/147106105/
https://forums.sherdog.com/posts/146153929/

 
It does have extreme specs an will support 4 GPU'S an insane 1.5 terabytes of ram. So this thing is really designed for the most high end applications. That being said the 8 core 16 thread base unit is really useless for a desktop designed this way. In fact the most base specs for this thing should be 500 gigabytes of ram an 8 terabytes of solid state storage with a 28 core CPU. This monster would cost more then used BMW.
Help me understand here, what the fuck kind of “work” workload would utilize 4 GPU’s?

I’m guessing it would need to be cgi rendering for a Hollywood movie, or something to that affect?

Which puts it out of “consumer” desktop “base” workloads anyways?

What programs out there can offload to 4 gpu’s simultaneously anyway?

Only other thing I can think of would be cad, set up where multiple engineers were rendering cad on this pc, remotely from thier laptops?

I’m pretty hardware engaged as a hobby, more so than software for sure and I use workloads at work that would take forever via apu on a laptop, and I can’t think of anything that a normal “consumer” would need something like this for at home or in a “normal” business environment that would make the cost justified for the performance gains (if any at all).

In a business where Roi is a concern, time is also money but to much money that only cuts a little time doesn’t make cents into dollars, and can turn dollars into cents quickly.

ALOT of programs out there that require gpu’s to cut a massive portion of time vs an apu running the processes require a GPU, but ANY gpu is where the most significant time cut comes from.

Let’s say you are rendering a video, and color correcting and post processing for example.

If you did this on a laptop apu, it would take HOURS depending on the video complexity and length.

Basic desktop, say a 4 core cpu, and 16g ram, chunk a 1050ti off in there. Cuts it down to the minutes instead of hours for rendering.

Replace the 1050 with a 1080, now your cost has grown substantially, but you’ve only shaved seconds or maybe a whole extra minute vs the 1050.

Add another 1080, same thing (if the program will even utilize dual gpu) seconds or maybe a minute vs the single 1080.

Anyway, point being, I use a vast array of workloads and cannot fathom a reason to have a 28 core 4 gpu workstation in a normal home or even office engineering workload environment.

This is targeted at a very specialized workload and anyone buying one NEEDS to have a need for it, or just the money to say they have a pimp Apple for no fucking reason.
 
Help me understand here, what the fuck kind of “work” workload would utilize 4 GPU’s?

I’m guessing it would need to be cgi rendering for a Hollywood movie, or something to that affect?

Which puts it out of “consumer” desktop “base” workloads anyways?

What programs out there can offload to 4 gpu’s simultaneously anyway?

Only other thing I can think of would be cad, set up where multiple engineers were rendering cad on this pc, remotely from thier laptops?

I’m pretty hardware engaged as a hobby, more so than software for sure and I use workloads at work that would take forever via apu on a laptop, and I can’t think of anything that a normal “consumer” would need something like this for at home or in a “normal” business environment that would make the cost justified for the performance gains (if any at all).

In a business where Roi is a concern, time is also money but to much money that only cuts a little time doesn’t make cents into dollars, and can turn dollars into cents quickly.

ALOT of programs out there that require gpu’s to cut a massive portion of time vs an apu running the processes require a GPU, but ANY gpu is where the most significant time cut comes from.

Let’s say you are rendering a video, and color correcting and post processing for example.

If you did this on a laptop apu, it would take HOURS depending on the video complexity and length.

Basic desktop, say a 4 core cpu, and 16g ram, chunk a 1050ti off in there. Cuts it down to the minutes instead of hours for rendering.

Replace the 1050 with a 1080, now your cost has grown substantially, but you’ve only shaved seconds or maybe a whole extra minute vs the 1050.

Add another 1080, same thing (if the program will even utilize dual gpu) seconds or maybe a minute vs the single 1080.

Anyway, point being, I use a vast array of workloads and cannot fathom a reason to have a 28 core 4 gpu workstation in a normal home or even office engineering workload environment.

This is targeted at a very specialized workload and anyone buying one NEEDS to have a need for it, or just the money to say they have a pimp Apple for no fucking reason.

Classic f-ing shit post from Sherdog I am sure a lot of people would have been saying back in 1948 I don't see the need to put money into working on transistors when vacuum tubes work just fine, or back in 1959 Noyce decided to give up on developing the integrated circuit because he saw no need for it when a computer the size of a football field works just fine, 1968 handheld calculators will be too expensive for average people, 1972 Federico :eek::eek::eek:gin thought at lengths about Computer Terminals need for an 8 bit chip from Intel then decided we can get away with people just using typewriters and punch card readers.

1975 why do people need color computer graphics on a terminal, Steven Woz and Steve Jobs decided in 1977 Apple will never sell a home computer, 1980 Intel math co-processor why would someone need such a thing?, 1981 why would people need a 16 bit computer with 256K of ram when 64K works fine, 1985 a graphics accelerator for what pong is already to fast to play on my 16 bit computer and PitFall! looks great without an accelerator, 1987 32 bit processor in a home computer to run a spreadsheet to expensive and people will never need a 32 bit computer with a gig of ram. Well you kinda get where I am going more to the point Nvidia latest self driving controller board uses 4 high end GPU's and pumps out 320 terabytes per second making it about 20 times more powerful then the average desktop with a 1060 GPU.

nv-drive-hardware-banner-background-767-m@2x.jpg


The funny thing even with that much processing Nvidia may not be getting enough to run their self driving platform with it. With AI, deep learning, virtual reality we need more processing to create the environment best suited for a given situation.
 
Classic f-ing shit post from Sherdog I am sure a lot of people would have been saying back in 1948 I don't see the need to put money into working on transistors when vacuum tubes work just fine, or back in 1959 Noyce decided to give up on developing the integrated circuit because he saw no need for it when a computer the size of a football field works just fine, 1968 handheld calculators will be too expensive for average people, 1972 Federico :eek::eek::eek:gin thought at lengths about Computer Terminals need for an 8 bit chip from Intel then decided we can get away with people just using typewriters and punch card readers.

1975 why do people need color computer graphics on a terminal, Steven Woz and Steve Jobs decided in 1977 Apple will never sell a home computer, 1980 Intel math co-processor why would someone need such a thing?, 1981 why would people need a 16 bit computer with 256K of ram when 64K works fine, 1985 a graphics accelerator for what pong is already to fast to play on my 16 bit computer and PitFall! looks great without an accelerator, 1987 32 bit processor in a home computer to run a spreadsheet to expensive and people will never need a 32 bit computer with a gig of ram. Well you kinda get where I am going more to the point Nvidia latest self driving controller board uses 4 high end GPU's and pumps out 320 terabytes per second making it about 20 times more powerful then the average desktop with a 1060 GPU.

nv-drive-hardware-banner-background-767-m@2x.jpg


The funny thing even with that much processing Nvidia may not be getting enough to run their self driving platform with it. With AI, deep learning, virtual reality we need more processing to create the environment best suited for a given situation.
All of that , is 100% correct and I wholeheartedly agree, except for the part that includes this over priced Apple bullshit.

What does any of that have to do, with Apple building something like this for a “home system” that’s way the fuck overpriced for what it is, and giving no reason anyone needs it.

Now if Apple had made thier OWN new cpu, or gpu or have nvidia develop a new tech gpu FOR them that moves tech forward it would be a different deal.

But they haven’t. They’ve taken basically off the shelf shit, put it in a shiny case and thrown a huge price tag on it.
 
All of that , is 100% correct and I wholeheartedly agree, except for the part that includes this over priced Apple bullshit.

What does any of that have to do, with Apple building something like this for a “home system” that’s way the fuck overpriced for what it is, and giving no reason anyone needs it.

Now if Apple had made thier OWN new cpu, or gpu or have nvidia develop a new tech gpu FOR them that moves tech forward it would be a different deal.

But they haven’t. They’ve taken basically off the shelf shit, put it in a shiny case and thrown a huge price tag on it.


Look I have only owned to Apple's in my life one as the original Mac "That I still own" and a Mac Pro laptop the original Mac. I bought it through the University discount and the Mac Pro I bought it because it had problems an I was confident I could fix it and paid only 140 dollars for it replaced the keyboard and the screen "Not easy to do" and it ran like a top for 5 years before I gave it to a friend of my to use in his spare time running his gas station. I only pointed out the Apple because at that time you had to by kits like an MITS altair or imsai 8080 nether one was particularly user friendly. Apple really was the first with a mass market home computer. All my computers today are ether Windows or Linux.
 
Look I have only owned to Apple's in my life one as the original Mac "That I still own" and a Mac Pro laptop the original Mac. I bought it through the University discount and the Mac Pro I bought it because it had problems an I was confident I could fix it and paid only 140 dollars for it replaced the keyboard and the screen "Not easy to do" and it ran like a top for 5 years before I gave it to a friend of my to use in his spare time running his gas station. I only pointed out the Apple because at that time you had to by kits like an MITS altair or imsai 8080 nether one was particularly user friendly. Apple really was the first with a mass market home computer. All my computers today are ether Windows or Linux.
I don’t disagree with any of that, Apple used to be in the forefront for development and in some ways still are.

My first “home” computer growing up was an Apple. And my family has had several “Macintosh “ computers over the years.

The last two personal laptops I had before my newest one were Mac books.

I have an iPhone, iPads, iPods, Apple TV etc etc. so I don’t have anything “against” Apple per say, but the PC referenced here is an example of their desire to rape their buyers for no reason, and includes no benifit to the suggested target purchaser.

That was my point. Look at that damn stand, that’s a perfect example.
 
I don’t disagree with any of that, Apple used to be in the forefront for development and in some ways still are.

My first “home” computer growing up was an Apple. And my family has had several “Macintosh “ computers over the years.

The last two personal laptops I had before my newest one were Mac books.

I have an iPhone, iPads, iPods, Apple TV etc etc. so I don’t have anything “against” Apple per say, but the PC referenced here is an example of their desire to rape their buyers for no reason, and includes no benifit to the suggested target purchaser.

That was my point. Look at that damn stand, that’s a perfect example.

We kinda got derailed about the whole Apple thing my point was there will always be applications that will push the bounds of where computers are today. Seeing what they can squeeze into something not much larger then 6 pizza boxes is really impressive. 3.4 petabytes, 64 cores, 2 terabytes of ram and 4 AMD GPU's. The point I was making it's truly remarkable what has been achieved an in the next 10 years based on our progress who knows even factoring moore's law and the limits of how small we can make chips. At first 7nm was thought to largely be the viability limit to how complex and dense we could make this chips now they are taking about sub 4nm and then optical computing chips "OSOAC "Optical System On A Chip" where a single chip optical computer would reside. Never mind quantum computers and similar radical ideas. We could be saying in the future 3.4 petabytes on a desktop that's pretty normal to me. While talking to your car and it drives you to the fitness center for a late night workout.
 
We kinda got derailed about the whole Apple thing my point was there will always be applications that will push the bounds of where computers are today. Seeing what they can squeeze into something not much larger then 6 pizza boxes is really impressive. 3.4 petabytes, 64 cores, 2 terabytes of ram and 4 AMD GPU's. The point I was making it's truly remarkable what has been achieved an in the next 10 years based on our progress who knows even factoring moore's law and the limits of how small we can make chips. At first 7nm was thought to largely be the viability limit to how complex and dense we could make this chips now they are taking about sub 4nm and then optical computing chips "OSOAC "Optical System On A Chip" where a single chip optical computer would reside. Never mind quantum computers and similar radical ideas. We could be saying in the future 3.4 petabytes on a desktop that's pretty normal to me. While talking to your car and it drives you to the fitness center for a late night workout.
And you robot fold your laundry.
 
There's an XBOX One S 1tb digital version bundle for $200 on sale at major retailers. The bundle includes Minecraft, Sea of Thieves, and Forza Horizon 3
 
Last edited:
I've barely touched my xbox one x my since I bought my new computer. I might call it quits on xbox and stick to PC and switch
 
AMD stock is again surging on the heels of the Ryzen 3000 announcement at Computex, and peaked at an all-time high for the stock today at 34.23.

48040638827_0e489558a0_z.jpg
 
It was also probably because of today's announcement. Intel is fucked.

AMD 16-Core Ryzen 9 3950X: Up to 4.7 GHz, 105W, Coming September
48041813238_6786f64f55_z.jpg

That didn't deliver to the letter of the leak, but it was damn exciting.

The 15% IPC gain matched the 13%-15% leaks exactly. That might prove to be the most critical feature delivered as this means Ryzen will likely challenge or exceed Intel on a clock-for-clock basis. The most significant inaccuracy from the leaks was on pricing/naming, and the most glaring absence is the deficit on the best turbo clock for any announced processor versus the leaks (4.6 GHz vs. 5.1 GHz in leaks) which suggests the glorious 5GHz barrier might still be a solid distance out of their reach:

Here was the real pricing announced by CEO Lisa Su at the keynote.
47943995476_aa6bb0844a_z.jpg


Even at $499 a 12c/24t processor with a 4.6 GHz peak turbo (that is 15% better per-thread than Zen+) is going to stiffly leach from the i9-9900K. The R5-3600 for $199 is going to be a slightly superior to the R5-2600X. But I think the meanest value of all there is the R7-3700X in the middle for $329. Looks to me like that one will have the best overclocking potential from the TDP and turbo. People will buy it to OC instead of the R7-3800X like they have with the R5-1600 or R5-2600 before.

The 16-core isn't being released, immediately, but the motherboard manufacturers have already confirmed it is on the way, and that they have had to prepare exceptional cooling that will provide up to 300W of power to the CPU VRM. That processor will be discussed in more detail at E3 in June. Fingers are crossed that the turbo adds at least +100MHz to the R9-3900X (for 4.7 GHz) because that appears to be how they're stepping the product line, and if it was also just $599 MSRP, instead of closer to the $799 TR-2950X, that would also be a huge win for consumers.
Nailed-It-Baby-Meme-06.jpg

They also announced new additions to the the APU lineup:
AMD Ryzen 3000 APUs: Up to Vega 11, More MHz, Under $150, Coming July 7th
 
They are also attempting to add pressure to NVIDIA. They can't compete at the top, but they are after the midrange:



https://www.theverge.com/2019/6/10/18660063/amd-radeon-5700-xt-dent-announcement-price-release-date
The Verge said:
But these two new blower-style graphics cards may be doing something more important: taking on Nvidia where it actually counts in the parts more PC gamers tend to buy. AMD says they’re designed to destroy the upper-mid-range Nvidia GeForce RTX 2070 and RTX 2060. You’ll pay $450 for the Radeon 5700 XT, compared to the $500 for Nvidia’s 2070, while the Radeon 5700 costs $380, a bit more than the $350 that Nvidia’s 2060 retails for today.
  • RX 5700XT is advertised by AMD to be stronger in games than RTX 2070 for $50 less ($450).
  • RX 5700 is advertised by AMD to be stronger in games than the RTX 2060 for $30 more ($380).

We'll see. I'm much more cynical about this announcement. The FLOP power quoted in the images below are not promising. "Up to" suggests to me the quote for the Single Precision Turbo Boost. Unless they've achieved some coup here in terms of translation to real world performance that puts them on par with NVIDIA, then staying apples to apples:
  • The RX 5700 XT below is only 92.4% as powerful as the RX Vega 56 (cheapest Brand New variants starting at $290 today)
  • The RX 5700 below is only 75.2% as powerful as the RX Vega 56, and only 11.7% faster than the RX 590 (cheapest Brand New variants start at $215 today)

They are falling further and further behind NVIDIA. Not good.

amd_radeon_5700_012.jpg


amd_radeon_5700_017.jpg
 
Back
Top