Tech Gaming Hardware discussion (& Hardware Sales) thread

It's the best I've got which is better than nothing.

Unless he wants to order a prebuilt from CyberpowerPC / iBuyPower / CLX with a rush delivery, I don't see alternatives with PSUs in this class.
 
Yeah, PSU pricing is pretty insane atm.

How much wattage do you need? Because this 650W Silverstone Strider is a fully modular, Platinum-rated unit for $122 manufactured by Sirtec that can be preordered, and is advertising that it will be back in stock on June 23rd. So that's a week away.
www.amazon.com/dp/B017WL5UFO?tag=pcpapi-20&linkCode=ogi&th=1

This specific unit wasn't reviewed by JonnyGuru, but two others from the same line also manufactured by Sirtec were. The 750W scored an 8.1, and the 550W scored a tremendous 9.3:
http://www.jonnyguru.com/blog/2016/03/06/silverstone-strider-platinum-st55f-pt-550w-power-supply/
http://www.jonnyguru.com/blog/2016/01/17/silverstone-strider-platinum-st75f-pt-750w-power-supply/

As you can see from the 550W review there is excellent quality of components and construction:
http://www.jonnyguru.com/blog/2016/03/06/silverstone-strider-platinum-st55f-pt-550w-power-supply/5/

750-850 watts would do it for me.

Here are the specs of my upcoming build:
AMD Ryzen 3800x
MSI X570 Tomahawk Wifi - Out of all the motherboards I've researched, this is not only the best priced, but has everything I need. I'm not looking to do crazy over clocks, I just need it to be able run stable and not glitch on me for intense editing sessions.
128gb Corsair Vengeance DDR4 ram
Sabrent nVME m.2 1tb ssd
Samsung 1tb Sata III ssd (used to store and edit footage solely)
20tb hard drive set in raid 1 so technically only 10tb usable
the 3080ti whenever that's released

It sounds massively overkill but aside from gaming, I do color correcting and video edit massive uncompressed 4-6k video files that require power and above all else, stability. I'm also eyeing the Asus 4k Proart PA32UCX, but gonna wait until the end of the year to see what else is available since that's an insane investment and I have got to make sure I've got a steady flow of work later this year or next year before pulling the trigger on such an expensive display. I was editing and color correcting this indie directors film before the COVID sent the entire production to shit. Personally, I'm still happy with my current 27" asus pro art 1440p display I've had for the last 3 years. It still does everything I need it to do and hasn't failed me once.

Currently rocking the i7-6800K. It's done me great, but it's time to uprade.
 
Last edited:
It's the best I've got which is better than nothing.

Unless he wants to order a prebuilt from CyberpowerPC / iBuyPower / CLX with a rush delivery, I don't see alternatives with PSUs in this class.

There's nothing really available at any tier right now. I've seen some budget Thermaltake stuff that's still at reasonable prices, but I would never buy one. I haven't seen a good deal on a PSU in a long time.

Did you see the Ryzen 5 3600 is on sale for $167?
I'm really thinking of upgrading my htpc from a 1700 to the 3600. Yes it may sound silly going backwards in cores but there's reasoning behind it. First gen Ryzen did great in games and such, but it didn't have that Intel snappiness in Windows. The 3000 series Ryzen closed that gap, I don't know if they improved the Infinity Fabric or what changed but it's much better now.
 
750-850 watts would do it for me.

Here are the specs of my upcoming build:
AMD Ryzen 3800x
MSI X570 Tomahawk Wifi - Out of all the motherboards I've researched, this is not only the best priced, but has everything I need. I'm not looking to do crazy over clocks, I just need it to be able run stable and not glitch on me for intense editing sessions.
128gb Corsair Vengeance DDR4 ram
Sabrent nVME m.2 1tb ssd
Samsung 1tb Sata III ssd (used to store and edit footage solely)
20tb hard drive set in raid 1 so technically only 10tb usable
the 3080ti whenever that's released

It sounds massively overkill but aside from gaming, I do color correcting and video edit massive uncompressed 4-6k video files that require power and above all else, stability. I'm also eyeing the Asus 4k Proart PA32UCX, but gonna wait until the end of the year to see what else is available since that's an insane investment and I have got to make sure I've got a steady flow of work later this year or next year before pulling the trigger on such an expensive display. I was editing and color correcting this indie directors film before the COVID sent the entire production to shit. Personally, I'm still happy with my current 27" asus pro art 1440p display I've had for the last 3 years. It still does everything I need it to do and hasn't failed me once.

Currently rocking the i7-6800K. It's done me great, but it's time to uprade.
Gotcha. I researched the RAM question for a sister-in-law. If you're working with files so large that you need that much RAM, there's absolutely no overkill.

Unless you're getting a special deal on the 3800X I think it makes the least sense of all your choices. It's still quite sensible, but the 3700X is $50 cheaper, now, and you're taking less than a 2% hit to performance. For an editing build, especially, I think it makes sense to just keep going, especially with the overall budget this build is commanding, and get the R9-3900X (+$88) or R9-3950X (+$271). Your gaming performance will be the same, but your editing performance will improve drastically. The motherboard you've selected is more than capable of handling them. The 3900X in particular seems well-suited for you without a big increase to your budget.
https://benchmarks.ul.com/compare/best-cpus
https://www.cpubenchmark.net/high_end_cpus.html

Not sure when you're buying, or if you already have, but if you're looking at the Sabrent Rocket, I'd suggest taking whichever is cheapest of these when you go to buy (I just went over this in depth helping another poster with his build). Right at this moment the Team MP34 for $137 is the most attractively priced. That's if you don't kiss it and just step up to the Samsung 970 EVO:
  • ADATA XPG SX8200 Pro
  • ADATA XPG Gammix S11 Pro
  • HP EX950
  • Inland Premium
  • Mushkin Pilot-E
  • PNY XLR8 CS3030
  • Sabrent Rocket (not the Rocket Q)
  • Team MP34
 
Was so fortunate to have gone overboard with my last PC (2013) and a 750W PSU. I didn't want to reuse it for this new build of mine but after weeks of holding out for PSU restocks I just said fuck it, I'm reusing it.
 
Was so fortunate to have gone overboard with my last PC (2013) and a 750W PSU. I didn't want to reuse it for this new build of mine but after weeks of holding out for PSU restocks I just said fuck it, I'm reusing it.
I see no problem with that, but I'd be concerned about maintaining that outside the warranty window. Once a new PSU for a reasonable price becomes available I'd make the switch.
 
So I hear Intel having a big internal shake up as far as staffing is going. Apparently the delays in releasing their new chips are causing some serious rethinking about the direction.
In the comments he said it cost $5800 to build. You can get a real Stern cabinet for that money.


You missed the point it's got crazy flexibility and tons of games to play and look at all the features.

3Vv2G6vg.png
 
So I hear Intel having a big internal shake up as far as staffing is going. Apparently the delays in releasing their new chips are causing some serious rethinking about the direction.



You missed the point it's got crazy flexibility and tons of games to play and look at all the features.

3Vv2G6vg.png


I didn't miss the point, I'm telling you to try one before you decide to build one. Pinball relies heavily on physical feedback, something virtual machines can't reproduce. There's also a lot of moves you can do with physical cabinets that you can't do with a virtual cabinet. It's not even close to the same experience.
You're also having to rely on Pinball FX for tables and they're a notoriously shitty company.
 
Last edited:
More credible-looking rumors on the upcoming RTX 3000 series:
https://wccftech.com/nvidia-geforce-rtx-3090-rtx-3080-titan-rtx-ampere-gaming-gpu-specs-rumor/
WCCF Tech said:
NVIDIA GeForce RTX 3090 - The Fastest Ampere Gaming Graphics Card
The NVIDIA GeForce RTX 3090 is an interesting graphics card based on its naming scheme. I already talked around the nomenclature of this card in my previous post mentioning why NVIDIA would be choosing this particular naming scheme that was previously reserved for dual-chip graphics cards. It is highly likely that the RTX 3090 could be an internal codename for the RTX 3080 Ti but if we dive to much into those details, we will just be moving away from today's topic.

So let's call it the GeForce RTX 3090 for now. According to the details, the GeForce RTX 3090 will feature the GA102-300-A1 GPU. This is one of the three SKUs that we will be talking about in this particular post. The GA102-300-A1 GPU is said to be comprised of 5248 CUDA cores or 82 SMs. In total, that's a 20% increase in cores over the GeForce RTX 2080 Ti. No details such as clock speeds, TMU/ROP counts is provided and we can't take the Ampere A100 die as a reference since gaming and HPC parts share a different configuration hierarchy.

Moving on to the memory side, things start to look interesting as it is stated that the card will get up to 12 GB memory with speeds of up to 21 Gbps. Since that's a 384-bit bus we are looking at, it will be able to offer up to 1 TB/s bandwidth. This overclocked memory design is being referred to as GDDR6X and there's no such information available yet by any graphics DRAM vendor unlike the previous GDDR6 DRAM which was officially reported months before its integration on the Turing graphics cards.

NVIDIA GeForce RTX 3080 - GA102 For High-End Gaming
Moving on to the GeForce RTX 3080, the rumor reports that the card will be featuring the GA102-200-Kx-A1 SKU. This cut down SKU will feature the same 4352 CUDA cores as the RTX 2080 Ti that will be arranged in a total of 68 SMs. The card is reportedly going to feature 10 GB of memory that is also going to be GDDR6X, running at 19 Gbps across a 320-bit bus interface with a bandwidth of 760 GB/s.

The switch to GA102 from TU104 on the RTX 2080 is definitely an interesting one. A high-end SKU would also result in higher wattage and thermals but I believe this could be done to raise the bar up on the sub $500 segment which should comprise of GA104 based RTX 3070 and RTX 3060. If that's the case, than we can expect a good performance jump in that segment which we recently talked about in one of our article.
NVIDIA Ampere Powered GeForce RTX 30 Series Reportedly Launching With RTX 3080 & RTX 3090 In September

The RTX 3090 is reported to have 20.5% more cores than the RTX 3080 (and RTX 2080 Ti) and a memory bus as wide as the Titan's. Assuming it is capable of sustaining similar frequencies rumored in their earlier story I talked about on the previous page it looks like we might expect somewhere from 30%-35% superior performance over the RTX 2080 Ti. That's an encouraging bump.
full
 

AMD's APUs have been dominating Intel's onboard graphics-- both in desktops and laptops-- since 2011: well before the arrival of Vega. The most significant advancement with Vega is that the most powerful of these APUs are now actually legitimate AIO gaming processors. They're no longer glorified retro gaming machines.

R5-3400G: Onboard GPU = 1.975 TFLOPS (> PS4 & Xbox One S)


It doesn't matter for the key reason they cost too much. I don't know why AMD hasn't figured out a niche for them, when there are obvious niches they could fill, but a niche doesn't exist unless you can fill it at the appropriate cost. It has made no damn sense to buy a laptop with (for example) an R7-3700U, R5-3550H, or R5-3500U when those laptops have typically started around $800. Sure, they beat the crap out of $400-$600 i5 laptops with no discrete graphics. The problem is they get shit on by gaming laptops with an i5 (or i7) and a GTX 1050, 1050 Ti, 1650, 1650 Ti, 1660, or 1660 Ti that have run between $650-$1100.

The most relevant their APUs have ever been, ironically, has been recently in the desktop market as raw CPUs. The 2200G, 2400G, & 3200G have at various times over the past year been the best overall CPU in the sub-$100 space; better than both Intels and pure AMD CPUs at the same price points.

I don't understand why their laptops APUs apparently cost so much to manufacture, but obviously a CPU + GPU combo is still just an outright superior strategy considering cost.

On the other hand, I never understood why they didn't position their desktop APUs to devour the barebones "NUC" kits and SFF prebuilt niche that Intel monopolizes. They could sell something like the below famous product, if they could figure out a sleek enough cooling solution, and it would be a legit gaming machine. There has to be a market for something like that, right?:

51vlC3XkouL._AC_SX466_.jpg
 
Last edited:
AMD's APUs have been dominating Intel's onboard graphics-- both in desktops and laptops-- since 2011: well before the arrival of Vega. The most significant advancement with Vega is that the most powerful of these APUs are now actually legitimate AIO gaming processors. They're no longer glorified retro gaming machines.


It doesn't matter, and it still hasn't mattered for the key reason they cost too much. I don't know why AMD hasn't figured out a niche for them, when there are obvious niches they could fill, but a niche doesn't exist unless you can fill it at the appropriate cost. It has made no damn sense to buy a laptop with (for example) an R7-3700U, R5-3550H, or R5-3500U when those laptops have typically started around $800. Sure, they beat the crap out of $400-$600 i5 laptops with no discrete graphics. The problem is they get shit on by gaming laptops with an i5 (or i7) and a GTX 1050, 1050 Ti, 1650, 1650 Ti, 1660, or 1660 Ti that have run between $650-$1100.

The most relevant their APUs have ever been, ironically, has been recently in the desktop market as raw CPUs. The 2200G, 2400G, & 3200G have at various times over the past year been the best overall CPU in the sub-$100 space; better than both Intels and pure AMD CPUs at the same price points.

I don't understand why their laptops APUs apparently cost so much to manufacture, but obviously a CPU + GPU combo is still just an outright superior strategy considering cost.

On the other hand, I never understood why they didn't position their desktop APUs to devour the barebones "NUC" kits and SFF prebuilt niche that Intel monopolizes. They could sell something like the below famous product, if they could figure out a sleek enough cooling solution, and it would be a legit gaming machine. There has to be a market for something like that, right?:

51vlC3XkouL._AC_SX466_.jpg


The issue for me is I can buy a used 6th generation i5 with a 1060 GPU and also with 16 gigs of ram for around 500 dollars. There is really not a huge difference in performance between a 6th generation and a 9th generation to make people run out and spend 500 just on a CPU. Yes there is a performance different but hardly enough to make you want to move.
 
The issue for me is I can buy a used 6th generation i5 with a 1060 GPU and also with 16 gigs of ram for around 500 dollars. There is really not a huge difference in performance between a 6th generation and a 9th generation to make people run out and spend 500 just on a CPU. Yes there is a performance different but hardly enough to make you want to move.
giphy.gif


Okay. What does this have to do with the silly title of that video that "Intel might finally be able to compete" when-- as I pointed out-- competition against hardware that has zero market presence is a pointless race? Intel has monopolized the laptop market for over a decade, and up until Ryzen's 2nd Gen, the desktop market, too. AMD's APUs are the hardware that has something to prove. Maybe one day they'll finally be relevant.

You're just iterating my point about the superior value of Intel CPU + NVIDIA GPUs in laptops, but taking the sensibility of business out of it. There is little competition between New and Used products.
 
AMD's APUs have been dominating Intel's onboard graphics-- both in desktops and laptops-- since 2011: well before the arrival of Vega. The most significant advancement with Vega is that the most powerful of these APUs are now actually legitimate AIO gaming processors. They're no longer glorified retro gaming machines.


It doesn't matter for the key reason they cost too much. I don't know why AMD hasn't figured out a niche for them, when there are obvious niches they could fill, but a niche doesn't exist unless you can fill it at the appropriate cost. It has made no damn sense to buy a laptop with (for example) an R7-3700U, R5-3550H, or R5-3500U when those laptops have typically started around $800. Sure, they beat the crap out of $400-$600 i5 laptops with no discrete graphics. The problem is they get shit on by gaming laptops with an i5 (or i7) and a GTX 1050, 1050 Ti, 1650, 1650 Ti, 1660, or 1660 Ti that have run between $650-$1100.

The most relevant their APUs have ever been, ironically, has been recently in the desktop market as raw CPUs. The 2200G, 2400G, & 3200G have at various times over the past year been the best overall CPU in the sub-$100 space; better than both Intels and pure AMD CPUs at the same price points.

I don't understand why their laptops APUs apparently cost so much to manufacture, but obviously a CPU + GPU combo is still just an outright superior strategy considering cost.

On the other hand, I never understood why they didn't position their desktop APUs to devour the barebones "NUC" kits and SFF prebuilt niche that Intel monopolizes. They could sell something like the below famous product, if they could figure out a sleek enough cooling solution, and it would be a legit gaming machine. There has to be a market for something like that, right?:

51vlC3XkouL._AC_SX466_.jpg


Ian Cutress recently did a video on why there isn't a lot of AMD SFF computers. A quick version is that Intel works with manufacturers on development, advertising, etc while AMD doesn't.
 
Noob here with a question.

I’ve been staring at YT videos of 30fps vs 60fps games and I just can’t tell the difference.
Yet everyone swears it’s a major improvement.
What gives ?
Is it like a 3D TV thing where some people just can’t see it ?

Are the YouTube videos encoded in 60fps? Does your monitor have a 60hz refresh rate. The difference between 30fps and 60fps is pretty substancial. I can't play console games after getting used to pc gaming.
 
Performance benchmarks of the Nvidia RTX 3080 have been leaked big boost. Nvidia seems to be able to do what CPU manufacturers have not significant performance boost generation to generation. The RTX 3080 shows a 30 percent performance boost without optimized drivers. That's a crazy bump. The odd thing it all but kills the titan RTX by 20 percent.

https://www.tomsguide.com/amp/news/...chmarks-just-leaked-and-amd-should-be-worried
 
Noob here with a question.

I’ve been staring at YT videos of 30fps vs 60fps games and I just can’t tell the difference.
Yet everyone swears it’s a major improvement.
What gives ?
Is it like a 3D TV thing where some people just can’t see it ?
I sometimes randomly mess around on this site: https://humanbenchmark.com/dashboard/reactiontime.
Anyway, I don't think I've visited it it since I got my new 144 Hz monitor with Gsync compatibility (Gsync can reduce input lag even further if you cap your fps just below your monitor's refresh rate). The reason I bring it up is because I just averaged 182 ms, and at 60 fps I've always been stuck at like 220-250.

I can't necessarily always see the difference between 30 and 60 without having videos side by side, and I certainly can't see the difference between 60 and 141, but I really can feel the difference. Everything feels a lot less laggy.
 
Last edited:
I sometimes randomly mess around on this site: https://humanbenchmark.com/dashboard/reactiontime.
Anyway, I don't think I've visited it it since I got my new 144 Hz monitor with Gsync compatibility (Gsync can reduce input lag even further if you cap your fps just below your monitor's refresh rate). The reason I bring it up is because I just averaged 182 ms, and at 60 fps I've always been stuck at like 220-250.

I can't necessarily always see the difference between 30 and 60 without having videos side by side, and I certainly can't see the difference between 60 and 141, but I really can feel the difference. Everything feels a lot less laggy.

This is why I’ve disabled crossplay in M.W 2019.
 
You can do that?? Mother fucker, wish I’d known that. I play on PS4 and have been consistently massacred. Fuck’s sake.

Lol.
Go to options then Account. First option.

Fk playing against 10 ms montitors with 240 hz refresh rates on 60 FPS. Or the modded controls, SSDs etc etc.

I play on a PS4 on a regular 1080p big screen.
 
Last edited:

Forum statistics

Threads
1,275,147
Messages
57,971,250
Members
175,885
Latest member
gono
Back
Top