• Xenforo Cloud has scheduled an upgrade to XenForo version 2.2.16. This will take place on or shortly after the following date and time: Jul 05, 2024 at 05:00 PM (PT) There shouldn't be any downtime, as it's just a maintenance release. More info here

Tech Gaming Hardware discussion (& Hardware Sales) thread

AMD Ryzen processors have been high at launch and in about 6 months they start to drop. I don't think we'll see that this time though.

In fairness they're not even that expensive this time round. The 5800x cost me roughly the same my 3700x did.
 
True, and the 3900X has a virtually identical total power draw to the 5800X, 5900X, and 5950X, while its per-core peak draws actually exceeds any of these, specifically the 5800X.

So if a Wraith Max/Prism was enough for a 3900X it would be enough for any of these. Would have been nice, but considering the penchant among builders to opt for more powerful coolers who spend this much, I'm not sure how many will lament the loss.

The 5600X, on the other hand, comes with the Wraith Stealth.

Have you heard the Prism in person? They do the job in the cooling department but they're loud, especially the newer ones. I have one from a 1700 and a 3700x and there's a noticeable acoustic difference, the first gen ones were quieter.
 
What's the deal with AMD GPU drivers? I always hear they are trash, but nobody can vocalize what that means. I'm on the fence between 3080 and 6900xt (If I can get either). I had AMD/ATI back in the day, but have been with team green for awhile now. Thoughts?
 
Have you heard the Prism in person? They do the job in the cooling department but they're loud, especially the newer ones. I have one from a 1700 and a 3700x and there's a noticeable acoustic difference, the first gen ones were quieter.
Yes, I have, with a 3700X, like yourself. It would definitely be nice if they were a bit quieter. I also wouldn't want one for a CPU pulling 142W, either. The temps are perfectly fine with a 3700X, but that CPU only pulls ~90W max. Again, I think AMD chose the path of no waste by not including the Wraith coolers.
In fairness they're not even that expensive this time round. The 5800x cost me roughly the same my 3700x did.
The MSRP down under is the same for these two? That's nuts.

In the States, the 3700X launched for $329 ($453 AUD), and held true to that for price for a long time. The 5800X launched for $449 ($619 AUD).
 
What's the deal with AMD GPU drivers? I always hear they are trash, but nobody can vocalize what that means. I'm on the fence between 3080 and 6900xt (If I can get either). I had AMD/ATI back in the day, but have been with team green for awhile now. Thoughts?

They're poorly optimized and can cause stability issues leading to crashes. Since at least 2017 AMD drivers can cause a green screen after install. It's a known issue but AMD hasn't been able to come up with a fix yet.
 
Yes, I have, with a 3700X, like yourself. It would definitely be nice if they were a bit quieter. I also wouldn't want one for a CPU pulling 142W, either. The temps are perfectly fine with a 3700X, but that CPU only pulls ~90W max. Again, I think AMD chose the path of no waste by not including the Wraith coolers.

Great thing about Ryzen is that you don't need an expensive cooler.
The Hyper 212 Evo is on sale for $25 on Newegg
 
What's the deal with AMD GPU drivers? I always hear they are trash, but nobody can vocalize what that means. I'm on the fence between 3080 and 6900xt (If I can get either). I had AMD/ATI back in the day, but have been with team green for awhile now. Thoughts?
It's a vestige of a time long gone. It's chatter you mostly hear from NVIDIA fanboys these days.

Choosing AMD vs. NVIDIA (Driver Stability discussed near the top)
GPUs like Fine Wine
 
Yes, I have, with a 3700X, like yourself. It would definitely be nice if they were a bit quieter. I also wouldn't want one for a CPU pulling 142W, either. The temps are perfectly fine with a 3700X, but that CPU only pulls ~90W max. Again, I think AMD chose the path of no waste by not including the Wraith coolers.

The MSRP down under is the same for these two? That's nuts.

In the States, the 3700X launched for $329 ($453 AUD), and held true to that for price for a long time. The 5800X launched for $449 ($619 AUD).
I could only find it for 699 Aud
 
It's a vestige of a time long gone. It's chatter you mostly hear from NVIDIA fanboys these days.

Choosing AMD vs. NVIDIA (Driver Stability discussed near the top)
GPUs like Fine Wine

The reason AMD GPU's age like fine wine is because going back to 2012 with the 7000 series until the RX500 series in 2017 they all use GCN. They didn't innovate the way Nvidia did. The RX5000 series is RDNA 1 and the new RX6000 series is RDNA 2, a whole different architectures. The RX5000 series will not get the love like the GCN models and won't age like fine wine.
 
AMD doesn't have the same pipeline queue bottleneck problem, anymore, that is true, but the RX 5000 series (respective to the RTX 2000 series) ought to enjoy a similar aging advantage even though it won't be nearly as stark.

Take the RX 5700 and RTX 2060 Super. Now, the former is not considered the latter's competitor (that would be the XT). Totally different price classes. The RTX 2060 Super has averaged well north of $400. The 5700 has averaged about $70-$80 less. We've seen sales for dual-fan variants as low as $280, and triple-fan variants as low as $300.
AMD Radeon RX 5700 vs. Nvidia GeForce RTX 2060 Super: 40+ Game Mega Benchmark (Oct-2019)
RTX 2060 Super was 4% superior (@1440p).
1440.png

Yet, the RX 5700 has a higher pixel rate, higher textel rate, and higher FLOPs. It has equal memory bandwidth, speed, and VRAM. Thus, it's equal or better everywhere minus the lack of ray-tracing & tensor cores.

Just like the RX 480, it was conservatively clocked when it launched. Unlike the RX 480, I don't think anyone expects an identical, but more finely fabricated successor, analogous to the RX 580, but we've already seen how massive the RX 5700's overhead is in overclocking forums. Once you unlock the power limit you can send the clock through the roof with manageable heat levels (in well-constructed AIBs). It's not quite the RX 5700 XT's equal, but it comes close, with vastly more gained by overclocking, so it's obvious AMD deliberately nerfed the voltage supply to create a more distinct product to sell at a lower price point. They shipped it with slightly fewer shaders, TMUs, and CUs. All of them do this. It's reminiscent of NVIDIA disabling shaders and TMUs while halving VRAM with the GTX 1060.

Conversely, the RTX 2060 Super is more aggressively clocked than the RTX 2060, and shows a marginally reduced overclocking overhead.

Even without voiding your warranty to achieve a proper overclock with the RX 5700, don't be surprised if it closes this gaming performance gap after a few years.
 
@Cygnus A
I can't offer any useful predictions, since AMD has overhauled their architectural design so radically, even relative to RDNA 1.0, but here are the pipelines including the throughputs using the real boost clock for the AMD GPUs (aka the "Game" clock):

full
 
Last edited:
AMD doesn't have the same pipeline queue bottleneck problem, anymore, that is true, but the RX 5000 series (respective to the RTX 2000 series) ought to enjoy a similar aging advantage even though it won't be nearly as stark.

Take the RX 5700 and RTX 2060 Super. Now, the former is not considered the latter's competitor (that would be the XT). Totally different price classes. The RTX 2060 Super has averaged well north of $400. The 5700 has averaged about $70-$80 less. We've seen sales for dual-fan variants as low as $280, and triple-fan variants as low as $300.
AMD Radeon RX 5700 vs. Nvidia GeForce RTX 2060 Super: 40+ Game Mega Benchmark (Oct-2019)
RTX 2060 Super was 4% superior (@1440p).
1440.png

Yet, the RX 5700 has a higher pixel rate, higher textel rate, and higher FLOPs. It has equal memory bandwidth, speed, and VRAM. Thus, it's equal or better everywhere minus the lack of ray-tracing & tensor cores.

Just like the RX 480, it was conservatively clocked when it launched. Unlike the RX 480, I don't think anyone expects an identical, but more finely fabricated successor, analogous to the RX 580, but we've already seen how massive the RX 5700's overhead is in overclocking forums. Once you unlock the power limit you can send the clock through the roof with manageable heat levels (in well-constructed AIBs). It's not quite the RX 5700 XT's equal, but it comes close, with vastly more gained by overclocking, so it's obvious AMD deliberately nerfed the voltage supply to create a more distinct product to sell at a lower price point. They shipped it with slightly fewer shaders, TMUs, and CUs. All of them do this. It's reminiscent of NVIDIA disabling shaders and TMUs while halving VRAM with the GTX 1060.

Conversely, the RTX 2060 Super is more aggressively clocked than the RTX 2060, and shows a marginally reduced overclocking overhead.

Even without voiding your warranty to achieve a proper overclock with the RX 5700, don't be surprised if it closes this gaming performance gap after a few years.


That’s a whole wall of text not addressing my post at all.
 
Of course it does. You're the one who loves to remind everyone that reducing GPUs to TFLOP performance is overly simplistic every 4-7 years when console gamers suddenly care and need things simplified. Well, here are those other things.

Do you think it some mystery that the RX 5700 is superior in everything to the RTX 2060 Super, but still loses the game benchmark roundup?
 
Of course it does. You're the one who loves to remind everyone that reducing GPUs to TFLOP performance is overly simplistic every 4-7 years when console gamers suddenly care and need things simplified. Well, here are those other things.

Do you think it some mystery that the RX 5700 is superior in everything to the RTX 2060 Super, but still loses the game benchmark roundup?

and here you go....
 
In the States, the 3700X launched for $329 ($453 AUD), and held true to that for price for a long time.

3700X went on sale two weeks after launch for $299. Each month that followed had a week long sale of $299. Some instances it hit $279. A certain game(cant remember the specific game) was even bundled with the CPU purchase during these sale periods.
 
The reason AMD GPU's age like fine wine is because going back to 2012 with the 7000 series until the RX500 series in 2017 they all use GCN. They didn't innovate the way Nvidia did. The RX5000 series is RDNA 1 and the new RX6000 series is RDNA 2, a whole different architectures. The RX5000 series will not get the love like the GCN models and won't age like fine wine.
and here you go....
Okay, I'll make this simpler.

Yes, the RX 5000 series is RDNA 1.0, and the RX 6000 series is RDNA 2.0. Meanwhile, for GCN, the HD 7000 series was GCN 1.0 (2012), the Radeon 200 series was GCN 2.0 (2013), the Radeon 300 series was GCN 3.0 (2015), the RX 400/500 series were both GCN 4.0 (2016), and the RX Vega was GCN 5.0 (2017).

So do you have a point?
 
Okay, I'll make this simpler.

Yes, the RX 5000 series is RDNA 1.0, and the RX 6000 series is RDNA 2.0. Meanwhile, for GCN, the HD 7000 series was GCN 1.0 (2012), the Radeon 200 series was GCN 2.0 (2013), the Radeon 300 series was GCN 3.0 (2015), the RX 400/500 series were both GCN 4.0 (2016), and the RX Vega was GCN 5.0 (2017).

So do you have a point?

I already made my point.
Whenever someone points something out to you or proves you wrong, you write these long winded posts that have nothing to do with the topic and try to move the goal posts so you can "win".
This usually happens after you lose a conversation in the War Room then you come in here and try to take it out on everyone else.
 
3700X went on sale two weeks after launch for $299. Each month that followed had a week long sale of $299. Some instances it hit $279. A certain game(cant remember the specific game) was even bundled with the CPU purchase during these sale periods.
This is a silly point to nitpick, but I don't recall this, unless you're counting Microcenter in-store sales, rebates, special payment card registration discounts, or combo deals, and that's dumb.

PCPP shows it didn't drop below the MSRP until November. Amazon sales trackers show the same.
full


There is no instance of this price on PC Part Picker or Slickdeals in July 2019, or the following months. I don't see any in here, either.
 
I already made my point.
Whenever someone points something out to you or proves you wrong, you write these long winded posts that have nothing to do with the topic and try to move the goal posts so you can "win".
This usually happens after you lose a conversation in the War Room then you come in here and try to take it out on everyone else.
I'm failing to grasp any semblance of a point, and you're the one who always turns it personal when I correct your ignorance on tech-- as you are here. I'm not interested.

You just said RDNA 1.0 and RDNA 2.0 were "completely different architectures", which is wrong, while appearing to believe that every GCN card from 2012-2017 were built on identical fabrications, which they were not. I wondered if you were implicitly referring to the inefficiencies of the throughput pathways of GCN which RDNA does not suffer.

You don't appear to know what your argument is.
 
I'm failing to grasp any semblance of a point, and you're the one who always turns it personal when I correct your ignorance on tech-- as you are here. I'm not interested.

You just said RDNA 1.0 and RDNA 2.0 were "completely different architectures", which is wrong, while appearing to believe that every GCN card from 2012-2017 were built on identical fabrications, which they were not. I wondered if you were implicitly referring to the inefficiencies of the throughput pathways of GCN which RDNA does not suffer.

You don't appear to know what your argument is.

I don't even know why I try with you. You'd argue the sun isn't yellow just to be "right"
 

Forum statistics

Threads
1,240,528
Messages
55,700,836
Members
174,903
Latest member
romanych
Back
Top