Official AMD "Ryzen" CPU Discussion

Status
Not open for further replies.
I am not telling you to upgrade, but there is definitely significant improvements to be had and I don't want misinformation spreading.

Just saying "I play on ultra settings with no problems" does not make the picture clear. Are you locked at 60FPS?. Define Ultra settings? Maxed AA/AF? Blur? HDR? DoF? 1080p? 1440p? ultrawide? There are a lot of variables at play here. Some people are fine with sub 60FPS. Some people want 120fps.

I am with @Madmick on this: Your CPU is probably adequate for a lot of games, but there is a threshold where upgrading will bring noticeable improvements. My kids play on a FX-6300 system I built years ago, and it is "fine" for most games too, at least for them.
Ultra everything max everything, every single possible setting at the highest it can be. I shoot for 60fps I see no noticeable difference after that.
 
The fx-8350 wasn't a great choice, the fx-8320 was an under clocked 8350. Unless you really lost the silicone lottery, the 8320 performed just as good when overclocked.
Another benefit of the AMD FX series was you had a nice little space heater.
Depends on when you bought it. I got a great deal on the 8350 with the wraith cooler and it doesn't get hot at all. Even after gaming for 12+ hours the CPU isn't that hot. I can't speak to the 8350 on the whole, only the one I have.
 
Meh. I wouldn't even look at an i3, anymore. For anything.



Even after its huge price drop the 7350K ($150 reduced MSRP) is an inferior buy to both the R3-1300X ($130 MSRP) or R5-1400 ($160 MSRP). The i3 is king of neither world, and its price drop doesn't spare it against the cheaper R5's. Don't forget about its inferior stock cooler on the Intel vs. the Wraith.

Simultaneously, if you plan on an aftermarket cooler, while the 7350K is the only i3 that overclocks, it requires a Z170/Z270 chipset motherboard for fully unlocked OC capabilities. The R3-1300 can be OC'd on any AM4 board. I just checked PCPP, and the cheapest Intel Z170/Z270 board (of any form factor) currently is the ASRock Z170A-X1/3.1 for $98; the cheapest AM4 board is the ASRock A320M-DGS for a paltry $54 by comparison. So even if you plan on ramping up that IPC advantage for the 7350K it's going to realize an effective real-world premium of ~$65 over the 1300X.

Until Coffee Lake drops AMD is just ruling south of $250 right now.


I'm taking my I screwed up comment back and i'm going back to buying whichever one is sale between the r3-1300x and i3-7100
ashes.png

civ6-1.png

farcryprimal.png

gtav.png

rotr.png


https://www.pcper.com/reviews/Proce...-and-1200-Processor-Review/Gaming-Performance

RYZEN-3-58.jpg

RYZEN-3-55.jpg

RYZEN-3-59.jpg

http://www.hardwarecanucks.com/foru...-ryzen-3-1300x-1200-performance-review-2.html
 
I'm taking my I screwed up comment back and i'm going back to buying whichever one is sale between the r3-1300x and i3-7100
http://www.hardwarecanucks.com/foru...-ryzen-3-1300x-1200-performance-review-2.html
That seems misguided. The i3-7100 can't overclock while the RX-1300X will be capable of overclocking fully on any board you buy for it, and the stock cooler is inferior if you don't for serious tasking like gaming:

http://www.gamersnexus.net/hwreviews/3001-amd-r3-1300x-review-vs-7350k-intel-response
upload_2017-7-28_0-57-58.png

Additionally:
http://cpu.userbenchmark.com/Compare/Intel-Core-i3-7100-vs-AMD-Ryzen-3-1300X/3891vs3930

The i3-6300 is nearly identical to the i3-7100 (~2% inferior):
A clear indication that heavy multitasking (such as simultaneous streaming or concomitant media server functionality) will favor the 1300X. The longevity of the processor such as Chael's FX-8350 have enjoyed will all undoubtedly favor the RX-1300X. Firestrike physics score is the real tell towards that last part, and even the 7350K falls substantially behind. Of course editing, rendering, or video re-encoding itself will also clearly favor the 1300X.

upload_2017-7-28_1-4-6.png

upload_2017-7-28_1-2-19.png

upload_2017-7-28_1-1-19.png

upload_2017-7-28_1-3-54.png
Also, those are good reviewers, but I was particularly surprised to see the i3 nearly equaling the 1300X in Ashes of Singularity & GTA V while beating it significantly in Rise of the Tomb Raider, so I sought out a second opinion. Tom's Hardware and Anandtech yielded different results (possibly because some may be DX12 vs. DX11, so perhaps they're still polishing the adapted Zen drivers they've built here at rollout with the 1300X):

https://www.tomsguide.com/us/ryzen-3-1300x-benchmark-tests,review-4548.html
http://www.anandtech.com/show/11658/the-amd-ryzen-3-1300x-ryzen-3-1200-cpu-review
upload_2017-7-28_1-17-59.png
upload_2017-7-28_1-19-27.jpeg
upload_2017-7-28_1-21-36.png


Nevertheless, Anandtech feels there is validity to the purchase of the i3-7100 as a value argument, but not based on an equality of performance dollar-per-dollar. For my own part, I don't see any valid reason to purchase the i3-7100 when the G4560 exists, so I don't perceive it as the most viable purchase in this $100-$150 range. Hopefully this lukewarm press reception does what it always does, and depresses the cost of the Ryzen processor by about ~10%. At the same cost you want the 1300X, no doubt:
Anandtech said:
Conclusion
For as much hype and excitement that has been generated around Zen and the Ryzen products so far, with everything focused on the high-end when we hit the lower elements of the stack and the volume parts, not much ‘excitement’ is to be had. We’ve already gone through the new fancy microarchitecture and the platform, and what matters at this end of the spectrum is a pure performance per dollar metric. So far the Ryzen 7 parts have certainly hit that goal, especially when originally compared to Broadwell-E when the Ryzen 7 parts per launched. For the Ryzen 3, the direct competition is Kaby Lake, and CPUs with a much higher IPC. But where Intel has two cores, AMD has four.

Diving straight into the graphs...

For our combined all-in-one graph, we included our mixed workload data and weighted the results 40:50:10 for single:multi:mixed thread workloads.



If we ignore the Ryzen 5 1500X in the top right corner, there are a few stories here.

First is that the Ryzen 3 1200 does not look like an attractive option. It performs +2-3% of the Pentium but is $30 more expensive, and the Core i3-7100 beats it by 8% for only a sub-$10 cost.

Then there is the Ryzen 3 1300X. Compared to the Core i3-7300/7320 and the Core i5-7400, it clearly wins on performance per dollar all around. Compared to the Core i3-7100 though, it offers almost 5% more performance for around $10-15 more, which is just under 10% of the cost. Depending on budgets, each one could be an attractive option.
 
That seems misguided. The i3-7100 can't overclock while the RX-1300X will be capable of overclocking fully on any board you buy for it, and the stock cooler is inferior if you don't for serious tasking like gaming:

http://www.gamersnexus.net/hwreviews/3001-amd-r3-1300x-review-vs-7350k-intel-response
View attachment 254883

Additionally:
http://cpu.userbenchmark.com/Compare/Intel-Core-i3-7100-vs-AMD-Ryzen-3-1300X/3891vs3930

The i3-6300 is nearly identical to the i3-7100 (~2% inferior):
A clear indication that heavy multitasking (such as simultaneous streaming or concomitant media server functionality) will favor the 1300X. The longevity of the processor such as Chael's FX-8350 have enjoyed will all undoubtedly favor the RX-1300X. Firestrike physics score is the real tell towards that last part, and even the 7350K falls substantially behind. Of course editing, rendering, or video re-encoding itself will also clearly favor the 1300X.

View attachment 254895

View attachment 254891

View attachment 254885

View attachment 254893
Also, those are good reviewers, but I was particularly surprised to see the i3 nearly equaling the 1300X in Ashes of Singularity & GTA V while beating it significantly in Rise of the Tomb Raider, so I sought out a second opinion. Tom's Hardware and Anandtech yielded different results (possibly because some may be DX12 vs. DX11, so perhaps they're still polishing the adapted Zen drivers they've built here at rollout with the 1300X):

https://www.tomsguide.com/us/ryzen-3-1300x-benchmark-tests,review-4548.html
http://www.anandtech.com/show/11658/the-amd-ryzen-3-1300x-ryzen-3-1200-cpu-review
View attachment 254899
View attachment 254901
View attachment 254905


Nevertheless, Anandtech feels there is validity to the purchase of the i3-7100 as a value argument, but not based on an equality of performance dollar-per-dollar. For my own part, I don't see any valid reason to purchase the i3-7100 when the G4560 exists, so I don't perceive it as the most viable purchase in this $100-$150 range. Hopefully this lukewarm press reception does what it always does, and depresses the cost of the Ryzen processor by about ~10%. At the same cost you want the 1300X, no doubt:
How is it misguided? I posted the benchmarks that back up my claim. They both perform damn near the same.
If you're going to claim the g4560 is a better deal than the 7100, then you should be recommending the g4560 over the 1300x since the 1300x and 7100 go neck and neck.
I don't care about synthetic benchmarks, I want to see real world gaming benchmarks.
For an htpc, either chip will do the job just fine. You wouldn't notice any difference between the two on an htpc side by side on video playback. Intel still wins this catagory because of HEVC 10 bit.
The 7100 and 1300x are both poor choices for video editing, youre better off waiting until your next paycheck to bump up to an i5 or r5.
What you stated in red only enhances my statement of buying the 1300x vs 7100, buy whichever one is on sale since they both perform equally.
 
Can you still get G4560s over there? No stock anywhere here, otherwise I'd definitely give them the nod for bang for buck. After that, for longevity or multitasking, I'd save up for the Ryzen 5 1600.
 
Can you still get G4560s over there? No stock anywhere here, otherwise I'd definitely give them the nod for bang for buck. After that, for longevity or heavy multitasking, I'd save up for the Ryzen 5 1600.
LOL, silly Kangaroo, Intel chips aren't for you.

Yeah, we can get them, but they've inflated to $80 atm. Remember you can always just change your PC Part Picker to the US version if you want to see our prices.
How is it misguided? I posted the benchmarks that back up my claim. They both perform damn near the same.
For the reasons I just stated and showed-- I'll add another:
  1. Overclockable (with motherboards starting at the same price as Intel's locked entry B150/B250 series MoBos)
  2. Superior stock cooler-- capable even of modest overclocks at acceptable temperatures, and better for long-term intensive tasking such as a gaming machine endures
  3. ~5% overall superior performance in a game-heavy benchmark roundup by Anandtech
  4. 22% superior UserBenchmark score (though samples are incredibly low for 1300X, but all were benched at stock frequencies with no turbo mode or OC-juju)
  5. Quad core vs. Dual core meaning that with the non-gaming programs that scale very well in multicore, but still don't hyperthread (the way synthetics do) the R3-1300X will be drastically superior.
  6. More importantly, the core count advantage means that the R3-1300X will have also have a drastic advantage for multitasking (because if a program doesn't hyperthread, and very few do, then it can't just be handed-off to the additional "virtual" cores)
  7. The Zen chips considerably younger, and they've already made miles of improvements in performance with driver updates for things like the memory. It may not improve more, but we know Kaby Lake has reached its asymptote. So the x factor variable favors the Ryzen.
If you're going to claim the g4560 is a better deal than the 7100, then you should be recommending the g4560 over the 1300x since the 1300x and 7100 go neck and neck.
I supplied game benchmarks, and that's not fair; the G4560 is about neck-and-neck with the i3-7100 which is about neck-and-neck with the R3-1300X. There's a gap there, and the i3-7100 sits in the netherworld. You spoke as if they were equal values on a dollar-per-dollar basis. I objected to that, but I also quoted AT who said the i3-7100 could be a worthy buy if its cost deficit was attractive enough.
For an htpc, either chip will do the job just fine. You wouldn't notice any difference between the two on an htpc side by side on video playback. Intel still wins this catagory because of HEVC 10 bit.
The 7100 and 1300x are both poor choices for video editing, youre better off waiting until your next paycheck to bump up to an i5 or r5.
What you stated in red only enhances my statement of buying the 1300x vs 7100, buy whichever one is on sale since they both perform equally.
That's a valid argument for people who aren't buying a discrete GPU, and are going the pure HTPC route, but then why wouldn't they just get the G4560? Naturally the R3-1300X users will require a discrete GPU, and the GTX cards as far back as the 7xx series support HEVC 10-bit encoding, for example (the "F" feature set and higher). So you're no longer arguing better value for the Intel based on a CPU that can double as an HTPC, but rather one that has an advantage as a pure HTPC.

Surely this will become more significant in the future, but of course this is critical for 4K (where compression is most needed) while the majority of gamers/consumers still don't own 4K television sets. Isn't 10-bit color still incredibly rare in TV's, too? I thought that was something appearing more and more in the higher-end monitors, but I must admit I never pay as close attention to the evolution of HDTV specifications. Thought it was still rare at the affordable end.
 
Can you still get G4560s over there? No stock anywhere here, otherwise I'd definitely give them the nod for bang for buck. After that, for longevity or multitasking, I'd save up for the Ryzen 5 1600.
It took me over 3 months to find a g4560. when they are available over here, they sell out quick.
I paired one up with a 1050ti and it will do 1080 60fps in most games with medium to high settings. Even at 4K, a lot of stuff was playable, but I wouldn't recommend it.
For LoL or Dota type games, it's more than plenty.
I can't remember the exact number, but the last time I looked at steams monthly survey 720p and 1080p gaming accounted for over 70% of systems.
 
It took me over 3 months to find a g4560. when they are available over here, they sell out quick.
I paired one up with a 1050ti and it will do 1080 60fps in most games with medium to high settings. Even at 4K, a lot of stuff was playable, but I wouldn't recommend it.
For LoL or Dota type games, it's more than plenty.
I can't remember the exact number, but the last time I looked at steams monthly survey 720p and 1080p gaming accounted for over 70% of systems.

Yeah, it's the GPU shortage which is killing the gaming PC value propositions at the moment. Practically overnight 3GB GTX 1060s jumped from @$270 (AUD) to $380 if you can find one. 4GB RX 570s jumped from $300 to over $400 (and are even harder to find).
RAM and SSD prices aren't helping either.
 
It took me over 3 months to find a g4560. when they are available over here, they sell out quick.
I paired one up with a 1050ti and it will do 1080 60fps in most games with medium to high settings. Even at 4K, a lot of stuff was playable, but I wouldn't recommend it.
For LoL or Dota type games, it's more than plenty.
I can't remember the exact number, but the last time I looked at steams monthly survey 720p and 1080p gaming accounted for over 70% of systems.
I don't think there is that significant of a shortage, anymore:


Amazon has 19 in stock, and there's another 15 in stock on the Amazon Marketplace in "New" condition.
 
Yeah, it's the GPU shortage which is killing the gaming PC value propositions at the moment. Practically overnight 3GB GTX 1060s jumped from @$270 (AUD) to $380 if you can find one. 4GB RX 570s jumped from $300 to over $400 (and are even harder to find).
RAM and SSD prices aren't helping either.
Unless you're buying low end or extreme high end, gpu's aren't worth it atm.
I flipped an rx480 that I picked up locally for $200 and sold it for over $400.
For SSD's, seagate just released a decent one for the price. I've been waiting for ssd prices to drop to try Raid 0 nvme but I'm not paying 20% more than I paid for the first one.

My co worker just built a pc very similar to what I have, and he paid about $200 more than I did 6 months ago for the same performance.
 
I don't think their is that significant of a shortage, anymore:


Amazon has 19 in stock, and there's another 15 in stock on the Amazon Marketplace in "New" condition.

$82 is ridiculous, I paid a little bit over $60. You may as well step up to the g4600 for the price of a Starbucks coffee.
 
$82 is ridiculous, I paid a little bit over $60. You may as well step up to the g4600 for the price of a Starbucks coffee.
Agreed, but it still charts as a far superior value to any i3 at that price. That's why the i3's sit in a no-man's land.
 
@UJustGotChaeld Here's an fx-8370 vs the R3-1200 using a gtx 1060.



@Madmick have you went through the Gamers Nexus R3 1300x review yet? The i5-2500k is still a beast considering it's 6 years old.
 
I don't think there is that significant of a shortage, anymore:


Amazon has 19 in stock, and there's another 15 in stock on the Amazon Marketplace in "New" condition.

Microcenter is selling them for the MSRP of $57 again. Jet had them on sale for $52 yesterday but they're back up to $80 again.
 
AMD's Radeon RX Vega 64 Might Be The Ultimate Ethereum Mining Card, Could Top 100 MH/s
Hot hardware said:
It is a tough time to be a gamer in need of a graphics card upgrade. There are some excellent options for gaming on the PC, but with the cryptocurrency gold rush that has taken place over the past several months, it can be difficult to (A) find certain graphics in stock and (B) not pay an inflated price for one. Digital coin miners have created a shortage, and if AMD's Radeon RX Vega 64 proves to be a mining power house as rumored, we could be looking at a similar situation for many months to come.

One of the staff members at OCUK claims the hash rate on "Vega" is 70-100 million hashes per second (MH/s). He does not specify which Vega graphics card, but it is a safe assumption it would be the higher end Radeon RX Vega 64 that might be capable of approaching 100 MH/s. And as he indicates, anywhere close to that is "insanely good."

Radeon_RX_Vega_64.jpg


"Trying to devise some kind of plan so gamers can get them at MSRP without the miners wiping all the stock out within 5 minutes of product going live," OCUK says.

Therein lies the problem for gamers. There are certain types of cryptocurrencies that are best mined with graphics cards. One of them is Ethereum, and it has skyrocketed in value over the last several months. At the beginning of the year, it was trading at around $8. It's currently valued at just under $224, and at one point recently, it hit the $400 mark.

The good news here is that nothing is yet official, though rumors of Radeon RX Vega 64 being great at mining are coming from multiple corners of the web. Videocardz, for example, claims to have heard from a "good source" that AMD has informed its hardware partners of the situation, presumably so they can brace for a shortage. The source also says the hash rate will be lower than 100 MH/s, but still at least double that of the Radeon RX Vega Frontier Edition, which has a hash rate of around 30 MH/s.

To put these numbers into perspective, most of the higher end cards that digital coin miners flock to have hash rates ranging from 20 MH/s to 30 MH/s. If the Radeon RX Vega 64 is significantly better at mining Ethereum, which is still trading high, then it's a safe bet there will be a shortage no matter how hard AMD tries to keep up.
Watch AMD stock tank it for a month or two because of Reddit chatter about poor gaming performance before the suits on Wall Street realize even after the new GPUs drop that there's been like a 60% increase in GPU pricing year over year, that's it's harder to find a decent shrinkwrapped fucking gaming GPU these days than a Nintendo Switch, that the PC gaming industry is soaring, that AMD continues to control both console markets including newer more premium variations, and that all of this erases AMD principal challenge which is its profit margin, not its market viability.

Chart: 2016. Rinse, wash, repeat.
 
Status
Not open for further replies.

Similar threads

Back
Top