Official AMD "Ryzen" CPU Discussion

Status
Not open for further replies.
If you were looking for the top performance from a quad-core processor...you've already got it. The R7 1600 isn't going to be superior to any of the Kaby Lake i5's for gaming. That's what insight into the IPC gives us.

When Nickerson says that it might beat the i5 on value he means that it will give you more bang for the buck, not that it will give you more gaming bang. No way in the world, especially considering how quickly we're seeing the 1700/1700x/1800x hit the heat transfer wall (~4.2GHz) on air coolers. That is a strong indication that them just removing 2-4 cores with Zen architecture won't allow them to push up the frequency a massive margin, so don't expect to see a 4.5GHz core clock on the quad cores when they come out or anything (and if you see that it will probably be because they're repeating the FX-9xxx snake oil strategy).

Yup the i5 will still be top as far as performance, but the difference in gaming could very well be small enough that passing up on two additional cores for the same money would be hard to do.
 
Yup the i5 will still be top as far as performance, but the difference in gaming could very well be small enough that passing up on two additional cores for the same money would be hard to do.
Yeah, in essence, looks like we may have a repeat of the FX-6300 situation, but with the 6-core staring down the i5's, not the i3's.
 
AMD dropped hints of a 16 core 32 thread Ryzen CPU for 700 dollars VS 1700 for Intel's biggest 10 core. They unfortunately will not support the current socket but supports a full 16 lanes of data transfer. It will have 6 more cores then the i7-6950X. It will support a crazy number of PCI-E sockets.
 
Was reading about some nice improvements on Ryzen today, DOTA 2 getting some patch that supposedly improved performance by 20%, and some more good stuff over on the AMD Reddit.

That 16 core 32 thread CPU mentioned though, probably months out eh? Too bad.
 
AMD dropped hints of a 16 core 32 thread Ryzen CPU for 700 dollars VS 1700 for Intel's biggest 10 core. They unfortunately will not support the current socket but supports a full 16 lanes of data transfer. It will have 6 more cores then the i7-6950X. It will support a crazy number of PCI-E sockets.
Is that the one with 6 channel memory as well?
 
Is that the one with 6 channel memory as well?

Intel new leak of a 32 core monster chip likely going to cost 4,000 dollars by Intel with 6 channel memory. Intel is countering an AMD 32 core chip coming out next year with one that is already being tested.

Intel-Xeon-E5-2699-V5-Skylake-EP_1.jpg



Skylake-EP-Xeon-32-Core-64-Threads.png
 
Just downloaded "ryzen master" and I didn't realize my 1700 was already running at 3.6ghz. Did I win the binning lotto or is this not accurate?
 
Just downloaded "ryzen master" and I didn't realize my 1700 was already running at 3.6ghz. Did I win the binning lotto or is this not accurate?
3.7GHz is the native Turbo binning:
http://www.cpu-world.com/CPUs/Zen/AMD-Ryzen 7 1700.html

Hence why the 1700 has risen to become the bestseller of the three and such a killer value as I've expounded on with my above posts. According to the information I've reviewed these are the numbers you can expect with the market's big winner CPU Coolers:


*Turbo Mode means it will run at 3.7GHz during challenging sequences in games, and throttle only if those sequences endure long enough without intermittent, less stressful periods for it to cool down.
 
3.7GHz is the native Turbo binning:
http://www.cpu-world.com/CPUs/Zen/AMD-Ryzen 7 1700.html

Hence why the 1700 has risen to become the bestseller of the three and such a killer value as I've expounded on with my above posts. According to the information I've reviewed these are the numbers you can expect with the market's big winner CPU Coolers:


*Turbo Mode means it will run at 3.7GHz during challenging sequences in games, and throttle only if those sequences endure long enough without intermittent, less stressful periods for it to cool down.
So turbo mode is only activated during games? I haven't overclocked my 1700 and since I only have a HD6670, I'm waiting till Vega or maybe the refreshed 500 line. Anyway I haven't been able to test new games (tried running Andromeda with this graphics card {<jordan}), I've used Cinebench though and it maxes out at 3.2ghz at all cores so was just wondering why I never see it kick in to 3.7. But I am also running the stock Wraith cooler.
 
So turbo mode is only activated during games? I haven't overclocked my 1700 and since I only have a HD6670, I'm waiting till Vega or maybe the refreshed 500 line. Anyway I haven't been able to test new games (tried running Andromeda with this graphics card {<jordan}), I've used Cinebench though and it maxes out at 3.2ghz at all cores so was just wondering why I never see it kick in to 3.7. But I am also running the stock Wraith cooler.
"Turbo Mode" (Intel) or "Turbo Core" (AMD) is just the dynamic frequency that will automatically be activated whenever it is demanded according to their algorithms and possibly the software being run, essentially, but it doesn't activate on all cores. This is optimal for workloads that are often optimized for fewer cores like games. So, in theory, the CPU is running at 3.0 GHz across all eight cores, and things are okay. But then all of a sudden you come to some new load area and it's dropping frames because the workload on the first two cores suddenly becomes incredibly demanding, but nothing was really added to the last four cores (possibly it will underclock the additional cores or shut them down). So it clocks those down, and overclocks the first two cores.

It may produce more overall heat for a short time, but generally speaking, this dynamic setup is intended to shift the burden of heat the processor can dissipate as a whole around to where it really needs to be used in that moment. It isn't intended to run 3.7GHz across all 8 cores (it would likely become toast pretty quickly, I imagine).

The CPU core clocks are chosen because the CPU can burn at that across the board and should never have to worry unless ambient temps get crazy, or the case gets choked. Games are one such stress. If you're CPU is never going into that, then that could mean that it is already working at full capacity, and can't really boost the primary cores for any appreciable duration (since all cores are firing).

Alternatively, it could be a motherboard issue. It might not be activated by default on your board. That's assuming your motherboard supports the technology.
 
"Turbo Mode" (Intel) or "Turbo Core" (AMD) is just the dynamic frequency that will automatically be activated whenever it is demanded according to their algorithms and possibly the software being run, essentially, but it doesn't activate on all cores. This is optimal for workloads that are often optimized for fewer cores like games. So, in theory, the CPU is running at 3.0 GHz across all eight cores, and things are okay. But then all of a sudden you come to some new load area and it's dropping frames because the workload on the first two cores suddenly becomes incredibly demanding, but nothing was really added to the last four cores (possibly it will underclock the additional cores or shut them down). So it clocks those down, and overclocks the first two cores.

It may produce more overall heat for a short time, but generally speaking, this dynamic setup is intended to shift the burden of heat the processor can dissipate as a whole around to where it really needs to be used in that moment. It isn't intended to run 3.7GHz across all 8 cores (it would likely become toast pretty quickly, I imagine).

The CPU core clocks are chosen because the CPU can burn at that across the board and should never have to worry unless ambient temps get crazy, or the case gets choked. Games are one such stress. If you're CPU is never going into that, then that could mean that it is already working at full capacity, and can't really boost the primary cores for any appreciable duration (since all cores are firing).

Alternatively, it could be a motherboard issue. It might not be activated by default on your board. That's assuming your motherboard supports the technology.
Gotcha

<40>
 
Don't get your hopes up. Nothing having to do with RAM speeds is going to close that gap.
But this does bring up the question is faster ram worth the performance/price now. Some games are showing up to a 10% improvement.
If I was building a high end machine I'd spend the extra $20-$30 to jump up to 3000+ mhz.
 
I did not.
Check it out, it's worth it imo

The guy saw some noticeable improvements after switching to faster ram. I'm curious on your thoughts about it and he did mention a Windows update I believe.
 
Check it out, it's worth it imo

The guy saw some noticeable improvements after switching to faster ram. I'm curious on your thoughts about it and he did mention a Windows update I believe.
Okay, I'll take a look, I just remember there being false hype about this the past several times, and I'm definitely burned on AMD's spins after they pulled what appears to be another glorified binning release with the R7 chips (i.e. the great TDP bamboozle). As far back as either Sandy Bridge or Ivy Bridge we've been seeing 3%-7% framerate improvements with RAM overclocks each generation, and it's never made much of any difference between AMD vs. Intel. I guess I'm just jaded.

I'll give it fresh eyes and my best attempt at a blank slate later tonight.
 
Okay, I'll take a look, I just remember there being false hype about this the past several times, and I'm definitely burned on AMD's spins after they pulled what appears to be another glorified binning release with the R7 chips (i.e. the great TDP bamboozle). As far back as either Sandy Bridge or Ivy Bridge we've been seeing 3%-7% framerate improvements with RAM overclocks each generation, and it's never made much of any difference between AMD vs. Intel. I guess I'm just jaded.

I'll give it fresh eyes and my best attempt at a blank slate later tonight.

I definitely agree on some of the over hype, but this is one you should dig into. It has something to do with the neural network. Wendell went into a little bit also, I'll try to dig up the video.

edit: it's in here somewhere
 
Last edited:
Status
Not open for further replies.

Similar threads

Back
Top