Intel says chips to become slower but more energy efficient

jefferz

Steel Belt
@Steel
Joined
Jul 23, 2011
Messages
32,182
Reaction score
17,919
Well this sucks for the master race

Intel has said that new technologies in chip manufacturing will favour better energy consumption over faster execution times – effectively calling an end to ‘Moore’s Law’, which successfully predicted the doubling of density in integrated circuits, and therefore speed, every two years.

“We’re going to see major transitions,” said Holt. “The new technology will be fundamentally different.” and continued “The best pure technology improvements we can make will bring improvements in power consumption but will reduce speed.”
 
Well, consoles use the same chips, so I'm not sure what your point is.
 
Well, consoles use the same chips, so I'm not sure what your point is.

Yeah, who gives a shit about performance gains. The only benefit from this is that it gives AMD a chance to catch up.
 
Yeah, who gives a shit about performance gains. The only benefit from this is that it gives AMD a chance to catch up.
To be fair, for the vast majority of uses computers are more than powerful enough right now, and better thermals and lower power consumption are very worthwhile pursuits.
At work we have something like 26k client computers and thousands of servers spread across 5 data centers, and a small increase in power consumption can have a big effect at that scale. And we're still fairly small, all things considered.
And the really insane number crunching is generally handled by specialized hardware anyways.
I could easily swap out my i7 for an i5 and not notice any negative effects on normal use. Rendering and video editing, maybe, but even that is more GPU dependent now with CUDA and OpenCL and the like offloading a lot of the work from the cpu.
 
For phones and handheld systems this could be a boon, for desktops not so much.
 
Well, I think it is about getting the best performance per watt now.

Processors (in terms or real world performance gains) have not been significant since the first generation i7 chips. Yes, IPC performance has gone up 30%, but what that means to the average consumer is negligable. You can still max out any game using an old i7 920.

For rendering/modeling applications, the focus for a while has been on energy efficient multi core processors. Instead of focusing on the having a blazing fast 5Ghz processor that draws 140Watts, servers have dual octo-core xeons that operate at lower frequencies and power draw.

As an example, I was benching my heavily overclocked 4770k (@4.6ghz) against my old Xeon x5660(@4.5Ghz). Despite the latter processor being almost 7 years old, they both turned in the same scores when gaming at 1440p. In fact, the latter turns out to be faster in games like Civ 5 and Battlefield because they take advantage of the extra cores.

P.S: Lower voltages/power draw leaves more overclocking headroom for the enthusiast community. Skylake is supposed to overclock like crazy at lower temps/voltages than Haswell and Ivy Bridge.
 
P.S: Lower voltages/power draw leaves more overclocking headroom for the enthusiast community. Skylake is supposed to overclock like crazy at lower temps/voltages than Haswell and Ivy Bridge.
Also, as devices get thinner and thinner it lets the chip do more work before it has to throttle either due to heat or power draw. Lower TDP probably isn't a very sexy stat on marketing materials but it's very important.
 
The chips aren't going to become slower, they're going to stop becoming faster and development will be focused on energy efficiency
 
All the money is in phones & laptops, now, guys. This was inevitable.
 
Back
Top