1080p 60fps or 4k 30fps?

Equivalent hardware does pretty much mean equal gaming performance. Optimization done for any platform can only do so much, it is not black magic. More so when you consider the XBox One uses Direct X for it's API, and PS4 has two APIs that are very similar to DX. The fact is the PlayStation APIs are designed to give developers finer controls over such things as pixel shading...so they can reduce the quality in some areas to be less demanding on hardware.

You are right, it is hard to build a PC for $300. Spending more on a PC also get you a lot more in the way of options. So a $500 PC can match of beat a PS4 in terms of gaming performance and is still a fully functioning PC. Of course this is if someone is starting from scratch. If you have a PC already there is a good chance that a $100 - $200 up grade can make it a gaming PC.


I built my current gaming pc for 1,2k about a year before ps4 release and its getting to the point where new games look better on my ps4 than with the settings I can run them on my pc.

consoles simply have great value, no matter how much diehard pc fans come up with benchmarks and price tags for the hardware components.

as a passionate gamer its obviously best to have both, but for someone who doesnt have or doesnt want to spend the money, consoles are a better deal imo.
 
As an ancient UT player, I still prefer FPS.
 
I built my current gaming pc for 1,2k about a year before ps4 release and its getting to the point where new games look better on my ps4 than with the settings I can run them on my pc.

consoles simply have great value, no matter how much diehard pc fans come up with benchmarks and price tags for the hardware components.

as a passionate gamer its obviously best to have both, but for someone who doesnt have or doesnt want to spend the money, consoles are a better deal imo.

If you built a PC in 2012 and spent $1,200 and it can't beat a PS4 you did not know what you were doing and got ripped off. Not only that but the PS4 hardware has not changed since release (not counting the PRO) so if your PC hardware was more powerful when you bought it, it is still more powerful now.

Not spending money or not wanting to does not make consoles a better deal. It simply makes them cheaper.

I would ask what your PC has in it for a CPU and GPU.
 
Last edited:
If you built a PC in 2012 and spent $1,200 and it can't beat a PS4 you did not know what you were doing and got ripped off. Not only that but the PS4 hardware has not changed since release (not counting the PRO) so if your PC hardware was more powerful when you bought it, it is still more powerful now.

Not spending money or not wanting to does not make consoles a better deal. It simply makes them cheaper.

core i5 4670
oc gtx770
good mb, ssd's, case, cooling etc.

no extreme high end stuff that drops in value quickly, just for the time solid gaming components for a normal price in my country

now look at some benchmarks for current games with these specs...
 
I built my current gaming pc for 1,2k about a year before ps4 release and its getting to the point where new games look better on my ps4 than with the settings I can run them on my pc.

consoles simply have great value, no matter how much diehard pc fans come up with benchmarks and price tags for the hardware components.

as a passionate gamer its obviously best to have both, but for someone who doesnt have or doesnt want to spend the money, consoles are a better deal imo.
Question: how did you build a PC with an i5-4670 in 2012 when that CPU launched in June 2013?
 
Question: how did you build a PC with an i5-4670 in 2012 when that CPU launched in June 2013?

by not remembering the time correctly. but yeah that cpu was new when i bought it.
 
core i5 4670
oc gtx770
good mb, ssd's, case, cooling etc.

no extreme high end stuff that drops in value quickly, just for the time solid gaming components for a normal price in my country

now look at some benchmarks for current games with these specs...

If you are going to compare the benchmarks of your system to a newer PC in the same range, yes it will fall behind. However the GTX 770 is still much more powerful in terms of graphical processing then a PS4. Look at the benchmarks for something like Witcher 3. The PS4 can hardly hold 30 fps. Your GTX 770 can do that, at higher graphical setting then the PS4.

Of course look at how much better a situation you are in now when you want to upgrade. The Xbox Scorpio is more then likely going to be around $450 or so. By the time it hits stores we are going to be staring at the next generation of video cards and you will be looking at GTX 1080 performance for that price.

That is not even mentioning that advantages you have with an SSD and other advantages.
 
If you are going to compare the benchmarks of your system to a newer PC in the same range, yes it will fall behind. However the GTX 770 is still much more powerful in terms of graphical processing then a PS4. Look at the benchmarks for something like Witcher 3. The PS4 can hardly hold 30 fps. Your GTX 770 can do that, at higher graphical setting then the PS4.

Of course look at how much better a situation you are in now when you want to upgrade. The Xbox Scorpio is more then likely going to be around $450 or so. By the time it hits stores we are going to be staring at the next generation of video cards and you will be looking at GTX 1080 performance for that price.

That is not even mentioning that advantages you have with an SSD and other advantages.

idk, turning details down makes a lot of games look kinda bland. maybe I'm disregarding the higher resolution I get on pc.

you think getting a gtx1080 is worth it for me or would the cpu bottleneck it?
 
idk, turning details down makes a lot of games look kinda bland. maybe I'm disregarding the higher resolution I get on pc.

you think getting a gtx1080 is worth it for me or would the cpu bottleneck it?
Fuck no, your CPU is still one of the best gaming CPU's on the market:
http://cpu.userbenchmark.com/Compare/Intel-Core-i7-7700K-vs-Intel-Core-i5-4670/3647vsm630
The most important single number there is the "QC Mixed". Your CPU is still just 28% off the #1 gaming CPU on the market at stock frequencies in that metric: a very modest margin. It's still hanging around the Top 30 gaming CPU's in existence, and that includes editing processors up to $1.6K, and server processors far beyond that.

You might look into overclocking your memory a bit if that's possible. I would advise upgrading to 16GB now, if possible, and if you don't intend to build a whole new computer in the next 2-3 years, to avoid the skyrocketing price of DDR3 as it disappears from the market. Could come in handy if you want to ride this build out with a new GPU. Probably won't be necessary, FYI. 8GB RAM should remain sufficient.

Also, so you know, you appear to have a misconception of what PC gamers argue, or you've been talking to the wrong crowd. Personally, I have long argued the very point you have raised as one of consoles' great virtues: their longevity of relevance. It is undoubtedly superior to PC's, generally speaking, but that is severely diminished as of late.

The first and biggest reason was the move to x86 architecture by not one, but both consoles.

The second was the fact that AMD's open-sourced approach to API development has meant that more and more PC games are seeing integration of those schemes in PC drivers. This is true for Vulkan-supported titles, in particular, but apparently DX12 borrowed heavily from Mantle, too, despite what you might read to the contrary, supported by the evidence that AMD GPUs saw a much larger boost in performance in DX12 titles than NVIDIA GPUs, relatively, when DX12 was first introduced. This has served to reduce the gap between the capability of their hardware pipelines and NVIDIA's. In other words, the dependency on software optimization from game to game has become less meaningful.

The third and final reason is thanks to what you are considering right now, but is exclusive to Intel owners whose builds have reaped phenomenal endurance due to the performance potential of those CPUs, and this of course is the option to simply upgrade your GPU, and nothing else. This dates back to the Sandy Bridge generation when Intel debuted its quad core i5 and i7 processors. Considering that is all PS4 and XB1 users are getting with their console "refreshes", I see little difference, and therefore little advantage to consoles, anymore. In another thread the PC guys were marveling at the value of the Sandy Bridge i5-2500K, and especially when purchased with the market-leading aftermarket cooler at the time: the Cooler Master Hyper 212 Plus ($245 MSRP was the combined total in 2011). That combination can hold its own against a i5-7600K or i7-7700K at stock frequency, today.

Also, to offer some perspective here, don't forget that simply because the game developers bend over backwards, figuratively, to maximize the performance of their games on late 2013 PS4/XB1 console hardware for newer games, and this disfavors PC longevity, it simultaneously does nothing to reverse the reality of your computer's incredible graphical superiority in titles that were better configured for your GTX 770 when it was on the cutting edge. So while it might be tougher to tell the difference for games made in 2017, don't take for granted that your PC will shit on the consoles, graphically, for every game made in the first half of 2015 and earlier. Even PS3 games ported to the PS4, or Xbox 360 games ported to the Xbox One, unless they're remastered and sold à la carte on the shelves in actual boxed retail, are a serious afterthought: no different than in the world of PCs!

But yeah, slot in a GTX 1080, and you'll be right back near the top of the world.
 
idk, turning details down makes a lot of games look kinda bland. maybe I'm disregarding the higher resolution I get on pc.

you think getting a gtx1080 is worth it for me or would the cpu bottleneck it?

That's the thing with PC graphics. You turn a setting down and then you start looking to see the difference.

Your CPU is still a great gaming CPU today. If your monitor is 1080p then yes your CPU will hold the GTX 1080 back a bit. Of course you are going to be getting FPS in the hundreds and most likely much higher then you monitor refresh rate. So you will not see the bottleneck. If you are playing above 1080p then the bottleneck will shift back to the GTX 1080.
 
That's the thing with PC graphics. You turn a setting down and then you start looking to see the difference.

Your CPU is still a great gaming CPU today. If your monitor is 1080p then yes your CPU will hold the GTX 1080 back a bit. Of course you are going to be getting FPS in the hundreds and most likely much higher then you monitor refresh rate. So you will not see the bottleneck. If you are playing above 1080p then the bottleneck will shift back to the GTX 1080.
I think this is why the gaming community has tended to eschew the term "bottleneck" more and more the past half decade. The limiting component on any game is not necessarily a matter of resolution/framerate although that is obviously highly salient to the question for any given game. The reason is that the most intensive portion of the load will vary from game to game (depending on CPU utilization). While the CPU might bottleneck full GPU utilization for someone wanting 144Hz on a 1080p monitor for The Witcher 3, for example, it won't for the majority of other PC games that already exist, such as DOOM. Thus, the marginal rift between the verbs "will" and "could" is primal. It's too conditional to reliably blanket with that term.

This Redditor maintains a channel that specializes in re-purposing old hardware:



After all, for perspective, this YouTuber mentions that his i5-4690K even "bottlenecks" his GTX 970 sometimes with his new 1080p@144Hz panel. Yet this mentality leads him to opine that the i5 is no longer the "sweet spot", and that games are designed with 8-core CPUs in mind. Obviously that perspective places him in an extreme minority.

I must say he does offer an interesting argument by virtue of the fact that he nullifies benchmarks where the i5 doesn't render everything on the screen that the i7 does in the game's internal benchmarks (the ones that "pop"):




The point I am trying to convey is that unless CPU and GPU have perfectly even loads on every single game during that game's peak demands, then technically there will always be a "bottleneck". One should be careful about obsessing over this. That is why, for me, that term indicated that your CPU will never let your GPU spread its wings and fly to its fullest capacity for any gaming situation you present to your comp. That is when your GPU is truly being bottlenecked.
 
Am I the only who doesn't care about resolution at all? I dunno, from 720p an onward I can't really see any relevant difference. FPS on the other hand is a big deal imo.
 
Back
Top