idk, turning details down makes a lot of games look kinda bland. maybe I'm disregarding the higher resolution I get on pc.
you think getting a gtx1080 is worth it for me or would the cpu bottleneck it?
Fuck no, your CPU is still one of the best gaming CPU's on the market:
http://cpu.userbenchmark.com/Compare/Intel-Core-i7-7700K-vs-Intel-Core-i5-4670/3647vsm630
The most important single number there is the "QC Mixed". Your CPU is still just 28% off the #1 gaming CPU on the market at stock frequencies in that metric: a very modest margin. It's still hanging around the Top 30 gaming CPU's in existence, and that includes editing processors up to $1.6K, and server processors far beyond that.
You might look into overclocking your memory a bit if that's possible. I would advise upgrading to 16GB now, if possible, and if you don't intend to build a whole new computer in the next 2-3 years, to avoid the skyrocketing price of DDR3 as it disappears from the market. Could come in handy if you want to ride this build out with a new GPU. Probably won't be necessary, FYI. 8GB RAM
should remain sufficient.
Also, so you know, you appear to have a misconception of what PC gamers argue, or you've been talking to the wrong crowd. Personally, I have long argued the very point you have raised as one of consoles' great virtues: their longevity of relevance. It is undoubtedly superior to PC's, generally speaking, but that is severely diminished as of late.
The first and biggest reason was the move to x86 architecture by not one, but
both consoles.
The second was the fact that AMD's open-sourced approach to API development has meant that more and more PC games are seeing integration of those schemes in PC drivers. This is true for Vulkan-supported titles, in particular, but apparently DX12 borrowed heavily from Mantle, too, despite what you might read to the contrary, supported by the evidence that AMD GPUs saw a much larger boost in performance in DX12 titles than NVIDIA GPUs, relatively, when DX12 was first introduced. This has served to reduce the gap between the capability of their hardware pipelines and NVIDIA's. In other words, the dependency on
software optimization from game to game has become less meaningful.
The third and final reason is thanks to what you are considering right now, but is exclusive to Intel owners whose builds have reaped phenomenal endurance due to the performance potential of those CPUs, and this of course is the option to simply upgrade your GPU, and nothing else. This dates back to the Sandy Bridge generation when Intel debuted its quad core i5 and i7 processors. Considering that is all PS4 and XB1 users are getting with their console "refreshes", I see little difference, and therefore little advantage to consoles, anymore. In another thread the PC guys were marveling at the value of the Sandy Bridge i5-2500K, and especially when purchased with the market-leading aftermarket cooler at the time: the Cooler Master Hyper 212 Plus ($245 MSRP was the combined total in 2011). That combination can hold its own against a i5-7600K or i7-7700K at stock frequency, today.
Also, to offer some perspective here, don't forget that simply because the game developers bend over backwards, figuratively, to maximize the performance of their games on late 2013 PS4/XB1 console hardware for newer games, and this disfavors PC longevity, it simultaneously does nothing to reverse the reality of your computer's incredible graphical superiority in titles that were better configured for your GTX 770 when it was on the cutting edge. So while it might be tougher to tell the difference for games made in 2017, don't take for granted that your PC will shit on the consoles, graphically, for
every game made in the first half of 2015 and earlier. Even PS3 games ported to the PS4, or Xbox 360 games ported to the Xbox One, unless they're remastered and sold
à la carte on the shelves in actual boxed retail, are a serious afterthought: no different than in the world of PCs!
But yeah, slot in a GTX 1080, and you'll be right back near the top of the world.