Ohh, been looking for a reason to post this:
Caught this in my YouTube feed a few days ago. Bit of a Titanic headline, and one that also needs a slight correction:
4K PC Gaming is Dumb.
This was the consensus already reached by the PC gaming community in May 2016 when the GTX 10 series was released, and we realized (except for the increasingly rarer games where SLI is supported) that there still isn't even enough GPU horsepower to drive a 4K display at desirable framerates for the more challenging games. Unfortunately, the RTX series didn't materially change that 2 1/2 years later. Still, that's only true for games that don't support SLI, or that are highly demanding. The vast majority of games aren't.
However, I was a bit disappointed that he is stacking up a 4K 60Hz monitor against a 1440p 144Hz monitor in November, 2018. Supposedly, this is video is intended to assess the question through the prism of whether or not you have an unlimited budget (as he said). So why isn't he using the below monitors for the 4K setup? I understand that the video is more relevant to the widespread market reality of 4K monitors and actual gaming setups, but if you set a hypothetical where you dismiss budget concerns, then you should stick to it. For 4K 27" is scraping the floor, but if you sit close enough, the
average human eye can resolve the difference according to the charts describing that average:
G-SYNC HDR 4K 144Hz Monitors Available Now From Acer and ASUS (August 1, 2018)
With these monitors you get 4K display resolutions at up to 144Hz, 1,000 nits of peak brightness, DCI P3- color, 384-zone controllable backlights, a 50,000:1 contrast ratio, Quantum Dot Enhancement Film, and of course, G-SYNC variable refresh rate technology.
We're still waiting on those NVIDIA BFGD 65" displays to actually become available on the market. They'll probably end up getting to the market before a GPU that is really capable to drive them:
BIG FORMAT GAMING DISPLAYS
Still, the further we stray from theory, and into realities, the more complicated this question becomes. Competitive FPS gamers don't give a shit how the game looks. Last I checked pro pages I was seeing 1080p screens, some that that nerf the resolution scale 70%, and some that even use 720p outright. Some play with 4:3 aspect ratios, stretched aspect ratios, or even play with
black bars.
They used mixed graphical settings to nerf eye candy that isn't conducive to a competitive advantage despite how it looks. To almost all of us these setups are also "stupid". The guys who don't game competitively, or who don't play games where you're constantly spinning around like ballerinas, and there is more emphasis on cutscenes and ambience, for example
Ori and the Blind Forest or
Metal Gear or
Resident Evil 7: Biohazard, those gamers might prefer the higher resolution at the lower framerate, especially if they are on a larger screen.
Furthermore, for console gamers, who are always going to be locked to 60fps maximum, what the hell is learned by comparing a 1440p@144Hz monitor to a 4K@60Hz monitor? Comparing stills and slow-moving videos his staff all picked the 4K monitor as the best, and that's with 27" monitors. Most console guys game on TVs. Compare
Forza with that super-smooth 27" 1440p screen to a
77" LG C8 OLED running on an Xbox One X, and your deck will be just as stacked.
Generally speaking, anyone sane will concur with Linus, and I appreciate what he is doing, but in its own way, this video is a bit misleading.