Is Framerate a big deal to you in video games?

What is more important to you


  • Total voters
    43

Steve-French

What A Rush!!!
@Gold
Joined
Jun 11, 2012
Messages
21,832
Reaction score
21,443
I think it's fine as long as it's at least 30. I have a Xbox one X and I don't no if it's my tv, but I don't really even notice that much of a difference From COD (60fps) and Far Cry (30fps) I feel I am in the minority when it comes to this, as I would rather have better textures and 4K resolution at 30fps vs 1080p at 60fps.

I do like what a lot of people call soap opera mode on my TV though. It's actually kind of funny how a lot of people that need 60fps minimum on a video game are the same people that can't stand the judder effect when they are watching television.

I think a lot of it is a way for PC gamers to flex their gear too, but with this generation of consoles it seems like 30fps is going to be a thing of the past.
 
If I have to pick one it has to be frame rate because that actually has an impact on the game beyond its aesthetic value. I'm not a frame rate junky by any means and I'm not one of the people on here making a huge deal about 30fps vs 60fps but a noticeable drop in frame rate really sucks. I also have a 4k TV so I do care about resolution when it's available.
 
i often can't even play shit with bad frame rates, between getting nausea and migraines...
 
30 FPS is fine for me as well. I can tell the difference between 30 vs 60 but that's it. I can't tell the difference between 60 and 144hz. Even 30 vs 60 can be tough sometimes.

I have a cousin who can tell right away though and he tends to be better at the FPS games we play. Makes me wonder if that's part of the reason why.
 
If I have to pick one it has to be frame rate because that actually has an impact on the game beyond its aesthetic value. I'm not a frame rate junky by any means and I'm not one of the people on here making a huge deal about 30fps vs 60fps but a noticeable drop in frame rate really sucks. I also have a 4k TV so I do care about resolution when it's available.

I'm not really that good at FPS. I never play online either. If I'm playing a FPS it's always the campaign on medium difficulty with aim assist on. I guess it would be a bigger deal if I played online.
 
framerate over resolution

Theres nothing more sexier than the game moving at smooth top speed like Doom Eternal etc
 
Frame rate is more integral to a positive gaming experience. A bad one can render something unplayable.
 
  • Like
Reactions: HHJ
If I have to pick one it has to be frame rate because that actually has an impact on the game beyond its aesthetic value. I'm not a frame rate junky by any means and I'm not one of the people on here making a huge deal about 30fps vs 60fps but a noticeable drop in frame rate really sucks. I also have a 4k TV so I do care about resolution when it's available.

I'm not a competitive gamer, but I work in the digital display market so I do a lot of calculations of visual characteristics.

Hypothetically, they should both have roughly the same impact.

The human eye with 20/20 vision can perceive one arcminute (1/60 degrees) of resolution.
When an object moves across your screen, the accuracy of that movement is determined by arcminutes of detail rendered.

Say an object moves across your screen at exactly 60 arcminutes per second, and your resolution works out to exactly one arcminute per pixel.

If you lower your resolution by half, the object will appear to move in jumps of 2 arcminutes at a time, because each pixel will be 2 arcminutes wide.
Similarly, if you drop the refresh rate by half, the object will also appear to move in jumps of 2 arcminutes at a time, because each refresh will skip a pixel.

I guess it's probably about specifying both resolutions and refresh rate to an optimal point before you start getting diminishing value.
 
i often can't even play shit with bad frame rates, between getting nausea and migraines...
Same here. Must be an age thing, because I am from the generation of frame rates in the teens, and never had a problem.
 
Same here. Must be an age thing, because I am from the generation of frame rates in the teens, and never had a problem.

maybe, i dunno. i always kinda had sensitive eyes (and couldn't do homework on the bus as a kid or i'd get carsick) to sunlight. first noticed this with vidya when i bought a sweet new cutting edge lcd monitor to replace the standard crts... and it was a migraine machine. and that was a long time ago, so i'm not sure if age or not. it seems like age would have the opposite effect with macular degeneration.
 
maybe, i dunno. i always kinda had sensitive eyes (and couldn't do homework on the bus as a kid or i'd get carsick) to sunlight. first noticed this with vidya when i bought a sweet new cutting edge lcd monitor to replace the standard crts... and it was a migraine machine. and that was a long time ago, so i'm not sure if age or not. it seems like age would have the opposite effect with macular degeneration.
Yeah, in reality, I’m not sure. Just know it fucks with me now and never did before, while I still have 20/20 vision.
 
Unless it's an absolute slide show, no. It's a nice little luxury I guess, but it's nothing I'd seek out to improve from the standard 30fps. It's nothing that will stop me from enjoying a game.
 
Honestly I’m not too picky about either to be honest. I can barely perceive the difference between 30 and 60 fps in most games, and as for resolution I’ve been gaming since the Atari days which according to google was 160x192 pixels. Honestly other graphical bells and whistles were always more important to me in my PC gaming days than resolution. For example, back when i would be playing something like Unreal Tournament back in the day I would much rather have things like coloured lightning, reflections, shadows and particle effects than anti-aliasing. Resolution was always the first graphical setting i was willing to compromise on if my system was struggling with its performance.

Even today I think I’d rather have 30 FPS at 1080p with Ray tracing than 4K/60 without it.
 
Last edited:
Same here. Must be an age thing, because I am from the generation of frame rates in the teens, and never had a problem.

My first console was an Atari 2600 in 1986. Video games have come a long way since then. Maybe that's why frame rate isn't an issue to me.
 
No, high FPS isn't just for PC people to flex. It makes a huge difference. You're also playing with a controller, it makes a bigger difference with mouse/keyboard.

 
Last edited:
My first console was an Atari 2600 in 1986. Video games have come a long way since then. Maybe that's why frame rate isn't an issue to me.
I started on the Atari 2600, then moved to NES, then everything else afterwards. Had a Macintosh PowerPC aswell. But the family got our first color PC in 95, and naturally upgraded every few years afterwards. So I saw the difference as I gamed on both PC and consoles throughout the generations. It wasn't really until the 360/PS3 era that it became super noticeable, as far as frame rate went. But the PC's resolution was already much higher at that point.
 
Back
Top