Tech Gaming Hardware discussion (& Hardware Sales) thread

Question here that might get answered in this thread and not need a thread of it's own.

I have a light gaming laptop that for the most part really suits my needs. Don't have space for a desktop, also don't play the latest and greatest shooter games and don't care about having to run graphics on medium rather than ultra.

A few games I play, like hunting games, I like to connect via HDMI to my TV and use a controller for on the big screen like it's a game console. The game looks crystal clear and smooth on my laptop screen, but on the TV (2 different TV's) it's choppy and there are lines that occasionally drift through the screen even though my FPS readout stays in the 60-80fps range.

Laptop is a TRACER III EVO bought in 2020, graphics card is a (laptop) GTX 1660 Ti 6GB, mother board is an Intel(R) HM370 Chipset if that's what's controlling the HDMI output.

Having this issue both on an older higher end Samsung 70" 4K TV in my basement, as well as a 2020 cheap 65" TCL ROKU based 4kTV in my living room.

So yeah I'm low on graphics power, but why do I get clipping (not sure how to describe it) on HDMI output to TV, and nice smooth video on the laptop screen? Is there anything to do?
 
There’s just no card you can really recommend. The 4090 draws a nutty amount of power and might burn itself out at the connector. The 4080 is super expensive to the point where you feel like you might as well jump to a 90. And now this one.

I guess the AMD 7900 xtx is looking like the best option right now. You save 200 bucks off the 4080 for similar performance. Of course you lose out on DLSS and RTX.

I will say I’ve played with RTX on a 3080 and 3080 to some recently. At least in those cards, the performance hit was so profound I ended up turning it off all together. That was even with DLSS 2 running. Yeah the image looked nice, but the games were running janky.

Last weekend I was running Witcher 3 at 4k on a 3080ti. With RTX off and everything else at max, I was getting 80-90 fps average. Turning it on, I was dropped to about 45 in more open scenes and struggling to stay above 30 in more crowded scenes. RTX looks nice, but no thanks. It’s not worth that performance hit. It wasn’t just Witcher 3 either. I messed about on a few other games with similar effect I just don’t remember those numbers as exactly.

Looking at the charts for the 4070 and 4080, it doesn’t look like it’s gotten a whole lot better at handling the hit. At least with the 4090 it seems to stay in the 75+ range on a lot of games, which I consider a must these days.

I say all that to say maybe RTX isn’t even worth considering right now. Of course I’m using 30 series hardware at 4k, so it’s not a perfect comparison. Looking at the charts though, 4090 is probably the only card I would switch RTX on with.


Intersting. I have a 3440 x 1440 monitor. Obviously not 4K, but I'm not sure if the ultrawide demands more power than a non -ultrawide 1440p. Regardless, yeah, the 7900XTX is the card I want but am now scared given heat problems posted in this thread LOL. A 4080 on sale would be nice too.
 
What specific games you play at this resolution?

I haven't been buying anything high end AAA because my current card is a 1070 (and processor is dated too, a 6700). So the more recent stuff I have sunk some time into would be Final Fantasy Remake, Yakuza Dragon, Death Stranding, AC Odyssey, etc
 
Last edited:
Question here that might get answered in this thread and not need a thread of it's own.

I have a light gaming laptop that for the most part really suits my needs. Don't have space for a desktop, also don't play the latest and greatest shooter games and don't care about having to run graphics on medium rather than ultra.

A few games I play, like hunting games, I like to connect via HDMI to my TV and use a controller for on the big screen like it's a game console. The game looks crystal clear and smooth on my laptop screen, but on the TV (2 different TV's) it's choppy and there are lines that occasionally drift through the screen even though my FPS readout stays in the 60-80fps range.

Laptop is a TRACER III EVO bought in 2020, graphics card is a (laptop) GTX 1660 Ti 6GB, mother board is an Intel(R) HM370 Chipset if that's what's controlling the HDMI output.

Having this issue both on an older higher end Samsung 70" 4K TV in my basement, as well as a 2020 cheap 65" TCL ROKU based 4kTV in my living room.

So yeah I'm low on graphics power, but why do I get clipping (not sure how to describe it) on HDMI output to TV, and nice smooth video on the laptop screen? Is there anything to do?
I assume your laptop is 1080p or 1440p. Do you change the settings?
 
I assume your laptop is 1080p or 1440p. Do you change the settings?
I’ve tried it with both manually changing the settings and leaving it on auto detect
 
@HockeyBjj Try this:

1. Right click on desktop and open display settings.
2. Change primary monitor to your TV.
3. Restart PC.

I'm thinking your newer TV is 4k at 120hz though and you need an HDMI 2.1 cable.
@Madmick is the SME on stuff like this.
 
Intersting. I have a 3440 x 1440 monitor. Obviously not 4K, but I'm not sure if the ultrawide demands more power than a non -ultrawide 1440p. Regardless, yeah, the 7900XTX is the card I want but am now scared given heat problems posted in this thread LOL. A 4080 on sale would be nice too.
It is only the reference model that has those issues and are being recalled. AIB cards are not affected. I would recommend an AIB card anyway.
 
Intersting. I have a 3440 x 1440 monitor. Obviously not 4K, but I'm not sure if the ultrawide demands more power than a non -ultrawide 1440p. Regardless, yeah, the 7900XTX is the card I want but am now scared given heat problems posted in this thread LOL. A 4080 on sale would be nice too.
I have an ultrawide monitor for my desktop also; the 4k was when I am on my TV playing. I swear by ultrawide too lol. It does take more juice to run than a regular 1440 monitor but not as much as a 4k. I forget exactly how much more demanding it is but just split the difference and you’re close enough.

I can try to mess around on a couple games tomorrow on the ultrawide and see what RTX does to it. I played cyberpunk some on it and it tanked it. Haven’t tried Witcher 3 on ultrawide though.

It is hard to give a solid gpu recommendation right now. It feels like each of them has a major drawback. I’m also just always leery of recommending amd cards as I had a bad experience with their drivers in the past.

I
 
Question here that might get answered in this thread and not need a thread of it's own.

I have a light gaming laptop that for the most part really suits my needs. Don't have space for a desktop, also don't play the latest and greatest shooter games and don't care about having to run graphics on medium rather than ultra.

A few games I play, like hunting games, I like to connect via HDMI to my TV and use a controller for on the big screen like it's a game console. The game looks crystal clear and smooth on my laptop screen, but on the TV (2 different TV's) it's choppy and there are lines that occasionally drift through the screen even though my FPS readout stays in the 60-80fps range.

Laptop is a TRACER III EVO bought in 2020, graphics card is a (laptop) GTX 1660 Ti 6GB, mother board is an Intel(R) HM370 Chipset if that's what's controlling the HDMI output.

Having this issue both on an older higher end Samsung 70" 4K TV in my basement, as well as a 2020 cheap 65" TCL ROKU based 4kTV in my living room.

So yeah I'm low on graphics power, but why do I get clipping (not sure how to describe it) on HDMI output to TV, and nice smooth video on the laptop screen? Is there anything to do?

@HockeyBjj Try this:

1. Right click on desktop and open display settings.
2. Change primary monitor to your TV.
3. Restart PC.

I'm thinking your newer TV is 4k at 120hz though and you need an HDMI 2.1 cable.
@Madmick is the SME on stuff like this.
I think @My Spot has correctly identified the issue. It must be trying to output in 4K to your TV. The HDMI port does not provide sufficient bandwidth to output that resolution. Your laptop's port is almost certainly HDMI 1.3 or 1.4 (doesn't matter which). The effective bandwidth ceiling for that port is 8.16 Gb/s. The bandwidth required to run 4K@60fps is over double that.

  1. Change the primary monitor to the TV as he suggested.
  2. Calibrate NVIDIA's Control Panel to output the image at a lower native resolution to override its attempt to output to the 4K resolution matching the TV. Make sure the Display Resolution is set to 1920x1080 at 60Hz (or 120Hz). Make sure the TV setting's fps matches.
  3. In the "Manage 3D Settings", you may also try forcing the Image Scaling onto the GPU via the dropdown menu, globally, or per each specific game you're trying to run in the "Program Settings" tab.
  4. Try running the game in Borderless mode instead of Full Screen if you're still getting issues.
  5. You may also check "Adjust Desktop Color Settings" to override to Reference Mode if the color is weird.
 
I think @My Spot has correctly identified the issue. It must be trying to output in 4K to your TV. The HDMI port does not provide sufficient bandwidth to output that resolution. Your laptop's port is almost certainly HDMI 1.3 or 1.4 (doesn't matter which). The effective bandwidth ceiling for that port is 8.16 Gb/s. The bandwidth required to run 4K@60fps is over double that.

  1. Change the primary monitor to the TV as he suggested.
  2. Calibrate NVIDIA's Control Panel to output the image at a lower native resolution to override its attempt to output to the 4K resolution matching the TV. Make sure the Display Resolution is set to 1920x1080 at 60Hz (or 120Hz). Make sure the TV setting's fps matches.
  3. In the "Manage 3D Settings", you may also try forcing the Image Scaling onto the GPU via the dropdown menu, globally, or per each specific game you're trying to run in the "Program Settings" tab.
  4. Try running the game in Borderless mode instead of Full Screen if you're still getting issues.
  5. You may also check "Adjust Desktop Color Settings" to override to Reference Mode if the color is weird.
Awesome. Thank you and @My Spot . I’ll try these suggestions over the weekend
 
I haven't been buying anything high end AAA because my current card is a 1070 (and processor is dated too, a 6700). So the more recent stuff I have sunk some time into would be Final Fantasy Remake, Yakuza Dragon, Death Stranding, AC Odyssey, etc

The latest line of new GPU's by both manufacturers are designed for such resolution usage. Comes down to your budget mainly. Look at the recent Jayztwocents 4070ti video. He shows an endless amount of graphs on performance amongst the newer line of GPU's pertaining to your resolution usage with game titles you'll likely play.

Just keep in mind that your current resolution is in the process of becoming standardized. That means the current GPU hardware holds little value when it comes to typical hardware lifespan.
 
The latest line of new GPU's by both manufacturers are designed for such resolution usage. Comes down to your budget mainly. Look at the recent Jayztwocents 4070ti video. He shows an endless amount of graphs on performance amongst the newer line of GPU's pertaining to your resolution usage with game titles you'll likely play.

Just keep in mind that your current resolution is in the process of becoming standardized. That means the current GPU hardware holds little value when it comes to typical hardware lifespan.

Thank you. I'm not sure exactly what the last bit means - except that you think the new cards aren't going to be cutting edge for long?

Regardless, I think I'll be happy with either the XTX7900 to 4080 will serve me fine. I'm going to stick with my current monitor for at least another 5 years I think and it's not a fancy 240hz refresh, it's an older ACER predator that I think overclocks to like 100 or 120. That's ample for me, I like my picture to look as pretty as possible even if sacrificing frames, so I think I'd put all the extra bells and whistles on when it comes to graphics settings.

I'm not really fussed about spending money, I just don't want to waste money (i.e. pay for something from one vendor where I can get the near identical same computer/performance for cheaper from another vendor).
 
Intersting. I have a 3440 x 1440 monitor. Obviously not 4K, but I'm not sure if the ultrawide demands more power than a non -ultrawide 1440p. Regardless, yeah, the 7900XTX is the card I want but am now scared given heat problems posted in this thread LOL. A 4080 on sale would be nice too.
As @Slobodan pointed out, there isn't a problem with the GPU itself, there is a problem with the reference cooler design by AMD.
Tom's Hardware said:
A small batch of RX 7900 XTX produced by AMD came with a faulty cooling system that caused the affected cards to overheat. The overheating boards will be replaced, Herkelman promised. The problem only concerns 'Made by AMD' add-in-boards. Radeon RX 7900 XTX devices designed and produced by its partners are not affected.
It's a great GPU, still easily the best value of the new cards. So don't buy a reference variant. Buy one with a cooler made by one of the AIB partners (ex. Sapphire, Powercolor, XFX, Gigabyte, ASUS). But not a reference variant issued by one of the partners. Example: for Sapphire, you want the "Pulse" or "Nitro" variants, not the reference "Radeon" variant.
https://www.techpowerup.com/review/sapphire-radeon-rx-7900-xtx-nitro/35.html
noise-normalized-temperature.png


Or, LOL, since "plenty of people" buy the AMD reference cards just to strip it and put a super-expensive custom liquid GPU cooler kit on it from niche luxury retailers like EKWB, yeah, you could totally do that. Only gonna run you ~$600, and entirely void the point of going with AMD in the first place, since at that expense you could simply opt for the 4090 instead, but I'm sure you'll find that appealing as "plenty" of people have over the years.
AnguishedWealthyEthiopianwolf-size_restricted.gif
 
Last edited:
Thank you. I'm not sure exactly what the last bit means - except that you think the new cards aren't going to be cutting edge for long?

Regardless, I think I'll be happy with either the XTX7900 to 4080 will serve me fine. I'm going to stick with my current monitor for at least another 5 years I think and it's not a fancy 240hz refresh, it's an older ACER predator that I think overclocks to like 100 or 120. That's ample for me, I like my picture to look as pretty as possible even if sacrificing frames, so I think I'd put all the extra bells and whistles on when it comes to graphics settings.

I'm not really fussed about spending money, I just don't want to waste money (i.e. pay for something from one vendor where I can get the near identical same computer/performance for cheaper from another vendor).
Any of these new GPUs are going to run 3440x1440 just fine for the next 5 years
 
Any of these new GPUs are going to run 3440x1440 just fine for the next 5 years

Perfect. I booted up Judgment yesterday thinking it wasn't a very demanding game and was only getting 30-40 fps on high settings (not ultra). Doesn't cut it any more, I need to upgrade lol
 
Perfect. I booted up Judgment yesterday thinking it wasn't a very demanding game and was only getting 30-40 fps on high settings (not ultra). Doesn't cut it any more, I need to upgrade lol
I had a 1080ti up until like august when they cut all the prices on 3080s. It was still chugging along at 3440 without much issue. In fact Cyberpunk was the only game I remember having to really dial the settings back on, and that game was as poorly optimized as they come.

I didn’t really feel like I needed to upgrade. I just wanted too
 
Back
Top