• Xenforo Cloud is upgrading us to version 2.3.8 on Monday February 16th, 2026 at 12:00 AM PST. Expect a temporary downtime during this process. More info here

Tech Gaming Hardware discussion (& Hardware Sales) thread

The cynic in me sees planned obsolescence in neon lights. Lose a few sales in the short term but force upgrades quicker.

im a little peeved that at least they have a 16 gb model for the 4060ti -though why? youre not going to be gaming in higher resolution on that thing.

i could have used that option with my 4070 ti. i game at 1440p but even right now in the odd game i'm just barely scraping by with 12gb. just running zelda tears of the kingdom uses a bit over 8 gigs, and my gpu is hardly even working at all while emulating that game lol
 
im a little peeved that at least they have a 16 gb model for the 4060ti -though why? youre not going to be gaming in higher resolution on that thing.

i could have used that option with my 4070 ti. i game at 1440p but even right now in the odd game i'm just barely scraping by with 12gb. just running zelda tears of the kingdom uses a bit over 8 gigs, and my gpu is hardly even working at all while emulating that game lol
It'll help some with bottlenecks, but it smacks of classic upselling. Which is...what it is. Nvidia's RTX 40 stack is getting comically crowded and I'm not even sure the finest analytics can save you from leaving sales on the table. Then again, Nvidia doesn't really need its gaming segment, so it can treat gamers poorly for the foreseeable future.
 
Excuse me my ignorance but how Nvidia went from manufacturing gaming GPUs to pioneer AI dev? Is that a byproduct of making GPUs or what
 
Excuse me my ignorance but how Nvidia went from manufacturing gaming GPUs to pioneer AI dev? Is that a byproduct of making GPUs or what

It is the other way around. NVIDIA was ahead of everyone else the productivity side of things and decided to spill their AI expertise into their graphics cards. DLSS and those tensor cores are carryovers. NVIDIA is ahead of everyone not really due to hardware ,in my opinion, but software. In general, there isn't that much difference between manufactures, less than 2X but NVIDIA is more useful in the productivity spaces due to better software, drivers, and third party software adoption. For a lot of AI related task, you are more concerned with VRAM than anything else. People believe that they are being skippy with their VRAM in comparison to AMD's cards is that their productivity cards that have 60+ gig of vram, go for $10k plus. They don't want their graphics cards eat into their productivity market which has extremely high margins.

My opinion on this is that this is short lived. AMD and INTEL are rolling hard into this same AI space. I have a strong suspicion that Intel's graphics card lines were not about video games but dipping their feet into the graphics card space a generation early to be ready when the big demand for AI capable cards comes. I think they knew full on ahead of time that it was going to be a money loser. It is looking more and more like graphics cards will be a spin off of AI tech just due to how much enhancement, upscaling, frame generation and such they do rather than just straight up rasterization and computation. I think we are maybe 3-4 years from games having graphics that are touched up with AI to the point that they are difficult to distinguish from streaming video. I think the jump is going to be incredible due to piggy backing off the AI industry and there being a lot of people coming into the age of having disposable money and willing to spend 10k on a computer.
 
I think we are maybe 3-4 years from games having graphics that are touched up with AI to the point that they are difficult to distinguish from streaming video. I think the jump is going to be incredible due to piggy backing off the AI industry and there being a lot of people coming into the age of having disposable money and willing to spend 10k on a computer.
That's an interesting take but I think it's going to take longer since games are built first and foremost for consoles. Just see how slow raytracing optimization has been in recent years. Unless the PS6/etc make a serious push about it, not sure we'll see it being mainstream on PC.

I also think people are really overestimating the amount of people with 10K in disposable income to blow on a PC.
 
Speaking of Nvidia, Jensen's presentation was hilarious, here are the highlights:



Hilariously bad from a gamer or video card customer point of view. Like many, my thought was that it was a shit show. But when it's all said and done, it's clear now Jensen didn't do this presentation for people like us. It's for the investors and tech speculators. Nvidia stocks went up dramatically since the keynote and Nvidia valuation momentarily hit 1 trillion 2 days ago. It came down a bit but I fear gpu prices will keep getting worse even if demands stay low so long money keeps flowing to the company.
 
God I must be getting old. Back in the day I was all about keeping up with the latest tech and building my own PC's... now I just couldn't care less.

I mean I am also drunk but I don't imagine it would make too much difference.

Wake me when they make OLED gaming monitors with decent resolution.
 
God I must be getting old. Back in the day I was all about keeping up with the latest tech and building my own PC's... now I just couldn't care less.

I mean I am also drunk but I don't imagine it would make too much difference.

Wake me when they make OLED gaming monitors with decent resolution.
They make 4k Oled gaming monitors
 
They make 4k Oled gaming monitors

Not the ones I've been looking at and @Madmick mentioned that the current hdmi/dp standards didn't allow for those resolutions on the big oled gaming monitors yet.
It's because of current port limitations. The QD-OLEDs are 3440x1440: the ultrawide format. They are also true 10-bit color displays. The corresponding 4K ultrawide resolution would be 5040x2160. To run that display even at just @120Hz would require far more throughput than DisplayPort 1.4 or HDMI 2.1 are capable of carrying. Until DisplayPort 2.0+ becomes a standard (or USB4), you won't see a higher resolution in ultrawides.
 
Not the ones I've been looking at and @Madmick mentioned that the current hdmi/dp standards didn't allow for those resolutions on the big oled gaming monitors yet.
He was talking about at ultrawide. Regular 4k they do. Ultrawide a have more length wise pixels than standard format and requires more throughput

34in 1440 ultrawide is the way to go anyways. I play on both that and 4k and prefer the ultrawide format to the higher pixel density
 
He was talking about at ultrawide. Regular 4k they do. Ultrawide a have more length wise pixels than standard format and requires more throughput

34in 1440 ultrawide is the way to go anyways. I play on both that and 4k and prefer the ultrawide format to the higher pixel density

I'll have to go and have a look at them in store.

Spending nearly all my time on a 77" OLED at home, the difference between the dark colours and blacks in Diablo 4 on a standard 4k monitor was very noticeable... just looks like grey and shit by comparison.
 
I'll have to go and have a look at them in store.

Spending nearly all my time on a 77" OLED at home, the difference between the dark colours and blacks in Diablo 4 on a standard 4k monitor was very noticeable... just looks like grey and shit by comparison.
You’re talking about the color contrast not the pixel density. That’s a product of the OLED. So an Ultrawide OLED will have that same contrast at 1440 as your TV
 
You’re talking about the color contrast not the pixel density. That’s a product of the OLED. So an Ultrawide OLED will have that same contrast at 1440 as your TV

Yeah I know. I want a single large, high quality monitor that I can use for gaming and work.

Currently its the 4k 32in Dell ultrasharp that's great for large diagrams etc for work but looks meh in games or a gaming monitor that won't have the detail I need for work.

I'm looking at the Alienware 34inch oled for ultrawide gaming but concerned that work will look like arse on it.
 
Yeah I know. I want a single large, high quality monitor that I can use for gaming and work.

Currently its the 4k 32in Dell ultrasharp that's great for large diagrams etc for work but looks meh in games or a gaming monitor that won't have the detail I need for work.

I'm looking at the Alienware 34inch oled for ultrawide gaming but concerned that work will look like arse on it.
It won’t look like ass. It’s not like you’re going to see pixilation or anything; 1440 is pretty good.

4k is better all things equal I won’t argue against that. Overall I would take ultrawide 1440 Oled over any other package.

The one caveat there is for those newer super big ones that are 45inch 1440s. Some of those ice heard you can see pixilation up close.
 
I'll have to go and have a look at them in store.

Spending nearly all my time on a 77" OLED at home, the difference between the dark colours and blacks in Diablo 4 on a standard 4k monitor was very noticeable... just looks like grey and shit by comparison.

Embrace the dungeon vibes? Seriously I agree it would be a huge contrast.
 
im a little peeved that at least they have a 16 gb model for the 4060ti -though why? youre not going to be gaming in higher resolution on that thing.

i could have used that option with my 4070 ti. i game at 1440p but even right now in the odd game i'm just barely scraping by with 12gb. just running zelda tears of the kingdom uses a bit over 8 gigs, and my gpu is hardly even working at all while emulating that game lol
The narrow memory bus isn't helping either. One of my steam friends plays VR and the narrow bus on the 4070 Ti is crippling his performance with bad micro stutters with huge ms spikes. It's borderline unplayable. The card he had previously (3090) had lower averages and higher power consumption but was 10x more smooth
 
The narrow memory bus isn't helping either. One of my steam friends plays VR and the narrow bus on the 4070 Ti is crippling his performance with bad micro stutters with huge ms spikes. It's borderline unplayable. The card he had previously (3090) had lower averages and higher power consumption but was 10x more smooth
I'm surprised reviewers didn't go out of their way a bit more than they did to highlight this weakness for gamers on really high resolution displays (HDR 4K & 1440p Ultrawide gamers). 504 GB/s is a pitiful VRAM throughput for its class. For perspective, to put what @560ti is talking about here into context:

4070 Ti Memory Throughput

vs. RDNA 3.0

52.5% vs. RX 7900 XTX
62.6% vs. RX 7900 XT

vs. Intel Arc
98.4% vs. Arc A770
98.4% vs. Arc A750

vs. Ampere

50.0% vs. RTX 3090 Ti
53.8% vs. RTX 3090
66.3% vs. RTX 3080 Ti
66.3% vs. RTX 3080
82.8% vs. RTX 3070 Ti
112.5% vs. RTX 3070
112.5% vs. RTX 3060 Ti
140.0% vs. RTX 3060

vs. RDNA 2.0
87.5% vs. RX 6950 XT
98.4% vs. RX 6900 XT
98.4% vs. RX 6800 XT
98.4% vs. RX 6800
116.6% vs. RX 6750 XT
131.2% vs. RX 6700 XT

vs. Older
49.2% vs. Radeon VII
81.8% vs. RTX 2080 Ti
101.6% vs. RTX 2080 Super
104.1% vs. GTX 1080 Ti
104.1% vs. RX Vega 64

112.5% vs. RTX 2080
112.5% vs. RTX 2070 Super
112.5% vs. RTX 2070
112.5% vs. RTX 2060 Super
112.5% vs. RX 5700 XT
112.5% vs. RX 5700
123.0% vs. RX Vega 56
 
Desktop GPU Sales Lowest in Decades: Report
ChiefAmbitiousGalapagoshawk-max-1mb.gif
 
Just bought a 4070, came with Diablo 4 for free, for my new build. Any suggestions on what Intel CPU to pair it with? I mostly just game and some streaming. I play at 2k and don't really see me going to 4k anytime soon. I have an I5 2500 right now that has served me well and still going strong. I9's seem like overkill.
The i5 13400 runs cool but not as powerfull as the i5 13600k. Or should i go for an i7? I don't have a problem spending more and on a air/liquid cooler either.
Any suggestions? Motherboard suggestions are welcome too.
 
Back
Top