Tech Gaming Hardware discussion (& Hardware Sales) thread

w8kgvre86yee1.png
 
So the minimums in CS:GO look good, but overall, this release is disappointing as hell. I saw this Reddit several days ago portending this:
tivre2hpxbee1.jpeg


average-fps-1920-1080.png

  • 5090 vs. 4090 = +11.6%
  • 4090 vs. 3090 = +58.6%
  • 3090 vs. 2080 Ti = +41.2%
    the below not from the above benchmark, but from Techpowerup's launch review...
  • 2080 Ti vs. 1080 Ti = ~+23.0% [<-- remember, the RTX 2xxx series was panned as the worst GPU value improvement ever seen from NVIDIA when it released)
Maybe one could offer a defense, however reaching, if those Tensor cores used for DLSS had been superstacked, but as you can see from the chart above, even the increase there is pitiful compared to past bumps. NVIDIA is so dominant now that they read the massive blowback on Reddit complaining about the price in the past few weeks after they announced it, and respond with...
geez-thats.gif
 
What did Linus do to piss everyone off again? I haven't been keeping up in the tech space, but I keep seeing videos popping up about him, including Louis Rossman.
 
So the minimums in CS:GO look good, but overall, this release is disappointing as hell. I saw this Reddit several days ago portending this:
tivre2hpxbee1.jpeg


average-fps-1920-1080.png

  • 5090 vs. 4090 = +11.6%
  • 4090 vs. 3090 = +58.6%
  • 3090 vs. 2080 Ti = +41.2%
    the below not from the above benchmark, but from Techpowerup's launch review...
  • 2080 Ti vs. 1080 Ti = ~+23.0% [<-- remember, the RTX 2xxx series was panned as the worst GPU value improvement ever seen from NVIDIA when it released)
Maybe one could offer a defense, however reaching, if those Tensor cores used for DLSS had been superstacked, but as you can see from the chart above, even the increase there is pitiful compared to past bumps. NVIDIA is so dominant now that they read the massive blowback on Reddit complaining about the price in the past few weeks after they announced it, and respond with...
geez-thats.gif

well in terms of rasterization this makes it look like the 5090 isn't anywhere close to being twice as powerful as my 4070ti despite being well over twice the price so this just affirms my decision to hold out until the 6000 series before i build a new rig. still looks like a stud card though. i just can't justify the price tag compared to what i already have.
 
Last edited:
What did Linus do to piss everyone off again? I haven't been keeping up in the tech space, but I keep seeing videos popping up about him, including Louis Rossman.
Gamer's Nexus did their "we're suing Honey!" video and took the opportunity to shit on LTT again. Honey was, at one time, an LTT sponsor but LTT dropped them when they learned about the link hijacking--this was years ago and before anybody knew Honey was also ripping off consumers, not just creators. LTT made a post on their forum at the time explaining why they were dropping Honey but Steve, with his powers of 20/20 hindsight swears they didn't do enough and in his video took a Linus quote out of context as his 'proof' that Linus=Bad. Linus had enough and did a pseudo-C&D on his podcast thingy, but worded it in a way to leave the opportunity to bury the hatched. Unsurprisingly, Steve basically said, 'nah, fuck you' and released a bunch of personal texts between him and Linus as more Linus=Bad. Reactions are, unsurprisingly, pretty tribal. Rossman is flinging shit because he's got some co-channel venture with GN starting up.
 
A shame so many marks still trust Linus after the water block fiasco.
 
Gamer's Nexus did their "we're suing Honey!" video and took the opportunity to shit on LTT again. Honey was, at one time, an LTT sponsor but LTT dropped them when they learned about the link hijacking--this was years ago and before anybody knew Honey was also ripping off consumers, not just creators. LTT made a post on their forum at the time explaining why they were dropping Honey but Steve, with his powers of 20/20 hindsight swears they didn't do enough and in his video took a Linus quote out of context as his 'proof' that Linus=Bad. Linus had enough and did a pseudo-C&D on his podcast thingy, but worded it in a way to leave the opportunity to bury the hatched. Unsurprisingly, Steve basically said, 'nah, fuck you' and released a bunch of personal texts between him and Linus as more Linus=Bad. Reactions are, unsurprisingly, pretty tribal. Rossman is flinging shit because he's got some co-channel venture with GN starting up.

So, Linus foiund out that a sponsor he took money from was engaging in shady business practices, and instead of making a video warning people about it, they made a forum post.
When your main audience is videos, no, a forum post that probably 1% of those viewers visit, isn't enough. All he had to do was bring it up on the WAN show, they could have made a clip from it, and posted it.
 
  • Like
Reactions: PEB
So, Linus foiund out that a sponsor he took money from was engaging in shady business practices, and instead of making a video warning people about it, they made a forum post.
When your main audience is videos, no, a forum post that probably 1% of those viewers visit, isn't enough. All he had to do was bring it up on the WAN show, they could have made a clip from it, and posted it.
From what I can tell, they only make a stink about dropping sponsors when they're doing something with consumer impact, like when they dropped anker/yufy (or however the fuck its spelled) for storing security camera images improperly and without user consent. At the time Honey was only known for the link-hijacking, which wasn't impacting consumers, only creators. Yeah, probably still should have been more vocal about it, but it sounds like it was pretty well known in the creator space at the time and LTT weren't even the ones who discovered it; they heard about it elsewhere.

EDIT: apparently there is a big hour-long Rossman video I wasn't aware off. I had only heard about him making loaded tweets earlier in the week. So my understanding of the whole situation appears to be out of date.
 
Last edited:
From what I can tell, they only make a stink about dropping sponsors when they're doing something with consumer impact, like when they dropped ugreen/yufy (or however the fuck its spelled) for storing security data improperly and without user consent. At the time Honey was only known for the link-hijacking, which wasn't impacting consumers, only creators. Yeah, probably still should have been more vocal about it, but it sounds like it was pretty well known in the creator space at the time and LTT weren't even the ones who discovered it; they heard about it elsewhere.

EDIT: apparently there is a big hour-long Rossman video I wasn't aware off. I had only heard about him making loaded tweets earlier in the week. So my understanding of the whole situation appears to be out of date.
This is interesting

1737798157442.jpeg
 
well in terms of rasterization this makes it look like the 5090 isn't anywhere close to being twice as powerful as my 4070ti despite being well over twice the price so this just affirms my decision to hold out until the 6000 series before i build a new rig. still looks like a stud card though. i just can't justify the price tag compared to what i already have.
On paper it has over double the pipelines of any kind, and the throughputs are all over double, sometimes more than triple. In reality that doesn't necessarily mean it will offer double the frames or benchmark performance.

*Edit* Well, damn, I had such a pretty head-to-head table to share as I see it before I actually hit post. Then it becomes a jumbled pile of shit.
 
So the minimums in CS:GO look good, but overall, this release is disappointing as hell. I saw this Reddit several days ago portending this:
tivre2hpxbee1.jpeg


average-fps-1920-1080.png

  • 5090 vs. 4090 = +11.6%
  • 4090 vs. 3090 = +58.6%
  • 3090 vs. 2080 Ti = +41.2%
    the below not from the above benchmark, but from Techpowerup's launch review...
  • 2080 Ti vs. 1080 Ti = ~+23.0% [<-- remember, the RTX 2xxx series was panned as the worst GPU value improvement ever seen from NVIDIA when it released)
Maybe one could offer a defense, however reaching, if those Tensor cores used for DLSS had been superstacked, but as you can see from the chart above, even the increase there is pitiful compared to past bumps. NVIDIA is so dominant now that they read the massive blowback on Reddit complaining about the price in the past few weeks after they announced it, and respond with...
geez-thats.gif
Oh come on, just LOOK at the difference in graphical fidelity and detail DLSS 4 makes with these new cards.

cp 2077 5090.webp


That's not something you can easily quantify with numbers.

Day 0 purchase for me. I might have to take out a mortgage but it is so worth it.
 

Attachments

  • cp 2077 5090.webp
    cp 2077 5090.webp
    160 KB · Views: 7
Last edited:
Oh come on, just LOOK at the difference in graphical fidelity and detail DLSS 4 makes with these new cards.

View attachment 1080675


That's not something you can easily quantify with numbers.

Day 0 purchase for me. I might have to take out a mortgage but it is so worth it.
Sarcasm aside, since DLSS 4 will run on all RTX cards (2000, 3000, 4000, and 5000 series) this is where I thought there might be the most massive increase over past cards. But nope. Not with the A.I-accelerating ray-tracing cores, either.

GPURTX 2080 TiRTX 3090RTX 4090RTX 5090
Tensor Cores544328512680
Ray-Tracing Cores6882128170
 
Welp, I'm sad to say I don't think my laptop is gonna make it much past my current trip. It's a i7-1065G7 cpu, with 16 GB Ram, and Intel Iris Graphics (which is an integrated GPU that I think is total shit by today's standards).

It's done it's duty and really liked it since it's the only OLED laptop I've ever had, but i) the battery is getting fried, when doing a battery report it's at half it's original capacity at full charge and it tends to not calibrate right (shuts off without warning once it gets to about 30% left, and also more recently seems to jump to full charge at some point when charging, and ii) only one of the USB charging ports works now, with the other being finnicky.

I was originally thinking of a zenbook S14 (portability is really important to me), the new ones has Ultra 7 258V processor with Arc graphics, though I have no idea what kind of raw perfomance this can give, say relative to a steam deck. But I also see their zephyrus line seems very compact for a dedicated GPU laptop, which might have the portability I want, and I think a lower tier (though superior to Arc?) card like a 4060.

anybody have any thoughts on what the intel Arc would be capable of versus a dedicated card? I figure with a 14 inch monitor you don't need toooooo much power to do some gaming. But I'm curious whether a title say, Like A Dragon Infinite Wealth, could run respectably on Arc laptop graphics.
 
Welp, I'm sad to say I don't think my laptop is gonna make it much past my current trip. It's a i7-1065G7 cpu, with 16 GB Ram, and Intel Iris Graphics (which is an integrated GPU that I think is total shit by today's standards).

It's done it's duty and really liked it since it's the only OLED laptop I've ever had, but i) the battery is getting fried, when doing a battery report it's at half it's original capacity at full charge and it tends to not calibrate right (shuts off without warning once it gets to about 30% left, and also more recently seems to jump to full charge at some point when charging, and ii) only one of the USB charging ports works now, with the other being finnicky.

I was originally thinking of a zenbook S14 (portability is really important to me), the new ones has Ultra 7 258V processor with Arc graphics, though I have no idea what kind of raw perfomance this can give, say relative to a steam deck. But I also see their zephyrus line seems very compact for a dedicated GPU laptop, which might have the portability I want, and I think a lower tier (though superior to Arc?) card like a 4060.

anybody have any thoughts on what the intel Arc would be capable of versus a dedicated card? I figure with a 14 inch monitor you don't need toooooo much power to do some gaming. But I'm curious whether a title say, Like A Dragon Infinite Wealth, could run respectably on Arc laptop graphics.
GPURTX 4060 LaptopArc 140V
(Ultra 7 258V)
Iris Plus G7 64EU
(i7-1065G7)
Steam Deck OLED
Pixel Bandwidth (GP/s)90.765.68.425.6
Textel Bandwidth (GT/s)181.4131.233.651.2
Memory Bandwidth (GB/s)256.0102.4*25.6*88.0
Processing Power (TFLOPS)11.604.201.081.64

*Memory bandwidth for these iGPUs varies depending on system RAM speed. For the Arc 140V, I used the typical LPDDR5X RAM speed in the Zenbook S14 of 6400MHz. For the Iris Plus I quoted you the throughput if DDR4-3200MHz frequency, but of course 2133MHz is the baseline for DDR4, and 2666MHz was also common for that generation of laptops.

There's a lot of empty fields, but for how the above translates to real-world performance, you can't do better than this:
RTX 4060 Laptop vs. Arc 140V vs. Steam Deck

The one thing the above doesn't account for is DLSS, and especially for lowered-powered GPUs, like those in laptops, this is just becoming more and more significant than it already is for desktop GPUs. The 4060 will have a stark advantage there.
 
Last edited:
Welp, I'm sad to say I don't think my laptop is gonna make it much past my current trip. It's a i7-1065G7 cpu, with 16 GB Ram, and Intel Iris Graphics (which is an integrated GPU that I think is total shit by today's standards).

It's done it's duty and really liked it since it's the only OLED laptop I've ever had, but i) the battery is getting fried, when doing a battery report it's at half it's original capacity at full charge and it tends to not calibrate right (shuts off without warning once it gets to about 30% left, and also more recently seems to jump to full charge at some point when charging, and ii) only one of the USB charging ports works now, with the other being finnicky.

I was originally thinking of a zenbook S14 (portability is really important to me), the new ones has Ultra 7 258V processor with Arc graphics, though I have no idea what kind of raw perfomance this can give, say relative to a steam deck. But I also see their zephyrus line seems very compact for a dedicated GPU laptop, which might have the portability I want, and I think a lower tier (though superior to Arc?) card like a 4060.

anybody have any thoughts on what the intel Arc would be capable of versus a dedicated card? I figure with a 14 inch monitor you don't need toooooo much power to do some gaming. But I'm curious whether a title say, Like A Dragon Infinite Wealth, could run respectably on Arc laptop graphics.
I have the Zephyrus G14 (2024 model) with the 4060, I would recommend it
 
@Slobodan @Madmick

Thanks for quick replies. Amazing data madmick. I'm surprised at how much those numbers seem to be in favor of Arc as compared to the steam deck. I am leaning with the Zenbook then because I'm assuming those numbers mean it can be decent enough with games at low settings (I'm not looking for a laptop to replace my desktop for gaming),it would have notably better battery life, and I do like touch screens for surfing when using laptop in bed which is quite a bit. I'd also assume the Zenbook to be quieter though I could be wrong. The Zephyrus does look cool as fuck tho, it doesn't look beefy and seems legit very portable for the juice inside.

Edit: I just timed my current laptop, and it goes from full charge to sudden off in more/less exactly 2 hours. That's with just web browsing casually with very little video playback (maybe like I did 5 to 10 minutes worth, rest was browsing Reddit, sherdog, etc). I think modern laptops are leaps and bounds better than even if my battery was as good as it was on Day 1 which it obviously isn't lol.
 
@Slobodan @Madmick

Thanks for quick replies. Amazing data madmick. I'm surprised at how much those numbers seem to be in favor of Arc as compared to the steam deck. I am leaning with the Zenbook then because I'm assuming those numbers mean it can be decent enough with games at low settings (I'm not looking for a laptop to replace my desktop for gaming),it would have notably better battery life, and I do like touch screens for surfing when using laptop in bed which is quite a bit. I'd also assume the Zenbook to be quieter though I could be wrong. The Zephyrus does look cool as fuck tho, it doesn't look beefy and seems legit very portable for the juice inside.

Edit: I just timed my current laptop, and it goes from full charge to sudden off in more/less exactly 2 hours. That's with just web browsing casually with very little video playback (maybe like I did 5 to 10 minutes worth, rest was browsing Reddit, sherdog, etc). I think modern laptops are leaps and bounds better than even if my battery was as good as it was on Day 1 which it obviously isn't lol.
If you're in the US, I'd keep an eye on the G14 at Best Buy. Clearance for the last generation 4070 G14 was last week, the 4060 will get clearance pricing between now and Best Buy's Q2 restock.
 
@Slobodan @Madmick

Thanks for quick replies. Amazing data madmick. I'm surprised at how much those numbers seem to be in favor of Arc as compared to the steam deck.
Just shows the Steam Deck's age. The OLED didn't change the base chipset, and the original came out in Feb-2022, but it's built on RDNA 2.0, and this architecture debuted in Oct-2020. The MSI Claw A1M debuted last year with an Arc card that is also significantly more powerful. Although when I revisited that I realized I'd forgotten a modifier for LPDDRX memory, and it must be halved. So I fixed that in the chart above. ***disregard this, I got it right the first time*** That's one thing Valve went out of their way to do with the Steam Deck. Valve ordered a custom quad-channel for its RAM so the VRAM transfer rate would be faster.

Also be mindful the original Arc cards looked incredibly impressive on these sheets due to hardware pipelines, but felt far below expectations of what you'd expect for real-world performance due to driver inferiority & other issues. However, as the recent release of the desktop Xe2 cards showed, Intel has closed most of that gap (the Arc 140V is an Xe2 architecture iGPU).
 
Last edited:
Back
Top