Tech Gaming Hardware discussion (& Hardware Sales) thread

LOL, obviously you haven't been to the NVIDIA subreddit in the last month. The #2 top post during that time:


The memes have been gold.
adVGv7Z_460s.jpg


u2yezn1p3ql51.jpg


rtx4090joke.jpg


Titled "just installing my new 4090"...
amwjoy0ie3661.jpg

49fbvnhr6gs91.png
 
Everything I have read has been that Intel is offering very game cards at a very bang per your buck spent price. Not high end but a lot of performance for the price. I was sort of hoping they would compete at the high end but maybe that wasn't realistic with this being their first gen of cards.
It's a good deal in comparison to Nvidia but not AMD.
 
Everything I have read has been that Intel is offering very game cards at a very bang per your buck spent price. Not high end but a lot of performance for the price. I was sort of hoping they would compete at the high end but maybe that wasn't realistic with this being their first gen of cards.

I'm in the market to replace my 1070ti, but the cards have too many shortcomings right now when it comes to DX11 and older games for me. I'm sure in due time, they'll get it sorted.
For example, my main game is R6 Siege. With the A770, I'd get worse performance than my current 1070ti. When you can get the RX6600 for $250 right now and not have to deal with the driver issues*, The A770 and A750 doesn't make sense.
QUfbtti.png
 
I'm in the market to replace my 1070ti, but the cards have too many shortcomings right now when it comes to DX11 and older games for me. I'm sure in due time, they'll get it sorted.
For example, my main game is R6 Siege. With the A770, I'd get worse performance than my current 1070ti. When you can get the RX6600 for $250 right now and not have to deal with the driver issues*, The A770 and A750 doesn't make sense.
QUfbtti.png

It depends on the game. You are cherry picking something it is currently bad at with bad drivers or isn't designed to excel at. I don't think this is a good test because it just drives up the card until it chokes. No one i playing a game at 600 frames per second. I get tired of people picking old games and using some ultra high frame rate as a benchmark.
 
  • Like
Reactions: PEB
It's a good deal in comparison to Nvidia but not AMD.

I have a strong suspicion that the biggest difference between NVIDIA and AMD is DLSS, which needs to be trained to even work for a game.
 
It depends on the game. You are cherry picking something it is currently bad at with bad drivers or isn't designed to excel at. I don't think this is a good test because it just drives up the card until it chokes. No one i playing a game at 600 frames per second. I get tired of people picking old games and using some ultra high frame rate as a benchmark.
I’m not cherry picking at all, Siege is a game I play every day. Why would I buy a card that gets worse performance? This is the real world to me.
Or sometimes I decide to play CSGO, which is the most played game on Steam. The performance is abysmal in CSGO.
Why would you get an ARC gpu when you can pick up a RTX3060 or RX 6700 for around the same price and not have to worry about the performance tanking on certain games.

edit: To further show the underperformance of Intel Arc
Valorant, one of the most popular FPS's out there. Chart is from Hardware Canucks
cdOwdhF.png
 
Last edited:
It depends on the game. You are cherry picking something it is currently bad at with bad drivers or isn't designed to excel at. I don't think this is a good test because it just drives up the card until it chokes. No one i playing a game at 600 frames per second. I get tired of people picking old games and using some ultra high frame rate as a benchmark.
Watch the GN review. The A770 had many of the same issues the previous card had they tested months ago despite that it's at release. In Steve's own word, they're retailing an "unfinished product". They didn't even get image output with several monitors they have in the studio. With one monitor the picture was a tiny box in the middle of the screen. That's how green the Intel drivers are.

Performance on DX11 is atrocious. Even Intel doesn't deny this. They devote all their energy to talking about DX12 performance which is obviously still pitiful considering the A770 is on par with the RX 6600 at 1080p, and the RX 6600 XT at 1440p.

The lone silver lining for the Arc cards is that, in the long run, maybe 3-4 years down the road, if Intel has caught up their driver sophistication, the A770 should perform on par with the RX 6800 and RTX 3070 Ti (which start at $510 and $610, respectively, today), and will carry 8GB-16GB of VRAM to go along with that. But that's a maybe, and that's a long way away. Expect suffering until then, and RX 6600ish performance along the way.

It's an awful buy. Nobody should recommend it. It's good that Intel is getting into the game, but older, wiser, responsible gamers should steer kids away from those cards. It's truly an alpha generation. Don't be an alpha tester.
 
Disappointing that over 20 years later and Intel still can't unfuck drivers for their own video card.
 
Watch the GN review. The A770 had many of the same issues the previous card had they tested months ago despite that it's at release. In Steve's own word, they're retailing an "unfinished product". They didn't even get image output with several monitors they have in the studio. With one monitor the picture was a tiny box in the middle of the screen. That's how green the Intel drivers are.

Performance on DX11 is atrocious. Even Intel doesn't deny this. They devote all their energy to talking about DX12 performance which is obviously still pitiful considering the A770 is on par with the RX 6600 at 1080p, and the RX 6600 XT at 1440p.

The lone silver lining for the Arc cards is that, in the long run, maybe 3-4 years down the road, if Intel has caught up their driver sophistication, the A770 should perform on par with the RX 6800 and RTX 3070 Ti (which start at $510 and $610, respectively, today), and will carry 8GB-16GB of VRAM to go along with that. But that's a maybe, and that's a long way away. Expect suffering until then, and RX 6600ish performance along the way.

It's an awful buy. Nobody should recommend it. It's good that Intel is getting into the game, but older, wiser, responsible gamers should steer kids away from those cards. It's truly an alpha generation. Don't be an alpha tester.
I think part of the issue also that Intel is facing many of the games have been optimized for AMD even more so then Nvidia due to the popularity of their APU/SOC in the huge gaming area with Playstation and XBox making up a sizable portion of gaming currently. Almost every major PC port has a console variant so it is in their best interest to insure their titles are best optimized for these platforms. The thing I am really still interested in is how the A770 performance with encoders and decoders for streaming and export of 4k an 8k video.
 
Watch the GN review. The A770 had many of the same issues the previous card had they tested months ago despite that it's at release. In Steve's own word, they're retailing an "unfinished product". They didn't even get image output with several monitors they have in the studio. With one monitor the picture was a tiny box in the middle of the screen. That's how green the Intel drivers are.

Performance on DX11 is atrocious. Even Intel doesn't deny this. They devote all their energy to talking about DX12 performance which is obviously still pitiful considering the A770 is on par with the RX 6600 at 1080p, and the RX 6600 XT at 1440p.

The lone silver lining for the Arc cards is that, in the long run, maybe 3-4 years down the road, if Intel has caught up their driver sophistication, the A770 should perform on par with the RX 6800 and RTX 3070 Ti (which start at $510 and $610, respectively, today), and will carry 8GB-16GB of VRAM to go along with that. But that's a maybe, and that's a long way away. Expect suffering until then, and RX 6600ish performance along the way.

It's an awful buy. Nobody should recommend it. It's good that Intel is getting into the game, but older, wiser, responsible gamers should steer kids away from those cards. It's truly an alpha generation. Don't be an alpha tester.

I hope you are wrong but you are probably going to be more right than wrong.
 
It depends on the game. You are cherry picking something it is currently bad at with bad drivers or isn't designed to excel at. I don't think this is a good test because it just drives up the card until it chokes. No one i playing a game at 600 frames per second. I get tired of people picking old games and using some ultra high frame rate as a benchmark.
Yeah that's how I feel about it. Obviously, it's an alpha generation (won't sell a ton and won't get recommended much for obvious reasons) and the DX11 performance should be better but at the same time I do see a bunch of snobs/elitists on other forums that will cry about unrealistic scenarios in order to make it seem worst then it really is.

For example one guy on another forum was crying (i'm talking 10+ post on the same subject) that it was only getting 250FPS in CSGO when a card in its class should be getting closer to 400. It's like really ? Are you actually being serious? Literally nobody who buys a lower-midrange bang for you buck card will actually give a shit that it's only getting 250FPS. Obviously, it should perform better but at the same time these people are kind of delusional in terms of actual practicality (average buyer will never tell the difference, only elitist staring at charts)
 
I wonder how much more powerful the 4090 will be than the top tier 3000's? I have a 3080 TI and I love it but they're saying the jump in this new gen is higher than usual
 
I wonder how much more powerful the 4090 will be than the top tier 3000's? I have a 3080 TI and I love it but they're saying the jump in this new gen is higher than usual

I think the embargo ends on the 11th (or 12th?) so you'll find out very soon!
 
Yeah that's how I feel about it. Obviously, it's an alpha generation (won't sell a ton and won't get recommended much for obvious reasons) and the DX11 performance should be better but at the same time I do see a bunch of snobs/elitists on other forums that will cry about unrealistic scenarios in order to make it seem worst then it really is.

For example one guy on another forum was crying (i'm talking 10+ post on the same subject) that it was only getting 250FPS in CSGO when a card in its class should be getting closer to 400. It's like really ? Are you actually being serious? Literally nobody who buys a lower-midrange bang for you buck card will actually give a shit that it's only getting 250FPS. Obviously, it should perform better but at the same time these people are kind of delusional in terms of actual practicality (average buyer will never tell the difference, only elitist staring at charts)

When there's 100s of frames a second it really doesn't make a lick of difference, but there's some really unoptimised games out there and the more you like 'retro' gaming the more likely you are to find shit tier performance on your new card. It's also worth looking at the 1% lows which can often inform your experience more than average frame rate - you'll notice stutters, but you won't notice 120 vs 140 FPS for example.
 
Crucial MX500 2TB Internal SSD SATA for $120 at Best Buy. edit: it's $106 on Amazon
Sennheiser HD 599 for $90 on Amazon, that's an insane deal. You can use an adapter like this and pair it with a V-MODA boom pro mic to create an insanely good headset. Or another mic option to pair it with is the ModMic.


 
Last edited:
Yeah monster card.

Obviously it will be niche due to the high price and power draw but 45-58% better than the 3090 Ti at 4k all while using less power is not shabby. Hopefully the rest of the lineup (along with RDNA3) will follow suit (similar power draw to previous gen but significantly better performance)
 
Ryzen 5600 for $130 at Newegg
6650 XT for $265 at Newegg
 
Back
Top