- Joined
- Jul 4, 2009
- Messages
- 69,503
- Reaction score
- 42,510
LOL, obviously you haven't been to the NVIDIA subreddit in the last month. The #2 top post during that time:
The memes have been gold.
![]()
![]()
![]()
Titled "just installing my new 4090"...
![]()
LOL, obviously you haven't been to the NVIDIA subreddit in the last month. The #2 top post during that time:
The memes have been gold.
![]()
![]()
![]()
Titled "just installing my new 4090"...
![]()
It's a good deal in comparison to Nvidia but not AMD.Everything I have read has been that Intel is offering very game cards at a very bang per your buck spent price. Not high end but a lot of performance for the price. I was sort of hoping they would compete at the high end but maybe that wasn't realistic with this being their first gen of cards.
Everything I have read has been that Intel is offering very game cards at a very bang per your buck spent price. Not high end but a lot of performance for the price. I was sort of hoping they would compete at the high end but maybe that wasn't realistic with this being their first gen of cards.
I'm in the market to replace my 1070ti, but the cards have too many shortcomings right now when it comes to DX11 and older games for me. I'm sure in due time, they'll get it sorted.
For example, my main game is R6 Siege. With the A770, I'd get worse performance than my current 1070ti. When you can get the RX6600 for $250 right now and not have to deal with the driver issues*, The A770 and A750 doesn't make sense.
![]()
It's a good deal in comparison to Nvidia but not AMD.
I’m not cherry picking at all, Siege is a game I play every day. Why would I buy a card that gets worse performance? This is the real world to me.It depends on the game. You are cherry picking something it is currently bad at with bad drivers or isn't designed to excel at. I don't think this is a good test because it just drives up the card until it chokes. No one i playing a game at 600 frames per second. I get tired of people picking old games and using some ultra high frame rate as a benchmark.
Watch the GN review. The A770 had many of the same issues the previous card had they tested months ago despite that it's at release. In Steve's own word, they're retailing an "unfinished product". They didn't even get image output with several monitors they have in the studio. With one monitor the picture was a tiny box in the middle of the screen. That's how green the Intel drivers are.It depends on the game. You are cherry picking something it is currently bad at with bad drivers or isn't designed to excel at. I don't think this is a good test because it just drives up the card until it chokes. No one i playing a game at 600 frames per second. I get tired of people picking old games and using some ultra high frame rate as a benchmark.
I think part of the issue also that Intel is facing many of the games have been optimized for AMD even more so then Nvidia due to the popularity of their APU/SOC in the huge gaming area with Playstation and XBox making up a sizable portion of gaming currently. Almost every major PC port has a console variant so it is in their best interest to insure their titles are best optimized for these platforms. The thing I am really still interested in is how the A770 performance with encoders and decoders for streaming and export of 4k an 8k video.Watch the GN review. The A770 had many of the same issues the previous card had they tested months ago despite that it's at release. In Steve's own word, they're retailing an "unfinished product". They didn't even get image output with several monitors they have in the studio. With one monitor the picture was a tiny box in the middle of the screen. That's how green the Intel drivers are.
Performance on DX11 is atrocious. Even Intel doesn't deny this. They devote all their energy to talking about DX12 performance which is obviously still pitiful considering the A770 is on par with the RX 6600 at 1080p, and the RX 6600 XT at 1440p.
The lone silver lining for the Arc cards is that, in the long run, maybe 3-4 years down the road, if Intel has caught up their driver sophistication, the A770 should perform on par with the RX 6800 and RTX 3070 Ti (which start at $510 and $610, respectively, today), and will carry 8GB-16GB of VRAM to go along with that. But that's a maybe, and that's a long way away. Expect suffering until then, and RX 6600ish performance along the way.
It's an awful buy. Nobody should recommend it. It's good that Intel is getting into the game, but older, wiser, responsible gamers should steer kids away from those cards. It's truly an alpha generation. Don't be an alpha tester.
Watch the GN review. The A770 had many of the same issues the previous card had they tested months ago despite that it's at release. In Steve's own word, they're retailing an "unfinished product". They didn't even get image output with several monitors they have in the studio. With one monitor the picture was a tiny box in the middle of the screen. That's how green the Intel drivers are.
Performance on DX11 is atrocious. Even Intel doesn't deny this. They devote all their energy to talking about DX12 performance which is obviously still pitiful considering the A770 is on par with the RX 6600 at 1080p, and the RX 6600 XT at 1440p.
The lone silver lining for the Arc cards is that, in the long run, maybe 3-4 years down the road, if Intel has caught up their driver sophistication, the A770 should perform on par with the RX 6800 and RTX 3070 Ti (which start at $510 and $610, respectively, today), and will carry 8GB-16GB of VRAM to go along with that. But that's a maybe, and that's a long way away. Expect suffering until then, and RX 6600ish performance along the way.
It's an awful buy. Nobody should recommend it. It's good that Intel is getting into the game, but older, wiser, responsible gamers should steer kids away from those cards. It's truly an alpha generation. Don't be an alpha tester.
Yeah that's how I feel about it. Obviously, it's an alpha generation (won't sell a ton and won't get recommended much for obvious reasons) and the DX11 performance should be better but at the same time I do see a bunch of snobs/elitists on other forums that will cry about unrealistic scenarios in order to make it seem worst then it really is.It depends on the game. You are cherry picking something it is currently bad at with bad drivers or isn't designed to excel at. I don't think this is a good test because it just drives up the card until it chokes. No one i playing a game at 600 frames per second. I get tired of people picking old games and using some ultra high frame rate as a benchmark.
I wonder how much more powerful the 4090 will be than the top tier 3000's? I have a 3080 TI and I love it but they're saying the jump in this new gen is higher than usual
Yeah that's how I feel about it. Obviously, it's an alpha generation (won't sell a ton and won't get recommended much for obvious reasons) and the DX11 performance should be better but at the same time I do see a bunch of snobs/elitists on other forums that will cry about unrealistic scenarios in order to make it seem worst then it really is.
For example one guy on another forum was crying (i'm talking 10+ post on the same subject) that it was only getting 250FPS in CSGO when a card in its class should be getting closer to 400. It's like really ? Are you actually being serious? Literally nobody who buys a lower-midrange bang for you buck card will actually give a shit that it's only getting 250FPS. Obviously, it should perform better but at the same time these people are kind of delusional in terms of actual practicality (average buyer will never tell the difference, only elitist staring at charts)
No, they're not.