• Xenforo Cloud has scheduled an upgrade to XenForo version 2.2.16. This will take place on or shortly after the following date and time: Jul 05, 2024 at 05:00 PM (PT) There shouldn't be any downtime, as it's just a maintenance release. More info here

Tech Gaming Hardware discussion (& Hardware Sales) thread

I don't even know why I try with you. You'd argue the sun isn't yellow just to be "right"
I've asked you several times to clarify your reasoning. I've tried to divine it. The only other possible reasoning on your part I could deduce from what you wrote was that GCN aged well because it endured for so long, but that is also faulty because:
  1. My specific "Fine Wine" post showed the remarkable improvement of the RX 480/580 over their lifespans, which are identical cards on GCN 4.0 with the latter more aggressively clocked, and which launched a mere 10 months apart.
  2. If GCN were monolithic, as you wrongly believe, and its age accounted for its driver improvements, then the RX 580 should have already been perfected at launch, since that came 6 years and 3 months after GCN debuted.
  3. Presuming your assumption is RDNA won't endure as long as GCN, you don't know how long AMD will implement RDNA. We only have their roadmap through 2022 which promises RDNA 3.0.

You can persist in flinging impotent ad hominem, or you can admit you forwarded a claim that for whatever reason you believed to be valid, yet isn't logical.
 
What's the deal with AMD GPU drivers? I always hear they are trash, but nobody can vocalize what that means. I'm on the fence between 3080 and 6900xt (If I can get either). I had AMD/ATI back in the day, but have been with team green for awhile now. Thoughts?

tl;dr - i wouldn't worry about it.
 
I see the raging debate about AMD vs Nvidia if pure gaming is what you want go AMD if you use open source and design and model parts Nvidia the way to go.

You really don't need to go into pipeline depth, clock speed, volumetric modeling, specular shading, bump maps, light reflections and many more it's about support CUDA carries over irays and real time modeling and there is nothing like it on AMD at this level.

If AMD offered something similar I would jump on it right now only Nvidia gives me a decent solution in this regard. I am sticking with the easiest solution.
 
I've asked you several times to clarify your reasoning. I've tried to divine it. The only other possible reasoning on your part I could deduce from what you wrote was that GCN aged well because it endured for so long, but that is also faulty because:
  1. My specific "Fine Wine" post showed the remarkable improvement of the RX 480/580 over their lifespans, which are identical cards on GCN 4.0 with the latter more aggressively clocked, and which launched a mere 10 months apart.
  2. If GCN were monolithic, as you wrongly believe, and its age accounted for its driver improvements, then the RX 580 should have already been perfected at launch, since that came 6 years and 3 months after GCN debuted.
  3. Presuming your assumption is RDNA won't endure as long as GCN, you don't know how long AMD will implement RDNA. We only have their roadmap through 2022 which promises RDNA 3.0.

You can persist in flinging impotent ad hominem, or you can admit you forwarded a claim that for whatever reason you believed to be valid, yet isn't logical.

I already made my point, no need to make it again.
All you're trying to do is find some tiny little thing you can pick apart to get a "win".
 
($100) ASUS Radeon R9 380 2GB STRIX -R9380-DC2OC-2GD5-GAMING Video Graphics Card GPU
s-l1600.jpg


eBay. Seller has a 99.1% positive rating with over 5K sales. Two units left. Condition: New.

Note that this is a special 2GB VRAM edition of the card. Only recommended for shoestring budget builds. It beats the hell out of other $100 options. It's faster than the GTX 1650 (~$150) , but with half the VRAM. Roughly equal to an RX 580.
https://benchmarks.ul.com/hardware/gpu/AMD+Radeon+R9+390+review
 
Last edited:
Cyberpunk 2077 will only be Direct X12 compatible, which implies that you must be on Windows 10 or 7 to use it. In addition, Ray Tracing will be enabled on all DXR (DirectX Ray Tracing) compatible GPU.

The CD Projekt RED teams have just recalled that Cyberpunk 2077 will be exclusively compatible with the Direct X12 API, and more precisely with the Direct X12 Ultimate API in order to take advantage of Ray Tracing on DXR (DirectX Ray Tracing) compatible GPUs, such as the latest GeForce RTX 3000 graphics cards from Nvidia or the AMD Radeon RX 6900 XT, 6800 XT and 6700 XT .


We chose the DX12 for two main reasons. First, this is the standard API for Xbox platforms. As the game is also available on Xbox One, we naturally wanted to implement it as quickly as possible. Secondly, it is the birthplace of DXR, and since we had planned to invest in DXR very early on, the choice of DX12 was quite straightforward, ” explains Marcin Gollent, the Polish studio's chief graphics program.

https://www.phonandroid.com/cyberpu...tibles-dxr.html/amp?__twitter_impression=true
 
@Slobodan Now that the thermal paste has settled, my 5800x is idling at 36 degrees with an AIO liquid cooler.
 
Cyberpunk 2077 will only be Direct X12 compatible, which implies that you must be on Windows 10 or 7 to use it. In addition, Ray Tracing will be enabled on all DXR (DirectX Ray Tracing) compatible GPU.

The CD Projekt RED teams have just recalled that Cyberpunk 2077 will be exclusively compatible with the Direct X12 API, and more precisely with the Direct X12 Ultimate API in order to take advantage of Ray Tracing on DXR (DirectX Ray Tracing) compatible GPUs, such as the latest GeForce RTX 3000 graphics cards from Nvidia or the AMD Radeon RX 6900 XT, 6800 XT and 6700 XT .


We chose the DX12 for two main reasons. First, this is the standard API for Xbox platforms. As the game is also available on Xbox One, we naturally wanted to implement it as quickly as possible. Secondly, it is the birthplace of DXR, and since we had planned to invest in DXR very early on, the choice of DX12 was quite straightforward, ” explains Marcin Gollent, the Polish studio's chief graphics program.

https://www.phonandroid.com/cyberpu...tibles-dxr.html/amp?__twitter_impression=true

Well I guess all those people who were keen to play Cyberpunk 2077 on their Windows 98 computer are shit outta luck.
 
Cyberpunk 2077 will only be Direct X12 compatible, which implies that you must be on Windows 10 or 7 to use it. In addition, Ray Tracing will be enabled on all DXR (DirectX Ray Tracing) compatible GPU.

The CD Projekt RED teams have just recalled that Cyberpunk 2077 will be exclusively compatible with the Direct X12 API, and more precisely with the Direct X12 Ultimate API in order to take advantage of Ray Tracing on DXR (DirectX Ray Tracing) compatible GPUs, such as the latest GeForce RTX 3000 graphics cards from Nvidia or the AMD Radeon RX 6900 XT, 6800 XT and 6700 XT .


We chose the DX12 for two main reasons. First, this is the standard API for Xbox platforms. As the game is also available on Xbox One, we naturally wanted to implement it as quickly as possible. Secondly, it is the birthplace of DXR, and since we had planned to invest in DXR very early on, the choice of DX12 was quite straightforward, ” explains Marcin Gollent, the Polish studio's chief graphics program.

https://www.phonandroid.com/cyberpu...tibles-dxr.html/amp?__twitter_impression=true

I'm surprised Win 7 is supported. It's 11 years old at this point and no longer receiving support, it's time to move on.
 
@Slobodan Now that the thermal paste has settled, my 5800x is idling at 36 degrees with an AIO liquid cooler.
Very nice, got the text from AusPost earlier this morning, mine will be arriving today.

Can't wait!
 
I'm surprised Win 7 is supported. It's 11 years old at this point and no longer receiving support, it's time to move on.
Can you come and tell that to my dad, who at this point still pines for the days of Win XP.
 
Back
Top