- Joined
- Jul 23, 2011
- Messages
- 34,962
- Reaction score
- 26,210
Here is Steve discussing everything that has happened.
Even though they walked it back, the damage has been done. The community and reviewers won't forget this.
Here is Steve discussing everything that has happened.
I’ve been watching him for a couple years now. He definitely has a different style about educating people.
Between him and Louis Rossman, they got me into electronics and helped me understand the basics.
i can't fathom how they thought this was a good idea. and the "apology" was crap, too.
I only made my post because after I started posting about simulators Linus made a 6,000 or so dollar motion simulator rig with NextGen motion seat.
Timing is funny because COVID 19 has created a number of motion simulator sales. I posted about this guy an Linus does a confab with him lol is Linus secretly a Sherdogger da da daaaa! Lol

Here is Steve discussing everything that has happened.
That's precisely what Linus forecast as his main critique of the RTX 2000 series when it launched. It was excessively expensive, and chiefly due to a feature that wouldn't practically benefit gamers. Consumers were being involuntarily enlisted as capital investors in the alpha stage of physical ray-tracing cores.Here is why I believe in the spectacular RTX future, but not in the RTX 3000-series GPUs...
However, what I do not believe is that Nvidia’s RTX 3000-series GPUs will be a meaningful part of that future, and I have a few reasons to believe that.
The first one has to do with the fact that my very own RTX 2060 (mobile) is simply not capable of using any ray-tracing effects in any modern title. Yes, technically, I can turn RTX on, but practically I cannot, and that is a very important distinction to consider. So, technically, yes, you will be able to switch real-time ray tracing on in the future RTX titles with a 3000-series card, but in practice that might result in unplayable frames or severely compromised visuals due to low resolutions, at which playable frames with RTX on can be achieved. So, practically, no, in all likelihood you will not be able to enjoy real-time ray tracing in future titles, because of practical considerations such as super low frame rates and/or resolutions. Granted, this will still be an RTX experience, but like Cyberpunk 2077 on a base PS4, it will not be a truly enjoyable one....
My second reason...there are still only four RTX titles that I have personally played that seem to offer good RTX implementations. These are the following: Control, Cyberpunk 2077, Battlefield 5 and Watch Dogs: Legion. But what all of these titles have in common is that all of them tank the frame rates with RTX on, while less demanding RTX titles provide only a modest visual uplift.
So to summarise, the reason why I do not believe in the RTX 3000-series GPUs is that they offer neither enough rasterisation performance nor enough RTX performance to run very demanding ray-tracing-focused titles of the next 3 to 5 years. They are struggling in Cyberpunk 2077 and Watch Dogs: Legion now, which is why it is very hard to imagine these GPUs being able to take full advantage of the RTX titles of the future.
Ultimately, this whole imbroglio shined a light on the fact that importance of certain "features" unique to NVIDIA are overstated by NVIDIA fanboys. Here's a recent article by Stanislav Kokhanyuk of Notebookcheck prompted by the scandal:
I believe in ray tracing, but I do not believe in Nvidia's RTX 3000-series GPUs
That's precisely what Linus forecast as his main critique of the RTX 2000 series when it launched. It was excessively expensive, and chiefly due to a feature that wouldn't practically benefit gamers. Consumers were being involuntarily enlisted as capital investors in the alpha stage of a hardware technology.
Meanwhile, the same is true of DLSS 2.0, not because of any hardware deficiency, but simply because not enough titles are supported for this to meaningfully counter superior rasterization performance & value. Rasterization performance is still king. These other features, such as ray-tracing, VRS (variable rate shading), mesh shaders, sampler feedback, and machine deep-learning are only significant when a GPU already holds the rasterization advantage.
This is what I tried articulating to the console crowd concerning the Xbox Series X and PS5. Sony confirmed the PS5 lacks both VRS and sampler feedback. Yet, if it held the processing power advantage, this wouldn't be a terribly meaningful advantage for Microsoft because most of these features are designed to simply overcome rasterization limits. Yet the XSX holds the rasterization advantage already. Over the course of the console lives these advantages will mature in Microsoft's favor as developers become more familiar with the tools, and integrate them into games.
Conversely, where they don't mean much is when you're comparing an AMD card with a large rasterization advantage over an NVIDIA card at the same price point that happens to enjoy these DX12 Ultimate features. Who cares about ray-tracing if you can't run it? Who cares about improve framerates that merely close the gap?
I have already explicated this wisdom several times in this very thread. I'm grateful this debacle drew more eyes to the underlying truth NVIDIA wants to brainwash the more easily duped into ignoring. Hopefully this fosters education across the gaming community.
- GPU War > NVIDIA vs. AMD -- The $400 Price Point King: RX 5700 XT vs. RTX 2060 Super
- Why Superior Raw Rasterization Capability Ages Better in GPUs (the case of the RX 480/580 vs. GTX 1060)
- Buyer's Guide 101: What to Look for in a GPU
- Techspot UPDATED Comparison: RX 5700 XT vs. RTX 2060 Super
- Techspot UPDATED Frame-per-dollar analysis (simplified): RX 5700 XT vs. RTX 2060 Super
for Quake II RTX alone.
Quake II is one of my favourite games of all time.
i loved it, too. but i can't fathom using it as any reason to buy hardware. haha
Okay, so there is a lot of ignorance to tackle, here.I get the point of that article, but I think it’s a bit of a stretch to say the 3080 or 3090 are struggling with ray tracing, and certainly if I could afford a 3080 I would get it over a 6800xt for the ray tracing advantages alone, and while ray tracing is limited right now, a person would still be more “future proof” with a 3080 than a 6800xt say two years from now as long as the industry doesn’t give up on ray tracing.
Lord knows that if I do follow through on my plan to build a gaming pc next year I’d be looking at a 3070 or 3080 over a 6800 or 6800xt for Quake II RTX alone.

...wtf are you talking about? Have airheads looked at benchmarks? The RTX 3080 is crushing the RX 6900 XT in game benchmark roundups. The 3070 Ti will probably be its peer when it arrives. Anyone who spends $999 on the 6900 XT over the $699 3080 is a moron.
Okay, so there is a lot of ignorance to tackle, here.
First, the 6800 XT is capable of ray-tracing. In fact, it leverages more physical ray-tracing "intersection engines" than the 3080 possess. It's the previous generation of AMD GPUs that lacked ray-tracing. That doesn't necessarily mean that AMD's implementation is equal to NVIDIA's, they're definitely playing catch up, as the recent Vulkan tests show, but the card can physically ray-trace.
Second, you didn't comprehend the thrust of his argument. He's not just saying that the RTX 2060 cards are too weak to support a playable framerate with ray-tracing turned on, as Linus forecast, but he's saying that ray-tracing, because it is calibrated to these generations of GPUs, where even the most powerful GPUs offer ray-tracing performance that this "future" you speak of will consider pitiful, is typically so underwhelming that it isn't even worth the feature. He used his own RTX 2060 Mobile's almost immediate irrelevance as the basis for that:
"So, technically, yes, you will be able to switch real-time ray tracing on in the future RTX titles with a 3000-series card, but in practice that might result in unplayable frames or severely compromised visuals due to low resolutions, at which playable frames with RTX on can be achieved."
Third, meanwhile, even for the RTX 3090, which nobody is buying, because you're not the only one who can't afford it, there are incredibly few titles where it is even worth turning it on. He testifies there are only four titles he has played where he felt ray-tracing was worth it. Furthermore, he adds that turning RTX On still destroys the performance even with this mighty card. Take DLSS out of the equation because it isn't apples-to-apples, and let's examine Watch Dogs: Legion. At 4K, with RTX On, DLSS disabled, running a i9-10900K + RTX 3090, you get....32 fps. So what's the point? Sure, you could turn down settings to 1440p, but now you're making a choice: between a more satisfying 60ish fps 4K framerate without ray-tracing versus a similar 1440p framerate with ray-tracing. Now imagine that AMD (or NVIDIA themselves) offered a competitor card at the same price point without ray-tracing but significantly superior rasterization performance. Which card would you buy? Suddenly, the "future-proof" concern from the second point above favors the latter card because it's only a matter of time before games at 1440p start to break the RTX 3090 in the same way.
Fourth, finally, to backtrack a bit, your comparison of the RTX 3080 to the RX 6800 XT within the framework of my conceptual theme pitting raw rasterization against ray-tracing (and other more advanced features) is purposeless. The RTX 3080 possesses superior rasterization performance. In fact, I'm getting triggered now every time I see a headline about the impending RTX 3080 Ti being a welcome arrival for NVIDIA as a "competitor" to the RX 6900 XT. I think...
...wtf are you talking about? Have airheads looked at benchmarks? The RTX 3080 is crushing the RX 6900 XT in game benchmark roundups. The 3070 Ti will probably be its peer when it arrives. Anyone who spends $999 on the 6900 XT over the $699 3080 is a moron. NVIDIA doesn't need a competitor to the RX 6900 XT. AMD needs to drastically lower their pricing.
Okay, my mistake. You never mentioned an awareness of their ray-tracing capability either, but your post appeared to imply that they didn't have it.I know the AMD 6000 cards are capable of ray tracing, I never said they weren’t.
I called gamers who would buy the 6900 XT for $999 over the 3080 for $699 morons. Do you need your own posts read back to you?Lastly, 6900xt was marketed as the 3090 killer while 6800xt was marketed as the 3080 killer. I haven’t looked at the benchmarks that closely beyond ray tracing performance. In any event I don’t appreciate being called a fucking moron.
You can't find a $999 RX 6900 XT, either. Fuck "Ampere" and "Big Navi". They should have issued a joint announcement they were collectively name all their cards the "Keyser Soze" generation.pssh, what are YOU talking about? i can't find a $699 3080 anywhere.
You can't find a $999 RX 6900 XT, either. Fuck "Ampere" and "Big Navi". They should have issued a joint announcement they were collectively name all their cards the "Keyser Soze" generation.
Except that point fails the hypothetical. If you buy a 6900 XT simply because it was available, you didn't buy it over the 3080 as an alternative at a lower price.that's my point, though. i can't talk shit on anyone buying any current gen gpu for retail since they're unicorns. if you sight one, might as well get it.
Except that point fails the hypothetical. If you buy a 6900 XT simply because it was available, you didn't buy it over the 3080 as an alternative at a lower price.
You're creating a new hypothetical to rebut an existing hypothetical on which it bears no relevance.