- Joined
- Jun 13, 2005
- Messages
- 66,596
- Reaction score
- 38,489
says the one whose hypothetical also doesn't exist - which was the point. hence, unicorns.
Please, learn what words mean before you use them.
says the one whose hypothetical also doesn't exist - which was the point. hence, unicorns.
![]()
Please, learn what words mean before you use them.
Okay, so there is a lot of ignorance to tackle, here.
First, the 6800 XT is capable of ray-tracing. In fact, it leverages more physical ray-tracing "intersection engines" than the 3080 possess. It's the previous generation of AMD GPUs that lacked ray-tracing. That doesn't necessarily mean that AMD's implementation is equal to NVIDIA's, they're definitely playing catch up, as the recent Vulkan tests show, but the card can physically ray-trace.
Second, you didn't comprehend the thrust of his argument. He's not just saying that the RTX 2060 cards are too weak to support a playable framerate with ray-tracing turned on, as Linus forecast, but he's saying that ray-tracing, because it is calibrated to these generations of GPUs, where even the most powerful GPUs offer ray-tracing performance that this "future" you speak of will consider pitiful, is typically so underwhelming that it isn't even worth the feature. He used his own RTX 2060 Mobile's almost immediate irrelevance as the basis for that:
"So, technically, yes, you will be able to switch real-time ray tracing on in the future RTX titles with a 3000-series card, but in practice that might result in unplayable frames or severely compromised visuals due to low resolutions, at which playable frames with RTX on can be achieved."
Third, meanwhile, even for the RTX 3090, which nobody is buying, because you're not the only one who can't afford it, there are incredibly few titles where it is even worth turning it on. He testifies there are only four titles he has played where he felt ray-tracing was worth it. Furthermore, he adds that turning RTX On still destroys the performance even with this mighty card. Take DLSS out of the equation because it isn't apples-to-apples, and let's examine Watch Dogs: Legion. At 4K, with RTX On, DLSS disabled, running a i9-10900K + RTX 3090, you get....32 fps. So what's the point? Sure, you could turn down settings to 1440p, but now you're making a choice: between a more satisfying 60ish fps 4K framerate without ray-tracing versus a similar 1440p framerate with ray-tracing. Now imagine that AMD (or NVIDIA themselves) offered a competitor card at the same price point without ray-tracing but significantly superior rasterization performance. Which card would you buy? Suddenly, the "future-proof" concern from the second point above favors the latter card because it's only a matter of time before games at 1440p start to break the RTX 3090 in the same way.
Fourth, finally, to backtrack a bit, your comparison of the RTX 3080 to the RX 6800 XT within the framework of my conceptual theme pitting raw rasterization against ray-tracing (and other more advanced features) is purposeless. The RTX 3080 possesses superior rasterization performance. In fact, I'm getting triggered now every time I see a headline about the impending RTX 3080 Ti being a welcome arrival for NVIDIA as a "competitor" to the RX 6900 XT. I think...
...wtf are you talking about? Have airheads looked at benchmarks? The RTX 3080 is crushing the RX 6900 XT in game benchmark roundups. The 3070 Ti will probably be its peer when it arrives. Anyone who spends $999 on the 6900 XT over the $699 3080 is a moron. NVIDIA doesn't need a competitor to the RX 6900 XT. AMD needs to drastically lower their pricing.
They still catching flak for the shit with Gaming Unboxed or whatever?Tough week to be an NVIDIA fanboy.
It's insane how fast it went from a buyer's market to a seller's market after Black Friday. It flipped overnight. Take advantage. As soon as they are able to supply the new GPUs this will end.I still need to sell my 1070ti and I can't believe they are still going for over $250 on ebay. I see some selling for $300. This is all preowned not even new. Didn't expect to get that much back for it.
This isn't going to go away for a long time. NVIDIA just tried to strongarm one of the most reliable game hardware reviewers in the world into saying whatever they want them to say. They literally issued the ultimatum from their executive of global marketing: "..change your editorial direction". In other words, "Cover NVIDIA exactly the way we want you to cover us-- be our marketing puppet-- or we'll cut you off."They still catching flak for the shit with Gaming Unboxed or whatever?
Hardware Unboxed and Bitwit were the first two channels I stumbled on when looking up ways to upgrade my computer.It's insane how fast it went from a buyer's market to a seller's market after Black Friday. It flipped overnight. Take advantage. As soon as they are able to supply the new GPUs this will end.
That's because stock vanished overnight. Every price-tracking chart of specific GPUs looks like the following. These are all relevant previous gen GPUs (RTX 2000 and RTX 2000 Super series):
![]()
![]()
![]()
![]()
This isn't going to go away for a long time. NVIDIA just tried to strongarm one of the most reliable game hardware reviewers in the world into saying whatever they want them to say. They literally issued the ultimatum from their executive of global marketing: "change your editorial direction". In other words, "Cover NVIDIA the exactly the way we want you to cover us-- be our marketing puppet-- or we'll cut you off."
Fuck them. Arrogant scumbags.
Even if they aren't Top 5 by traffic Techspot / Hardware Unboxed are Top 5 by reputation, I'd wager. They're easily Top 5 by usefulness. Nobody but TechPowerUp puts in as much time with the drudgery of just benchmarking games. That's boring, tedious, monotonous work. That's why none of them want to do it, and yet it's the most necessary function these tech journalists provide to us.
You can get a 1x2, 1x4, or 1x8 from this company. This is the HDMI 1.4 unit (Retaining the bandwidth of the previous version, HDMI 1.4 added support for 4096 × 2160 at 24 Hz, 3840 × 2160 at 24, 25, and 30 Hz, and 1920 × 1080 at 120 Hz):Not sure if this is the right thread for this or not, but does anyone here know much about HDMI switches? My TV only has 3 HDMI ports but I have more devices than that, because I have my cable box/PVR/DVR (whatever the correct nomenclature happens to be) as well as my PS5, I’m getting the kids a Nintendo switch, and then I’ve also got an Xbox One and a PS3 that I might want to hook up here, while I move my PS4 to my other TV (kind of redundant having my PS4 and PS5 hooked up to the same TV imo) and lastly I may occasionally want to hook up my NES Classic or SNES Classic, not to mention my laptop to my TV, and so I’m basically short by two, if not 3 HDMI ports on my TV.
Prior to 2012 I was using an old 36” crt from 2005 that had just one HDMI port so I used an HDMI switch for it to alternate between my Cable box, my 360 and my ps3. I mean it worked, but I don’t know if it would have compromised signal strength or anything, not that I would have noticed on a 36”
CRT that I think only went up to 720p, if not less.
Anyone have any comments or recommendations?
Not sure if this is the right thread for this or not, but does anyone here know much about HDMI switches? My TV only has 3 HDMI ports but I have more devices than that, because I have my cable box/PVR/DVR (whatever the correct nomenclature happens to be) as well as my PS5, I’m getting the kids a Nintendo switch, and then I’ve also got an Xbox One and a PS3 that I might want to hook up here, while I move my PS4 to my other TV (kind of redundant having my PS4 and PS5 hooked up to the same TV imo) and lastly I may occasionally want to hook up my NES Classic or SNES Classic, not to mention my laptop to my TV, and so I’m basically short by two, if not 3 HDMI ports on my TV.
Prior to 2012 I was using an old 36” crt from 2005 that had just one HDMI port so I used an HDMI switch for it to alternate between my Cable box, my 360 and my ps3. I mean it worked, but I don’t know if it would have compromised signal strength or anything, not that I would have noticed on a 36”
CRT that I think only went up to 720p, if not less.
Anyone have any comments or recommendations?
You can get a 1x2, 1x4, or 1x8 from this company. This is the HDMI 1.4 unit (Retaining the bandwidth of the previous version, HDMI 1.4 added support for 4096 × 2160 at 24 Hz, 3840 × 2160 at 24, 25, and 30 Hz, and 1920 × 1080 at 120 Hz):
www.amazon.com/Splitter-Certified-Duplicate-Supports-Resolutions/dp/B00FBZ02C0/
Their older HDMI 1.3 averaged 4.5* with over 8K reviews. It only supports HDMI 1.3 (1920 × 1080 at 120 Hz or 2560 × 1440 at 60 Hz). That would be fine for everything but the PS5 and Xbox One assuming the latter is a One S (media playback only) or One X. If not, then this one will suffice for the Xbox One, too.
If your TV has an HDMI 2.0+ port, plug your PS5 directly into that port. Get the 1x8 unit of the above dock, and plug it into a different/lesser HDMI port. You'll have one TV HDMI port left over, and tons of HDMI inputs for the other devices.
Expounding further, if you use a speaker bar or some other digital audio system it will want the HDMI Arc port. If your TV does have HDMI 2.0+, hopefully it doesn't only have one, because that would probably be the Arc port (TVs usually only have one). You definitely want the 2.0+ port for the PS5. In that case, most soundbars can run via an analog port, but you lose the ability to control the speakers with the TV remote (instead of a second sound bar remote). Of course, if you have a universal remote, managing multiple remotes isn't an issue.
Isn’t a splitter the opposite of a switch? For when you have multiple TVs or monitors?
Yes, apologies, those terms are sometimes used interchangeably, and I didn't look this over too closely. You're obviously after a unit that works in the reverse, and is fine only handling a single output at a time.Isn’t a splitter the opposite of a switch? For when you have multiple TVs or monitors?
No, that's not the case with the device I posted. It doesn't split the bandwidth. It just receives and then re-broadcasts it simultaneously across the chain.tl;dr - splitters degrade the signal. a switch doesn't (well, not to a degree that should matter).
Yes, apologies, those terms are sometimes used interchangeably, and I didn't look this over too closely. You're obviously after a unit that works in the reverse, and is fine only handling a single output at a time.
No, that's not the case with the device I posted. It doesn't split the bandwidth. It just receives and then re-broadcasts it simultaneously across the chain.
This unit is the most reviewed on Amazon:No worries. Any recommendations for switches?
This unit is the most reviewed on Amazon:
https://www.amazon.com/Switch-Awakelion-Premium-Switcher-Support/dp/B06WV5YJ6H/
Definitely opt for HDCP 2.2 protection as suggested. You may not need it if you use the PS5 as your primary media device for playback of Blu-Ray DVDs, 4K streaming, etc. Your TV's HDMI port will support that natively, but I overlooked you may have bought the digital-only edition. I also forgot that some use their Xbox One as the intermediary device for their cable box, so if you do that, then you'll also need it. It doesn't look like it's carrying a premium, either, so there's no reason not to get it.
Pretty much F them. But I am still buying my 3080 next week. AMD is still behind in hardware, and much much further behind in software. Also I have a Gsync monitor so....Tough week to be an NVIDIA fanboy.