Tech Gaming Hardware discussion (& Hardware Sales) thread

Neat. Any details on what, if anything it’s improved so far?
These are the games currently supported by it, but I haven't seen any confirmation the XSX/XSS versions of these games have been patched to integrate the support on the console:
https://www.amd.com/en/technologies/radeon-software-fidelityfx-supported-games

The point of DLSS and FidelityFX are to yield much higher framerates while in the best case scenario not sacrificing image quality by intelligently compressing the rendered frames on a frame-by-frame basis, and using other tricks to more efficiently utilize the GPU (ex. while flying in space through a meteor field it will render smaller, less complex rocks at lower polygon counts with lower resolution textures, but use higher polygon counts and higher resolution textures on the rocks that need it). It's like the video compression technology in mp4 codecs, and mp4 itself versus less efficient video formats. If the computer can be taught to recognize a mostly static shot of someone's face at night, for example, is repeating the same pixels as black in every frame, then you can compress the total amount of information it needs to process. This means much smaller file sizes without losing image quality. Similar idea with DLSS/FFX.

In theory, in the future, DLSS/FFX could be even better than native resolution by yielding superior frametime rates (i.e. lower latency) or even superior resolution/contrast/shading in parts of the screen while simultaneously maintaining the same fps (or better).

DLSS was a running joke during its first version, since it was notorious for doing nothing but blurring a bunch of games, but DLSS 2.0 changed that, and it's been quite the feather in NVIDIA's hat ever since (when implemented). Nevertheless, it's still controversial whether it harms the image. DLSS has up to 4 modes that I've seen: (1) Ultra Performance, (2) Performance, (3) Balanced, and (4) Quality. The only one that most PC gamers find worth the fuss is the "Quality Mode"; the one intended to not sacrifice image quality, or to sacrifice as little as possible. Across most titles, this has achieved an increase of 5%-15% higher fps while maintaining noticeable image quality. However, there have been some titles where the difference is staggering. Call of Duty: Warzone is the most recent example. It notched a 35% improvement to the average fps in quality mode.

The biggest downside to DLSS 2.0 is that-- like the DX12 or Vulkan APIs, or for that matter RTX ray-tracing effects-- it has received support on a pitiful number of titles. To date, not counting the System Shock demo and the obscure VR game, there are only 34 confirmed DLSS 2.0 titles. It launched on March 26, 2020. That means just 7 titles are being added every 3 months. Bear in mind, too, that this includes games previously supported on the original DLSS version that have been patched to 2.0. Counting all DLSS titles, not just DLSS 2.0, going back to the original February 2019 launch, just 22 games are being added each year. And don't think native engine support is a silver lining for optimism. Each game has to be individually processed by NVIDIA to be supported, anyway.

One might be tempted to predict that support on the Xbox Series X will usher in greater support for FFX, but I'm skeptical. DX12 support on the Xbox One and PS4 did nothing to improve its dispersion. I was optimistic upon the launch of the new consoles DX12 would finally be embraced, but adoption continues to lag. Meanwhile, the XSX/XSS are in third place, and aren't market drivers for developers. So it really comes down to AMD. Their hustle might make the difference. The fact FFX became a thing a month ago and they already have 40 games speaks to that truth (vs. NVIDIA's 51 games for DLSS after 2 1/2 years).

What could it mean? Well, it could be huge. Assuming the same ~10% fps increase in games, or more, stacked on top of the XSX's existing processing power advantage, it would likely mean that 4K native on the Xbox while the PS5 lagged with 1440p-1800p would become the rule for supported next-gen multiplat games.

Although the console that will truly benefit will be the Xbox Series S. In "Performance" or "Balanced" modes, suddenly it might be able to match the PS5 instead of requiring a nerf. It still wouldn't be the PS5's equal in terms of image quality, but most critically, it could mean the lesser console won't hamstring more ambitious development, and won't suffer the struggle many have forecast it will face in the years to come as games become that much more demanding.
 
Anyone know if LG Nanocell TV tech works good with a PC ?

I heard that OLED and QLED show terrible avi and mpeg quality

But this 100 hz LG TV I found has 4K upscaling and HDMI 2.1
 
Had a 3080Ti added to my cart on the EVGA site (despite the site running like garbage) when I finally got logged in my ELITE account to finish the checkout, it wasn't in the cart anymore.
<2>
 



Apparently a Best Buy in Auburn, WA already has folks camping out for the 3080 Ti..... a card no one asked for.
 
Last edited:
Anyone know if LG Nanocell TV tech works good with a PC ?

I heard that OLED and QLED show terrible avi and mpeg quality

But this 100 hz LG TV I found has 4K upscaling and HDMI 2.1

It's not one I've seen many people using as a monitor. Lately its the 48 Inch LG CX or C1 Oled's that I've seen a lot of people use as a monitor (myself included a couple of times with my 65CX).

Rtings have said though the Nano90 is a good choice to use as a PC monitor.

https://www.rtings.com/tv/reviews/lg/nano90-2021
 
For any Sherdoggers just looking to game, this is one of the best prebuilt deals I've seen in the last year. This is if you just want to game. This isn't a prebuilt well-suited to future expansion or upgrading.
right_facing.png


About the deal (from the TS who flagged the sale):

About the PC:


As I've highlighted in past threads about HP/Lenovo builds, the RAM is 3200MHz, but unlike the Intel prebuilds, it should actually run at this speed since this is a Zen 3 CPU (didn't check the motherboard specs). Regardless, the cheapest route is to buy the 8GB option, and then upgrade with a second, identical spec'd Crucial stick to install yourself. It's as simply as plugging the stick in. It can be 8GB or 16GB, but not greater. This is strongly recommended so that the memory runs at dual channel speeds:

Otherwise, it's worth considering the CPU upgrade to the 6-core Ryzen 5 5600G for +$100. I wouldn't recommend the 5700G. Too expensive for too little gained, and it's a bit overkill for an RTX 3060 when a future GPU upgrade is off the table.

The SSD upgrade to 512GB (+$40) is also reasonable while the 1TB upgrade is exorbitant.

Finally, if you want more storage, the 1TB HDD (+$49) isn't too bad a premium. Otherwise, if you want more storage than that, definitely purchase an HDD separately and install it yourself. You can score a 2TB for $55 instead of the +$90 they're charging, and the best value currently is a 6TB 5400RPM HDD for $130. Also a very simple job. You slide the drive into a bay, then connect it with a cable. Most of these are bare drives, so don't forget a SATA III cable:
https://pcpartpicker.com/products/i...ort=ppgb&page=1&A=1900000000000,8000000000000
 
If anyone gives a crap



Oh the 3070ti drop June 10th apparently.

The RTX 3070 Ti, meanwhile, is coming next week on June 10th. This one will be a bit friendlier on the wallet, although with prices set to start at £529 / $599, it's still going to set you back a fair chunk of change. This puts it right in the middle of the MSRPs of the RTX 3070 and RTX 3080, with the RTX 3070 costing £100 / $100 less, and the RTX 3080 costing £100 / $100 more.
 
funny how pretty much everyone's unhappy with nvidia except @PEB



I get NVIDIA is a company and needs to make money but I can’t get over how seemingly just, tone deaf, they are in this whole card shortage.
 



I get NVIDIA is a company and needs to make money but I can’t get over how seemingly just, tone deaf, they are in this whole card shortage.


it's not even just tone deaf, the fine print for some of the giveaways/leaks showed that within a couple days (maybe even 1 day) of the announcement, the msrp was $200 less. ie: they marked it up at the last minute - putting the 3080 ti at a seemingly nonsensical price point vs their msrp of the 3080/3090.

ie: tone deaf would be putting out more higher end cards instead of lower, ie: 3050. this is just lazy/last minute greed... on top of the already tone deaf release.
 
it's not even just tone deaf, the fine print for some of the giveaways/leaks showed that within a couple days (maybe even 1 day) of the announcement, the msrp was $200 less. ie: they marked it up at the last minute - putting the 3080 ti at a seemingly nonsensical price point vs their msrp of the 3080/3090.

ie: tone deaf would be putting out more higher end cards instead of lower, ie: 3050. this is just lazy/last minute greed... on top of the already tone deaf release.
Like I said, I get they have to make money..... but holy Hell NVIDIA... everything about this is putting you on par with like EA for money grubbing tactics.

There's a store near me that sells computer parts and I might see if they have like a newer 1000 series card. I have a GTX 1070 right now and I swear the store had a 1660 super for sale. The store is used more as a repair shop that happens to sell gaming stuff cause the owner is a gamer.
 
Okay guys I want some advice.

As I've stated before, I am in the business for a huge upgrade. I'm actually looking to do custom water cooling- I've been reading and researching for quite a while now. Going to get the Corsair Hydro X set, as well as a GPU block with an extra 360 Rad. So this is serious stuff.

In keeping with that theme I want serious hardware. I've got two questions, and I'm hoping that all of you but probably only @Madmick will have the desire to answer my blathering.

1) It's my understanding that the Asus ROG Strix OC RTX 3090 is THE most powerful card. Is this correct?
2) I'm hearing that next year AMD's Zen 4 is going to be monstrously better, even accounting for standard hardware power creep. Same for their Radeon. I'm looking to play games at highest settings around 1440p. I know that you can always make a case for waiting in the PC realm. But should I wait? If a game is on XBSX or PS5 I want it to run better on my rig.

As I said before money is not an issue but I do want to make only ONE investment every so often (My last one was 2015).
 
I know that you can always make a case for waiting in the PC realm. But should I wait?

GPU market dont expect a performance jump in the next three plus years.

CPU market its an unknown with AMD. Theyve made grand claims before. Only delivering on that promise once with their late 2020 CPU releases.
 
Back
Top