Tech Gaming Hardware discussion (& Hardware Sales) thread

From what I hear both Intel and AMD next generation CPU's will be a substantial jump in performance very unlike their previous generation jumps. I believe Apple caused AMD and Intel to do more radical changes to their architecture vs tweaking their performance we are talking between 30 to 50 percent jump performance vs 10 to 15 percent jump. Jumping the performance lift requires radical and expensive revisions to the design of the chips.
Intels next gen (13th gen Raptor lake) is just a refresh with more cache and more e-cores so it won't be as signficant as a jump as 11th gen was to 12. Amds next gen will be a huge jump from previous gen which is expected since it will be at least 2 years between gens + die shrink.
 
The slides they showed today with an 8 game sample averaged 34% faster than a stock 1650 so that should put it somewhere in the third party 580/590 range performance wise.

It's basically a more power efficent version of the 5500XT with slightly higher performance. I really wasn't expecting much different to be honest (market sucks but I wasn't expecting this card to be magic)

Well reviews are out and it's going back and forth with a 5 year old RX580. What a joke.
 
Well reviews are out and it's going back and forth with a 5 year old RX580. What a joke.
Yeah when I saw the slides along with the spec sheet it was pretty much assumed it would be somewhere around a RX580 :(
 
Yeah when I saw the slides along with the spec sheet it was pretty much assumed it would be somewhere around a RX580 :(

Holy crap, I hadn't watched Hardware Unboxed's review yet. They're showing it struggles behind an RX570 4gb in things like Siege, F1 2021, and RE Village. They could have continued to make RX580's and not wasted 6nm fab production on these.
yoda-sad.gif
 
Last edited:
I hear the new DDR5 RAM is hardly any better than the current DDR4.

Maybe that will change with time.

As of now I think I'm going to wait 6 or so months for a 4000 series GPUs then make a bold big purchase , thinking i9-12900K paired with like a 4080. HOpefully if buying a whole rig, getting the video card won't be mission impossible.
 
Holy crap, I hadn't watched Hardware Unboxed's review yet. They're showing it struggles behind an RX570 4gb in things like Siege, F1 2021, and RE Village. They could have continued to make RX580's and not wasted 6nm fab production on these.
yoda-sad.gif
Yeah when I read the spec sheet I knew it wasn't going to be good. History has been awful on budget cards for about a decade now

It's been so bad the last decade that I remember the tahiti version of the 7870XT (which sold for $135 Fall of 2013) was literally faster than 90% of the sub $200 graphics cards that came out for the next 7 years (GT 1030, GTX 1050 non Ti, GTX 750, GTX 650/650ti, GTX 950, RX460/560).
 
Holy crap, I hadn't watched Hardware Unboxed's review yet. They're showing it struggles behind an RX570 4gb in things like Siege, F1 2021, and RE Village. They could have continued to make RX580's and not wasted 6nm fab production on these.
yoda-sad.gif
Yeah when I read the spec sheet I knew it wasn't going to be good. History has been awful on budget cards for about a decade now

It's been so bad the last decade that I remember the tahiti version of the 7870XT (which sold for $135 Fall of 2013) was literally faster than 90% of the sub $200 graphics cards that came out for the next 7 years (GT 1030, GTX 1050 non Ti, GTX 750, GTX 650/650ti, GTX 950, RX460/560).
I thought this was pretty funny, holy shit lol

291kkbqw4oc81.png
 
I thought this was pretty funny, holy shit lol

291kkbqw4oc81.png
It's stupefying. This only way this GPU wouldn't be a joke is if the MSRP was $90-$130 and it had a >75W TDP. That's what analogous cards offered the past decade from both NVIDIA and AMD (from the GTX 1050 back to the Radeon HD 6670). There's no excuse for it to be $200 even in this market. That's pure gouging directly from AMD. They know it will inflate well above the launch MSRP they name.

This is the worst GPU launch since the also egregiously overpriced RTX 2000 series, and the $349 RTX 2060 in particular (which suffered a similar shortchanging of ray-tracing cores necessary for adequate ray-tracing performance relative to what was sensible for its resolution/framerate targeting).

But this is far, far worse.
 
It's stupefying. This only way this GPU wouldn't be a joke is if the MSRP was $90-$130 and it had a >75W TDP. That's what analogous cards offered the past decade from both NVIDIA and AMD (from the GTX 1050 back to the Radeon HD 6670). There's no excuse for it to be $200 even in this market. That's pure gouging directly from AMD. They know it will inflate well above the launch MSRP they name.

This is the worst GPU launch since the also egregiously overpriced RTX 2000 series, and the $349 RTX 2060 in particular (which suffered a similar shortchanging of ray-tracing cores necessary for adequate ray-tracing performance relative to what was sensible for its pricing & resolution/framerate targeting).

But this is far, far worse.

I was watching the Full Nerd today and they were saying it was a good card using the argument that it was good for kids that want to build a Fortnite pc. But by the time you're done buying everything else to build the computer, you're well beyond the price of the XBOX series S.
 
https://www.tomshardware.com/news/geforce-rtx-3050-defeats-radeon-rx-6500-xt-leaked-benchmarks

While we don't have official figures yet, the early leaks suggest NVIDIA's about to shit on what AMD just gave us with that RX 6500 XT. The leaked benchmarks include 3DMark's most popular tests; FireStrike 1080p, TimeSpy, and TimeSpy Extreme.

The RTX 3050 is actually outpacing the GTX 1660 Ti by ~5% in the leaks (a $280 card from 2019, not a $200 card from 2017 like the RX 580, or a $170 card like the RX 570). Meanwhile, unlike the 1660 Ti, it also offers DLSS, and ray-tracing. Additionally, it carries 8GB of VRAM rather than 6GB (or the pitiful 4GB in the 6500 XT), although it appears there will be both a 4GB and an 8GB version. So NVIDIA isn't bewilderingly nerfing features. Most critically of all, it doesn't have the halved memory lane width, so if the info we already have on the 3050 is entirely accurate, it will carry a whopping 55.6% VRAM bandwidth advantage over the 6500 XT (also nearly double the FLOPS).

It is beating the 6500 XT by 5%-20% depending on the benchmark without considering these more obscure feature advantages. All this suggests that-- unlike the 6500 XT-- it will actually be a respectable 1080p card heading into 2022.

The rumored TDP is 130W, so for a second generation*, we aren't getting a solid card for office prebuild comp conversion from either manufacturer.
*Excuse me, third generation, sort of...GTX 16 refresh didn't have a >75W card either.

The MSRP of the 3050 is reported to be $249. Not ideal, but if all this is true, not outrageous like the 6500 XT's pricing.
 
Last edited:
Alienware AW3423DWl
IMG-1818.jpg


Alienware’s upcoming QD-OLED monitor has a $1,299 price tag

It’s the first of its kind to come with Samsung’s QD-OLED panel
Engadget + The Verge + Ars Technica said:
Dell's Alienware monitor that uses Samsung's quantum dot OLED (QD-OLED) tech will arrive this spring for a surprisingly reasonable $1,299, the company announced via a tweet. Dell first unveiled the curved, 34-inch gaming display at CES promising the ultra-high contrast of OLED displays with improved brightness, color range and uniformity.

That price might not seem cheap, but other OLED monitors can cost far more. LG's 32-inch UltraFine OLED model, while not exactly ideal for gaming, costs $3,999, and even its 27-inch UltraFine model is $2,999. Alienware’s 55-inch OLED gaming monitor currently sits at $2,719 on Amazon, and was first released in 2019 for $3,999 — size is obviously a factor here, but it’s still nice to see a monitor that wields new QD-OLED tech sitting well below the $2,000 mark.

The Alienware model has specs more designed for gamers than content creators, though, with 3,440 x 1,440 of resolution, a 175 Hz refresh rate, 99.3 percent DCI-P3 color gamut, 1,000,000:1 contrast ratio and 250 nits of brightness with 1,000 nits peak. It also offers HDR, conforming to the minimum DisplayHDR True Black 400 standard for OLED displays. At 250 nits, it's expected to have a lower typical brightness than many modern LED monitors.

Samsung's QD-OLED technology uses blue organic light-emitting diodes passed through quantum dots to generate red and green. That compares to standard OLED, which uses blue and yellow OLED compounds. Blue has the strongest light energy, so QD-OLED in theory offers more brightness and efficiency. Other advantages include a longer lifespan, more extreme viewing angles and less potential burn-in. Long story short, Samsung's QD-OLED panel is supposed to be like traditional OLED, which is known for its impressive contrast brought on by rich, deep blacks, but with more consistently vivid color across brightness levels.

The Verge summarizes the difference from traditional OLED here:
QD-OLED screens differ from the traditional OLED panels that’ve long been manufactured by LG Display in the way they produce an image. LG’s displays are considered WRGB OLED, because they use blue and yellow OLED compound to generate white-ish light pixels that are passed through color filters to produce red, green, and blue sub-pixels. More recent OLED TVs also have a fourth unfiltered / white sub-pixel meant to enhance brightness — especially for HDR content.

QD-OLED changes this up by emitting blue light through quantum dots to convert some of that blue into red and green without any need for the color filter. (Blue is used because it has the strongest light energy.) This leads to greater light energy efficiency; since you’re not losing any light to the color filters, QD-OLED TVs should offer brightness gains compared to past-generation OLEDs....

Screen_Shot_2022_01_03_at_1.51.06_PM.png

A simplified breakdown of QD-OLED.
Image: Samsung Display
They should also be able to maintain accurate, vivid quantum dot color reproduction even at peak brightness levels, whereas WRGB OLED can sometimes exhibit some desaturation when pushed that far. The already-superb viewing angles of OLED are claimed to be even better on QD-OLED at extreme angles since there’s more diffusion happening without the color filter in the way.

The possibility of burn-in isn’t eliminated by QD-OLED, but the hope is that these panels could exhibit a longer overall life span than existing OLED TVs since the pixels aren’t working as hard. Samsung Display is using three layers of blue OLED material for each pixel, and that could help to preserve longevity.
For a deeper dive, you can read Ars Technica's focused feature on QD-OLED:
Explaining QD-OLED, Samsung’s display tech that’s wowing CES


We'll see if this becomes the new, most desired ultrawide on the market. It won't just be the first QD-OLED monitor on the market. It will be the first OLED ultrawide aimed at gaming, period. Currently, the LG 34GP950G is arguably the ultimate eye candy 34" ultrawide in existence, but it is an IPS display, and it happens to cost the same as this Alienware will: $1296.

However, most would agree the best ultrawide you can buy for your money, currently, is the Gigabyte M34WQ which can be had for a relatively modest $499. It might not be quite as impressive for HDR gaming as the LG, but it's not far off, and it scorches it in performance gaming metrics.
 
I thought a Comp USA had reopened close by. But then I looked it up and it closed in 2008.

<DCrying>
 
Does anyone know a reasonably priced 2TB internal hard drive for the PS5?
 
Alienware AW3423DWl
IMG-1818.jpg


Alienware’s upcoming QD-OLED monitor has a $1,299 price tag

It’s the first of its kind to come with Samsung’s QD-OLED panel


The Verge summarizes the difference from traditional OLED here:

For a deeper dive, you can read Ars Technica's focused feature on QD-OLED:
Explaining QD-OLED, Samsung’s display tech that’s wowing CES


We'll see if this becomes the new, most desired ultrawide on the market. It won't just be the first QD-OLED monitor on the market. It will be the first OLED ultrawide aimed at gaming, period. Currently, the LG 34GP950G is arguably the ultimate eye candy 34" ultrawide in existence, but it is an IPS display, and it happens to cost the same as this Alienware will: $1296.

However, most would agree the best ultrawide you can buy for your money, currently, is the Gigabyte M34WQ which can be had for a relatively modest $499. It might not be quite as impressive for HDR gaming as the LG, but it's not far off, and it scorches it in performance gaming metrics.

Absolutely stunning display. I currently have a 38" ultrawide and don't think I'll go down in size for this but I am very keen to see more QD-OLEDs come to market. Thanks for the article.
 
Does anyone know a reasonably priced 2TB internal hard drive for the PS5?
The PS5 doesn't have internal bays for hard drives. It only supports m.2 SSDs internally. If you want a cheap hard drive, you'll have to buy an external HDD. Unfortunately, you can't run PS5 games from these external drives. You can store them there, then copy them to the internal drive when you want to play them.

You could get a 5TB Portable External HDD for $105. No extra cords. It will be powered by the USB port itself:
https://pcpartpicker.com/product/fG...e-5-tb-external-hard-drive-wdbu6y0050bbk-wesn

Find more here:
https://pcpartpicker.com/products/e...=ppgb&A=500000000000,8000000000000&i=95,4&t=2
 
I'm buying the 8TB Samsung SSD. My 4TB is more than halfway filled.
 
Back
Top