Tech Gaming Hardware discussion (& Hardware Sales) thread

So I came back tonight to look at the 4080 more closely, and I discovered the price hike is even more egregious than previously realized. They pushed the 4090 to the forefront in the hopes people wouldn't notice.

For the 4080, it's a +$200 price hike, and that's only for the 12GB variant (a miserly +2GB VRAM over the launch version of the 3080, and they're both GDDR6X). Meanwhile, the 4080 is unimpressive. It's advantages over the 3080 are far less drastic than the 4090's advantages over the 3090 Ti. In fact, the 3080 actually has higher VRAM bandwidth than the 4080, LOL. The 16GB is absurd at $1199, and so far, they haven't declared any enhancements over the 12GB card other than the VRAM. Everything else is the same. You're expected to pay $300 for 4GB of VRAM. Holy shit. Fuck NVIDIA.

This is exactly what Intel needed. They're opening the door for Intel to become a player with that pricing if Intel aggressively undercuts them (especially if AMD doesn't). Updating my bullet lists below.

MSRP
  • RTX 4080* = $1199/$899 (October, 2022: +25 months)
  • RTX 3080* = $699 (September, 2020: +24 months)
  • RTX 2080 = $799 (September, 2018: +28 months)
  • GTX 1080 = $599 (May, 2016: +20 months)
  • GTX 980 = $549 (September, 2014: +16 months)
  • GTX 780 = $499 (May, 2013: +14 months)
  • GTX 680 = $499 (March, 2012: +16 months)
  • GTX 580 = $499 (November, 2010)

MSRP (Adjusted for Inflation; in August 2022 USD, the latest available month)
  • RTX 4080 = $1199 / $899
  • RTX 3080 = $795
  • RTX 2080 = $937
  • GTX 1080 = $738
  • GTX 980 = $683
  • GTX 780 = $634
  • GTX 680 = $644
  • GTX 580 = $675
*Unlike past generations, the RTX 3080 & RTX 4080 were not the strongest GPUs at launch, but the stronger 3090/4090 variants merely supplanted the Titan series, even if not in name, so it's still apples to apples.
The reason is likely they are using the slower DRAM found in the 12 gigabyte updated 3060 board. The reason why the changed the 3060 board from 8 gig to 12 gig was because of ram shortages and they knew the 3060 was going to become their biggest seller. There is a lot of concern over DRAM shortages especially in light of what is happening in China. On a side note Apple for their newest iPhone decided to go with the Chinese made ram because of the scale they are dealing with in shortages. Nvidia decided to go with the slower ram instead of relying on like 1 or 2 vendors stock. I pretty confident even though the performance hit the 4080 overall performance is still going to be a jump over the 3080ti. Not surprising Jensen would make a big deal about the 4090 given it is the top dog of their line to introduce and that the 4080 could be pushed out till end of Nov or Dec. Part shortages are problematic given the push to reduce the alliance on Chinese made parts "Seems counter productive but that is a war room debate".

Chunky board.





Oh thanks to AMD trolls it is a hip thing to take a dump on Nvidia and Intel lately on social media. I expect Intel will gain a number of customers given the price point and Nvidia will still sell a truck load of 1600 dollar cards at launch.
It does not seem to make sense at first but at the time there was huge chip shortages and dram was getting clobbered so they went with slower Dram in the 3060 instead of going with the faster dram found in the 3060ti and the older 3060 boards. This was part of Nvidia fix on the board shortages at the time with crypto insanity and COVID shortages.
RTX 3060 RTX 3060 Ti
GPU Die
GA106 GA104
Architecture Ampere Ampere
VRAM 12GB GDDR6 8GB GDDR6
 
Last edited:
The reason is likely they are using the slower DRAM found in the 12 gigabyte updated 3060 board. The reason why the changed the 3060 board from 8 gig to 12 gig was because of ram shortages and they knew the 3060 was going to become their biggest seller.
giphy.gif


Confusion #1: The 3060 has an updated board, but it carries the same 12GB as the original. You obviously mean to reference the 3060 Ti (which has 8GB).

Confusion #2: The problem with your reasoning is the 3060 Ti was actually released before the 3060 this generation (a change from previous generations). It was launched Dec-20 while the 3060 was launched Feb-21.

Confusion #3: Furthermore, the 3060 Ti doesn't have slower memory, not effectively. It was clocked lower, yes, so it has a slower transfer rate, but it has a wider memory bus, which is certainly an expense of fabrication, and due to this, it has a higher memory bandwidth.

Confusion #4: No, the 4080 will not have the "slower" RAM of the 3060 Ti. The 4080 carries GDDR6X RAM, not GDDR6, which is what is in the 3060 Ti (and 3060).

Confusion #5: The 4080 is the card that actually carries the faster VRAM by your own misuse of that concept. The clock and transfer rate are faster than the VRAM in the 4090. So if they expected the 4080 to sell better, by your logic, they would have put the "slower" RAM in the 4080, as they did the 3060 Ti, but they didn't.


Yes, the 4080 will be more powerful than the 3080 Ti, duh, that's obvious from the pipelines, I can't imagine expressing any level of uncertainty over that. Also, yes, VRAM is expensive. That's why the more expensive cards have more VRAM. The most expensive cards also carry the most VRAM because this appeals even more to editors/professionals than to gamers. Finally, yes, obviously NVIDIA is scrimping on production costs despite charging far more than ever. This is good for their profit margin. This is terrible for consumers.
 
Okay, this makes way more sense. NVIDIA's board partners confirmed the specs, and the 4080 12GB will, in fact, be nerfed in more regards than just VRAM amount:
https://videocardz.com/newz/galax-c...104-400-gpus-for-geforce-rtx-4090-4080-series
GALAX-RTX-40-GPU-2.jpg


However, this begs the question: why in the hell is NVIDIA calling the 12GB version the "4080"? It makes no sense. Why not just call it the 4070?

And that flips on the light bulb: because then the whole world would realize the true nature of the price hike they just pulled. They'd notice the 4080 is $1199 when the 3080 released at just $699, when NVIDIA wants people to see the $899 lower end, and think that is the true entry for a 4080, while the variant with more VRAM is some premium, but that isn't true. Because, of course, NVIDIA would be forced to confront the unavoidable truth they just nearly doubled the price on the xx80 line. Meanwhile, the 4070, if more coherently labeled that, at $899, would also raise eyebrows because the RTX 3070's launch MSRP of $499 would similarly stand out against that.

NVIDIA just increased prices on the last gen by 70%-80%, folks. Smack dab in the middle of the cryptocrash.
 
"It's a mistake, we meant to call the 16gb 4080 the 4080 Ti. This will be rectified after launch." - Nvidia, probably

Fortunately, even r/Nvidia is clowning all over them right now. That's good to see.

gfd7p82ji2p91.jpg


8za69kkyr1p91.jpg
 
Last edited:
It wouldn't surprise me if Nvidia sells the real 4080 for only a little bit more than what their AIBs will have to sell the actual 4070 at after markups. No wonder EVGA left them and hopefully it's announced on 11/3 that they're now partnered with AMD instead of leaving the GPU business outright.
 
Planned on upgrading to the 4000 series but I didn't realize Nvidia was going into the scalping business. I'll probably skip this gen.
 
Logitech announced a cloud gaming handheld today, the G Cloud. Preorder on Amazon for $300, msrp of $350. At $350 it's going to be a tough sell, for $50 more you can get a Steam Deck that will do everything this will do, and more.

The Logitech G Cloud is a handheld console that focuses primarily on cloud gaming services, such as Xbox Cloud Gaming and Nvidia GeForce Now.
The Logitech G Cloud runs on Android and features deep integrations with Xbox Cloud Gaming, Nvidia GeForce Now, and Steam Link.
Logitech has worked with Tencent to co-develop the launcher, run usability tests and optimize how you can pause games and move them to the background. But it remains a true Android device with Google Play.
the company has opted for a mid-range Qualcomm Snapdragon 720G system on a chip with 4GB of LPDDR4X RAM. It has 64GB of storage that you can expand with a microSD card.

WiFi 5, Bluetooth 5.1, a USB-C port, a 3.5mm headphone jack, stereo speakers, stereo microphones, 7 inch 60hz 1080p screen 16:9 aspect ratio, 12 hour 6,000mAh battery, and weighs 463g.
https://techcrunch.com/2022/09/21/logitech-announces-a-handheld-console-focused-on-cloud-gaming/

High_Resolution_JPG-CLOUD-BOB.jpg

High_Resolution_JPG-CLOUD-LIFE-16x9-9410.jpg
 
Logitech announced a cloud gaming handheld today, the G Cloud. Preorder on Amazon for $300, msrp of $350. At $350 it's going to be a tough sell, for $50 more you can get a Steam Deck that will do everything this will do, and more.

The Logitech G Cloud is a handheld console that focuses primarily on cloud gaming services, such as Xbox Cloud Gaming and Nvidia GeForce Now.
The Logitech G Cloud runs on Android and features deep integrations with Xbox Cloud Gaming, Nvidia GeForce Now, and Steam Link.
Logitech has worked with Tencent to co-develop the launcher, run usability tests and optimize how you can pause games and move them to the background. But it remains a true Android device with Google Play.
the company has opted for a mid-range Qualcomm Snapdragon 720G system on a chip with 4GB of LPDDR4X RAM. It has 64GB of storage that you can expand with a microSD card.

WiFi 5, Bluetooth 5.1, a USB-C port, a 3.5mm headphone jack, stereo speakers, stereo microphones, 7 inch 60hz 1080p screen 16:9 aspect ratio, 12 hour 6,000mAh battery, and weighs 463g.
https://techcrunch.com/2022/09/21/logitech-announces-a-handheld-console-focused-on-cloud-gaming/

High_Resolution_JPG-CLOUD-BOB.jpg

High_Resolution_JPG-CLOUD-LIFE-16x9-9410.jpg
Also over $100 more than the Odin.
 
"
The CEO of Nvidia has a message to gamers complaining about the high pricing of the company’s graphics cards. Don’t blame us.

On Wednesday during a videoconference call Q&A with reporters, Nvidia (ticker: NVDA) CEO Jensen Huang was asked about the broad negative reaction from the gaming community over the elevated pricing of its chip maker’s new “Ada Lovelace” graphics cards.

“A 12-inch wafer is a lot more expensive today,” he replied, citing rising chip making costs. “Moore’s Law is dead … It’s completely over.” The executive added the expectations of twice the performance for similar cost was “a thing of the past” for the industry.

Moore’s Law is an old forecast of innovation for the semiconductor industry by Gordon E. Moore, the co-founder of Intel . He said “the number of transistors incorporated in a chip will approximately double every 24 months,” offering performance and cost benefits over time.

Over the past day, many gamers on social media and message boards expressed outrage with the pricing of Nvidia’s next generation gaming graphics chips, code-named “Ada Lovelace,” which were revealed on Tuesday at its GTC conference.
"
 
I don't really see what is gained by more unreliable speculation than I've already offered above according to hardware pipelines & frequencies, especially when you won't be able to buy any of these CPUs before actual reviews are out.

So far, for Cinebench20, leaks have put the Zen 4 uplift at ~23%-29% over its Zen 3 predecessor series core-for-core: meaning for either single core or multi-core. Meanwhile, that Intel Raptor Lake leak put single core uplift at 12% while multicore uplift was a whopping 46% due to the higher number of total cores. Single core scores give us more insight into probable gaming performance, and Intel's are obviously far less impressive, but Raptor Lake's predecessor also held a significant advantage in single core performance on rendering tests like Cinebench over Zen 3, and as would be expected, also in gaming performance.

Looks a touch more promising for Intel's gaming performance than I would have estimated, but it doesn't change my expectations, and the power draw is just silly. Intel's just pushing the pedal into the red line.

Looks like you were right. Even they say it looks worrying for Raptor.

What is again the CEO of Intel said? AMD shall never be in windscreen but in rearview?

https://www.notebookcheck.net/AMD-s...etition-in-SiSoft-Sandra-review.655196.0.html
 
Last edited:
"
The CEO of Nvidia has a message to gamers complaining about the high pricing of the company’s graphics cards. Don’t blame us.

On Wednesday during a videoconference call Q&A with reporters, Nvidia (ticker: NVDA) CEO Jensen Huang was asked about the broad negative reaction from the gaming community over the elevated pricing of its chip maker’s new “Ada Lovelace” graphics cards.

“A 12-inch wafer is a lot more expensive today,” he replied, citing rising chip making costs. “Moore’s Law is dead … It’s completely over.” The executive added the expectations of twice the performance for similar cost was “a thing of the past” for the industry.

Moore’s Law is an old forecast of innovation for the semiconductor industry by Gordon E. Moore, the co-founder of Intel . He said “the number of transistors incorporated in a chip will approximately double every 24 months,” offering performance and cost benefits over time.

Over the past day, many gamers on social media and message boards expressed outrage with the pricing of Nvidia’s next generation gaming graphics chips, code-named “Ada Lovelace,” which were revealed on Tuesday at its GTC conference.
"
"The real 4080 costs 100% more than the 1080 when it launched in 2016 but don't blame us!"

<mma3>
 
"The real 4080 costs 100% more than the 1080 when it launched in 2016 but don't blame us!"

<mma3>
It sucks but I kind of expected it, especially for the first 6 months. There's still a ton of 3070 TI and 3080s in the $600-750 price range that they have to sell so there artificially raising the price of the 4080 so it doesn't overlap with the price of those 2 cards. They can get away with it because I don't believe 4000 and RDNA3 midrange cards are due to come out till at least Q2 of 2023
 
It sucks but I kind of expected it, especially for the first 6 months. There's still a ton of 3070 TI and 3080s in the $600-750 price range that they have to sell so there artificially raising the price of the 4080 so it doesn't overlap with the price of those 2 cards. They can get away with it because I don't believe 4000 and RDNA3 midrange cards are due to come out till at least Q2 of 2023
They'll get away with it because the same suckers who paid scalpers $1k above MSRP for 3080s and 3090s during the pandemic and mining boom will be buying these cards as soon as they're available. But even the 2080 which had the ray tracing premium was only $100 more than the 1080 when it launched in 2018.
 
They'll get away with it because the same suckers who paid scalpers $1k above MSRP for 3080s and 3090s during the pandemic and mining boom will be buying these cards as soon as they're available. But even the 2080 which had the ray tracing premium was only $100 more than the 1080 when it launched in 2018.

Will people still pay it even though mining has gone down so much? I'm sure some will but I don't think it will be at the same levels like it was.
 
Will people still pay it even though mining has gone down so much? I'm sure some will but I don't think it will be at the same levels like it was.
It depends on what AMD does 11/3. I've been team green since 3dfx went out of business but it's getting harder to resist especially if EVGA goes to AMD.
 
It depends on what AMD does 11/3. I've been team green since 3dfx went out of business but it's getting harder to resist especially if EVGA goes to AMD.

I went AMD for my 290 never again. I would rather wait for the 5000 series to come out and get the 4000 at that point before I go AMD.
 
I went AMD for my 290 never again. I would rather wait for the 5000 series to come out and get the 4000 at that point before I go AMD.
I feel bad for people who bought a 1080 in 2016. The 2080 wasn't worth upgrading to in 2018 and the 3080 wasn't in 2020 because it was impossible to find at anywhere near MSRP. Now Nvidia expects them to literally shell out twice as much for the latest XX80.
 
I feel bad for people who bought a 1080 in 2016. The 2080 wasn't worth upgrading to in 2018 and the 3080 wasn't in 2020 because it was impossible to find at anywhere near MSRP. Now Nvidia expects them to literally shell out twice as much for the latest XX80.

I got a 3080 for very close to MSRP because I was able to sit at my PC all day and setup alerts on the BB website to notify me once the sold out button changed on the website. I think it was about $50 over MSRP EVGA card. Had to do the same for a PS5. I went from a 1070ti to 3080 and it was great.

I planned on going for the 4000 series right away but at these prices I'll just wait. I have a feeling these aren't going to move as well as Nvidia is expecting but who knows. Either way I'll wait my 3080 honestly is working great right now.
 
Back
Top