• Xenforo Cloud has scheduled an upgrade to XenForo version 2.2.16. This will take place on or shortly after the following date and time: Jul 05, 2024 at 05:00 PM (PT) There shouldn't be any downtime, as it's just a maintenance release. More info here

Tech Gaming Hardware discussion (& Hardware Sales) thread

I got a 3080 for very close to MSRP because I was able to sit at my PC all day and setup alerts on the BB website to notify me once the sold out button changed on the website. I think it was about $50 over MSRP EVGA card. Had to do the same for a PS5. I went from a 1070ti to 3080 and it was great.

I planned on going for the 4000 series right away but at these prices I'll just wait. I have a feeling these aren't going to move as well as Nvidia is expecting but who knows. Either way I'll wait my 3080 honestly is working great right now.
At least there are used LHR 3080s on Ebay now for $150 under MSRP. Still though, it turns 2 years old next month.
 
RTX 3050 $290
RX 6600 (which is basically a 3060) $250

<{hughesimpress}>
 
131604-radeon-rx-6000-msrp-1.png


<DCWhoa>
 
Okay, this makes way more sense. NVIDIA's board partners confirmed the specs, and the 4080 12GB will, in fact, be nerfed in more regards than just VRAM amount:
https://videocardz.com/newz/galax-c...104-400-gpus-for-geforce-rtx-4090-4080-series
GALAX-RTX-40-GPU-2.jpg


However, this begs the question: why in the hell is NVIDIA calling the 12GB version the "4080"? It makes no sense. Why not just call it the 4070?

And that flips on the light bulb: because then the whole world would realize the true nature of the price hike they just pulled. They'd notice the 4080 is $1199 when the 3080 released at just $699, when NVIDIA wants people to see the $899 lower end, and think that is the true entry for a 4080, while the variant with more VRAM is some premium, but that isn't true. Because, of course, NVIDIA would be forced to confront the unavoidable truth they just nearly doubled the price on the xx80 line. Meanwhile, the 4070, if more coherently labeled that, at $899, would also raise eyebrows because the RTX 3070's launch MSRP of $499 would similarly stand out against that.

NVIDIA just increased prices on the last gen by 70%-80%, folks. Smack dab in the middle of the cryptocrash.


This is so disappointing. Gonna have to play the waiting game again.

Thinking about this again, I'm wondering if the sudden surplus in 3000 series cards is just way way bigger than we could have imagined and this price increase on release is designed to just help get those 3000 series cards sold and then the 4000 series prices will come tumbling down considerably after that (and the early adopter / must have the best thing no matter the cost smaller segment buys their 4000 series upon release)?

i guess what I'm saying is it possible Nvidia kind of has to do this not to price gouge per se, but just correct for an inventory management fuck up.
 
Last edited:
Okay, this makes way more sense. NVIDIA's board partners confirmed the specs, and the 4080 12GB will, in fact, be nerfed in more regards than just VRAM amount:
https://videocardz.com/newz/galax-c...104-400-gpus-for-geforce-rtx-4090-4080-series
GALAX-RTX-40-GPU-2.jpg


However, this begs the question: why in the hell is NVIDIA calling the 12GB version the "4080"? It makes no sense. Why not just call it the 4070?

And that flips on the light bulb: because then the whole world would realize the true nature of the price hike they just pulled. They'd notice the 4080 is $1199 when the 3080 released at just $699, when NVIDIA wants people to see the $899 lower end, and think that is the true entry for a 4080, while the variant with more VRAM is some premium, but that isn't true. Because, of course, NVIDIA would be forced to confront the unavoidable truth they just nearly doubled the price on the xx80 line. Meanwhile, the 4070, if more coherently labeled that, at $899, would also raise eyebrows because the RTX 3070's launch MSRP of $499 would similarly stand out against that.

NVIDIA just increased prices on the last gen by 70%-80%, folks. Smack dab in the middle of the cryptocrash.


Passmarks are in for 13900k, waiting for 7950x

5.8 ghz single thread

https://www.notebookcheck.net/Titan...single-thread-performance-chart.656387.0.html
 
Well, thats pretty bad tbh. Expected this one to do better :/

Meh, idk what to buy anymore.

upload_2022-9-25_18-55-44.png
 
Passmarks are in for 13900k, waiting for 7950x

5.8 ghz single thread

https://www.notebookcheck.net/Titan...single-thread-performance-chart.656387.0.html

It maybe worth waiting for the 14gen Intel looks like a huge jump in technology.

"Intel has confirmed additional details regarding its 14th Gen Core lineup in a Medium blog post. Shedding light on its first tiled or modular design, the chipmaker elaborated on its plans for the PC market for the next 3-5 years. Starting with the 14th, every new generation will be a collection of different product stacks featuring different process nodes, core architectures, and core counts. Meteor Lake and Arrow Lake will together form the 14th Gen family while leveraging the same socket and base tiles."


Intel-Meteor-Lake-Arrow-Lake-Lunar-Lake_-Hot-Chips-34_35-1480x833-1-1024x576.png.webp



This looks like Patrick Paul Gelsinger era is beginning bringing in his guys from his past businesses such as VMware, Dell and EMC leveraging their hardware experiences. This is the titanic shift Pat been talking about when he took the job as CEO. It will be interesting if AMD follows this path.

https://www.hardwaretimes.com/intel...d-arrow-lake-on-the-same-socket-and-base-die/
 
Passmarks are in for 13900k, waiting for 7950x

5.8 ghz single thread

https://www.notebookcheck.net/Titan...single-thread-performance-chart.656387.0.html
Well, thats pretty bad tbh. Expected this one to do better :/

Meh, idk what to buy anymore.

View attachment 945265
We tried telling you not to obsess over leaks. They're going to be all over the map. They always are.

The idea is to focus on a reputable leaker who believes he has obtained specifications for frequencies and core counts, and then whether or not early leaks of retail samples that appear credible seem to confirm those clocks. You can then extrapolate a very vague estimate of the potential for performance improvement based on those clocks because what Intel/AMD tell you about IPC uplift is notoriously unreliable, and if you nitpick every leak, they will inevitably contradict each other. As I told you, it doesn't look great for Intel.

But just wait for the reviews, man.
This is so disappointing. Gonna have to play the waiting game again.

Thinking about this again, I'm wondering if the sudden surplus in 3000 series cards is just way way bigger than we could have imagined and this price increase on release is designed to just help get those 3000 series cards sold and then the 4000 series prices will come tumbling down considerably after that (and the early adopter / must have the best thing no matter the cost smaller segment buys their 4000 series upon release)?

i guess what I'm saying is it possible Nvidia kind of has to do this not to price gouge per se, but just correct for an inventory management fuck up.
NVIDIA doesn't give a damn about overstock, or how merchants will struggle to move old inventory. Those GPUs are already sold on their end.
 
my corsair 4000D case can barely fit my monstrocity of a noctua dh-15 cooler in it and now i gotta worry about these new fucking gpu sizes on top of the power draw. i have an evga supernova 750 watt psu i threw in there too, expecting i wouldnt need anything more than that after i upgrade the oldest piece in my system, my beloved asus strix 2080. i cant really even call it old, i love that thing. i ditched my 970 for a 2080 and used to run it with an i5-4690k @ 4.5 ghz. i could have overclocked that cpu to the moon and back and it never would have saved me from the bottleneck lol

well after skipping the 3000 series to make a 4000 series upgrade, preferably the 16 gb 4080, it looks like im gonna need a beefier power supply to accomodate my 11700k and a new gpu, and maybe even a new case too even though my 4000D is huge. why in the fuck do i need more than a 750 watt psu? what the fuck???? despite usually running 125-150 watts while gaming or under heavy load, on the worst of days my 11700k can draw 250 watts under extreme unregulated circumstances. and the 4080/4090's over 600 watts on their own. i am so fucked

this is fucking bullshit. fuck off nvidia. like seriously im fucking pissed. fuck off. go fucking fuck yourself. when i built my last rig i did it with future proofing in mind, but i had no expectations of needing a 1000W+ power supply. this is fucking absurd. im going to literally have to pull my entire computer apart and buy a brand new psu just to make a simple gpu upgrade. fuck off im going AMD
 
Last edited:
AMD Ryzen 5 7600X Review - Affordable Zen 4 for Gaming | TechPowerUp

Strange launch. The hype (from AMD and the media) was that it was going to crush everything in sight in gaming performance but lose to alderlake/raptorlake in productivity task.

Ended up being backwards, it outperformed alder lake in productivity but couldn't best it in gaming despite alder lake being almost a year old, on a much smaller node, and cheaper.
 
AMD Ryzen 5 7600X Review - Affordable Zen 4 for Gaming | TechPowerUp

Strange launch. The hype (from AMD and the media) was that it was going to crush everything in sight in gaming performance but lose to alderlake/raptorlake in productivity task.

Ended up being backwards, it outperformed alder lake in productivity but couldn't best it in gaming despite alder lake being almost a year old, on a much smaller node, and cheaper.
That's what everyone expected. Hell, that's what I was expecting for its matchup against Raptor Lake, not Alder Lake. And that's not just the 7600X. The 7950X is losing to the Alder Lake i7 in gaming. It's even losing with PBO Max and overclocked RAM.
relative-performance-games-1280-720.png


i-mean-what-the-hell-is-going-on-whats-going-on-here.gif


Frankly, I'm worried that Intel's corrupt benchmark-fixing machine has gotten to Techpowerup. I was already worried about this with the release of the 5800X3D. Nobody else had the 5800X3D losing to the 12700K in roundups. I was able to write it off when that happened because it was only a few percent, and benchmark setups/suites vary. But this...this is beyond explanation. This is looking like a UserBenchmark moment for what has long been one of my favorite resources for tech review.

Head over to Anandtech:
https://www.anandtech.com/show/1758...ryzen-5-7600x-review-retaking-the-high-end/17
130284.png

130290.png

130298.png

130304.png

130310.png

130316.png

130328.png

Techspot aka Hardware Unboxed"
https://www.techspot.com/review/2534-amd-ryzen-7600x/
Average.png



Tom's Hardware:
AMD Ryzen 9 7950X and Ryzen 5 7600X Review: A Return to Gaming Dominance
u7ibvZJ2UR2VoC69jwusRi-970-80.png.webp


Tweaktown:
https://www.tweaktown.com/reviews/1...4-cpu/index.html#Gaming-and-Power-Consumption

Eurogamer aka Digital Foundry:
https://www.eurogamer.net/digitalfoundry-2022-amd-ryzen-9-7900x-ryzen-5-7600x-review?page=2

Gamers Nexus:



RIP Techpowerup.
 
AMD Ryzen 5 7600X Review - Affordable Zen 4 for Gaming | TechPowerUp

Strange launch. The hype (from AMD and the media) was that it was going to crush everything in sight in gaming performance but lose to alderlake/raptorlake in productivity task.

Ended up being backwards, it outperformed alder lake in productivity but couldn't best it in gaming despite alder lake being almost a year old, on a much smaller node, and cheaper.
Weird. Tom’s has them beating intel in most games. They used a 3090 instead of a 3080 for the test. I wouldn’t think that would change the results much though
 
Looks like the new AMD chips are 20-30% faster than Zen3 in gaming and operate at a ridiculous 95C.

Considering how aggressively AMD has price dropped Zen3 recently. Obvious Zen4 will have Zen3's launch price structure. Allowing Intel to keep their market share. Motherboard, DDR5 and CPU prices for this upgrade again limits their targeted user base.
 
Back
Top