Tech Gaming Hardware discussion (& Hardware Sales) thread

wait....seriously?
The whole PC with top end processor, 64 Gb memory, big M2 SSD and a 2080Ti plus 4k monitor would easily be around $4500-5000, so yeah. I just paid $4900 for plane tickets and 18-day stay at 5 star hotel all inclusive (2 rooms, 3 people).
 
Hmm. I sort of want to blow money on a 2080, but I've only got a 600w psu and don't really feel like going through the trouble of cabling everything again. I guess it depends on what the 2070's 4k performance is like.
 
Hmm. I sort of want to blow money on a 2080, but I've only got a 600w psu and don't really feel like going through the trouble of cabling everything again. I guess it depends on what the 2070's 4k performance is like.


i want the 2080 but i really want the 2080ti, but im gonna hold fast
 
is there even a reason to upgrade from a 1080 if im not running 4k? games are still running great and i cant imagine in the next two years that it will be much different
 
is there even a reason to upgrade from a 1080 if im not running 4k? games are still running great and i cant imagine in the next two years that it will be much different
We don’t have benchmarks yet so it’s tough to say.
 
ANOTHER update. After weeks of fighting crashes and blue screens. I had returned every piece of hardware one at a time trying to figure out what was causing the problems. I finally swapped to a different motherboard and everything is butter smooth. I was running the ASRock Fatal1ty AM4 ITX board with a Ryzen 2600x. I found that when i disabled 3 of the 6 cores, all crashing and blue screens went away. As soon as I enabled all 6 cores, problems flared up again. I returned the motherboard for the exact model and it did not solve my issues. Finally, I swapped it for the Gigabyte ITX board and all my problems went away.

No idea why, but I feel the ASRock board just was not stable with the 6+ core Ryzen chips.
 
is there even a reason to upgrade from a 1080 if im not running 4k? games are still running great and i cant imagine in the next two years that it will be much different

Unless you are jumping to 4K I see zero reason to pay the ridiculous cost of the 20xx series. We dont even know performance number yet, but unless we get 60+ FPS at 4k, then this will go down as a failure in the PC hardware historical timeline.
 
I've used crossfired 290x's for years and only now it seems like my bottom card has issues (draws power and overheats when no games are running, even re-did the thermal paste and pads on it, plus software and OS troubleshooting). You don't need a new card with every new card generation. The only benefits to getting one of these now will be steady or higher framerates at 4K+ resolutions or VR applications. I am working on someone's PC used to run video walls used in presentations, trade shows, and concerts and it only uses two SLI'd 960's.
 
From Nvidia. This is for the 2080.
TuringVsPascal_EditorsDay_Aug22.png

giphy.gif
 
Gonna need more information than that.. Is that Avg FPS? Min FPS? Max FPS? What settings were turned off?
I'm gonna guess it's average FPS with maybe ray tracing turned off
 
I'm gonna guess it's average FPS with maybe ray tracing turned off

https://www.techspot.com/news/76073-shadow-tomb-raider-unable-maintain-60fps-geforce-rtx.html

It is very likely those numbers are with ray tracing turned off. I mean lets think about this for a moment. Ray tracing as a technique had required so much processing power that it was not something that could be done in real time. The new NVidia cards being able to preform it in real time is mine blowing, a huge leap. To expect that they can take that leap, and do it in 4K at 60 FPS, that is a bit of a stretch.
 
https://www.techspot.com/news/76073-shadow-tomb-raider-unable-maintain-60fps-geforce-rtx.html

It is very likely those numbers are with ray tracing turned off. I mean lets think about this for a moment. Ray tracing as a technique had required so much processing power that it was not something that could be done in real time. The new NVidia cards being able to preform it in real time is mine blowing, a huge leap. To expect that they can take that leap, and do it in 4K at 60 FPS, that is a bit of a stretch.
Maybe in 2022 or something lol
 
Don’t Buy the Ray-Traced Hype Around the Nvidia RTX 2080
On Monday, Nvidia announced a new set of GPUs in a presentation focused on ray tracing and the advent of ray tracing in today’s games. Nvidia has built an entirely new capability around ray tracing and announced an all-new SM (Streaming Multiprocessor) architecture around it. But Nvidia also debuted GPUs at substantially higher prices than its previous generation, and it showed no straight benchmark data that didn’t revolve around ray tracing.

Buying CPUs and GPUs for a first-generation feature is almost always a bad idea. If you bought an Nvidia Maxwell or Pascal video card because you thought DX12 and Vulkan were the future, do you feel like you got what you paid for as far as that feature is concerned? Probably not. AMD doesn’t get to take a bow on this either. True, DX12 has been kinder to Team Red than Team Green, but if you bought a 2013 Radeon thinking Mantle was going to take over the gaming industry, you didn’t get a lot of shipping titles before it was retired in favor of other APIs. If you bought a Radeon in 2013 thinking you were getting in on the dawn of a new age of gaming, well, you were wrong.

The list goes on. The first DX10 cards weren’t particularly fast, including models like Nvidia’s GTX 8800 Ultra. The first AMD GPUs to support tessellation in DX11 weren’t all that good at it. If you bought a VR headset and a top-end Pascal, Maxwell, or AMD GPU to drive it, guess what? By the time VR is well-established, if it ever is, you’ll be playing it on very different and vastly improved hardware. The first strike against buying into RTX specifically is that by the time ray tracing is well-established, practically useful, and driving modern games, the RTX 2080 will be a garbage GPU. That’s not an indictment of Nvidia, it’s a consequence of the substantial lead time between when a new GPU feature is released and when enough games take advantage of that feature to make it a serious perk.

But there’s also some reason to ask just how much performance these GPUs are going to deliver, period, and Nvidia left substantial questions on the table on that point. The company showed no benchmarks that didn’t involve ray tracing.
To try and predict what we might see from this new generation, let’s take a look at what past cards delivered. We’re helped in this by [H]ardOCP, which recently published a massive generational comparison of the GTX 780 versus the GTX 980 and 1080. They tested a suite of 14 games from Crysis 3 to Far Cry 5. Let’s compare the GPUs to the rate of performance improvement and see what we can tease out:


Click to enlarge

There’s a lot going on in this chart, so let’s break it down. When Nvidia moved from Kepler to Maxwell, we see evidence that they made the core far less dependent on raw memory bandwidth (the GTX 980 has markedly less than the 780), but that this lost Nvidia nothing in overall performance. Maxwell was a better-balanced architecture than Kepler, and Nvidia successfully delivered huge performance improvements without a node shift. But while Maxwell used less bandwidth than Kepler did, it still benefited from a huge increase in fill rate, and the overall improvement across 14 games tracks that fill-rate boost. Clock speeds also increased substantially. The percentage comparison data from [H]ardOCP reflects the 14 game improvement for the GTX 980 compared with the GTX 780, and then from the GTX 1080 compared with the GTX 980.

Maxwell to Kepler duplicates this improvement. Fill rate increases a monstrous 1.6x, thanks to the increased clocks (ROPs were identical). Bandwidth surged on the adoption of GDDR5X, and the overall improvement to gaming performance is directly in line with these gains. The point here is this: While any given game may gain more or less depending on the specifics of the engine and the peculiarities of its design, the average trend shows a strong relationship between throwing more bandwidth and fill rate at games and the performance of those titles.

Now we come to RTX 2080. Its fill rate is actually slightly less than the GTX 1080. Its core count increase is smaller than either of the previous two generations. Its bandwidth increase is smaller. And those facts alone suggest that unless Nvidia managed to deliver the mother of all IPC improvements via rearchitecting its GPU core, the RTX 2080 family is unlikely to deliver a huge improvement in current games. This tentative conclusion is further strengthened by the company’s refusal to show any game data that didn’t focus on ray tracing this week.
Don't buy into this shitty hype marketing until you see benchmarks. Frankly, to me, just looking at disclosed pipelines, I'm siding with those whose Spidey Senses are detecting the grand bamboozle of our times. NVIDIA liked the taste of those cryptominer prices. They aren't interested in returning to previous meals.

The preliminary indications are that the $799 RTX 2080 won't provide unprecedented leaps and bounds over the GTX 1080 in terms of raw performance; in fact, it looks like it may actually improve less over the GTX 1080 than the GTX 1080 improved upon the GTX 980, or the GTX 980 improved upon the GTX 780, and so on. They're just throwing out the $1200 Ti card with the launch release to obscure the fact they're perpetrating this reckless price hike.

MSRP
  • RTX 2080 = $799 (September, 2018: +28 months)
  • GTX 1080 = $599 (May, 2016: +20 months)
  • GTX 980 = $549 (September, 2014: +16 months)
  • GTX 780 = $499 (May, 2013: +14 months)
  • GTX 680 = $499 (March, 2012: +16 months)
  • GTX 580 = $499 (November, 2010)

They've taken nearly twice as long to release this update as the past four updates, with what appears will be probably an inferior improvement upon its predecessor that each of those provided, when greater improvement is increasingly demanded for meaningful improvement to graphics as the eye perceives it, in the middle of a PC gaming market that no longer gives a shit about pressing the boundaries of hardware, and they think NOW is the right time to levy this price hike?

Holy crapola, and to think I was sitting there and sweating for AMD's stock future. Screw ARM. Screw CPUs and APUs. AMD is poised to fucking crush these out-of-touch nincompoops when they release their next line of GPUs. If they can repeat their same strategy with the RX 480/580, I'm not sure there will be any reason to buy an NVIDIA GPU until the market responds with an absolute collapse around these MSRPs. That's assuming AMD doesn't bungle their response with more overpriced cards like the Vega GPUs that nobody can actually even find to buy. In other words, if the RTX 2080 is selling for ~$450-$500 around 6 months after launch, and NVIDIA can turn a profit on that price, then I see a future that is okay for them. Otherwise, this is deep, muddy water.

Because at these price points, with this graphical software market...who needs new hardware?
 
Last edited:
If you buy the 2080ti, you are paying a lot of early adopter tax.

It's a new tech, and the chips are huge, and very inefficient compared to 1080ti. There is a reason nvidia has shy away from any fps related benchmarks while disclosing the in-game settings they used, and focus their pitch entirely on ray tracing. The game/driver support is still questionable. I am personally going to upgrade to a 2nd hand 1080ti if available when early adopters make the transition to 2080ti. And upgrade to the next generation when the tech is more mature and efficient, and the driver and game sees better support for ray tracing.
 

Similar threads

Back
Top