Tech Gaming Hardware discussion (& Hardware Sales) thread

I would wait for more reports because WCC is known for jumping the gun an click bait headlines.
I don't know where you got this notion, but WCCF Tech has been an utter oracle with their coverage for the past three years, now. They've gotten virtually everything right with leaks, and reported these before any of the rest of the tech sphere. They specialize in this kind of coverage. Nevertheless, the price prediction wasn't a leak or a rumor. That was the writer's editorializing; a logical prediction based on a concrete fact: the ending of RTX 2000 series production. I just don't understand what the logic is.

Here's an example of a shakier rumor to distrust (via Tweaktown).
AMD's next-gen RDNA 2 rumor: 40-50% faster than GeForce RTX 2080 Ti
Keep in mind that a 17.5 TFLOPS GPU is only 30% more FLOPS than the 13.45 TFLOPS of the RTX 2080 Ti. That means that AMD would seize the IPC advantage (at least against the previous gen since RTX 3000 hasn't debuted). Even if only mostly true, this means the new consoles will be absolute juggernauts. After all, if true, this would mean the Xbox Series X would equate to from somewhere between 13 to 14 TFLOPS of Turing hardware. Thus, it would effectively offer RTX 2080 Ti performance in terms of raw synthetic power.

Meanwhile, another circulating report is that NVIDIA is requiring a new 12-pin power adapter for the RTX 3000 series GPUs. This seems absurd because there's no reason for it when one should just be able to use an 8+6 or 8+8 pin configuration that nearly all current PSUs support if these GPUs really draw this much power (there are rumors of up to 350W TDPs). I can't believe they would do this, especially with PSU pricing where it is now. If those RDNA 2.0 rumors are true, I can't imagine who would upgrade to an RTX 3000 GPU over that upcoming Navi. You'll save yourself over $100 in terms of the PSU, and the hassle of uninstalling nearly your entire comp just to reinstall it so you can put in the new PSU.
 
HP Reverb G2 could be a game changer very little God rays and it looks reasonable not blinding to the eyes. It looks like a movie and no enhancement at all.





 
Cooler Master is getting in the Raspberry Pi case game. Priced at about $32
2674ec13173767dedd16cd1f7825d24d_original.jpg

bba615ccd21762347b126199a88eb7ec_original.jpg




Really nice case design especially the way they cool the CPU.
 
I don't know where you got this notion, but WCCF Tech has been an utter oracle with their coverage for the past three years, now. They've gotten virtually everything right with leaks, and reported these before any of the rest of the tech sphere. They specialize in this kind of coverage. Nevertheless, the price prediction wasn't a leak or a rumor. That was the writer's editorializing; a logical prediction based on a concrete fact: the ending of RTX 2000 series production. I just don't understand what the logic is.

Here's an example of a shakier rumor to distrust (via Tweaktown).
AMD's next-gen RDNA 2 rumor: 40-50% faster than GeForce RTX 2080 Ti
Keep in mind that a 17.5 TFLOPS GPU is only 30% more FLOPS than the 13.45 TFLOPS of the RTX 2080 Ti. That means that AMD would seize the IPC advantage (at least against the previous gen since RTX 3000 hasn't debuted). Even if only mostly true, this means the new consoles will be absolute juggernauts. After all, if true, this would mean the Xbox Series X would equate to from somewhere between 13 to 14 TFLOPS of Turing hardware. Thus, it would effectively offer RTX 2080 Ti performance in terms of raw synthetic power.

Meanwhile, another circulating report is that NVIDIA is requiring a new 12-pin power adapter for the RTX 3000 series GPUs. This seems absurd because there's no reason for it when one should just be able to use an 8+6 or 8+8 pin configuration that nearly all current PSUs support if these GPUs really draw this much power (there are rumors of up to 350W TDPs). I can't believe they would do this, especially with PSU pricing where it is now. If those RDNA 2.0 rumors are true, I can't imagine who would upgrade to an RTX 3000 GPU over that upcoming Navi. You'll save yourself over $100 in terms of the PSU, and the hassle of uninstalling nearly your entire comp just to reinstall it so you can put in the new PSU.
<45><45><45><45><45><45><45> Please don't stop LOL.
 
Cooler Master is getting in the Raspberry Pi case game. Priced at about $32
2674ec13173767dedd16cd1f7825d24d_original.jpg

bba615ccd21762347b126199a88eb7ec_original.jpg




I hear a rumor that AMD is trying to get a Ryzen APU working like a RaspberryPI with the added benefit of working like Arduino motor controller for 3D printers and small CNC machines. This will be huge business in a few years as well as small powder coating machines and ovens. People will be able to machine their own parts yeah and yeah seems crazy but people said that about 3D printers too at one time. I see people doing all kinds of things with their 3D printers check out thingiverse.

https://www.thingiverse.com/


 
That's an LGA1151 board, 10th gen Intel is LGA1200. Motherboard features I'm looking for are ATX form factor, dual NVME slots, and decent audio.
I was looking at the ASRock B460 Steel Legend. There's an ASROCK H470 for $20 more but from what I've seen the only difference is 4 more PCI-E lanes and USB Gen 3.2 Gen 2 ports, I don't think it's worth the upgrade.
I had a bad experience with MSI so I won't buy anything from them.


Funny I searched Google with i3-10 generation and I did not look at the chipset type. I'm surprised you had a bad experience with MSI I have built a number of MSI systems and had good experience the one I was disappointed with was Zotac I bought 5 ITX motherboards ended up with 3 bad ones. I like ASUS the best motherboard manufacturer I have ever had an experience with and I thought you where trying to stay down with price. I still have one of my bad Zotec under my desk for over 8 years lol. Kind of a memory how bad they are as a board. MSI system I built because I was asked to because someone was budget limited.

Check out this RaspberryPI case nice.

 
Last edited:


What the F does this have to do with gaming hardware seriously what the FXXK. Moving this to gaming hardware F off. It's for video editing troll sh*t head.
 
Last edited:

It's great that things are marching forward, but it's especially hard to get excited about a new interface protocol when its predecessor of its predecessor of its predecessor still has basically no practical market presence, and also isn't yet obsolete.

Thunderbolt 1 was introduced a decade ago with a bandwidth of 10 Gb/s. The the same bandwidth as USB 3.2 Gen 2(x1), or USB 3.1 Gen 2. That's also enough throughput to handle 1080p@120Hz at 10-bit depth or 1080p@144Hz at 8-bit depth.

While there are 4K@144Hz panels now, the 1440p monitors perform the best, and are considered the desirable zenith for displays. The LG 27GL850 is a perfect example of this. At 2560x1440@144Hz with a 10-bit depth it requires 19.11 Gb/s. So even older Thunderbolt 2 ports are enough to handle it if using converter cables.

Because Thunderbolt still isn't on almost anything, particularly TVs or monitors, which is how the protocol would really matter to gamers. The standards for displays remain HDMI and DisplayPort even if a monitor is so beastly it requires a dual-connection. So even if you have a Thunderbolt port you end up buying a converter cable.

The only people who will care about Thunderbolt 4 are graphics professionals who daisy chain high resolution monitors or need to constantly transfer massive amounts of data between machines/drives.
 
It's great that things are marching forward, but it's especially hard to get excited about a new interface protocol when its predecessor of its predecessor of its predecessor still has basically no practical market presence, and also isn't yet obsolete.

Thunderbolt 1 was introduced a decade ago with a bandwidth of 10 Gb/s. The the same bandwidth as USB 3.2 Gen 2(x1), or USB 3.1 Gen 2. That's also enough throughput to handle 1080p@120Hz at 10-bit depth or 1080p@144Hz at 8-bit depth.

While there are 4K@144Hz panels now, the 1440p monitors perform the best, and are considered the desirable zenith for displays. The LG 27GL850 is a perfect example of this. At 2560x1440@144Hz with a 10-bit depth it requires 19.11 Gb/s. So even older Thunderbolt 2 ports are enough to handle it if using converter cables.

Because Thunderbolt still isn't on almost anything, particularly TVs or monitors, which is how the protocol would really matter to gamers. The standards for displays remain HDMI and DisplayPort even if a monitor is so beastly it requires a dual-connection. So even if you have a Thunderbolt port you end up buying a converter cable.

The only people who will care about Thunderbolt 4 are graphics professionals who daisy chain high resolution monitors or need to constantly transfer massive amounts of data between machines/drives.

The video talked about the difficulty that content producers had with previous generation that standards where not static many seem to push the envelope what thunderbolt 3 would do and some thunderbolt 3 stuff would not run properly. It sounded like a hit or miss and seems that a sizable amount of effort went into setting up a standard that will protect and perform as advertised. Instead of being and envelope pushing monster it was about fixing problems with the previous generation chipsets for content producers.
 
The video talked about the difficulty that content producers had with previous generation that standards where not static many seem to push the envelope what thunderbolt 3 would do and some thunderbolt 3 stuff would not run properly. It sounded like a hit or miss and seems that a sizable amount of effort went into setting up a standard that will protect and perform as advertised. Instead of being and envelope pushing monster it was about fixing problems with the previous generation chipsets for content producers.
Frankly, I don't think they should call it Thunderbolt 4. They should call it Thunderbolt 3.1. It doesn't even boost the bandwidth. Yes, I get that the focus is on setting a standard for minimum performance, but he said in the video what I reinforced in my post.

This has no meaningful relevance to gaming hardware.
 
It's great that things are marching forward, but it's especially hard to get excited about a new interface protocol when its predecessor of its predecessor of its predecessor still has basically no practical market presence, and also isn't yet obsolete.

Thunderbolt 1 was introduced a decade ago with a bandwidth of 10 Gb/s. The the same bandwidth as USB 3.2 Gen 2(x1), or USB 3.1 Gen 2. That's also enough throughput to handle 1080p@120Hz at 10-bit depth or 1080p@144Hz at 8-bit depth.

While there are 4K@144Hz panels now, the 1440p monitors perform the best, and are considered the desirable zenith for displays. The LG 27GL850 is a perfect example of this. At 2560x1440@144Hz with a 10-bit depth it requires 19.11 Gb/s. So even older Thunderbolt 2 ports are enough to handle it if using converter cables.

Because Thunderbolt still isn't on almost anything, particularly TVs or monitors, which is how the protocol would really matter to gamers. The standards for displays remain HDMI and DisplayPort even if a monitor is so beastly it requires a dual-connection. So even if you have a Thunderbolt port you end up buying a converter cable.

The only people who will care about Thunderbolt 4 are graphics professionals who daisy chain high resolution monitors or need to constantly transfer massive amounts of data between machines/drives.

The problem with Thunderbolt is that it may be royalty free, but Intel isn't certifying it for anything. I can only recall seeing 1 AMD motherboard that has Thunderbolt.
When Wendell (Level1Techs) got it working on first gen Threadripper, Intel contacted him and was pissed.
 
Tweeted then deleted



intel's in the middle of a meltdown so bad they wish it was a dumpster fire, so someone might have jumped the gun. i've been curious about intel's gpu project for a while, but i'm still skeptical/guessing it's mostly vaporware. the rumurs with raja/murthy are pretty bad.
 
Tweeted then deleted


Interesting news if it happens will shake up the current situation with prices and performance. I saw a leaked photo of the GPU with Raja and the former VP who left recently Jim Keller wearing his mask completely wrong. But more to the point the chip packed an insane 1700 plus pins and was larger then anything that AMD or Nvidia made in component count. Rumored to be running on an 10 nm process but there will be a 7 nm soon apparently. I guess they have the capability of building on a 7nm process likely though a 3rd party?

EDIT here is a photo of the GPU.

INTEL-Big-GPU-Tease.jpg


Apparently part of a family of GPU's

index.php
 
Ed from Sapphire is going to be on the PCWorld's TheFullNerd today. Ed has been in the industry a very, very long time and is quite the character.
 

Forum statistics

Threads
1,275,147
Messages
57,971,250
Members
175,885
Latest member
gono
Back
Top