Tech Gaming Hardware discussion (& Hardware Sales) thread

Okay, how about this test from 2016. The latest Wraith coolers weren't out yet, but the E97379-001 was, and it was tested alongside the immediate predecessor to the Wraith Prism & Wraith Max flagships for their respective Ryzen generations, as well as the lesser AMD coolers at the time. It wasn't conducted by Gamers Nexus. Just a lowly Ph.D. in Electronic and Computer Engineering from London who presently works as a high-voltage equipment engineer in Greece:
Battle of The CPU Stock Coolers! 7x Intel vs 5x AMD, plus an EVO 212
82972.png


82966.png

82967.png

82968.png
Here is AMD's most common stock cooler from that time, their second-to-worst, that was stomping the Intel in those charts:
Anandtech said:
AMD’s FHSA7015B is perhaps the most widely used stock CPU cooler in existence, as the company has been supplying it alongside with tens of CPUs across nine different platforms (FM1, AM3+, AM3, AM2+, AM2, 1207, 940, 939 and 754 sockets). It is a rather simple design entirely made out of aluminum, with a square base and straight fins extending to all four sides of the cooler.

AMD FHSA7015B


Like the Intel, you can purchase this at Amazon:
($17.99) AMD FHSA7015B-1268
  • Dimensions: 2.1" x 2.75" x 3.0" (53mm x 70mm x 76mm)
  • Material: Aluminum
  • Fan: 70mm (9-blade, ball bearing)
  • Weight: 9.6 ounces (0.272kg)

The Wraith Stealth is heavier (0.317kg), also carries an aluminum base of roughly equal dimensions, but a much larger fan (100mm, 7-blade):
41-ocBRNeVL._AC_SX355_.jpg



Don't expect any Intel E9-series stock cooler to compete with the Wraith Stealth. It's puny. It won't matter who conducts the test.
 
Does anyone have any recommendations on storage/backup solutions? I'm looking into docking enclosure's with RAID to hook up to my PC, but also thinking about just straight up building a home server. I really don't want to spend a lot of money though.
 
Does anyone have any recommendations on storage/backup solutions? I'm looking into docking enclosure's with RAID to hook up to my PC, but also thinking about just straight up building a home server. I really don't want to spend a lot of money though.
Build a home NAS and use an online provider like Backblaze. Backblaze is $60 a year and has unlimited storage.
 
NVIDIA’s GeForce RTX 3080 Flagship GPU Pictured For The First Time
NVIDIA-GeForce-RTX-3080-740x555.jpg


nvidia-rtx-3080-close-up-picture-scaled.jpg


WCCF Tech said:
Here are some key design considerations of the new NVIDIA RTX 3080 graphics card:
  1. The fans are bi-directional. One fan has an open-intake at the bottom while the other has an open-intake at the top. While it is hard to judge the exact direction of each fan I would assume the bottom one to be a static pressure fan while the top one throws air out in an attempt to keep air flowing through the fins.
  2. The heat sink is curved. This is a very interesting design because normally it would disrupt airflow but NVIDIA seems to be attempting a push full configuration which would rely on pressure differentials to keep the air flowing and might work wonders with this design philosophy.
  3. The PCB is irregularly shapes and ends when it touches the heatsink. In the image below, only the grey plastic part of the RTX 3080 houses the PCB, the fan section does not. This confirms previous rumors that had indicated a small and irregular shaped PCB for NVIDIA's upcoming Ampere GPUs.
  4. This is almost certainly a Founder's Edition (if NVIDIA still wants to call these that) GPU. A design like this would be incredibly hard to tool and machine and AIBs would almost certainly prefer to go with their traditional Heatpipe and triple fan designs if NVIDIA allows them. Unless we see some form of packaging with NVIDIA supplying these parts (not going to happen) what you are looking at is almost certainly a founders edition run in limited quantity (and very high MSRP).

According to old rumors the RTX 3080 will ship with a GA104 GPU*, and replace the existing RTX 2080 with 48 SMs (3072 CUDA cores). This is very slightly more than the RTX 2080 at 46 SMs. Coupled with higher performance throughput and improved RTX cores you are looking at a significant performance increase if this turns out to be true - in fact, it is estimated to be just slightly less in performance than the RTX 2080 Ti current generation flagship. The RTX 3080 GPU will be coupled with 8GB/16GB of vRAM and a 256-bit bus width.

*Edit: New rumors have indicated that the card may actually ship with a GA102 core that is extremely powerful.
On the GA102-200 core:
https://wccftech.com/nividias-rtx-3...ght-ship-with-ga-102-200-and-4352-cuda-cores/


Since the RTX 3080 is speculated to ship with a GA102-200 core, this would mean that it actually houses 4352 CUDA cores. At a conservative clock speed of just 1.75GHz, which NVIDIA has easily achieved in the past, this would mean it will output 15 TFLOPs at the bare minimum (TFLOPs is a function of Clock Rate * Core Count * IPC (which in this case is 2) / 1000000). That is an insanely high amount and would result in a very very powerful card - not to mention actual IPC gains. Initial leaks had indicated that the RTX 3080 would simply have a GA104 core, while later leaks suggested a GA103 core. All of that seems to have changed (and for the better) because NVIDIA is clearly prepping multiple flavors of the GA102 and one of these could potentially trickle down to the RTX 3080 (assuming NVIDIA doesn't add more naming schemes to their lineup - SUPER for example - which is a possibility).

An RTX 2080 is currently a $699 MSRP card and outputs around 10 TFLOPs of graphics horsepower. If NVIDIA keeps pricing the same, this would mean the RTX 3080 becomes 50% more powerful at the same price.

Rumored specs of the GA102-400:
https://www.techpowerup.com/gpu-specs/nvidia-ga102.g930

So if the GA102-200 core leak is the right one, and these assumptions are correct, a reference comparison:

RTX 3080 (Rumored)
  • 4352 CUDA Cores
  • 15.00 TFLOP
  • 10GB GDDR6 VRAM
RTX 2080 Ti
  • 4352 CUDA Cores
  • 13.45 TFLOP
  • 11GB GDDR6 VRAM
RTX 2080 Super
  • 3072 CUDA cores
  • 11.15 TFLOP
  • 8GB GDDR6 VRAM
RTX 2080
  • 2944 CUDA Cores
  • 10.07 TFLOP
  • 8GB GDDR6 VRAM

*Edit Update*
Another snap has leaked:

mkswyd88w1451.png
 
Last edited:
Sooo any recommendations for a monitor over 32" under $1500?

Looking for deals
 
NVIDIA’s GeForce RTX 3080 Flagship GPU Pictured For The First Time
NVIDIA-GeForce-RTX-3080-740x555.jpg


nvidia-rtx-3080-close-up-picture-scaled.jpg



On the GA102-200 core:
https://wccftech.com/nividias-rtx-3...ght-ship-with-ga-102-200-and-4352-cuda-cores/

Rumored specs of the GA102-400:
https://www.techpowerup.com/gpu-specs/nvidia-ga102.g930

So if the GA102-200 core leak is the right one, and these assumptions are correct, a reference comparison:

RTX 3080 (Rumored)
  • 4352 CUDA Cores
  • 15.00 TFLOP
  • 10GB GDDR6 VRAM
RTX 2080 Ti
  • 4352 CUDA Cores
  • 13.45 TFLOP
  • 11GB GDDR6 VRAM
RTX 2080 Super
  • 3072 CUDA cores
  • 11.15 TFLOP
  • 8GB GDDR6 VRAM
RTX 2080
  • 2944 CUDA Cores
  • 10.07 TFLOP
  • 8GB GDDR6 VRAM
But can it run Crysis?
 
Sooo any recommendations for a monitor over 32" under $1500?

Looking for deals
Are you talking about Ultrawides or Superwides? Because any othe monitor above 32" are extremely rare (mostly 36", 38", 43") and there isn't much of a point to any of them versus TVs because they don't do above 60Hz, and so they also don't require (or have) dual Displayport or DVI inputs. So I assume you're simply after a 32"+ display with a DisplayPort 1.2a+ port to support G-Sync or Freesync. Is that right?
 
Are you talking about Ultrawides or Superwides? Because any othe monitor above 32" are extremely rare (mostly 36", 38", 43") and there isn't much of a point to any of them versus TVs because they don't do above 60Hz, and so they also don't require (or have) dual Displayport or DVI inputs. So I assume you're simply after a 32"+ display with a DisplayPort 1.2a+ port to support G-Sync or Freesync. Is that right?

Well I'm talking something in line with this

https://www.microcenter.com/product...hdr-1000-g-sync-compatible-led-gaming-monitor

Looks like it's too good to be true so I wanted to know if there are other similar options/deals.

Don't care for Ultrawide. G Sync compatibility/DP/HDR/4K are all essentials.
 
Well I'm talking something in line with this

https://www.microcenter.com/product...hdr-1000-g-sync-compatible-led-gaming-monitor

Looks like it's too good to be true so I wanted to know if there are other similar options/deals.

Don't care for Ultrawide. G Sync compatibility/DP/HDR/4K are all essentials.
This is the only other monitor in that class, but it's $1500 even at Microcenter:
https://www.asus.com/Monitors/ROG-Swift-PG43UQ/
https://www.microcenter.com/product...ync-compatible-hdr-eyecare-led-gaming-monitor
https://pcpartpicker.com/product/2pDkcf/asus-rog-swift-pg43uq-430-3840x2160-144-hz-monitor-pg43uq

If you step down to 38" you have these two. Didn't check for HDR or resolution:
  • Acer X38 P
  • LG 38GL950G
 
Intel has just published a news release on its website stating that Jim Keller has resigned from the company, effective immediately, due to personal reasons.

Jim Keller was hired by Intel two years ago to the role as Senior Vice President of Intel’s Silicon Engineering Group, after a string of successes at Tesla, AMD, Apple, AMD (again), and PA Semiconductor. As far as we understand, Jim’s goal inside Intel was to streamline a lot of the product development process on the silicon side, as well as providing strategic platforms though which future products can be developed and optimized to market. We also believe that Jim Keller has had a hand in looking at Intel’s manufacturing processes, as well as a number of future products.
 
Intel has just published a news release on its website stating that Jim Keller has resigned from the company, effective immediately, due to personal reasons.

Jim Keller was hired by Intel two years ago to the role as Senior Vice President of Intel’s Silicon Engineering Group, after a string of successes at Tesla, AMD, Apple, AMD (again), and PA Semiconductor. As far as we understand, Jim’s goal inside Intel was to streamline a lot of the product development process on the silicon side, as well as providing strategic platforms though which future products can be developed and optimized to market. We also believe that Jim Keller has had a hand in looking at Intel’s manufacturing processes, as well as a number of future products.
Is there a reason you plagiarize websites without throwing a link? That's not kosher.
https://www.anandtech.com/show/15846/jim-keller-resigns-from-intel-effective-immediately
 
AMD has been reclaiming GPU market share the past year:
https://www.fool.com/investing/2020/06/13/amd-takes-nvidia-by-storm-but-can-it-keep-up-pace.aspx
Advanced Micro Devices (NASDAQ:AMD) has been chipping away at arch-rival NVIDIA's (NASDAQ:NVDA) discrete graphics card market share for a few years now, and did it once again in the first quarter of 2020, according to the latest numbers from Jon Peddie Research.

Jon Peddie Research's latest discrete graphics processing unit (GPU) report reveals that AMD was sitting on nearly 31% of the market in the first quarter of 2020. That was a nice jump from the prior-year period's market share of nearly 22.7%, which means that NVIDIA has lost substantial ground to its smaller rival...

Why AMD is winning the GPU battle
...The reason why AMD has been able to pack in a solid price-to-performance ratio compared to NVIDIA graphics cards is because of an advanced manufacturing process. The company's RDNA architecture is based on a 7nm process, while NVIDIA's current generation Turing graphics cards are based on a 12nm process.

A smaller manufacturing node means that the transistors on the chip are smaller and are packed closely together. This leads to a jump in the compute capacity of the chip, reduces power consumption, and lowers manufacturing costs because of a smaller die size.

NVIDIA's upcoming graphics cards will pose a new challenge
The Ampere architecture-based consumer graphics cards from NVIDIA could be based on a 7nm manufacturing process, though there are rumors suggesting that they may also be manufactured using a 10nm process. The Ampere-based graphics cards could hit the market close to the fourth quarter of 2020.

With this move, NVIDIA is expected to close the technology gap that AMD has enjoyed of late. It is rumored that the Ampere cards could be four times more powerful in ray-tracing performance, with the top of the line card expected to be 40% faster in terms of overall performance than its predecessor.

It remains to be seen how NVIDIA goes about the pricing of these cards, as the company has been making its offerings pricier of late. But NVIDIA may not have that luxury this time -- AMD looks all set to keep up the heat thanks to its upcoming GPUs based on the RDNA2 architecture, which is expected by the end of the year.
That's a 36.5% gain in market share. Not surprised, really. It explains why the the floor pricing I've seen on PCPP for these cards for the past few months is about as high as ever.
 
Back
Top