Tech Gaming Hardware discussion (& Hardware Sales) thread

@PEB you might want to check out GamersNexus new video. They go in hard on MSI, and justifiably so.
 
This is actually not bad price wise especially considering it has a Ryzen 4 core/ 8 thread and a for 360 to 400 dollars. It would be nice if it had a large external power brick and a half length pci express slot with an external box option. Compared to previously with Intel offerings this is pretty good.



General
Brand Beelink
Model GT-R
Type Mini PC
Hardware
OS N/A
CPU AMD Ryzen™ 5 3550H, Quad-Core 8 Threads, 2.1Ghz UP TO 3.7Ghz, Cache 6MB
Instruction Set 64bit
Lithography 12nm
Graphics Radeon™ Vega 8 Graphics 1200 MHz
RAM 8GB (SO-DIMM DDR4 2400MHz), Support Up to 32GB
Storage M.2 PCI-E 2X SSD
Slot (Support Extension)
Storage M.2 SATA3 SSD
Slot (Support Extension)
Storage SATA3 HDD 2.5in
Slot (Support Extension)
Communication
WIFI 2.4G/5G, 802.11 b/g/n/ac , MIMO
Bluetooth Bluetooth 4.2
LAN 1000M
Media
Decoder Format H.264, H.265
Video Format 4K, 4Kx2K, ASF, AVC, AVI, DAT, FLV, ISO, MP4, RM, RMVB, RV, TS, VOB, VP9, WMV
Audio Format AAC, AC3, ACC, APE, DDP, DTS, MKA, MP3, MPEG, OGA, OGG, RM, TrueHD, WAV, WMA
Photo Format BMP, GIF, JPEG, JPG, PNG
5.1 Surround Sound Output Support
Other Features
Language Multi-language
HDMI Function HDCP
Interface
Fingerprint Recognition x 1
DC JACK x 1
USB 3.0 x 6
100/1000M RJ45 x 2
HDMI (Standard HDMI Connector) x 2
DP x 1
MIC (Built-in) x 2
Audio Jack (HP&MIC) x 1
Type-C (date/video) x 1
Power
Power Consumption 19V/3A
Power Type External Power Adapter
Power Supply Charge Adapter
Certification CE, FCC
Dimension & Weight
Product weight 0.666 kg
Package weight 1.363 kg
Product size (L x W x H) 16.80 x 12.00 x 3.90 cm
Package size (L x W x H) 18.00 x 12.90 x 11.00 cm
 
Last edited:
I would just like to take this post to vent about how much I hate display ports, as well as the stupid HDMI licensing standard.

I don't want a video card with 1 HDMI 2.1 and 4 Display Ports, I want the exact opposite. Heck, I would even settle for some good old fashioned DVI ports for non 4k monitors. Over the years, I have come to terms (for the most part) with this, purchasing countless adapters to make my multi monitor setups work.

However, I was reminded again about how frustrating this this can be when I purchased a 4K TV to use in my office. I did my homework, picked up a TCL 55" that supported 4k@60hz with 4:4:4 chroma, and made sure my work laptop was equipped with a GTX 2060 and 9750h. Plugged the TV in, but it would only output at 30hz - I attached it to my desktop using the same cable, and no issues. I spent a solid 3 hours trouble shooting, installing drivers, reading forums etc.... only to find out that the 9750h will default to Intel integrated graphics unless in 3D mode, which only supports 4K60hz when using the mini display port. Even though the HDMI port is fully capable of outputting the full resolution/refresh rate, there is no way to force the laptop to use the dedicated GPU in 2D (or non Windows environments). There isn't even a bios setting that allows me to set preferences.

So despite having a drawer full of HDMI to DP, DVI-DP, HDMI-DVI etc etc., I have to add one more to the shopping list because apparently it is too difficult to put a full sized display port on a 17" laptop.

Silver lining. The TCL 6 series is a ridiculous deal - paid $500CAD, and it looks way better than my Samsung 2017 UHD 4k.
 
I would just like to take this post to vent about how much I hate display ports, as well as the stupid HDMI licensing standard.

I don't want a video card with 1 HDMI 2.1 and 4 Display Ports, I want the exact opposite. Heck, I would even settle for some good old fashioned DVI ports for non 4k monitors. Over the years, I have come to terms (for the most part) with this, purchasing countless adapters to make my multi monitor setups work.

However, I was reminded again about how frustrating this this can be when I purchased a 4K TV to use in my office. I did my homework, picked up a TCL 55" that supported 4k@60hz with 4:4:4 chroma, and made sure my work laptop was equipped with a GTX 2060 and 9750h. Plugged the TV in, but it would only output at 30hz - I attached it to my desktop using the same cable, and no issues. I spent a solid 3 hours trouble shooting, installing drivers, reading forums etc.... only to find out that the 9750h will default to Intel integrated graphics unless in 3D mode, which only supports 4K60hz when using the mini display port. Even though the HDMI port is fully capable of outputting the full resolution/refresh rate, there is no way to force the laptop to use the dedicated GPU in 2D (or non Windows environments). There isn't even a bios setting that allows me to set preferences.

So despite having a drawer full of HDMI to DP, DVI-DP, HDMI-DVI etc etc., I have to add one more to the shopping list because apparently it is too difficult to put a full sized display port on a 17" laptop.

Silver lining. The TCL 6 series is a ridiculous deal - paid $500CAD, and it looks way better than my Samsung 2017 UHD 4k.




 
Can people/someone help me understand laptop network card/download speeds?

Like, if you buy a USB wifi adapter, would that increase download speeds? I guess where I am confused is ISPs always advertise some really fast connection, let's say 800 mbps, but on a wireless network (and I'd assume wired) it never gets that fast.

So let's say in practice, you're getting 100 to 150 on your laptop (according to speedtest) that you're using to stream video or download large files. Would like a 1200 mpbs usd wifi adapter likely get you noticeable better speed, or would it likely be a waste? (let's assume the advertised ISP speed is 500 mpbs). And would you need a USB 3.0 port for it to be effective, what if you only have USB 2.0 port.

Sorry if I sound dumb, but I always find whether it's file transfer on SSDs/memory cards, or wireless data transfer the advertised numbers aren't realized in practice, so I'm trying to figure out what's best in practice.
 
Can people/someone help me understand laptop network card/download speeds?

Like, if you buy a USB wifi adapter, would that increase download speeds? I guess where I am confused is ISPs always advertise some really fast connection, let's say 800 mbps, but on a wireless network (and I'd assume wired) it never gets that fast.

So let's say in practice, you're getting 100 to 150 on your laptop (according to speedtest) that you're using to stream video or download large files. Would like a 1200 mpbs usd wifi adapter likely get you noticeable better speed, or would it likely be a waste? (let's assume the advertised ISP speed is 500 mpbs). And would you need a USB 3.0 port for it to be effective, what if you only have USB 2.0 port.

Sorry if I sound dumb, but I always find whether it's file transfer on SSDs/memory cards, or wireless data transfer the advertised numbers aren't realized in practice, so I'm trying to figure out what's best in practice.
The practical throughput never matches the theoretical specification quoted.

Your actual speed depends on the WiFi network adapter built into the laptop's motherboard. There's not all the same, and the reason this is confusing is because this remains one of the most obfuscated specs in laptops despite that in this day and age it's one of the most important. If you Google enough you can usually find the built-in network adapter's exact model. Intel, Killer, Asus, sometimes even the mobile players like Broadcomm (typically for Netbook, Ultrabooks, and tablet hybrids) are the manufacturers of these. Then you can look up that network adapter's specs.

There you can find the exact download and upload rated speeds for the WiFi. Be aware these are the WLAN speeds, not the LAN speeds. Laptops have ethernet ports for wired connections, the LAN, and those will almost always be at least 1Gbps. Not always so for WiFi.

Next you have the MU-MIMO of the network adapter, or how many simultaneous streams it can manage. Both router and WiFi network adapter have rating. This is written as 1x1, 2x2, 4x4. The more the better. That's because we often compete with other devices on our own networks. This has a greater impact on latency than speed, but it can affect both. What happens if that you have everyone asking the router for data packets. It puts you all in a queue. If the router can only handle 1 stream at a time, you all get put into the same queue. More queues is more people being served simultaneously just like at the post office. The router can also break up tasks all going to your laptop. If your laptop can handle more streams simultaneously that can speed things up.

In addition to all this, more important to raw download speed than the above, there is real-world degradation with WiFi signals that you don't experience with wired connections. Again, this depends on both the router and the WiFi network adapter in your laptop. Cheaper routers have inferior transmission power. All signals weaken the farther you get from the router, but their signals will weaken with less distance. Walls and other obstacles also play a huge role in this. With dual-band routers, the 2.4GHz band is slower than the 5GHz band, but it also carries farther (like AM vs. FM radio). While it's the tortoise, sometimes this means it wins the race. Depends on the physical spot of reception. You can speedtest.net both networks from any specific spot to decipher this with certainty.

The better routers also have a feature called "beamforming" which enables you some directional control of your signal using the antennae, but in my experience this has had a negligible effect.

Finally, you have the concern of interference. So if you live in a building with a ton of other people, since everyone has WiFi these days, you're almost certainly going to experience interference. This also causes issues of latency and speed drop-off. Over a decade ago, when 5GHz was brand new, and almost nobody had it, the easiest way to avoid this crowded radio space was to get on 5GHz, and suddenly you were sailing free. Those in crowded buildings reported the difference in internet speed was astonishing. So today, the latest and greatest routers are starting to utilize a special sub-channel: the 160MHz band. This is how to avoid the crowd at the watering hole (assuming you don't have a tech-savvy neighbor). The below article explains it in depth:
160 MHz Wi-Fi Channels: Friend or Foe?
 
I'm more referring to high end VR so I'm not sure how well an air cooled or 125mm AIO would work with a Ryzen. I know with an Intel its a no go, at least for my needs.

Moving the convo over here so we don't muddy up the laptop thread.

Radiators are usually in either increments of 120mm or 140mm. There are 80mm and 92mm radiators, but they're super rare.

3900x with a Prism test.
"Even when enabling PBO you won’t gain much more performance by upgrading the cooler. We're not saying you shouldn’t upgrade the cooler for lower temperatures and quieter operation, simply that by doing so you won’t squeeze much extra performance. For gamers, the bundled Wraith Prism will be even less of an issue as it’s very unlikely you’ll see all 12-cores loaded up."


Personally I run a 3700x at 4.3ghz all core 1.35v with the cheap Cooler Master MasterLiquid 240mm AIO and it doesn't go above 70 degrees max, usually hovers around 66-67. I replaced the fans with Corsair HD120 fans, not the best static pressure fans, and run them at 800rpm.
 
I would just like to take this post to vent about how much I hate display ports, as well as the stupid HDMI licensing standard.

I don't want a video card with 1 HDMI 2.1 and 4 Display Ports, I want the exact opposite. Heck, I would even settle for some good old fashioned DVI ports for non 4k monitors. Over the years, I have come to terms (for the most part) with this, purchasing countless adapters to make my multi monitor setups work.

However, I was reminded again about how frustrating this this can be when I purchased a 4K TV to use in my office. I did my homework, picked up a TCL 55" that supported 4k@60hz with 4:4:4 chroma, and made sure my work laptop was equipped with a GTX 2060 and 9750h. Plugged the TV in, but it would only output at 30hz - I attached it to my desktop using the same cable, and no issues. I spent a solid 3 hours trouble shooting, installing drivers, reading forums etc.... only to find out that the 9750h will default to Intel integrated graphics unless in 3D mode, which only supports 4K60hz when using the mini display port. Even though the HDMI port is fully capable of outputting the full resolution/refresh rate, there is no way to force the laptop to use the dedicated GPU in 2D (or non Windows environments). There isn't even a bios setting that allows me to set preferences.

So despite having a drawer full of HDMI to DP, DVI-DP, HDMI-DVI etc etc., I have to add one more to the shopping list because apparently it is too difficult to put a full sized display port on a 17" laptop.

Silver lining. The TCL 6 series is a ridiculous deal - paid $500CAD, and it looks way better than my Samsung 2017 UHD 4k.
 
Moving the convo over here so we don't muddy up the laptop thread.

Radiators are usually in either increments of 120mm or 140mm. There are 80mm and 92mm radiators, but they're super rare.

3900x with a Prism test.
"Even when enabling PBO you won’t gain much more performance by upgrading the cooler. We're not saying you shouldn’t upgrade the cooler for lower temperatures and quieter operation, simply that by doing so you won’t squeeze much extra performance. For gamers, the bundled Wraith Prism will be even less of an issue as it’s very unlikely you’ll see all 12-cores loaded up."


Personally I run a 3700x at 4.3ghz all core 1.35v with the cheap Cooler Master MasterLiquid 240mm AIO and it doesn't go above 70 degrees max, usually hovers around 66-67. I replaced the fans with Corsair HD120 fans, not the best static pressure fans, and run them at 800rpm.


I can only tell you my experiences with it and that has been the the air coolers, which I used with the HTC Vive and the 125mm Krakken AIO that I used with the Valve Index were incapable of cooling the applications I was running and the differences in temperature between those solutions and the 360mm AIO are stunning. I'm talking a reduction in some cases from 80c+ down to 47c. That might be different when using Ryzen CPU's but I'm still on Intel.

Another thing is, if you don't run a high end VR headset then you might not know that these headsets are getting more and more demanding. The Valve Index that I'm using runs a dual LCD display, one for each eye, at 1440x1600 for a combined resolution of 2880x1600. The field of view has been up'ed to 130 degrees which is even more to render and then on top of that, the refresh rate in VR has to be high or you will get sick so the Valve Index runs as low as 90hz but is more standard to run it at 120hz and it goes as high as what they call the experimental 144hz.

Then you have to realize that not all game makers are equal when it comes to optimization of their games. That's one reason when I'm always skeptical of blanket benchmark numbers because they do not take into account all the variations of equipment, ambient temperatures, software being run, etc. If you get a game like Half Life:Alyx and run it on high settings with a Valve Index at 120hz and ask a 125mm AIO to handle it, well, goodluck with that.

I fully admit though, I have no experience with AMD Ryzen so I can't speak on that matter.
 
I can only tell you my experiences with it and that has been the the air coolers, which I used with the HTC Vive and the 125mm Krakken AIO that I used with the Valve Index were incapable of cooling the applications I was running and the differences in temperature between those solutions and the 360mm AIO are stunning. I'm talking a reduction in some cases from 80c+ down to 47c. That might be different when using Ryzen CPU's but I'm still on Intel.

Another thing is, if you don't run a high end VR headset then you might not know that these headsets are getting more and more demanding. The Valve Index that I'm using runs a dual LCD display, one for each eye, at 1440x1600 for a combined resolution of 2880x1600. The field of view has been up'ed to 130 degrees which is even more to render and then on top of that, the refresh rate in VR has to be high or you will get sick so the Valve Index runs as low as 90hz but is more standard to run it at 120hz and it goes as high as what they call the experimental 144hz.

Then you have to realize that not all game makers are equal when it comes to optimization of their games. That's one reason when I'm always skeptical of blanket benchmark numbers because they do not take into account all the variations of equipment, ambient temperatures, software being run, etc. If you get a game like Half Life:Alyx and run it on high settings with a Valve Index at 120hz and ask a 125mm AIO to handle it, well, goodluck with that.

I fully admit though, I have no experience with AMD Ryzen so I can't speak on that matter.

I think we're on different levels here bud. It's good to hear you're enjoying your Valve Index.
 
I think we're on different levels here bud. It's good to hear you're enjoying your Valve Index.

I'm no expert, that's for sure. I can only speak from my experiences with these different coolers, of which I've used, air, 125mm AIO, and 360mm AIO. Its kinda like trying to explain to an engineer that all his math and drawings don't necessarily work in the real world the way he thinks they do.
 
I'm no expert, that's for sure. I can only speak from my experiences with these different coolers, of which I've used, air, 125mm AIO, and 360mm AIO. Its kinda like trying to explain to an engineer that all his math and drawings don't necessarily work in the real world the way he thinks they do.
After reading what I wrote, it came off dickish and that's not what I was trying to do. I apologize if I offended you.
There's tons of videos, articles, and anecdotal posts on forums stating that the Wraith Prism is just fine on the Ryzen.
How much better is the Valve Index than the Vive? I've got a Rift, the original one, but I haven't used it in almost 9 months. Personally I feel games aren't there yet and most feel like tech demo's to me.
 
After reading what I wrote, it came off dickish and that's not what I was trying to do. I apologize if I offended you.
There's tons of videos, articles, and anecdotal posts on forums stating that the Wraith Prism is just fine on the Ryzen.
How much better is the Valve Index than the Vive? I've got a Rift, the original one, but I haven't used it in almost 9 months. Personally I feel games aren't there yet and most feel like tech demo's to me.

No, I'm not offended at all. I fully understand that some of you guys on here are on another level when it comes to this stuff. As far as the differences in headsets, the Valve Index is Cadillac. When comparing it to the original Vive, the Index is pretty much superior in every conceivable way, from overall resolution, field of view, design, and a refresh rate that goes all the way to 144hz. It is a noticeable difference.

What I want to tell you though is not about the Valve Index, its about the state of VR games. Its taken some time, 4 years, for game companies to start to catch up to what these headsets can do. Valve hit it out of the park with Half Life: Alyx. It is considered by most at this point to be the greatest VR game ever made and it is spectacular. Its a full game in the Half Life universe and not anything at all like you've seen before. I can say that with confidence. You've never seen anything like it.

Just on Steam alone there are 32,000 reviews, which is a SHIT-TON for a VR game and its sitting at 98% approval. I don't know what it might look like on your Oculus but you are seriously missing out on this one if you don't pick it up. I'm a big Half Life fan that played the originals when they came out then later played them again when Orange Box came out, and I'm telling you, Alyx has exceeded my expectations in pretty much every way possible. It is a true showcase for what VR should be.
 
  • Like
Reactions: PEB
The area that really blows my mind is how quickly the price of solid state drives have come down I only wish the same trend was with the GPU's. :)

I remember paying $90 for 960TB ADATA drive last year (Canadian). To me, prices are still too damn high.
 
Back
Top