Computex 2019

Best of Computex 2019: Tech Advisor award winners
asus_zenbook_pro_duo_review_thumb800.jpg

TechAdvisor said:
When designing its latest laptop, Asus apparently took one look at the MacBook Pro Touch Bar and then asked Apple to hold its beer.

Nevermind a small LCD strip, the ZenBook Pro Duo has a full 14in 4K touchscreen built into the body of the laptop, which you can use to display up to three apps at once, set up custom touch controls, or just let your main software spill down for more vertical real estate.

Yeah, it leaves the keyboard a little squashed and pushes the touchpad to one side (though it still gets a fancy built-in LCD NumPad) but for the most part Asus has made good use of the space, and managed to pack in a few ports too, while keeping the whole thing slick and small.

It won't be for everyone, but if you're used to using multiple monitors or multi-tasking constantly, the Pro Duo (or it's smaller ZenBook Duo sibling) could be a winner.
Asus ZenBook Pro Duo review
Not a fan, but okay.

Meanwhile, not sure if you saw this, @KaNesDeath, but it looks like it's targeting CS:GO players on the go:
Asus announces 240Hz portable gaming monitor and another with a touchscreen
DSC00609.0.jpg
 
...what? i... i don't get it. but... why?
Bright shiny lights seem to intrigue a lot of people. Ridiculous overkill.

Apple & Asus are stealing an idea that LG innovated with their LG V10 phone (which is an awesome feature for smartphones, IMO), and continue to carry with that lineup. LG themselves really sort of stole it, or were inspired at least, by the Windows OS. Who'd have thunk it? The most popular OS in the world that had been around for nearly 30 years at that point might have had some nice ideas about productivity & convenience for the GUI figured out.

81dciPM2QgL._SX569_.jpg


See that row of icons at the top? That is a second, discrete screen from the display below. It's like a taskbar. Notifications can scroll into view there without disrupting what you're doing. You can stick shortcuts to apps there as this person has done, or even functions. It can also show the time so that doesn't take up space on the top of the traditional screen.

Phenomenally useful for a smartphone which has limited screen space, is touchscreen-operated, and where perpetual multitasking is the primary function of the device. Not sure I'm seeing it with laptops.
 
Ye ol I don’t agree and anyone that doesn’t agree with me is a schill

if people are telling me to buy x because (insert bs here), then yeah.

intel is getting hammered on numerous fronts. even if they write their r&d dept a blank check, it would still likely be a couple years away until whatever turnaround hail mary would be through production. meanwhile... they're cpus are vulnerable to numerous exploits (the article doesn't even mention zombieload, but claims the worst is over and refers to spectre and meltdown)/disabling hyperthreading, they cost too much, their 5g modems failed, and their gpu department has yet to provide a single gpu.
 
Last edited:
Was looking through some of the other announcements. Didn't really care about PCIe 4.0 until I saw this product (I've timestamped to when he introduces the product station at 4:46, but the product in question is shown and mentioned at 5:07):


Here's Gigabyte's 52-second feature for the product:


Gigabyte unveils 15,000MB/s AORUS Gen 4 NVMe AIC 8TB SSD
GIGABYTE-AORUS-AIC-Gen4-SSD-01.jpg


Only product I've seen that takes advantage of the extra bandwidth. They've just packed 4x2TB NVMe SSD drives running in a raid setup into what closely resembles a single-wide blower-style GPU. I don't see any practical benefit, unfortunately, but that is an 8TB SSD that will shit on the Samsung 970 Pro or the Intel Optane 905P in terms of performance (it's ~7x as fast). The only question is if the speed it offers will be practically relevant by the time Intel turns the memory/storage worlds upside down with their upcoming Optane products that appear to bridge the gap between these historically distinct categories.
 
Last edited:
The event ended, obviously, but I chip away by looking through several new articles about Computex each day. Samsung upgraded their unique Superwide monitor to 5120x1440:
Samsung Announces CRG9: A 49-Inch Curved 5K 120 Hz FreeSync 2 Monitor
samsung-49-curved-678_678x452.jpg

Anandtech said:
Samsung has announced its second-generation 49-inch curved display supporting AMD’s FreeSync 2 technology. The new CRG9 monitor features a considerably higher resolution and brightness than the first-gen C49HG90 LCD introduced in mid-2017.

Samsung was the first display maker to launch a family of AMD FreeSync 2-supporting monitors in mid-2017 with a 49-inch curved C49HG90 LCD being its flagship offering. Without any doubts, the monitor was an impressive piece of hardware, yet at $1,499 it was not exactly perfect with its 3840×1080 resolution and a 600 nits brightness. Since then, a number of display suppliers have introduced their FreeSync 2-supporting offerings yet neither of them was actually as impressive as Samsung’s FS2 products. In the meantime, Samsung has been working on its second-gen FreeSync 2-enabled flagship monitor.



The new Samsung CRG9 ultra-wide display is based on the company’s new curved VA panel featuring a so-called 'dual QHD' resolution (5120×1440), a 1000 nits peak brightness, 178º/178º vertical/horizontal viewing angles, a 4 ms response time, and a 120 Hz refresh rate. In a bid to increase resolution of its flagship gaming display from 3840×1080 and brightness from 600 nits, Samsung had to lower refresh rate from 144 Hz to 120 Hz, which seems like a fair tradeoff.

Just like in case of the previous-gen 49-incher, the new CRG9 display has a LED backlighting enhanced with quantum dots that enable support for the DCI-P3 color space (as well as larger-than-sRGB gamut) and the HDR-focused AMD’s FreeSync 2. Speaking of HDR, the monitor fully supports HDR10 (but Samsung says nothing about Dolby Vision) and bearing in mind that we are dealing with a panel featuring a peak brightness of 1000 nits, it will likely carry VESA’s DisplayHDR 1000 badge.



Connectivity wise, the new monitor supports two DisplayPorts, one HDMI input, a USB 3.0 hub, and a headphone connector.
The brightness is the biggest leap over the original unit besides that resolution bump. It went from 600 nits to 1000 nits. Freesync 2 also means that you can run Freesync over HDMI.
 
One other I spotted that appears intriguing is the new semi-customizable liquid loop that Corsair launched called Hydro X. Follow the link to see the slideshow:
https://www.kitguru.net/components/...liquid-cooling-range-has-officially-launched/
Starting with its CPU blocks, Corsair has two models at launch – the XC7 and XC9. Both sport RGB lighting with 16 addressable RGB LEDs, but the primary difference is the XC7 is designed for LGA 115x/AM4 sockets, and the XC9 is for LGA 2011/2066/TR4. Because of that, the XC7 has ’60+’ cold plate fins, while the XC9 has ’70+’. Aesthetically they are very similar with the transparent flow chamber, and both come with pre-applied thermal paste.

For its GPU blocks, Corsair currently has five models of its XG7 RGB block – to fit RTX 2070, 2080 and 2080 Ti (Founders Editions), and also GTX 1080 Ti FE and Vega 56/64 (reference). These are full-cover blocks and come with included aluminium backplates, while there is also a transparent section to match the design of the XC CPU blocks. There’s more RGB lighting in the GPU blocks as well, and each comes with pre-applied thermal paste and thermal pads.

Corsair is also offering a XD5 RGB Pump/Reservoir Combo which utilises the popular Xylem D5 pump, with a 330ml reservoir that comes with an integrated fillport. On top of this, there’s more RGB LEDs to illuminate the reservoir itself (ten of those), while Corsair has even integrated a temperature sensor to provide real-time coolant temperatures from within the loop itself.

There’s also two radiator options – 30mm and 54mm thick – with the former available in 120/140/240/280/360/420mm sizes, while the thicker rad is available in 240/360/480mm sizes. These radiators have been made by Corsair in cooperation with HardwareLabs. Corsair is also providing both soft and hardline tubing options, a variety of fittings in different colours that are designed in collaboration with Bitspower, as well as XL5 coolant – currently available in clear, red, green, blue and purple colours – that is a result of cooperation between Corsair and Mayhems.

As this is Corsair, all of the RGB-enabled hardware is compatible and controllable with Corsair iCUE, and Corsair is also keen to point out its easy-to-use custom loop configurator that’s available on its website.
Apparently only 85% of their in-house cases, and 40% of their opponent's cases will support the system, but the idea is that it will conform to the case and overall build you blueprint (Case, Motherboard, GPU).

Direct link to Corsair custom cooling configurator is here. Very fun to play with:
https://www.corsair.com/uk/en/custom-cooling-configurator/
 
I was super excited to see that Noctua has thrown their hat in the fanless CPU cooler ring:
https://www.anandtech.com/show/1448...-cpu-cooler-up-to-120w-of-cooling-performance
IMG_20190531_135741_678x452.jpg


Right now the only company that offers serious fanless CPU cooling performance is the South Korean company NoFan. Its coolers are hard to find, expensive, and massive (as an effective fanless cooler must be). Silverstone carries the Heligon HE02, which claims it can cool CPUs with a TDP of 95W, but online customer reviews report that it grossly under-performs, and can't even adequately cool a 65W CPU in a warm ambient environment.

IMG_20190531_135809.jpg

CR-95C_6.jpg

Noctua claims a 120W TDP ceiling for this cooler, but it doesn't seem to do better much better than the Heligon:
On the booth was a test system demonstrating the concept, which currently has no name, while it was cooling an Intel Core i9-9900K on an ASUS Prime Z390-A motherboard running Prime95, inside a Jonsbo UM4 chassis. While the temperatures weren't exactly helped on by a warm Taiwanese climate, the Noctua fanless averaged a CPU core temperature of around 94 °C under a full Prime 95 load running all day. That's quite impressive given the Core i9-9900K is a premium processor, although the high temperature did thermally throttle the processor down.
These coolers are really better suited for 65W processors, or best of all, the special 35W "T" variants of the Intel core processors. The best >35W CPU in the world is the i9-9900T which launched last month though there are still no samples appearing on UserBenchmark or Passmark:
https://www.intel.com/content/www/us/en/products/processors/core/i9-processors/i9-9900t.html

This is part of a strategy for those who want a truly silent build with the highest possible level of performance:
  • i9-9900T; higher TDP processors may be possible with coolers like the NoFan CR-95C
  • Fanless CPU cooler
  • Fanless PSU
  • Asus ROG Strix 2080 Ti variant (for the Quiet BIOS mode)
  • 140mm-200m PWM case fans set to a low RPM, or tuned with a custom fan profile
  • Noise-Dampening Case* (ex. Be Quiet! Dark Rock Pro 900 Rev2)
*The last is the most controversial. Silent Cases tend to sacrifice airflow for less noise, in theory, but in some noise-normalized thermal tests, high-airflow cases have been shown to achieve lower temperatures at the same decibel level. It's also ideal to give passive fan coolers plenty of air to breathe.
 
Last edited:
Well it seems my 2020 PC for Cyberpunk will be AMD - based.
 
Well it seems my 2020 PC for Cyberpunk will be AMD - based.
Are you basing this on cpu , or gpu or both?

I’m already running and will primarily stay running amd ryzen and td cpu’s On my computers.

But the gpu field is still muddy for me, I’ve been thinking of switching from invidia to amd gpu’s for my Doom machine, as many accounts and tests show Doom 2016 ran better on amd.

What have you seen from cyberpunk if anything suggesting amd gpu is a better choice, as this is a title I will be getting also?
 
Are you basing this on cpu , or gpu or both?

I’m already running and will primarily stay running amd ryzen and td cpu’s On my computers.

But the gpu field is still muddy for me, I’ve been thinking of switching from invidia to amd gpu’s for my Doom machine, as many accounts and tests show Doom 2016 ran better on amd.

What have you seen from cyberpunk if anything suggesting amd gpu is a better choice, as this is a title I will be getting also?
I will be building the PC this winter. I am almost sure the CPU will be a 12-core Ryzen, while GPU remains to be decided on. Depends whether AMD introduce higher-performing cards based on new architecture. Even if not most likely it will be a Radeon VII since I find spending more than a grand on a video card is just stupid.
The last PC I assembled was for my brother's GF and it was a Threadripper - Nvidia combo for rendering.
 
Best of Computex 2019: Tech Advisor award winners
asus_zenbook_pro_duo_review_thumb800.jpg


Not a fan, but okay.

Meanwhile, not sure if you saw this, @KaNesDeath, but it looks like it's targeting CS:GO players on the go:
Asus announces 240Hz portable gaming monitor and another with a touchscreen
DSC00609.0.jpg
So looking into this some more, I’m finding that they have been out for a while, and the 240hrz one is really the “new” one.

I have no need for the fast one, but the “normal” one I could definitely utilize traveling.

At work on a daily basis I really wish this little screen had a “wireless” connectivity option to a laptop.

This would be ideal for me, I’m at my desk and I unplug this from
The usb-c to my docking station and walk around the office and shop with it, while having full use of my pc. As this is only a screen.


I’m working on a document, and hop up from my desk to go verify something. Today I take a notebook, right shit down and then go back to my desk to update the excel file etc.

With a wireless screen option I could do it right there and cut the notebook to update excel time consumption out of the equation.


Here is what I’m finding available now(this seems to be the best one) but I haven’t read anything about any wireless options.

Amazon product ASIN B07J4SX1MS
 
I will be building the PC this winter. I am almost sure the CPU will be a 12-core Ryzen, while GPU remains to be decided on. Depends whether AMD introduce higher-performing cards based on new architecture. Even if not most likely it will be a Radeon VII since I find spending more than a grand on a video card is just stupid.
The last PC I assembled was for my brother's GF and it was a Threadripper - Nvidia combo for rendering.
Yeah I’m a cheap bastard and can’t justify 1k for a card.

I always shop deals on the last gen out of new gen right before next gen release

I’m still sour that sli hasn’t been optimized more , but I’m sure it’s due to
A) companies wanting you to buy the newest cards
B) not that many people running sli(because it isn’t optimized lol)

In a perfect work, you stick in a second card like the one you have(use two 980 ti’s Or two Vega 54’s) for example.

Could be done vary budget friendly if you have the pcie slots and power for it, and be STRONGER than one solo “new replacement “ in 99% of the cases if it were optimized.

Or two older quadro’s for my workstation for example vs a newer better quadro
 
Yeah I’m a cheap bastard and can’t justify 1k for a card.

I always shop deals on the last gen out of new gen right before next gen release

I’m still sour that sli hasn’t been optimized more , but I’m sure it’s due to
A) companies wanting you to buy the newest cards
B) not that many people running sli(because it isn’t optimized lol)

In a perfect work, you stick in a second card like the one you have(use two 980 ti’s Or two Vega 54’s) for example.

Could be done vary budget friendly if you have the pcie slots and power for it, and be STRONGER than one solo “new replacement “ in 99% of the cases if it were optimized.

Or two older quadro’s for my workstation for example vs a newer better quadro
as you said yourself - many people would not buy new cards then.
 
I will be building the PC this winter. I am almost sure the CPU will be a 12-core Ryzen, while GPU remains to be decided on. Depends whether AMD introduce higher-performing cards based on new architecture. Even if not most likely it will be a Radeon VII since I find spending more than a grand on a video card is just stupid.
The last PC I assembled was for my brother's GF and it was a Threadripper - Nvidia combo for rendering.
as you said yourself - many people would not buy new cards then.
Are you on a Freesync monitor, then?

Unless the Radeon VII craters in price by next year it appears to be the clearly inferior buy, new. Lowest price I'm seeing on PCPP atm is $730. Meanwhile, the RTX 2080 starts at $700, and there are far more sales on it (the best I saw was $590).
https://gpu.userbenchmark.com/Compare/Nvidia-RTX-2080-vs-AMD-Radeon-VII/4026vs4035
NVIDIA still has better driver support across games, as well as more gimmicky bells and whistles like Hairworks, and with +1300% market share for this matchup expect developers to care more about catering to the RTX. Then there's the matter of ray-tracing. The Radeon also runs much louder, and a bit hotter.

If not Freesync, are you favoring the 37% advantage in FLOP power and the 16GB VRAM?
 
Last edited:
Are you on a Freesync monitor, then?

Unless the Radeon VII craters in price by next year it appears to be the clearly inferior buy, new. Lowest price I'm seeing on PCPP atm is $730. Meanwhile, the RTX 2080 starts at $700, and there are far more sales on it (the best I saw was $590).
https://gpu.userbenchmark.com/Compare/Nvidia-RTX-2080-vs-AMD-Radeon-VII/4026vs4035
NVIDIA still has better driver support across games, as well as more gimmicky bells and whistles like Hairworks, and with +1300% market share for this matchup expect developers to care more about catering to the RTX. Then there's the matter of ray-tracing. It's also runs much louder, and a bit hotter.

If not Freesync, are you favoring the 37% advantage in FLOP power and the 16GB VRAM?
As Jay talks about hopefully amd will have a “ray tracing” driver release soon, as their architecture is actually better for it?

But I do agree, on market share perspective very few games run BETTER on amd when apples to apples cards are compared, BUT if the game you like the most does, it makes sense.

From a certain point of view.

I LOVED Quake 2, and that video has me tempted to buy an rtx card just for it.

BUT if ETERNAL will run better on amd like 2016 does, I will have shot myself in the foot, for a gimmick on an old ass game.

Rambling I know.

Carry on.
 
Back
Top