Tech Gaming Hardware discussion (& Hardware Sales) thread

qualcomm-reference-headset-ces-2019-1-1-1021x580.jpg


Qualcomm VR reference headset will have over 2K per eye and 90hz refresh rate with eye-tracking and running off a mobile processor snapdragon 855 with HDR support at 7nm. No word yet on who will support it but pretty impressive at over 4k total resolution.

https://www.roadtovr.com/qualcomm-reference-headset-2x-pixels-vive-pro-ces-2018/
 
You suppose Intel getting a little concern about that other company CPU.
Best thing I've seen from Intel is the combination SSDs that yoke an Optane unit to the same PCB as a Nand flash storage unit. I suspect laptops that have this will come at a stiff premium, but should become the most desirable storage configuration among laptops for the next few years.

Nobody is more confused with naming than Intel. The center couldn't hold, and it's an irrecoverable clusterfuck, now. Not only does the "i9" line not make a lick of sense, but somehow they're branding the Skylake X refresh under 9th generation numbering:
https://en.wikipedia.org/wiki/Skylake_(microarchitecture)#High-end_desktop_processors
46682313051_f29a559ed2_b.jpg


What a joke. The i7-9800X, for example, should clearly be called something like i7-7825X; the i9-9900X should be the i9-7905X; the i9-9920X should be the i9-7925X; etc.

Meanwhile, the whole "i9" branding was mismanaged from the start, and now the i9-9900K makes zero sens (timestamped):



Every i9 chip was unique to the LGA 2066 socket. In fact, this socket was the master differentiator that made the three criteria listed there separating the i9 possible in the first place (128GB RAM ceiling, more PCIe lanes, support for higher TDP inherent to 10+ cores).
 
Last edited:
Have you seen Craft Computings review of the Pimax 5k?

I posted about this on the Beat Saber thread the 8K is where you want to go even though it gives up 10 hz refresh rate you get to actual 4K high refresh headsets. I do not know how the got their hands on the consumer version because they are going to start to ship them in at the end of the month. New controllers not the VIVE controllers they work like VIVE unreleased knuckle controllers. I am really excited about the Primax headsets but you still need a RTX 2070 or better to run them. They costs around 1100 dollars for the 8K version with the controllers. I know they are ripping on the price but the nearest 200 degree FOV headset after Primax is around 2,500 dollars so it's pretty cheap. The Optics that they do not mention add a ton to the price because it's crazy expensive custom optics.



https://pimaxvr.com/collections/store/products/pimax-8k-series?variant=19912761606203
 
Have you seen the theme park ride they built at CES?



They force you to run through one big add for Google products. They really know how to drop some serious coin on things. I remember when New York 5th Av got the biggest at the time LCD screen in the world it literally takes up a city block and costs like 80 to 100 million dollars Google rented it exclusively for 4 weeks for like 3 million dollars or something.

_79118052_79118051.jpg
 



Not sure why you felt the need to post coverage of this twice. Again, I'm not too exhilarated when the LG B/C/E series & Sony AF series OLED TVs exist (up to 77"). The "variable refresh rate" up to 120Hz thanks to the DisplayPort is the only advantage on the spec sheet, but of course the LG TVs I just named can also do 120Hz if you downscale native rendering to 1080p, and variable refresh seems highly counterintuitive without G-sync/Freesync supported. There is nothing about an advantage to input lag, color gamut, or HDR brightness ranges over the TVs we already have.

I think the new 43" Asus ROG Strix XG438Q monitor-- a VA Panel-- is a bit more relevant in the space between those monster 4K OLED TVs and the best 27"-34" 1440p IPS/TN monitors:
https://www.anandtech.com/show/1379...r-gaming-monitor-lineup-up-to-49inch-displays

Asus ROG Strix XG438Q
aHR0cDovL21lZGlhLmJlc3RvZm1pY3JvLmNvbS9WL00vODE3NjE4L29yaWdpbmFsLzIwMTkwMTA2XzEyMjY1Ny5qcGc=

Anandtech said:
ASUS has announced their upcoming lineup of gaming monitors at CES under the Republic of Gamers branding, and as with everything in Las Vegas, bigger appears to be better. The Strix XG32VQR is a 32-inch 2560x1440 144Hz display, the Strix XG438Q is a 43-inch UHD HDR with a 120 Hz refresh rate monitor, and the Strix XG49VG is a massive 49-inch 32:9 3840x1800 144 Hz beast.

32813047268_356e6a6568_z.jpg
The 600 Nits brightness and 4ms response times aren't as good as the LG/Sony OLED TVs, but you get Freesync 2 (supported over HDMI), you will almost certainly enjoy a better input lag due to the panel type, you won't have to worry about ambient light washout, and you won't have to worry about burn-in.

But mostly it will be that unseen spec: price. Expect this monitor to cost around $1K or less, not the $5K HP quoted for its 65" Omen Experium BFGD which actually has G-Sync HDR, and that Dell is too afraid to even list for their new OLED "gaming TV". This 43" monitor is a stop gap. That is a product with a home outside a tech show floor. If you're willing to spend $5K+ would you really pick a 55" 120Hz OLED over a 77" 60Hz OLED or an 82" 60Hz VA? This is 4K.
$6,997: LG Electronics OLED77C8PUA 77" 4K Ultra HD Smart OLED TV, 2018 (OLED panel)
$4,998: Samsung QN82Q8FNBFXZA FLAT 82" QLED 4K UHD 8 Series Smart TV, 2018 (VA panel)

82" vs. 55"
82-inch-16x9-vs-55-inch-16x9.png


tenor.gif
 
Not sure why you felt the need to post coverage of this twice. Again, I'm not too exhilarated when the LG B/C/E series & Sony AF series OLED TVs exist (up to 77"). The "variable refresh rate" up to 120Hz thanks to the DisplayPort is the only advantage on the spec sheet, but of course the LG TVs I just named can also do 120Hz if you downscale native rendering to 1080p, and variable refresh seems highly counterintuitive without G-sync/Freesync supported. There is nothing about an advantage to input lag, color gamut, or HDR brightness ranges over the TVs we already have.

I think the new 43" Asus ROG Strix XG438Q monitor-- a VA Panel-- is a bit more relevant in the space between those monster 4K OLED TVs and the best 27"-34" 1440p IPS/TN monitors:
https://www.anandtech.com/show/1379...r-gaming-monitor-lineup-up-to-49inch-displays

Asus ROG Strix XG438Q
aHR0cDovL21lZGlhLmJlc3RvZm1pY3JvLmNvbS9WL00vODE3NjE4L29yaWdpbmFsLzIwMTkwMTA2XzEyMjY1Ny5qcGc=


The 600 Nits brightness and 4ms response times aren't as good as the LG/Sony OLED TVs, but you get Freesync 2 (supported over HDMI), you will almost certainly enjoy a better input lag due to the panel type, you won't have to worry about ambient light washout, and you won't have to worry about burn-in.

But mostly it will be that unseen spec: price. Expect this monitor to cost around $1K or less, not the $5K HP quoted for its 65" Omen Experium BFGD which actually has G-Sync HDR, and that Dell is too afraid to even list for their new OLED "gaming TV". This 43" monitor is a stop gap. That is a product with a home outside a tech show floor. If you're willing to spend $5K+ would you really pick a 55" 120Hz OLED over a 77" 60Hz OLED or an 82" 60Hz VA? This is 4K.
$6,997: LG Electronics OLED77C8PUA 77" 4K Ultra HD Smart OLED TV, 2018 (OLED panel)
$4,998: Samsung QN82Q8FNBFXZA FLAT 82" QLED 4K UHD 8 Series Smart TV, 2018 (VA panel)

82" vs. 55"
82-inch-16x9-vs-55-inch-16x9.png


tenor.gif

You’re comparing a tv to a monitor.
lucy-disappointed.gif
 
any clue when/if the last generation of GPUs are gonna drop in price. I have amazon giftcard money (about $120~) and im looking into the rx 570.....only last piece of my new pc i need
 
any clue when/if the last generation of GPUs are gonna drop in price. I have amazon giftcard money (about $120~) and im looking into the rx 570.....only last piece of my new pc i need
They have already dropped in price.
Don’t delay, buy today. RX 580 4gb versions have been going on sale for around the same price as a RX570 4gb so keep an eye out for them as well.
 
Last edited:
They have already dropped in price.
Don’t delay, buy today. RX 580 4gb versions have been going on sale for around the same price as a RX570 4gb so keep an eye out for them as well.
What I don’t get, is in my area used gpu prices are going UP, even as new prices are dropping.

I’m regularly seeing used 1080 and 1070 ti’s for MORE than they are NEW right now.

And older cards prices are creeping up as well.

A few months back, a 970 would list for 100.00 could offer 70.00 and get it. 980 ti, list for 150.00 get for 125.00.

Now I’m seeing the 70’s for 150-175 and 80 ti’s for 225-250. I’m like what the fak? I found a pair of 980 ti’s with the bridge listed for 300, had guy down to 225 and drug my feet, but there were hordes of 980ti’s listed for 150 anyway.

Not today. WTF.

I’m seeing rx 570’s listed used for MORE than new constantly.
 
Am I in a tech forum?

1863xk84tlu9yjpg.jpg


Difference in 2018:
comp_table-_2018.png


Let's keep up with the times.

I'll try again.
  1. HP Omen X BFGD is dope. That's what we've been waiting for. Will gain a foothold, but only after the price comes down.
  2. Dell Alienware "OLED Gaming TV" has a slippery feel to it with Dell catty about the details, especially price, when other top OLED TVs like the 55" LG E7P is $1589 on Amazon right now. That 2017 TV has a superior 103% DCI-P3 color gamut, and as far as quoted specifications goes, the only meaningful spec it gives up is 60Hz when the biggest market for large TVs are consoles in couch environments that only run 4K@60fps.
  3. I think 16:9 monitors that fill the size void above 27" are more immediately appealing.
https://www.anandtech.com/show/1384...5inch-4k-120-hz-oled-gaming-monitor-showcased
The display can reproduce up to 95% of the DCI-P3 color space, which is oddly low for an OLED monitor, but which is explainable as the device is still in development.

In fact, while Dell says that the Alienware 55 display is set to support an adaptive refresh rate technology, the manufacturer does not disclose whether it will eventually support AMD’s FreeSync/FreeSync 2 or NVIDIA’s G-Sync/G-Sync HDR when it is finalized. As for connectivity, the monitor features DisplayPort 1.4 and HDMI 2.1 (with the latter possibly pointing not only to a new cable requirement, but also to variable refresh rate (VRR) and other HDMI 2.1 features support).
I never come into these things as anything other than a giddy nerd, so I don't want to neg, but this CES has definitely been more chicken shit than chicken salad.

NVIDIA caving on Freesync was easily the highlight of the show for gamers. Don't fall for the tech press that is hyping every mediocre unveiling with a clickbait headline, "Check out this most amazing _____ ever!!!" They're part of the same industry machine. They are more than happy to sling exaggerations to seize traffic. I'm just a little disappointed. The companies are grinding along, and that is praiseworthy, but the show has resonated with me similar to the last two iPhone press announcements.

I was just disheartened to realize this was probably my favorite thing covered in the whole show:
Sony GTK-PG10 beer speaker
DSC02181_2.jpg

21bc4e7302b5f0f578da9c2327976d78051c46b5444d722112bad6195fd17f0a.jpg
 
Last edited:
you guys don't already play suspended or upside down?
 
PC Magazine > One year later, this ongoing burn-in test shines light on OLED displays
As he says in the intro to the Test #1 one-year update video the results are practically very good if you don't use the televisions heavily with static images somewhere on the screen, but of course this is a concern for gamers who often have HUDs or other logos that are permanently plastered on the screen if they play the same game all the time.

Test #1: Real Life OLED Burn-In Test on 6 TVs (LG C7 Models)
(initiated on January 24, 2018; updated January 11, 2019)




w4tdy2fEnCEv5L59nQmDQ4.jpg


Test #2: 20/7 Burn-In Test: OLED vs. VA vs. IPS

(initiated on August 31, 2017; updated January 11, 2019)


Static test, not real world usage, designed to showcase the pure tendencies of the different display types.



39791530253_4354c36952_c.jpg
 
Intel Core i9-9990XE : Up to 5.0 GHz, Auction Only
by Ian Cutress on January 14, 2019 5:00 PM EST

46704165472_925c06739f_b.jpg


The Ryzen 3000 launch definitely has Intel shook. They're panicked. The above graphic is probably pimping the stats a bit, just like AMD was probably pimping the stats of the upcoming 16-core Ryzen 3000 CPU (WCCF Tech's Keith May predicted a 4.6GHz stock turbo ceiling for the gen instead of 5.0GHz in an interview with Gordon at the CES show, and doesn't seem to think the 16-core flagship is probable this year at all). If the above graphic is accurate this new CPU will easily be the king of gaming CPUs ahead of the non-HEDT i9-9900K, but Intel is clearly in bullshit hype overdrive. They had a comfortable few years where they didn't have to do that, didn't they? Back to slinging a little slop along with their competitor. They're both going crazy trying to outmaneuver each other for a paper crown that isn't even really that important right now, for gamers, whose cups already runneth over.

To remind everyone, the i9-9900K is an 8c/16t CPU with a base clock of 3.5GHz, and a 5.1GHz peak turbo across the top two cores. For this reason even the monster i9-9980XE won't match it in most games despite being an 18c/36t CPU because it runs a 3.0GHz base clock with a 4.5GHz peak turbo on two cores. The rest of those cores can't close the thread-gap.

This i9-9990XE looks like an i9-9980XE that has had four cores disabled that has been engineered (or binned?) to sustain even higher voltages, and probably is leaning on an even more restrictive pre-approved CPU cooler list to support it. In a way, it reminds me of the FX-9xxx series refresh AMD launched to howls from the gaming world, and jeers from Intel fanboys, several years ago. Eyeballing those figures, looks suspiciously like a pre-overclocked refresh presuming they can even be achieved.

This is the most telling single fact reported by Anandtech:
Motherboard vendors will have to support 420 amps on the power delivery for the chip (which at 1.3 volts would be 546 watts), and up to 30 amps per core. It will be for the socket 2066 X299 motherboards already on the market, and perhaps importantly, there is no warranty from Intel. This means that system builders will not be able to recoup costs on dead silicon, but they might give their own warranty to end users.
<JagsKiddingMe>
 

Similar threads

Back
Top