What GPU card would I need of I wanted to do VR via External GPU through my Laptop?

Steve-French

What A Rush!!!
Joined
Jun 11, 2012
Messages
21,836
Reaction score
21,470
And would it even work? The specs on my laptop are...

Dell inspiron 15 3000
Intel i5 8265u 8th Gen
8gb ram

I was thinking of combining it with this...

GTX 1660 Ti

Razor Core X
da2ecd05-c3fb-43c4-8426-4e43becf355b._CR0,0,1024,1024_PT0_SX300__.jpg

I have an Oculus Quest 2 headset and apparently it can connect wirelessly to a PC for PC VR gameplay. Is this plan even feasible or am I pissing up a rope here. It's looking like I can do all of this for around $500. I'm not really worried about gaming as I'm fine with my XBOX, but I want to see if I can get more out of my Oculus Quest 2. any input would be appreciated.
@Madmick
 

Attachments

  • upload_2021-2-8_7-17-17.jpeg
    upload_2021-2-8_7-17-17.jpeg
    10.4 KB · Views: 3
Last edited:
That GPU should be fine.
I played Half-Life: Alyx on a GTX 1070 (with Oculus Quest 1) and it looked great and ran well on medium settings. And that is pretty much the VR showcase, most other games will be less demanding.

Don't know how good that CPU is and if there can be any issues with external GPU's but I'm sure others here can tell you.
 
That GPU should be fine.
I played Half-Life: Alyx on a GTX 1070 (with Oculus Quest 1) and it looked great and ran well on medium settings. And that is pretty much the VR showcase, most other games will be less demanding.

Don't know how good that CPU is and if there can be any issues with external GPU's but I'm sure others here can tell you.

Thanks for the input. I read that as long as you pair an external GPU with an i5 or higher that you will be okay.
 
And would it even work? The specs on my laptop are...

Dell inspiron 15 3000
Intel i5 8265u 8th Gen
8gb ram

I was thinking of combining it with this...

GTX 1660 Ti

Razor Core X
da2ecd05-c3fb-43c4-8426-4e43becf355b._CR0,0,1024,1024_PT0_SX300__.jpg

I have an Oculus Quest 2 headset and apparently it can connect wirelessly to a PC for PC VR gameplay. Is this plan even feasible or am I pissing up a rope here. It's looking like I can do all of this for around $500. I'm not really worried about gaming as I'm fine with my XBOX, but I want to see if I can get more out of my Oculus Quest 2. any input would be appreciated.
@Madmick
eGPUs can definitely improve performance. The greatest limiter here is the ports. If you can go into "About Computer" and find the exact model number of your laptop that would be ideal. That would enable me to Google the manufacturer's page to find the OEM spec sheet so that I can try to learn what the exact specification of your fastest port is. This would give a better indication of how wise this might be for you. If you only have USB 2.0 ports don't waste the money.

GPUs go into the PCIe 3.0 x16 slot in your motherboard specifically because that slot supports the highest bandwidth of information exchange with the CPU and the memory. These components all communicate at vastly greater speeds than is required for the information they ultimately spit out to your storage or display. In fact, even for the latest m.2 SSDs, which are much slower, this has become an issue. This is why Sony decided not to support allowing true next gen PS5 games to be played from external storage (only PS4 games). They knew consumers would buy the cheapest external storage they could find, and they knew this might cause performance issues at some point in the future-- possibly very long load times. Then that consumer would stomp and vent about how terrible the PS5 is. Better to protect ignorant consumers from themselves: it's the console model.

Transmission Speed Ceilings
  • 0.48 Gb/s = USB 2.0

  • 5 Gb/s = USB 3.0 (<--- PS4's external ports)
  • 5 Gb/s = USB 3.1 Gen 1 (<---- PS4 Pro's external ports)
  • 10 Gb/s = USB 3.1 Gen 2 (<-- PS5's external ports)
  • 5 Gb/s = USB 3.2 Gen 1(×1)
  • 10 Gb/s = USB 3.2 Gen 1(×2)
  • 10 Gb/s = USB 3.2 Gen 2(×1)
  • 20 Gb/s = USB 3.2 Gen 2(×2)
  • 10 Gb/s = Thunderbolt 1
  • 20 Gb/s = Thunderbolt 2
  • 40 Gb/s = Thunderbolt 3
Let's say you have a USB 3.1 Gen 2 port like the PS5. Runs at 10 Gb/s. In comparison, a PCIe 3.0 x16 slot runs at 126 Gb/s. Now, that is way more than any current GPUs require, even the mighty RTX Titan, but that PS5's USB port is still ~50% slower than a PCIe 2.0 x 4 or PCIe 3.0 x 1 slot. At those speeds I believe the GTX 1660 Ti will be bottlenecked significantly.

That's why these eGPUs were more interesting 8-10 years ago when the USB 3.0 and 3.1 ports were freshly out on the cutting edge laptops. The GPUs at the time were snails compared to today. The ports weren't that limiting.
 
eGPUs can definitely improve performance. The greatest limiter here is the ports. If you can go into "About Computer" and find the exact model number of your laptop that would be ideal. That would enable me to Google the manufacturer's page to find the OEM spec sheet so that I can try to learn what the exact specification of your fastest port is. This would give a better indication of how wise this might be for you. If you only have USB 2.0 ports don't waste the money.

GPUs go into the PCIe 3.0 x16 slot in your motherboard specifically because that slot supports the highest bandwidth of information exchange with the CPU and the memory. These components all communicate at vastly greater speeds than is required for the information they ultimately spit out to your storage or display. In fact, even for the latest m.2 SSDs, which are much slower, this has become an issue. This is why Sony decided not to support allowing true next gen PS5 games to be played from external storage (only PS4 games). They knew consumers would buy the cheapest external storage they could find, and they knew this might cause performance issues at some point in the future-- possibly very long load times. Then that consumer would stomp and vent about how terrible the PS5 is. Better to protect ignorant consumers from themselves: it's the console model.

Transmission Speed Ceilings
  • 3 Gb/s = SATA II (<----- PS4's internal drive standard)
  • 6 Gb/s = SATA III (<---- PS4 Pro's internal drive standard)
  • 0.48 Gb/s = USB 2.0

  • 5 Gb/s = USB 3.0 (<--- PS4's external ports)
  • 5 Gb/s = USB 3.1 Gen 1 (<---- PS4 Pro's external ports)
  • 10 Gb/s = USB 3.1 Gen 2 (<-- PS5's external ports)
  • 5 Gb/s = USB 3.2 Gen 1(×1)
  • 10 Gb/s = USB 3.2 Gen 1(×2)
  • 10 Gb/s = USB 3.2 Gen 2(×1)
  • 20 Gb/s = USB 3.2 Gen 2(×2)
  • 10 Gb/s = Thunderbolt 1
  • 20 Gb/s = Thunderbolt 2
  • 40 Gb/s = Thunderbolt 3
Let's say you have a USB 3.1 Gen 2 port like the PS5. Runs at 10 Gb/s. In comparison, a PCIe 3.0 x16 slot runs at 126 Gb/s. Now, that is way more than any current GPUs require, even the mighty RTX Titan, but that PS5's USB port is still ~50% slower than a PCIe 2.0 x 4 or PCIe 3.0 x 1 slot. At those speeds I believe the GTX 1660 Ti will be bottlenecked significantly.

That's why these eGPUs were more interesting 8-10 years ago when the USB 3.0 and 3.1 ports were freshly out on the cutting edge laptops. The GPUs at the time were snails compared to today. The ports weren't that limiting.

I checked in my device manager and My fastest port is a USB 3.1 version 1.10. I also read that sometimes these EGPU's will not run at the cards 100% capability that's why I thought choosing to get a 1660 over a 1060 would be better.
 
I checked in my device manager and My fastest port is a USB 3.1 version 1.10. I also read that sometimes these EGPU's will not run at the cards 100% capability that's why I thought choosing to get a 1660 over a 1060 would be better.
Well, wouldn't it be nice if internet tech reviewers provided up-to-date info on this sort of thing that could be referenced to guide these purchase decisions?

Too bad. They'd rather upload useless videos about RGB bling or liquid overclocking (in 2021, LOL) or some other clickbaity project that's content offers zero-- or nearly zero-- informational value to real world purchasing decisions like this one.

Unfortunately, most of what I can find is old, using GPUs from generations back, with only one of these old GPUs is tested, port standards unspecified, and often the laptop CPU & RAM unspecified, too. Benchmarks themselves are also extremely limited where testing exists.

Just based off the memory transfer rate of the 1660 Ti I have strong doubts about how wise this pairing would be on a USB 3.1 Gen1. We're talking about a GPU with a 12 Gb/s transfer rate, and a port with a 5 Gb/s bandwidth ceiling.

You could save money by buying a cheaper, less powerful eGPU that doesn't exceed your port's capabilities as drastically, but a 1660 Ti is already sort of scraping the bare minimum for a video card to drive an Oculus Quest 2. You can give it a whirl, but I think you'd be much better off buying a proper desktop. If you want to do these eGPUs with a laptop, I suggest looking for laptops with Thunderbolt 3 or Thunderbolt 4 ports in the future.
 
Well, wouldn't it be nice if internet tech reviewers provided up-to-date info on this sort of thing that could be referenced to guide these purchase decisions?

Too bad. They'd rather upload useless videos about RGB bling or liquid overclocking (in 2021, LOL) or some other clickbaity project that's content offers zero-- or nearly zero-- informational value to real world purchasing decisions like this one.

Unfortunately, most of what I can find it old, using GPUs from generations back, with only one of these old GPUs is tested, port standards unspecified, and often they don't even specify the laptop CPU or RAM. Benchmarks are extremely limited. Just based off the memory transfer rate of the 1660 Ti I have strong doubts about how wise this pairing would be on a USB 3.1 Gen1. We're talking about a GPU with a 12 Gb/s transfer rate, and a port with a 5 Gb/s bandwidth ceiling.

You could save money by buying a cheaper, less powerful eGPU that doesn't exceed your port's capabilities as drastically, but a 1660 Ti is already sort of scraping the bare minimum for a video card to drive an Oculus Quest 2. You can give it a whirl, but I think you'd be much better off buying a proper desktop. If you want to do these eGPUs with a laptop, I suggest looking for laptops with Thunderbolt 3 or Thunderbolt 4 ports in the future.

Right on... thanks the for the input.
 
Look into some videos on YouTube. No think there is one specifically where they tested the razer core on Linus tech tips. If I remember, it had some issues. It might still be a bit of a pipe dream.
 
Look into some videos on YouTube. No think there is one specifically where they tested the razer core on Linus tech tips. If I remember, it had some issues. It might still be a bit of a pipe dream.

From what I read it's a big no go as it is, because I would need a thunderbolt 3 port for it to even be worth a shit, and the fastest port I have is a first gen USB 3.1. Even with the Thunderbolt 3 I would probably only get 70-80% of the GPU's potential as it is.

I think I need to just sack up and build my-self a PC. Like I said I just want it for VR, and am fine playing games on a console, but it would still be cool to have both. Maybe I'll use my stimulus check to do a build. That way I won't feel guilty for buying more tech:D
 
From what I read it's a big no go as it is, because I would need a thunderbolt 3 port for it to even be worth a shit, and the fastest port I have is a first gen USB 3.1. Even with the Thunderbolt 3 I would probably only get 70-80% of the GPU's potential as it is.

I think I need to just sack up and build my-self a PC. Like I said I just want it for VR, and am fine playing games on a console, but it would still be cool to have both. Maybe I'll use my stimulus check to do a build. That way I won't feel guilty for buying more tech:D
Yeah and maybe I’m misremembering but even with thunderbolt it’s not a smooth experience. I think it’s just a bad idea for now.

Building them is fun though. Before you get started on the build, sit through jay two cents walkthrough. He’s also got a good video on common mistakes. Have it cued up when you start, as you will get stuck here and there
 
Yeah and maybe I’m misremembering but even with thunderbolt it’s not a smooth experience. I think it’s just a bad idea for now.

Building them is fun though. Before you get started on the build, sit through jay two cents walkthrough. He’s also got a good video on common mistakes. Have it cued up when you start, as you will get stuck here and there
These additional issues are matters of driver stability, not throughput. That's because drivers for GPUs running over a USB/Thunderbolt port are clearly not the priority for NVIDIA or AMD.
 
These additional issues are matters of driver stability, not throughput. That's because drivers for GPUs running over a USB/Thunderbolt port are clearly not the priority for NVIDIA or AMD.
Have you seen the new eGPU from Asus? It uses a proprietary connector but gives you more bandwidth than Thunderbolt.

upload_2021-2-11_22-47-31.png
 
Have you seen the new eGPU from Asus? It uses a proprietary connector but gives you more bandwidth than Thunderbolt.

View attachment 832964
I hadn't.

Cool idea, but the problem with these proprietary ports is that they never seem to take off. Remember eSATA?

The result is that unless you specifically seek out laptops with this port when you go to buy, then their eGPUs which depend on that connector's speed aren't a viable option with your laptop. The problem here is that the tiny number of laptops ever made with it don't offer a competitive value. So even if you specifically look for the port it doesn't make sense unless you accept that you're going to pay a large premium for a laptop that can offer more gaming power at home than on-the-go.

Conceptually, there might have been a niche for this in 2014 and earlier; prior to NVIDIA's introduction of the GTX 10 Mobile series GPUs. Today, why pay the premium for the port when you could simply spend that money on a laptop with a better GPU? The RTX 3080 Mobile is bested only by the RTX 2080 Ti and RTX Titan among previous desktop generations, and is better than the RTX 3060 Ti and all lesser GPUs among the current desktop generation. The RTX 3070 Max-Q carries a strict TDP of 80W, and runs on par with a 5700 XT or 1080 Ti, but with ray-tracing.

That's the frustrating irony. This port might actually find a market niche if they put it in lower-end laptops competing for the best price relative to their performance, those with a decent screen and CPU otherwise frugally manufactured, but they won't.
 
I hadn't.

Cool idea, but the problem with these proprietary ports is that they never seem to take off. Remember eSATA?

The result is that unless you specifically seek out laptops with this port when you go to buy their eGPUs which depend on that connector's speed aren't a viable option. The problem here is that the tiny number of laptops ever made with it don't offer a competitive value when you go to buy. So even if you specifically look for the port it doesn't make sense unless you accept that you're going to pay a large premium for a laptop that will offer more gaming power at home than on-the-go.

Conceptually, there might have been a niche for this in 2014 and earlier; prior to NVIDIA's introduction of the GTX 10 Mobile series GPUs. Today, why pay the premium for the port when you could simply spend that money on a laptop with a better GPU? The RTX 3080 Mobile is bested only by the RTX 2080 Ti and RTX Titan among previous desktop generations, and is better than the RTX 3060 Ti and all lesser GPUs among the current desktop generation. The RTX 3070 Max-Q carries a strict TDP of 80W, and runs on par with a 5700 XT or 1080 Ti, but with ray-tracing.

That's the frustrating irony. This port might actually find a market niche if they put it in lower-end laptops competing for the best price relative to their performance, those with a decent screen and CPU otherwise frugally manufactured, but they won't.

I've always thought it was kind of a cool concept but yeah, it would've been much cooler earlier last decade.

Linus has a video on it.

 
Last edited:
When I silll had my yoga 720, Lenovo made a specific 1050ti docking station for it (when the 1050ti was newish).

I wanted to. Buy one but beer did.

this would have been The way to go back then bu you needed a t-bolt port. And I’m sure drivers worked best on Lenovo laps ..
 
I hadn't.

Cool idea, but the problem with these proprietary ports is that they never seem to take off. Remember eSATA?

The result is that unless you specifically seek out laptops with this port when you go to buy, then their eGPUs which depend on that connector's speed aren't a viable option with your laptop. The problem here is that the tiny number of laptops ever made with it don't offer a competitive value. So even if you specifically look for the port it doesn't make sense unless you accept that you're going to pay a large premium for a laptop that can offer more gaming power at home than on-the-go.

Conceptually, there might have been a niche for this in 2014 and earlier; prior to NVIDIA's introduction of the GTX 10 Mobile series GPUs. Today, why pay the premium for the port when you could simply spend that money on a laptop with a better GPU? The RTX 3080 Mobile is bested only by the RTX 2080 Ti and RTX Titan among previous desktop generations, and is better than the RTX 3060 Ti and all lesser GPUs among the current desktop generation. The RTX 3070 Max-Q carries a strict TDP of 80W, and runs on par with a 5700 XT or 1080 Ti, but with ray-tracing.

That's the frustrating irony. This port might actually find a market niche if they put it in lower-end laptops competing for the best price relative to their performance, those with a decent screen and CPU otherwise frugally manufactured, but they won't.
Well I can see the application still. You can get an ultra light laptop and plug it into a gpu when you want to game. Not sure that there is a huge market for that but I can see it as a niche application
 
Well I can see the application still. You can get an ultra light laptop and plug it into a gpu when you want to game. Not sure that there is a huge market for that but I can see it as a niche application
True, but the niche for whom this makes sense is incredibly small.

First, you can get ultrathin laptops like the Dell XPS 15 that carry the GTX 1650 Ti. Second, in conjunction with that consideration, very few of the ultrabooks or ultralight laptops come with a USB or Thunderbolt port capable of 20 Gb/s or faster, the lack of which renders a significantly more powerful eGPU than the 1650 Ti pointless, and third, except for only a few high-end models, they rarely come with CPUs powerful enough to drive better GPUs (e.g. they carry the lower power "U" variants of Intel). Fourth, the screens in Ultrabooks are awful for gaming. For some reason they've evolved to carry these really high-resolution IPS displays despite the tiny screen size. Terrible response times and GTG, and no G-Sync/Freesync for fps drops. Tear city.
 
Well umm, weird

waht.png
The thumbnail is the video of the guys who built the acetylene lightsaber for me. It's because Slob accidentally posted the YouTube playlist containing that video instead of a video itself.
 
The thumbnail is the video of the guys who built the acetylene lightsaber for me. It's because Slob accidentally posted the YouTube playlist containing that video instead of a video itself.

Ahh ok. I was wondering what the heck.
 
Back
Top