How to use a desktop GPU on a laptop.

Can already do this via Thunderbolt (or other PCIe expansion method).
We have sufficient technology now for the truly "dockable" laptop/desktop combo, demand is just too low for it to be viable.

--Patrick
 
Can already do this via Thunderbolt (or other PCIe expansion method).
We have sufficient technology now for the truly "dockable" laptop/desktop combo, demand is just too low for it to be viable.

--Patrick
I was about to say, I don't see there being a huge need for such a device. The kind of people who are going to drop big bills for a high end gpu are likely to want it in a full fledged gaming pc, separate from their laptop.
 
I was about to say, I don't see there being a huge need for such a device. The kind of people who are going to drop big bills for a high end gpu are likely to want it in a full fledged gaming pc, separate from their laptop.
A setup like this would be more for the GPGPU crowd, where the graphics card is just used as a specialized coprocessor. Go out into the field with your laptop (because it's portable!), collect your data, get your readings, take your measurements, and then bring it back to your office/home and hook up the GPU for analysis/rendering/whatever. Could be popular with oil/gas industries or with cryptocurrency miners (though ASICs would probably be a more efficient method than GPGPU).

--Patrick
 

GasBandit

Staff member
The current cutting edge of GPUs won't be good for games forever (or even for 6 years, probably)... abarring major changes to architecture, this would allow laptops to have de facto easily upgradable mainstream video cards for gaming at home, while still being portable for work.

But yeah, that proprietary connection is a major buzzkill.
 
Thunderbolt, as currently implemented, only supports up to 4 lanes of PCIe. Most high end graphics cards want 16. I imagine their proprietary solution provides more than 4, and thus better performance than thunderbolt could provide.
Correct, but Thunderbolt 2 provides PCIe x4 v2.0 (which can carry as much as PCIe x8 v1.x), and that's the same bandwidth as each card would get in early SLI/XFire configs. Also, Thunderbolt 3 is expected in late 2015, which will double the bandwidth yet again.

--Patrick
 

figmentPez

Staff member
Thunderbolt 2.0 would only be able to provide 2GB per second. The newest video cards, however, use PCIe 3.0, allowing nearly 16GB per second.
But how much bandwidth do the cards actually use? Historically the amount of bandwidth that actually benefits performance has lagged behind what the latest spec is capable of. Often significantly. Here's an outdated article from 2013, which found that not only were PCIe 2.0 and 3.0 comparable in speeds, but so was using 8x or 16x lanes. Which means that they're almost certainly not using more than 4GB/s.

Heck, here's a video review of testing in 16x, 8x, and 4x. It's a limited benchmark of just 3DMark (and from 2011 at that), but there was less than a 3% difference in performance between 16x and 4x PCIe lanes.
 
There's other people who have tried artificially limiting the link width for similar testing. They discovered that most applications are not as sensitive to transfer speed as you might think.

EDIT: Often, once the textures are loaded onto the card's local memory, the amount of PCIe traffic goes down quite a bit. So a GPU can compensate for a slow/narrow PCIe link by having lots of on-board RAM.

--Patrick
 
Last edited:

GasBandit

Staff member
Why would anyone even want this?
... so that you could have a laptop to take with you for work/to job sites... but be able to game on it when you're at home, and not have the GPU be obsolete in 2 years and have to buy a whole new laptop if you want to game.
 

GasBandit

Staff member
This is far too unwieldy to be anything but a desktop supplement for a laptop.
Well, yeah, you don't cart the GPU housing around with you, you leave it at home. Where you do your gaming. But you take your alienware to the office where you yeah ok this concept is kinda falling apart isn't it.

Basically, it was an attempt at a compromise that lets you have the portability of a laptop and the upgradability/modularity of a desktop. But you gotta leave your "good" gpu at home.

If it wasn't a proprietary connection though, I'd have thought about trying to get it to work with my old Asus laptop which is just barely powerful enough to play borderlands 1 at low resolution on its own.. but if I could tether it to a modern $200 desktop GPU, it'd be a pretty badass gaming system all over again.
 
This is far too unwieldy to be anything but a desktop supplement for a laptop.
Pretty much. NVIDIA has been doing something like this since 2006, but it hasn't really taken off except in certain vertical markets (i.e., anyone who has lots of in-house or custom-developed CUDA software).

--Patrick
 
I was thinking that the number of things I don't need with my laptop are legion, and slowly came to the conclusion that if I could just pick up a touch-screen monitor and bring it home to be used with the stuff I do use at home, it'd be perfect. So basically I just invented tablets, only I'd give them USB ports.

Wait, why don't tablets have USB ports already? This is another thing I have to go complain about now.
 
I'd be all for a "dockable" tablet idea if the base had "real" power in it. So something like the Asus Transformer Book series, but with the "base" of it having big capacity, and ability to have real graphics, etc. So something I could web browse on, watch some videos, etc, but then plug it into my "main setup" and have it be my real computer. But then you run into "the premium that costs, you may as well buy a real desktop, and a tablet, and synchronize your bookmarks, etc." So that's what most people do.

I'll admit though that if I needed to be mobile home/work with a LAPTOP that this concept would make sense. Laptop is a weird region there, but if it's work, THEY have control over your laptop, which means I'd never use it as MY machine.
 
It's something that was already tried once in the mid-90's, but it didn't go over very well.


But who knows? The Newton bombed in 1998 but people love iPads now, maybe it's time to try the idea again.

--Patrick
 
Top