Eds Mar 9, 2018 @ 4:08pm
Guidance on Steam streaming from a VM
Hi Guys,

I spotted a couple of people have seemingly managed to get Steam in home streaming to work from a VM with a passed through GPU, and while mine works, it is incredibly poor quality.

I was wondering if someone might be able to assist me with some tweaks when using ESXi to virtualise this work load?

Cheers
Eds
< >
Showing 1-15 of 20 comments
You need an incredibly powerful PC to do this.
Eds Mar 9, 2018 @ 4:45pm 
Hardware isn't really the problem at the mo, it's a config question as much as anything.

I have a 12 core Xeon, SuperMicro X9SRL-F, ESXi 6.5 with a Windows 10 VM on a Samsung 960 EVO SSD with PCIe passthrough of a GTX 760.

I have the device passed through to the VM, it shows and drivers have installed correctly, however I do not think when streaming a game from another device it is actually using the 760, as I can see no load using GPU-Z.

Have seen some mention of perhaps disabling onboard GPU, but then ESXi has no video output.

Cheers
Eds
The GTX 760 is very poor at being virtualised.
xSOSxHawkens Mar 9, 2018 @ 5:35pm 
Unless you plan to use CPU software encoding (min 4 dedicated threads) or plan to push thr IGP into service for encoding on the fly, you will likely have performance issues.

Why are you tryimg this in the first place? Don't you have a proper PC to put that 760 into instead of trying to VM a game box...

Unless you plan yo toss another 3 GPUs in and make a 4 gamer one PC lan rig, there seems little reason to attempt what you are trying. Let the server be a server and get a stand alone machine for gaming...
RGX12 Mar 9, 2018 @ 6:58pm 
Originally posted by Eds:
Hardware isn't really the problem at the mo, it's a config question as much as anything.

I have a 12 core Xeon, SuperMicro X9SRL-F, ESXi 6.5 with a Windows 10 VM on a Samsung 960 EVO SSD with PCIe passthrough of a GTX 760.

I have the device passed through to the VM, it shows and drivers have installed correctly, however I do not think when streaming a game from another device it is actually using the 760, as I can see no load using GPU-Z.

Have seen some mention of perhaps disabling onboard GPU, but then ESXi has no video output.

Cheers
Eds

Last time I tried this, I couldn't get it to work either. In my case, I was getting an error code on the driver; apparently you are not (although even if Device Manager in the guest shows normal I would still check the event logs). But I suspect the root cause is the same: the nasty little habit of many consumer-grade NVidia cards refusing to operate when the driver detects it's being run inside a VM. AFAIK, there is no way around this...it will not work.

However, as long as you're not married to VMWare, you could try a different hypervisor. Regarding what Gordy Freeman said about the 760 being "very poor at being virtualised"--technically there is nothing inherently 'poor' about it at the hardware level; it's simply that it doesn't want to be virtualized--NVidia contrived it to be this way, if only so customers who needed this functionality had to fork over the cash for their pro level cards (e.g., Quadro, Tesla, etc.).

However, you CAN, as I did with a similar setup, get your current system to run under KVM (and likely with much better perfomance than you would've gotten under ESXi). Now-- the driver still throws an error, but with a little wrangling of the config files you can suppress the the virtualization flag that causes the error (or simply causes the card not to be recognized). The little bit of magic which makes this possible is this: https://github.com/sk1080/nvidia-kvm-patcher.

The alternative to the above would be to pick up an AMD GPU. They are not hamstrung as the NVidia's are, and one would virtualize just fine given the rest of your setup (although I would *still* go with KVM).

Hope this helps.

P.S.: Just for ♥♥♥♥♥ and giggles make sure that mobo supports VT-d (in case you haven't already).

Originally posted by xSOSxHawkens:
Unless you plan to use CPU software encoding (min 4 dedicated threads) or plan to push thr IGP into service for encoding on the fly, you will likely have performance issues.

Why are you tryimg this in the first place? Don't you have a proper PC to put that 760 into instead of trying to VM a game box...

Unless you plan yo toss another 3 GPUs in and make a 4 gamer one PC lan rig, there seems little reason to attempt what you are trying. Let the server be a server and get a stand alone machine for gaming...

There are plenty of reasons why one might want to do this. Whole blogs have been written about it. And when done properly, using VT-x/VT-d -capable equipment, performance can be the same as, or perhaps at 90% that of a corresponding bare metal configuration.
Last edited by rotNdude; Mar 10, 2018 @ 7:49am
xSOSxHawkens Mar 9, 2018 @ 7:20pm 
Well, take yhe 90% best case, know that you *might* not get that all the time as it *is best case*, then undersand that the 90% available takes a further 10-=20% compunded hit due to streaming unless you are leveraging a seperate GPU or IGP for processing.

So, whats 72-81% of a GTX760?

Not much. Enough to get by, but not enough to expect great things from if you press it hard.

All of this is without talking about CPU (how many cores and threads does the VM get? What is that actual core speed of this chip? Make and model?)

Or talking about how you are working on a shared RAM subsystem, liekly a shread disk (ssd or not), etc. What other things run ont his server? how intensive are they?

Does the VM have a dedicated NIC or is it using a shared one? If it has its own NIC, does it have direct pass through with native drivers in windows or does it use an emulation layer of some type with something not hardware spec installed inside the windwos OS?

You are right. It can be done, and there are plenty of reasons to do it, but for a single person it just seems like you are making life much harder on you than need be.

Any decent 3rd gen core i5 system or better with that 760 could sling great stuff around your house for near nothing. I know a few places here in portland that would sell you a PC capable of slotting that 760 into for $100 with such i5's. You can press their IGP inter use with quicksync encoding and sling 1080p gameplay with full CPU and GPU on game to anywhere with ease.
Vince ✟ Mar 9, 2018 @ 7:57pm 
Whats the point of this? Im just curious
point is to run a os that respects your privacy, then virtually be able to boot up a win10 that can use the gpu to game with, then shut down the creepersoft virtual windows
Eds Mar 10, 2018 @ 2:15am 
Ok so I will start by advising the point of this;

I have an ESXi virtualisation host that runs several VMs and is on 24/7. It isn't heavily utilised, and has head room for pushing more services to it.

I wanted to try out home streaming over a VPN, and to me it made sense to get "nearly" baremetal performance by sticking a GPU into this server, passing it through to a Win 10 VM, and streaming from that, rather than ALSO keeping my PC on at home 24/7. I know I could do WOL to bring it out of sleep, but what's the point when I already have a machine that is guaranteed running? Saves me a step.

I appreciate that the 760 is old, and I am not going to be playing any triple A titles from it. Given that I want to try and stream over the internet anyway, we are talking mainly casual or older games, things I can just do on my lunch break at work or hop in and out of from my families homes.

I don't agree with the point of the 760 being "poor at virtualisation", as it comes down to passing the device through to a VM, basically giving it direct access to it. There is then no virtualisation layer between the VM and the 760.
Not sure why there is reference to software encoding or the iGPU, as the idea is that the 760 takes care of that once working from within the VM.



Originally posted by RGX12:
Last time I tried this, I couldn't get it to work either. In my case, I was getting an error code on the driver; apparently you are not (although even if Device Manager in the guest shows normal I would still check the event logs). But I suspect the root cause is the same: the nasty little habit of many consumer-grade NVidia cards refusing to operate when the driver detects it's being run inside a VM. AFAIK, there is no way around this...it will not work.

P.S.: Just for ♥♥♥♥♥ and giggles make sure that mobo supports VT-d (in case you haven't already).

One thing you may want to look at if you were to try it again, is an advanced parameter within the VM called hypervisor.cpuid.v0 = false
For me, when I enable this parameter, the device shows no errors in device manager and is detected correctly. I believe this basically just tells the VM it isnt a VM, so works around that problem.
Motherboard does support VT-d ;)

I can see the GPU does have load on it under some circumstances, so I think my issue comes down to Steam not using the 760, maybe because I have a console window open (needs to be unlocked for stream to start) which uses the Vmware SVGA adapter, so it's either trying to use that, or it's doing it via software somehow.

I hope that clarifies why i am trying to do this, and where I am currently. I am able to stream something like Astroneer for example, but I can see no load on the 760 using GPU-z during the stream, so it feels like the card is not being used. One thing I am also going to try, is disconnecting from the console session, and connecting a monitor to the 760, log in and see if it then works.

Thanks
Eds
Omega Mar 10, 2018 @ 3:30am 
It might be worth it asking the Linux guys, they are more knowledgeable on GPU passthrough.
shanqs Mar 10, 2018 @ 8:19am 
Connect a display to the gpu, if you haven't already noticed, the nvidia control panel will not open without one connected. And make sure it's set as primary or only display in windows display properties. After, all input via the ESXi VM console will be passed to the connected display rendering it quite useless after setup unless you are in the same room. Once working, use vnc or rdp which you should setup prior. Disabling the svga adapter via device manager afterwards is optional.

I have no issues with gtx970 on ESXi 6.0 with near baremetal performance even on a rdp session or streaming to Link. Granted this is all over LAN and your perf may vary streaming over WAN with VPN.
Eds Mar 10, 2018 @ 12:34pm 
Originally posted by §hλŋqʂ:
Connect a display to the gpu, if you haven't already noticed, the nvidia control panel will not open without one connected. And make sure it's set as primary or only display in windows display properties. After, all input via the ESXi VM console will be passed to the connected display rendering it quite useless after setup unless you are in the same room. Once working, use vnc or rdp which you should setup prior. Disabling the svga adapter via device manager afterwards is optional.

I have no issues with gtx970 on ESXi 6.0 with near baremetal performance even on a rdp session or streaming to Link. Granted this is all over LAN and your perf may vary streaming over WAN with VPN.

God I hope this is literally all it comes down to, it will be so simple!

Have you got an actual monitor connected, or do you use one of those dummy display adaptors?
Do you need to set the machine to never lock, as Steam tells me occasionally it can't stream due to the machine being locked?
shanqs Mar 10, 2018 @ 12:47pm 
If you got the VM to recognize the card and successfully installed the nvidia drivers without error or bsod, then that's most likely all you need to do.

I have an extra monitor connected to it, but you can use a dummy plug by jumping pins on a vga connector. Never tried the plug since an extra monitor was never an issue.

For Steam Link it will complain that the PC is locked if I'm logged into a RDP session. Disconnecting from rdp and opening the VM console to log in resolves this, but as I said this is all done in-house as the console is blacked out with the desktop displaying only to the connected monitor. Not sure how you would do it off-site unless you logged in before leaving the house.

EDIT: I just checked and the console isn't blacked out, it shows the desktop but any input into the console only shows on the connected monitor.
Last edited by shanqs; Mar 10, 2018 @ 1:04pm
Eds Mar 10, 2018 @ 1:20pm 
So it definitely sounds like I need to have an unlocked session on the VM at all times in order for streaming to work.

Have you disabled the VMware SVGA adaptor? My understanding was that if that is disabled, the console stops working. If the console doesn't work, I won't be able to unlock it unless I have a physical monitor connected so I can see the login screen.

In that scenario, I guess a dummy plug will be no good as I won't be able to see.

Or, are you saying that if you keep the SVGA adapter enabled, log into the console, but ensure the GPU is set as the default adaptor, it works as expected?
shanqs Mar 10, 2018 @ 1:33pm 
There is no change in behavior whether I have svga enabled or disabled. If you have to use a plug or have issues with a locked session, forget streaming and just RDP in from remote Windows PC and play. Obviously this isn't optimal unless you have enough storage on the ESXi server and copy your entire library over, otherwise just a few games that you intend to play often in this manner.

All of this is unsupported with consumer cards and type 1 hypervisors where never intended to be used to game. My results are quite good on LAN mainly streaming to the Link. I have run 3d benchmarks while in a rdp session with good results as well. Doing this over VPN from a remote location is going to lead to an entirely different experience however. Input lag may make it unplayable even with a high bandwidth connection.
Last edited by shanqs; Mar 10, 2018 @ 1:54pm
< >
Showing 1-15 of 20 comments
Per page: 1530 50

Date Posted: Mar 9, 2018 @ 4:08pm
Posts: 20