Install Steam
login
|
language
简体中文 (Simplified Chinese)
繁體中文 (Traditional Chinese)
日本語 (Japanese)
한국어 (Korean)
ไทย (Thai)
Български (Bulgarian)
Čeština (Czech)
Dansk (Danish)
Deutsch (German)
Español - España (Spanish - Spain)
Español - Latinoamérica (Spanish - Latin America)
Ελληνικά (Greek)
Français (French)
Italiano (Italian)
Bahasa Indonesia (Indonesian)
Magyar (Hungarian)
Nederlands (Dutch)
Norsk (Norwegian)
Polski (Polish)
Português (Portuguese - Portugal)
Português - Brasil (Portuguese - Brazil)
Română (Romanian)
Русский (Russian)
Suomi (Finnish)
Svenska (Swedish)
Türkçe (Turkish)
Tiếng Việt (Vietnamese)
Українська (Ukrainian)
Report a translation problem
I have a 12 core Xeon, SuperMicro X9SRL-F, ESXi 6.5 with a Windows 10 VM on a Samsung 960 EVO SSD with PCIe passthrough of a GTX 760.
I have the device passed through to the VM, it shows and drivers have installed correctly, however I do not think when streaming a game from another device it is actually using the 760, as I can see no load using GPU-Z.
Have seen some mention of perhaps disabling onboard GPU, but then ESXi has no video output.
Cheers
Eds
Why are you tryimg this in the first place? Don't you have a proper PC to put that 760 into instead of trying to VM a game box...
Unless you plan yo toss another 3 GPUs in and make a 4 gamer one PC lan rig, there seems little reason to attempt what you are trying. Let the server be a server and get a stand alone machine for gaming...
Last time I tried this, I couldn't get it to work either. In my case, I was getting an error code on the driver; apparently you are not (although even if Device Manager in the guest shows normal I would still check the event logs). But I suspect the root cause is the same: the nasty little habit of many consumer-grade NVidia cards refusing to operate when the driver detects it's being run inside a VM. AFAIK, there is no way around this...it will not work.
However, as long as you're not married to VMWare, you could try a different hypervisor. Regarding what Gordy Freeman said about the 760 being "very poor at being virtualised"--technically there is nothing inherently 'poor' about it at the hardware level; it's simply that it doesn't want to be virtualized--NVidia contrived it to be this way, if only so customers who needed this functionality had to fork over the cash for their pro level cards (e.g., Quadro, Tesla, etc.).
However, you CAN, as I did with a similar setup, get your current system to run under KVM (and likely with much better perfomance than you would've gotten under ESXi). Now-- the driver still throws an error, but with a little wrangling of the config files you can suppress the the virtualization flag that causes the error (or simply causes the card not to be recognized). The little bit of magic which makes this possible is this: https://github.com/sk1080/nvidia-kvm-patcher.
The alternative to the above would be to pick up an AMD GPU. They are not hamstrung as the NVidia's are, and one would virtualize just fine given the rest of your setup (although I would *still* go with KVM).
Hope this helps.
P.S.: Just for ♥♥♥♥♥ and giggles make sure that mobo supports VT-d (in case you haven't already).
There are plenty of reasons why one might want to do this. Whole blogs have been written about it. And when done properly, using VT-x/VT-d -capable equipment, performance can be the same as, or perhaps at 90% that of a corresponding bare metal configuration.
So, whats 72-81% of a GTX760?
Not much. Enough to get by, but not enough to expect great things from if you press it hard.
All of this is without talking about CPU (how many cores and threads does the VM get? What is that actual core speed of this chip? Make and model?)
Or talking about how you are working on a shared RAM subsystem, liekly a shread disk (ssd or not), etc. What other things run ont his server? how intensive are they?
Does the VM have a dedicated NIC or is it using a shared one? If it has its own NIC, does it have direct pass through with native drivers in windows or does it use an emulation layer of some type with something not hardware spec installed inside the windwos OS?
You are right. It can be done, and there are plenty of reasons to do it, but for a single person it just seems like you are making life much harder on you than need be.
Any decent 3rd gen core i5 system or better with that 760 could sling great stuff around your house for near nothing. I know a few places here in portland that would sell you a PC capable of slotting that 760 into for $100 with such i5's. You can press their IGP inter use with quicksync encoding and sling 1080p gameplay with full CPU and GPU on game to anywhere with ease.
I have an ESXi virtualisation host that runs several VMs and is on 24/7. It isn't heavily utilised, and has head room for pushing more services to it.
I wanted to try out home streaming over a VPN, and to me it made sense to get "nearly" baremetal performance by sticking a GPU into this server, passing it through to a Win 10 VM, and streaming from that, rather than ALSO keeping my PC on at home 24/7. I know I could do WOL to bring it out of sleep, but what's the point when I already have a machine that is guaranteed running? Saves me a step.
I appreciate that the 760 is old, and I am not going to be playing any triple A titles from it. Given that I want to try and stream over the internet anyway, we are talking mainly casual or older games, things I can just do on my lunch break at work or hop in and out of from my families homes.
I don't agree with the point of the 760 being "poor at virtualisation", as it comes down to passing the device through to a VM, basically giving it direct access to it. There is then no virtualisation layer between the VM and the 760.
Not sure why there is reference to software encoding or the iGPU, as the idea is that the 760 takes care of that once working from within the VM.
One thing you may want to look at if you were to try it again, is an advanced parameter within the VM called hypervisor.cpuid.v0 = false
For me, when I enable this parameter, the device shows no errors in device manager and is detected correctly. I believe this basically just tells the VM it isnt a VM, so works around that problem.
Motherboard does support VT-d ;)
I can see the GPU does have load on it under some circumstances, so I think my issue comes down to Steam not using the 760, maybe because I have a console window open (needs to be unlocked for stream to start) which uses the Vmware SVGA adapter, so it's either trying to use that, or it's doing it via software somehow.
I hope that clarifies why i am trying to do this, and where I am currently. I am able to stream something like Astroneer for example, but I can see no load on the 760 using GPU-z during the stream, so it feels like the card is not being used. One thing I am also going to try, is disconnecting from the console session, and connecting a monitor to the 760, log in and see if it then works.
Thanks
Eds
I have no issues with gtx970 on ESXi 6.0 with near baremetal performance even on a rdp session or streaming to Link. Granted this is all over LAN and your perf may vary streaming over WAN with VPN.
God I hope this is literally all it comes down to, it will be so simple!
Have you got an actual monitor connected, or do you use one of those dummy display adaptors?
Do you need to set the machine to never lock, as Steam tells me occasionally it can't stream due to the machine being locked?
I have an extra monitor connected to it, but you can use a dummy plug by jumping pins on a vga connector. Never tried the plug since an extra monitor was never an issue.
For Steam Link it will complain that the PC is locked if I'm logged into a RDP session. Disconnecting from rdp and opening the VM console to log in resolves this, but as I said this is all done in-house as the console is blacked out with the desktop displaying only to the connected monitor. Not sure how you would do it off-site unless you logged in before leaving the house.
EDIT: I just checked and the console isn't blacked out, it shows the desktop but any input into the console only shows on the connected monitor.
Have you disabled the VMware SVGA adaptor? My understanding was that if that is disabled, the console stops working. If the console doesn't work, I won't be able to unlock it unless I have a physical monitor connected so I can see the login screen.
In that scenario, I guess a dummy plug will be no good as I won't be able to see.
Or, are you saying that if you keep the SVGA adapter enabled, log into the console, but ensure the GPU is set as the default adaptor, it works as expected?
All of this is unsupported with consumer cards and type 1 hypervisors where never intended to be used to game. My results are quite good on LAN mainly streaming to the Link. I have run 3d benchmarks while in a rdp session with good results as well. Doing this over VPN from a remote location is going to lead to an entirely different experience however. Input lag may make it unplayable even with a high bandwidth connection.