Steam Link

Steam Link

Reck 1 nov, 2015 @ 8:37
Hardware Encoding - Which is best CPU or GPU?
If you have a Sandy Bridge Intel CPU or newer you can encode\decode using QuickSync or the GPU can also encode\decode the stream.

1. Is it safe to assume that hardware encoding is always better and should be turned on rather than relying on software encoding?
2. How can I tell if my CPU or GPU is doing the encoding?
3. How can I switch between using the CPU or GPU for encoding?
4. Which is best for Steam Link encoding\decoding, CPU or GPU?

I have an i7 Sandy Bridge and Nvidia Geforice 970.

Thanks
< >
Visar 1-12 av 12 kommentarer
Greg 1 nov, 2015 @ 16:01 
1. Yes - unless it doesn't work.

2. The steam performance overlay will tell you which encoders are being used. If the encoder says that it is using "threads" then it is software. For your card you either want NVIFR or NVFBC.

3. Turn off hardware encoding to have the CPU software encode the frame. To use quicksync to encode with a GPU requires quite a bit of finagling but the result is completely not worth it. The display latency is just as bad, if not worse, than software encoding.

If you truly want to do it I believe you would have to uninstall your drivers completely using DDU. Reinstall the Nvidia driver ONLY (Do NOT install Geforce Experience) and then make sure the Intel Graphics Software is installed so Quicksync will work.

4. Geforce 970 should work just fine. If you want to upgrade from there the only thing I could recommend is the 980 TI. If you're having issues it sounds like your hardware encoder is not working. Try installing the "Geforce Experience" software.
Senast ändrad av Greg; 1 nov, 2015 @ 16:08
Reck 2 nov, 2015 @ 4:01 
Hi Greg,

Sounds like I'm not really understanding Intel Quick Sync then. It sounded to me that newer CPU's, > Sandy Bridge, had hardware encoding\decoding built into the chip.

https://en.wikipedia.org/wiki/Intel_Quick_Sync_Video Intel Quick Sync Video is the name given to Intel's hardware video encoding and decoding technology integrated into some of its CPUs.

So I thought you could either use the CPU or the GPU to encode, both being hardware based, not software. If your GPU wasn't good enough and your CPU was older than a Sandy Bridge then you would have to resort to Software.


So I thought there were 3 options for encoding.

1. CPU
2. GPU
3. Software

From your post it sounds like it's either GPU or software?

Turn off hardware encoding to have the CPU software encode the frame

CPU Software? This is what's confusing me, I thought this would be done with the hardware of the CPU.




Mobski 2 nov, 2015 @ 6:29 
You are almost correct.

It is either hardware-encoding or software-encoding.

Hardware-encoding means that the encoding is hardware accelerated[en.wikipedia.org]. Which means that a piece of a chip has been specifically designed for a single task, in which it excels.
Software-encoding means that the encoding is handled by software that runs on a general purpose processing unit. In our case the CPU.

Because we are talking PC's it goes like this:
Software-encoding: Done on CPU which is the general purpose processor of the computer. It runs anything and everything you throw at it.
Hardware-encodign: Done on a chip which is specifically designed for audio/video encoding(and/or decoding). This chip is incorporated in alot of current GPU's. Even the later models of integrated Intel GPU's have hardware encoding. Funfact: if you have a mediaplayer: Popcornhour, Xtreamer, Chromecast, SmartTV; it also has hardware decoding, but i wouldnt call it a GPU.

I hope this helps clear it up some more.

tl;dr
software encoding = moderate - high CPU load
hardware encoding = very low - low GPU load
Senast ändrad av Mobski; 2 nov, 2015 @ 6:31
Greg 2 nov, 2015 @ 10:23 
Ursprungligen skrivet av Reck:
Hi Greg,
-
So I thought there were 3 options for encoding.

1. CPU
2. GPU
3. Software

From your post it sounds like it's either GPU or software?

I was saying you should ONLY use the GPU or software. You are correct though it IS physically possible to use the CPU's quicksync to encode a GPU frame. This introduces an extra magnitude of latency though easily. The CPU has to copy the frame to RAM, encode it with quicksync and copy it back to RAM again, and then finally the CPU sends it over the network.

With "GPU hardware encoding" the CPU pulls the encoded frame directly off the GPU and sends it over the network. The transfer of the frame over the network could potentially begin before the entirety of the frame is even copied off the GPU. This results in a massive speedup.

If you were using the intel graphics hardware as your main GPU it would be the same exact thing. The intel IGP would encode the frame on the hardware and there would only be one copy into system RAM before going over the network.

Long story short - You are right, but it is NOT something you want to do when using discreet graphics. Quicksync is GREAT for decoding on client machines though. This is probably why you see it mentioned so much.
Senast ändrad av Greg; 2 nov, 2015 @ 11:26
Reck 4 nov, 2015 @ 3:51 
Greg, Mobski. Thanks for taking the time to explain that, sounds like GPU encoding is the way to go then. I'd imagine I'll be ok with a gtx 970.
Galland 4 nov, 2015 @ 5:41 
So my Encoder is listed as: "Desktop DWM NV12 + Intel QuickSync D3D11" which sounds like the undesireable one...? How do I force it to use the GPU hardware listed above; NVIFR or NVFBC? I have a GTX 970M + Haswell Processor. I have Hardware Encoding and Decoding checked on the Host and Client. I feel the latency is very high.

Valve needs to clearly explain this stuff otherwise consumer level "computer on the big screen" tech doesn't stand a chance!
Senast ändrad av Galland; 4 nov, 2015 @ 5:43
Greg 4 nov, 2015 @ 7:46 
Install the "Geforce Experience" software.

If that does not work then attempt to disable your integrated graphics card temporarily to see if that resolves it.

If all else fails, try switching to windowed mode.
Galland 4 nov, 2015 @ 15:29 
Ok, so installed the Geforce Experience, and played around with disabling my integrated Intel 4300 chip and I learned that the Steam Link on a laptop is complicated!

I got it to use "NVFBC H264" as the hardware encoder by forcing Steam via the Nvidia Control Panel to use the "High Performance Nvidia Processor" (ie my 970M) but it just streams a black screen in the Steam interface. Upon launching a game from the host computer it looks and plays great on the TV (client), best quality graphics of the bunch here.

If I set STEAM to "Auto select" the GPU it uses to run, then it streams with QuickSync with Hardware Encoding enabled... until launching a game like Mad Max which runs in Fullscreen and uses the "NVIFR H264" encoder. This is the best performance I get on the Steam Link, fast, no input/display lag, and no screen flickering.

If I launch Arkham Knight however with Hardware Encoding enabled it stays with the Intel QuickSync encoder and the Links performace is pretty bad. Laggy and lots of artefacting... this is because I have to have AK set to Borderless Windowed Mode in order to display properly (Known bug on mobile graphics chipsets, along with the thousands of other problems with this game!) In borderless windowed mode the Desktop graphics properties are applied including the QuickSync encoder.

TL,DR

When I force the Link to use NVFBC H254 as its hardware encoder it performs at its best but I cannot see the STEAM interface on the client and must launch games from the host.

If I force the Link to use QuickSync as the hardware encoder performance and graphics are just meh.

If I use software encoding everything is pretty good however that breaks down quickly in a very cpu demanding game.

Senast ändrad av Galland; 4 nov, 2015 @ 15:55
Greg 4 nov, 2015 @ 16:17 
When I force the Link to use NVFBC H254 as its hardware encoder it performs at its best but I cannot see the STEAM interface on the client and must launch games from the host.

What happens if you minimize big picture mode and then maximize it again?

and played around with disabling my integrated Intel 4300 chip

I'm kind of confused because most of your results involve it using quicksync at one time or another. Is it still using quicksync after you have disabled the IGP?

If I launch Arkham Knight however with Hardware Encoding enabled it stays with the Intel QuickSync encoder and the Links performace is pretty bad.

There are a handful of games out there which for some reason or another don't work with Nvidia hardware encoding. Xcom is one that comes to mind right off the bat. Homeworld remastered used to be one but they seem to have fixed it. I've seen a few people mention that Arkham Knight isn't working while streaming for one reason or another. Could you try another game?

Mad Max which runs in Fullscreen and uses the "NVIFR H264" encoder. This is the best performance I get on the Steam Link, fast, no input/display lag, and no screen flickering.

This is how it should always work. NVIFR is slightly better performing than NVFBC (game a-sync vs desktop really matters here too) although in general the best way to make it work is to run the game in full screen. Here's a small bit from the Nvidia GRID FAQ that might shed some light on the differences between the two

Ursprungligen skrivet av Nvidia GRID SDK FAQ:

A3) The GRID SDK consists of two component software APIs: NvFBC and NvIFR.
a) NvFBC captures (and optionally H.264 encodes) the entire visible desktop
b) NvIFR captures (and optionally H.264 encodes) from a specific render target

NvIFR is the preferred solution to capture the video output of one specific
application. NvFBC is better for remote desktop applications

So while NvFBC will be the more compatible of the two as it captures the entire visible desktop. NvIFR should be better performing because it is only capturing the output from the game. It's also able to do this at an earlier stage in the rendering pipeline. Whereas NvFBC would have to wait until the render target is copied to the system framebuffer.

It sounds like your biggest issue here really is big picture mode.


Senast ändrad av Greg; 4 nov, 2015 @ 16:21
Galland 4 nov, 2015 @ 21:45 
Ursprungligen skrivet av < Greg:
It sounds like your biggest issue here really is big picture mode.

This was the truth. I was able to see big picture mode after alt-tabbing out then back into it; but I wanted a solution that didn't get me off the couch. I searched a bit for similar issues with black screens using Big Picture Mode and learned that it was a known issue with Windows 8.1... so I bit the bullet and upgraded to Windows 10; I had also read some stories of Arkham Knight behaving much nicer with the OS "upgrade."

Sure enough this seems to have solved everything... allowing Steam to "auto select" it's GPU at launch allows it to use my integrated Intel graphics processor to run the Steam interface, including Big Picture mode, and stream that flawlessly using QuickSync. Upon launching most games now it will auto switch to the NVidia GPU and begin using the NvIFR encoder which performs very nicely on my wireless setup.

Thanks for walking me through how this all plays together; I consider myself fortunate that I'm streaming 1080p in beautiful quality on a wirelessly connected laptop with Optimus tech!
Senast ändrad av Galland; 5 nov, 2015 @ 4:57
dannyhefc69 25 feb, 2016 @ 5:47 
Iv got an amd r9 290x and a i7 4790, i cant seem to get a solid 60fps, frames go up and down tge more intense the frames our. What would be best hardware or sofeware? And it all gives an option to pick how many thread to use. Would 8 be the best or say 2 and that leaves me 6 threads to handle the game
Quad 25 feb, 2016 @ 6:19 
Ursprungligen skrivet av Reck:
If you have a Sandy Bridge Intel CPU or newer you can encode\decode using QuickSync or the GPU can also encode\decode the stream.

1. Is it safe to assume that hardware encoding is always better and should be turned on rather than relying on software encoding?
2. How can I tell if my CPU or GPU is doing the encoding?
3. How can I switch between using the CPU or GPU for encoding?
4. Which is best for Steam Link encoding\decoding, CPU or GPU?

I have an i7 Sandy Bridge and Nvidia Geforice 970.

Thanks

1. Yes. In addition to being quicker and not using 100% CPU, it also uses significantly less power (for a high end i7 we're talking 100-200w less).

2. Enable "Display performance information" in streaming settings and then press F6, it will tell you what encoder it is using.

3. Launch Big Picture Mode and then go to Settings > In-home streaming settings. Uncheck the ones you don't want it to use.

4. In your case GPU. The encoder ASIC on your GPU in higher quality/faster than the encoder ASIC on your i7.


The 970's hardware encoding is better in every way, don't even bother to use the Sandy Bridge i7.
Senast ändrad av Quad; 25 feb, 2016 @ 6:27
< >
Visar 1-12 av 12 kommentarer
Per sida: 1530 50

Datum skrivet: 1 nov, 2015 @ 8:37
Inlägg: 12