STEAM GROUP
Steam Remote Play homestream
STEAM GROUP
Steam Remote Play homestream
4,527
IN-GAME
39,158
ONLINE
Founded
November 7, 2013
Epsilons Jun 26, 2014 @ 11:11pm
Steam In-Home Streaming @ 1080p jump 60 to 30 fps [SOLVED]
First of all here is the setup that I use for In-Home Streaming

NETWORK :
  • Routers : NETGEAR Nighthawk AC1900 (Dual-Core 1Ghz 384mb Ram)
  • Host and Client are both plugged with a Wired Connection (Ethernet)

RIG FOR HOST & CLIENT :

HOST :
Hardware encoding : Yes (supported for NVIDIA Card since today Steam patch)
Prioritize Network Traffic : Yes

CLIENT :
Options : Balanced
Bandwidth : Unlimited
Resolution : 1920x1080 (1080p)
Hardware decoding : Yes
Display performance information : Yes

STREAMING LATENCY @ 1080p
Input : 0.20ms to 0.80ms
Display : 45ms to 55ms
Ping time : 2ms to 3ms
Incoming bitrate : 60000 kbit/sec video : 60000 kbit/sec
Outgoing bitrate : 385 kbit/sec
Estimated bandwidth : 93Mbps
Packet loss : 0.00% (0.00% frame loss)
*** Jump from 60FPS to 30FPS randomly while playing ***

STREAMING LATENCY @ 720p
Input : 0.20ms to 0.80ms
Display : 15ms to 25ms
Ping time : 1ms to 2ms
Incoming bitrate : 40000 kbit/sec video : 40000 kbit/sec
Outgoing bitrate : 140 kbit/sec
Estimated bandwidth : 89Mbps
Packet loss : 0.00% (0.00% frame loss)
*** 60FPS steady all the time ***

When I play Battlefield 4 with In-Home Streaming @ 1080p, i can play for 20 to 40min with a steady 60FPS, and then suddenly the streaming drop to 30FPS (but I can see that the game still run at 60FPS with MSIAfterburner and RivaTunerStatistic in-game overlay). So it's only affecting the streaming. I try to disable VSYNC but it's still dropping to 30FPS randomly while playing.

Can I do something to play 1080p without dropping to 30FPS ?

Also, are there any trick to reduce the latency of display in 1080p streaming ? Like modification in the Routers option to enhance streaming, or modification in Windows/Steam/Linux ?
Last edited by Epsilons; Jul 14, 2014 @ 3:31pm
< >
Showing 1-12 of 12 comments
Elettrone Jun 27, 2014 @ 6:37am 
Force the bandwidth to 10000....the more you give the more power it needs.
It works for me at least.
Sir Ohsis Jun 27, 2014 @ 11:42am 
Is your decode machine getting hot? I was having this issue before I got hardware decode going on my amd 7850k apu. After running a while in software decode with 3 threads the cpu fan would come up and my framrate would cap to 30fps. After I got hardware decoding going through vdpau everything is staying much cooler in my super cramped itx box. I know you are already using hardware decode so you may want to check your temps on your video card although it doesnt seem like it should put much strain on it. Played for a good hour and a half without frame drops yesterday.

Host:
I7 3770k@4.5ghz - using software encode, quicksync seems to add latency
16gb ram ddr 1600 cas 9
Radeon 7950 - 14.4 cats
SB X-Fi titanium HD
1TB Samsung ssd
Gigabyte G1 Sniper M3
Win 8.1 64bit

Client
AMD A10 7850k - VDPAU hardware decode - open sorce radeonsi driver
8gb ram ddr 2133 cas 10
120GB Kingston SSD
ASRock FM2A88X-ITX+
Mint Mate 17 +xorg edgers ppa

Steam:
Beautiful Preset
auto bandwidth
1080p

Network:
Wired GigE
Dlink DIR 655

I was getting about 37-45ms display latency in Borderlands 2 less in stuff like Steamworld Dig. Add 10-15ms with quicksync on for hardware encode - I still want to play around more with this when I have time. Using about 3 times the badwidth you are. Might just be the Beautiful Preset.
Epsilons Jun 28, 2014 @ 12:37pm 
I find a way to fix my problem. Here is my new final setting :

HOST
Hardware encoding : Yes (supported for NVIDIA Card since latest Steam patch)
Prioritize Network Traffic : Yes

CLIENT
Options : Balanced
Bandwidth : Automatic (** This is what fixed the frame drop for me **)
Resolution : 1920x1080 (1080p)
Hardware decoding : Yes
Display performance information : Yes

STREAMING LATENCY @ 1080p Balanced
Input : 0.20ms to 0.80ms
Display : 25ms to 30ms
Ping time : 1ms to 2ms
Incoming bitrate : 60000 kbit/sec video : 60000 kbit/sec
Outgoing bitrate : 385 kbit/sec
Estimated bandwidth : 93Mbps
Packet loss : 0.00% (0.00% frame loss)
*** Steady 60 FPS in Battlefield 4, no frame drop ***

And to put the In-Home Streaming to it's Limit, if I switch Client Options to Beautiful, instead of Balanced, my Display latency go from 25-30ms to 45-50ms and i'm getting frame drop to 30fps sometime in Battlefield 4....

I think Balanced graphic + Automatic Bandwidth is the best setting for 1080p capable machine, maybe in the futur i'll be able to run Beautiful setting if they optimize hardware encoding and hardware decoding for NVIDIA Graphic card, anyway i prefer 25-30ms on display latency. 45-50ms is way too much latency to play multiplayer Battlefield.

Non-Multiplayer game, like Watch_Dogs or Tomb Raider run just great in Beautiful setting, since Display latency is not much of an issue in singleplayer game. And they don't drop to 30FPS. I think 64player multiplayer game like BF4 put much strain on the CPU/GPU
Last edited by Epsilons; Jun 30, 2014 @ 9:45pm
Sir Ohsis Jun 28, 2014 @ 7:48pm 
You may want to double check your streaming_log.txt in steam\logs\ on your host machine to make sure you are actually getting hardware decode on your client machine. It may just be BF4 but on the games I have been trying the Beautiful preset has hardly affected my display latency. My last session of Farcry 3 follows. Note DecoderName and AvgFrameMS. At any rate glad you found some setting that are working for you.


[2014-06-28 22:29:23] "SessionStats"
{
"TimeSubmitted" "1404008963"
"ResolutionX" "1920"
"ResolutionY" "1080"
"CaptureName" "Desktop DWM NV12 + libx264 main (4 threads)"
"DecoderName" "VDPAU hardware decoding" <---should see something like this
"BandwidthLimit" "30000"
"FramerateLimit" "0"
"SlowSeconds" "0"
"SlowGamePercent" "0"
"SlowCapturePercent" "0"
"SlowConvertPercent" "0"
"SlowEncodePercent" "0"
"SlowNetworkPercent" "0"
"SlowDecodePercent" "0"
"AvgClientBitrate" "115.09413146972656"
"StdDevClientBitrate" "42.9027099609375"
"AvgServerBitrate" "27859.912109375"
"StdDevServerBitrate" "21360.64453125"
"AvgLinkBandwidth" "273604.21875"
"AvgPingMS" "0.60825085639953613"
"StdDevPingMS" "0.088371507823467255"
"AvgCaptureMS" "8.9190597534179687"
"StdDevCaptureMS" "5.459230899810791"
"AvgConvertMS" "9.9326658248901367"
"StdDevConvertMS" "5.1059308052062988"
"AvgEncodeMS" "10.45759105682373"
"StdDevEncodeMS" "2.3086166381835938"
"AvgNetworkMS" "0.37150111794471741"
"StdDevNetworkMS" "0.26619437336921692"
"AvgDecodeMS" "2.5262541770935059"
"StdDevDecodeMS" "1.1475206613540649"
"AvgDisplayMS" "12.987998962402344"
"StdDevDisplayMS" "4.6667218208312988"
"AvgFrameMS" "37.036563873291016" <--- I think this is f6 display latancy
"StdDevFrameMS" "6.5792956352233887"
"AvgFPS" "59.641178131103516"
"StdDevFPS" "4.3443069458007812"
}
Epsilons Jun 28, 2014 @ 7:55pm 
How do I know if I have Hardware Decoding? Does VDPAU is hardware decoding for NVIDIA Graphic card ?
Last edited by Epsilons; Jun 30, 2014 @ 9:41pm
Sir Ohsis Jun 29, 2014 @ 5:09pm 
Yes NVIDIA supports VDPAU in linux so it should say something like that in your log. If your decoder name in your log is something like libx264 and then lists a number of threads its using software decode for some reason. I know steam requires some 32bit librarys to get things working on radeon and linux Mint but I have no experiance with steamOS and Nvidia.
your network is not running at 1 gigabit, its running at 100 Megabit, check your connections, and ensure your ethernet cords are gigabit ready.
my estimated bandwitch for 720P is in the 500megabit range, never as low as 90
Epsilons Jun 30, 2014 @ 9:40pm 
Ok I fixed my problem, I'm now running at 1GBps. My issue was because I plugged my Ethernet Cords on my TRIPP-LITE Surge Protector, and it is limited to 100mbps -_- ! I'll never plug any Ethernet Cable on a Surge Protector again lol.

My result with 1Gbps LAN (1000mbps) :

NETWORK :
  • Routers : NETGEAR Nighthawk AC1900 (Dual-Core 1Ghz 384mb Ram)
  • Host and Client are both plugged with a 1 Gbits/s @ 1000 Mbps/s CAT 7 Wired Connection

RIG FOR HOST & CLIENT :

I am so happy that a little modification to my network setup have made such an improvement in my Streaming. I can now handle Beautiful setting in 1080p without lag or frame drop.

I'm getting 280mbps out of my 1Gbps LAN in Battlefield 4 and Tomb Raider something similar. But in other games, I get 1500mbps to 2100 mbps ? (Skyrim + Crysis 3 + Watch Dogs + South Park The Stick of truth = All 2000mbps+ with 5 to 10ms in Display Latency, it's awesome, but why BF4/Tomb Raider only get ~280mbps? If I set Bandwidth to unlimited instead of automatic, I get 1500mbps in Battlefield 4, but the game is laggy, drop to 30fps and display latency is higher comparatively to Automatic Bandwidth @ 280mbps that run at 60fps all the time without any lag.

But if I set Bandwidth to Automatic, I'am capable of running all game at 1080p / Beautiful, without any lag, at 60 frame-per-second with all the setting maxed out (ultra setting, AA maxed out , etc) in all the game i've tested so far. But BF4 have an average of 25ms in Display latency at 280mbps with Automatic Bandwidth setting, while other game that run in the 1500-2000mbps range have only 5 to 10 ms in Display Latency ! South Park The Stick of Truth run at 2100mbps with 1 to 2 ms in Display Latency...

Also, 1 Gbits/s is supposed to run at 1000mbps I guess ? Why i'm running at 2100 mbps in some games? I guess it's because I have a Netgear Nighthawk R7000 capable of 1900mbps ?

Originally posted by Netgear Nighthawk R7000:
Optimize your online gaming and streaming with NETGEAR Nighthawk, with speeds up to 1900 Mbps

But my CAT 5e network cords only support 1000mbps this is why i'm wondering how it's capable of 2100mbps lol...

ALL GAMES ARE MAXED OUT (ULTRA SETTING, MAX AA, etc) WITH BEAUTIFUL SETTING AND 1080p RESOLUTIONS

BF4 Unlimited Bandwidth @ 1502mbps
*** Laggy, and sometimes drop to 30FPS, Automatic Bandwidth run at 280mbps and have better result in latency without fps drop ***
http://i.imgur.com/gw9ppIm.jpg

Crysis 3 Automatic Bandwidth @ 1734mbps
*** Run perfectly at 45 to 60fps with very very low latency ***
http://i.imgur.com/srX5f9U.jpg

Skyrim Automatic Bandwidth @ 1968mbps
*** Run perfectly at 60fps all the time with very very low latency ***
http://i.imgur.com/PviNlKX.jpg

Tomb Raider Automatic Bandwidth @ 337mbps
*** Run perfectly at 60fps all the time with low latency ***
http://i.imgur.com/GUcy6qr.jpg

South Park Automatic Bandwidth @ 2069mbps
*** Run perfectly all the time with EXTRA low latency (1ms - 2ms)
http://i.imgur.com/uSRNyZW.jpg

I'm still trying to understand why some game run at 2000+mbps, while other run only at 300mbps.... !

Also all the game i've play with Unlimited Bandwidth setting were more laggy then Automatic Bandwidth. In my opinions, Unlimited Bandwidth is a bad option making in-home streaming experience worse.
Last edited by Epsilons; Jul 14, 2014 @ 3:31pm
bogd Nov 2, 2015 @ 5:34am 
I know this is an old thread, but I have to clarify one thing: there is absolutely no way you're getting more than 1000 Mbps from your cable! :) Gigabit Ethernet means 1 gigabit per second, and that includes all the protocol overhead.

The actual speed in your sshots is around 20-30Mbps. The "estimated bandwidth" is just that - what Steam _thinks_ it might be able to use. But it is most definitely wrong in this case...
Epsilons Nov 2, 2015 @ 6:56am 
Meanwhile, my router is capable of 2300mbps. And the Skyrim screenshot I took show 1968mbps estimated bandwidth and 2069 mbps for south park the stick of truth.

I'm using shielded CAT 7 wired network with a NETGEAR NIGHTHAWK R7000 (350$ router)
bogd Nov 3, 2015 @ 11:20am 
@Epsilons: no, it's not capable of that... :) That router has 5 Gigabit (as in "1000Mbps" :P ) Ethernet ports. And no matter if you use Cat5E, Cat7, or anything else, that interface will never be able to carry more than 1000Mbps.

You were probably looking at the wireless speeds. Which the tech specs for the router list as "1900Mbps (600+1300)". But you need to keep in mind that with wireless those speeds are purely theoretical, and you will never get them in your network. The same specs (from Netgear's site) have a footnote that says:

"Maximum wireless signal rate derived from IEEE standard 802.11 specifications. Actual data throughput and wireless coverage will vary. Network conditions and environmental factors, including volume of network trac, building materials and construction, and network overhead, lower actual data throughput rate and wireless coverage"

< >
Showing 1-12 of 12 comments
Per page: 1530 50

Date Posted: Jun 26, 2014 @ 11:11pm
Posts: 12