Install Steam
login
|
language
简体中文 (Simplified Chinese)
繁體中文 (Traditional Chinese)
日本語 (Japanese)
한국어 (Korean)
ไทย (Thai)
Български (Bulgarian)
Čeština (Czech)
Dansk (Danish)
Deutsch (German)
Español - España (Spanish - Spain)
Español - Latinoamérica (Spanish - Latin America)
Ελληνικά (Greek)
Français (French)
Italiano (Italian)
Bahasa Indonesia (Indonesian)
Magyar (Hungarian)
Nederlands (Dutch)
Norsk (Norwegian)
Polski (Polish)
Português (Portuguese - Portugal)
Português - Brasil (Portuguese - Brazil)
Română (Romanian)
Русский (Russian)
Suomi (Finnish)
Svenska (Swedish)
Türkçe (Turkish)
Tiếng Việt (Vietnamese)
Українська (Ukrainian)
Report a translation problem
Limit Bandwidth to: Automatic
Limit Resolution to: 1920 x 1080 (1080p)
Speaker Configurations: Auto Detect
Enable Hardware Decoding: selected
Display Performance Information: selected
Just let Steam detect your computer hardware and drivers and make sure that Balance is selected for the type of performance and you shouldn't have any issues. But you always have the option to play around with your settings to see what works best for you.
As a side note: Make sure you take a screenshot of your settings before making any changes that way you can reset them back to the way they were if things don't work out. Let us know if this helps.
<1 ms input and 12-16ms for display latency is completely normal.
I've seen up to 25ms on display latency with some higher end games and it still remain playable. If your input latency starts moving into the 1-2ms range or display > 30ms, I would start worrying
@Greg I never said that my ISP had anything to do with streaming latency. I just let @Pheo know what type of service I had and what my settings were on the Client App on my desktop. In fact, your response was a bit condescending that this is not appreciated and find it boderline offensive. All I did was respond to the user. If the information doesn't help they would say so. Instead of attacking someone for responding and coming on to respond as if you Know It All is a bit rediculous and makes you appear arrogant.
Ok thanks both for answering, I was expecting your data connection in the local network, I suppose google fiber would provide gigabit connection on you local side. I would like to know what latency you have using in-home streaming.
Thanks @Dra'kharus for your data. I wanted to confirm if my latency was somehow ok or I should try to get gigabit network.
As a side note here is a guide to reading the statistics graph that may give you more insight.
The very first sentence of your post.
At no point did you later clarify that you were just tossing this information out there for absolutely no reason at all (Why would you even mention it to begin with?).
On top of this, all I did was say "it has nothing to do with your latency". Which...is 100% ♥♥♥♥♥♥♥ true. It has no bearing on your latency, and therefor no bearing on the question being asked. This is a lot like starting a post off with "I ate a ham sandwich this afternoon" and expecting everyone to know that we just need to ignore the extra anecdotes.
But at no point did this warrant a 100 word 500 character reply explaining to me how much I offended you. Honestly, if you want to just ignore all of the content in my post and harp on what you were personally offended by. You should just get off the damn internet. There's no place here for you.
In fact, i'll take this a step farther. Out of the 700+ posts I have on both these and the steam link forums trying to help uninformed users, you are the ONLY person who has managed to go off on a tirade about your hurt feelings.
I use max quality settings, no bandwith cap and I usualy get 20-30ms delay which is eqivalent to 1,5-2 frames at 60fps, so I barely notice it. I also have a fairly weak client machine (3rd gen celeron), so faster one would probably have even less lag.
Referring to frame loss? This seems incorrect. Including render time, encode time, and transfer time, and excluding decode for the sake of argument, your latency must be ~16ms or less to get a frame for frame 60fps. For example: 1sec/20ms=50fps. Additionally, you may be able to get better latency figures without affecting quality by setting a bandwidth cap or setting to auto, as noted in the steam client's IHS settings. It can lead to higher encode times then necessary if left on Unlimited. Even slouken(steam IHS dev) recommends leaving it on auto in most cases. If you've ever done x264 video encoding then a good analog would be the difference between "fast" and "placebo" (this may be exaggerated somewhat, but you get the idea)
On the client I mainly use, sometime ago I added a gigabit expansion card out of curiosity myself as it only had 10/100 on board, and saw little to no difference in IHS performance.
Considering the OP reported he is getting 16m/s display latency on a fast ethernet connection and comparing that to your results, how would you say an upgrade to gigabit would benefit the OP?
Anyway, it doesn't matter, network manager is reporting the same as steam overlay.
As for the delay, I'm not talking about frame loss, that's at 0. I'm talking about a delay between when the frame was recorded on host, and displayed on client. In my case it's up to 30ms, which is the equivalent of 2 frames at 60fps. This means that by the time I see a frame, the host is only 2 frames ahead. Like I said, it's barely noticeable.
The reason it's so "high" is because I use max quality settings, thanks to a gigabit connection. If you want to have a minimal lag, then sure, you can cap it, but imo that is not worth it, especially since I can really see a difference in quality between 100 cap and unlimited. The colors are less washed, image is super sharp and there is no blocking when there is a lot of movement on the screen. Depends, of course on the quality of the TV and how far you sit from it, but I like to sit close.
Also, that cap is there, so steam wouldn't try to push more than your network can handle, thus generating lag. But it becomes a non issue if you are using gigabit, since your network will always be faster than the video bitrate.
So the upgrade is better for quality, and the latency difference (which just might be caused by my slow client) is still so low, that it's meaningless.
Not that I think this is the case here, but this happens with H.264 by design. At the beginning of each GOP you will send an I-Frame which will take up significantly more space than any other frame in the GOP. This means that you will se spikes in network usage every 2-10 seconds (depending on the GOP length) with every frame proceeding it using significantly less bandwidth.
EDIT: you might be using too high resolution. Some older TVs can only accept HDMI 4K signal at 24Hz, try setting it to 1080p, or you might use some old GPU in your client, which can only output 4K@24. Also turning on HDR might limit it to 24Hz