Steam telepítése
belépés
|
nyelv
简体中文 (egyszerűsített kínai)
繁體中文 (hagyományos kínai)
日本語 (japán)
한국어 (koreai)
ไทย (thai)
Български (bolgár)
Čeština (cseh)
Dansk (dán)
Deutsch (német)
English (angol)
Español - España (spanyolországi spanyol)
Español - Latinoamérica (latin-amerikai spanyol)
Ελληνικά (görög)
Français (francia)
Italiano (olasz)
Bahasa Indonesia (indonéz)
Nederlands (holland)
Norsk (norvég)
Polski (lengyel)
Português (portugáliai portugál)
Português - Brasil (brazíliai portugál)
Română (román)
Русский (orosz)
Suomi (finn)
Svenska (svéd)
Türkçe (török)
Tiếng Việt (vietnámi)
Українська (ukrán)
Fordítási probléma jelentése
As per this post, I should find a "streaming_log.txt" under ~/.local/share/Steam/logs
There are no streaming log files in this folder, let alone anywhere else on my computer.
I am using a Gigabyte Brix (AMD A8-5545m) as a streaming client. It runs Linux mint 17.2 (ubuntu 14.04 base), with latest proprietary AMD Crimson driver, and I can't figure out why it won't use hardware decoding. The streaming performance is rather poor as the CPU in this computer is not that fast, but the GPU should be fully capable of hardware decoding.
The server end runs both Linux and Windows, and also uses AMD graphics (280X) and hardware encoding is not an issue, performance is the same if I stream from Linux or Windows.
I want to see the logs...but they don't exist. I am on Steam beta program.
EDIT: Didn't find anything likely on my steamos (debian based) machine. I think the host log is it.
[2015-12-09 16:11:07] CLIENT: Video size: 1920x1080, output size: 1920x1080
[2015-12-09 16:11:07] CLIENT: CVAAPIAccel: vaInitialize() failed: unknown libva error
[2015-12-09 16:11:07] CLIENT: VDPAU init failed: GL_NV_vdpau_interop not available on current context
[2015-12-09 16:11:07] >>> Client video decoder set to libavcodec software decoding with 4 threads
[2015-12-09 16:11:07] CLIENT: libavcodec software decoding with 4 threads
Further down is the session stats:
{
"GameNameID" "BioShock Infinite"
"TimeSubmitted" "1449699087"
"ResolutionX" "1920"
"ResolutionY" "1080"
"CaptureDescriptionID" "Game Delayed OpenGL NV12 + libx264 main (4 threads)"
"DecoderDescriptionID" "libavcodec software decoding with 4 threads"
"BandwidthLimit" "15000"
"FramerateLimit" "0"
"SlowGamePercent" "0"
"SlowCapturePercent" "0"
"SlowConvertPercent" "0"
"SlowEncodePercent" "0"
"SlowNetworkPercent" "0"
"SlowDecodePercent" "0"
"SlowDisplayPercent" "0"
"AvgClientBitrate" "5.95400047302246094"
"StdDevClientBitrate" "5.57687187194824219"
"AvgServerBitrate" "8260.0263671875"
"StdDevServerBitrate" "0"
"AvgLinkBandwidth" "75000"
"AvgPingMS" "0.4825592041015625"
"StdDevPingMS" "0.037737872451543808"
"AvgCaptureMS" "1.07294070720672607"
"StdDevCaptureMS" "0.414855241775512695"
"AvgConvertMS" "0.0203776545822620392"
"StdDevConvertMS" "0.314402580261230469"
"AvgEncodeMS" "23.4428138732910156"
"StdDevEncodeMS" "5.9516143798828125"
"AvgNetworkMS" "5.91558456420898438"
"StdDevNetworkMS" "3.23080778121948242"
"AvgDecodeMS" "11.6414451599121094"
"StdDevDecodeMS" "2.84054684638977051"
"AvgDisplayMS" "2.26332187652587891"
"StdDevDisplayMS" "1.2771599292755127"
"AvgFrameMS" "40.2400321960449219"
"StdDevFrameMS" "8.40722942352294922"
"AvgFPS" "32.6540756225585938"
"StdDevFPS" "10.7771787643432617"
"BigPicture" "0"
"KeyboardMouseInput" "0"
"GameControllerInput" "0"
"SteamControllerInput" "0"
}
I have been wondering if my Windows Steam client is really using the best decoder available.
I am using the latest 15.30 Crimson drivers.
I have a AMD 7850k with no extra GPU, which means it should use UVD 4 but the logs always report the standard "DXVA: H.264 variable-length decoder, no film grain technology".
This obviously is not software encoding but is it UVD ? I mean I hope not because my average time for decoding is around 9ms at 1080p with automatic bandwith on Rocket League. Yet it's slightly faster (~8ms) without hardware encoding, but my CPU is not so fast, I even underclocked it.
Looking at the best results, I really hope I can get this number down because my streaming experience is really not great so far.
I am seriously considering buying a Nvidia GT 720 focused solely on the decoding.
http://i.imgur.com/mJbrO70.png
{
"GameNameID" "Tom Clancy's Splinter Cell Blacklist"
"TimeSubmitted" "1454247342"
"ResolutionX" "1920"
"ResolutionY" "1080"
"CaptureDescriptionID" "Game polled D3D11 NV12 + libx264 main (4 threads)"
"DecoderDescriptionID" "Marvell hardware decoding"
"BandwidthLimit" "30000"
"FramerateLimit" "0"
"SlowGamePercent" "0"
"SlowCapturePercent" "0"
"SlowConvertPercent" "0"
"SlowEncodePercent" "1.4137680530548096"
"SlowNetworkPercent" "0"
"SlowDecodePercent" "0"
"SlowDisplayPercent" "0"
"AvgClientBitrate" "17.000064849853516"
"StdDevClientBitrate" "16.601770401000977"
"AvgServerBitrate" "25366.1171875"
"StdDevServerBitrate" "0"
"AvgLinkBandwidth" "100000.0078125"
"AvgPingMS" "0.1310010552406311"
"StdDevPingMS" "0.17119871079921722"
"AvgCaptureMS" "1.3358078002929687"
"StdDevCaptureMS" "2.0422823429107666"
"AvgConvertMS" "0.011791355907917023"
"StdDevConvertMS" "0.3310931921005249"
"AvgEncodeMS" "25.64385986328125"
"StdDevEncodeMS" "6.7429022789001465"
"AvgNetworkMS" "15.442395210266113"
"StdDevNetworkMS" "5.7079968452453613"
"AvgDecodeMS" "2.1332027912139893"
"StdDevDecodeMS" "0.66553890705108643"
"AvgDisplayMS" "0.016240939497947693"
"StdDevDisplayMS" "0.065722011029720306"
"AvgFrameMS" "64.886558532714844"
"StdDevFrameMS" "19.524898529052734"
"AvgFPS" "33.965591430664062"
"StdDevFPS" "16.278661727905273"
"BigPicture" "1"
"KeyboardMouseInput" "1"
"GameControllerInput" "1"
"SteamControllerInput" "0"
}