Steamをインストール
ログイン
|
言語
简体中文(簡体字中国語)
繁體中文(繁体字中国語)
한국어 (韓国語)
ไทย (タイ語)
български (ブルガリア語)
Čeština(チェコ語)
Dansk (デンマーク語)
Deutsch (ドイツ語)
English (英語)
Español - España (スペイン語 - スペイン)
Español - Latinoamérica (スペイン語 - ラテンアメリカ)
Ελληνικά (ギリシャ語)
Français (フランス語)
Italiano (イタリア語)
Bahasa Indonesia(インドネシア語)
Magyar(ハンガリー語)
Nederlands (オランダ語)
Norsk (ノルウェー語)
Polski (ポーランド語)
Português(ポルトガル語-ポルトガル)
Português - Brasil (ポルトガル語 - ブラジル)
Română(ルーマニア語)
Русский (ロシア語)
Suomi (フィンランド語)
Svenska (スウェーデン語)
Türkçe (トルコ語)
Tiếng Việt (ベトナム語)
Українська (ウクライナ語)
翻訳の問題を報告
I believe its time for a better GUI on both client and host in-home streaming tabs.
If could choose the encoding method easily it would be a nice way to perform tests.
I suggest using something like OBS (Open Broadcaster Software) where we can easly select the parameters, FPS and even the " intensity " of the encoding process.
For reference:
https://jp9000.github.io/OBS/settings/encodingsettings.html
As for OBS, in order to have NVENC providing similar image results to x264 we need to configure NVENC recording as 30 to 40 mbs (30000 / 40000).
How this works for Steam? NVENC is more bandwidth consuming?
I think you just sold me on a new iPhone better than Apple ever has.
I honestly think this is a bit much and would honestly probably cause more problems then they would help. It would also take much longer to implement these things as custom parameters and create a nice GUI around the whole thing.
A simple reorderable list where we could set the encoder priority would be a HUGE leap in the right direction. To be completely honest as much as I like NVENC the idea of it taking precedence over Async + NVIFR scares the living crap out of me. Something like this would definitely ease the transition.
Btw. I second that we need a better gui to enforce the encoding method per game if not for anything else than for testing!
I don't know if it's related, but I wrote a project that converts DX10/11 games to use the DXGI Flip Presentation Model for improved DWM performance. I've seen other software on my system properly use NVENC for frames captured even when the games are using my DLL to retrofit flip model, so I'm at a loss.
I do know that Steam's overlay doesn't like the flip presentation model at all and I had to add a second SwapChain::Present (...) call to get its hooks to work right. I don't think the overlay hooks Present1 (...).
Maybe something could be done about that problem while you're at it? :P It's awkward doing a no-wait Present with a zero-area region just to get Steam to draw its overlay at the end of a frame. This problem is only going to get worse with DX12 and Windows Store apps, which are required to use the Flip Presentation Model.
Very good question. I've been tempted at times to use the NDA version of NVAPI and implement this stuff myself since for months at a time in-home Streaming works great and then it all goes to hell. I'd like something that doesn't change constantly and I'd GET that if I said the hell with it all and did this myself.
You could always become a contributor to the moonlight project. I'm pretty sure it started out the same exact way.
I looked at that project in the past. There's very little I could do because it's mostly Java and I don't know the first thing about Java. I barely know C++'s standard library, my work basically covers C and the C standard library. Sort of backed myself into a corner with my career path, I think -- with all these popular and unfamiliar languages like Java, Ruby :P
Java is C++ without pointers, it's pretty simple if you can read C/C++. I actually thought the same thing as you though (The project is in java) but it's actually just the client frontend that is written in java. The actual capture and decode logic is in C and bound to java via JNI.
Here's the directory for their decoder files for example -
https://github.com/moonlight-stream/moonlight-pc/tree/master/jni/nv_avc_dec
As much as I hate managed languages I've seen a steady decrease in C/C++ developer positions and a massive increase in Java positions. As a C/C++ programmer this makes me sad, as a guy who knows 8 different programming languages, this really just makes me a lot of money. It'd probably be more than worth it to try to pick Java up.
To be completely honest I've thought about this, A LOT. (writing my own streamer) Getting the capture "server" to work doesn't seem all that hard. Surprisingly it's the client that sounds a lot harder to write. As well as all of the issues related to window focus, mirroring input, network prediction, error handling, dealing with ♥♥♥♥♥♥♥♥ quirks (read: bugs) with the Nvidia drivers.
This actually just wraps itself around the protocol NVIDIA already devised for Shield streaming. I was thinking a bit more grandiose in scale, wanting to handle the encoding myself with every conceivable option tunable :) I'm completely turned off by the 3 radio buttons and 2 check boxes Valve passes for in-home streaming configuration, especially when from version to version those settings have different meanings :P
@"" signifies that it's an objective-C string class and not a primitive C string. Objective-C is just a layer on top of standard C so anything you can do in C still applies.
This allows you to treat the string as a fully qualified class and do things like this:
NSString *helloWorld = [@"Hello " stringByAppendingString:@"World"];