Інсталювати Steam
увійти
|
мова
简体中文 (спрощена китайська)
繁體中文 (традиційна китайська)
日本語 (японська)
한국어 (корейська)
ไทย (тайська)
Български (болгарська)
Čeština (чеська)
Dansk (данська)
Deutsch (німецька)
English (англійська)
Español - España (іспанська — Іспанія)
Español - Latinoamérica (іспанська — Латинська Америка)
Ελληνικά (грецька)
Français (французька)
Italiano (італійська)
Bahasa Indonesia (індонезійська)
Magyar (угорська)
Nederlands (нідерландська)
Norsk (норвезька)
Polski (польська)
Português (португальська — Португалія)
Português - Brasil (португальська — Бразилія)
Română (румунська)
Русский (російська)
Suomi (фінська)
Svenska (шведська)
Türkçe (турецька)
Tiếng Việt (в’єтнамська)
Повідомити про проблему з перекладом
http://www.youtube.com/watch?v=kQL-Bii_S-s
Keep in mind though that they said they are rolling out a new codec in a few days which will improve performance. It is compatible with other codecs as well but I have not tested yet how it performs on them.
http://steamcommunity.com/sharedfiles/filedetails/?id=237783142
I'd assume RSupport since it has less bitrate to work with but at the same time I don't know if it's because it uses more effective techniques than FPS1 that it doesn't need that much. So am at a loss.
Using default settings/codecs on both, this is how I see Fraps and Litecam. Fraps faster with less real-time compression, Litecam has smaller files with more real-time compression. I suppose you could use a different codec with Litecam to alter this.
The default setting for litecam's RSupport codec looks horrible. I tested it prior, the quality was inferior to fraps (lot of articating). The lower the setting the worse the quality becomes, set it to something lower than 6000 and it becomes more apparent. It only looks almost identical to fraps if I set the bitrate to 12000. Considering what I told you in the other thread about youtube and their horrible bitrate matching practices, I want it to be as close to lossless as possible.
My computer is a bit old but it's not exactly ancient either. It's first gen i7 and an HD 5850 and the games are running off an SSD. It should be more than suitable for recording a game running at 1280x960 (bullet hell or not) and a fighting game running at 720p, even dealing with real time compression.
However, even if you set the RSupport codec to something as low as 2000 (which I don't recommend because it looks horrible) and do as little processing as possible it still has the FPS issue. There is no slider option in the codec to change the compression rate or leave the files uncompressed so it takes up less cycles which is a bit of a problem. However with the release of the new MJPEG codec that might not be much of an issue anymore though I have yet to test it. Most people recording games have plenty of hard drive space but CPU cycles are a finite resource. However I'm not sure if it really is a codec problem because I've read on the forums that other people have the same problem as well when using different codecs so make of that what you will.
Those two "tests" that I did were stress tests. I put the first game at settings that I knew would give recording software problems normally. All recording software whether it is litecam, or fraps, or action, or bandicam or playclaw hate recording bullet hell games running at 60FPS (30 FPS it gives little to no problems but 60 always results in dropped frames for a computer with my specs), even worse is when the resolution is running at 1280x960. I made it like that on purpose to see which software can handle that kind of stress better, although that test was made with the outdated litecam game trial on their website so it's not relavent anymore.
The fighting game is a bit different, that one was made after the litecam patch. Ran that one 720p at 60 FPS but it's nowhere near as stressful as a bullet hell so there should be no reason it shouldn't perform well. I could have ran the game at 30 FPS and both videos would be smooth but that defeats the purpose of a stress test and it wouldn't be representative of a real world scenerio in which most people run it at 60 FPS while recording. I'll say this, they did definatly improve the program like they said they would, but there are still issues that need resolving. Both games were synced up and running off a separate SSD so loading times was not the issue. It may look like a small issue because it's lagging by only a few frames, but keep in mind that if it was running at 1080p or if the game was more graphic intensive it would be a much wider gap in dropped frames.
I gotta test this codec out soon now that I know this. The problem now is choosing what game to test.
Many modern 3D games with high scene complexity in constant motion are already taxing the CPU quite a bit, with very little left over for compressing real-time. Older 3D games with lower scene complexity probably don't tax the cpu much to achieve high fps.
Anyone ever try profiling these programs to know for sure how much CPU the game uses and how much the compressor uses?
The reason I use Bullet Hell games (or fighting games, or even racing games) for a comparison is because those games are usually the ones that have a replay feature, so you are getting the same exact video each and every time to test multiple codecs across. So it's much easier and much more consistant to tell which recording software / codec performs better by syncing the videos and finding out if there are any dropped frames or slowdowns, even if it's a fraction of a second long.
That and that added bonus that even if there was a studdering issue caused by the GPU or a driver and not the software recording program, the fact that the replay function makes all the videos identical means that it should theoredically show up on all the tests, meaning much more accurate results.
Edit: However you can also use benchmark tests that come with games like shogun 2, arkham city, resident evil 5, etc, to get identical videos as well probably. The only issue is that those benchmark tests tend to be short so while finding if frames were dropped or studdering occured is still possible, you won't be able to tell how severe of an issue it really is in a short video.
I've ran Quake 1 though 3 at 1920x1080 windowed with the resource monitor on the bottom 120 pixels (1920x1200 screen) and discovered that these games never take more than 15% of the CPU when locked at 60fps. BTW my computer was built in 2010 so it would be even less with 2014 computers. I was thinking that these games would make a good test of the compressor as the compresser would have 85% of the CPU to use without interfering with the game's framerate. Also none of them took up more than 0.2 gigbbytes of RAM, plenty leftoever for the compresser. I know Quake1&2 have replays, probably Quake3 as well.
Quake3 used less CPU cycles than Quake2 I believe due SSE support in Quake3. Quake1 used the most as it lacks the OpenGL hardware transform accel of Q2/Q3, and lacks the SSE support that Quake3 has.
I wonder if there's a performance monitor app that lists what % of the GPU each app is using.
Varies per program, Fraps I know is dependant on the CPU while something like nVidia's shadowplay is GPU dependant for recordings.
liteCam though is CPU dependant.
Moreover if the game itself uses the hard drive regularly as well such as flying over terrain on World of Warcraft and Google Earth then you should really have 3 drives: 1 for the app, 1 for the recorder, and 1 for the operating system. However, a fast solid state drive with 1 gigabyte/sec transfer rate you might be able to get away with only one as solid state drives don't have head delays as it has no moving parts.
Also I don't think there's much lossless compression that can be done for constant motion through a 3D textured world.