Install Steam
login
|
language
简体中文 (Simplified Chinese)
繁體中文 (Traditional Chinese)
日本語 (Japanese)
한국어 (Korean)
ไทย (Thai)
Български (Bulgarian)
Čeština (Czech)
Dansk (Danish)
Deutsch (German)
Español - España (Spanish - Spain)
Español - Latinoamérica (Spanish - Latin America)
Ελληνικά (Greek)
Français (French)
Italiano (Italian)
Bahasa Indonesia (Indonesian)
Magyar (Hungarian)
Nederlands (Dutch)
Norsk (Norwegian)
Polski (Polish)
Português (Portuguese - Portugal)
Português - Brasil (Portuguese - Brazil)
Română (Romanian)
Русский (Russian)
Suomi (Finnish)
Svenska (Swedish)
Türkçe (Turkish)
Tiếng Việt (Vietnamese)
Українська (Ukrainian)
Report a translation problem
Yes hands tracking is possible with Live2D avatars, with RealSense or Leap Motion Controller. The movements indeed are more limited but it also depends how the 2d model is created.
Here is an example of a Live2D avatar being used with Leap Motion Controller https://www.instagram.com/p/BGMVkhRHUVn/
As for documentation follow these guidelines http://steamcommunity.com/app/274920/discussions/8/485624149164823989/
http://steamcommunity.com/sharedfiles/filedetails/?id=703468790
It's definitely limited, you can't track the fingers for instance. I haven't tried any application beyond waving the arm or raising it up and down.
I have a question about the documentation though. - I've read much of it in the past, and as far as I can tell most of it seems to be the same documentation that was given out when the Live2d module was first introduced.
As far as I can tell, that doesn't include any references to how to set up the hands in a way that Facerig understands.
Clearly several people have done so anyway, so how did you figure out how to do it?
(or am I missing something incredibly obvious here?)