Install Steam
login
|
language
简体中文 (Simplified Chinese)
繁體中文 (Traditional Chinese)
日本語 (Japanese)
한국어 (Korean)
ไทย (Thai)
Български (Bulgarian)
Čeština (Czech)
Dansk (Danish)
Deutsch (German)
Español - España (Spanish - Spain)
Español - Latinoamérica (Spanish - Latin America)
Ελληνικά (Greek)
Français (French)
Italiano (Italian)
Bahasa Indonesia (Indonesian)
Magyar (Hungarian)
Nederlands (Dutch)
Norsk (Norwegian)
Polski (Polish)
Português (Portuguese - Portugal)
Português - Brasil (Portuguese - Brazil)
Română (Romanian)
Русский (Russian)
Suomi (Finnish)
Svenska (Swedish)
Türkçe (Turkish)
Tiếng Việt (Vietnamese)
Українська (Ukrainian)
Report a translation problem
It's already possible to derive a similar data stream from the OpenVR poses using the the public API. I.e., we could emulate a 'virtual imu' atop the normal tracking.
Also note that the returned poses in OpenVR include both linear and angular velocity terms, if that's what you're interested in.
I can send you a demo over mail to show you what we're working on and explain the use case and the issue.
But in short, the headset assumes that the head is always moving, which our application might not be for this.
http://steamcommunity.com/app/358720/discussions/0/361787186419244897/
Thank you.
Does anyone have the source code for lighthouse_console.exe? Is there another online source that shows how to get the gyro/accelerometer data without using the "pose" methods?
I am trying to handle when a headset gets occluded from the lighthouses for a short period. All the pose methods return the identity matrix when it gets TrackingResult_Calibrating_OutOfRange and sets FALSE==bPoseIsValid. Basically what I want to do is update our local pose information with the updated gyro values when the device has no direct line of sight to any lighthouses.
Before you say: "Don't do that, why don't you just...". Don't bother. I understand the implications of what I am trying to do. This will only ever be done in a sitting context and will only be used for short periods of time (for those with safety concerns).
I know this data is available somehow, the lighthouse_console.exe example shown here does indeed dump correct values even when there is no lghthouses to pull from. This means there is a disconnect between the pose methods and whatever is going on within lighthouse_console.exe.
@krazcanuck
I am also kinda running into the same issue with Unity and Vive. I want to get the position/velocity data from HMD/Controller. I followed the online tutorial and implement some sample C# codes in Unity using SteamVR library (SteamVR_Controller.Device). Basically, I am getting the sensing data every time Update() function is called in MonoBehaviour. However, one thing I notice is that the sampling rate of this method is pretty low. Update() function gets called roughly every 40ms (25Hz) and the HMD/Controller data is recorded every 50ms (20Hz).
I'm not sure if it's the correct way to collect the data from Vive components since the sampling rate I need would be much higher (100Hz). Any suggestion? Thanks in advance!
I had a quick and dirty look at osvr for unity and it seems you can access the data directly (position and rotation) by getting reference to a PoseAdaptor object which is kept up to date by osvr itself. Update looks to be called whenever the engine wants and grabs the current data from a PoseAdaptor then. Not sure if that will get you to 100Hz but alteast will let you to get updates async from the engine.
Here would be my starting point:
https://github.com/OSVR/OSVR-Unity/blob/master/OSVR-Unity/Assets/OSVRUnity/src/PoseInterface.cs