SteamVR Developer Hardware

SteamVR Developer Hardware

データを表示:
SteamVR Developer Hardware > 総合掲示板 > トピックの詳細
 このトピックはピン留めされているので、おそらく重要です
Mixed Reality Videos
I was asked by a dev to provide some more details around SteamVR_ExternalCamera, so I figured I'd start a thread here to go over this, and also to discuss the topic more generally.

The new Unity plugin (v1.0.8) will automatically create this for you given two conditions:
1) Add a file called externalcamera.cfg in the root of your project.
2) Attach a third controller to your system.

By root of your project, I mean next to Assets or your executable.

Look at the config struct in SteamVR_ExternalCamera.cs for the full set of values you can set. At a minimum, you needs to set the near and far clip distances since we currently treat everything explicitly (as opposed to overrides).

Example:
x=0 y=0 z=0 rx=0 ry=0 rz=0 fov=60 near=0.1 far=100 //m=-0.999059,0.015577,-0.040472,-0.0127,-0.016016,-0.999816,0.010544,0.1799,-0.040301,0.011183,0.999125,-0.0846 sceneResolutionScale=0.5

The idea is that you attach the third controller rigidly to a camera that you can use to record the person playing on a greenscreen. Then you set the xyz offset and rotations in the config file to match the virtual camera in game. We have a tool for automatically solving for this, but it's not ready to release publicly yet, so you'll have to eye ball it for now by adjusting in the editor looking at other tracked objects until they line up close enough.

This script changes the output of the companion window on your main screen. It outputs four quadrants: foreground, foreground alpha (for compositing), background and normal game view.

You'll need to use another application to perform the compositing (e.g. OBS) and can use that to stream via twitch, etc. or you can save out the video feed and put it together in post.

Since this is a pretty intensive operation, and the primary computer is already overloaded with rendering the scene multiple times, you are better off setting up a second computer to take the video feed from the camera and the game, to composite in real-time for preview (very useful for the social aspect - e.g. set up a group of people on a couch to watch) and also take the brunt of writing the files to disk.

For creating trailers at 1080p, you'll want to set your output to 4k so each quadrant is full res.

The Vive Pre headsets only support pairing two controllers, but you can take an original Vive dev kit controller and plug its dongle into the primary machine, or use the usb cable with adapter to avoid dealing with batteries.
< >
1-15 / 255 のコメントを表示
aaron.leiby  [開発者] 2016年3月8日 19時21分 
We've been using DeckLink cards for video capture.
https://www.blackmagicdesign.com/products/decklink

We had issues with OBS for compositing, but I can't remember what ended up getting used instead. I've heard other folks have had good luck with OBS though.
https://obsproject.com/

We use After Effects for off-line compositing.

This is fantastic, thank you.
Are there any solutions for doing this in Unreal 4 that anyone has found or that Valve is working on?
Are there any video samples of what can be done with this.
Yes: Owlchemy Labs just did a video of Job Simulator and I believe they've used this: https://www.youtube.com/watch?v=w9dQkiI8raY ... there's also quite a few mixed reality videos showing Fantastic Contraption, like this one (it's an early test without greenscreen): https://www.youtube.com/watch?v=CHE2xEesS00 and here's one of Fantastic Contraption with greenscreen: https://www.youtube.com/watch?v=VPQipFQpceY ... I believe Northway Games (the creators of Fantastic Contraption) were the first ones to use this and the started with fixed cameras.

It's definitely the way to show VR to the masses on video. /me goes to order greenscreen stuffz on Amazon ;-)
最近の変更はjashanが行いました; 2016年3月9日 1時10分
Oh, and btw, I think this should be stickied :-)
aaron.leiby の投稿を引用:
Since this is a pretty intensive operation, and the primary computer is already overloaded with rendering the scene multiple times, you are better off setting up a second computer to take the video feed from the camera and the game, to composite in real-time for preview (very useful for the social aspect - e.g. set up a group of people on a couch to watch) and also take the brunt of writing the files to disk.

So you'd take the output of the "VR machine" that would usually go into the monitor and feed that into the "compositing machine", plus the camera feed, right? And for that, you'd use something like "DeckLink Studio 4K" to be able to feed two HDMI-signals (one from the VR machine, one from the camera) into the "compositing machine"?

Not sure if the "DeckLink Studio 4K" would actually be the right choice but it looks as if it had one HDMI input and one SDI Video Input which should be enough: https://www.blackmagicdesign.com/products/decklink/techspecs/W-DLK-12 ... SDI, however, is apparently only supported by fairly high end cameras, so one would probably need an additional SDI to HDMI converter. To make such a setup work.
This is incredibly useful thank you. I have it working here using an older VIVE controller as the 3rd camera with the .cfg file setup. OBS functions correctly, no lag. Green screen has arrived, ready to test.

I have it working in Unity BUT the foreground and foreground alpha (for compositing) are having issues with clipping distances. I've tried various clipping distances for near and far. Normally I would use 0.01 and 10000 but that doesn't seem to display correctly - www.ir-ltd.net/uploads/example.jpg

Do you know what might be causing this?

EDIT:

It seems turning off the External Camera/Controller (third)/camera seems to then display foreground and foreground alpha (for compositing) correctly? - www.ir-ltd.net/uploads/example02.jpg
最近の変更はS. LaBeoufが行いました; 2016年3月9日 6時59分
Dylan 2016年3月9日 13時05分 
Mixed-reality Audioshield video:
https://www.youtube.com/watch?v=FhSsdsu4yIk

This one was done differently, with the game doing the compositing. The video is just a grab of the window output using Fraps.
最近の変更はDylanが行いました; 2016年3月9日 13時06分
BOLL 2016年3月9日 13時35分 
The result of these mixed reality composits are just so great. Nothing else out there shows of better what room-scale/motion controllers is all about.

As for streaming something similar it almost feels as if it would be beneficial to have a spectator client connect to the main game through the network and do all the extra cameras and compositing/encoding on a different that machine. But, that might be a pipe dream, especially for single player games :P
Just wondering for the consumer versions, how will we be able to accomplish this if we don't have one of the legacy controllers to act as the camera? Will a Pre gen controller work as a wired controller?
aaron.leiby  [開発者] 2016年3月9日 14時47分 
hardlab の投稿を引用:
Just wondering for the consumer versions, how will we be able to accomplish this if we don't have one of the legacy controllers to act as the camera? Will a Pre gen controller work as a wired controller?
The Vive headset only supports two controller connections, however, you can plug as many additional controllers in via USB as you need.

The dongles that shipped with the original Vive dev kits are the same hardware as what ships with the Steam Controllers, just with different firmware. These can be converted fairly easily, and have the added benefit of fitting perfectly into the aux port on the Vive headset.
最近の変更はaaron.leibyが行いました; 2016年3月9日 14時48分
S. LaBeouf の投稿を引用:
I have it working in Unity BUT the foreground and foreground alpha (for compositing) are having issues with clipping distances. I've tried various clipping distances for near and far. Normally I would use 0.01 and 10000 but that doesn't seem to display correctly - www.ir-ltd.net/uploads/example.jpg
Unity doesn't like it if the near and far clipping planes are too far apart. I think something like far/near <= 10000 is a good idea. So I use near=0.01 and far=1000 for most of my tests. If you increase far to 10000 you should increase near to 0.1.

In theory that example.jpg you posted nothing should be showing in the foreground, right? Since the external camera is close to the Hmd (just slightly behind to the left I think?)
hardlab の投稿を引用:
Just wondering for the consumer versions, how will we be able to accomplish this if we don't have one of the legacy controllers to act as the camera? Will a Pre gen controller work as a wired controller?
Any controller will work. We have experimented using old ones from the Pre, the dev-kit editions and older. One thing we did notice is with really bright lights the old controllers (think the ones from GDC last year) have a hard time tracking, while the Pre ones work great.
Here's a crazy idea:

What if we could record the demo while using a legacy vive devkit, while filming using the vive pre's HMD with tracking camera.

I would imagine this setup to have the linkbox connect to the legacy vive with USB+Power+HDMI and to have the Vive Pre HMD connected to the same computer using just USB+Power.

That way you'd have a tracked/working HMD for VR and a tracked "camera" with all the known specifications/offsets/fov's/distortion to seamlessly work into the compositing of the scene.

Would it be possible even to connect two pre HMD's to the same computer if a developer has multiples?
< >
1-15 / 255 のコメントを表示
ページ毎: 15 30 50

SteamVR Developer Hardware > 総合掲示板 > トピックの詳細