SteamVR Developer Hardware

SteamVR Developer Hardware

 이 토론은 고정되었습니다. 중요해서 그렇겠죠?
aaron.leiby 2016년 3월 8일 오후 7시 13분
Mixed Reality Videos
I was asked by a dev to provide some more details around SteamVR_ExternalCamera, so I figured I'd start a thread here to go over this, and also to discuss the topic more generally.

The new Unity plugin (v1.0.8) will automatically create this for you given two conditions:
1) Add a file called externalcamera.cfg in the root of your project.
2) Attach a third controller to your system.

By root of your project, I mean next to Assets or your executable.

Look at the config struct in SteamVR_ExternalCamera.cs for the full set of values you can set. At a minimum, you needs to set the near and far clip distances since we currently treat everything explicitly (as opposed to overrides).

Example:
x=0 y=0 z=0 rx=0 ry=0 rz=0 fov=60 near=0.1 far=100 //m=-0.999059,0.015577,-0.040472,-0.0127,-0.016016,-0.999816,0.010544,0.1799,-0.040301,0.011183,0.999125,-0.0846 sceneResolutionScale=0.5

The idea is that you attach the third controller rigidly to a camera that you can use to record the person playing on a greenscreen. Then you set the xyz offset and rotations in the config file to match the virtual camera in game. We have a tool for automatically solving for this, but it's not ready to release publicly yet, so you'll have to eye ball it for now by adjusting in the editor looking at other tracked objects until they line up close enough.

This script changes the output of the companion window on your main screen. It outputs four quadrants: foreground, foreground alpha (for compositing), background and normal game view.

You'll need to use another application to perform the compositing (e.g. OBS) and can use that to stream via twitch, etc. or you can save out the video feed and put it together in post.

Since this is a pretty intensive operation, and the primary computer is already overloaded with rendering the scene multiple times, you are better off setting up a second computer to take the video feed from the camera and the game, to composite in real-time for preview (very useful for the social aspect - e.g. set up a group of people on a couch to watch) and also take the brunt of writing the files to disk.

For creating trailers at 1080p, you'll want to set your output to 4k so each quadrant is full res.

The Vive Pre headsets only support pairing two controllers, but you can take an original Vive dev kit controller and plug its dongle into the primary machine, or use the usb cable with adapter to avoid dealing with batteries.
< >
264개 댓글 중 1-15개 표시
aaron.leiby  [개발자] 2016년 3월 8일 오후 7시 21분 
We've been using DeckLink cards for video capture.
https://www.blackmagicdesign.com/products/decklink

We had issues with OBS for compositing, but I can't remember what ended up getting used instead. I've heard other folks have had good luck with OBS though.
https://obsproject.com/

We use After Effects for off-line compositing.

railboy 2016년 3월 8일 오후 7시 24분 
This is fantastic, thank you.
JERENBOR 2016년 3월 8일 오후 9시 36분 
Are there any solutions for doing this in Unreal 4 that anyone has found or that Valve is working on?
Bangerman 2016년 3월 9일 오전 1시 01분 
Are there any video samples of what can be done with this.
jashan 2016년 3월 9일 오전 1시 08분 
Yes: Owlchemy Labs just did a video of Job Simulator and I believe they've used this: https://www.youtube.com/watch?v=w9dQkiI8raY ... there's also quite a few mixed reality videos showing Fantastic Contraption, like this one (it's an early test without greenscreen): https://www.youtube.com/watch?v=CHE2xEesS00 and here's one of Fantastic Contraption with greenscreen: https://www.youtube.com/watch?v=VPQipFQpceY ... I believe Northway Games (the creators of Fantastic Contraption) were the first ones to use this and the started with fixed cameras.

It's definitely the way to show VR to the masses on video. /me goes to order greenscreen stuffz on Amazon ;-)
jashan 님이 마지막으로 수정; 2016년 3월 9일 오전 1시 10분
jashan 2016년 3월 9일 오전 1시 12분 
Oh, and btw, I think this should be stickied :-)
jashan 2016년 3월 9일 오전 1시 39분 
aaron.leiby님이 먼저 게시:
Since this is a pretty intensive operation, and the primary computer is already overloaded with rendering the scene multiple times, you are better off setting up a second computer to take the video feed from the camera and the game, to composite in real-time for preview (very useful for the social aspect - e.g. set up a group of people on a couch to watch) and also take the brunt of writing the files to disk.

So you'd take the output of the "VR machine" that would usually go into the monitor and feed that into the "compositing machine", plus the camera feed, right? And for that, you'd use something like "DeckLink Studio 4K" to be able to feed two HDMI-signals (one from the VR machine, one from the camera) into the "compositing machine"?

Not sure if the "DeckLink Studio 4K" would actually be the right choice but it looks as if it had one HDMI input and one SDI Video Input which should be enough: https://www.blackmagicdesign.com/products/decklink/techspecs/W-DLK-12 ... SDI, however, is apparently only supported by fairly high end cameras, so one would probably need an additional SDI to HDMI converter. To make such a setup work.
DOGÉ 2016년 3월 9일 오전 6시 41분 
This is incredibly useful thank you. I have it working here using an older VIVE controller as the 3rd camera with the .cfg file setup. OBS functions correctly, no lag. Green screen has arrived, ready to test.

I have it working in Unity BUT the foreground and foreground alpha (for compositing) are having issues with clipping distances. I've tried various clipping distances for near and far. Normally I would use 0.01 and 10000 but that doesn't seem to display correctly - www.ir-ltd.net/uploads/example.jpg

Do you know what might be causing this?

EDIT:

It seems turning off the External Camera/Controller (third)/camera seems to then display foreground and foreground alpha (for compositing) correctly? - www.ir-ltd.net/uploads/example02.jpg
DOGÉ 님이 마지막으로 수정; 2016년 3월 9일 오전 6시 59분
Dylan 2016년 3월 9일 오후 1시 05분 
Mixed-reality Audioshield video:
https://www.youtube.com/watch?v=FhSsdsu4yIk

This one was done differently, with the game doing the compositing. The video is just a grab of the window output using Fraps.
Dylan 님이 마지막으로 수정; 2016년 3월 9일 오후 1시 06분
BOLL 2016년 3월 9일 오후 1시 35분 
The result of these mixed reality composits are just so great. Nothing else out there shows of better what room-scale/motion controllers is all about.

As for streaming something similar it almost feels as if it would be beneficial to have a spectator client connect to the main game through the network and do all the extra cameras and compositing/encoding on a different that machine. But, that might be a pipe dream, especially for single player games :P
hardlab 2016년 3월 9일 오후 1시 49분 
Just wondering for the consumer versions, how will we be able to accomplish this if we don't have one of the legacy controllers to act as the camera? Will a Pre gen controller work as a wired controller?
aaron.leiby  [개발자] 2016년 3월 9일 오후 2시 47분 
hardlab님이 먼저 게시:
Just wondering for the consumer versions, how will we be able to accomplish this if we don't have one of the legacy controllers to act as the camera? Will a Pre gen controller work as a wired controller?
The Vive headset only supports two controller connections, however, you can plug as many additional controllers in via USB as you need.

The dongles that shipped with the original Vive dev kits are the same hardware as what ships with the Steam Controllers, just with different firmware. These can be converted fairly easily, and have the added benefit of fitting perfectly into the aux port on the Vive headset.
aaron.leiby 님이 마지막으로 수정; 2016년 3월 9일 오후 2시 48분
Shadow 2016년 3월 9일 오후 2시 48분 
S. LaBeouf님이 먼저 게시:
I have it working in Unity BUT the foreground and foreground alpha (for compositing) are having issues with clipping distances. I've tried various clipping distances for near and far. Normally I would use 0.01 and 10000 but that doesn't seem to display correctly - www.ir-ltd.net/uploads/example.jpg
Unity doesn't like it if the near and far clipping planes are too far apart. I think something like far/near <= 10000 is a good idea. So I use near=0.01 and far=1000 for most of my tests. If you increase far to 10000 you should increase near to 0.1.

In theory that example.jpg you posted nothing should be showing in the foreground, right? Since the external camera is close to the Hmd (just slightly behind to the left I think?)
Shadow 2016년 3월 9일 오후 2시 50분 
hardlab님이 먼저 게시:
Just wondering for the consumer versions, how will we be able to accomplish this if we don't have one of the legacy controllers to act as the camera? Will a Pre gen controller work as a wired controller?
Any controller will work. We have experimented using old ones from the Pre, the dev-kit editions and older. One thing we did notice is with really bright lights the old controllers (think the ones from GDC last year) have a hard time tracking, while the Pre ones work great.
Miss Stabby 2016년 3월 9일 오후 4시 50분 
Here's a crazy idea:

What if we could record the demo while using a legacy vive devkit, while filming using the vive pre's HMD with tracking camera.

I would imagine this setup to have the linkbox connect to the legacy vive with USB+Power+HDMI and to have the Vive Pre HMD connected to the same computer using just USB+Power.

That way you'd have a tracked/working HMD for VR and a tracked "camera" with all the known specifications/offsets/fov's/distortion to seamlessly work into the compositing of the scene.

Would it be possible even to connect two pre HMD's to the same computer if a developer has multiples?
< >
264개 댓글 중 1-15개 표시
페이지당 표시 개수: 1530 50