Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Rendering from two cameras at the same time in A-Frame

Tags:

aframe

the recent v0.3.0 blog post mentions WebVR 1.0 support allowing "us to have different content on the desktop display than the headset, opening the door for asynchronous gameplay and spectator modes." This is precisely what I'm trying to get working. I'm looking to have one camera in the scene represent the viewpoint of the HMD and a secondary camera represent a spectator of the same scene and render that view to a canvas on the same webpage. 0.3.0 removes the ability to render a-scene to a specific canvas in favor of embedded component. Any thoughts on how to accomplish two cameras rendering a single scene simultaneously?

My intention is to have a the desktop display show what a user is doing from a different perspective. My end goal is to be able to build a mixed reality green screen component.

like image 372
derickson Avatar asked Sep 13 '25 11:09

derickson


1 Answers

While there may be a better or cleaner way to do this in the future, I was able to get a second camera rendering by looking at examples of how this is done in the THREE.js world.

I add a component to a non-active camera called spectator. in the init function I set up a new renderer and attach to div outside the scene to create a new canvas. I then call the render method inside the tick() part of the lifecycle.

I have not worked out how to isolate the movement of this camera yet. The default look controls of the 0.3.0 aframe scene still control both camera

gif of spectator cam

Source code: https://gist.github.com/derickson/334a48eb1f53f6891c59a2c137c180fa

like image 75
derickson Avatar answered Sep 17 '25 19:09

derickson