What is a good approach to "attaching" GUI controls such as forms, buttons, checkboxes etc to objects in a three.js scene?
i.e., I'd like to show a 3D model, let the user click and pick things in that model, and see a pop-up menu that leads him to forms that let him set its properties, do other actions etc.
(A rough equivalent probably would be Nifty GUI if I were to use JMonkeyEngine.)
I use jQuery UI components with the Three.js raycaster.
In my HTML:
<div id="main-canvas">
<div id="interface">
markup for your various modals, etc...
</div>
</div>
I use the raycasting example from Mr. Doob here to handle clicks on my canvas. If the ray hits an object I fire off pieces of code for the jQuery UI components. For example, fire open a modal when the user clicks on a planet sphere object. In the modal you can trigger things to happen in your WebGL canvas.
Since my application takes up the entire window, I had to do some CSS to make sure the nested interface div didn't cause any scrollbars.
body {
background-color: black;
margin: 0px;
}
div#interface {
position:absolute;
width: 100%;
}
This has been working really well for me.
dat.GUI is a popular library among Three.js users for such things: http://code.google.com/p/dat-gui/ It's even included in Three.js distribution, under /examples/js/libs/
Here's one example of it in use: http://jabtunes.com/labs/3d/dof/webgl_postprocessing_dof2.html
The only problem I've found is that it is hard to create custom controls/widgets if you are not happy with the built-in controls. It's still pretty good.
For selecting/activating objects with mouse, there's plenty of information, just google "three.js picking" or something.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With