Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Weird phenomenon with three.js plane

Tags:

math

three.js

This is the first question I've ever asked on here! Apologies in advance if I've done it wrong somehow.

I have written a program which stacks up spheres in three.js.

Each sphere starts with randomly generated (within certain bounds) x and z co-ordinates, and a y co-ordinate high above the ground plane. I casts rays from each of the sphere's vertices to see how far down it can fall before it intersects with an existing mesh.

For each sphere, I test it in 80 different random xz positions, see where it can fall the furthest, and then 'drop' it into that position.

This is intended to create bubble towers like this one:

image of regular bubble tower

However, I have noticed that when I make the bubble radius very small and the base dimensions of the tower large, this happens:

image of bubbles clustered towards the edges of square

If I turn the recursions down from 80, this effect is less apparent. For some reason, three.js seems to think that the spheres can fall further at the corners of the base square. The origin is exactly at the center of the base square - perhaps this is relevant.

When I console log all the fall-distances I'm receiving from the raycaster, they are indeed larger the further away you get from the center of the square... but only at the 11th or 12th decimal place.

This is not so much a problem I am trying to solve (I could just round fall distances to the nearest 10th decimal place before I pick the largest one), but something I am very curious about. Does anyone know why this is happening? Has anybody come across something similar to this before?

EDIT:

I edited my code to shift everything so that the origin is no longer at the center of the base square:

image of edited code

So am I correct in thinking... this phenomenon is something to do with distance from the origin, rather than anything relating to the surface onto which the balls are falling?

like image 855
Martha Avatar asked Oct 06 '15 12:10

Martha


Video Answer


1 Answers

Indeed, the pattern you are seeing is exactly because the corners and edges of the bottom of your tower are furthest from the origin where you are dropping the balls. You are creating a right triangle (see image below) in which the vertical "leg" is the line from the origin from which you are dropping the balls down to the point directly below on mesh floor (at a right angle to the floor - thus the name, right triangle). The hypotenuse is always the longest leg of a right triangle, and the futher out your rays cast from the point just below the origin, the longer the hypotenuse will be, and the more your algorithm will favor that longer distance (no matter how fractional).

Increasing the size of the tower base would exaggerate this effect as the hypotenuse measurements can now grow even larger. Reducing the size of the balls would also favor the pattern you are seeing, as now each ball is not taking up as much space, and so the distant measurments to the corners won't fill in as quickly as they would with larger balls so that now more balls will congregate at the edges before filling in the rest of the space.

Moving your dropping origin to one side or another creates longer distances (hypotenuses) to the opposites sides and corners, so that the balls will fill in those distant locations first.

The reason you see less of an effect when you reduce the sample size from 80 to say, 20, is that there are simply fewer chances to detect these more distant locations to which the balls could fall (an odds game).

A right triangle:

enter image description here

A back-of-the-napkin sketch:

enter image description here

like image 63
gromiczek Avatar answered Nov 15 '22 21:11

gromiczek