I'm curious about infinite numbers in computing, in particular pi.
For a computer to render a circle it would have to understand pi. But how can it if it is infinite?
Am I looking too much into this? Would it just use a rounded value?
Mathematically, computers are both finite and non-continuous and therefore can neither know PI completely, nor correctly render a circle.
However, in the digital realm neither of these exist anyway, so it is sufficient to approximate PI and then use that to approximately render the circle, resulting in exactly the same pixels that would have been calculated from an exact PI anyway.
Either way, the resulting pixels aren't really a circle either, because they are a finite collection of digital points and a circle is a curve made up of an infinite number of points, most with irrational values.
(It has been pointed out to me that PI is not normally used to plot a circle, which is true, however, the methods used to plot a circle are related to the formulas used to express and/or calculate the value of PI, which still have the same issues).
An approximation is generally sufficient. To "render" a circle, the computer only needs to understand pi well enough to render accurately at whatever resolution (finite) is required.
Edit: as others have pointed out, you don't even need pi to render a circle. Still, the gist of the question was "how do computers deal with numbers like pi?" They use approximations, and whoever is using those approximations must decide whether they are precise enough for the given purpose.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With