Does anyone know of an irrational number representation type/object/class/whatever in any programming language?
All suggestions welcome.
Simply put, if I have two irrational objects, both representing the square root of five, and I multiply those objects, I want to get back the integer five, not float 4 point lots o' 9s.
Specifically, I need the representation to be able to collect terms, not just resolve every time to an integer/float. For instance, if I want to add the square root of five to one, I don't want it to return some approximation integer/float, I want it to return an object that I can add/multiply with another irrational object, such that I can tell the object to resolve at the latest time possible to minimize the float approximation error.
Thanks much!
Generally, the symbol used to represent the irrational symbol is “P”. Since irrational numbers are defined negatively, the set of real numbers (R) that are not the rational number (Q) is called an irrational number.
Irrational Numbers are the numbers that can not be expressed in the form of p/q where p and q are integers and q does not equal zero. Irrational numbers cant be represented via a fraction or normal decimals.
Definition of irrational number : a number that can be expressed as an infinite decimal with no set of consecutive digits repeating itself indefinitely and that cannot be expressed as the quotient of two integers.
An irrational number can be calculated to an infinite number of decimal places, without ever slipping into a repeating pattern, so cannot be accurately represented as a fraction. Some decimal numbers, e.g. 0, point, 66666, dots,0.66666…
What you are looking for is called symbolic mathematics. You might want to try some computer algebra system like Maxima, Maple or Mathematica. There are also libraries for this purpose, for example the SymPy library for Python.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With