Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Running time of computing mathematical functions

Tags:

algorithm

Where can I turn for information regarding computing times of mathematical functions? Has any (general) study with any amount of rigor been made?

For instance, the computing time of

constant + constant

generally takes O(1).

Suppose I want to start using math like integrals, and I'd like to get an asymptotic approximation to various integrals. Has there been a standard study of this, or must I take the information I have and figure out my own approximation. I'd be very interested in a standard approach to this, and I'd like to know if it already exists.

Here's my motivation: I'm in the middle of writing a paper that points out the equivalence between NP hard problems and certain types of mathematical equations. It seems that there might be use for a study of math computing times that is generalized like a new science.

EDIT: I guess I'm wondering if there is a standard computational complexity to any given math that cannot be avoided. I'm wondering if anyone has studied this question. I'd love to see what others have tried.

EDIT 2: Wikipedia lists "Computational Complexity Theory" in their encyclopedia, which I think may fit the bill. I'm still wondering if someone who has studied this could affirm this.

like image 612
Matt Groff Avatar asked Jul 12 '10 01:07

Matt Groff


People also ask

How do you find the running time of a function?

To calculate the running time, find the maximum number of nested loops that go through a significant portion of the input. Some algorithms use nested loops where the outer loop goes through an input n while the inner loop goes through a different input m. The time complexity in such cases is O(nm).

How do you calculate runtime complexity?

For any loop, we find out the runtime of the block inside them and multiply it by the number of times the program will repeat the loop. All loops that grow proportionally to the input size have a linear time complexity O(n) .

What is running time in algorithm?

The running time of an algorithm refers to the length of time it takes for it to run as a function. An algorithm's running time for a given input depends on the number of operations executed. An algorithm with more operations will have a lengthier running time.

What is the running time of the program?

In computer science, runtime, run time, or execution time is the final phase of a computer program's life cycle, in which the code is being executed on the computer's central processing unit (CPU) as machine code.


1 Answers

"Standard" math has no notion of algorithmic complexity. That's reserved for computer algorithms.

There are ways to analyze the dynamic behavior of solutions of equations. Things like convergence matter a great deal to mathematicians.

You can ask what the algorithmic complexity of euler integration versus fifth-order Runge-Kutta for integration. They would compare based on number of function evaluations required and time step stability.

But what's the "running time" of the solution to Fermat's Last Theorem? What about the last of David Hilbert's challenge problems? Is the "running time" for those a century and counting? What's your running time for solving a partial differential equation using separation of variables?

When you think about it that way, do you have a better understanding of why people would be put off by your question?

like image 51
duffymo Avatar answered Sep 25 '22 09:09

duffymo