Can somebody give me a realistic example in which an O(N*N)
algorithm is faster than an O(N)
algorithm for some N>10
.
EDIT : I see this question being put on hold for being too generic. But i do have a generic question only. There is no other way that this question could have been asked in a different way.
It could be that some tried to make a O(N*N) algorithm faster (e.g. by introducing some preconditioning of the data) and ends up with something like this:
O(N):
for (int i=0;i<N;i++){
// do some expensive preconditioning of your data
// to enable using O(N) algorithm
}
for (int i=0;i<N;i++){
// run the actual O(N) algorithm
}
O(N*N):
for (int i=0;i<N;i++){
for (int j=0;j<N;j++){
// run the O(N*N) algorithm
}
}
The big O notation is only the limiting behavior for large N. The constant (or linear) part can differ a lot. For example it might be that
O(N) = N + 10000
O(N*N) = N^2 + 0.5*N + 10
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With