I was doing a question where I used a recursive function to create a segment tree. For larger values it started giving segmentation fault. So I thought before it might be because of array index value out of bound but later I thought it might be because of program stack going too big. I wrote this code to count what is the maximum number of recursive calls allowed before the system give seg-fault.
#include<iostream>
using namespace std;
void recur(long long int);
int main()
{
recur(0);
return 0;
}
void recur(long long int v)
{
v++;
cout<<v<<endl;
recur(v);
}
After running the above code I got value of v to be 261926 and 261893 and 261816 before getting segmentation fault and all values were close to these.
Now I know that this would depend on machine to machine, and the size of the stack of the function being called but can someone explain the basics of how to keep safe from seg-faults and what is a soft limit that one can keep in mind.
There is no general number of recursions that will cause stack overflow. Removing the variable 'neighbour' will allow for the function to recur further as each recursion takes less memory, but it will still eventually cause stack overflow.
A seg fault occurs when the call stack gets too big - i.e. too many levels of recursion.
The maximal number of nested calls (including the first one) is called recursion depth. In our case, it will be exactly n . The maximal recursion depth is limited by JavaScript engine. We can rely on it being 10000, some engines allow more, but 100000 is probably out of limit for the majority of them.
The maximum recursion depth in Python is 1000. You can change the limit by calling sys. setrecursionlimit() method.
The number of recursion levels you can do depends on the call-stack size combined with the size of local variables and arguments that are placed on such a stack. Aside from "how the code is written", just like many other memory related things, this is very much dependent on the system you're running on, what compiler you are using, optimisation level [1], and so on. Some embedded systems I've worked on, the stack would be a few hundred bytes, my first home computer had 256 bytes of stack, where modern desktops have megabytes of stack (and you can adjust it, but eventually you will run out)
Doing recursion at unlimited depth is not a good idea, and you should look at changing your code to so that "it doesn't do that". You need to understand the algorithm and understand to what depth it will recurse, and whether that is acceptable in your system. There is unfortunately nothing anyone can do at the time stack runs out (at best your program crashes, at worst it doesn't, but instead causes something ELSE to go wrong, such as the stack or heap of some other application gets messed up!)
On a desktop machine, I'd think it's acceptable to have a recursion depth of a hew hundred to some thousands, but not much more than this - and that is if you have small usage of stack in each call - if each call is using up kilobytes of stack, you should limit the call level even further, or reduce the need for stack-space.
If you need to have more recursion depth than that, you need to re-arrange the code - for example using a software stack to store the state, and a loop in the code itself.
[1] Using g++ -O2 on your posted code, I got to 50 million and counting, and I expect if I leave it long enough, it will restart at zero because it keeps going forever - this since g++ detects that this recursion can be converted into a loop, and does that. Same program compiled with -O0 or -O1 does indeed stop at a little over 200000. With clang++ -O1 it just keeps going. The clang-compiled code is still running as I finished writing the rest of the code, at 185 million "recursions".
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With