consider the following code in C
int n;
scanf("%d",n)
it gives the error Segmentation fault core dumped in GCC compiler in Linux Mandriva
but the following code
int *p=NULL;
*P=8;
gives only segmentation fault why is that so..?
A core dump is a file containing a dump of the state and memory of a program at the time it crashed. Since core dumps can take non-trivial amounts of disk space, there is a configurable limit on how large they can be. You can see it with ulimit -c
.
Now, when you get a segmentation fault, the default action is to terminate the process and dump core. Your shell tells what has happened, if a process has terminated with a segmentation fault signal it will print Segmentation fault
, and if that process has additionally dumped core (when the ulimit
setting and the permissions on the directory where the core dump is to be generated allow it), it will tell you so.
Assuming you're running both of these on the same system, with the same ulimit -c
settings (which would be my first guess as to the difference you're seeing), then its possible the optimizer is "noticing" the clearly undefined behavior in the second example, and generating its own exit. You could check with objdump -x
.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With