Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Core dump file not generated on segmentation fault

I am trying to debug a segmentation fault caused by my C program using gdb. A core dump file is not automatically generated when I run my program,and i have to run the command

ulimit -c unlimited

for a core file to be generated on the next run.

Why is a core dump file not generated automatically and why do I have to run the ulimit command everytime to generate a core file on the next run of my program ?.

The operating system i use is Ubuntu 10.10.

like image 908
Tim Avatar asked Oct 16 '10 11:10

Tim


People also ask

Is segmentation fault a core dump?

Core Dump (Segmentation fault) in C/C++ Core Dump/Segmentation fault is a specific kind of error caused by accessing memory that “does not belong to you.” When a piece of code tries to do read and write operation in a read only location in memory or freed block of memory, it is known as core dump.

Where is the core dump file located?

By default, core dumps are sent to systemd-coredump which can be configured in /etc/systemd/coredump. conf . By default, all core dumps are stored in /var/lib/systemd/coredump (due to Storage=external ) and they are compressed with zstd (due to Compress=yes ).


2 Answers

You need to place the command

ulimit -c unlimited

in your environment settings.

If you are using bash as your shell, you need to place the above command in ~/.bashrc

like image 170
Michalis Giannakidis Avatar answered Oct 11 '22 12:10

Michalis Giannakidis


You might also want to try to edit /etc/security/limits.conf file instead of adding ulimit -c unlimited to ~/.bashrc.

The limits.conf is the "correct" place where to specify core dump details in most Linux distros.

like image 33
user389238 Avatar answered Oct 11 '22 12:10

user389238