Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Using shared libraries vs a single executable

My colleague claims that we should dissect our C++ application (C++, Linux) into shared libraries to improve code modularity, testability and reuse.

From my point of view it's a burden since the code we write does not need to be shared between applications on the same machine neither to be dynamically loaded or unloaded and we can simply link a monolithic executable application.

Furthermore, wrapping C++ classes with C-function interfaces IMHO makes it uglier.

I also think single-file application will be much more easy to upgrade remotely at a customer's site.

Should dynamic libraries be used when there is no need to share binary code between applications and no dynamic code loading?

like image 290
jackhab Avatar asked Sep 16 '09 12:09

jackhab


People also ask

Do shared libraries need executable?

As shared libraries cannot be directly executed, they need to be linked into a system executable or callable shared object. Hence, shared libraries are searched for by the system linker during the link process. This means that a shared library name must always start with the prefix lib and have the extension . so (or .

What is the purpose of using shared libraries?

Shared libraries are the most common way to manage dependencies on Linux systems. These shared resources are loaded into memory before the application starts, and when several processes require the same library, it will be loaded only once on the system. This feature saves on memory usage by the application.

Why would you prefer to use shared libraries instead of static libraries for compiled binaries?

The most significant advantage of shared libraries is that there is only one copy of code loaded in memory, no matter how many processes are using the library. For static libraries each process gets its own copy of the code. This can lead to significant memory wastage.

Are shared libraries slower?

Programs that use shared libraries are usually slower than those that use statically-linked libraries. A more subtle effect is a reduction in "locality of reference." You may be interested in only a few of the routines in a library, and these routines may be scattered widely in the virtual address space of the library.


2 Answers

I'd say that splitting code into shared libraries to improve without having any immediate goal in mind is a sign of a buzzwords-infested development environment. It is better to write code that can easily be split at some point.

But why would you need to wrap C++ classes into C-function interfaces, except for, maybe, for object creation?

Also, splitting into shared libraries here sounds like an interpreted language mindset. In compiled languages you try not to postpone till runtime what you can do at compile-time. Unnecessary dynamic linking is exactly the case.

like image 133
Michael Krelin - hacker Avatar answered Nov 04 '22 08:11

Michael Krelin - hacker


Enforcing shared libraries ensures that libraries doesn't have circular dependencies. Using shared libraries often leads to faster linkage and link errors are discovered at an earlier stage than if there isn't any linking before the final application is linked. If you want to avoid shipping multiple files to customers you can consider linking the application dynamically in your development environment and statically when creating release builds.

EDIT: I don't really see a reason to why you need to wrap your C++ classes using C interfaces - this is done behind the scenes. On Linux you can use shared libraries without any special handling. On Windows, however, you would need to ___declspec(export) and ___declspec(import).

like image 35
larsmoa Avatar answered Nov 04 '22 08:11

larsmoa