Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Why can't Python find shared objects that are in directories in sys.path?

People also ask

How does Python set SYS path?

sys. path is a built-in variable within the sys module. It contains a list of directories that the interpreter will search in for the required module. When a module(a module is a python file) is imported within a Python file, the interpreter first searches for the specified module among its built-in modules.

How will you search a path for module in Python?

To make sure Python can always find the module.py , you need to: Place module.py in the folder where the program will execute. Include the folder that contains the module.py in the PYTHONPATH environment variable. Or you can place the module.py in one of the folders included in the PYTHONPATH variable.

How do I add path to SYS path?

append(mod_directory) to append the path and then open the python interpreter, the directory mod_directory gets added to the end of the list sys. path. If I export the PYTHONPATH variable before opening the python interpreter, the directory gets added to the start of the list.

What is a shared object file Python?

Shared objects are ranges and specific instances of certain immutable types that CPython instantiates and loads in memory every time a Python interactive session is initialized. These objects are static global variables that can be accessed by all programs executed in a given Python session.


sys.path is only searched for Python modules. For dynamic linked libraries, the paths searched must be in LD_LIBRARY_PATH. Check if your LD_LIBRARY_PATH includes /usr/local/lib, and if it doesn't, add it and try again.

Some more information (source):

In Linux, the environment variable LD_LIBRARY_PATH is a colon-separated set of directories where libraries should be searched for first, before the standard set of directories; this is useful when debugging a new library or using a nonstandard library for special purposes. The environment variable LD_PRELOAD lists shared libraries with functions that override the standard set, just as /etc/ld.so.preload does. These are implemented by the loader /lib/ld-linux.so. I should note that, while LD_LIBRARY_PATH works on many Unix-like systems, it doesn't work on all; for example, this functionality is available on HP-UX but as the environment variable SHLIB_PATH, and on AIX this functionality is through the variable LIBPATH (with the same syntax, a colon-separated list).

Update: to set LD_LIBRARY_PATH, use one of the following, ideally in your ~/.bashrc or equivalent file:

export LD_LIBRARY_PATH=/usr/local/lib

or

export LD_LIBRARY_PATH=/usr/local/lib:$LD_LIBRARY_PATH

Use the first form if it's empty (equivalent to the empty string, or not present at all), and the second form if it isn't. Note the use of export.


Ensure your libcurl.so module is in the system library path, which is distinct and separate from the python library path.

A "quick fix" is to add this path to a LD_LIBRARY_PATH variable. However, setting that system wide (or even account wide) is a BAD IDEA, as it is possible to set it in such a way that some programs will find a library it shouldn't, or even worse, open up security holes.

If your "locally installed libraries" are installed in, for example, /usr/local/lib, add this directory to /etc/ld.so.conf (it's a text file) and run "ldconfig"

The command will run a caching utility, but will also create all the necessary "symbolic links" required for the loader system to function. It is surprising that the "make install" for libcurl did not do this already, but it's possible it could not if /usr/local/lib is not in /etc/ld.so.conf already.

PS: it's possible that your /etc/ld.so.conf contains nothing but "include ld.so.conf.d/*.conf". You can still add a directory path after it, or just create a new file inside the directory it's being included from. Dont forget to run "ldconfig" after it.

Be careful. Getting this wrong can screw up your system.

Additionally: make sure your python module is compiled against THAT version of libcurl. If you just copied some files over from another system, this wont always work. If in doubt, compile your modules on the system you intend to run them on.


You can also set LD_RUN_PATH to /usr/local/lib in your user environment when you compile pycurl in the first place. This will embed /usr/local/lib in the RPATH attribute of the C extension module .so so that it automatically knows where to find the library at run time without having to have LD_LIBRARY_PATH set at run time.


Had the exact same issue. I installed curl 7.19 to /opt/curl/ to make sure that I would not affect current curl on our production servers. Once I linked libcurl.so.4 to /usr/lib:

sudo ln -s /opt/curl/lib/libcurl.so /usr/lib/libcurl.so.4

I still got the same error! Durf.

But running ldconfig make the linkage for me and that worked. No need to set the LD_RUN_PATH or LD_LIBRARY_PATH at all. Just needed to run ldconfig.


As a supplement to above answers - I'm just bumping into a similar problem, and working completely of the default installed python.

When I call the example of the shared object library I'm looking for with LD_LIBRARY_PATH, I get something like this:

$ LD_LIBRARY_PATH=/path/to/mysodir:$LD_LIBRARY_PATH python example-so-user.py
python: can't open file 'example-so-user.py': [Errno 2] No such file or directory

Notably, it doesn't even complain about the import - it complains about the source file!

But if I force loading of the object using LD_PRELOAD:

$ LD_PRELOAD=/path/to/mysodir/mypyobj.so python example-so-user.py
python: error while loading shared libraries: libtiff.so.5: cannot open shared object file: No such file or directory

... I immediately get a more meaningful error message - about a missing dependency!

Just thought I'd jot this down here - cheers!


I use python setup.py build_ext -R/usr/local/lib -I/usr/local/include/libcalg-1.0 and the compiled .so file is under the build folder. you can type python setup.py --help build_ext to see the explanations of -R and -I


For me what works here is to using a version manager such as pyenv, which I strongly recommend to get your project environments and package versions well managed and separate from that of the operative system.

I had this same error after an OS update, but was easily fixed with pyenv install 3.7-dev (the version I use).