Related questions:
Related answers:
Related comments:
pip install lxml
The last few lines of my error, logged by pip:
cl : Command line warning D9025 : overriding '/W3' with '/w'
lxml.etree.c
C:\Users\NATHAN~1\AppData\Local\Temp\pip_build_nathanielanderson\lxml\src\lxml\includes\etree_defs.h(9) : fatal error C1083: Cannot open include file: 'libxml/xmlversion.h': No such file or directory
C:\Python34\lib\distutils\dist.py:260: UserWarning: Unknown distribution option: 'bugtrack_url'
warnings.warn(msg)
error: command 'C:\Program Files\Microsoft Visual Studio 10.0\VC\BIN\cl.exe' failed with exit status 2
So I can't install from the .egg or by compiling...
I also can't find Windows installer (exe or msi or whatever) for this version of Python
Update 10/16/2019
As commenter says, the executable links are no longer available
whl
(wheel) files for the lxml
library; you can use pip
to install from whl
files
pypi.org
; the executables only range from 2.6, 2.7, 3.2, 3.3, and 3.4Looks like Chris does provide a direct exe here:
http://www.lfd.uci.edu/~gohlke/pythonlibs/#lxml
Thanks, Chris! Any ideas why I cannot compile using pip?
I also got this problem, but the workarounds provided above are not work for me as well.
Here is my system configuration:
I tried to use the method in the first link in the Related questions, but it's fail. This method is to create a system variable for vs2010 use, and the variable is actually copy from my original configuration in visual studio 2013.
However, the command line prompted error with "libxml/xmlversion.h" no suck file or directory
Then I further searched it on the internet and got a method which works in my case.
downloading the precompiled lxml plugin
Precompiled lxml 3.5: https://pypi.python.org/pypi/lxml/3.3.5#downloads
if your system is 64bit, then you can get a unofficial version for x64 at here: http://www.lfd.uci.edu/~gohlke/pythonlibs/#lxml (this is what i use)
installing with command in command line easy_install lxml-3.2.1.win32-py3.3.exe
Reference: https://pytools.codeplex.com/workitem/1520
If you are using python 3.4 this is the download link :
Download Here
If you have any other configuration find it HERE according to your need. ;-)
The short version is: You need to have the C library libxml2
(and also libxslt2
) before you can build lxml
.
As the lxml
installation docs say:
Unless you are using a static binary distribution (e.g. from a Windows binary installer), you need to install
libxml2
andlibxslt
, in particular:
libxml2
2.6.21 or later. It can be found here: http://xmlsoft.org/downloads.html
- We recommend `libxml2 2.7.8 or a later version.
- If you want to use XPath, do not use
libxml2
2.6.27.- If you want to use the feed parser interface, especially when parsing from unicode strings, do not use
libxml2
2.7.4 through 2.7.6.
libxslt
1.1.15 or later. It can be found here: http://xmlsoft.org/XSLT/downloads.html
- We recommend
libxslt
1.1.26 or later.
The build from source docs similarly start off with:
To build lxml from source, you need libxml2 and libxslt properly installed, including the header files.
Windows (unlike most other platforms) doesn't come with these libraries. You don't mention anything in your "Facts" about having them.
And the error message that you showed is:
C:\Users\NATHAN~1\AppData\Local\Temp\pip_build_nathanielanderson\lxml\src\lxml\includes\etree_defs.h(9) :
fatal error C1083: Cannot open include file: 'libxml/xmlversion.h':
No such file or directory
That 'libxml/xmlversion.h'
that it can't find is part of libxml2
.
It's also worth noting that the same installation docs explicitly say:
consider using the binary builds from PyPI or the unofficial Windows binaries that Christoph Gohlke generously provides.
So, the fact that you thought Christoph Gohlke didn't provide binaries for lxml
implies that you hadn't found these docs.
So, it's possible that you did install libxml2
, but not in a way that lxml2
's setup script can find it. But all the evidence implies it's a lot more likely that you just don't have it.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With