I'm having difficulty installing lxml with easy_install on Ubuntu 11.
When I type $ easy_install lxml I get:
Searching for lxml
Reading http://pypi.python.org/simple/lxml/
Reading http://codespeak.net/lxml
Best match: lxml 2.3
Downloading http://lxml.de/files/lxml-2.3.tgz
Processing lxml-2.3.tgz
Running lxml-2.3/setup.py -q bdist_egg --dist-dir /tmp/easy_install-7UdQOZ/lxml-2.3/egg-dist-tmp-GacQGy
Building lxml version 2.3.
Building without Cython.
ERROR: /bin/sh: xslt-config: not found
** make sure the development packages of libxml2 and libxslt are installed **
Using build configuration of libxslt
In file included from src/lxml/lxml.etree.c:227:0:
src/lxml/etree_defs.h:9:31: fatal error: libxml/xmlversion.h: No such file or directory
compilation terminated.
It seems that libxslt or libxml2 is not installed. I've tried following the instructions at http://www.techsww.com/tutorials/libraries/libxslt/installation/installing_libxslt_on_ubuntu_linux.php and http://www.techsww.com/tutorials/libraries/libxml/installation/installing_libxml_on_ubuntu_linux.php with no success.
If I try wget ftp://xmlsoft.org/libxml2/libxml2-sources-2.6.27.tar.gz I get
<successful connection info>
==> SYST ... done. ==> PWD ... done.
==> TYPE I ... done. ==> CWD (1) /libxml2 ... done.
==> SIZE libxml2-sources-2.6.27.tar.gz ... done.
==> PASV ... done. ==> RETR libxml2-sources-2.6.27.tar.gz ...
No such file `libxml2-sources-2.6.27.tar.gz'.
If I try the other first, I'll get to ./configure --prefix=/usr/local/libxslt --with-libxml-prefix=/usr/local/libxml2 and that will fail eventually with:
checking for libxml libraries >= 2.6.27... configure: error: Could not find libxml2 anywhere, check ftp://xmlsoft.org/.
I've tried both versions 2.6.27 and 2.6.29 of libxml2 with no difference.
Leaving no stone unturned, I have successfully done sudo apt-get install libxml2-dev, but this changes nothing.
Since you're on Ubuntu, don't bother with those source packages. Just install those development packages using apt-get.
apt-get install libxml2-dev libxslt1-dev python-dev
If you're happy with a possibly older version of lxml altogether though, you could try
apt-get install python-lxml
and be done with it. :)
I also had to install lib32z1-dev before lxml would compile (Ubuntu 13.04 x64).
sudo apt-get install lib32z1-dev
Or all the required packages together:
sudo apt-get install libxml2-dev libxslt-dev python-dev lib32z1-dev
As #Pepijn commented on #Druska 's answer, on ubuntu 13.04 x64, there is no need to use lib32z1-dev, zlib1g-dev is enough:
sudo apt-get install libxml2-dev libxslt-dev python-dev zlib1g-dev
I installed lxml with pip in Vagrant, using Ubuntu 14.04 and had the same problem. Even though all requirements where installed, i got the same error again and again. Turned out, my VM had to little memory by default. With 1024 MB everything works fine.
Add this to your VagrantFile and lxml should properly compile / install:
config.vm.provider "virtualbox" do |vb|
vb.memory = 1024
end
Thanks to sixhobbit for the hint (see: can't installing lxml on Ubuntu 12.04).
Step 1
Install latest python updates using this command.
sudo apt-get install python-dev
Step 2
Add first dependency libxml2 version 2.7.0 or later
sudo apt-get install libxml2-dev
Step 3
Add second dependency libxslt version 1.1.23 or later
sudo apt-get install libxslt1-dev
Step 4
Install pip package management tool first. and run this command.
pip install lxml
If you have any doubt Click Here
For Ubuntu 14.04
sudo apt-get install python-lxml
worked for me.
After installing the packages mentioned by AKX I still had the same problem. Solved it with
apt-get install python-dev
For Ubuntu 12.04.3 LTS (Precise Pangolin) I had to do:
apt-get install libxml2-dev libxslt1-dev
(Note the "1" in libxslt1-dev)
Then I just installed lxml with pip/easy_install.
From Ubuntu 18.4 (Bionic Beaver) it is advisable to use apt instead of apt-get since it has much better structural form.
sudo apt install libxml2-dev libxslt1-dev python-dev
If you're happy with a possibly older version of lxml altogether though, you could try
sudo apt install python-lxml
First install Ubuntu's python-lxml package and its dependencies:
sudo apt-get install python-lxml
Then use pip to upgrade to the latest version of lxml for Python:
pip install lxml
Many answers here are rather old,
thanks to the pointer from #Simplans (https://stackoverflow.com/a/37759871/417747) and the home page...
What worked for me (Ubuntu bionic):
sudo apt-get install python3-lxml
(+ sudo apt-get install libxml2-dev libxslt1-dev I installed before it, but not sure if that's the requirement still)
Related
I am getting below mentioned error when i am trying to install dependencies
,
./psycopg/psycopg.h:35:10: fatal error: libpq-fe.h: No such file or
directory
35 | #include <libpq-fe.h>
Depends: libpq5 (= 12.9-0ubuntu0.20.04.1) but 14.1-2.pgdg20.04+1 is
to be installed
This error comes from the fact that you do not have the libpq-dev package installed on your Ubuntu system.
You can solve this by either installing that package, or by using the psycopg2-binary package from pip instead of the psycopg2 package. The psycopg2-binary package contains a pre-compiled binary which means that you don't have to build the C extension when installing the dependencies of your app.
So, plan of action:
Either, you make sure to install the dependent packages on Ubuntu according to the psycopg2 documentation:
sudo apt install python3-dev libpq-dev
And then you should be able to run your requirements using pip install -r requirements.txt.
The other option is to change the psycopg2 line in your requirements.txt file so that it says psycopg2-binary instead, and then you shouldn't have to install the libpq-dev package.
You can read more about the differences between psycopg2 and psycopg2-binary in their slightly longer installation documentation
For specific version of python try
sudo apt install python-dev libpq-dev
For example
sudo apt install python3.9-dev libpq-dev
For python version 3.9
I wanted to install eventlet on my system in order to have "Herd" for software deployment.. but the terminal is showing a gcc error:
root#agrover-OptiPlex-780:~# easy_install -U eventlet
Searching for eventlet
Reading http://pypi.python.org/simple/eventlet/
Reading http://wiki.secondlife.com/wiki/Eventlet
Reading http://eventlet.net
Best match: eventlet 0.9.16
Processing eventlet-0.9.16-py2.7.egg
eventlet 0.9.16 is already the active version in easy-install.pth
Using /usr/local/lib/python2.7/dist-packages/eventlet-0.9.16-py2.7.egg
Processing dependencies for eventlet
Searching for greenlet>=0.3
Reading http://pypi.python.org/simple/greenlet/
Reading https://github.com/python-greenlet/greenlet
Reading http://bitbucket.org/ambroff/greenlet
Best match: greenlet 0.3.4
Downloading http://pypi.python.org/packages/source/g/greenlet/greenlet- 0.3.4.zip#md5=530a69acebbb0d66eb5abd83523d8272
Processing greenlet-0.3.4.zip
Writing /tmp/easy_install-_aeHYm/greenlet-0.3.4/setup.cfg
Running greenlet-0.3.4/setup.py -q bdist_egg --dist-dir /tmp/easy_install-_aeHYm/greenlet-0.3.4/egg-dist-tmp-t9_gbW
In file included from greenlet.c:5:0:
greenlet.h:8:20: fatal error: Python.h: No such file or directory
compilation terminated.
error: Setup script exited with error: command 'gcc' failed with exit status 1`
Why can't Python.h be found?
Your install is failing because you don't have the python development headers installed. You can do this through apt on ubuntu/debian with:
sudo apt-get install python-dev
for python3 use:
sudo apt-get install python3-dev
For eventlet you might also need the libevent libraries installed so if you get an error talking about that you can install libevent with:
sudo apt-get install libevent-dev
For Fedora:
sudo yum install python-devel
sudo yum install libevent-devel
and finally:
sudo easy_install gevent
What worked for me on CentOS was:
sudo yum -y install gcc
sudo yum install python-devel
For Redhat Versions(Centos 7) Use the below command to install Python Development Package
Python 2.7
sudo yum install python-dev
Python 3.4
sudo yum install python34-devel
Python 3.6
sudo yum install python36-devel
If the issue is still not resolved then try installing the below packages -
sudo yum install python-devel
sudo yum install openssl-devel
sudo yum install libffi-devel
On MacOS I had trouble installing fbprophet which requires pystan which requires gcc to compile. I would consistently get the same error: command 'gcc' failed with exit status 1
I think I fixed the problem for myself thus:
I used brew install gcc to install the newest version, which ended up being gcc-8
Then I made sure that when gcc ran it would use gcc-8 instead.
It either worked because I added alias gcc='gcc-8 in my .zshrc (same as .bashrc but for zsh), or because I ran export PATH=/usr/local/bin:$PATH (see comment)
Also: all my attempts were inside a virtual environment and I only succeeded by installing fbprophet globally (with pip), but still no success inside a venv
This is an old post but I just run to the same problem on AWS EC2 installing regex. This working perfectly for me
sudo yum -y install gcc
and next
sudo yum -y install gcc-c++
If it is still not working, you can try this
sudo apt-get install build-essential
in my case, it solved the problem.
try this :
sudo apt-get install libblas-dev libatlas-base-dev
I had a similar issue on Ubuntu 14.04. For me the following Ubuntu packages
On MacOS I also had problems trying to install fbprophet which had gcc as one of its dependencies.
After trying several steps as recommended by #Boris the command below from the Facebook Prophet project page worked for me in the end.
conda install -c conda-forge fbprophet
It installed all the needed dependencies for fbprophet. Make sure you have anaconda installed.
This page is gonna save your life, for all further lib issues that are forthcoming,
For Alpine(>=3.6), use
apk --update --upgrade add gcc musl-dev jpeg-dev zlib-dev libffi-dev cairo-dev pango-dev gdk-pixbuf-dev
For CentOS 7.2:
LSB Version: :core-4.1-amd64:core-4.1-noarch
Distributor ID: CentOS
Description: CentOS Linux release 7.2.1511 (Core)
Release: 7.2.1511
Codename: Core
Install eventlet:
sudo yum install python-devel
sudo easy_install -ZU eventlet
Terminal info:
[root#localhost ~]# easy_install -ZU eventlet
Searching for eventlet
Reading http://pypi.python.org/simple/eventlet/
Best match: eventlet 0.19.0
Downloading https://pypi.python.org/packages/5a/e8/ac80f330a80c18113df0f4f872fb741974ad2179f8c2a5e3e45f40214cef/eventlet-0.19.0.tar.gz#md5=fde857181347d5b7b921541367a99204
Processing eventlet-0.19.0.tar.gz
Running eventlet-0.19.0/setup.py -q bdist_egg --dist-dir /tmp/easy_install-Hh9GQY/eventlet-0.19.0/egg-dist-tmp-rBFoAx
Adding eventlet 0.19.0 to easy-install.pth file
Installed /usr/lib/python2.6/site-packages/eventlet-0.19.0-py2.6.egg
Processing dependencies for eventlet
Finished processing dependencies for eventlet
For openSUSE 42.1 Leap Linux use this
sudo zypper install python3-devel
I am using MacOS catalina 10.15.4. None of the posted solutions worked for me. What worked for me is:
>> xcode-select --install
xcode-select: error: command line tools are already installed, use "Software Update" to install updates
>> env LDFLAGS="-I/usr/local/opt/openssl/include -L/usr/local/opt/openssl/lib" pip install psycopg2==2.8.4
Collecting psycopg2==2.8.4
Using cached psycopg2-2.8.4.tar.gz (377 kB)
Installing collected packages: psycopg2
Attempting uninstall: psycopg2
Found existing installation: psycopg2 2.7.7
Uninstalling psycopg2-2.7.7:
Successfully uninstalled psycopg2-2.7.7
Running setup.py install for psycopg2 ... done
Successfully installed psycopg2-2.8.4
use pip3 for python3
if you are on Mac as myself, try this in your terminal: xcode-select --install
Then accept the installation request, and it works afterwards as described in this issue
Build from source and install, this is fixed in the latest release (10.3+):
mkdir -p /tmp/install/netifaces/
cd /tmp/install/netifaces && wget -O "netifaces-0.10.4.tar.gz" "https://pypi.python.org/packages/source/n/netifaces/netifaces-0.10.4.tar.gz#md5=36da76e2cfadd24cc7510c2c0012eb1e"
tar xvzf netifaces-0.10.4.tar.gz
cd netifaces-0.10.4 && python setup.py install
Similarly I fixed it like this (notice python34):
sudo yum install python34-devel
sudo apt install gcc
It works for PyCharm on Ubuntu 20.10.
If you are migrating to a more modern version of python3 e.g. python3.5 to python3.8 You may want to check/upgrade the versions of the library that are failing if you have already installed the recommended libraries to handle gcc building python3-dev + other libraries as suggested.
It depends on the package. Some versions of the packages may not be supported on later versions of python3.
Need to install python packages like pip, numpy, cv2 on an Amazon EC2 instance of Ubuntu. I tried using sudo apt-get install python-pip but got below given error:
ubuntu#ip-172-31-35-131:~$ sudo apt-get install python-pip
Reading package lists... Done
Building dependency tree
Reading state information... Done
E: Unable to locate package python-pip
Try first sudo apt-get update then
sudo apt-get install python-pip
Have you tried the instructions here?
You can install pip from PyPa directly:
curl -O https://bootstrap.pypa.io/get-pip.py
python get-pip.py --user
Your system may have a concurrent python3.x under the name python3, then you can install pip for it with python3 get-pip.py --user as well. (Or contrarily, python2.x under the name python2.)
I am using Ubuntu and have installed Python 2.7.5 and 3.4.0. In Python 2.7.5 I am able to successfully assign a variable x = Value('i', 2), but not in 3.4.0. I am getting:
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/usr/local/lib/python3.4/multiprocessing/context.py", line 132, in Value
from .sharedctypes import Value
File "/usr/local/lib/python3.4/multiprocessing/sharedctypes.py", line 10, in <
module>
import ctypes
File "/usr/local/lib/python3.4/ctypes/__init__.py", line 7, in <module>
from _ctypes import Union, Structure, Array
ImportError: No module named '_ctypes'
I just updated to 3.3.2 through installing the source of 3.4.0. It installed in /usr/local/lib/python3.4.
Did I update to Python 3.4 correctly?
One thing I noticed that Python 3.4 is installed in usr/local/lib, while Python 3.3.2 is still installed in usr/lib, so it was not overwritten.
Installing libffi-dev and re-installing python3.7 fixed the problem for me.
to cleanly build py 3.7 libffi-dev is required or else later stuff will fail
If using RHEL/Fedora:
yum install libffi-devel
or
sudo dnf install libffi-devel
If using Debian/Ubuntu:
sudo apt-get install libffi-dev
On a fresh Debian image, cloning https://github.com/python/cpython and running:
sudo apt-get update
sudo apt-get upgrade
sudo apt-get dist-upgrade
sudo apt-get install build-essential python-dev python-setuptools python-pip python-smbus
sudo apt-get install libncursesw5-dev libgdbm-dev libc6-dev
sudo apt-get install zlib1g-dev libsqlite3-dev tk-dev
sudo apt-get install libssl-dev openssl
sudo apt-get install libffi-dev
Now execute the configure file cloned above:
./configure
make # alternatively `make -j 4` will utilize 4 threads
sudo make altinstall
Got 3.7 installed and working for me.
SLIGHT UPDATE
Looks like I said I would update this answer with some more explanation and two years later I don't have much to add.
this SO post explains why certain libraries like python-dev might be necessary.
this SO post explains why one might use the altinstall as opposed to install argument in the make command.
Aside from that I guess the choice would be to either read through the cpython codebase looking for #include directives that need to be met, but what I usually do is keep trying to install the package and just keep reading through the output installing the required packages until it succeeds.
Reminds me of the story of the Engineer, the Manager and the Programmer whose car rolls down a hill.
If you use pyenv and get error "No module named '_ctypes'" (like i am) on Debian/Raspbian/Ubuntu you need to run this commands:
sudo apt-get install libffi-dev
pyenv uninstall 3.7.6
pyenv install 3.7.6
Put your version of python instead of 3.7.6
Detailed steps to install Python 3.7 in CentOS or any redhat linux machine:
Download Python from https://www.python.org/ftp/python/3.7.0/Python-3.7.0.tar.xz
Extract the content in new folder
Open Terminal in the same directory
Run below code step by step :
sudo yum -y install gcc gcc-c++
sudo yum -y install zlib zlib-devel
sudo yum -y install libffi-devel
./configure
make
make install
Thought I'd add the Centos installs:
sudo yum -y install gcc gcc-c++
sudo yum -y install zlib zlib-devel
sudo yum -y install libffi-devel
Check python version:
python3 -V
Create virtualenv:
virtualenv -p python3 venv
On my Ubuntu 18.04 machine, I had the common problem of python not finding _ctypes with the pyenv installed python.
In my case libffi-dev was already installed. Installing cpython from source, as suggested by #MikeiLL, didn't help either.
Turned out to be an homebrew issue.
ajkerrigans suggested solution on pyenvs github issues solved this problem for me.
Solution summary: Tell pyenv to build Python using the Homebrew-managed GCC, with a command like:
CC="$(brew --prefix gcc)/bin/gcc-11" \
pyenv install --verbose 3.10.0
This assumes that any build dependencies have also been installed via Homebrew as specified in the pyenv wiki. As of this writing, that looks like this for Homebrew on Linux:
brew install bzip2 libffi libxml2 libxmlsec1 openssl readline sqlite xz zlib
This solved the same error for me on Debian:
sudo apt-get install libffi-dev
and compile again
Reference: issue31652
None of the solution worked. You have to recompile your python again; once all the required packages were completely installed.
Follow this:
Install required packages
Run ./configure --enable-optimizations
https://gist.github.com/jerblack/798718c1910ccdd4ede92481229043be
I run into this error when I tried to install Python 3.7.3 in Ubuntu 18.04 with next command: $ pyenv install 3.7.3.
Installation succeeded after running $ sudo apt-get update && sudo apt-get install libffi-dev (as suggested here).
The issue was solved there.
Based on this answer, just copy-paste into the terminal.
First run:
sudo apt-get -y update
then:
sudo apt-get -y upgrade
sudo apt-get -y dist-upgrade
sudo apt-get -y install build-essential python-dev python-setuptools python-pip python-smbus
sudo apt-get -y install libncursesw5-dev libgdbm-dev libc6-dev
sudo apt-get -y install zlib1g-dev libsqlite3-dev tk-dev
sudo apt-get -y install libssl-dev openssl
sudo apt-get -y install libffi-dev
PS: You can just copy-paste the whole chunk into the terminal in one go.
In my case what was causing all sorts of Python installation issues including the one having to do with _ctypes and libffi was Homebrew on Linux / Linuxbrew. pyenv was happy again once brew was no longer in the $PATH.
Refer to this thread or this thread, for customized installation of libffi, it is difficult for Python3.7 to find the library location of libffi. An alternative method is to set the CONFIGURE_LDFLAGS variable in the Makefile, for example CONFIGURE_LDFLAGS="-L/path/to/libffi-3.2.1/lib64".
My solution:
Installing libffi-dev with apt-get didn't help.
But this helped: Installing libffi from source and then installing Python 3.8 from source.
My configuration:
Ubuntu 16.04 LTS
Python 3.8.2
Step by step:
I got the error message "ModuleNotFoundError: No module named '_ctypes'" when starting the debugger from Visual Studio Code, and when running python3 -c "import sklearn; sklearn.show_versions()".
download libffi v3.3 from https://github.com/libffi/libffi/releases
install libtool: sudo apt-get install libtool
The file README.md from libffi mentions that autoconf and automake are also necessary. They were already installed on my system.
configure libffi without docs:
./configure --disable-docs
make check
sudo make install
download python 3.8 from https://www.python.org/downloads/
./configure
make
make test
make install
After that my python installation could find _ctypes.
CentOS without root
Install libffi-3.2 (Do NOT use libffi-3.3)
wget ftp://sourceware.org/pub/libffi/libffi-3.2.tar.gz
tar -xzf libffi-3.2.tar.gz
cd libffi-3.2/
./configure --prefix=$YOUR_LIBFFI_DIR
make && make install
Install Python3
./configure --prefix=$YOUR_PATH/python/3.7.10 LDFLAGS=-L${YOUR_LIBFFI_DIR}/lib64 PKG_CONFIG_PATH=${YOUR_LIBFFI_DIR}/lib/pkgconfig --enable-shared
make && make install
Thanks for JohnWSteill
I was having the same problem. None of the above solutions worked for me. The key challenge was that I didn't have the root access. So, I first download the source of libffi. Then I compiled it with usual commands:
./configure --prefix=desired_installation_path_to_libffi
make
Then I recompiled python using
./configure --prefix=/home/user123/Softwares/Python/installation3/ LDFLAGS='-L/home/user123/Softwares/library/libffi/installation/lib64'
make
make install
In my case, 'home/user123/Softwares/library/libffi/installation/lib64' is path to LIBFFI installation directory where libffi.so is located. And, /home/user123/Softwares/Python/installation3/ is path to Python installation directory. Modify them as per your case.
If you don't mind using Miniconda, the necessary external libraries and _ctypes are installed by default. It does take more space and may require using a moderately older version of Python (e.g. 3.7.6 instead of 3.8.2 as of this writing).
You have to load the missing php3 (Python3) modules from the package manager.
If you have Ubuntu I recommend the Synaptic Package Manager:
sudo apt-get install synaptic
There you can simply search for the missing modules. search for ctypes and install all the packages. Then go to your Python dir and do
./configure
make install.
This should solve your problem.
How to install Python from source without libffi in /usr/local?
Download libffi from github and install to /path/to/local
Download python source code and compile with the following configuration:
export PKG_CONFIG_PATH=/path/to/local/lib/pkgconfig
./configure --prefix=/path/to/python \
LDFLAGS='-L/path/to/local/lib -Wl,-R/path/to/local/lib' \
--enable-optimizations
make
make install
I am using MAC M1 and I had this error:
... __boot__.py", line 30, in <module> import ctypes
and something was said about the file libffi.8.dylib
I downloaded this thing on Anaconda and now everything works:
https://anaconda.org/wakari/libffi
I inform you, since much of the above is either not for MAC or outdated, my Python is on Anaconda version 3.10.4
Application file created with py2app works now!!
If your issue is with the VSCODE DEBUGGER, check your currently selected python interpreter. I had both python3.10.9 and python3.10.6 installed; however, the former was probably missing some dependencies so I switched to the latter(my OS default interpreter) which solved the issue.
To change your python interpreter in VSCODE:
Hold ctrl+shift+P
Search Python:Select Interpreter and try your OS default version(The version you get when you run python3 --version
If the issue is still not resolved, run sudo apt-get install libffi-dev.
If you are doing something nobody here will listen you about because "you're doing it the wrong way", but you have to do it "the wrong way" for reasons too asinine to explain and also beyond your ability to control, you can try this:
Get libffi and install it into your user install area the usual way.
git clone https://github.com/libffi/libffi.git
cd libffi
./configure --prefix=path/to/your/install/root
make
make install
Then go back to your Python 3 source and find this part of the code in setup.py at the top level of the python source directory
ffi_inc = [sysconfig.get_config_var("LIBFFI_INCLUDEDIR")]
if not ffi_inc or ffi_inc[0] == '':
ffi_inc = find_file('ffi.h', [], inc_dirs)
if ffi_inc is not None:
ffi_h = ffi_inc[0] + '/ffi.h'
if not os.path.exists(ffi_h):
ffi_inc = None
print('Header file {} does not exist'.format(ffi_h))
ffi_lib = None
if ffi_inc is not None:
for lib_name in ('ffi', 'ffi_pic'):
if (self.compiler.find_library_file(lib_dirs, lib_name)):
ffi_lib = lib_name
break
ffi_lib="ffi" # --- AND INSERT THIS LINE HERE THAT DOES NOT APPEAR ---
if ffi_inc and ffi_lib:
ext.include_dirs.extend(ffi_inc)
ext.libraries.append(ffi_lib)
self.use_system_libffi = True
and add the line I have marked above with the comment. Why it is necessary, and why there is no way to get configure to respect '--without-system-ffi` on Linux platforms, perhaps I will find out why that is "unsupported" in the next couple of hours, but everything has worked ever since. Otherwise, best of luck... YMMV.
WHAT IT DOES: just overrides the logic there and causes the compiler linking command to add "-lffi" which is all that it really needs. If you have the library user-installed, it is probably detecting the headers fine as long as your PKG_CONFIG_PATH includes path/to/your/install/root/lib/pkgconfig.
I'm compiling psycopg2 and get the following error:
Python.h: No such file or directory
How to compile it, Ubuntu12 x64.
Python 2:
sudo apt-get install python-dev
Python 3:
sudo apt-get install python3-dev
This is a dependency issue.
I resolved this issue on Ubuntu using apt-get. Substitute it with a package manager appropriate to your system.
For any current Python version:
sudo apt-get install python-dev
For alternative Python version:
sudo apt-get install python<version>-dev
For example 3.5 as alternative:
sudo apt-get install python3.5-dev
if you take a look at PostgreSQL's faq page ( http://initd.org/psycopg/docs/faq.html ) you'll see that they recommend installing pythons development package, which is usually called python-dev. You can install via
sudo apt-get install python-dev
As mentioned in psycopg documentation http://initd.org/psycopg/docs/install.html
Psycopg is a C wrapper around the libpq PostgreSQL client library. To install it from sources you will need:
C compiler
Python header files
They are usually installed in a package such as python-dev a message error such: Python.h: no such file or directory indicate that you missed mentioned python headers.
How you can fix it? First of all you need check which python version installed in your virtual envitonment or in system itself if you didnt use virtual environment. You can check your python version by:
python --version
After it you should install the same python-dev version which installed on your virtual env or system. For example if you use python3.7 you should install
apt-get install python3.7-dev
Hope my answer will help anyone
Based on the python version your your pipenv file requires, you need to install the corresponding dev file.
I was getting this error and my default python version was 3.8 but the pipenv file was requiring the Python3.9 version. So I installed the python3.9 dev.
$ sudo apt install python3.9-dev
While all answers here are correct, they won't work correctly anyway:
- sudo apt-get install python3-dev
- sudo apt-get install python3.5-dev
- etc ..
won't apply when you are using python3.8, python3.9 or future versions
I recommend using a deterministic way instead :
sudo apt install python3-all-dev
On Fedora, Redhat or centos
Python 2:
sudo yum install python-devel
Python 3:
sudo yum install python3-devel
if none of the above-suggested answers is not working, try this it's worked for me.
sudo apt-get install libpq-dev