I'm compiling psycopg2 and get the following error:
Python.h: No such file or directory
How to compile it, Ubuntu12 x64.
Python 2:
sudo apt-get install python-dev
Python 3:
sudo apt-get install python3-dev
This is a dependency issue.
I resolved this issue on Ubuntu using apt-get. Substitute it with a package manager appropriate to your system.
For any current Python version:
sudo apt-get install python-dev
For alternative Python version:
sudo apt-get install python<version>-dev
For example 3.5 as alternative:
sudo apt-get install python3.5-dev
if you take a look at PostgreSQL's faq page ( http://initd.org/psycopg/docs/faq.html ) you'll see that they recommend installing pythons development package, which is usually called python-dev. You can install via
sudo apt-get install python-dev
As mentioned in psycopg documentation http://initd.org/psycopg/docs/install.html
Psycopg is a C wrapper around the libpq PostgreSQL client library. To install it from sources you will need:
C compiler
Python header files
They are usually installed in a package such as python-dev a message error such: Python.h: no such file or directory indicate that you missed mentioned python headers.
How you can fix it? First of all you need check which python version installed in your virtual envitonment or in system itself if you didnt use virtual environment. You can check your python version by:
python --version
After it you should install the same python-dev version which installed on your virtual env or system. For example if you use python3.7 you should install
apt-get install python3.7-dev
Hope my answer will help anyone
Based on the python version your your pipenv file requires, you need to install the corresponding dev file.
I was getting this error and my default python version was 3.8 but the pipenv file was requiring the Python3.9 version. So I installed the python3.9 dev.
$ sudo apt install python3.9-dev
While all answers here are correct, they won't work correctly anyway:
- sudo apt-get install python3-dev
- sudo apt-get install python3.5-dev
- etc ..
won't apply when you are using python3.8, python3.9 or future versions
I recommend using a deterministic way instead :
sudo apt install python3-all-dev
On Fedora, Redhat or centos
Python 2:
sudo yum install python-devel
Python 3:
sudo yum install python3-devel
if none of the above-suggested answers is not working, try this it's worked for me.
sudo apt-get install libpq-dev
Related
I'm trying to install fontforge --with-python in Linux Ubuntu 14.04 LTS.
I did this hand steps listed here (https://github.com/fontforge/fontforge/blob/master/INSTALL-git.md) but it still does not working.
Does someone know if there is another way to make it works?
Thanks!
You don't need to compile FontForge, but the Python bindings must be specifically installed:
sudo add-apt-repository ppa:fontforge/fontforge
sudo apt-get update
sudo apt-get install fontforge python-fontforge
The best solution I found was:
INSTALL LINUXBREW:
https://github.com/Linuxbrew/linuxbrew
Then I did:
$brew install fontforge --with-python
$brew install eot-utils
$gem install fontcustom
And did the rest of my process till Gulp:
$npm install
$bundle install
$gulp fontcustom
$gulp
That is it! Tks!
I am using Ubuntu and have installed Python 2.7.5 and 3.4.0. In Python 2.7.5 I am able to successfully assign a variable x = Value('i', 2), but not in 3.4.0. I am getting:
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/usr/local/lib/python3.4/multiprocessing/context.py", line 132, in Value
from .sharedctypes import Value
File "/usr/local/lib/python3.4/multiprocessing/sharedctypes.py", line 10, in <
module>
import ctypes
File "/usr/local/lib/python3.4/ctypes/__init__.py", line 7, in <module>
from _ctypes import Union, Structure, Array
ImportError: No module named '_ctypes'
I just updated to 3.3.2 through installing the source of 3.4.0. It installed in /usr/local/lib/python3.4.
Did I update to Python 3.4 correctly?
One thing I noticed that Python 3.4 is installed in usr/local/lib, while Python 3.3.2 is still installed in usr/lib, so it was not overwritten.
Installing libffi-dev and re-installing python3.7 fixed the problem for me.
to cleanly build py 3.7 libffi-dev is required or else later stuff will fail
If using RHEL/Fedora:
yum install libffi-devel
or
sudo dnf install libffi-devel
If using Debian/Ubuntu:
sudo apt-get install libffi-dev
On a fresh Debian image, cloning https://github.com/python/cpython and running:
sudo apt-get update
sudo apt-get upgrade
sudo apt-get dist-upgrade
sudo apt-get install build-essential python-dev python-setuptools python-pip python-smbus
sudo apt-get install libncursesw5-dev libgdbm-dev libc6-dev
sudo apt-get install zlib1g-dev libsqlite3-dev tk-dev
sudo apt-get install libssl-dev openssl
sudo apt-get install libffi-dev
Now execute the configure file cloned above:
./configure
make # alternatively `make -j 4` will utilize 4 threads
sudo make altinstall
Got 3.7 installed and working for me.
SLIGHT UPDATE
Looks like I said I would update this answer with some more explanation and two years later I don't have much to add.
this SO post explains why certain libraries like python-dev might be necessary.
this SO post explains why one might use the altinstall as opposed to install argument in the make command.
Aside from that I guess the choice would be to either read through the cpython codebase looking for #include directives that need to be met, but what I usually do is keep trying to install the package and just keep reading through the output installing the required packages until it succeeds.
Reminds me of the story of the Engineer, the Manager and the Programmer whose car rolls down a hill.
If you use pyenv and get error "No module named '_ctypes'" (like i am) on Debian/Raspbian/Ubuntu you need to run this commands:
sudo apt-get install libffi-dev
pyenv uninstall 3.7.6
pyenv install 3.7.6
Put your version of python instead of 3.7.6
Detailed steps to install Python 3.7 in CentOS or any redhat linux machine:
Download Python from https://www.python.org/ftp/python/3.7.0/Python-3.7.0.tar.xz
Extract the content in new folder
Open Terminal in the same directory
Run below code step by step :
sudo yum -y install gcc gcc-c++
sudo yum -y install zlib zlib-devel
sudo yum -y install libffi-devel
./configure
make
make install
Thought I'd add the Centos installs:
sudo yum -y install gcc gcc-c++
sudo yum -y install zlib zlib-devel
sudo yum -y install libffi-devel
Check python version:
python3 -V
Create virtualenv:
virtualenv -p python3 venv
On my Ubuntu 18.04 machine, I had the common problem of python not finding _ctypes with the pyenv installed python.
In my case libffi-dev was already installed. Installing cpython from source, as suggested by #MikeiLL, didn't help either.
Turned out to be an homebrew issue.
ajkerrigans suggested solution on pyenvs github issues solved this problem for me.
Solution summary: Tell pyenv to build Python using the Homebrew-managed GCC, with a command like:
CC="$(brew --prefix gcc)/bin/gcc-11" \
pyenv install --verbose 3.10.0
This assumes that any build dependencies have also been installed via Homebrew as specified in the pyenv wiki. As of this writing, that looks like this for Homebrew on Linux:
brew install bzip2 libffi libxml2 libxmlsec1 openssl readline sqlite xz zlib
This solved the same error for me on Debian:
sudo apt-get install libffi-dev
and compile again
Reference: issue31652
None of the solution worked. You have to recompile your python again; once all the required packages were completely installed.
Follow this:
Install required packages
Run ./configure --enable-optimizations
https://gist.github.com/jerblack/798718c1910ccdd4ede92481229043be
I run into this error when I tried to install Python 3.7.3 in Ubuntu 18.04 with next command: $ pyenv install 3.7.3.
Installation succeeded after running $ sudo apt-get update && sudo apt-get install libffi-dev (as suggested here).
The issue was solved there.
Based on this answer, just copy-paste into the terminal.
First run:
sudo apt-get -y update
then:
sudo apt-get -y upgrade
sudo apt-get -y dist-upgrade
sudo apt-get -y install build-essential python-dev python-setuptools python-pip python-smbus
sudo apt-get -y install libncursesw5-dev libgdbm-dev libc6-dev
sudo apt-get -y install zlib1g-dev libsqlite3-dev tk-dev
sudo apt-get -y install libssl-dev openssl
sudo apt-get -y install libffi-dev
PS: You can just copy-paste the whole chunk into the terminal in one go.
In my case what was causing all sorts of Python installation issues including the one having to do with _ctypes and libffi was Homebrew on Linux / Linuxbrew. pyenv was happy again once brew was no longer in the $PATH.
Refer to this thread or this thread, for customized installation of libffi, it is difficult for Python3.7 to find the library location of libffi. An alternative method is to set the CONFIGURE_LDFLAGS variable in the Makefile, for example CONFIGURE_LDFLAGS="-L/path/to/libffi-3.2.1/lib64".
My solution:
Installing libffi-dev with apt-get didn't help.
But this helped: Installing libffi from source and then installing Python 3.8 from source.
My configuration:
Ubuntu 16.04 LTS
Python 3.8.2
Step by step:
I got the error message "ModuleNotFoundError: No module named '_ctypes'" when starting the debugger from Visual Studio Code, and when running python3 -c "import sklearn; sklearn.show_versions()".
download libffi v3.3 from https://github.com/libffi/libffi/releases
install libtool: sudo apt-get install libtool
The file README.md from libffi mentions that autoconf and automake are also necessary. They were already installed on my system.
configure libffi without docs:
./configure --disable-docs
make check
sudo make install
download python 3.8 from https://www.python.org/downloads/
./configure
make
make test
make install
After that my python installation could find _ctypes.
CentOS without root
Install libffi-3.2 (Do NOT use libffi-3.3)
wget ftp://sourceware.org/pub/libffi/libffi-3.2.tar.gz
tar -xzf libffi-3.2.tar.gz
cd libffi-3.2/
./configure --prefix=$YOUR_LIBFFI_DIR
make && make install
Install Python3
./configure --prefix=$YOUR_PATH/python/3.7.10 LDFLAGS=-L${YOUR_LIBFFI_DIR}/lib64 PKG_CONFIG_PATH=${YOUR_LIBFFI_DIR}/lib/pkgconfig --enable-shared
make && make install
Thanks for JohnWSteill
I was having the same problem. None of the above solutions worked for me. The key challenge was that I didn't have the root access. So, I first download the source of libffi. Then I compiled it with usual commands:
./configure --prefix=desired_installation_path_to_libffi
make
Then I recompiled python using
./configure --prefix=/home/user123/Softwares/Python/installation3/ LDFLAGS='-L/home/user123/Softwares/library/libffi/installation/lib64'
make
make install
In my case, 'home/user123/Softwares/library/libffi/installation/lib64' is path to LIBFFI installation directory where libffi.so is located. And, /home/user123/Softwares/Python/installation3/ is path to Python installation directory. Modify them as per your case.
If you don't mind using Miniconda, the necessary external libraries and _ctypes are installed by default. It does take more space and may require using a moderately older version of Python (e.g. 3.7.6 instead of 3.8.2 as of this writing).
You have to load the missing php3 (Python3) modules from the package manager.
If you have Ubuntu I recommend the Synaptic Package Manager:
sudo apt-get install synaptic
There you can simply search for the missing modules. search for ctypes and install all the packages. Then go to your Python dir and do
./configure
make install.
This should solve your problem.
How to install Python from source without libffi in /usr/local?
Download libffi from github and install to /path/to/local
Download python source code and compile with the following configuration:
export PKG_CONFIG_PATH=/path/to/local/lib/pkgconfig
./configure --prefix=/path/to/python \
LDFLAGS='-L/path/to/local/lib -Wl,-R/path/to/local/lib' \
--enable-optimizations
make
make install
I am using MAC M1 and I had this error:
... __boot__.py", line 30, in <module> import ctypes
and something was said about the file libffi.8.dylib
I downloaded this thing on Anaconda and now everything works:
https://anaconda.org/wakari/libffi
I inform you, since much of the above is either not for MAC or outdated, my Python is on Anaconda version 3.10.4
Application file created with py2app works now!!
If your issue is with the VSCODE DEBUGGER, check your currently selected python interpreter. I had both python3.10.9 and python3.10.6 installed; however, the former was probably missing some dependencies so I switched to the latter(my OS default interpreter) which solved the issue.
To change your python interpreter in VSCODE:
Hold ctrl+shift+P
Search Python:Select Interpreter and try your OS default version(The version you get when you run python3 --version
If the issue is still not resolved, run sudo apt-get install libffi-dev.
If you are doing something nobody here will listen you about because "you're doing it the wrong way", but you have to do it "the wrong way" for reasons too asinine to explain and also beyond your ability to control, you can try this:
Get libffi and install it into your user install area the usual way.
git clone https://github.com/libffi/libffi.git
cd libffi
./configure --prefix=path/to/your/install/root
make
make install
Then go back to your Python 3 source and find this part of the code in setup.py at the top level of the python source directory
ffi_inc = [sysconfig.get_config_var("LIBFFI_INCLUDEDIR")]
if not ffi_inc or ffi_inc[0] == '':
ffi_inc = find_file('ffi.h', [], inc_dirs)
if ffi_inc is not None:
ffi_h = ffi_inc[0] + '/ffi.h'
if not os.path.exists(ffi_h):
ffi_inc = None
print('Header file {} does not exist'.format(ffi_h))
ffi_lib = None
if ffi_inc is not None:
for lib_name in ('ffi', 'ffi_pic'):
if (self.compiler.find_library_file(lib_dirs, lib_name)):
ffi_lib = lib_name
break
ffi_lib="ffi" # --- AND INSERT THIS LINE HERE THAT DOES NOT APPEAR ---
if ffi_inc and ffi_lib:
ext.include_dirs.extend(ffi_inc)
ext.libraries.append(ffi_lib)
self.use_system_libffi = True
and add the line I have marked above with the comment. Why it is necessary, and why there is no way to get configure to respect '--without-system-ffi` on Linux platforms, perhaps I will find out why that is "unsupported" in the next couple of hours, but everything has worked ever since. Otherwise, best of luck... YMMV.
WHAT IT DOES: just overrides the logic there and causes the compiler linking command to add "-lffi" which is all that it really needs. If you have the library user-installed, it is probably detecting the headers fine as long as your PKG_CONFIG_PATH includes path/to/your/install/root/lib/pkgconfig.
I am very new to Ubuntu OS and Python as well. I want to install Django. But i dont have easy_install and I tried below command to install pip
sudo apt-get install python-pip
I got an error as below
Unable to locate package python-pip
I tried below command as well
sudo apt-get install python-pip
and i got error as below
E: Package 'python-setuptools' has no installation candidate
I am very confused in installing django, How to successfully install django
First update repositories
sudo apt-get update
then try
sudo apt-get install python-pip python-dev build-essential python-setuptools
if nothing, you can install pip and setuptools packages manually. Download them from PyPI.
Install pip
To install or upgrade pip, securely download get-pip.py.
Then run the following (which may require administrator access):
sudo python get-pip.py
If setuptools (or distribute) is not already installed, get-pip.py will install setuptools for you.
To upgrade an existing setuptools (or distribute), run pip install -U setuptools
Upgrade pip
On Linux or OS X:
sudo pip install -U pip
Then you can download django using pip,
sudo pip install django
Install
First you need make sure you have Python install, here I take 2.7.6 as example. For how to install Python, you can go check this link:
https://askubuntu.com/questions/443048/python-2-7-6-on-ubuntu-12-04-how-to
Then you can start install Django and setup database as follow:
sudo apt-get install python-django
sudo apt-get install mysql-server
sudo apt-get install python-mysqldb
You can find more configuration detail in this link
http://yuwenqing.org/?p=108
Also, for less pain in future develop, you should deploy your Django application in python virtualEnv, here are some detail of why you need virtualEnv.
http://yuwenqing.org/?p=126
In installation process of OpenERP 6, I want to generate a config file with these commands:
cd /home/openerp/openerp-server/bin/
./openerp-server.py -s --stop-after-init -c /home/openerp/openerp-server.cfg
But it always showed the message: ImportError: No module named psycopg2
When I checked for psycopg2 package, it's already installed. Package python-psycopg2-2.4.5-1.rhel5.x86_64 is already installed to its latest version. Nothing to do. What's wrong with this? My server is CentOS, I've installed Python 2.6.7.
Step 1: Install the dependencies
sudo apt-get install build-dep python-psycopg2
Step 2: Run this command in your virtualenv
pip install psycopg2-binary
Ref: Fernando Munoz
Use psycopg2-binary instead of psycopg2.
pip install psycopg2-binary
Or you will get the warning below:
UserWarning: The psycopg2 wheel package will be renamed from release 2.8; in order to keep installing from binary please use "pip install psycopg2-binary" instead. For details see: http://initd.org/psycopg/docs/install.html#binary-install-from-pypi.
Reference: Psycopg 2.7.4 released | Psycopg
I faced the same issue and resolved it with following commands:
sudo apt-get install libpq-dev
pip install psycopg2
Try installing
psycopg2-binary
with
pip install psycopg2-binary --user
Please try to run the command import psycopg2 on the python console. If you get the error then check the sys.path where the python look for the install module. If the parent directory of the python-psycopg2-2.4.5-1.rhel5.x86_64 is there in the sys.path or not. If its not in the sys.path then run export PYTHONPATH=<parent directory of python-psycopg2-2.4.5-1.rhel5.x86_64> before running the openerp server.
Import Error on Mac OS
If psycopg2 is getting installed but you are unable to import it in your .py file then the problem is libpq, its linkages, and the library openssl, on which libpq depends upon. The overall steps are reproduced below. You can check it step by step to know which is the source of error for you and then you can troubleshoot from there.
Check for the installation of the openssl and make sure it's working.
Check for installation of libpq in your system it may not have been installed or not linked. If not installed then install it using the command brew install libpq. This installs libpq library. As per the documentation
libpq is the C application programmer's interface to PostgreSQL. libpq is a set of library functions that allow client programs to pass queries to the PostgreSQL backend server and to receive the results of these queries.
Link libpq using brew link libpq, if this doesn't work then use the command: brew link libpq --force.
Also put in your .zshrc file the following export PATH="/usr/local/opt/libpq/bin:$PATH". This creates all the necessary linkages for libpq library .
Now restart the terminal or use the following command source ~/.zshrc.
Now use the command pip install psycopg2. It will work.
This works, even when you are working in conda environment.
N.B. pip install psycopg2-binaryshould be avoided because as per the developers of the psycopg2 library
The use of the -binary packages in production is discouraged because in the past they proved unreliable in multithread environments. This might have been fixed in more recent versions but I have never managed to reproduce the failure.
Try with these:
virtualenv -p /usr/bin/python3 test_env
source test_env/bin/activate
pip install psycopg2
run python and try to import if you insist on installing it on your systems python try:
pip3 install psycopg2
Recently faced this issue on my production server. I had installed pyscopg2 using
sudo pip install psycopg2
It worked beautifully on my local, but had me for a run on my ec2 server.
sudo python -m pip install psycopg2
The above command worked for me there. Posting here just in case it would help someone in future.
sudo pip install psycopg2-binary
You need to install the psycopg2 module.
On CentOS:
Make sure Python 2.7+ is installed. If not, follow these instructions: http://toomuchdata.com/2014/02/16/how-to-install-python-on-centos/
# Python 2.7.6:
$ wget http://python.org/ftp/python/2.7.6/Python-2.7.6.tar.xz
$ tar xf Python-2.7.6.tar.xz
$ cd Python-2.7.6
$ ./configure --prefix=/usr/local --enable-unicode=ucs4 --enable-shared LDFLAGS="-Wl,-rpath /usr/local/lib"
$ make && make altinstall
$ yum install postgresql-libs
# First get the setup script for Setuptools:
$ wget https://bitbucket.org/pypa/setuptools/raw/bootstrap/ez_setup.py
# Then install it for Python 2.7 and/or Python 3.3:
$ python2.7 ez_setup.py
$ easy_install-2.7 psycopg2
Even though this is a CentOS question, here are the instructions for Ubuntu:
$ sudo apt-get install python3-pip python-distribute python-dev
$ easy_install psycopg2
Cite: http://initd.org/psycopg/install/
For python3 on ubuntu, this worked for me:
$sudo apt-get update
$sudo apt-get install libpq-dev
$sudo pip3 install psycopg2-binary
i have the same problem, but this piece of snippet alone solved my problem.
pip install psycopg2
Run into the same issue when I switch to Ubuntu from Windows 10.. the following worked for me.. this after googling and trying numerous suggestions for 2 hours...
sudo apt-get install libpq-dev
then
pip3 install psycopg2
I hope this helps someone who has encountered the same problem especially when switching for windows OS to Linux(Ubuntu).
I have done 2 things to solve this issue:
use Python 3.6 instead of 3.8.
change Django version to 2.2 (may be working with some higher but I change to 2.2)
For Python3
Step 1: Install Dependencies
sudo apt-get install python3 python-dev python3-dev
Step 2: Install
pip install psycopg2
check correctly if you had ON your virtual env of your peoject, if it's OFF then make it ON. execute following cammands:
workon <your_env_name>
python manage.py runserver
It's working for me
It's very simple, not sure why nobody mentioned this for mac before.
brew install postgresql
pip3 install psycopg2
In simple terms, psycopg2 wants us to install postgres first.
PS: Don't forget to upvote, so that it can help other people as well.
Solved the issue with below solution :
Basically the issue due to _bz2.cpython-36m-x86_64-linux-gnu.so Linux package file. Try to find the the location.
Check the install python location ( which python3)- Example: /usr/local/bin/python3
copy the file under INSTALL_LOCATION/lib/python3.6
cp -rvp /usr/lib64/python3.6/lib-dynload/_bz2.cpython-36m-x86_64-linux-gnu.so /usr/local/lib/python3.6
try:
pip install psycopg2 --force-reinstall --no-cache-dir
Python2 importerror no module named psycopg2
pip install psycopg2-binary
Requirement already satisfied...
Solved by following steps:
sudo curl https://bootstrap.pypa.io/pip/2.7/get-pip.py -o get-pip.py
sudo python get-pip.py
sudo python -m pip install psycopg2-binary
pip install psycopg-binary
The line above helped me
For Python3 use this:
sudo apt-get install -y python3-psycopg2
I am on a school computer, so I can't install anything.
I am trying to create C code which can be run in Python. It seems all the articles I am finding on it require you to use
#include <Python.h>
I do this, but when I compile it complains that there is no such file or directory.
The computer has Python (at least it has the python command in the terminal, and we can run whatever Python code we want).
I typed in locate Python.h in the terminal, but it found nothing.
I have two questions:
Can I write C code that I can call in Python without Python.h?
Am I missing something, and the computer actually has Python.h?
You need the python-dev package which contains Python.h
On Ubuntu, you would need to install a package called python-dev. Since this package doesn't seem to be installed (locate Python.h didn't find anything) and you can't install it system-wide yourself, we need a different solution.
You can install Python in your home directory -- you don't need any special permissions to do this. If you are allowed to use a web browser and run a gcc, this should work for you. To this end
Download the source tarball.
Unzip with
tar xjf Python-2.7.2.tar.bz2
Build and install with
cd Python-2.7.2
./configure --prefix=/home/username/python --enable-unicode=ucs4
make
make install
Now, you have a complete Python installation in your home directory. Pass -I /home/username/python/include to gcc when compiling to make it aware of Python.h. Pass -L /home/username/python/lib and -lpython2.7 when linking.
You have to use #include "python2.7/Python.h" instead of #include "Python.h".
For Ubuntu 15.10 and Python 3, comming to this question as they don't have Python.h but having administrative rights, the following might solve it:
sudo apt-get install python-dev
sudo apt-get install python3-dev
sudo apt-get install libpython3-dev
sudo apt-get install libpython3.4-dev
sudo apt-get install libpython3.5-dev
Generally on ubuntu you can install the python-dev package to resolve this.
Type the following in terminal to install the python-dev package.
sudo apt-get install python-dev -y
I found the answer in ubuntuforums (ubuntuforums), you can just add this to your gcc '$(python-config --includes)'
gcc $(python-config --includes) urfile.c
The header files are now provided by libpython2.7-dev.
You can use the search form at packages.ubuntu.com to find out what package provides Python.h.
You need python-dev installed.
For Ubuntu :
sudo apt-get install python-dev # for python2.x installs
sudo apt-get install python3-dev # for python3.x installs
For more distros, refer -
https://stackoverflow.com/a/21530768/6841045
I ran into the same issue while trying to build a very old copy of omniORB on a CentOS 7 machine. Resolved the issue by installing the python development libraries:
# yum install python-devel
This installed the Python.h into:
/usr/include/python2.7/Python.h
It happens because Python.h is not located in the default include folder (which is /usr/include/ ).
Installing Python-dev might help:
$ sudo apt-get install python-dev
But mostly the problem will persist because the development packages are made inside a separate folder inside the include folder itself ( /usr/include/python2.7 or python3).
So you should either specify the library folder using -I option in gcc or by creating soft-links to everything inside those folders to just outside (I'd prefer the former option).
Using -I option in gcc:
$ gcc -o hello -I /usr/include/python2.7 helloworld.c
Creating soft-links :
$ sudo ln -sv /usr/include/python2.7/* /usr/include/
locate Python.h
If the output is empty, then find your python version
python --version
lets say it is X.x i.e 2.7 or 3.6, 3.7, 3.8
Then with the same version install header files and static libraries for python
sudo apt-get install pythonX.x-dev
Go to Synaptic package manager. Reload -> Search for python -> select the python package you want -> Submit -> Install
Works for me ;)
Exactly, the package you need to install is python-dev.
That means you are not install libraries for python dev.
If you are on Linux OS, you can solve this issue by commands separately below:
Ubuntu (Debian) :
sudo apt-get install python-dev (Py2) or sudo apt-get install python3-dev (Py3)
Rehat (CentOS):
yum install python-devel
None of the answers worked for me. If you are running on Ubuntu, you can try:
With python3:
sudo apt-get install python3 python-dev python3-dev \
build-essential libssl-dev libffi-dev \
libxml2-dev libxslt1-dev zlib1g-dev \
python-pip
With Python 2:
sudo apt-get install python-dev \
build-essential libssl-dev libffi-dev \
libxml2-dev libxslt1-dev zlib1g-dev \
python-pip
I think the correct way is python3-config --include, and if you look at it cat $(which python3-config), you'll see that it is using sysconfig module under the hood. Thus, I think the best solution is to use:
>>> import sysconfig
>>> sysconfig.get_path('include')