Building Python and more on missing modules - python

I have another thread asking help on "missing zlib". With the nice help the problem has been resolved (almost).
Now I am interested in building Python myself (on Ubuntu 10.10).
A few important questions have caught my attention:
After building Python (say 2.7.1), do I need to rebuild Python if I have missing modules?
Is there a way to find out what modules will be missing prior to building Python? Say sqlite3. I have sqlite3 installed for the system default (Python 2.6.6), and I can import that into Python 2.6.6 shell. Now I use pythonbrew to build 2.7.1, and in the shell I cannot import sqlite3 because _sqlite3 is not available. I am sure there are a few more important one missing which I need for future development (such as Django..).
I am willing to learn how to build without using pythonbrew.
Please share with me your experience in building another version of Python, and how would you address the problem of missing modules? Is there a practical solution to building Python?
I have never bothered building one myself, so please bear with me. I am beginning to realize the importance of learning and building one myself! Thank you very much!
EDIT
First I thank you all of your inputs. They meant a lot. I did the building.
Python build finished, but the necessary bits to build these modules were not found:
_bsddb _curses _curses_panel
_tkinter bsddb185 bz2
dbm gdbm readline
sunaudiodev _sqlite3
To find the necessary bits, look in setup.py in detect_modules() for the module's name.
I got sqlite3 and readline away by
sudo apt-get install libreadline6 libreadline6-dev
sudo apt-get install libsqlite3-dev
I tried to import them, but still "no named module xxxx".
At AskUbuntu I actually asked people how to get previous commands because I couldn't use that feature when I am in Python 2.7.1 shell. I believe it's due to readline.
Readline
I installed the Python-2.7.1 under this directory: /home/jwxie518/python27/
I looked into setup.py, I found the following lines:
# The sqlite interface
sqlite_setup_debug = False # verbose debug prints from this script?
# We hunt for #define SQLITE_VERSION "n.n.n"
# We need to find >= sqlite version 3.0.8
sqlite_incdir = sqlite_libdir = None
sqlite_inc_paths = [ '/usr/include',
'/usr/include/sqlite',
'/usr/include/sqlite3',
'/usr/local/include',
'/usr/local/include/sqlite',
'/usr/local/include/sqlite3',
]
All the paths listed above do not exist.
So I guess I have to install sqlite3 manually? I got another reference here (it's in Chinese, however)
# Download the latest and extract
# Go into the extracted directory
./configure --prefix=/home/jwxie518/python27/python
make && make install
# Then edit python-2.7 's setup.py before rebuild it
# Sample (add these two lines to the end....)
'~/share/software/python/sqlite-3.6.20/include',
'~/share/software/python/sqlite-3.6.20/include/sqlite3',
# Then rebuild python like how we did before
I went into my directory where I installed sqlite3. I found include/sqlite3.h only. So I went back and check /usr/include/. I can only find sqlite3.h too.
So what is going on here? Readline is also non-importable.
3RD EDIT
I started everything over, except I didn't reinstall sqlite3.
# Extract Python-2.7.1
# cd into Python-2.7.1
# ./configure
make >make.out 2>&1
less make.out
make.out is here: http://pastebin.com/raw.php?i=7k3BfxZQ
I still couldn't import sqlite3. So I went into setup.py and made changes:
# We hunt for #define SQLITE_VERSION "n.n.n"
# We need to find >= sqlite version 3.0.8
sqlite_incdir = sqlite_libdir = None
sqlite_inc_paths = [ '/usr/include',
'/usr/include/sqlite',
'/usr/include/sqlite3',
'/usr/local/include',
'/usr/local/include/sqlite',
'/usr/local/include/sqlite3',
'/home/jwxie518/python-mod/include/sqlite',
'/home/jwxie518/python-mod/include/sqlite3',
]
Then again, ran everything over (this time I also did make clean)
Output is here: http://pastebin.com/raw.php?i=8ZKgAcWn
According to the output, I don't think the custom path is included.... (for complete output please go to the link above and search for sqlite)
build/temp.linux-i686-2.7/home/jwxie518/Python-2.7.1/Modules/_sqlite/util.o
-L/usr/lib -L/usr/local/lib -Wl,-R/usr/lib -lsqlite3 -o build/lib.linux-i686-2.7/_sqlite3.so
I still cannot import sqlite3.
THanks!
Thank you very much, Michael Dillon, for helping me out. Your tutorial was neat and clear.
I solved the problem as soon as I realized whenever I tried Python-2.7.1, I was actually using the one installed by Pythonbrew.
The moral of the story is read all the errors. I neglected the errors generated by importing sqlite3. The one installed by Pythonbrew didn't have sqlite3 installed. The development package for sqlite3 was installed after Pythonbrew installed the Python-2.7.1
Thanks.

Here is how to build Python and fix any dependencies. I am assuming that you want this Python to be entirely separate from the Ubuntu release Python, so I am specifying the --prefix option to install it all in /home/python27 using the standard Python layout, i.e. site-packages instead of dist-packages.
1. Get the .tar.gz file into your own home directory.
2. tar zxvf Py*.tar.gz
3. cd Py*1
4. ./configure --prefix=/home/python27
5. make
6. make install
Step 5 is the important one. At the end, it will display a list of any modules that could not be built properly. Often you can fix this by installing an Ubuntu package, and rerunning make.
a. sudo apt-get install something-dev
b. make
It is pretty common to have a problem because you are missing the -dev addon to some module or other. But sometimes you should start over like this:
a. make clean
b. ./configure --prefix=/home/python27
c. make
Starting over never hurts if you are unsure. An important note about step 6. I am not using sudo on this command which means that you will need to have the /home/python27 directory already created with the appropriate ownership.
Don't hesitate to try out ./configure --help |less before building something because there may be interesting options that you could use. One time on a minimal distro I had to do --with-dbmliborder=gdbm:bdb in order to get gdbm working. When you run ./configure, the last few lines will tell you where it put the information that it learned. In the case of Python, Modules/Setup has been useful to figure out how to get a module to build.
Another useful thing is to make clean and then run make >make.out 2>&1 to capture all the output from the full make process. Then, after it is complete, use less or an editor to look for the details on a problem module such as _sqlite. For instance, check all the -I options that are passed to gcc. If the correct include directory is not on the list that would cause a problem. You can edit setup.py to change the list of include directories.
In the past it was more common to have library problems that would be fixed by logging out, logging in again, and running "sudo ldconfig" before doing a complete rebuild.

Related

Nest simulator: python says “no module named nest”

After installing the Nest Neural Simulator, I keep getting the following error when trying to run any of the example python files that came in the installation. I've tried re-installing Nest, Python, and using Anaconda, but no go.
Python error:
ImportError: No module named nest
Suggestions?
At https://nest-simulator.org/documentation you now find many different install instructions and how to solve the "ImportError: no module named nest" depends on the way you installed NEST.
System Python
The problem with the nest python module not being found is usually, that NEST is installed for a specific Python version and you can not load it from another. So while many OS still use Python 2.7 you may need to explicitly run
$ python3
>>> import nest
Additionally, if you have multiple Python 3.x versions installed, modules may still be installed for a different version and you have to explicitly start python with python3.6 or python3.8, etc.
Conda package
As #nosratullah-mohammadi already mentioned, if you have a Conda flavour installed, using the pre-built package is a very quick solution. The link in his post is unfortunately broken; this one should work, then go to "Installation" in the side bar.
$ conda create --name nest -c conda-forge python3 nest-simulator
$ conda activate nest
$ python # this should load the Python from the conda env
>>> import nest # this loads nest which is installed explicitly for that Python
From Source
For every install from source, be sure to have Python and other prerequisites installed before building NEST. Then you can create your temporary build directory (can be deleted afterwards) and configure with the flags you need.
cd somewhere
mkdir nest-build
cd nest-build
cmake -DCMAKE_INSTALL_PREFIX:PATH=/install/path -Dwith-python=3 .../sources/of/nest-simulator
Replace somewhere, /install/path and .../sources/of/nest-simulator with the paths correct for your setup. (A popular choice when compiling from source in conjunction with Conda environments, for example, is to use -CMAKE_INSTALL_PREFIX=$CONDA_PREFIX, which installs NEST directly into the active environment. Conda is however in no way necessary for NEST.)
Add more -D... flags as you prefer. Possible flags you see with cmake -LA .../sources/of/nest-simulator, as pointed out here. You are probably interested in many of the with-xyz at the end. Check the aforementioned documentation for deatils.
Check that the paths and libraries reported in the Configuration Summary make sense (you may need to scroll up a bit to see). It could for example look something like this:
--------------------------------------------------------------------------------
NEST Configuration Summary
--------------------------------------------------------------------------------
[...]
Python bindings : Yes (Python 3.6.8: /home/yourname/miniconda3/envs/nest/bin/python3)
Includes : /home/yourname/miniconda3/envs/nest/include/python3.6m
Libraries : /home/yourname/miniconda3/envs/nest/lib/libpython3.6m.so
Cython bindings : Yes (Cython 0.27.3: /home/yourname/miniconda3/envs/nest/bin/cython)
[...]
--------------------------------------------------------------------------------
[...]
PyNEST will be installed to:
/home/yourname/miniconda3/envs/nest/lib/python3.6/site-packages
--------------------------------------------------------------------------------
In this example CMake configured everything for Python3.6 from my conda environment.
If you are satisfied with your settings and all the found Python versions match, run the usual
$ make # optionally with -j$(nproc)
$ make install
$ make installcheck
In case that works out fine you are done and can delete the build directory to free the space. Congratulations!
Also if things get too mixed up and it doesn't seem to do what you expect, it is sometimes useful to delete the build directory and start off clean.
there is a new method of installation added to other methods, wich is installing nest with conda package and it's in its beta version. but it works and it's really simple.
you can find the installation from here!
simply after install mini conda package run your terminal and type this :
conda create --name ENVNAME -c conda-forge nest-simulator python
then type :
conda activate ENVNAME
and you're good to go!
NEST now provide the solution to that problem and similar ones, by providing a script which automatically sets the relevant system variables:
If your operating system does not find the nest executable or Python
does not find the nest module, your path variables may not be set
correctly. This may also be the case if Python cannot load the nest
module due to missing or incompatible libraries. In this case, please
run
source </path/to/nest_install_dir>/bin/nest_vars.sh
to set the necessary environment variables. You may want to include
this line in your .bashrc file, so that the environment variables are
set automatically.
https://nest-simulator.readthedocs.io/en/latest/installation/linux_install.html
Turns out I needed to move the directory where I installed nest (Users/name/opt/nest) into a nest folder in the following directory in anaconda. Specifically, I moved the contents of the folder (from the nest installation):
/Users/name/opt/nest/lib/python2.7/site-packages/nest
Into this one:
/anaconda/lib/python2.7/site-packages/nest
Disclaimer: I could very well run into problems for not having copied all the contents of the Nest installation, but this little hack is helping me run example files now.

Python installation, failing to find bz2 module

OK, I have an old Debian VM. Package managers are useless. No, I'm not going to update the OS.
I have the bzip2 lib and development headers installed correctly on my system (those actually came from a package).
I start with absolutely NO Python on the system. I removed everything manually. I downloaded the Python 2.7.5 source, and configured with ./configure --prefix=/usr. It configures fine. I run make, and it compiles fine. I try ./python -c "import bz2; print bz2.__doc__", it works, and says:
The python bz2 module provides a comprehensive interface for
the bz2 compression library. It implements a complete file
interface, one shot (de)compression functions, and types for
sequential (de)compression.
I then run make test and the whole test suite progresses fine, and notably the "test_bz2" test passes.
I then run make install, which installs my new Python binary into /usr/bin/ like I wanted.
I try /usr/bin/python -c "import bz2; print bz2.__doc__", and it fails with:
Traceback (most recent call last):
File "", line 1, in
ImportError: No module named bz2
I've tried a bunch of different things, including building Python as --enable-shared and not, no luck. I've tried at least 10 times (each time totally cleaning out everything, running make distclean, etc.). No luck.
I tried: PYTHONPATH="/usr/lib/python2.7"; export PYTHONPATH. Still no luck.
HOWEVER, if I delete the symlink that make install creates for /usr/bin/python, and instead do: ln -s /path/to/my/python/compile/python python, NOW it magically works.
So, what the heck? Why is this Python binary I'm getting created only able to find stuff when the binary exists in the compile directory, and not when it's put into normal production install location? What am I missing?
I am root during the entire process, from configure to make to make install to trying to test the Python import call.
I have started from scratch again (this time compiling with --enable-shared btw), and verified that not only in the compile directory is there build/lib.linux-x86_64-2.7/bz2.so, but once I run make install, that file is put into /usr/lib/python2.7/lib-dynload/bz2.so.
I've tried to do some reading on lib-dynload, but haven't been able to determine if there's something else a Python program (like default configuration for the CLI or whatever) would need to be able to tell it to pull module imports from lib-dynload, or if there's some other place or option to tell the make install where it should be putting it instead of dynload.
Still I have no explanation why the /path/to/compilation/python binary can find and load bz2.so fine, but the /usr/bin/python binary can't find (or load) /usr/lib/python2.7/lib-dynload/bz2.so.
I thought maybe it was something to do with the fact that the installation doesn't create like a /usr/lib/python symlink to point at /usr/lib/python2.7 directory. But I created the symlink and still no go.
I am still lost here.
It would appear that a sort of non-answer answer was arrived at accidentally via a long string of Twitter conversation(s).
I've filed another Stack Overflow question here to ask WHY what we found was the solution to this problem: https://stackoverflow.com/questions/17662091/python-installation-prefix-not-being-persisted-in-config
For posterity sake, right now the solution is that I have to set the PYTHONHOME environment variable to /usr, and everything starts working. The puzzling part is that the documentation says PYTHONHOME should default to {prefix}, which I was clearly setting as default during configure to /usr. So why should I have to manually set it?
Running python-config --prefix reveals that the {prefix} default is in fact /usr/bin, NOT /usr like I specified, which leads to me needing to override the default back to the default, bizarrely.

py2app built app displays `ERROR: pygame.macosx import FAILED` on other machines

Trying to build an app on the Mac using py2app. Got everything working fine on my machine, but when moving the app to another, it crashes and the console displays this error.
ERROR: pygame.macosx import FAILED
Anybody have a solution to this?
Found the problem and solution after many hours. Turns out other people have experienced similar problems and their articles were quite helpful:
http://b.atcg.us/blog/2010/04/13/py2app-hell-the-first.html
http://www.vijayp.ca/blog/?p=62
In case someone else runs into the issue, this particular problem was caused because the Python framework was not being bundled into the application. You can confirm this by right-clicking your app to view package contents, then proceed to Contents/Frameworks/. If Python.framework is not there, it should be.
Be sure to download Python -
My first issue was reliance on Apple's build in Python package. Don't use this. You need to install your own version of python. Go to http://www.python.org/download/releases/, find a version (I stuck with 2.6), download the gzip (not the mac package), and install with the following if you are running Snow Leopard:
./configure --enable-framework MACOSX_DEPLOYMENT_TARGET=10.6 --with-universal-archs=intel --enable-universalsdk=/
make
sudo make install
Adjust Paths, Install Packages - From here there you need to adjust your paths to ensure you are using your custom-installed version. From here, I reinstalled the following packages - this turned out to be a dependency nightmare so I'm including the version numbers as well:
py2app 0.5.2
macholib 1.3
modulegraph .8.0
If these packages actually worked, you should be able to build and run your app now. Unfortunately, they don't. I'll go into the errors and my hacked solutions in a bit, but there's some settings in the build file that need to be made first.
First the setup.py file should like a little somethin' like this:
setup.py
from setuptools import setup
APP = ['Game.py']
DATA_FILES = ['data']
OPTIONS = {
"argv_emulation": False,
"compressed" : True,
"optimize":2,
"iconfile":'data/game.icns',
}
setup(
app=APP,
data_files=DATA_FILES,
options={'py2app': OPTIONS},
)
then to be extra safe, I use a shell script to call this.
build.sh
## Remove previous builds. Start with clean slate.
rm -rf build dist
## Force python into 32 bit mode.
export VERSIONER_PYTHON_PREFER_32_BIT=yes
## Force build with custom installed python
/Library/Frameworks/Python.framework/Versions/2.6/bin/python setup.py py2app
Running build.sh should compile the app. If it does not compile, I have good news -- it's not your fault. Due to glitches in the libraries, you may run into some (or all) of the following:
Potential Problems
If the build script fails, scan the traceback for some of the following keywords:
pygame not found - basic path problem in py2app. Add...
sys.path.insert(0, os.path.join(os.getcwd(), 'lib', 'python2.6','lib-dynload')) ## Added to fix dynlib bug
after the import statements in boot_app.py in the py2app lib.
pythonNone - This appears to be a bug in the macho package where it cannot determine the version number of your python build. To solve this, I added the following lines to build_app.py in py2app.
## Add these two lines...
if not info["version"]:
info["version"] = "2.6"
## Before this line. (line 941 in method copy_python_framework() at time of writing)
pydir = 'python%s'%(info['version'])
No such file or directory...Python.framework/[lib|include] - py2app is simply looking for directories that exist deeper in the file system tree. Go to the Python.framework directory and symlink up the place...
cd /Library/Frameworks/Python.framework
sudo ln -s Versions/Current/include/ include
sudo ln -s Versions/Current/lib lib
That should do it! - These steps created a compiled app that worked on other intel machines.
Thank you for posting what you found!
I had a similar problem. I tried various combinations of what you suggested, and isolated the single issue for me to be the bug in boot_app.py which you identify above.
Once I added the one-line fix to boot_app.py which you identify above, everything worked, even using the pre-installed Apple build of python (version 2.6.1).
I should note that when I say "everything worked," I really mean building a py2app app for actual distribution, i.e. using the normal command:
python setup.py py2app
The "alias" mode. i.e.
python setup.py py2app -A
which the py2app documentation suggests for use during development, still does not work for me (with the same module not found error). But better the actual distribution build working than nothing at all! Again, thanks.

How do you correct Module already loaded UserWarnings in Python?

Getting the following kinds of warnings when running most python scripts in the command line:
/Library/Python/2.6/site-packages/virtualenvwrapper/hook_loader.py:16: UserWarning: Module
pkg_resources was already imported from /System/Library/Frameworks/Python.framework/Versions/2.6/Extras/lib/python/pkg_resources.pyc, but /Library/Python/2.6/site-packages is being added to sys.path
import pkg_resources
/Library/Python/2.6/site-packages/virtualenvwrapper/hook_loader.py:16: UserWarning: Module site was already imported from /System/Library/Frameworks/Python.framework/Versions/2.6/lib/python2.6/site.pyc, but /Library/Python/2.6/site-packages is being added to sys.path
import pkg_resources
I think it has to do with a combination of using distribute and virtualenv, but wanted to check if anyone else has run in to this or would know how to go about fixing it.
Perhaps use the virtualenv option --no-site-packages so you won't see any system site-packages within your virtual environment. Having items installed both in your virtualenv and on the system root may be the cause of this issue.
Using --no-site-packages when creating your virtualenv prevents any conflict between system packages. I almost always use that option when creating a new virtualenv to prevent any conflicts. Though I may have several copies of libraries, at least they don't mess with each other.
The python equivalent of putting a bit of electrical tape over the check engine light would be to use the -W command line flag or by adding a warning filter.
In my case reinstalling of anything did not help. There were some orphaned .pyc files (specifically pkg_resources.pyc) left in /System/Library/Frameworks/Python.framework/Versions/2.6/Extras/lib/python
sudo find . -type f -name "*.pyc" -delete
made it work. This link helped me to track down the problem.
I had this sort of Python packaging hell visit today too.
Running Python 2.7.3 on Ubuntu, using namespace packages and using zc.buildout.
Finally, updating system wide Distribute from older version 0.6.30 to latest version 0.6.35 resolved the problem.
If the warning shows in a program you are modifying, try it this way (examply with pytz):
try:
import pytz
except ImportError:
from pkg_resources import require
require('pytz')

Unable to install python-setuptools: ./configure: No such file or directory

The question is related to the answer to "Unable to install Python without sudo access".
I need to install python-setuptools to install python modules.
I have extracted the installation package.
I get the following error when configuring
[~/wepapps/pythonModules/setuptools-0.6c9]# ./configure --prefix=/home/masi/.local
-bash: ./configure: No such file or directory
I did not find the solution at the program's homepage.
How can I resolve this error?
As Noah states, setuptools isn't an automake package so doesn't use ‘./configure’. Instead it's a pure-Python-style ‘setup.py’ (distutils) script.
You shouldn't normally need to play with .pydistutils.cfg, as long as you run it with the right version of Python. So if you haven't added the .local/bin folder to PATH, you'd have to say explicitly:
/home/masi/.local/bin/python setup.py install
AIUI this should Just Work.
I did not find the solution at the program's homepage.
Yeah, they want you to install it from a shell script egg which uses the default version of Python. Which you don't want.
(Another approach if you can't get setuptools to work is to skip it and install each module and dependency manually. Personally I have a bit of an aversion to setuptools/egg, as it contains far too much “clever” magic for my tastes and makes a mess of my filesystem. But I'm an old curmudgeon like that. Most Python modules can be obtained as simple Python files or plain old distutils scripts, but unfortunately there are some that demand eggs.)
You might want to check http://peak.telecommunity.com/DevCenter/EasyInstall#custom-installation-locations.
EasyInstall is a python module with some shell scripts (or some shell scripts with a python module?) and does not use the unix make tool that gets configured with the "./configure" command. It looks like your best bet is to try editing ~/.pydistutils.cfg to include:
[install]
install_lib = /home/masi/.local/lib/python/site-packages/
install_scripts = /home/masi/.local/bin
You'll also presumably have made the ~/.local/bin/ folder part of your PATH so you can run the easy_install script. (I'm not sure exactly where the site-packages directory will be under .local, but it shouldn't be hard to find.)
Hope this helps.

Categories

Resources