Problems with setting up a configuration file and running a code - python

I'm fairly new to linux and trying to get some code to run. After trying for a few days without success, I hope someone can help me. My Linux version is
Description: Ubuntu 16.04.1 LTS Release: 16.04 Codename: xenial
The code is found here: https://bitbucket.org/dsign/gecmi/wiki/Home
My problems lie in modifying the configuration file: site_config.py from which the developers say I should:
Check this file and edit it in such a way that it matches your build environment, the targets you want to compile and where do you want to install them.
Firstly, I tried to install boost,itbb, scons and numpy/scipy library via apt-get
sudo apt-get install libboost-dev
sudo apt-get install scons
sudo apt-get install python-scipy
sudo apt-get install python-numpy
sudo apt-get install libtbb-dev
I'm not sure if those are the correct versions, but that's what I could find after researching a bit.
The next step is to adapt the entries in the configuration file, but I find it very difficult (actually I have not much of a clue) to find out WHERE all those directories are after installing the above-mentioned libraries are or how to adapt this configuration file respectively:
# Use 'release' for speed, 'debug' for debugging.
COMPILE_MODE='release'
# Use one or both of the following words: 'standalone' for
# building the standalone executable, 'python' for building
# the python module
COMPILE_TARGETS='standalone python'
# Where the boost library is installed. I expect an 'include'
# and a 'lib' dir below.
BOOST_ROOT='/opt/boost_1_47_0/'
# The prefix of the python installation. This is used for deducing
# include directories and lib path of python
PYTHON_PREFIX="/usr/"
# The python's version. Used for deducing include directories and
# library name of python.
PYTHON_VERSION ="2.7"
# Where the Intel threading building blocks is installed. I expect
# and 'include' and a 'lib' directory below this one.
TBB_PREFIX="/home/alcides/programming/projects/sci_python/prx"
# The executable for the c++ compiler to use.
COMPILER_NAME="g++-4.6"
# The place where numpy is installed. I expect the directories
# 'core/include' and 'numarray/include' below.
NUMPY_PREFIX="/usr/local/lib/python2.7/dist-packages/numpy/"
# Where you sould like the standalone program to be installed
GECMI_PROGRAM_INSTALL_AT='/usr/local/bin'
# Where you would like the shared library to be installed. This
# library is used by both the python module and the standalone
# program.
GECMI_LIB_INSTALL_AT='/usr/local/lib'
# Where would you like to have the module installed, for the
# target 'InstallPythonModule'. A dll file called gecmi.{dllext}
# is installed there.
PYTHON_MODULE_INSTALL_AT='/usr/lib/python2.7/dist-packages'
The tip of the authors points to modifying the LD_LIBRARY_PATH variable, but didn't help me much either:
You can get messages of the kind error while loading shared libraries if the dependencies are not correctly installed. In that case, you might want to fiddle with the commands locate and the environment variable LD_LIBRARY_PATH, or the equivalents in your operating system of choice.
I tried to use whereis and locate, but not sure if that works. For example when I use whereis boost or locate boost I find a lot of directories, but no directory like boost/include or boost/lib which seems to be expected by the config file. For numpy there is no numarray/include folder asf.
So basically I was wondering how a linux power user would find all the necessary paths and directories and figure out how to get this code to run (or if some of you could run the code). I'm also thankful for some further information/tutorial/advices on how to get more familiar with solving such problems.
thanks!

in terminal write following commands
sudo apt-get update
sudo apt-get -y upgrade
python3 -V (out put is Python 3.5.2 may be or other version..)
sudo apt-get install -y python3-pip
pip3 install package_name
sudo apt-get install build-essential libssl-dev libffi-dev python-dev
sudo apt-get install -y python3-venv
for more help refer these site
https://www.digitalocean.com/community/tutorials/how-to-install-python-3-and-set-up-a-local-programming-environment-on-ubuntu-16-04

Related

How do I link an extracted python-dev config to a virtualenv python installation?

I am trying to install a python project on a server where I don't have root access. I've managed to install virtualenv and pip locally, but the pip install is now failing when trying to install uwsgi (specifically trying to build wheel) with several error: Python.h: No such file or directory errors.
From what I've gathered I need to install python-dev, but as I don't have sudo access I've had to download and extract the python-dev package.
Where I'm stuck now is how to link python-dev to the python I've installed in my virtualenv.
Server details:
$ uname -a
4.9.0-12-amd64 #1 SMP Debian 4.9.210-1 (2020-01-20) x86_64 GNU/Linux
$ python --version
Python 3.5.3
First, could you ask your admin for help?
OK, let's summarize: python.h: No such file or directory looks like the C compiler could not find the necessary header files. Those files are contained in the python-dev package, but you are not permitted to install it. You have downloaded the package and extracted its files somewhere.
Now you should set the xx_PATH environment variables that gcc uses to find additional library and header files locations, see the docs and the example here for details and try the installation again.
Anyway this fixes only the first compilation error.

pip install custom include path

On an ubuntu system on which I don't have sudo previleges, I wish to install a package via pip (matplotlib to be precise), but some source packages are not installed on the system (however the binaries are installed).
I have created a virtual environment in which to install, and have downloaded the required source code, but I can't place them in the default /usr/include/ etc.. When pip runs matplotlib's setup.py script, the source files are reported as missing.
Is there a way to instruct pip or setup.py where to look for the source?
ps: setting CFLAGS or CPPFLAGS adds the locations of the downloaded source to compile instructions, but setup.py didn't find the source, so didn't attempt to compile some components (graphic backends).
pps: this is similar to, but more specific than this question
I would suggest doing:
Rebuild whatever binaries you need in your own home directory (this also avoids an issue if the apps get upgraded on the system or are otherwise different versions from your source). Assuming the programs use the standard configure scripts, you can do
mkdir ~/dev
cd app_src
./configure --prefix=~/dev
make; make install
Then when you want to do your pip install, do
export PATH=~/dev/bin:$PATH
export LD_LIBRARY_PATh=~/dev/lib
(Note, what I should be suggesting is pointing it to your virtualenv but I haven't had the issue you're having)
Do the pip install; if memory serves, pkg-config should pick up the info you want (this assumes matplotlib uses pkg-config to figure out where packages are stored)

Install tkinter and python locally

I work with linux on a servies. And I don't have the root privilege. I installed the python-3.2.3 locally to "/home/sam/install_sam". when I import the tkinter module. I get the following error:
ImportError: No module named _tkinter, please install the python-tk package
I know I need to install the Tkinter module. because I don't have the root privilege. I can't use like the following commands:
apt-get install python-tk
sudo apt-get install python-tk
And I search on goolge. I get tcl/tk from here. I install them use the following commands.
cd ~/Downloads/tcl8.5.11/unix
./configure --prefix=/home/sam/install_sam/tcl
make
make install
cd ~/Downloads/tk8.5.11/unix
./configure --prefix=/home/sam/install_sam/tk
--with- tcl=/home/sam/Downloads/tcl8.5.11/unix
make
make install
cd ~/Downloads/Python3.2.3/
export LD_LIBRARY_PATH=/home/sam/install_sam/tcl/lib:/home/sam/install_sam/tk/lib
export LD_RUN_PATH=/home/sam/install_sam/tcl/lib:/home/sam/install_sam/tk/lib
./configure --prefix=/home/sam/install_sam/python
make
make install
I still got error: INFO: Can't locate Tcl/Tk libs and/or headers. How should I config the tcl/tk for the python
Use CPPFLAGS environment variable to set the include directories for tcl and tk before building Python 3. This has worked for me.
export CPPFLAGS="-I/home/sam/install_sam/tcl/include -I/home/sam/install_sam/tk/include"
Finally. I install tcl/tk and python in a same path. It can work now. the commands as follow:
cd ~/Downloads/tcl8.5.11/unix
./configure --prefix=/home/sam/install_sam/python3
make
make install
cd ~/Downloads/tk8.5.11/unix
./configure --prefix=/home/sam/install_sam/python3
--with-tcl=/home/sam/Downloads/tcl8.5.11/unix
make
make install
export LD_LIBRARY_PATH=/home/sam/install_sam/python3/lib
cd ~/Downloads/Python3.2.3/3
./configure --prefix=/home/sam/install_sam/python3
make
make install
someone can tell me how to config the tcl/tk for python in the first way(mentioned in the question). I'll appreciate it
sudo apt-get install tcl-dev tk-dev
worked for me, although I ended up pulling a docker image and using that instead.
In my case I had import tkinter properly working on my Python3 environment, but I had to use a pre-compiled Python with its own environment (Blender fyi) that didn't include the dependencies (I needed tkinter to run matplotlib).
The fix in my case was very simple:
In the working python, import tkinter and check where it is installed with tkinter.__file__. This will be something like path/to/site-packages/tkinter
Copy the tkinter folder into the site-packagesof your target installation
Then import _tkinter won't work. Again using the file trick, locate the missing .so file, in my Ubuntu was something like `path/to/python3.7/lib-dynload/_tkinter.cpython-37m-x86_64-linux-gnu.so'
Again, copy the .so file into the corresponding lib-dynload of your target installation. Make sure that both origin and target Python versions are compatible.
To make sure that your target python finds the copied files, make sure that the destination paths are listed under sys.path.
Hope this helps!
Cheers,
Andres
For CentOS, this is:
yum install -y tcl-devel tk-devel
Worked on CentOS 7.
In general, I find that where RHEL has *-dev, CentOS has *-devel

How to compile pygtk from source for python2.6 on ubuntu 12.04

I have a application which is stuck at python2.6. I cannot port it to python2.7 due to specific and complicated extensions.
The probleme is that 12.04 removes pygtk for python2.6 as python2.7 becomes the default python version.
I need then to build pygtk for python2.6 from source. I have followed the readme but I am doing something wrong. (the doc is quite succinct)
The build looks ok, as I can import gtk if I am in the decompressed archive folder (I do a python -c 'import gtk').
But the make install doesn't work properly.
AFAICT, I have export'ed PYTHON & PYTHONPATH variables to the proper path.
PYTHONPATH=/usr/lib/python2.6/dist-packages
PYTHON=/usr/bin/python2.6
Any idea on what's wrong with this config ?
I don't know if I'm getting farther than you are, but here's what I'm doing so far. Maybe we can figure this out together.
$ sudo su
# pip install pygtk
This generates a bunch of errors, including "To build PyGTK in a supported way, read the INSTALL file." After reading that and other things, I tried this:
# cd build/pygtk
# chmod 755 configure
# PYTHON=/usr/bin/python2.6 ./configure --prefix=/usr
This finds the right version of Python, but now can't find GLIB. Errors include, "This usually means GLIB is incorrectly installed." When I look in config.log I find this error, "fatal error: glib.h: No such file or directory". I found a help page that suggested you might get this error if you haven't installed a development version of GLIB.
# apt-get install libglib2.0-dev
# PYTHON=/usr/bin/python2.6 ./configure --prefix=/usr
Progress! I now see a new error, "No package 'pygobject-2.0' found". That error appears in a forum post with a suggestion to install python-gobject-dev.
# apt-get install python-gobject-dev
# PYTHON=/usr/bin/python2.6 ./configure --prefix=/usr
No errors, so I try running make and make install. The first one works, but the install fails with an error, "/bin/bash: line 16: ../py-compile: Permission denied". Permission denied is weird when running as root. After flailing for a while, I go back to the output of the configure script and see a message, "checking for PYCAIRO... no", followed by another, "not checking for gtk due to missing pycairo". A little guesswork leads me to install another module.
# apt-get install python-cairo-dev
# PYTHON=/usr/bin/python2.6 ./configure --prefix=/usr
That solves the pycairo complaint, but there are a bunch more, including GTK.
# apt-get install python-gtk2-dev
# PYTHON=/usr/bin/python2.6 ./configure --prefix=/usr
That solved most of the complaints, just LIBGLADE is missing.
# apt-get install libglade2-dev
# PYTHON=/usr/bin/python2.6 ./configure --prefix=/usr
OK, all the modules will be built, but it says, "Numpy support: no".
# make
# make install
This fails with the same error I saw earlier, "/bin/bash: line 16: ../py-compile: Permission denied".
I'm going to leave it here for now and come back to it later.
Try to use easy_install for 2.6, suppose in your ubuntu you have 2.6 and 2.7 installed. you can have easy_install (by default for 2.7), and easy_install-2.6 to install the dedicated packages for 2.6.

Ubuntu packages needed to compile Python 2.7

I've tried to compile Python 2.7 on Ubuntu 10.4, but got the following error message after running make:
Python build finished, but the necessary bits to build these modules were not found:
_bsddb bsddb185 sunaudiodev
To find the necessary bits, look in setup.py in detect_modules() for the module's name.
What packages do I need? (setup.py was not helpful)
Assuming that you have all the dependencies installed (on Ubuntu that would be bunch of things like sudo apt-get install libdb4.8-dev and various other -dev packages, then this is how I build Python.
tar zxvf Python-2.7.1.tgz
cd Python-2.7.1
# 64 bit self-contained build in /opt
export TARG=/opt/python272
export CC="gcc -m64"
export LDFLAGS='-Wl,-rpath,\$${ORIGIN}/../lib -Wl,-rpath-link,\$${ORIGIN}/../lib -Wl,--enable-new-dtags'
./configure --prefix=$TARG --with-dbmliborder=bdb:gdbm --enable-shared --enable-ipv6
make
make install
The only modules that don't build during make are:
_tkinter - I don't do GUI apps and would use wxWindows if I did
bsddb185 - horribly obsolete version of bdb
dl - deprecated in 2.6
imageop - deprecated in 2.6
sunaudiodev - obsolete interface to some SparcStation device I think
Next I collect any .so files that are not already in the Python install directories and copy them over:
# collect binary libraries ##REDO THIS IF YOU ADD ANY ADDITIONAL MODULES##
cd /opt/python272
find . -name '*.so' | sed 's/^/ldd -v /' >elffiles
echo "ldd -v bin/python" >>elffiles
chmod +x elffiles
./elffiles | sed 's/.*=> //;s/ .*//;/:$/d;s/^ *//' | sort -u | sed 's/.*/cp -L & lib/' >lddinfo
# mkdir lib
chmod +x lddinfo
./lddinfo
And then add setuptools for good measure
#set the path
export PATH=/opt/python272/bin:$PATH
#install setuptools
./setuptools-0.6c11-py2.7.egg
At this point I can make a tarball of /opt/python272 and run it on any 64-bit Linux distro, even a stripped down one that has none of the dependencies installed, or a older distro that has old obsolete versions of the dependencies.
I also get pip installed but at this point there is a gap in my notes due to some failed struggles with virtualenv. Basically virtualenv does not support this scenario. Presumably I did easy_install pip and then:
export LD_RUN_PATH=\$${ORIGIN}/../lib
pip install cython
pip install {a whole bunch of other libraries that I expect to use}
After I'm done installing modules, I go back and rerun the commands to collect .so files, and make a new tarball. There were a couple of packages where I had to muck around with LDFLAGS to get them to install correctly, and I haven't done enough thorough testing yet, but so far it works and I'm using this Python build to run production applications on machines that don't have all the support libraries preinstalled.
Those are older, (mostly depreciated) modules that you probably won't use. You should be able to safely ignore the warnings.
The one that you may want to worry about trying to fix is _bsddb, which should go away once you install Berkeley DB 4.8... I'm not sure if it's in the Ubuntu repos or not. (edit: apparently it's the db package)
bsddb185 is an older version of the Oracle Berkley Database module. You can safely ignore it as far as I know.
sunaudiodev is depreciated, undocumented, I doubt you'd ever need to use it anyway. You should be able to safely ignore it.
Hope that helps a bit, anyway...
sudo apt-get build-dep python2.6 python-gdbm python-bsddb3 (Use python2.7 on maverick).
For more information, see this answer. Also look at this page, which applies equally for building on Lucid.

Categories

Resources