Completely lost: Python configure script runs with errors - python

I downloaded tarball of python 2.7.2 to install on Suse Linux server--it comes with 2.6 and 3.1.
Untarred it (I know--wrong lingo, sorry) to a directory.
When trying to run ./configure, which should create a valid makefile I can't get past the step one: the script reports that it can't find a compiler on the path.
But, when I run the shell in the same directory and type "make", make runs.
I am really unfamiliar with Linux, but this just seems so basic that I can't even begin to see what's wrong.
I also downloaded what appears to be an RPM file for python 2.7.2 for SUSE Linux but I can't for the life of me figure out how to get "import" this package into Yast2 or "Install Software." These two tools seem impenetrable and hostile to packages saved in the file system rather than accessed from specific distribution web sites.
Really, this should be just trivial but it is not.
Suse uses Gnome and Gnome seems to have its own view of what the directory structure should be for desktop end user-y kinds of files. That is where I put my downloaded tar file. Might I do better if I put somewhere in usr?
Sorry to be so much more clueless than most stackoverflow participants but I am just not a Linux guy.

Sounds like you simply don't have the compiler installed. Do:
sudo zypper install gcc
If a ./configure fails, there's no point in running make.

Suse has a package manager called Yast. It would do your installation with no fuss.

Related

any known way to create a device-to-device portable self-contained python runtime?

I want to distribute a python program on, say, Windows and/or Mac, but I don't want to give the user the headache of ensuring there is an appropriate python runtime installed on their machine. And i don't want to interfere with their machine's configuration by, let's say, requesting root privileges and installing a system-wide python runtime on their system that suits my program specifically because it's too invasive and might cause compatibility collisions with other installed versions of the runtime.
I would much rather have a self-contained executable that could be, for example, stored on a USB flash-drive, inserted into the system, and then maybe with a stepping-stone binary executable that just invokes the device-portable runtime on a python script that I provide, I could then run the program as if it were a self-contained binary executable (with only standard-library dependencies).
A link to this binary executable could be published into main-menu program lists, docks, or desktops. And it could be invoked by shell scripts or other executed-by-proxy mechanisms. Such a no-install/self-contained python program could potentially be a first-class user-invokable application. This is what I want to achieve.
I googled around for projects that provided a device-portable/mobile python installation and so far I've only found portablepython.com. Unfortunately it says the project is discontinued and no download link for the project is provided. it listed some similar projects but they all seemed defunkt or with a very different focus.
Does anyone know of an active project that is or includes such an independent/portable/mobile/no-install distribution for python?
or is there some way i could configure python's build system to build a noinstall-friendly product?
any ideas welcome. thanks for your input!
After more searching I found that Python.org publishes its own standalone-python distribution called the embeddable zip file.
This is exactly what I was searching for. It's a basic python standalone runtime that requires relatively few megabytes of storage.
I started with this embeddable distro and then cajoled a standalone copy of pip to work with it. Problem solved.
Improving upon #oreus2020's answer, you can download the embeddable zip file from here. Then, unzip the compressed file to a folder of your choice. Go to the root of your install and find python._pth file and open it in a text editor. Remove the "#" before import site(This file is the one which manages the environment of the portable install. If you want anything to be recognized by the portable python interpreter, just throw the path in here and that's it!). If you want pip, go to this page and save it in the root of your portable install and run it using the portable python interpreter like ./python get-pip.py from a commandline opened at the root of your install. Pip installed! To use the pip, do ./python -m pip <commands> from the commandline opened at the root of your install and then open the python._pth file and insert the following below the "." ./Lib/site-packages ./Scripts. Voila, you got yourself a python portable install!
My python._pth file looks like:
python39.zip
.
# Uncomment to run site.main() automatically
./Repo
./Repo/Code
./Repo/Code/cogs
./Lib/site-packages
./Scripts
import site
If you are still wondering, here is the link to the one I made for myself.
P.S. Pardon my bad English

OSX: Just ran sudo make install ... where did the compiled Phoenix2 app go?

I'm a Windows guy who just compiled my first python application, can anyone tell me where the compiled output would end up?
I just ran SUDO make install to install pyopenCL which is a dependency.
Now I'm trying to install and run phoenix2 and I ran the following:
sudo python ./setup.py install
and now I'm not sure where to look for and execute the file as described here. Any assistance would be appreciated (I'm a bit of a n00b here, overwhelmed by all the documentation)
When you run make install, the make application looks in the project's Makefile to find out where it should put executables, as well as any other files the application needs to run. This, of course, assumes that the project even has executables (a library might not, for example).
Look in the project base directory (the dir you ran make install from) for a file named Makefile. It should have a variable called BIN_DIR or similar that tells you where it wants final binaries to go.

Creating a Portable Python (local install) for Linux

I'm looking to create the following:
A portable version of python that can be run on any system (with any previous version of python or no python installed) and have it pre-configured with various python packages (ie, django, lxml, pysqlite, etc)
The closest I've found to the above is virtualenv, but this only goes so far.
If I package up a nice virtualenv for python on one machine, it contains sym links to a lot of the libraries it needs. I can take those sym links and convert them to their actual files, but if I try to move this entire directory to another machine, I get seg fault after seg fault.
To launch python on a different machine, I'm using:
LD_LIBRARY_PATH=lib/ ./bin/python
and in lib/ I have all of the shared libraries I copied from the original machine. The problem here is these shared libraries might rely on other shared libraries that I'm not including, so executing this on other linux distros does not work. Probably due to it falling back on older shared libaries installed on the system that do not work with what I copied over.
Anyone have an idea on how to get this working? Is this even possible?
EDIT:
To clarify, the desired outcome is to create a tar.gz of a python binary and associated packages (django, lxml, pysqlite, etc) that can be extracted and run on any linux based system, ie (ubuntu 8.04, redhat 5, suse 11, etc), all 32bit distros, where the locally installed version of python doesn't impact what's in the tar.gz.
I just tested this and it works great.
Get the copy of python you want to install and untar it and cd to the untarred folder first.
Also get a copy of setuptools and untar that.
/opt/portapy used below is of course just the name I came up with for this post, it could be any path and the full path should be tarred up and the same path should be used on any systems you put this on due to absolute path linking.
mkdir /opt/portapy
cd <python source dir>
./configure --prefix=/opt/portapy && make && make install
cd <setuptools source dir>
/opt/portapy/bin/python ./setup.py install
Make the virtual env folder inside the portapy folder.
mkdir /opt/portapy/virtenv
/opt/portapy/bin/virtualenv /opt/portapy/virtenv
cd /opt/portapy/virtenv
source bin/activate
Done. You are ready to install all of your libraries here and have the option of creating multiple virtual envs this way.
You can then tar up the whole /opt/portapy folder and transport it to any Linux system of the same arch, within reason I suspect.
I compiled 2.7.5 ond centOS 5.8 64bit and moved the folder to a Cent6.9 system and it runs perfectly.
I don't know how this is even possible. If it were, they woudn't need to distribute binary packages of python for different platforms. You can't simply distribute python that will run on any platform. It has to be built from source for that arch. Virtualenv will expect you to tell it which system python to use (using links).
This pretty much goes for almost any binary package that links against system libs. Again, if it were possible, we wouldn't need any platform specific binary distributions.
You can, however, achieve part of what you want. That is, running python on another machine that doesn't have python installed as long as its the same arch. This is the same concept behind freezing, or py2exe/py2app/pyinstaller. An interpreter is bundled into a standalone environment. So the app can run on any similar platform.
Edit
I just realized that while your question speaks about "system" agnostically, your title contains the reference "linux". There are different flavors of linux, so in order for it to work you would have to build it fat for multiple archs and also completely contain the standalone links. You might try building a package with pyinstaller and using that to include in your project.
You can try just building python from source, in your virtualenv:
$ ./configure --prefix=/path/to/virtualenv && make && make install
If you still have problems with the links to libs, you can also investigate building it statically
I'm not sure that working solely in Python is the way to go here. You might have better luck with Puppet of Chef, which are configuration tools that can be used to create a local environment. There is plenty of code out there to install virtualenv and python on just about any Linux plus OSX (probably not Windows though).
Your workflow would be to install chef or Puppet (your choice), run a script to install the Python you want, then enter a virtualenv and pip install any packages you might need.
Sorry this isn't as easy as virtualenv alone, but it is much more robust.
Well, since I rarely accept "can't be done", there is a way to do it. Warning: it isn't pretty and you should probably look into a different scenario.
What you will need to to is determine a standard location for this top level directory. Second, using that directory as your root you will need to compile Python on each Linux distribution you want to run this on. For this you would use something like "/usr/local/myappname/platform/" to configure and compile Python to live in. In each case substitute "platform" with the name of the platform such as "/usr/local/rhel/". If memory serves the configure option you are looking for here is --prefix.
Once you have each distribution compiled you will need a script to determine which one to use and either set environment variables or have it create symlinks to the appropriate "installation" of python. I would then use virtualenv and bootstrap in that tree to keep the "in-use" python libraries even more specific.
I can't think of a common Linux distribution that doesn't have Python by default. As such you could use setup.py and/or basic python scripts to script this out since you should be able to rely in Python being present - even if its ye olde version as in RHEL installs. Personally I find the above method overly complicated but it would meet your stated requirements with the allowance for a final script. Of course, you could use shar (SHell ARchive) to tar all of this into a runnable shell script to do the installation and avoid the need for secondary scripts. If you gzip the resulting shel archive then you can decompress it on target systems and execute it to set everything up.
All that said, I would not recommend this. I would recommend determining the minimum Python version you can run on and ensuring that is installed by the distribution whenever possible and if needs be pulling down from a repo and installing. Then, use virtualenv and bootstrap with a requirements.txt to install necessary python libraries and apps into the virutalenv. For that see this documentation
I faced the same problem, so I created PortableVirtualenv. Your Question is just the definition of it.
I use it as a base for commercial multiplatform app I develop. (But PortableVirtualenv is public domain - use it freely.)
If needed, you can pip-install any package and zip the whole directory to distribute also packages you need.
One nice option is to make a "snap" portable linux application. They have a python mode which lets you specify you specify exactly what modules you need. From https://snapcraft.io/first-snap#python :
Snaps let you distribute a dependency-isolated Python app in an app store experience for end users.
Another option is to containerize your application with something like docker. Then instead of executing your script directly, the user is actually running a small OS with just your application and its dependencies. https://www.infoq.com/articles/docker-executable-images/ has more about executable containers.
Container images can also be used for short lived processes: a containerized executable meant to be run on your computer. These containers execute a single task, are short lived and can generally be removed after use. We call these executable images. Examples are compilers (Golang) or build tools (Maven), presentation software (I love to hack a simple presentation in Markdown format and let a RevealJS Docker image serve that) and browsers (a fresh contained browser to follow that fishy link). A real evangelist for executable images is Docker's own Jessie Frazelle. To get some great inspiration be sure to read her blog about them or check out this presentation at DockerCon 2015.

Finding Tools/scripts/ subdirectory

Does anyone know where I could find this file on Ubuntu?
On my machine it's in:
/usr/share/doc/python2.7/examples/Tools/scripts/diff.py
However, it'll vary a little depending on your distrib and how up to date you are. But there's a really handy little tool called 'locate' that you can use to quickly find stuff on your machine.
locate diff.py | grep Tools
Gives me
/usr/share/doc/python2.7/examples/Tools/scripts/diff.py
/usr/share/doc/python2.7/examples/Tools/scripts/ndiff.py
/usr/share/doc/python3.1/examples/Tools/scripts/diff.py
/usr/share/doc/python3.1/examples/Tools/scripts/ndiff.py
As I have two versions of Python installed. I put it through grep, as sometimes locate can match quite a lot.
If you're sure that you have a file on your machine, but locate isn't finding it, you might need to update your database, which is done with the 'updatedb' command, as root. So, just run
sudo updatedb
and get a coffee (or two if you have a slow machine/very full drive) and then try again.
This command
$ locate "Tools/scripts/diff.py"
will find the location of the file if it's installed. This depends on the database generated regularly by the updatedb command (this usually runs as a cron job, but can also be invoked manually)
FWIW, I just checked my Ubuntu installation (10.04LTS) and didn't find it. Perhaps only Python versions 2.7+ have this (the default version that came with this install is still v2.6.5)
I needed to know this for a different environment, namely Linux running an Amazon Machine Image (AMI). I installed Python 3.5 via "yum" but couldn't find the Tools/scripts directory using find/locate, etc., or any other "yum" package which included it.
So in the end I cloned the python source tree mirror:
git clone https://github.com/python-git/python python
This downloads the Tools/scripts folder which I then moved to some standard location. I needed the "2to3" program from "scripts" and this worked. There is also a "clone or download" link at that URL where the package can be downloaded in the usual way if git is not available.
I couldn't find an easy way to install the Tools/scripts via "yum" on AMI Linux, which would still be my preference.

Run post-install script in a Python Egg (setuptools)

I have created a little Python egg (with setuptools) that I want to install in other machines of my LAN. I have even setup a server for the eggs and all (and the egg is properly downloaded and installed with easy_install -f http://myserver/eggrepository ) :-)
I would like to know if there's a way of running an script (bash or Python) when installing it with easy_install (version 0.6c11 and python2.6).
I have added a bash script to the package, and I'd like to be able to run it automatically (mainly to start some functionalities in the rcX.d levels, start running at startup, etc...) when the egg is installed. Right now I have to go to the /usr/local/lib/python2.6/dist-packages, find the folder where my egg was installed and run the bash script that is in said egg... But that solution is not very accurate and I'm sure it will give me problems if I change versions, paths, etc...
I've been reading and I found some posts saying it wasn't possible, but they are a bit old and maybe there's a way now... I also found others saying it was possible with distutils (which means that probably setuptools can do it too) but I haven't been able to find any suitable solution using setuptools.
Thank you in advance
Related:
How can I add post install scripts...
How to extend distutils with a simple post install script
Ok... I found a workaround...
python-packaging-custom-scripts
It's not as straight-forward as I would have liked, but well...
I can put the installation process in an sh file and then, since there's going to be a Python script in the user's path, I can call it from the bash script installing the package...

Categories

Resources