I'm looking into releasing a python package which includes an existing fortran or C program. The fortran/C program is compiled by running
./configure
make
The python code calls the resulting binary through subprocess calls (i.e. the code is not really wrapped as such). What I would like is that when the user types
python setup.py install
the fortran/C program is first compiled using the ./configure and make commands, then I want the python module to be installed, and the binary to be installed in the python bin/ directory alongside executables that are usually installed via the scripts= option in distutils.core.setup.
First, are there any problems with doing this? And if not, what is the best way to do it via setup.py? Are there existing functions to automate the ./configure and make, since this is pretty standard? Or should I just use os.system calls? And either way, where should those commands go in setup.py? Then should I have make output the binary to e.g. scripts/ and then have scripts=['scripts/mybinary'] in the setup() function?
Don't make this too complex.
Just provide them as separate items with a README that says -- basically -- what you said in the question.
Build the Fortran/C with ./configure; make; make install.
Setup Python with python setup.py install.
It doesn't appear to be rocket science. Trying to over-simplify the installation means that you must account for every OS vagary and oddness.
It's easier to trust the users to do "standard" installations so that the Fortran/C is on the system PATH, and your Python script should be configured to find them on the system PATH.
People who want to use your software are then free to reconfigure it to their own unique needs. They will anyway. Don't overpackage and force them to fight against you to reconfigure things.
consider writing a python C extension as a wrapper for your C code, and a f2py extension as a wrapper for your fortran code. Then you can just use them in your python code as fast calls instead of using subprocess.
Related
I try to write a simple plugin with gimpfu in python and I tried following those instructions.
1.2. Installation
Gimp-python consists of a Python module written in C and some native python support modules. You can build pygimp with the commands:
./configure
make
make install
This will build and install gimpmodule and its supporting modules, and install the sample plugins in gimp's plugin directory.
Where do I have to execute those commands?
I tried adding my script to the plugins folder but it seems like there is no python module called gimpfu. I believe I have to enable or install it in some way, but I can't find a solutio to do it.
EDIT: It seems like gimpfu is availible in the gimpfy-console insode gimp. It just doesn't seem to be availible for my plugin scripts.
No need to install anything. In the Windows versions Python support is built-in, and the gimpfu import is available when your code is executed by Gimp.
If you don't see the plugin in the menu it is likely a syntax error that doesn't let it run its registration code. See here for some debugging techniques.
However, since you mention PyCharm, you may have another Python interpreter installed and this makes things complicated because there can be conflicts depending on order of installation (and remember, Gimp uses Python 2.7)
Now it all depends if you are really doing a plugin (called from the Gimp menu) or a batch (where Gimp is called from a shell script), which is somewhat different. If you are writing a batch, see this answer for an example.
you don't need to install anything, on windows gimp comes with a python interpreter along with the libraries inside of it.
if you want to run your script from inside GIMP then you should check this answer and you should add the path to gimp to your system PATH environment variable (which is C:\Program Files\GIMP 2\bin on my system) , and instead of calling gimp-console.exe you should replace that with whatever gimp-console is currently available in that folder, the one on my system is gimp-console-2.10.exe.
So I've put together some Python bindings to some legacy Fortran code. They work, I can install this with the usual python setup.py install, but I would like to write some nosetests to check things using e.g. travis-ci.
I'm not sure how one goes about doing this. If one installs the package, and the bindings are available, then obviously you can just import the different methods and classes and use them in tests, but what if the package isn't installed? I guess that a pointer to some examples would be worthwile.
Background
I write small python packages for a system that uses modules (https://luarocks.org/) to manage packages. For those of you who don't know it, you can run module load x and a small script is run that modifies various environmental variables to make software 'x' work, you can then undo this with module unload x.
This method of software management is nearly ubiquitous in scientific computing and has a lot of value in that arena: you can run ancient unmaintained software alongside packages that that software would interfere with, you can run multiple versions of software, which allows you to reproduce your data exactly (you can go back to old versions), and you can run frankly poorly written non updated software with outdated dependencies.
These features are great, but they create an issue with the python 2/3 split:
What if you want to write a package that works with both python 2 and 3 and use it alongside software that requires either python 2 or 3?
The way you make old python2 dependent software work on these large systems is that you make a python/2.7.x module and a python/3.5 module. When you want to run a script that uses python 2, you load that module, etc.
However, I want to write a single python package that can work in either environment, because I want that software to be active regardless of which python interpreter is being used.
This is fundamentally extremely easy: just use a #!/usr/bin/env python shebang line, done. That works. I write all my software to work with either, so no problem.
Question
The issue is: I want to use setuptools to distribute my package to other scientists in the same situation, and setup tools mangles the shebang line.
I don't want to get into a debate about whether mangling the shebang line is a good idea or not, I am sure it is since it has existed for years now in the same state. I honestly don't care, it doesn't work for me. The default setuptools install causes the software not to run because when a python interpreter's module is not loaded, that python interpreter does not function, the PYTHONPATH is totally wrong for it.
If all of my users had root access, I could use the data_files option to just copy the scripts to /usr/bin, but this is a bad idea for compatibility, and my users don't have root access anyway so it is a moot point.
Things I tried so far:
I tried setting the sys.executable to /usr/bin/env python in the setup.py file, but that doesn't work, because then the shebang is: #!"/usr/bin/env python", which obviously doesn't work.
I tried the Don't touch my shebang class idea in this question: Don't touch my shebang! (it is the bottom answer with 0 votes). That didn't work either, probably because it is written for distutils and not setuptools. Plus that question is 6 years old.
I also looked at these questions:
Setuptools entry_points/console_scripts have specific Python version in shebang
Changing console_script entry point interpreter for packaging
The methods described there do not work, the shebang line is still altered.
Creating a setup.cfg file with the contents::
[build]
executable = /usr/bin/env python
also does not change the shebang line mangling behavior.
There is an open issue on the setuptools github page that discusses something similar:
https://github.com/pypa/setuptools/issues/494
So I assume this isn't possible to do natively, but I wonder if there is a workaround?
Finally, I don't like any solution that involves asking the user to modify their install flags, e.g. with -e.
Is there anyway to modify this behavior, or is there another distribution system I can use instead? Or is this too much of an edge case and I just need to write some kind of custom installation script?
Thanks all.
Update
I think I was not clear enough in my original question, what I want the user to be able to do is:
Install the package in both python2 and python3 (the modules will go into lib/pythonx/site-lib.
Be able to run the scripts irrespective of which python environment is active.
If there is a way to accomplish this without preventing shebang munging, that would be great.
All my code is already compatible with python 2.7 and python 3.3+ out of the box, the main thing is just making the scripts run irrespective of active python environment.
I accidentally stumbled onto a workaround while trying to write a custom install script.
import os
from setuptools import setup
from setuptools.command.install import install
here = os.path.abspath(os.path.dirname(__file__))
# Generate a list of python scripts
scpts = []
scpt_dir = os.listdir(os.path.join(here, 'bin'))
for scpt in scpt_dir:
scpts.append(os.path.join(here, 'bin', scpt))
class ScriptInstaller(install):
"""Install scripts directly."""
def run(self):
"""Wrapper for parent run."""
super(ScriptInstaller, self).run()
setup(
cmdclass={'install': ScriptInstaller},
scripts=scpts,
...
)
This code doesn't do exactly what I wanted (alter just the shebang line), it actually just copies the whole script to ~/.local/bin, instead of wrapping it in::
__import__('pkg_resources').run_script()
Additionally, and more concerningly, this method makes setuptools create a root module directory plus an egg-info directory like this::
.local/lib/python3.5/site-packages/cluster
.local/lib/python3.5/site-packages/python_cluster-0.6.1-py3.5.egg-info
Instead of a single egg, which is the usual behavior::
.local/lib/python3.5/site-packages/python_cluster-0.6.1-py3.5.egg
As far as I am aware this is the behavior of the old distutils, which makes me worry that this install would fail on some systems or have other unexpected behavior (although please correct me if I am wrong, I really am no expert on this).
However, given that my code is going to be used almost entirely on linux and OS X, this isn't the end of the world. I am more concerned that this behavior will just disappear sometime very soon.
I posted a comment on an open feature request on the setuptools github page:
https://github.com/pypa/setuptools/issues/494
The ideal solution would be if I could add an executable=/usr/bin/env python statement to setup.cfg, hopefully that is reimplemented soon.
This workaround will work for me for now though. Thanks all.
I don't even know the right way to put my question but i will try my best.
I downloaded (Python-3.4.2.tar) the source code of a python interpreter from www.python.org
I extracted the files(using 7-zip).
Now lets say i latter want to use the unziped/extracted fies to create an installer i.e put it in a form that i can double click and Python-3.4.2 will be installed in my computer.
i guess it is called creating a build distriution.
I know i can just download Python-3.4.2.exe from the site and install right away but i want to know how it goes from being source code to becoming something one can install.
Here's a starting point for compiling:
https://github.com/python/cpython/blob/2.7/PCbuild/readme.txt
I was able to compile 2.7.13 with:
cd PCBuild
cmd /c get_externals.bat
cmd /c build.bat -e --no-tkinter "/p:PlatformToolset=v100"
3.4 may require a later compiler.
Here's how to build an installer:
https://github.com/python/cpython/blob/2.7/Tools/msi/README.txt
For building the CPython python interpreter from source, you'll want to have a look at the instructions at https://docs.python.org/devguide/
These instructions probably also (somewhere) contain the steps to produce a Windows installer, which seems to be done with some tool called PCbuild.
Basically, I think what you are asking is how you can distribute the Python runtime along with your program. The process is pretty simple. First, you may want to take a look at Python's distutils. Secondly, you will need to distribute the Python runtime. Python is currently released within installation binaries for pretty much every major operating system. You have the option of compiling target system(s) yourself in order to make Python silently install.
You will then need to decide where you want the files to go. Most people don't like this sort of behavior, so I recommend putting the files in your programs directory. Shortcuts, system variables, et cetera will need to be looked into.
As another option you can consider porting your script(s) to Jython, as most people tend to have at least the Java runtime. In the process of porting your code there is a way of constructing Java .class files with Jython when coded a certain way. They can then be placed into an executable Jar file. Easy peasy, but I haven't tried this myself, to be honest.
If you want to stick with pure Python, the last hurtle is putting everything in one file. There are plenty of tools for that. I would try Github for starters. For bonus points you can always code your own self-extracting binary.
Post Script: With the title changed, I think you are looking for Creating Build Distributions. It is the second link within the Google results for the query "how to compile python installer". Further down the same page I found this gem.
This is probably a question that has a very easy and straightforward answer, however, despite having a few years programming experience, for some reason I still don't quite get the exact concepts of what it means to "build" and then to "install". I know how to use them and have used them a lot, but have no idea about the exact processes which happen in the background...
I have looked across the web, wikipedia, etc... but there is no one simple answer to it, neither can I find one here.
A good example, which I tried to understand, is adding new modules to python:
http://docs.python.org/2/install/index.html#how-installation-works
It says that "the build command is responsible for putting the files to install into a build directory"
And then for the install command: "After the build command runs (whether you run it explicitly, or the install command does it for you), the work of the install command is relatively simple: all it has to do is copy everything under build/lib (or build/lib.plat) to your chosen installation directory."
So essentially what this is saying is:
1. Copy everything to the build directory and then...
2. Copy everything to the installation directory
There must be a process missing somewhere in the explanation...complilation?
Would appreciate some straightforward not too techy answer but in as much detail as possible :)
Hopefully I am not the only one who doesn't know the detailed answer to this...
Thanks!
Aivoric
Building means compiling the source code to binary in a sandbox location where it won't affect your system if something goes wrong, like a build subdirectory inside the source code directory.
Install means copying the built binaries from the build subdirectory to a place in your system path, where they become easily accessible. This is rarely done by a straight copy command, and it's often done by some package manager that can track the files created and easily uninstall them later.
Usually, a build command does all the compiling and linking needed, but Python is an interpreted language, so if there are only pure Python files in the library, there's no compiling step in the build. Indeed, everything is copied to a build directory, and then copied again to a final location. Only if the library depends on code written in other languages that needs to be compiled you'll have a compiling step.
You want a new chair for your living-room and you want to make it yourself. You browse through a catalog and order a pile of parts. When they arrives at your door, you can't immediately use them. You have to build the chair at your workshop. After a bit of elbow-grease, you can sit down in it. Afterwards, you install the chair in your living-room, in a convenient place to sit down.
The chair is a program you want to use. It arrives at your house as source code. You build it by compiling it into a runnable program. You install it by making it easier to use.
The build and install commands you are refering to come from setup.py file right?
Setup.py (http://docs.python.org/2/distutils/setupscript.html)
This file is created by 3rd party applications / extensions of Python. They are not part of:
Python source code (bunch of c files, etc)
Python libraries that come bundled with Python
When a developer makes a library for python that he wants to share to the world he creates a setup.py file so the library can be installed on any computer that has python. Maybe this is the MISSING STEP
Setup.py sdist
This creates a python module (the tar.gz files). What this does is copy all the files used by the python library into a folder. Creates a setup.py file for the module and archives everything so the library can be built somewhere else.
Setup.py build
This builds the python module back into a library (SPECIFICALLY FOR THIS OS).
As you may know, the computer that the python library originally came from will be different from the library that you are installing on.
It might have a different version of python
It might have a different operating system
It might have a different processor / motherboard / etc
For all the reasons listed above the code will not work on another computer. So setup.py sdist creates a module with only the source files needed to rebuild the library on another computer.
What setup.py does exactly is similar to what a makefile would do. It compiles sources / creates libraries all that stuff.
Now we have a copy of all the files we need in the library and they will work on our computer / operating system.
Setup.py install
Great we have all the files needed. But they won't work. Why? Well they have to be added to Python that's why. This is where install comes in. Now that we have a local copy of the library we need to install it into python so you can use it like so:
import mycustomlibrary
In order to do this we need to do several things including:
Copy files to their library folders in our version of python.
Make sure library can be imported using import command
Run any special install instructions for this library. (seting up paths, etc)
This is the most complicated part of the task. What if our library uses BeautifulSoup? This is not a part of Python Library. We'd have to install it in a way such that our library and any others can use BeautifulSoup without interfering with each other.
Also what if python was installed someplace else? What if it was installed on a server with many users?
Install handles all these problems transparently. What is does is make the library that we just built able to run. All you have to do is use the import command, install handles the rest.