Friends,
I was trying to create a python distribution. I executed the commands:
python setup.py sdist followed by python setup.py install
but the distribution created showed up folders build and dist without .pyc file.
Then I tried to find that file using windows find and got that its present in
C:\Python27\Lib\site-packages
Could anybody tell me the mistake I did in setup or missed anything.
Thanks in advance,
Saurabh
sdist command creates a source distribution that should not and does not contain pyc files.
install command installs in your local environment. Creating Python distributions is not its purpose.
If your package is pure Python then the source distribution is enough to install it anywhere you like; you don't need pyc files that depend on Python version and thus less general. bdist or bdist_egg (setuptools) generate among other things pyc files.
You won't get any pyc files until the first time you run the module. A source distribution is just that. Distributing pyc files is not usually useful anyway, they are not portable. If you intended to only distribute pyc files then you are in for a lovely set of problems when different python versions, and different operating systems, are used. Always distribute the source.
For most modules, the time taken to generate them the first time they are used is trivial - it is not worth worrying about.
By the way, when you move to Python 3, the pyc files are now stored in a directory called __pycache__ (3.2).
Related
Python setuptools can create a source distribution:
python setup.py sdist # create a source distribution (tarball, zip file, etc.)
Or a binary distribution:
python setup.py bdist # create a built (binary) distribution
As far as I understand, there should not be any performance difference:
bdist installs the already-compiled .pyc files from the binary package.
sdist compiles the .py files to .pyc files, and installs them.
When executed it should not matter how were the .pyc files compiled - they should have the same performance.
Is there any performance difference between dist and sdist python packages?
If you have a pure Python code, the difference in time deploying will be slim. Note that there is no difference in performance between .py and .pyc, except that the latter will be read slightly faster the first time. The so called optimised .pyo only strip the asserts, and optionally, get rid of the docstrings, so they are not very much optimised.
The big difference comes when you have C files. sdist will include them if properly referenced, but the user will need a working and appropiate compiler, Python header files, and so on. Also, you will have to take the time to build them on every client. The same distribution will be valid for any platform you deploy in.
On the other hand, bdist compiles the code once. Installing in the client is immediate, as they don't need to build anything, and easier as they don't require a compiler installed. The downside is that you have to build for that platform. Setuptools is capable of doing cross-compilation, provided you have installed and configured the right tools.
This is probably a question that has a very easy and straightforward answer, however, despite having a few years programming experience, for some reason I still don't quite get the exact concepts of what it means to "build" and then to "install". I know how to use them and have used them a lot, but have no idea about the exact processes which happen in the background...
I have looked across the web, wikipedia, etc... but there is no one simple answer to it, neither can I find one here.
A good example, which I tried to understand, is adding new modules to python:
http://docs.python.org/2/install/index.html#how-installation-works
It says that "the build command is responsible for putting the files to install into a build directory"
And then for the install command: "After the build command runs (whether you run it explicitly, or the install command does it for you), the work of the install command is relatively simple: all it has to do is copy everything under build/lib (or build/lib.plat) to your chosen installation directory."
So essentially what this is saying is:
1. Copy everything to the build directory and then...
2. Copy everything to the installation directory
There must be a process missing somewhere in the explanation...complilation?
Would appreciate some straightforward not too techy answer but in as much detail as possible :)
Hopefully I am not the only one who doesn't know the detailed answer to this...
Thanks!
Aivoric
Building means compiling the source code to binary in a sandbox location where it won't affect your system if something goes wrong, like a build subdirectory inside the source code directory.
Install means copying the built binaries from the build subdirectory to a place in your system path, where they become easily accessible. This is rarely done by a straight copy command, and it's often done by some package manager that can track the files created and easily uninstall them later.
Usually, a build command does all the compiling and linking needed, but Python is an interpreted language, so if there are only pure Python files in the library, there's no compiling step in the build. Indeed, everything is copied to a build directory, and then copied again to a final location. Only if the library depends on code written in other languages that needs to be compiled you'll have a compiling step.
You want a new chair for your living-room and you want to make it yourself. You browse through a catalog and order a pile of parts. When they arrives at your door, you can't immediately use them. You have to build the chair at your workshop. After a bit of elbow-grease, you can sit down in it. Afterwards, you install the chair in your living-room, in a convenient place to sit down.
The chair is a program you want to use. It arrives at your house as source code. You build it by compiling it into a runnable program. You install it by making it easier to use.
The build and install commands you are refering to come from setup.py file right?
Setup.py (http://docs.python.org/2/distutils/setupscript.html)
This file is created by 3rd party applications / extensions of Python. They are not part of:
Python source code (bunch of c files, etc)
Python libraries that come bundled with Python
When a developer makes a library for python that he wants to share to the world he creates a setup.py file so the library can be installed on any computer that has python. Maybe this is the MISSING STEP
Setup.py sdist
This creates a python module (the tar.gz files). What this does is copy all the files used by the python library into a folder. Creates a setup.py file for the module and archives everything so the library can be built somewhere else.
Setup.py build
This builds the python module back into a library (SPECIFICALLY FOR THIS OS).
As you may know, the computer that the python library originally came from will be different from the library that you are installing on.
It might have a different version of python
It might have a different operating system
It might have a different processor / motherboard / etc
For all the reasons listed above the code will not work on another computer. So setup.py sdist creates a module with only the source files needed to rebuild the library on another computer.
What setup.py does exactly is similar to what a makefile would do. It compiles sources / creates libraries all that stuff.
Now we have a copy of all the files we need in the library and they will work on our computer / operating system.
Setup.py install
Great we have all the files needed. But they won't work. Why? Well they have to be added to Python that's why. This is where install comes in. Now that we have a local copy of the library we need to install it into python so you can use it like so:
import mycustomlibrary
In order to do this we need to do several things including:
Copy files to their library folders in our version of python.
Make sure library can be imported using import command
Run any special install instructions for this library. (seting up paths, etc)
This is the most complicated part of the task. What if our library uses BeautifulSoup? This is not a part of Python Library. We'd have to install it in a way such that our library and any others can use BeautifulSoup without interfering with each other.
Also what if python was installed someplace else? What if it was installed on a server with many users?
Install handles all these problems transparently. What is does is make the library that we just built able to run. All you have to do is use the import command, install handles the rest.
I am writing a program in python to be sent to other people, who are running the same python version, however these some 3rd party modules that need to be installed to use it.
Is there a way to compile into a .pyc (I only say pyc because its a python compiled file) that has the all the dependant modules inside it as well?
So they can run the programme without needing to install the modules separately?
Edit:
Sorry if it wasnt clear, but I am aware of things such as cx_freeze etc but what im trying to is just a single python file.
So they can just type "python myapp.py" and then it will run. No installation of anything. As if all the module codes are in my .py file.
If you are on python 2.3 or later and your dependencies are pure python:
If you don't want to go the setuptools or distutiles routes, you can provide a zip file with the pycs for your code and all of its dependencies. You will have to do a little work to make any complex pathing inside the zip file available (if the dependencies are just lying around at the root of the zip this is not necessary. Then just add the zip location to your path and it should work just as if the dependencies files has been installed.
If your dependencies include .pyds or other binary dependencies you'll probably have to fall back on distutils.
You can simply include .pyc files for the libraries required, but no - .pyc cannot work as a container for multiple files (unless you will collect all the source into one .py file and then compile it).
It sounds like what you're after is the ability for your end users to run one command, e.g. install my_custom_package_and_all_required_dependencies, and have it assemble everything it needs.
This is a perfect use case for distutils, with which you can make manifests for your own code that link out to external dependencies. If your 3rd party modules are available publicly in a standard format (they should be, and if they're not, it's pretty easy to package them yourself), then this approach has the benefit of allowing you to very easily change what versions of 3rd party libraries your code runs against (see this section of the above linked doc). If you're dead set on packaging others' code with your own, you can always include the required files in the .egg you create with distutils.
Two options:
build a package that will install the dependencies for them (I don't recommend this if the only dependencies are python packages that are installed with pip)
Use virtual environments. You use an existing python on their system but python modules are installed into the virtualenv.
or I suppose you could just punt, and create a shell script that installs them, and tell them to run it once before they run your stuff.
I have a personal python library consisting of several modules of scientific programs that I use. These live on a directory with the structure:
root/__init__.py
root/module1/__init__.py
root/module1/someprog.py
root/module1/ (...)
root/module2/__init__.py
root/module2/someprog2.py
root/module2/somecython.pyx
root/module2/somecython.so
root/module2/somefortran.f
root/module2/somefortran.so
(...)
I am constantly making changes to these programs and adding new files. With my current setup at work, I share the same directory with several machines of different architectures. What I want is a way to use these packages from python in the different architectures. If the packages were all pure python, this would be no problem. But the issue is that I have several compiled binaries (as shown in the example) from Cython and from f2py.
Is there a clever way to repackage these binaries so that python in the different systems only imports the relevant binaries? I'd like to keep the code organised in the same directory.
Obviously the simplest way would be to duplicate the directory or create another directory of symlinks. But this would mean that when new files are created, I'd have to update the symlinks manually.
Has anyone bumped into a similar problem, or can suggest a more pythonic approach to this organisation problem?
Probably you should use setuptools/distribute. You can then define a setup.py that compiles all files according to your current platform, copies them to some adequate directory and makes sure they are available in your sys.path.
You would do the following when you compile the source code of python.
Pass the exec-prefix flag with the directory to ./configure
For more info: ./configure --help will give you the following:
Installation directories:
--prefix=PREFIX install architecture-independent files in PREFIX
[/usr/local]
--exec-prefix=EPREFIX install architecture-dependent files in EPREFIX
[PREFIX]
Hope this helps :)
There is unfortunately no way to do this. A python package must reside entirely in one directory. PEP 382 proposed support for namespace-packages that could be split in different directories, but it was rejected. (And in any case, those would be special packages.)
Given that python packages have to be in a single directory, it is not possible to mix compiled extension modules for different architectures. There are two ways to mitigate this problem:
Keep binary extensions on a separate directory, and have all the python packages in a common directory that can be shared between architectures. The separate directory for binary extension can then be selected for different architectures with PYTHONPATH.
Keep a common directory with all the python files and extensions for different architectures. For each architecture, create a new directory with the package name. Then symlink all the python files and binaries in each of these directories. This will still allow a single place where the code lives, at the expense of having to create new symlinks for each new file.
The option suggested by Thorsten Krans is unfortunately not viable for this problem. Using distutils/setuptools/distribute still requires all the python source files to be installed in a directory for each architecture, negating the advantage of having them in a single directory. (This is not a finished package, but always work in progress.)
If I place my project in /usr/bin/
will my python interpreter generate bytecode? If so where does it put them as the files do not have write permission in that folder. Does it cache them in a temp file?
If not, is there a performance loss for me putting the project there?
I have packaged this up as a .deb file that is installed from my Ubuntu ppa, so the obvious place to install the project is in /usr/bin/
but if I don't generate byte code by putting it there what should I do? Can I give the project write permission if it installs on another persons machine? that would seem to be a security risk.
There are surely lots of python projects installed in Ubuntu ( and obviously other distros ) how do they deal with this?
Thanks
Regarding the script in /usr/bin, if you execute your script as a user that doesn't have permissions to write in /usr/bin, then the .pyc files won't be created and, as far as I know, there isn't any other caching mechanism.
This means that your file will be byte compiled by the interpreter every time so, yes, there will be a performance loss. However, probably that loss it's not noticeable. Note that when a source file is updated, the compiled file is updated automatically without the user noticing it (at least most of the times).
What I've seen is the common practice in Ubuntu is to use small scripts in /usr/bin without even the .py extension. Those scripts are byte compiled very fast, so you don't need to worry about that. They just import a library and call some kind of library.main.Application().run() method and that's all.
Note that the library is installed in a different path and that all library files are byte compiled for different python versions. If that's not the case in your package, then you have to review you setup.py and your debian files since that's not the way it should be.
.pyc/.pyo files are not generated for scripts that are run directly. Python modules placed where Python modules are normally expected and packaged up have the .pyc/.pyo files generated at either build time or install time, and so aren't the end user's problem.