central path for python modules - python

I am starting to convert a lot of C stuff in python 3.
In C I defined a directory called "Toolbox", where i put all my functions i needed in different programs, so called libraries.
To use a specific library i had just to add the line
#include "/home/User/Toolbox/VectorFunctions.h"
into my source. So i was able to use the same library in different sources.
In python i tried to write some Toolbox functions and implement them into the source with import VectorFunctions, which works, as long as the file VectorFunctions.py is in the same directory as the source.
I there a way (I think there must be one...) telling python that VectorFunctions.py is located in a different directory, e.g. /home/User/Python_Toolbox?
Thanks for any comment!

What I would do is to organize these toolbox functions into one installable Python package bruno_toolbox, with its setup.py, and then install it into development mode to system site packages, using python setup.py develop, and then use the bruno_toolbox like any other package on the system, everywhere. Then if that package feels useful, I'd publish it to PyPI for the benefit of everyone.

You can use python path. Writing this code beginning of your program :
import sys
sys.path.append('/home/User/Python_Toolbox')
If you have VectorFunctions.py in this folder you can import it :
import VectorFunctions

Related

Manually adding libraries

Given that every library is a python code itself, I think its logical that instead of using the import command, we can actually copy the whole code of that library and paste it to the top of our main.py.
I'm working on a remote pc, I cannot install libraries, can I use a library by just doing this?
Forgive me if this a very silly question.
Thanks
In some cases yes, you can. But there are (a lot of) libraries that have some of their functionality written in C and compiled to binary (e.g. the famous numpy). You can't just paste those.
Another thing that the pasting might introduce are naming colisions. If you use
import module
than any name in the module module can be safely used in the importing module using module.name even if the name name is already defined somewhere in the importing module. If you just paste the code this won't work.
While pasting the entire library at the top of your main file can work, I don't think it's the best way to go about solving your problem.
One option is to move the library and put it in to the same folder as your main.py file. I believe the import statement will check the current working directory for the library before looking elsewhere.
Another option is to use a virtual environment(virtualenv) and then install all the required libraries within this virtual environment. I'm not sure that this will work for you since you said you cannot install on this libraries and virtualenv requires pep. If you are interested in learning more about python virtual environments, take a look here.
Most modules are actually written in C, like Pygame for example. Python itself is based on C. You can't jump to conclusions, but if the library is pure Python, I'd suggest copying the package into your project directory and importing, rather than copying and pasting code snippets.

How to import some set of Python modules globally?

In my particular setting, I have a set of python modules that include auxiliary functions used in many different other modules. I putted them into a LIBS folder and I have other folder at the same path level those are including other modules that are doing certain jobs by using the help of these LIBS modules. Presently, I do this for all the modules to import LIBS modules.
import sys
sys.path.insert(0, '../LIBS')
import lib_module1
import lib_module2
....
As the project getting larger, this starts to be pain in the neck. I need to write down a large set of import statements for these auxiliary LIBS modules for each new module.
Is there any way to automatically import all these LIBS modules for the other modules that are in the folders living at the same path lelvel with LIBS folder?
For this, you can use
__init__.py
Kindly refer Modules and Stackoverflow.
Indeed there is! Start treating your LIBS modules as "real" modules (or packages) that are installed into the system like any other.
This means you will have to write a setup.py script to install your code. Generally this is done inside your development directory, then your module is installed with:
$ sudo python setup.py install
This will install your module under the site-packages subdirectory of wherever Python libraries are stored on your system.
I suggest starting by copying someone else's working setup.py and supporting files, then modifying to suit your packages. For example, here is my quoter module.
Fair warning: This is a pretty big step. Not only will you learn to deploy your module locally, you can also publish it on PyPI if you wish. The step of moving to true packages will encourage you to write more and more standard documentation, to develop and run more tests, to adopt more rigorous version specifications, to more clearly identify and define code dependencies, and take many other "professionalization" steps. These all pay dividends in better, more reliable, more portable, more easily deployed code--but I'd be lying if I didn't admit the learning curve can be steep at times.

What is Building and Installing?

This is probably a question that has a very easy and straightforward answer, however, despite having a few years programming experience, for some reason I still don't quite get the exact concepts of what it means to "build" and then to "install". I know how to use them and have used them a lot, but have no idea about the exact processes which happen in the background...
I have looked across the web, wikipedia, etc... but there is no one simple answer to it, neither can I find one here.
A good example, which I tried to understand, is adding new modules to python:
http://docs.python.org/2/install/index.html#how-installation-works
It says that "the build command is responsible for putting the files to install into a build directory"
And then for the install command: "After the build command runs (whether you run it explicitly, or the install command does it for you), the work of the install command is relatively simple: all it has to do is copy everything under build/lib (or build/lib.plat) to your chosen installation directory."
So essentially what this is saying is:
1. Copy everything to the build directory and then...
2. Copy everything to the installation directory
There must be a process missing somewhere in the explanation...complilation?
Would appreciate some straightforward not too techy answer but in as much detail as possible :)
Hopefully I am not the only one who doesn't know the detailed answer to this...
Thanks!
Aivoric
Building means compiling the source code to binary in a sandbox location where it won't affect your system if something goes wrong, like a build subdirectory inside the source code directory.
Install means copying the built binaries from the build subdirectory to a place in your system path, where they become easily accessible. This is rarely done by a straight copy command, and it's often done by some package manager that can track the files created and easily uninstall them later.
Usually, a build command does all the compiling and linking needed, but Python is an interpreted language, so if there are only pure Python files in the library, there's no compiling step in the build. Indeed, everything is copied to a build directory, and then copied again to a final location. Only if the library depends on code written in other languages that needs to be compiled you'll have a compiling step.
You want a new chair for your living-room and you want to make it yourself. You browse through a catalog and order a pile of parts. When they arrives at your door, you can't immediately use them. You have to build the chair at your workshop. After a bit of elbow-grease, you can sit down in it. Afterwards, you install the chair in your living-room, in a convenient place to sit down.
The chair is a program you want to use. It arrives at your house as source code. You build it by compiling it into a runnable program. You install it by making it easier to use.
The build and install commands you are refering to come from setup.py file right?
Setup.py (http://docs.python.org/2/distutils/setupscript.html)
This file is created by 3rd party applications / extensions of Python. They are not part of:
Python source code (bunch of c files, etc)
Python libraries that come bundled with Python
When a developer makes a library for python that he wants to share to the world he creates a setup.py file so the library can be installed on any computer that has python. Maybe this is the MISSING STEP
Setup.py sdist
This creates a python module (the tar.gz files). What this does is copy all the files used by the python library into a folder. Creates a setup.py file for the module and archives everything so the library can be built somewhere else.
Setup.py build
This builds the python module back into a library (SPECIFICALLY FOR THIS OS).
As you may know, the computer that the python library originally came from will be different from the library that you are installing on.
It might have a different version of python
It might have a different operating system
It might have a different processor / motherboard / etc
For all the reasons listed above the code will not work on another computer. So setup.py sdist creates a module with only the source files needed to rebuild the library on another computer.
What setup.py does exactly is similar to what a makefile would do. It compiles sources / creates libraries all that stuff.
Now we have a copy of all the files we need in the library and they will work on our computer / operating system.
Setup.py install
Great we have all the files needed. But they won't work. Why? Well they have to be added to Python that's why. This is where install comes in. Now that we have a local copy of the library we need to install it into python so you can use it like so:
import mycustomlibrary
In order to do this we need to do several things including:
Copy files to their library folders in our version of python.
Make sure library can be imported using import command
Run any special install instructions for this library. (seting up paths, etc)
This is the most complicated part of the task. What if our library uses BeautifulSoup? This is not a part of Python Library. We'd have to install it in a way such that our library and any others can use BeautifulSoup without interfering with each other.
Also what if python was installed someplace else? What if it was installed on a server with many users?
Install handles all these problems transparently. What is does is make the library that we just built able to run. All you have to do is use the import command, install handles the rest.

Some way to create a cross-platform, self-contained, cloud-synchronized python library of modules for personal use? [duplicate]

I need to ship a collection of Python programs that use multiple packages stored in a local Library directory: the goal is to avoid having users install packages before using my programs (the packages are shipped in the Library directory). What is the best way of importing the packages contained in Library?
I tried three methods, but none of them appears perfect: is there a simpler and robust method? or is one of these methods the best one can do?
In the first method, the Library folder is simply added to the library path:
import sys
import os
sys.path.insert(0, os.path.join(os.path.dirname(__file__), 'Library'))
import package_from_Library
The Library folder is put at the beginning so that the packages shipped with my programs have priority over the same modules installed by the user (this way I am sure that they have the correct version to work with my programs). This method also works when the Library folder is not in the current directory, which is good. However, this approach has drawbacks. Each and every one of my programs adds a copy of the same path to sys.path, which is a waste. In addition, all programs must contain the same three path-modifying lines, which goes against the Don't Repeat Yourself principle.
An improvement over the above problems consists in trying to add the Library path only once, by doing it in an imported module:
# In module add_Library_path:
sys.path.insert(0, os.path.join(os.path.dirname(__file__), 'Library'))
and then to use, in each of my programs:
import add_Library_path
import package_from_Library
This way, thanks to the caching mechanism of CPython, the module add_Library_path is only run once, and the Library path is added only once to sys.path. However, a drawback of this approach is that import add_Library_path has an invisible side effect, and that the order of the imports matters: this makes the code less legible, and more fragile. Also, this forces my distribution of programs to inlude an add_Library_path.py program that users will not use.
Python modules from Library can also be imported by making it a package (empty __init__.py file stored inside), which allows one to do:
from Library import module_from_Library
However, this breaks for packages in Library, as they might do something like from xlutils.filter import …, which breaks because xlutils is not found in sys.path. So, this method works, but only when including modules in Library, not packages.
All these methods have some drawback.
Is there a better way of shipping programs with a collection of packages (that they use) stored in a local Library directory? or is one of the methods above (method 1?) the best one can do?
PS: In my case, all the packages from Library are pure Python packages, but a more general solution that works for any operating system is best.
PPS: The goal is that the user be able to use my programs without having to install anything (beyond copying the directory I ship them regularly), like in the examples above.
PPPS: More precisely, the goal is to have the flexibility of easily updating both my collection of programs and their associated third-party packages from Library by having my users do a simple copy of a directory containing my programs and the Library folder of "hidden" third-party packages. (I do frequent updates, so I prefer not forcing the users to update their Python distribution too.)
Messing around with sys.path() leads to pain... The modern package template and Distribute contain a vast array of information and were in part set up to solve your problem.
What I would do is to set up setup.py to install all your packages to a specific site-packages location or if you could do it to the system's site-packages. In the former case, the local site-packages would then be added to the PYTHONPATH of the system/user. In the latter case, nothing needs to changes
You could use the batch file to set the python path as well. Or change the python executable to point to a shell script that contains a modified PYTHONPATH and then executes the python interpreter. The latter of course, means that you have to have access to the user's machine, which you do not. However, if your users only run scripts and do not import your own libraries, you could use your own wrapper for scripts:
#!/path/to/my/python
And the /path/to/my/python script would be something like:
#!/bin/sh
PYTHONPATH=/whatever/lib/path:$PYTHONPATH /usr/bin/python $*
I think you should have a look at path import hooks which allow to modify the behaviour of python when searching for modules.
For example you could try to do something like kde's scriptengine does for python plugins[1].
It adds a special token to sys.path(like "<plasmaXXXXXX>" with XXXXXX being a random number just to avoid name collisions) and then when python try to import modules and can't find them in the other paths, it will call your importer which can deal with it.
A simpler alternative is to have a main script used as launcher which simply adds the path to sys.path and execute the target file(so that you can safely avoid putting the sys.path.append(...) line on every file).
Yet an other alternative, that works on python2.6+, would be to install the library under the per-user site-packages directory.
[1] You can find the source code under /usr/share/kde4/apps/plasma_scriptengine_python in a linux installation with kde.

How to version a Python program

I'm doing a Python program (for use in a Linux environment) and I want it to support version actualizations of the program. I read somewhere I should add a line like "__ Version __" but I'm not really sure where and how to put it. What I want to achieve is not having to erase my whole program every time I want to install a new version of it.
thanks!
I highly recommend you to use setuptools instead of manually versioning it. It is the de-facto stanrad now and very simple to use. All you need to do is to create a setup.py in your projects root directory:
from setuptools import setup, find_packages
setup(name='your_programms_name',
version='major.minor.patch',
packages=find_packages(),
)
and then just run:
python setup.py sdist
and then there will be eggs under your dist folder.
What you actually want to do is make your Python program into a package using distutils.
You would create a setup.py. A simple example would look something like this:
from distutils.core import setup
setup(name='foo',
version='1.0',
py_modules=['foo'],
)
This is the most standard way to version and distribute Python code. If your code is open source you can even register it with the cheese shop^M^M^M^MPyPi and install it on any machine in the world with a simple pip install mypackage.
It depends what you want to be versioned.
Single modules with independent versions, or the whole program/package.
You could theoretically add a __version__ string to every class and do dynamic imports, by testing these variables.
If you want to version the main, or whole program you could add a __version__ string somewhere at the top of your __init__.py file and import this string into you setup.py when generating packages. This way you wouldn't have to manually edit multiple files and setup.py could be left mostly untouched.
Also consider using a string, not a number or tuple. See PEP 396.
If you are only concerned with versions on your local machine, I would suggest becoming familiar with Git. Information about version control in Git can be found here.
If you are concerned with version control on a module that others would use, information can be found here.

Categories

Resources