I am looking for a solution that will allow me to use py file as a little library.
Short description:
I have a py script and there are a lot of common functions in it.
Instead of each time use a big file I want to throw all common functions into separate py file, put into folder Tools and use it when I need.
My problems is that I cannot import this file from Tools, because my script does not see it.
My folder structure:
C:\some\folders\here\my\folder\script.py
C:\some\folders\here\Tools\Library\library.py
Also, it is not good for me to user init.py, because I haven't any python project, it is just one file without any other things.
Are there any normal solutions?
Python interpreter searches for modules in 3 places:
current directory, that means the directory from which you run the script with your import statement
list of directories from PYTHONPATH
installation-dependent list of directories, this is configures when Python in installed
You can also modify sys.path at runtime and include a directory where your module is located, but this is the worst solution and it's usually discouraged to do so.
Setting PYTHONPATH will most likely be a solution in your situation.
Related
I been reading a lot on how to set up the projects and having the __init__.py in the folder structures and also one above in the root folder where the project is been sitting.
I have this folder structure, and running in pycharm work, since it adds the path to environment variables when it starts.
C:\test_folder\folder_struture\project
C:\test_folder\folder_struture\project\pak1
C:\test_folder\folder_struture\project\pak1\pak_num1.py
C:\test_folder\folder_struture\project\pak1\__init__.py
C:\test_folder\folder_struture\project\program
C:\test_folder\folder_struture\project\program\program.py
C:\test_folder\folder_struture\project\program\__init__.py
C:\test_folder\folder_struture\project\__init__.py
C:\test_folder\folder_struture\__init__.py
When I try to run program.py where I have:
from project.pak1 import pak_num1
I get the error that the module doesn't exists. when I add the project to the PYTHONPATH variable or the PATH variable in my windows machine, now everything works fine. Is it possible that the tutorials are missing the part of setting the root folder of the project into the environs since they assume you already did it?
For every project do I need to put the root project into the environment variables or is there is a way for python to recognize that is in a python module/python structure?
Adding it to the environ will let me import absolute,
but if I try to do and relative with
from ..pak1 import pak_num1
I get:
ImportError: attempted relative import with no known parent package
If I run program.py does it look for the __init__.py in the same folder and if it find it, does it go one level up to find another __init__.py and so on to get the structure?
If you are writing lots of little scripts and start wanting to organize some of them into utility packages, mostly to be used by other scripts (ouside those packages), put all the scripts that are to be called from the command-line (your entry-points for execution, or main scripts, or whatever name you want to call this scripts that run as commands from the command line) side-by-side, in the same folder as the root folders of all your packages.
Then you can import (anything) from any package from the top-level scripts. Starting the execution from the top level scripts not only gives access to any package, as also allows for all the package internal imports to work, given that an __init__.py file rests in all package/sub-package folder/sub-folder.
For code inside a package to import from another sibling package (sibling at the top-level folder) you need either to append to sys.path the sibling package, or for example, to wrap everything yet in another folder again with an __init__.py file there. Then again you should start execution from outside this overall package, not from the scripts that were top-level before you do this.
Think of packages as something to be imported for use, not to start running from a random inner entry point.
Many times a better alternative is to configure a virtualenv, and every package you install in the environment becomes known in that environment. This also isolates the dependencies from project to project, including the Python version at use if you need. Note that this is solving also a different problem, but a hairy one, by the way.
Let's say I have a script I've written:
~/workspace/myscript/script.py
If I have, for example, a ~/bin which I have added to my $PATH, then I could create a symbolic link
~/bin/script -> ~/workspace/myscript/script.py
And everything works fine, I can call my script from anywhere.
Then, say my script starts to grow, and I separate it out
~/workspace/myscript/
script.py
mylib.py
I now run into a problem, as described here, that if I am calling my python script directly (as opposed to importing it as a module) then I cannot do a relative import.
The only solution I have seen is to package up the whole program into a fully fledged python package with a setup.py and installing it system-wide (or managing a home directory python library folder).
This seems like a lot of extra work for the sake of breaking my code into multiple python files.
Is there some way I can:
Call the script from anywhere (have it callable on path),
Have the code sparated into multiple files,
Not have to manage a full python package and installation.
All at once?
You can add the root directory of your module to the Python path:
export PYTHONPATH="$PYTHONPATH:~/workspace/myscript/"
I am tinkering with some pet projects with Python in Linux (Mint 13) and I plan to do the following:
Create a Dropbox subfolder named "pybin" where I put all my home-made python modules;
Put a symlink to this folder somewhere in the system (first candidate: /usr/lib/python2.7/dist-packages, which is in sys.path, or some similar path);
Then I just do import mymodule from any python session, and the module is imported.
I tried it and it didn't work. I suspect this has to do with differences between modules and packages, and __init__.py files, but I confess that everytime I read something about this stuff I get pretty confused. Besides learning a bit more about this, all I really want to do is find a way to import my modules the described way. It is crucial that the actual folder is inside Dropbox (or any other file-syncing folder), not in a system folder.
Thanks for any help!
Why not simply set the PYTHONPATH envvar in your .bash_profile. That way every time you execute a bash shell (normally happens upon login), this environment variable will be set the wherever you place your user defined modules. The python interpreter uses this variable to determine where to search for module imports:
PYTHONPATH="${PYTHONPATH}:/path/to/some/cool/python/package/:/path/to/another/cool/python/package/"
export PYTHONPATH
I am using Python 2.5 on CentOS 5.5
I have got a file called MultipartPostHandler.py, I am supposed to use it like this:
import MultipartPostHandler
But I dont know where should I put the file MultipartPostHandler.py so I can use it. Thanks a lot!
http://docs.python.org/tutorial/modules.html#the-module-search-path
When a module named spam is imported,
the interpreter searches for a file
named spam.py in the current
directory, and then in the list of
directories specified by the
environment variable PYTHONPATH. This
has the same syntax as the shell
variable PATH, that is, a list of
directory names. When PYTHONPATH is
not set, or when the file is not found
there, the search continues in an
installation-dependent default path;
on Unix, this is usually
.:/usr/local/lib/python.
Actually, modules are searched in the
list of directories given by the
variable sys.path which is initialized
from the directory containing the
input script (or the current
directory), PYTHONPATH and the
installation- dependent default. This
allows Python programs that know what
they’re doing to modify or replace the
module search path.
So, you have to put it into the current directory or use sys.path to show your program where the searched module is.
The easiest is to put it in the same folder as the code you're writing.
If you got it from somebody who provides an installer, then to be safe, just run the installer.
If it's just something you grabbed specifically for this project, put it in the same folder as the .py files that you're writing.
Otherwise (if you're planning to use it for a few projects and don't want to copy/paste the file), the safest place to put it is probably in the /lib/site-packages sub-directory of the directory where Python is installed.
The easiest - yes, is to put it in the same directory where you have the importer python code. Also, you can export PYTHONPATH environment variable which points to the directory (or directories separated by :) where you have MultipartPostHandler.py. Another option: make it distributable. This will be useful if you are going to publish your code.
I have written a small DB access module that is extensively reused in many programs.
My code is stored in a single directory tree /projects for backup and versioning reasons, and so the module should be placed within this directory tree, say at /projects/my_py_lib/dbconn.py.
I want to easily configure Python to automatically search for modules at the /projects/my_py_lib directory structure (of course, __init__.py should be placed within any subdirectory).
What's the best way to do this under Ubuntu?
Thanks,
Adam
You can add a PYTHONPATH environment variable to your .bashrc file. eg.
export PYTHONPATH=/projects/my_py_lib
on linux, this directory will be added to your sys.path automatically for pythonN.M
~/.local/lib/pythonN.M/site-packages/
So you can put your packages in there for each version of python you are using.
You need a copy for each version of python, otherwise the .pyc file will be recompiled every time you import the module with a different python version
This also allows fine grained control if the module only works for some of the versions of python you have installed
If you create this file
~/.local/lib/pythonN.M/site-packages/usercustomize.py
it will be imported each time you start the python interpreter
Another option is to create a soft link in /usr/lib*/python*/site-packages/:
ln -s /projects/my_py_lib /usr/lib*/python*/site-packages/
That will make the project visible to all Python programs plus any changes will be visible immediately, too.
The main drawback is that you will eventually have *.pyc files owned by root or another user unless you make sure you compile the files yourself before you start python as another user.