I am tinkering with some pet projects with Python in Linux (Mint 13) and I plan to do the following:
Create a Dropbox subfolder named "pybin" where I put all my home-made python modules;
Put a symlink to this folder somewhere in the system (first candidate: /usr/lib/python2.7/dist-packages, which is in sys.path, or some similar path);
Then I just do import mymodule from any python session, and the module is imported.
I tried it and it didn't work. I suspect this has to do with differences between modules and packages, and __init__.py files, but I confess that everytime I read something about this stuff I get pretty confused. Besides learning a bit more about this, all I really want to do is find a way to import my modules the described way. It is crucial that the actual folder is inside Dropbox (or any other file-syncing folder), not in a system folder.
Thanks for any help!
Why not simply set the PYTHONPATH envvar in your .bash_profile. That way every time you execute a bash shell (normally happens upon login), this environment variable will be set the wherever you place your user defined modules. The python interpreter uses this variable to determine where to search for module imports:
PYTHONPATH="${PYTHONPATH}:/path/to/some/cool/python/package/:/path/to/another/cool/python/package/"
export PYTHONPATH
Related
I am looking for a solution that will allow me to use py file as a little library.
Short description:
I have a py script and there are a lot of common functions in it.
Instead of each time use a big file I want to throw all common functions into separate py file, put into folder Tools and use it when I need.
My problems is that I cannot import this file from Tools, because my script does not see it.
My folder structure:
C:\some\folders\here\my\folder\script.py
C:\some\folders\here\Tools\Library\library.py
Also, it is not good for me to user init.py, because I haven't any python project, it is just one file without any other things.
Are there any normal solutions?
Python interpreter searches for modules in 3 places:
current directory, that means the directory from which you run the script with your import statement
list of directories from PYTHONPATH
installation-dependent list of directories, this is configures when Python in installed
You can also modify sys.path at runtime and include a directory where your module is located, but this is the worst solution and it's usually discouraged to do so.
Setting PYTHONPATH will most likely be a solution in your situation.
There are many of similar questions about PYTHONPATH and imports but I didn't find exactly what I needed.
I have a git repository that contains a few python helper scripts. The scripts are naturally organized in a few packages. Something like:
scripts/main.py
scripts/other_main.py
scripts/__init__.py
a/foo.py
a/bar.py
a/__init__py
b/foo.py
b/bar.py
b/__init__.py
__init__.py
scripts depends on a and b. I'm using absolute import in all modules. I run python3 scripts/main.py. Everything works as long as I set up PYTHONPATH to the root of my project.
However, I'd like to avoid users the hassle of setting up an environment variable.
What would be the right way to go? I expected this to work like in java, where the current dir is in the classpath by default but it doesn't seem to be the case. I've also tried relative import without success.
EDIT: it seems to work if I remove the top-level __init__.py
Firstly, you're right in that I don't think you need the top-level __init__.py. Removing it doesn't solve any import error for me though.
You won't need to set PYTHONPATH and there are a few alternatives that I can think of:
Use a virtual environment (https://virtualenv.pypa.io/en/latest/). This would also require you to package up your code into an installable package (https://packaging.python.org/). I won't explain this option further since it's not directly related to your question.
Move your modules under your scripts directory. Python automatically adds the script's directory into the Python path.
Modify the sys.path variable in your scripts so they can find your local modules.
The second option is the most straightforward.
The third option would require you to add some python code to the top of your scripts, above your normal imports. In your main.py it would look like:
#!/usr/bin/env python
import os.path, sys
sys.path.insert(0, os.path.dirname(os.path.dirname(__file__)))
import a
import b
What this does is:
Take the filename of the script
calculate the parent directory of the directory of the script
Prepend that directory to sys.path
Then do normal imports of your modules
On windows 7, I currently don't have a python path. Can I safely make one? If so, how do I do it?
Upon making this variable, I can no longer load Spyder (IDE) without it crashing. Does anyone know why?
I would like to edit my existing python path if possible, but just don't know why it isn't already there in environmental variables.
I would ultimately like to be able to run "python myscript.py" and have myscript be in a different directory from the call directory.
PYTHONPATH adds new paths to the ones Python uses by default. The path in total determines where Python will look for modules when you import them.
Look at sys.path to see the combination of the defaults with your PYTHONPATH environment variable. It's likely that Spyder is loading a module that exists in two different places and the wrong one comes first.
When you import modules in python, python searches for the module in the directories in PYTHONPATH, in addition to some other directories.
In order to be able to run your script as > myscript.py, you want to put your script somewhere on PATH (here are some instructions for viewing or updating PATH), this is where the OS looks for scripts and programs when you give it a command. I believe that in windows the .py extension must be associated with python for windows to know that myscript.py should be run using python. This should happen automatically when python in installed, but maybe someone with more windows knowledge can comment on this.
it has role similar to path. this variable tells the python interpreter where to
locate the module files imported into a program. it should include the python source library directory and the directories contain in python source code
I have a Django app and I'm getting an error whenever I try to run my code:
Error: No module named django_openid
Let me step back a bit and tell you how I came about this:
I formatted my computer and completely re-installed everything -- including virtualenv, and all dependent packages (in addition to Django) required for my project based on settings in my requirements.txt folder
I tried doing python manage.py syncdb and got the error
I googled the issue, and many people say it could be a path problem.
I'm confused as to how I go about changing the path variables though, and what exactly they mean. I found some documentation, but being somewhat of a hack-ish noob, it kind of goes over my head.
So my questions are:
What exactly is their purpose -- and are they on a system based level based on the version of Python or are they project dependent?
How can I see what mine are set to currently?
How can I change them (ie. where is this .profile file they talk of and can I just use a text editor)
Any input you would have would be great as this one is stumping me and I just want to get back to writing code :-)
The path is just the locations in your filesystem in which python will search for the modules you are trying to import. For example, when you run import somemodule, Python will perform a search for somemodule in all the locations contained in the path (sys.path variable).
You should check the path attribute in sys module:
import sys
print sys.path
It is just a regular list, sou you could append/remove elements from it:
sys.path.append('/path/to/some/module/folder/')
If you want to change your path for every python session you start, you should create a file to be loaded at startup, doing so:
Create a PYTHONSTARTUP environment variable and setting it to your startup file. E.g.: PYTHONSTARTUP=/home/user/.pythonrc (in a unix shell);
Edit the startup file so it contains the commands you want to be auto-executed when python is loaded;
An example of a .pythonrc could be:
import sys
sys.path.append('/path/to/some/folder/')
Do you really need to alter the path? It's always best to actually think about your reasons first. If you're only going to be running a single application on the server or you just don't care about polluting the system packages directory with potentially unnecessary packages, then put everything in the main system site-packages or dist-packages directory. Otherwise, use virtualenv.
The system-level package directory is always on the path. Virtualenv will add its site-packages directory to the path when activated, and Django will add the project directory to the path when activated. There shouldn't be a need to add anything else to the path, and really it's something you should never really have to worry about in practice.
I have written a small DB access module that is extensively reused in many programs.
My code is stored in a single directory tree /projects for backup and versioning reasons, and so the module should be placed within this directory tree, say at /projects/my_py_lib/dbconn.py.
I want to easily configure Python to automatically search for modules at the /projects/my_py_lib directory structure (of course, __init__.py should be placed within any subdirectory).
What's the best way to do this under Ubuntu?
Thanks,
Adam
You can add a PYTHONPATH environment variable to your .bashrc file. eg.
export PYTHONPATH=/projects/my_py_lib
on linux, this directory will be added to your sys.path automatically for pythonN.M
~/.local/lib/pythonN.M/site-packages/
So you can put your packages in there for each version of python you are using.
You need a copy for each version of python, otherwise the .pyc file will be recompiled every time you import the module with a different python version
This also allows fine grained control if the module only works for some of the versions of python you have installed
If you create this file
~/.local/lib/pythonN.M/site-packages/usercustomize.py
it will be imported each time you start the python interpreter
Another option is to create a soft link in /usr/lib*/python*/site-packages/:
ln -s /projects/my_py_lib /usr/lib*/python*/site-packages/
That will make the project visible to all Python programs plus any changes will be visible immediately, too.
The main drawback is that you will eventually have *.pyc files owned by root or another user unless you make sure you compile the files yourself before you start python as another user.