Copying modules into Django, "No module named [moduleName]" - python

I run into this problem pretty consistently... keep in mind I am quite new to Django and a total Python amateur.
It seems that, for example, whenever I check out my Django project on a new computer after a clean install of Python and Django, it can never find the project/apps I create or copy in.
So right now I have an app that is working, and I downloaded a 3rd party Django module and installed it into my app directory, include it in my settings, and the web server quits because it cannot find the module.
This is the first time I've imported an third party module. In the past when it couldn't find modules I created, I would just rename the folder and run "manage.py startapp appname", delete the folder it created, and name my original folder back, boom, problem solved...
But that's obviously a hack, I am wondering if anyone can explain the the heck is going on here and how best to approach it.
I can't be the only one who has run into this, but I couldn't find any other questions on this site that seemed to match my issue.
Happens on both OS X and Windows 7.

They way Django works is pretty much how Python works. At default the folder you create when you run django-admin.py startproject name is added to your python path. That means that anything you put into there you can get to. But you have to mind that when you write the app into the installed app list. If you have an app at project/apps/appname, you would have to write 'app.appname' in the installed apps list.
Now there are some ways to go about adding 3rd party apps located somewhere else to your project. You can either add them to your python path, put in your python path, or make a link to your python path. However, you can also add a sys.path.insert(...) in your manage.py file where you add the folder of your liking to your python path. Doing this will allow you to add folders to your python path for that project only, and will keep your python path more clean.

Your third party django module should be searchable by PYTHONPATH, because django module is no other than a python module. Now there are two ways to do this:
Create a folder (anywhere you want), put your third party django module under there. Now set that directory to environment variable $PYTHONPATH
e.g (on Linux box):
export PYTHONPATH = /home/me/pythonmodules/
Create a folder (anywhere you want), put the third party django module under there. Now if you are on Unix box, create a symlink to that directory to python site-packages. Use this command to find out where your python site packages is:
python -c "from distutils.sysconfig import get_python_lib; print get_python_lib()"

Related

Adding Path in GIT for KIVY

Create a file named 'py.ini' and place it either in your users application data directory, or in 'C:\Windows'. It will contain the path used to start Kivy.
I put my Kivy installation at 'C:\utils\kivy'
so my copy says:
[commands]
kivy="c:\utils\kivy\kivy.bat"
(You could also add commands to start other script interpreters, such as jython or IronPython.)
so my question is:
What commands are supposed to be used in GIT to add this path into a variable "kivy".
Or is it even suppose to be a variable?
And in GIT, to get the script working, it uses "source /c/.../Kivy-1.8.0-py2.7-win32/kivyenv.sh"
But, on to add path, they said to use "C:...\kivy.bat
Why does " /c/" change to "C:"
and why is it 'kivy.ba't not 'kivyenv.sh'
Thank you.
This value is not typically something you would include in a git repo because it is specific to your system. Other people using your repo may have Kivy installed somewhere else, or be on an entirely different OS where that path does not exist.

Simple way to import python modules in Linux using symlinks

I am tinkering with some pet projects with Python in Linux (Mint 13) and I plan to do the following:
Create a Dropbox subfolder named "pybin" where I put all my home-made python modules;
Put a symlink to this folder somewhere in the system (first candidate: /usr/lib/python2.7/dist-packages, which is in sys.path, or some similar path);
Then I just do import mymodule from any python session, and the module is imported.
I tried it and it didn't work. I suspect this has to do with differences between modules and packages, and __init__.py files, but I confess that everytime I read something about this stuff I get pretty confused. Besides learning a bit more about this, all I really want to do is find a way to import my modules the described way. It is crucial that the actual folder is inside Dropbox (or any other file-syncing folder), not in a system folder.
Thanks for any help!
Why not simply set the PYTHONPATH envvar in your .bash_profile. That way every time you execute a bash shell (normally happens upon login), this environment variable will be set the wherever you place your user defined modules. The python interpreter uses this variable to determine where to search for module imports:
PYTHONPATH="${PYTHONPATH}:/path/to/some/cool/python/package/:/path/to/another/cool/python/package/"
export PYTHONPATH

Setting python path

I have a Django app and I'm getting an error whenever I try to run my code:
Error: No module named django_openid
Let me step back a bit and tell you how I came about this:
I formatted my computer and completely re-installed everything -- including virtualenv, and all dependent packages (in addition to Django) required for my project based on settings in my requirements.txt folder
I tried doing python manage.py syncdb and got the error
I googled the issue, and many people say it could be a path problem.
I'm confused as to how I go about changing the path variables though, and what exactly they mean. I found some documentation, but being somewhat of a hack-ish noob, it kind of goes over my head.
So my questions are:
What exactly is their purpose -- and are they on a system based level based on the version of Python or are they project dependent?
How can I see what mine are set to currently?
How can I change them (ie. where is this .profile file they talk of and can I just use a text editor)
Any input you would have would be great as this one is stumping me and I just want to get back to writing code :-)
The path is just the locations in your filesystem in which python will search for the modules you are trying to import. For example, when you run import somemodule, Python will perform a search for somemodule in all the locations contained in the path (sys.path variable).
You should check the path attribute in sys module:
import sys
print sys.path
It is just a regular list, sou you could append/remove elements from it:
sys.path.append('/path/to/some/module/folder/')
If you want to change your path for every python session you start, you should create a file to be loaded at startup, doing so:
Create a PYTHONSTARTUP environment variable and setting it to your startup file. E.g.: PYTHONSTARTUP=/home/user/.pythonrc (in a unix shell);
Edit the startup file so it contains the commands you want to be auto-executed when python is loaded;
An example of a .pythonrc could be:
import sys
sys.path.append('/path/to/some/folder/')
Do you really need to alter the path? It's always best to actually think about your reasons first. If you're only going to be running a single application on the server or you just don't care about polluting the system packages directory with potentially unnecessary packages, then put everything in the main system site-packages or dist-packages directory. Otherwise, use virtualenv.
The system-level package directory is always on the path. Virtualenv will add its site-packages directory to the path when activated, and Django will add the project directory to the path when activated. There shouldn't be a need to add anything else to the path, and really it's something you should never really have to worry about in practice.

Python path: Reusing Python module

I have written a small DB access module that is extensively reused in many programs.
My code is stored in a single directory tree /projects for backup and versioning reasons, and so the module should be placed within this directory tree, say at /projects/my_py_lib/dbconn.py.
I want to easily configure Python to automatically search for modules at the /projects/my_py_lib directory structure (of course, __init__.py should be placed within any subdirectory).
What's the best way to do this under Ubuntu?
Thanks,
Adam
You can add a PYTHONPATH environment variable to your .bashrc file. eg.
export PYTHONPATH=/projects/my_py_lib
on linux, this directory will be added to your sys.path automatically for pythonN.M
~/.local/lib/pythonN.M/site-packages/
So you can put your packages in there for each version of python you are using.
You need a copy for each version of python, otherwise the .pyc file will be recompiled every time you import the module with a different python version
This also allows fine grained control if the module only works for some of the versions of python you have installed
If you create this file
~/.local/lib/pythonN.M/site-packages/usercustomize.py
it will be imported each time you start the python interpreter
Another option is to create a soft link in /usr/lib*/python*/site-packages/:
ln -s /projects/my_py_lib /usr/lib*/python*/site-packages/
That will make the project visible to all Python programs plus any changes will be visible immediately, too.
The main drawback is that you will eventually have *.pyc files owned by root or another user unless you make sure you compile the files yourself before you start python as another user.

Managing Python Path When Moving Code from Development Computer to Target

I have a python project with this directory structure and these files:
/home/project_root
|---__init__.py
|---setup
|---__init__.py
|---configs.py
|---test_code
|---__init__.py
|---tester.py
The tester script imports from setup/configs.py with the reference "setup.configs". It runs fine on my development machine.
This works on the development (Linux) computer. When I move this to another (Linux) computer, I set the PYTHONPATH with
PYTHONPATH = "/home/project_root"
But when I run tester.py, it can't find the configs module. And when I run the interactive Python interpreter, sys.path doesn't include the /home/project_root directory. But /home/project_root does appear when I echo $PYTHPATH.
What am I doing wrong here?
(I don't want to rely on the .bashrc file to set the PYTHONPATH for the target machine -- the code is for a Django application, and will eventually be run by www-data. And, I know that the apache configuration for Django includes a specification of the PYTHONPATH, but I don't want to use that here as I'm first trying to make sure the code passes its unit tests in the target machine environment.)
CURIOUSER AND CURIOUSER
This seems to be a userid and permissions problem.
- When launched by a command from an ordinary user, the interpreter can import modules as expected.
- When launched by sudo (I'm running Ubuntu here), the interpreter cannot import modules as expected.
- I've been calling the test script with sudo, as the files are owned by www-data (b/c they'll be called by the user running apache as part of the Django application).
- After changing the files' ownership to that of an ordinary user, the test script does run without import errors (albeit, into all sorts of userid related walls).
Sorry to waste your time. This question should be closed.
Stick this in the tester script right before the import setup.configs
import sys
import os
sys.path.insert(0, os.path.join(os.path.dirname(__file__), os.path.pardir))
sys.path is a list of all the directories the python interpreter looks for when importing a python module.
This will add the parent directory which contains setup module to the beginning of that list which means that the local directory will be checked first. That is important if you have your module installed system wide. More info on that here: sys doc.
EDIT: You could also put a .pth file in /usr/local/lib/python2.X/site-packages/ A .pth file is simply a text file with a directory path on each line that the python interpreter will search in. So just add a file with this line in it:
/home/project_root
Try explicitly setting your python path in your scripts. If you don't want to have to change it, you could always add something like "../" to the path in tester. That is to say:
sys.path.append("../")
(I don't want to rely on the .bashrc file to set the PYTHONPATH for the target machine -- the code is for a Django application, and will eventually be run by www-data. And, I know that the apache configuration for Django includes a specification of the PYTHONPATH, but I don't want to use that here as I'm first trying to make sure the code passes its unit tests in the target machine environment.)
If the code is for a Django application, is there a reason you're not testing it in the context of a Django project? Testing it in the context of a Django project gives a couple benefits:
Django's manage.py will set up your Python environment for you. It'll add the appropriate project paths to sys.path, and it'll set the environment variable DJANGO_SETTINGS_MODULE correctly.
Django's libraries include ample unit testing facilities, and you can easily extend that functionality to include your own testing facilities. Executing tests in a Django project is as easy as executing a single command via manage.py.

Categories

Resources