virtualenv module name conflict - python

I have a very basic virtualenv setup on OSX with two packages, py3dns, and dnspython. Both packages have a dns module(actually one is DNS, the other dns, but on OSX they are the same), and these modules both have files named opcode.py.
Virtual env is installing both of these modules into lib/python3.5/site-packages/, and not into lib/python3.5/site-packages/#{package}, therefore these opcode.py files are overwriting each other.
Is this a bug(feature?) of virtualenv? Is there some way around this? Unfortunately I dont have the option of using only one of these dns packages, as they are both indirectly imported via other packages that I need to use in my project. Any advice would be appreciated.

You can specify different target path https://pip.pypa.io/en/latest/reference/pip_install/#cmdoption-t
You can install second package like this:
pip3 install dnspython --target /Users/green/dns
and then create file dns.pth in site-packages with path to package:
/Users/green/dns
and in your code:
from dns.opcode import from_text
from DNS.Opcode import opcodemap
print("ok")

Related

Python Module Installation from Folder

I downloaded a project and am running through and installing all dependencies. At first I had errors regarding No module named utm and No module named paho. I solved these issues by going to C:\Users\me and using pip install paho-mqtt and pip install utm. Easy enough.
I then have this line "from mfa_msgs import Mission, WaypointList, MissionControl, ControlCommand, Status" and am getting a No module error here. mfa_msgs is a folder found in the project I downloaded that contains the Mission, WaypointList, etc. files. Where do I need to put the mfa_msgs folder in order to be able to access them?
Appreciate your time!

Jupyter-notebook returns "module not found" for every single import, despite them being downloaded

For example, import trackpy returns the module not found error.I have already confirmed that trackpy has been downloaded somewhere on my computer, because attempting to install it again via conda install -c soft-matter trackpy will eventually return something to the effect of "all files already installed". This seems to occur for every "external import" (numpy, scipy, matplotlib), i.e. one that was downloaded somewhere from the internet. This does not happen for "internal imports" (sys, os). I believe this is just a matter of jupyter not looking for the files in the correct place, but I don't know how to fix something like this.
Edit: Relevant info: I ran
import sys
sys.executable
which returns 'c:\\users\\reese\\miniconda3\\python.exe'. In the pkgs folder for miniconda3, there are none of the imports that I want. However in 'c:\\users\\reese\\Anaconda\\pkgs' are all the imports, trackpy and all else. Is there an easy way to make jupyter check here for imports? I already tried straight up copying the entire pkgs folder and pasting it in miniconda3's pkgs folder, but it did not work.
Two solutions I would propose.
Okay Solution:
Yes, you can add the path to your other packages with sys.path:
import sys
sys.path.insert(0,'PATH_TO_YOUR_OTHER_PACKAGES')
import Packages_of_another_path
By insert it at index zero, you ensure that your other packages get first priority in case there in another package with the same name.
Better Solution:(Recommendable)
Always use environments. E.g.
conda create —name your_env python=3.6 pip
conda activate your_env
conda install packages1 packages2
pip install package3
In this environment you can keep all you things together.
Everything you wish to use your packages, activate your environment and start hacking ;)

Python Installing data files and then finding them again

I have a python package targeted at linux machines that needs to install its locale files to an accessible location. Right now, I have them being installed to sys.prefix + "/share/locale/".
However, I found a small caveat with Ubuntu and pip. Under default conditions, Ubuntu installs packages installed with pip to /usr/local and sets sys.prefix to that during installation. However, after installation, when the package is run, the prefix is /usr, meaning my code can't find the locale files installed at /usr/local.
I could simply hardcode the location, but I would prefer not to do this, as it makes the package less portable and would require the user to install it as root. These are added as data_files in setup.py and won't be discoverable as a python package.
How else can I ensure my package can find my the locale files after
installation?
I thought about adding a line to the package's __init__.py during installation, which created a variable pointing to the locale dir's location. However, it did not seem trivial to edit files being installed without changing the source files.
This is a python 3 only package.
Maybe use the resource functions available in pkg_resources to find the files?
from pkg_resources import resource_stream, resource_filename
with resource_stream('my_package', 'locale/foo.dat') as infp:
# ...
# ... or ...
foo_location = resource_filename('my_package', 'locale/foo.dat')

How to package a module with my python script

I just wrote a python script to fix some database issues and it uses the psycopg2 module. For some reason the person that needs to run it (on the server) is claiming that they can't install psycopg2 on their server machine... is there a way that I can package that module within my script such that they don't have to have psycopg2 installed? Something similar to adding a lib folder to the classpath in Java?
Thx in advance,
Andre
Make a directory, put your script into it.
cd into that directory, then run:
$ easy_install -Z --prefix . psycopg2
That should result in a copy of the python module in the same directory.
Zip it all up and send the whole thing off.
As to be unable to install psycopg2, I sure it relates to security policies about who can have root access on database servers. If you can't find someone with the necessary permissions, there are two methods to skin this particular cat:
1) You could build an exe using py2exe, that way you know that all of the dependencies for your scripts are there and available. Also this will allow to you escape dealing with any python incompatibilities. You made no mention of your python version (2.5, 2.6, 2.7, 3.1?) nor the version on the server.
2) Or you could place the psycopg2 module into its own directory (c:\mypymods) and add the path to sys.path before importing.
import sys
sys.path = sys.path.append("c:/mypymods")
import psycopg2
....

How do you correct Module already loaded UserWarnings in Python?

Getting the following kinds of warnings when running most python scripts in the command line:
/Library/Python/2.6/site-packages/virtualenvwrapper/hook_loader.py:16: UserWarning: Module
pkg_resources was already imported from /System/Library/Frameworks/Python.framework/Versions/2.6/Extras/lib/python/pkg_resources.pyc, but /Library/Python/2.6/site-packages is being added to sys.path
import pkg_resources
/Library/Python/2.6/site-packages/virtualenvwrapper/hook_loader.py:16: UserWarning: Module site was already imported from /System/Library/Frameworks/Python.framework/Versions/2.6/lib/python2.6/site.pyc, but /Library/Python/2.6/site-packages is being added to sys.path
import pkg_resources
I think it has to do with a combination of using distribute and virtualenv, but wanted to check if anyone else has run in to this or would know how to go about fixing it.
Perhaps use the virtualenv option --no-site-packages so you won't see any system site-packages within your virtual environment. Having items installed both in your virtualenv and on the system root may be the cause of this issue.
Using --no-site-packages when creating your virtualenv prevents any conflict between system packages. I almost always use that option when creating a new virtualenv to prevent any conflicts. Though I may have several copies of libraries, at least they don't mess with each other.
The python equivalent of putting a bit of electrical tape over the check engine light would be to use the -W command line flag or by adding a warning filter.
In my case reinstalling of anything did not help. There were some orphaned .pyc files (specifically pkg_resources.pyc) left in /System/Library/Frameworks/Python.framework/Versions/2.6/Extras/lib/python
sudo find . -type f -name "*.pyc" -delete
made it work. This link helped me to track down the problem.
I had this sort of Python packaging hell visit today too.
Running Python 2.7.3 on Ubuntu, using namespace packages and using zc.buildout.
Finally, updating system wide Distribute from older version 0.6.30 to latest version 0.6.35 resolved the problem.
If the warning shows in a program you are modifying, try it this way (examply with pytz):
try:
import pytz
except ImportError:
from pkg_resources import require
require('pytz')

Categories

Resources