Pythonic Ways of Importing Custom Modules? - python

I've needed to deal with this for some time, but never really figured out what the most pythonic way of importing/setting up PYTHONPATH for custom modules is. I know I can use virtualenv to manage it, I know I can set it inside of scripts, or through pth files, but none of these seem very clean and pythonic to me, so I'm guessing I'm missing something.
Almost always, all custom modules I'm interested in are contained in the git directory I've cloned down that has whatever script I'm running, if that simplifies things.
I'm guessing virtualenv is the answer, but figured I'd ask in case I'm missing anything.
EDIT: To clarify, this is only a question about custom modules. I'm already using pip for modules from PyPI.

You can use pip to install packages that are not on PyPI also. You just need an URI endpoint and a valid python package:
Examples:
$ pip install https://github.com/pypa/pip/archive/develop.zip#egg=pip
$ pip install git+https://github.com/pypa/pip.git#egg=pip
$ pip install git+git://github.com/pypa/pip.git#egg=pip
$ pip install /path/to/pip.tar.gz
$ pip install .
Read more at https://pip-installer.org/en/latest/usage.html#pip-install

virtualenv is a good start.
There are also package managers like pip and easy_install that manage third party modules.
In code you can use:
import sys
sys.path.append('/path/to/customModule')

Virtualenv is the way to go with this.
pip install virtualenv
Then make a folder to setup your environments. Inside that folder:
virtualenv <new_env_name>
That'll create a new folder in that directory, inside that there's a bin folder, run source on activate in that bin folder. You can then do pip install and it will only install it for that environment.
If you're cloning a git repo that you also want to be able to peruse the code easily (like if you're also working on that repo) clone it into your work_dir and then symlink or alias the package folder into the site-package directory inside that virtualenv's lib directory. Otherwise, if it's packaged correctly if you do python setup.py install it should install it right for that virtualenv.

Related

virtualenv and Pycharm: folders, and packages: how does it work?

I created a project in Pycharm. Then I went down and typed pip install commands in to the terminal to install my required packages.
The strange thing is that my code only works, if the .py file is in the project root directory, and it does not work if it is in the 'venv' directory.
It don't know yet how exactly this works, and this looks to specific to just "google it".
Thank you
Your code is not supposed to go there, it is for your environment only. Move any source files to the root of your project.
First of all you can read this docs to understand better what is virtualenv:
what is virtualenv
The virtualenv you built can use different python version other then you use in the Pycharm project.
Furthermore, virtualenv built with libs. when you did pip install <lib> you didn't install the lib in the virtualenv so it probably missing there.
Solution:
Activate virtualenv in terminal:
source path_to_virtualenv/bin/activate
install the lib again this time in the virtualenv:
pip install <lib>
this should help.

What's the standard way to package a python project with dependencies?

I have a python project that has a few dependencies (defined under install_requires in setup.py). My ops people requires a package to be self contained and only depend on a python installation. The litmus test would be that they're able to get a zip-file and then unzip and run it without an internet connection.
Is there an easy way to package an install including dependencies? It is acceptable if I have to build on the OS/architecture that it will eventually be run on.
For what it's worth, I've tried both setup.py build and setup.py sdist, but they don't seem to fit the bill since they do not include dependencies. I've also considered virtualenv (which could be installed if absolutely necessary), but that has hard coded paths which makes it less than ideal.
There are a few nuances to how pip works. Unfortunately, using --prefix vendor to store all the dependencies of the project doesn't work if any of those dependencies, or dependencies of dependencies are installed into a place where pip can find them. It will skip those dependencies and just install the rest to your vendor folder.
In the past I've used virtualenv's --no-site-packages option to solve this issue. At one company we would ship the whole virtualenv, which includes the python binary. In the interest of only shipping the dependencies, you can combine using a virtualenv with the --prefix switch on pip to give yourself a clean environment that installs to the right place.
I'll provide an example script that creates a temporary virtualenv, activates it, then installs the dependencies to a local vendor folder. This is handy if you are running in CI.
#!/bin/bash
tempdir=$(mktemp -d -t project.XXX) # create a temporary directory
trap "rm -rf $tempdir" EXIT # ensure it is cleaned up
# create the virtualenv and exclude packages outside of it
virtualenv --python=$(which python2.7) --no-site-packages $tempdir/venv
# activate the virtualenv
source $tempdir/venv/bin/activate
# install the dependencies as above
pip install -r requirements.txt --prefix=vendor
In most cases you should be able to "vendor" all the dependencies. It's basically a crude version of virtualenv.
For example look at how the requests package includes chardet and urllib3 in its own source tree. Here's an example script that should do the initial downloading and copying for you: https://gist.github.com/proppy/1136723
Once you have the dependencies installed, you can reference them with from .some.namespace import dependency_name to make sure that you're using your local versions.
It's possible to do this with recent versions of pip (I'm using 8.1.2). On the build machine:
pip install -r requirements.txt --prefix vendor
Then run it:
PYTHONPATH=vendor/lib/python2.7/site-packages python yourapp.py
(This is basically an expansion of #valentjedi comment. Thanks!)
let's say you have python module app.py with dependencies in requirements.txt file.
first, install all your dependencies in appdeps folder.
python -m pip install -r requirements.txt --target=./appdeps
then in your app.py module add this dependency folder to the pythonpath
# app.py
import sys
sys.path.append('appdeps')
# rest of your module normally
#...
this will work the same way as if you were running this script from venv with all the dependencies installed inside ;>

Install local dist package into virtualenv

I have a pytest test, let's call it test.py. I used to run this test outside of virtualenv; now I'm trying to run it inside a virtualenv sandbox.
The project is structured like this:
~/project/test # where test.py and all virtualenv files live
~/project/mylibrary
test.py imports from mylibrary. In the past, this worked because I have the code in ~/project/mylibrary installed into /usr/lib/python2.7/dist-packages/mylibrary.
I can't run virtualenv with the --system-site-packages flag. I also can't move the code from ~/project/mylibrary into the ~/project/test folder. How can I get access to the code in mylibrary inside my virtualenv?
You don't need to do anything special - as long as you are working inside a virtualenv, python setup.py install will automatically install packages into
$VIRTUAL_ENV/lib/python2.7/site-packages
rather than your system-wide
/usr/lib/python2.7/dist-packages
directory.
In general it's better to use pip install mylibrary/, since this way you can neatly uninstall the package using pip uninstall mylibrary.
If you're installing a working copy of some code that you're developing, it might be a good idea to install it in "editable" mode using pip install -e mylibrary/, which creates a link to your source directory so that your installed module gets updated as you edit the code.
The easiest way would be to add the directory containing the library to your sys.path

How to install python modules in a local directory? --user and exporting pythonpath isn't working

I don't know very much about python but would like to install some python modules in a local directory on a server on which I don't have sudo access.
I start by going into my desired directory (not root) and create the directory tree needed to store my custom modules
cd /root/example/sub-example
mkdir -p local/lib/python2.7/site-packages
I then export this local path to PYTHONPATH
export PYTHONPATH=$PYTHONPATH:/root/example/sub-example/local/lib/python2.7/site-packages
I then make a new sub-directory to store the python package while extracting
mkdir example-python-directory
cd example-python-directory
wget http://example-python-package
tar -xvf example-python-package.tar.gz
cd example-python-package
Last, I run the setup.py script with the --user flag to try to get it to install in my specified /local directory
python setup.py install --user
The problem is, nothing is installed in my /root/example/sub-example/local/lib/python2.7/site-packages directory, and instead I find that I now have a new directory at root: /root/.local/lib/python2.7/site-packages
Is there a way to prevent this? I feel like my lack of Python knowledge is causing me to make some silly error that is probably obvious to others. Thanks for the help!
create a folder called "lib"
For Python 3
pip3 install <your_python_module_name> -t lib/
For Python 2
pip install <your_python_module_name> -t lib/
CFLAGS=-I$(brew --prefix)/include LDFLAGS=-L$(brew --prefix)/lib pip install <package>
I found that on servers where you haven't root access to, you can usually install the python module into your .brew/lib using this.
virtualenv is what I would recommend for this case (and pretty much any other case). I use it for pretty much everything I do in Python.
It allows you to essentially create a sandbox containing a Python environment that is bootstrapped from a Python install on your machine, and to install any modules you want into it.
It should not, in general, require the use of sudo, since you're not touching the system install. You can generally pip install a module right into the virtualenv, and then you run your scripts out of that virtualenv. You would basically just need a location you can read/write/execute from, say a directory you create in your user's home directory.
You can keep track of what's installed by doing a pip freeze > requirements and checking that into an SCM, and then a new virtualenv can be recreated using that file, ready to run your scripts.
The link I provided above has more details about how to use virtualenv.
Edit in response to comment from OP:
You can still use pip install outside of virtualenv, and I would recommend that. However, that can only operate on various Python installs that may be on the box (invoke pip from the bin directory of the install you want to use).
However, that won't work for installation into arbitrary directories. For that, you could try to unzip the egg file (they are supposed to be zip files) into the directory you want, and then make sure that directory is on the PYTHONPATH. Some egg files are available for direct download off of PyPI, although some are source only.
I think is approach is much more complex and prone to problems than virtualenv would be, though.

How to export virtualenv?

I'm new to virtualenv but I'm writting django app and finally I will have to deploy it somehow.
So lets assume I have my app working on my local virtualenv where I installed all the required libraries. What I want to do now, is to run some kind of script, that will take my virtualenv, check what's installed inside and produce a script that will install all these libraries on fresh virtualenv on other machine. How this can be done? Please help.
You don't copy paste your virtualenv. You export the list of all the packages installed like -
pip freeze > requirements.txt
Then push the requirements.txt file to anywhere you want to deploy the code, and then just do what you did on dev machine -
$ virtualenv <env_name>
$ source <env_name>/bin/activate
(<env_name>)$ pip install -r path/to/requirements.txt
And there you have all your packages installed with the exact version.
You can also look into Fabric to automate this task, with a function like this -
def pip_install():
with cd(env.path):
with prefix('source venv/bin/activate'):
run('pip install -r requirements.txt')
You can install virtualenvwrapper and try cpvirtualenv, but the developers advise caution here:
Warning
Copying virtual environments is not well supported. Each virtualenv
has path information hard-coded into it, and there may be cases where
the copy code does not know it needs to update a particular file. Use
with caution.
If it is going to be on the same path you can tar it and extract it on another machine. If all the same dependencies, libraries etc are available on the target machine it will work.

Categories

Resources