I created a project in Pycharm. Then I went down and typed pip install commands in to the terminal to install my required packages.
The strange thing is that my code only works, if the .py file is in the project root directory, and it does not work if it is in the 'venv' directory.
It don't know yet how exactly this works, and this looks to specific to just "google it".
Thank you
Your code is not supposed to go there, it is for your environment only. Move any source files to the root of your project.
First of all you can read this docs to understand better what is virtualenv:
what is virtualenv
The virtualenv you built can use different python version other then you use in the Pycharm project.
Furthermore, virtualenv built with libs. when you did pip install <lib> you didn't install the lib in the virtualenv so it probably missing there.
Solution:
Activate virtualenv in terminal:
source path_to_virtualenv/bin/activate
install the lib again this time in the virtualenv:
pip install <lib>
this should help.
Related
I'm writing a plugin for an app, and found that packages I install with pip are apparently being ignored, so I'm trying to install them within my plugin's directory using
pip install --ignore-installed --install-option="--prefix=[plugins-folder]" [package-name]
This creates a folder structure lib/python2.7/site-packages/[dependencies] which is fine by me, except I can't figure out how to them import those dependencies. Even if I manage to import the main package, it will break because it can't find it's own dependencies which are also in that same directory structure.
I suggest that you should use virtual python environment. virtualenv for python2/python3 and python3 -m venv for python3.
You python environment seems orderless and you should isolate each environment for each python app.
I have a pytest test, let's call it test.py. I used to run this test outside of virtualenv; now I'm trying to run it inside a virtualenv sandbox.
The project is structured like this:
~/project/test # where test.py and all virtualenv files live
~/project/mylibrary
test.py imports from mylibrary. In the past, this worked because I have the code in ~/project/mylibrary installed into /usr/lib/python2.7/dist-packages/mylibrary.
I can't run virtualenv with the --system-site-packages flag. I also can't move the code from ~/project/mylibrary into the ~/project/test folder. How can I get access to the code in mylibrary inside my virtualenv?
You don't need to do anything special - as long as you are working inside a virtualenv, python setup.py install will automatically install packages into
$VIRTUAL_ENV/lib/python2.7/site-packages
rather than your system-wide
/usr/lib/python2.7/dist-packages
directory.
In general it's better to use pip install mylibrary/, since this way you can neatly uninstall the package using pip uninstall mylibrary.
If you're installing a working copy of some code that you're developing, it might be a good idea to install it in "editable" mode using pip install -e mylibrary/, which creates a link to your source directory so that your installed module gets updated as you edit the code.
The easiest way would be to add the directory containing the library to your sys.path
I've needed to deal with this for some time, but never really figured out what the most pythonic way of importing/setting up PYTHONPATH for custom modules is. I know I can use virtualenv to manage it, I know I can set it inside of scripts, or through pth files, but none of these seem very clean and pythonic to me, so I'm guessing I'm missing something.
Almost always, all custom modules I'm interested in are contained in the git directory I've cloned down that has whatever script I'm running, if that simplifies things.
I'm guessing virtualenv is the answer, but figured I'd ask in case I'm missing anything.
EDIT: To clarify, this is only a question about custom modules. I'm already using pip for modules from PyPI.
You can use pip to install packages that are not on PyPI also. You just need an URI endpoint and a valid python package:
Examples:
$ pip install https://github.com/pypa/pip/archive/develop.zip#egg=pip
$ pip install git+https://github.com/pypa/pip.git#egg=pip
$ pip install git+git://github.com/pypa/pip.git#egg=pip
$ pip install /path/to/pip.tar.gz
$ pip install .
Read more at https://pip-installer.org/en/latest/usage.html#pip-install
virtualenv is a good start.
There are also package managers like pip and easy_install that manage third party modules.
In code you can use:
import sys
sys.path.append('/path/to/customModule')
Virtualenv is the way to go with this.
pip install virtualenv
Then make a folder to setup your environments. Inside that folder:
virtualenv <new_env_name>
That'll create a new folder in that directory, inside that there's a bin folder, run source on activate in that bin folder. You can then do pip install and it will only install it for that environment.
If you're cloning a git repo that you also want to be able to peruse the code easily (like if you're also working on that repo) clone it into your work_dir and then symlink or alias the package folder into the site-package directory inside that virtualenv's lib directory. Otherwise, if it's packaged correctly if you do python setup.py install it should install it right for that virtualenv.
I'm new to virtualenv but I'm writting django app and finally I will have to deploy it somehow.
So lets assume I have my app working on my local virtualenv where I installed all the required libraries. What I want to do now, is to run some kind of script, that will take my virtualenv, check what's installed inside and produce a script that will install all these libraries on fresh virtualenv on other machine. How this can be done? Please help.
You don't copy paste your virtualenv. You export the list of all the packages installed like -
pip freeze > requirements.txt
Then push the requirements.txt file to anywhere you want to deploy the code, and then just do what you did on dev machine -
$ virtualenv <env_name>
$ source <env_name>/bin/activate
(<env_name>)$ pip install -r path/to/requirements.txt
And there you have all your packages installed with the exact version.
You can also look into Fabric to automate this task, with a function like this -
def pip_install():
with cd(env.path):
with prefix('source venv/bin/activate'):
run('pip install -r requirements.txt')
You can install virtualenvwrapper and try cpvirtualenv, but the developers advise caution here:
Warning
Copying virtual environments is not well supported. Each virtualenv
has path information hard-coded into it, and there may be cases where
the copy code does not know it needs to update a particular file. Use
with caution.
If it is going to be on the same path you can tar it and extract it on another machine. If all the same dependencies, libraries etc are available on the target machine it will work.
For a number of reasons, such as when the package takes a long time to compile (lxml) it seems to be recommended to symlink such packages from the system sitepackages directory to a virtualenv.
Some example questions:
Use a single site package (as exception) for a virtualenv
How to install lxml into virtualenv from the local system?
But such packages are not recognized by pip, which will happily try to reinstall them. How to deal with this?
Okay, it seems the trick is to also link the egg-info directory.