Creating "virtualenv" for an existing project - python

I have a python on which I've been working on. Now I've realized that I need a virtual environment for it. How can I create it for an existing project? If I do this:
virtualenv venv
will it work fine? Or do I have to re-create my project, create virtualenv and then copy the existing files to it?

You can just create an virtual enviroment with virtualenv venv and start it with venv/bin/activate.
You will need to reinstall all dependencies using pip, but the rest should just work fine.

The key thing is creating requirements.txt.
Create a virtualenv as normal. Do not activate it yet.
Now you need to install the required packages. If you do not readily remember it, ask pip:
pip freeze > requirements.txt
Now edit requirements.txt so that only the packages you know you installed are included. Note that the list will include all dependencies for all installed packages. Remove them, unless you want to explicitly pin their versions, and know what you're doing.
Now activate the virtualenv (the normal source path/to/virtualenv/bin/activate).
Install the dependencies you've collected:
pip install -r requirements.txt
The dependencies will be installed into your virtualenv.
The same way you'll be able to re-create the same env on your deployment target.

If you are using from windows then follow the following procedure:
Step 1: Go to your root directory of existing python project
Step 2: Create virtual environment with virtualenv venv
Step 4: Go to /Scripts and type this command activate
then if you would like to install all required library , pip3 install -r requirements.txt

There is something that I would like to add to this question. Because, newbees always have a problem and even once in a while, I do some mistake like that.
If you do not have requirements.txt for an already existing python-project, then you are doomed. Save at-least 2-3 hours of the day to recover the requirements.txt for an already existing python-project.
The best way to see it through and only if you are lucky, remember from that python-project, the most import package. By most-important, I mean the package that has highest-dependency.
Once, you have located this highest-dependency package, install it via pip. This will install all the dependency of the highest dependency package.
Now, see if it works. If it does, Voila !!
If it doesn't you will have get where the conflicts are and start to resolve them one-by-one.
I was working on such a situation recently, where there was no requirements.txt. But I knew, the highest dependency was this deep-learning package called Sentence-Transformers and I installed it and with minor conflicts, resolved everything.
Best-of-luck !! Let me know if it ever helped anyone !!

Related

Can virtualenv include necessary project packages from site-packages

Running command line:
virtualenv --system-site-packages venv
I'm expecting venv folder venv\Lib\site-packages to contain all the necessary library from the projects that are located in:
C:\Users\XXX\AppData\Local\Programs\Python\Python36\Lib\site-packages\
But it's not the case, only a few are installed.
Example, my program currently use pdfminer which is in
C:\Users\XXXX\AppData\Local\Programs\Python\Python36\Lib\site-packages\
I want it to be included in venv\Lib\site-packages but it is not copied.
Any advice?
--system-site-packages doesn't copy packages, it just allows python from the virtualenv to access packages in C:\Users\XXX\AppData\Local\Programs\Python\Python36\Lib\site-packages\.
There is no way to copy packages because they could depend on their installation directory. If you want these packages in the virtualenv don't use --system-site-packages and install all packages in the virtualenv.
A virtualenv environment is the same as if you have just installed a new version of Python. It has no packages other than the standard packages provided with Python. If you want other packages, you have to install them with 'pip' or however you'd do it with the native Python version that you are using.
So in general, just do pip install <packagename>.
If you find yourself often wanting to create virtualenvs with a standard set of base packages, then put together a requirements.txt file listing all of the packages you want to install as a base, and do pip install -r requirements.txt inside a new virtualenv, right after you create it.
One nice thing about a virtualenv is that it's all yours. Your user owns 100% of it, unlike the base Python version that is owned by the system. To install new packages into the base Python version, you often have to have root access (sudo privileges). With virtualenvs, you don't need special permissions (in fact, you'll get all screwed up if you use sudo in a virtualenv) to install all the packages you want. Everything you do lives within your own home directory. Another neat thing is that when you are done with a virtualenv, you just throw away the root directory that contains it.
If its not mandatory to use virtualenv, I would suggest to go with Anaconda. That'll pretty much help your concern.
Conda as a package manager helps you find and install packages. By default quite a few packages are already installed, so as to set you up quickly for your project. To check the list of packages installed in terminal, type: conda list to obtain the packages installed using conda.
If you need a package that requires a different version of Python, you do not need to switch to a different environment manager, because conda is also an environment manager.
With just a few commands, you can set up a totally separate environment to run that different version of Python, while continuing to run your usual version of Python in your normal environment

Is there any way to output requirements.txt automatically?

I want to output the requirements.txt for my Python 3 project in PyCharm. Any ideas?
Try the following command:
pip freeze > requirements.txt
Pigar works quite well I just tested it
https://github.com/damnever/pigar
The answer above with pip freeze will only work properly if you have set up a virtualenv before you started installing stuff with pip. Otherwise you will end up with requirements which are "surplus to requirements". It seems that pigar goes and looks at all your import statements and also looks up the versions currently being used. Anyway, if you have the virtualenv set up before you start it will all be cleaner, otherwise pigar can save you. It looks in subdirectories too.
open the terminal in Pycharm and type in this command:
pip freeze > requirements.txt
and the requirements.txt will be automatically created
Surely this post is a bit old but the same I contribute with what I learned, to generate the requirements.txt we can do it in three ways, as far as I know:
using FREEZE
pip freeze > requirements.txt
Before running the command be sure that the virtual environments is activated because
the command will be executed in the same folder as the project.A file requirements.txt with Python dependencies will be generated in the same folder as the execution.
If you use this command all requirements will be collected from the virtual environment. If you need to get requirements used only in the current project scope then you need to check next options.
using DEEPHELL
pip install --user dephell
using PIPREQS
pip install pipreqs
pipreqs /path/to/project
Note
Why to use pipreqs? Because pip freeze will collect all dependencies from the environments. While pipreqs will collect requirements used only in the current project!
plus freeze only saves the packages that are installed with pip install and saves all packages in the environment.
If you like to generate requirements.txt without installing the modules use pipreqs
If there were other ways to do it always grateful to continue learning :)
You can do it in Pycharm by going to Settings and project interpreter. Select all the Packages with their Version and latest. Then copy all this data into a MS word document. The MS word will treat it as a table. Delete the middle column of this table. Now copy all this data into a notepad++. Search for double spaces ' ' or a tab and replace it with '=='. Save this file as a requirements.txt. It will work

How to handle python dependencies throughout the project?

Lets say a developer is working on a project when he realizes he needs to use some package.
He uses pip to install it. Now, after installing it, would a the developer write it down as a dependency in the requirements file / setup.py?
What does that same dev do if he forgot to write down all the dependencies of the project (or if he didn't know better since he hasn't been doing it long)?
What I'm asking is what's the workflow when working with external packages from the PyPi?
The command:
pip freeze > requirements.txt
will copy all of the dependencies currently in your python environment into requirements.txt. http://pip.readthedocs.org/en/latest/reference/pip_freeze.html
It depends on the project.
If you're working on a library, you'll want to put your dependencies in setup.py so that if you're putting the library on PyPi, people will be able to install it, and its dependencies automatically.
If you're working on an application in Python (possibly web application), a requirements.txt file will be easier for deploying. You can copy all your code to where you need it, set up a virtual environment with virtualenv or pyvenv, and then do pip install -r requirements.txt. (You should be doing this for development as well so that you don't have a mess of libraries globally).
It's certainly easier to write the packages you're installing to your requirements.txt as soon as you've installed them than trying to figure out which ones you need at the end. What I do so that I never forget is I write the packages to the file first and then install with pip install -r.
pip freeze helps if you've forgotten what you've installed, but you should always read the file it created to make sure that you actually need everything that's in there. If you're using virtualenv it'll give better results than if you're installing all packages globally.

Pythonic Ways of Importing Custom Modules?

I've needed to deal with this for some time, but never really figured out what the most pythonic way of importing/setting up PYTHONPATH for custom modules is. I know I can use virtualenv to manage it, I know I can set it inside of scripts, or through pth files, but none of these seem very clean and pythonic to me, so I'm guessing I'm missing something.
Almost always, all custom modules I'm interested in are contained in the git directory I've cloned down that has whatever script I'm running, if that simplifies things.
I'm guessing virtualenv is the answer, but figured I'd ask in case I'm missing anything.
EDIT: To clarify, this is only a question about custom modules. I'm already using pip for modules from PyPI.
You can use pip to install packages that are not on PyPI also. You just need an URI endpoint and a valid python package:
Examples:
$ pip install https://github.com/pypa/pip/archive/develop.zip#egg=pip
$ pip install git+https://github.com/pypa/pip.git#egg=pip
$ pip install git+git://github.com/pypa/pip.git#egg=pip
$ pip install /path/to/pip.tar.gz
$ pip install .
Read more at https://pip-installer.org/en/latest/usage.html#pip-install
virtualenv is a good start.
There are also package managers like pip and easy_install that manage third party modules.
In code you can use:
import sys
sys.path.append('/path/to/customModule')
Virtualenv is the way to go with this.
pip install virtualenv
Then make a folder to setup your environments. Inside that folder:
virtualenv <new_env_name>
That'll create a new folder in that directory, inside that there's a bin folder, run source on activate in that bin folder. You can then do pip install and it will only install it for that environment.
If you're cloning a git repo that you also want to be able to peruse the code easily (like if you're also working on that repo) clone it into your work_dir and then symlink or alias the package folder into the site-package directory inside that virtualenv's lib directory. Otherwise, if it's packaged correctly if you do python setup.py install it should install it right for that virtualenv.

How to export virtualenv?

I'm new to virtualenv but I'm writting django app and finally I will have to deploy it somehow.
So lets assume I have my app working on my local virtualenv where I installed all the required libraries. What I want to do now, is to run some kind of script, that will take my virtualenv, check what's installed inside and produce a script that will install all these libraries on fresh virtualenv on other machine. How this can be done? Please help.
You don't copy paste your virtualenv. You export the list of all the packages installed like -
pip freeze > requirements.txt
Then push the requirements.txt file to anywhere you want to deploy the code, and then just do what you did on dev machine -
$ virtualenv <env_name>
$ source <env_name>/bin/activate
(<env_name>)$ pip install -r path/to/requirements.txt
And there you have all your packages installed with the exact version.
You can also look into Fabric to automate this task, with a function like this -
def pip_install():
with cd(env.path):
with prefix('source venv/bin/activate'):
run('pip install -r requirements.txt')
You can install virtualenvwrapper and try cpvirtualenv, but the developers advise caution here:
Warning
Copying virtual environments is not well supported. Each virtualenv
has path information hard-coded into it, and there may be cases where
the copy code does not know it needs to update a particular file. Use
with caution.
If it is going to be on the same path you can tar it and extract it on another machine. If all the same dependencies, libraries etc are available on the target machine it will work.

Categories

Resources