Why shouldn't I push a virtualenv to Heroku? - python

Tutorials online are telling me to put venv in my .gitignore file. Why wouldn't I want to push my virtual environment so that I or other developers could easily pull the project to their locals and conveniently have all dependencies?

On top of what Othman said, virtualenvs are simply not portable. Trying to move it will break it, and it's easier to create a new environment than to fix it. So, even on deployment platforms that do use virtual environments, checking them in to git is not going to work.

virtualenv is a tool to create isolated Python environments.
Heroku gives you one environment and you may install your packages using requirements.txt which is required by Heroku for Django applications.
If you want to share these dependencies with other developers use another remote to github. and push your requirements.txt
then tell your developers to install the packages using this file.
Example
requirements.txt
Django==1.3
Fabric==1.2.0
Jinja2==2.5.5
PyYAML==3.09
To install these packages in one time use:
pip install -r /path/to/requirements.txt
Moreover, when you run the application in your local machine then the virtual environment files might change, which will make push useless things to your repo.
Note: if you want to know which packages installed in your virtual environment then use pip freeze
If you want to export the packages to requirements.txt then run
pip freeze > requirements.txt

Related

How to create a common environment for teamwork in Python

I would like to create a virtual environment for my team. My team, works in different places and everyone has their own environment, it causes a lot of problems, everyone has a different version of the libraries (Python, RobotFramework).
I thought about:
creating one common environment, I used virtualenv.
Installing the prepared libraries (python and robotframework) with one command pip install ...,
Prepared libraries will be in the git repository so that everyone can modify them, change the library version.
I have the first and third parts done, but I have a problem with the second. How to create such a package of libraries to be able to install it with one pip install command.
Should I create an environment locally, install all the libraries in it, and send them to git? Or should I package the project via setuptool (to tar.gz)?
Unfortunately, I cannot find the answer to this question, it seems to me that none of the above solutions is optimal.
The easiest way of doing it would be creating a text file of all the libraries you are using in pip with the command.
pip freeze > requirements.txt
This will create a file listing all the packages with their versions that are being used. To install that ask every team member to place that requirement file in their projects and use
pip install -r requirements.txt
With pip, you could download your dependencies. These will be .tar.gz, .whl or .zip files. Note that this could be complicated if your team uses multiple OS.
Here is an example which will download the dependencies into the directory named "dependencies", you can push this to git along with the requirements file.
pip freeze > req.txt
pip download -r req.txt -d dependencies
When someone clones your repository, they can install the dependencies offline with the following command.
pip install --no-index --find-links=dependencies -r req.txt

Is there any way to package python code so that other machine does not need to install all the dependencies using pip

I have installed python in my system and wrote a simple script using GET REST API for Jenkins data.
I have installed all the required modules using pip. Now I want to package this script with all the dependencies and run on another machine. However, in another machine, I don't want to perform all the pip installation steps.
I know we can mention all the modules in the requirements.txt and use pip install -r requirements.txt. But, is there any way so that I don't need to install modules using pip for each dependency, such that I can install Python and all other dependencies must be installed when I run the zip file.
You can install pip dependencies to a certain directory using -t (target).
pip install -r requirements.txt -t .
That will install your pip modules to the current directory. You can zip the whole thing then and deploy. Make sure that the environment you install the dependencies in matches your intended deployment environment. For consistency you can run the command in a docker container, for example.
I think you should use virtualenv module which makes your project easily deploy-able.
Virtual Environment should be used whenever you work on any Python based project. It is generally good to have one new virtual environment for every Python based project you work on. So the dependencies of every project are isolated from the system and each other.
I came across a link which can help Virtual Env explained

Python: Packages that are not relevant for project

When I create a virtualenv for a Python project, it get's "polluted" by packages that I install for my convenience (like iPython or other packages that my editor "VS Code" depends on, like "pylint").
But these packages are not relevant for my project. So if I do pip freeze > requirements.txt, I see that only a few packages are relevant for my project.
What is the best way to clean up?
Install those packages in a global context so that I can use them in every project I begin? or
Do a pip freeze > requirements.txt, then edit the requirements file and remove not needed packages?
What we do here:
First we have the project's requirement file - the one used for deployments. This is not built using pip freeze but manually edited so it only contains relevant packages.
Then we have the "dev" requirement file with packages that are only useful for development but are required to work on the project (linters, additionnal testing stuff etc).
And finally each is free to maintain his own personal additional requirements (editor-related packages etc).
Note that using virtualenvwrapper (which really helps for development installs) you define hooks that will install packages when you create a new virtual env.
Here is an alternative solution for preparing requirements.txt manually.
The project I mentioned above prepares a requirements.txt
for your project based on the imports you did in your project's Python files.
Assuming all of your Python files in myproject, doing these in your terminal:
$ pip install pipreqs
$ pipreqs myproject
will generate a requirements.txt file for you.
This way, you can just pip install -r requirements.txt in your virtual environment instead of pip freeze > requirement.txt since you will have only the packages which are related to your project.

Create environment from existing folder/app with pip / conda

While developing my app I didn't use an environment. Now I want to use one and export all dependencies of my app in an environment.yml / requirements.txt file that afterwards I can use it to build a docker image.
The issue is, if I create an environment and then export it with:
conda env export > environment.yml
I get no dependencies in that file.
Or if I use:
pip freeze --local > requirements.txt
I see all system modules that have nothing to do with my project.
I would imagine conda or pip has something that would just go through all my files in the directory I am and place all imports and their dependencies inside the environment.yml/requirements.txt file.
I can't find the command to do that..
You can use virtualenv to isolate your pip environment of your application from rest of your system. Use:
virtualenv <your_project_path>/venv
This will create a virtual environment of your app. Then use;
source venv/bin/activate
This will isolate your pip environment. Reinstall all your dependencies and run pip freeze you will see only project related dependencies.
pip freeze by default fetches all installed pip modules over the system. If you use virtualenv and then install your dependencies, your pip modules will reside in your application folder.
edit
I would recommend a good IDE based on your comments such as PyCharm. You can follow tutorial here for setting up venv and handling all your dependencies. Once done, you can run pip freeze for your requirements.txt

Do i need to install Django in virtualenv as well if i don't use system package

I have installed virtualenv with --no-site-packages option.
I have few doubts
If I use django with virtualenv then does it mean that my django site is completely cut off from system packages. I mean any of the package installed in system site packages won't be available?
I have installed all packages in virtualenv but not django. Do I also need to install Django in virtualenv as well?
Suppose I have some package which is not in virtualenv but that is available in main env, can I access it from main package or only one environment can run at one time?
Yes you do, you can do that via pip or download from Django and run setup. In both cases you need to make sure that you have the virtualenv active, i.e. source ENV/bin/activate
The point of virtualenv is to keep your main system separate, you want to do that.
yes.
you should just install them in your virtualenv, it is better practice.
A really nice thing about virtualenv is that you can create a nice complete environment for your project. Then once things are working and stable you can pip freeze the packages and git your code and then you know if you share your project or move systems the whole thing is going to be easy to recreate and just work :)
--- update to comment ---
at command line and assuming Linux type environment
$ cd
$ virtualenv --no-site-packages --distribute ENV
$ source ENV/bin/activate
$ pip install django
$ pip install all_the_packages_you_need
Now you can go into your django project and run python commands as normal and it will use your virtualenv "ENV" python and site-packages

Categories

Resources