I am working on a Python project on both my work computer and my home computer. GitHub has made the experience pretty seamless.
But I'm having a problem with the pyvenv.cfg file in my venv folder. Because my Python SDK has a different file path on my work computer compared to my home computer, I have to manually go into pyvenv.cfg to change the home = C:\Users\myName\... filepath each time I pull the updated version of my project from my other computer, or else the interpreter doesn't work.
Does anyone know a solution to this problem?
As confirmed in the comments, you've added the virtual environment folder to your project and included it in the files that you put on GitHub.
That's generally a bad idea, since it defeats part of the purpose of having a virtual environment in the first place. Your virtual environment will contain packages specific to the platform and configuration of the machine it is on - you could be developing on Linux on one machine and Windows on another and you'd have bigger problems than just one line in a configuration file.
What you should do:
create a virtual environment in a folder outside your project / source folder.
assuming you're using pip, you can run pip freeze > requirements.txt to create a requirements.txt folder which you can then use on the other system with pip install -r requirements.txt to recreate the exact same virtual environment.
All you have to do is keep that requirements.txt up to date and update the virtual environments on either computer whenever it changes, instead of pulling it through GitHub.
In more detail, a simple example (for Windows, very similar for Linux):
create a project folder, e.g. C:\projects\my_project
create a virtual environment for the project, e.g. python -m venv C:\projects\venv\my_project
activate the environment, i.e. C:\projects\venv\my_project\Scripts\activate.bat
install packages, e.g. pip install numpy
save what packages were installed to a file in C:\projects\my_project, called requirements.txt with pip freeze > requirements.txt
store the project in a Git repo, including that file
on another development machine, clone or pull the project, e.g. git clone https://github.com/my_project D:\workwork\projects\my_project
on that machine, create a new virtual environment, e.g. python -m venv D:\workwork\venv\my_project
activate the environment, i.e. D:\workwork\venv\my_project\Scripts\activate.bat
install the packages that are required with pip install -r D:\workwork\projects\my_project\requirements.txt
Since you say you're using PyCharm, it's a lot easier still, just make sure that the environment created by PyCharm sits outside your project folder. I like to keep all my virtual environments in one folder, with venv-names that match the project names.
You can still create a requirements.txt in your project and when you pull the project to another PC with PyCharm, just do the same: create a venv outside the project folder. PyCharm will recognise that it's missing packages from the requirements file and offer to install them for you.
You shouldn't keep the full virtualenv in source control, since more often than not it's much larger than your code, and there may be platform-and-interpreter-version specific bits and bobs in there.
Instead, you should save the packages required -- the tried and tested way is a requirements.txt file (but there are plenty of alternatives such as Pyenv, Poetry, Dephell, ...) -- and recreate the virtualenv on each machine you need to run the project on.
To save a requirements file for a pre-existing project, you can
pip freeze > requirements.txt
when the virtualenv is active.
Then, you can use
pip install -r requirements.txt
to install exactly those packages.
In general, I like to use pip-tools, so I only manage a requirements.in file with package requirement names, and the pip-compile utility then locks those requirements with exact versions into requirements.txt.
Related
I would like to easily export one Python project from one PC to other. When I created the project, I used a virtual environment in order to avoid problems with different package versions.
What I did was to just copy the project folder and paste it in the destination PC. Once I opened the project with Pycharm, I activated the virtual environment with project_path/venv/Scripts/activate, but when I tried to execute any Script, it said it didnĀ“t find the modules.
Which is the workflow I should follow in order to create projects and be able to run them from multiple PC-s without needing to install all the dependencies?
Since you did not specify your Python version I will provide a solution working for both Python 2.x and 3.x.
My suggestion is to create a requirements.txt file containing all your requirements.
This file can be easily prepared using the output from the command:
pip freeze
Then you can paste the output in your requirements.txt file and when you are going to install your Python code on another PC you can simply:
pip install -r requirements.txt
To install your requirements again.
Depending on your project it could be possible, for example, to create a single EXE file (if you are using Windows machines) but more detailed is needed if this is the case.
In case you are using Python 3 the method that is at the moment arguably more popular in the Python community is Pipenv.
Here's its relevant documentation.
And here you can read a simple example of a workflow.
if you are using python3 then use pipenv. It will automatically create Pipfile and Pipfile.lock. That will insure reinstalling dependencies on different machine will have the same packages.
basic and helpful commands:
pipenv shell # activate virutalenv
pipenv install # will install dependencies in Pipfile
pipenv install requests # will install requests lib. and will auto update Pipfile and Pipfile.lock
Whenever we download any python dependencies i.e django,MySQL by using 'pip' it get install in the root folder that would be c drive on windows( where we have install python).
How can I install this dependencies in my project folder itself, so when I start my application it will read the dependencies from project folder rather from c drive.
My purpose is whenever i give my application to user or third person he should directly start application instead of downloading all dependencies.
Go install VirtualEnvWrapper first. After that, create a new virtualenv, activate it, and run pip freeze. You should see nothing in there because nothing is installed. This env will store all your pip installs and will not use all the packages you have installed on your c drive. Anything you pip install while this env is activated will be in this env and this env alone. You can tie a project to this env as well by using commands from virtaulenvwrapper. Your env and project directories should be separate. Deactivate the env to go back to your 'Base' environment and pip freeze again. You will see all the installs you have done on your C drive.
A best practice is to create a requirements.txt file and version control it so everyone can use the same versions of the same packages. If you don't want to do this, simply activate your new virtual env and pip install everything you want.
I have a project I built in python 2.7 on one PC. It uses djagno 1.4.5 and some other modules that are kept in site-packages. I tried to copy the contents of Lib\site-packages to the new PC Python install, but I get missing module errors when I try to run manage.py runserver. Do I have to install everything again on the new pc and just transfer the project files?
Usually, you keep a list of your project requirements in requirements.txt and keep the file at the root of your project. Example requirements.txt content:
Django==1.6.5
lxml==3.3
On the new computer, you clone the repository containing the project source code (or get it different way), then install the requirements via pip python package manager:
pip install -r requirements.txt
Also, having separate virtual environments for every project is basically must have.
In order to create the list of requirements from your current python environment (virtual or system-wide), run:
pip freeze > requirements.txt
Is there any easy way to export the libs my script needs so that I can put all of the files into a git repo and run the script from Jenkins without the need of installing anything?
context:
remote Jenkins without some python libs (RO - no access to terminal)
need to run my script that needs external libs such as paramiko, requests, etc
I have tried freeze.py but it fails at make stage
I have found some articles here regarding freeze.py, p2exe, p2app, but none of those helped me.
You can use a virtual environment to install your required python dependencies in the workspace. In short, this sets up a local version of python and pip for which you can install packages without affecting the system installation. Using virtual environments is also a great way to ensure dependencies from one job do not impact other jobs. This solution does require pip and virtualenv to be installed on the build machine.
Your build step should do something like:
virtualenv venv
. venv/bin/activate
pip install -r requirements.txt
# ... perform build, tests ...
If you separate your build into several steps, the environment variables set in the activate script will not be available in subsequent steps. You will need to either source the activate script in each step, or adjust the PATH (e.g. via EnvInject) so that the virtualenv python is run.
I'm new to virtualenv but I'm writting django app and finally I will have to deploy it somehow.
So lets assume I have my app working on my local virtualenv where I installed all the required libraries. What I want to do now, is to run some kind of script, that will take my virtualenv, check what's installed inside and produce a script that will install all these libraries on fresh virtualenv on other machine. How this can be done? Please help.
You don't copy paste your virtualenv. You export the list of all the packages installed like -
pip freeze > requirements.txt
Then push the requirements.txt file to anywhere you want to deploy the code, and then just do what you did on dev machine -
$ virtualenv <env_name>
$ source <env_name>/bin/activate
(<env_name>)$ pip install -r path/to/requirements.txt
And there you have all your packages installed with the exact version.
You can also look into Fabric to automate this task, with a function like this -
def pip_install():
with cd(env.path):
with prefix('source venv/bin/activate'):
run('pip install -r requirements.txt')
You can install virtualenvwrapper and try cpvirtualenv, but the developers advise caution here:
Warning
Copying virtual environments is not well supported. Each virtualenv
has path information hard-coded into it, and there may be cases where
the copy code does not know it needs to update a particular file. Use
with caution.
If it is going to be on the same path you can tar it and extract it on another machine. If all the same dependencies, libraries etc are available on the target machine it will work.