I would like to easily export one Python project from one PC to other. When I created the project, I used a virtual environment in order to avoid problems with different package versions.
What I did was to just copy the project folder and paste it in the destination PC. Once I opened the project with Pycharm, I activated the virtual environment with project_path/venv/Scripts/activate, but when I tried to execute any Script, it said it didnĀ“t find the modules.
Which is the workflow I should follow in order to create projects and be able to run them from multiple PC-s without needing to install all the dependencies?
Since you did not specify your Python version I will provide a solution working for both Python 2.x and 3.x.
My suggestion is to create a requirements.txt file containing all your requirements.
This file can be easily prepared using the output from the command:
pip freeze
Then you can paste the output in your requirements.txt file and when you are going to install your Python code on another PC you can simply:
pip install -r requirements.txt
To install your requirements again.
Depending on your project it could be possible, for example, to create a single EXE file (if you are using Windows machines) but more detailed is needed if this is the case.
In case you are using Python 3 the method that is at the moment arguably more popular in the Python community is Pipenv.
Here's its relevant documentation.
And here you can read a simple example of a workflow.
if you are using python3 then use pipenv. It will automatically create Pipfile and Pipfile.lock. That will insure reinstalling dependencies on different machine will have the same packages.
basic and helpful commands:
pipenv shell # activate virutalenv
pipenv install # will install dependencies in Pipfile
pipenv install requests # will install requests lib. and will auto update Pipfile and Pipfile.lock
Related
I am working on a Python project on both my work computer and my home computer. GitHub has made the experience pretty seamless.
But I'm having a problem with the pyvenv.cfg file in my venv folder. Because my Python SDK has a different file path on my work computer compared to my home computer, I have to manually go into pyvenv.cfg to change the home = C:\Users\myName\... filepath each time I pull the updated version of my project from my other computer, or else the interpreter doesn't work.
Does anyone know a solution to this problem?
As confirmed in the comments, you've added the virtual environment folder to your project and included it in the files that you put on GitHub.
That's generally a bad idea, since it defeats part of the purpose of having a virtual environment in the first place. Your virtual environment will contain packages specific to the platform and configuration of the machine it is on - you could be developing on Linux on one machine and Windows on another and you'd have bigger problems than just one line in a configuration file.
What you should do:
create a virtual environment in a folder outside your project / source folder.
assuming you're using pip, you can run pip freeze > requirements.txt to create a requirements.txt folder which you can then use on the other system with pip install -r requirements.txt to recreate the exact same virtual environment.
All you have to do is keep that requirements.txt up to date and update the virtual environments on either computer whenever it changes, instead of pulling it through GitHub.
In more detail, a simple example (for Windows, very similar for Linux):
create a project folder, e.g. C:\projects\my_project
create a virtual environment for the project, e.g. python -m venv C:\projects\venv\my_project
activate the environment, i.e. C:\projects\venv\my_project\Scripts\activate.bat
install packages, e.g. pip install numpy
save what packages were installed to a file in C:\projects\my_project, called requirements.txt with pip freeze > requirements.txt
store the project in a Git repo, including that file
on another development machine, clone or pull the project, e.g. git clone https://github.com/my_project D:\workwork\projects\my_project
on that machine, create a new virtual environment, e.g. python -m venv D:\workwork\venv\my_project
activate the environment, i.e. D:\workwork\venv\my_project\Scripts\activate.bat
install the packages that are required with pip install -r D:\workwork\projects\my_project\requirements.txt
Since you say you're using PyCharm, it's a lot easier still, just make sure that the environment created by PyCharm sits outside your project folder. I like to keep all my virtual environments in one folder, with venv-names that match the project names.
You can still create a requirements.txt in your project and when you pull the project to another PC with PyCharm, just do the same: create a venv outside the project folder. PyCharm will recognise that it's missing packages from the requirements file and offer to install them for you.
You shouldn't keep the full virtualenv in source control, since more often than not it's much larger than your code, and there may be platform-and-interpreter-version specific bits and bobs in there.
Instead, you should save the packages required -- the tried and tested way is a requirements.txt file (but there are plenty of alternatives such as Pyenv, Poetry, Dephell, ...) -- and recreate the virtualenv on each machine you need to run the project on.
To save a requirements file for a pre-existing project, you can
pip freeze > requirements.txt
when the virtualenv is active.
Then, you can use
pip install -r requirements.txt
to install exactly those packages.
In general, I like to use pip-tools, so I only manage a requirements.in file with package requirement names, and the pip-compile utility then locks those requirements with exact versions into requirements.txt.
I'm working on a script in python that relies on several different packages and libraries. When this script is transferred to another machine, the packages it needs in order to run are sometimes not present or are older versions that do not have the same functionality and cause the script to fail.
I was considering using a virtual environment, but I can't find a way to have the script use the specific environment I design as it's default, and in order to use the environment a user must manually activate it from the command line.
I've also looked into trying to check the versions of the packages installed on the machine, and if they are not sufficient then updating them from the script as described here:
Installing python module within code
Is there any easier/surefire way to make sure that the needed packages will always be available regardless of where it's run?
The normal approach is to create an installation script and have that manage your dependencies. Then when you move your project to a new environment your installer will check that all dependencies are present.
I recommend you check out setuptools: https://setuptools.readthedocs.io/en/latest/
If you don't want to install dependencies whenever you need to use your script somewhere new, then you could package your script into a Docker container.
If the problem is ensuring the required packages are available in a new environment or virtual environment, you could use pip and generate a requirements.txt and check it in version control or use a tool to do that for you, like pipenv.
If you would prefer to generate a requirements.txt by hand, you should:
Install your depencencies using pip
Type pip freeze > requirements.txt to generate a requirements.txt file
Check requirements.txt in you source management software
When you need to setup a new environment, use pip install -m requirements.txt
The solution that I've been using has been to include a custom library (folder with all of my desired packages) in the folder with my script, and I simply import them from there:
from Customlib import pkg1, pkg2,...
As long as the custom library and script stay together in the same folder, it will always have access to the right packages and the correct versions of those packages.
I'm not sure how robust this solution actually is or what possible bugs may arise from this if it is passed from machine to machine, but for now this seems to work.
I will format my pc and i would like to somehow collect all the python modules that i have currently and package them (zip or rar etc) / or create an index file of them, so that when i'm done formatting the pc i can reinstall them all in one go, either by using the package/or by using the index created to pip install them all in a batch.
Is there any python module that allows to do that?
Use pip
pip freeze > requirements.txt
This will save the names of all your installed python modules to a file called requirements.txt.
Then when you want to install them again run the following command.
pip install -r requirements.txt
Using a package manager like this is good practice to get into, especially if you use a code repository, so you dont upload all the dependencies to the repo.
If you are not already doing so, its a good idea to use a virtual environment for your python projects.
This will create a unique python environment for each of your projects, keeping each project self contained.
Yes - pip. Using pip freeze will give you the list of all the installed modules. Next all you need to do is to install all that modules by running pip install -r your_file_with_modules_list.
Is there any easy way to export the libs my script needs so that I can put all of the files into a git repo and run the script from Jenkins without the need of installing anything?
context:
remote Jenkins without some python libs (RO - no access to terminal)
need to run my script that needs external libs such as paramiko, requests, etc
I have tried freeze.py but it fails at make stage
I have found some articles here regarding freeze.py, p2exe, p2app, but none of those helped me.
You can use a virtual environment to install your required python dependencies in the workspace. In short, this sets up a local version of python and pip for which you can install packages without affecting the system installation. Using virtual environments is also a great way to ensure dependencies from one job do not impact other jobs. This solution does require pip and virtualenv to be installed on the build machine.
Your build step should do something like:
virtualenv venv
. venv/bin/activate
pip install -r requirements.txt
# ... perform build, tests ...
If you separate your build into several steps, the environment variables set in the activate script will not be available in subsequent steps. You will need to either source the activate script in each step, or adjust the PATH (e.g. via EnvInject) so that the virtualenv python is run.
I'm new to virtualenv but I'm writting django app and finally I will have to deploy it somehow.
So lets assume I have my app working on my local virtualenv where I installed all the required libraries. What I want to do now, is to run some kind of script, that will take my virtualenv, check what's installed inside and produce a script that will install all these libraries on fresh virtualenv on other machine. How this can be done? Please help.
You don't copy paste your virtualenv. You export the list of all the packages installed like -
pip freeze > requirements.txt
Then push the requirements.txt file to anywhere you want to deploy the code, and then just do what you did on dev machine -
$ virtualenv <env_name>
$ source <env_name>/bin/activate
(<env_name>)$ pip install -r path/to/requirements.txt
And there you have all your packages installed with the exact version.
You can also look into Fabric to automate this task, with a function like this -
def pip_install():
with cd(env.path):
with prefix('source venv/bin/activate'):
run('pip install -r requirements.txt')
You can install virtualenvwrapper and try cpvirtualenv, but the developers advise caution here:
Warning
Copying virtual environments is not well supported. Each virtualenv
has path information hard-coded into it, and there may be cases where
the copy code does not know it needs to update a particular file. Use
with caution.
If it is going to be on the same path you can tar it and extract it on another machine. If all the same dependencies, libraries etc are available on the target machine it will work.