How to export virtualenv? - python

I'm new to virtualenv but I'm writting django app and finally I will have to deploy it somehow.
So lets assume I have my app working on my local virtualenv where I installed all the required libraries. What I want to do now, is to run some kind of script, that will take my virtualenv, check what's installed inside and produce a script that will install all these libraries on fresh virtualenv on other machine. How this can be done? Please help.

You don't copy paste your virtualenv. You export the list of all the packages installed like -
pip freeze > requirements.txt
Then push the requirements.txt file to anywhere you want to deploy the code, and then just do what you did on dev machine -
$ virtualenv <env_name>
$ source <env_name>/bin/activate
(<env_name>)$ pip install -r path/to/requirements.txt
And there you have all your packages installed with the exact version.
You can also look into Fabric to automate this task, with a function like this -
def pip_install():
with cd(env.path):
with prefix('source venv/bin/activate'):
run('pip install -r requirements.txt')

You can install virtualenvwrapper and try cpvirtualenv, but the developers advise caution here:
Warning
Copying virtual environments is not well supported. Each virtualenv
has path information hard-coded into it, and there may be cases where
the copy code does not know it needs to update a particular file. Use
with caution.

If it is going to be on the same path you can tar it and extract it on another machine. If all the same dependencies, libraries etc are available on the target machine it will work.

Related

How to install all my python modules in single shot

I am a beginner in python. I have done a website using django, flask, xml, wtforms and also i have used some API python modules too. The website was successfully created and working well in local machine.
But if i want to run in an another python available machine, i am in the need of install all my above mentioned modules manually.
Do we have something similar to gradle, maven or ant which will download/install the required modules during my first run?
Kindly help me.
One way is to freeze your current local python installations into a requirements.txt file and then install everything in one go in another machine.
$ pip freeze > requirements.txt
copy the requirements file into another machine,
install python and then ...
$ pip install -r requirements.txt

Export Python project from one PC to other

I would like to easily export one Python project from one PC to other. When I created the project, I used a virtual environment in order to avoid problems with different package versions.
What I did was to just copy the project folder and paste it in the destination PC. Once I opened the project with Pycharm, I activated the virtual environment with project_path/venv/Scripts/activate, but when I tried to execute any Script, it said it didnĀ“t find the modules.
Which is the workflow I should follow in order to create projects and be able to run them from multiple PC-s without needing to install all the dependencies?
Since you did not specify your Python version I will provide a solution working for both Python 2.x and 3.x.
My suggestion is to create a requirements.txt file containing all your requirements.
This file can be easily prepared using the output from the command:
pip freeze
Then you can paste the output in your requirements.txt file and when you are going to install your Python code on another PC you can simply:
pip install -r requirements.txt
To install your requirements again.
Depending on your project it could be possible, for example, to create a single EXE file (if you are using Windows machines) but more detailed is needed if this is the case.
In case you are using Python 3 the method that is at the moment arguably more popular in the Python community is Pipenv.
Here's its relevant documentation.
And here you can read a simple example of a workflow.
if you are using python3 then use pipenv. It will automatically create Pipfile and Pipfile.lock. That will insure reinstalling dependencies on different machine will have the same packages.
basic and helpful commands:
pipenv shell # activate virutalenv
pipenv install # will install dependencies in Pipfile
pipenv install requests # will install requests lib. and will auto update Pipfile and Pipfile.lock

Creating "virtualenv" for an existing project

I have a python on which I've been working on. Now I've realized that I need a virtual environment for it. How can I create it for an existing project? If I do this:
virtualenv venv
will it work fine? Or do I have to re-create my project, create virtualenv and then copy the existing files to it?
You can just create an virtual enviroment with virtualenv venv and start it with venv/bin/activate.
You will need to reinstall all dependencies using pip, but the rest should just work fine.
The key thing is creating requirements.txt.
Create a virtualenv as normal. Do not activate it yet.
Now you need to install the required packages. If you do not readily remember it, ask pip:
pip freeze > requirements.txt
Now edit requirements.txt so that only the packages you know you installed are included. Note that the list will include all dependencies for all installed packages. Remove them, unless you want to explicitly pin their versions, and know what you're doing.
Now activate the virtualenv (the normal source path/to/virtualenv/bin/activate).
Install the dependencies you've collected:
pip install -r requirements.txt
The dependencies will be installed into your virtualenv.
The same way you'll be able to re-create the same env on your deployment target.
If you are using from windows then follow the following procedure:
Step 1: Go to your root directory of existing python project
Step 2: Create virtual environment with virtualenv venv
Step 4: Go to /Scripts and type this command activate
then if you would like to install all required library , pip3 install -r requirements.txt
There is something that I would like to add to this question. Because, newbees always have a problem and even once in a while, I do some mistake like that.
If you do not have requirements.txt for an already existing python-project, then you are doomed. Save at-least 2-3 hours of the day to recover the requirements.txt for an already existing python-project.
The best way to see it through and only if you are lucky, remember from that python-project, the most import package. By most-important, I mean the package that has highest-dependency.
Once, you have located this highest-dependency package, install it via pip. This will install all the dependency of the highest dependency package.
Now, see if it works. If it does, Voila !!
If it doesn't you will have get where the conflicts are and start to resolve them one-by-one.
I was working on such a situation recently, where there was no requirements.txt. But I knew, the highest dependency was this deep-learning package called Sentence-Transformers and I installed it and with minor conflicts, resolved everything.
Best-of-luck !! Let me know if it ever helped anyone !!

Install python module only on home folder in server

I'm developing my master thesis on a university's server, so I have my account and I can log in and do all the stuff I want if I remain inside /home/myname/.
I'm developing some python scripts and now I want to integrate python with the octave module, which is not currently installed on the system, and , of course, I cannot do anything with sudo apt-get install .
How can I overcome this problem without asking to my teacher?
thank you all,
Fabio
Please don't copy python and pip. You should use a virtualenv to install project-specific packages. This is particularly useful in your use-case where you can't install things at the system level. Even if you could, virtualenvs are recommended so the dependencies of each project are isolated.
Here is a quick primer that should get you going.
Create the virtualenv
virtualenv ~/project/env
Activate the virtualenv
source ~/project/env/bin/activate
This will modify your bash prompt by placing the name of your virtualenv in parenthesis to indicate that your virtualenv is activated.
(env) hostname:current_folder user$
Install Packages into the virtualenv
pip install -r requirements.txt
Use the virtualenv
python script.py
Use virtualenv by default in a script
script.py
#!~/project/env/bin/python
print('hello world!')
Then from the command line
chmod ugo+x script.py
./script.py
hello world!
Deactivate the virtualenv
deactivate
Make yourself a local copy of python and pip, then you can install whatever modules you want and not have to worry about getting a sysadmin to help you.
There are some good instructions here
Go here to get the link to the version of python you need and substitute it in the instructions above.
In your .bashrc add alias and path to your local copy - you may need to modify this for your own situation:
alias python="~/bin/python"
PATH=~/.local/bin:~/bin:$PATH
For the PATH - when you install local copies of modules through pip they by default go to ~/.local - change this if you prefer.
Begin your scripts with:
#/usr/bin/env python
so they use your preferred python version

Python, export dependencies

Is there any easy way to export the libs my script needs so that I can put all of the files into a git repo and run the script from Jenkins without the need of installing anything?
context:
remote Jenkins without some python libs (RO - no access to terminal)
need to run my script that needs external libs such as paramiko, requests, etc
I have tried freeze.py but it fails at make stage
I have found some articles here regarding freeze.py, p2exe, p2app, but none of those helped me.
You can use a virtual environment to install your required python dependencies in the workspace. In short, this sets up a local version of python and pip for which you can install packages without affecting the system installation. Using virtual environments is also a great way to ensure dependencies from one job do not impact other jobs. This solution does require pip and virtualenv to be installed on the build machine.
Your build step should do something like:
virtualenv venv
. venv/bin/activate
pip install -r requirements.txt
# ... perform build, tests ...
If you separate your build into several steps, the environment variables set in the activate script will not be available in subsequent steps. You will need to either source the activate script in each step, or adjust the PATH (e.g. via EnvInject) so that the virtualenv python is run.

Categories

Resources