So I want to send my python script to another computer, nonetheless, it doesn't have the same package installed on it. Is there any way to send the whole python code as a folder which will also include all that packages? ( I have tried creating a virtual environment through the problem relies on the fact that a lot of the code in the virtual environment is made of aliased files which might not exist on the other computer).
Thank you very much in advance for your help.
you can make a requirements.txt file, here will be all your packages from your project, if you have already vitualenv for example to create your requirements.txt execute
pip freeze > requirements.txt
example of requirements.txt
absl-py==0.2.1
amqp==2.3.1
asn1crypto==0.24.0
after this take your all project without your virtualenv, copy to another compure create a new virtualenv, enter in your virtualenv and make
pip install -r requirements.txt
you will have all your packages
Barring the longer process of creating an installable package with a setup.py file, you can place your script in its own folder, then add a pip requirements file. While your virtualenv is active, run the bash/terminal command:
pip freeze > requirements.txt
Ensure that you send the requirements file with the script, and then the recipient can simply run the bash/terminal command (in their own virtualenv, which they hopefully will do)
pip install -r requirements.txt
before running the script.
Related
I have installed python in my system and wrote a simple script using GET REST API for Jenkins data.
I have installed all the required modules using pip. Now I want to package this script with all the dependencies and run on another machine. However, in another machine, I don't want to perform all the pip installation steps.
I know we can mention all the modules in the requirements.txt and use pip install -r requirements.txt. But, is there any way so that I don't need to install modules using pip for each dependency, such that I can install Python and all other dependencies must be installed when I run the zip file.
You can install pip dependencies to a certain directory using -t (target).
pip install -r requirements.txt -t .
That will install your pip modules to the current directory. You can zip the whole thing then and deploy. Make sure that the environment you install the dependencies in matches your intended deployment environment. For consistency you can run the command in a docker container, for example.
I think you should use virtualenv module which makes your project easily deploy-able.
Virtual Environment should be used whenever you work on any Python based project. It is generally good to have one new virtual environment for every Python based project you work on. So the dependencies of every project are isolated from the system and each other.
I came across a link which can help Virtual Env explained
When I create a virtualenv for a Python project, it get's "polluted" by packages that I install for my convenience (like iPython or other packages that my editor "VS Code" depends on, like "pylint").
But these packages are not relevant for my project. So if I do pip freeze > requirements.txt, I see that only a few packages are relevant for my project.
What is the best way to clean up?
Install those packages in a global context so that I can use them in every project I begin? or
Do a pip freeze > requirements.txt, then edit the requirements file and remove not needed packages?
What we do here:
First we have the project's requirement file - the one used for deployments. This is not built using pip freeze but manually edited so it only contains relevant packages.
Then we have the "dev" requirement file with packages that are only useful for development but are required to work on the project (linters, additionnal testing stuff etc).
And finally each is free to maintain his own personal additional requirements (editor-related packages etc).
Note that using virtualenvwrapper (which really helps for development installs) you define hooks that will install packages when you create a new virtual env.
Here is an alternative solution for preparing requirements.txt manually.
The project I mentioned above prepares a requirements.txt
for your project based on the imports you did in your project's Python files.
Assuming all of your Python files in myproject, doing these in your terminal:
$ pip install pipreqs
$ pipreqs myproject
will generate a requirements.txt file for you.
This way, you can just pip install -r requirements.txt in your virtual environment instead of pip freeze > requirement.txt since you will have only the packages which are related to your project.
How do you check what package and version are installed inside virtualenv?
My thought was to create requirement.txt file. But, is there another way to do this from CLI?
Once you activate your virtual environment, you should be able to list out packages using pip list and verify version using python --version.
pip list will show you all the packages that are installed for the virtualenv. I see you are wanting to create the requirements.txt file from a CLI, so you can run this command
pip freeze > requirements.txt
This will give you a requirements.txt file inside your current directory, for ALL the packages and libraries that are installed for that virtualenv.
Personally, I like using pigar. All you need to do after installing it is run pigar and it will search through your Python files inside the current directory and create a requirements.txt file for you based on all your imports from those files.
I created a project in PyCharm that uses flask (among a few other modules) installed in a PyCharm-created virutal environment of Python 3.6.1, and the app itself works great in PyCharm. When I went to set up a requirements.txt file for other virtual requirements, however, I noticed that only virtualenv was listed in the output file. (To create the file, I went to "console" and did "pip freeze > requirements.txt".)
After testing some things out, I noticed that pip list only gave me three modules installed to begin with. I know this is not true because when I go into the settings of my interpreter, PyCharm says there are a lot of other modules installed, but it doesn't seem like they're actually installed.
How can I create a requirements file in PyCharm effectively? Do I just have to list all the modules I installed in my readme and have my users figure out how to install them all themselves?
Screenshots below:
Project Settings dialog:
Pip list output:
Use pipreqs
$ pip install pipreqs
$ pipreqs /home/project/location
Successfully saved requirements file in /home/project/location/requirements.txt
This will export the packages used in your current project directory into requirements.txt
See pipreqs
Why not pip freeze?
pip freeze only saves the packages that are installed with pip
install in your environment.
pip freeze saves all packages in the
environment including those that you don't use in your current
project. (if you don't have virtualenv)
Sometimes you just need
to create requirements.txt for a new project without installing
modules.
It's certainly strange, the only thing I can think of is that it's a problem with virtualenv on Windows.
Anyways it wouldn't be best practice to create your requirements.txt from pip freeze, because you have more packages installed than the ones that your project requires.
E.g. lets say that your project only requires Flask:
$ pip install Flask
$ pip freeze
click==6.7
Flask==0.12.2
itsdangerous==0.24
Jinja2==2.9.6
MarkupSafe==1.0
Werkzeug==0.12.2
As you can see by installing Flask many more packages were installed, but you don't have to list those in your requirements.txt, as they are not your projects requirements, they are Flask's requirements.
Therefore you should construct your requirements.txt manually. What I usually do is pip install Flask; pip freeze |grep Flask and copy the line Flask==0.12.2 to my requirements.txt, doing this every time I install something with pip.
Tutorials online are telling me to put venv in my .gitignore file. Why wouldn't I want to push my virtual environment so that I or other developers could easily pull the project to their locals and conveniently have all dependencies?
On top of what Othman said, virtualenvs are simply not portable. Trying to move it will break it, and it's easier to create a new environment than to fix it. So, even on deployment platforms that do use virtual environments, checking them in to git is not going to work.
virtualenv is a tool to create isolated Python environments.
Heroku gives you one environment and you may install your packages using requirements.txt which is required by Heroku for Django applications.
If you want to share these dependencies with other developers use another remote to github. and push your requirements.txt
then tell your developers to install the packages using this file.
Example
requirements.txt
Django==1.3
Fabric==1.2.0
Jinja2==2.5.5
PyYAML==3.09
To install these packages in one time use:
pip install -r /path/to/requirements.txt
Moreover, when you run the application in your local machine then the virtual environment files might change, which will make push useless things to your repo.
Note: if you want to know which packages installed in your virtual environment then use pip freeze
If you want to export the packages to requirements.txt then run
pip freeze > requirements.txt