I want to output the requirements.txt for my Python 3 project in PyCharm. Any ideas?
Try the following command:
pip freeze > requirements.txt
Pigar works quite well I just tested it
https://github.com/damnever/pigar
The answer above with pip freeze will only work properly if you have set up a virtualenv before you started installing stuff with pip. Otherwise you will end up with requirements which are "surplus to requirements". It seems that pigar goes and looks at all your import statements and also looks up the versions currently being used. Anyway, if you have the virtualenv set up before you start it will all be cleaner, otherwise pigar can save you. It looks in subdirectories too.
open the terminal in Pycharm and type in this command:
pip freeze > requirements.txt
and the requirements.txt will be automatically created
Surely this post is a bit old but the same I contribute with what I learned, to generate the requirements.txt we can do it in three ways, as far as I know:
using FREEZE
pip freeze > requirements.txt
Before running the command be sure that the virtual environments is activated because
the command will be executed in the same folder as the project.A file requirements.txt with Python dependencies will be generated in the same folder as the execution.
If you use this command all requirements will be collected from the virtual environment. If you need to get requirements used only in the current project scope then you need to check next options.
using DEEPHELL
pip install --user dephell
using PIPREQS
pip install pipreqs
pipreqs /path/to/project
Note
Why to use pipreqs? Because pip freeze will collect all dependencies from the environments. While pipreqs will collect requirements used only in the current project!
plus freeze only saves the packages that are installed with pip install and saves all packages in the environment.
If you like to generate requirements.txt without installing the modules use pipreqs
If there were other ways to do it always grateful to continue learning :)
You can do it in Pycharm by going to Settings and project interpreter. Select all the Packages with their Version and latest. Then copy all this data into a MS word document. The MS word will treat it as a table. Delete the middle column of this table. Now copy all this data into a notepad++. Search for double spaces ' ' or a tab and replace it with '=='. Save this file as a requirements.txt. It will work
Related
I am working on a github project that will eventually be released on both conda and PyPl. Because of PyPl (and current development needs) I need to make the installation work using this command:
python -m pip install -e . (from inside the github cloned folder - pip 21)
However, my project has a dependency from a forked github repo. The problem I have is that I do not find a way to install the package from that fork using the command above, ....but I have to. Any trick I could find either got me nothing or the prod version of the repo. The only two things that worked are (1) manual or scripted launch of pip install [cannot do this, need setup.py to install things] (2) using conda [cannot rely on conda alone, I need the regular pip to work].
Can you please help me or point me in the right direction?
I use a requirements.txt that contains this line at the bottom:
...
-e git+https://github.com/jojurgens/pyqode.qt.git#master#egg=pyqode.qt
and a setup.py that contains this
with open(here / "requirements.txt", encoding="utf-8") as f:
requirements = f.read().splitlines()
setup(
name="qiskit_metal",
version="0.0.2",
install_requires=requirements,
)
On execution I get this error:
...error in setup command: 'install_requires' must be a string or list of strings containing valid project/version requirement specifiers; Parse error at "'-e git+h'": Expected W:(abcd...)
Actually I was finally able to find an answer. For who may need this info, here is what I did:
In requirements.txt I have changed the line to this:
pyqode.qt # git+https://github.com/jojurgens/pyqode.qt.git#egg=pyqode.qt
that alone took care of the error.
That also solved my other (not described here) problem about "updating" the conda environment without corrupting it. So now I update the environment with just the same install command (as opposed to the update command, which does not work) python -m pip install -e .
I think what you should specify is the version of Shapely on the requirements, for example Shapely==<version>. If you want to install from differente source (and not from pip), do it outside of requirements.txt. For that you can evoke a bash command inside of your python code.
I would like to easily export one Python project from one PC to other. When I created the project, I used a virtual environment in order to avoid problems with different package versions.
What I did was to just copy the project folder and paste it in the destination PC. Once I opened the project with Pycharm, I activated the virtual environment with project_path/venv/Scripts/activate, but when I tried to execute any Script, it said it didnĀ“t find the modules.
Which is the workflow I should follow in order to create projects and be able to run them from multiple PC-s without needing to install all the dependencies?
Since you did not specify your Python version I will provide a solution working for both Python 2.x and 3.x.
My suggestion is to create a requirements.txt file containing all your requirements.
This file can be easily prepared using the output from the command:
pip freeze
Then you can paste the output in your requirements.txt file and when you are going to install your Python code on another PC you can simply:
pip install -r requirements.txt
To install your requirements again.
Depending on your project it could be possible, for example, to create a single EXE file (if you are using Windows machines) but more detailed is needed if this is the case.
In case you are using Python 3 the method that is at the moment arguably more popular in the Python community is Pipenv.
Here's its relevant documentation.
And here you can read a simple example of a workflow.
if you are using python3 then use pipenv. It will automatically create Pipfile and Pipfile.lock. That will insure reinstalling dependencies on different machine will have the same packages.
basic and helpful commands:
pipenv shell # activate virutalenv
pipenv install # will install dependencies in Pipfile
pipenv install requests # will install requests lib. and will auto update Pipfile and Pipfile.lock
I have a python on which I've been working on. Now I've realized that I need a virtual environment for it. How can I create it for an existing project? If I do this:
virtualenv venv
will it work fine? Or do I have to re-create my project, create virtualenv and then copy the existing files to it?
You can just create an virtual enviroment with virtualenv venv and start it with venv/bin/activate.
You will need to reinstall all dependencies using pip, but the rest should just work fine.
The key thing is creating requirements.txt.
Create a virtualenv as normal. Do not activate it yet.
Now you need to install the required packages. If you do not readily remember it, ask pip:
pip freeze > requirements.txt
Now edit requirements.txt so that only the packages you know you installed are included. Note that the list will include all dependencies for all installed packages. Remove them, unless you want to explicitly pin their versions, and know what you're doing.
Now activate the virtualenv (the normal source path/to/virtualenv/bin/activate).
Install the dependencies you've collected:
pip install -r requirements.txt
The dependencies will be installed into your virtualenv.
The same way you'll be able to re-create the same env on your deployment target.
If you are using from windows then follow the following procedure:
Step 1: Go to your root directory of existing python project
Step 2: Create virtual environment with virtualenv venv
Step 4: Go to /Scripts and type this command activate
then if you would like to install all required library , pip3 install -r requirements.txt
There is something that I would like to add to this question. Because, newbees always have a problem and even once in a while, I do some mistake like that.
If you do not have requirements.txt for an already existing python-project, then you are doomed. Save at-least 2-3 hours of the day to recover the requirements.txt for an already existing python-project.
The best way to see it through and only if you are lucky, remember from that python-project, the most import package. By most-important, I mean the package that has highest-dependency.
Once, you have located this highest-dependency package, install it via pip. This will install all the dependency of the highest dependency package.
Now, see if it works. If it does, Voila !!
If it doesn't you will have get where the conflicts are and start to resolve them one-by-one.
I was working on such a situation recently, where there was no requirements.txt. But I knew, the highest dependency was this deep-learning package called Sentence-Transformers and I installed it and with minor conflicts, resolved everything.
Best-of-luck !! Let me know if it ever helped anyone !!
I will format my pc and i would like to somehow collect all the python modules that i have currently and package them (zip or rar etc) / or create an index file of them, so that when i'm done formatting the pc i can reinstall them all in one go, either by using the package/or by using the index created to pip install them all in a batch.
Is there any python module that allows to do that?
Use pip
pip freeze > requirements.txt
This will save the names of all your installed python modules to a file called requirements.txt.
Then when you want to install them again run the following command.
pip install -r requirements.txt
Using a package manager like this is good practice to get into, especially if you use a code repository, so you dont upload all the dependencies to the repo.
If you are not already doing so, its a good idea to use a virtual environment for your python projects.
This will create a unique python environment for each of your projects, keeping each project self contained.
Yes - pip. Using pip freeze will give you the list of all the installed modules. Next all you need to do is to install all that modules by running pip install -r your_file_with_modules_list.
Lets say a developer is working on a project when he realizes he needs to use some package.
He uses pip to install it. Now, after installing it, would a the developer write it down as a dependency in the requirements file / setup.py?
What does that same dev do if he forgot to write down all the dependencies of the project (or if he didn't know better since he hasn't been doing it long)?
What I'm asking is what's the workflow when working with external packages from the PyPi?
The command:
pip freeze > requirements.txt
will copy all of the dependencies currently in your python environment into requirements.txt. http://pip.readthedocs.org/en/latest/reference/pip_freeze.html
It depends on the project.
If you're working on a library, you'll want to put your dependencies in setup.py so that if you're putting the library on PyPi, people will be able to install it, and its dependencies automatically.
If you're working on an application in Python (possibly web application), a requirements.txt file will be easier for deploying. You can copy all your code to where you need it, set up a virtual environment with virtualenv or pyvenv, and then do pip install -r requirements.txt. (You should be doing this for development as well so that you don't have a mess of libraries globally).
It's certainly easier to write the packages you're installing to your requirements.txt as soon as you've installed them than trying to figure out which ones you need at the end. What I do so that I never forget is I write the packages to the file first and then install with pip install -r.
pip freeze helps if you've forgotten what you've installed, but you should always read the file it created to make sure that you actually need everything that's in there. If you're using virtualenv it'll give better results than if you're installing all packages globally.