A standard way of installing Poetry projects globally - python

I often want to use my own personal Python projects for daily use. In golang, I can just run go install project.go and the compiled binary is sent to ~/go/bin (which is in my PATH). Now I can run my program anywhere globally in my system. Is there an equivalent for this in Python and Poetry?
I can probably workaround this by writing a script that runs something along the lines of poetry run $project and put that script in one of my PATHs. But what I want to know is if there is a standard way of doing this in Python and Poetry.
What I want is something along the lines of pip install program but for my own projects.
Right now, I need to cd into my project folder and run poetry run $project every time I want to run my program, which is really cumbersome.

Pip works with poetry projects.
Try:
pip install /path/to/your/project

Related

How can I run python code with the packages I use without an IDE?

I used PyCharm to write a program that mutes me on discord when I say “mute myself”.
It works fine when I run it with PyCharm, but when I try to run it with idle or cmd, it won't find the modules I use.
It sounds like you may want to look into something like Poetry to manage the Virtualenv for your program. Poetry will help you bundle together the modules your program needs in a structured way.
After you've packaged your program using poetry, check out the "run" command.
If you don't like Poetry, you can find other alternatives over on the PyPA documentation.
PyCharm, by default, creates a virtual environment where it installs the modules (the venv folder).
I am assuming you are on windows?
To use that environment's packages you can activate it in your command line
venv\Scripts\activate.bat
and run your script like
python main.py
Another alternative would be to package your project into a exe using a tool like pyinstaller
# while in your virtual environment
pip install pyinstaller
pyinstaller --onefile main.py

How to ensure that needed packages are always installed/available

I'm working on a script in python that relies on several different packages and libraries. When this script is transferred to another machine, the packages it needs in order to run are sometimes not present or are older versions that do not have the same functionality and cause the script to fail.
I was considering using a virtual environment, but I can't find a way to have the script use the specific environment I design as it's default, and in order to use the environment a user must manually activate it from the command line.
I've also looked into trying to check the versions of the packages installed on the machine, and if they are not sufficient then updating them from the script as described here:
Installing python module within code
Is there any easier/surefire way to make sure that the needed packages will always be available regardless of where it's run?
The normal approach is to create an installation script and have that manage your dependencies. Then when you move your project to a new environment your installer will check that all dependencies are present.
I recommend you check out setuptools: https://setuptools.readthedocs.io/en/latest/
If you don't want to install dependencies whenever you need to use your script somewhere new, then you could package your script into a Docker container.
If the problem is ensuring the required packages are available in a new environment or virtual environment, you could use pip and generate a requirements.txt and check it in version control or use a tool to do that for you, like pipenv.
If you would prefer to generate a requirements.txt by hand, you should:
Install your depencencies using pip
Type pip freeze > requirements.txt to generate a requirements.txt file
Check requirements.txt in you source management software
When you need to setup a new environment, use pip install -m requirements.txt
The solution that I've been using has been to include a custom library (folder with all of my desired packages) in the folder with my script, and I simply import them from there:
from Customlib import pkg1, pkg2,...
As long as the custom library and script stay together in the same folder, it will always have access to the right packages and the correct versions of those packages.
I'm not sure how robust this solution actually is or what possible bugs may arise from this if it is passed from machine to machine, but for now this seems to work.

Run python script from another computer without installing packages/setting up environment?

I have a Jupyter notebook script that will be used to teach others how to use python.
Instead of asking each participant to install the required packages, I would like to provide a folder with the environment ready from the start.
How can I do this?
What is the easiest way to teach python without running into technical problems with packages/environments etc.?
If you just need to install Python dependencies, you can use #Aero Blue solution. However, the users would need probably to make a virtual environment, so they don't mess with other environments and versions, etc.
However, if they should need some Linux packages, this would not be enough. Therefore, I would suggest using Docker. You would need to provide them with a Dockerfile, that you should set to install any dependencies (whether is for Python or Linux), and they would just need to use docker build and docker run commands.
The easiest way I have found to package python files is to use pyinstaller which packages your python file into an executable file.
If it's a single file I usually run pyinstaller main.py --onefile
Another option is to have a requirements file
This reduces installing all packages to one command pip install -r requirements.txt
You would need to use a program such as py2exe, pyinstaller, or cx_freeze to package each the file, the modules, and a lightweight interpreter. The result will be an executable which does not require the user to have any modules or even python installed to access it; however, because of the built-in interpreter, it can get quite large (which is why Python is not commonly used to make executables).
Have you considered using Azure notebooks or another Jupyter hosting service ? Most of these have a special syntax you can use to perform pip installs. For Azure it is !pip install
https://notebooks.azure.com

Export Python project from one PC to other

I would like to easily export one Python project from one PC to other. When I created the project, I used a virtual environment in order to avoid problems with different package versions.
What I did was to just copy the project folder and paste it in the destination PC. Once I opened the project with Pycharm, I activated the virtual environment with project_path/venv/Scripts/activate, but when I tried to execute any Script, it said it didn´t find the modules.
Which is the workflow I should follow in order to create projects and be able to run them from multiple PC-s without needing to install all the dependencies?
Since you did not specify your Python version I will provide a solution working for both Python 2.x and 3.x.
My suggestion is to create a requirements.txt file containing all your requirements.
This file can be easily prepared using the output from the command:
pip freeze
Then you can paste the output in your requirements.txt file and when you are going to install your Python code on another PC you can simply:
pip install -r requirements.txt
To install your requirements again.
Depending on your project it could be possible, for example, to create a single EXE file (if you are using Windows machines) but more detailed is needed if this is the case.
In case you are using Python 3 the method that is at the moment arguably more popular in the Python community is Pipenv.
Here's its relevant documentation.
And here you can read a simple example of a workflow.
if you are using python3 then use pipenv. It will automatically create Pipfile and Pipfile.lock. That will insure reinstalling dependencies on different machine will have the same packages.
basic and helpful commands:
pipenv shell # activate virutalenv
pipenv install # will install dependencies in Pipfile
pipenv install requests # will install requests lib. and will auto update Pipfile and Pipfile.lock

How to add uninstall script to debian package

I have created a simple debian package for my python program using this post.
I am also using a postinst script to setup and populate mysql tables. The package gets installed with following command.
sudo apt install mypackage.deb
I now want to add an uninstall script so that if the package is removed, uninstall script gets called to cleanup the environment.
How can I incorporate uninstall script with the debian package?
You probably need to write a postrm script too the same way as you wrote the postinst script. See maintainer scrips flowcharts to understand how these scripts work.
A quote from the same article:
"It is possible to supply scripts as part of a package which the package management system will run for you when your package is installed, upgraded or removed.
These scripts are the control information files preinst, postinst, prerm and postrm. They must be proper executable files; if they are scripts (which is recommended), they must start with the usual #! convention. They should be readable and executable by anyone, and must not be world-writable."

Categories

Resources