I was curious about the process of sending someone a simple Python script, with some imported modules and how that work from the receiving end.
I guess my overall question is the following. Does the receiver of the script have to do anything at all or can they just run that file and get the intended result? Do they have to install Python, the modules used in the script, etc?
Thanks for any answers, I am sure there are plenty of “well it depends..” examples, which are fine. I am still learning so any answer is great.
If you are sending it to someone who has everything needed to develop with Python, what you need to do is work on a virtual environment:
Virtual environments (venv), in a nutshell, are python's way to handle the versioning of packages and ensuring that if someone else tries to run your script they can replicate your dependencies. To start, run
python -m venv your_venv_name
#if on linux:
source your_venv_name/bin/activate
#if on windows:
./your_venv_name/Scripts/activate
Then you will have a fresh version of the python version you were using with no dependencies installed, so you can then start installing with pip.
After you install everything run
pip freeze > requirements.txt
Now you can share your project, and the other devs just have to create their own venv and run
pip install -r requirements.txt
If on the other hand you are sending the script to someone who doesn't have python installed on their machine you will have to generate and executable file: https://realpython.com/pyinstaller-python/
Related
I am a beginner in python. I have done a website using django, flask, xml, wtforms and also i have used some API python modules too. The website was successfully created and working well in local machine.
But if i want to run in an another python available machine, i am in the need of install all my above mentioned modules manually.
Do we have something similar to gradle, maven or ant which will download/install the required modules during my first run?
Kindly help me.
One way is to freeze your current local python installations into a requirements.txt file and then install everything in one go in another machine.
$ pip freeze > requirements.txt
copy the requirements file into another machine,
install python and then ...
$ pip install -r requirements.txt
If I install a virtualenv on my local machine, activate it and try to run python3 then it works fine (with the imported modules). However, after I send it to the live server (using scp and filezilla) it gives the error:
-bash: /<path>/venv4/bin/python3: cannot execute binary file: Exec format error
This also happens with python and python3.8 in the same package.
I have tried reinstalling virtualenv and pipx, recreating the virtualenv and reuploading a few times.
It seems that it can't find the module, as when I activate the virtualenv on the live server and type "which python3" then it shows me the system python3:
/usr/bin/python3
It also does not work if I try to execute the venv's python3 directly, using the full path.
The reason I'm doing this is because the old virtualenv I was using has stopped working because it can't seem to find the installed modules anymore. I'm not sure why.
Any help would be much appreciated.
I believe some pip packages contain more than just python code, and must be compiled. If your host OS is different from your server OS, or you have different libraries installed, the host-compiled code will not be compatible with your server.
Common practice is to create a file with a list of required packages, using something like
pip freeze > requirements.txt
and rebuild the environment on the server, using something like
pip install -r requirements.txt
I only have python files which require some packages to be installed. I have conda installe on my computer as well. I just have the code, the project is not running anywhere at the moment. I checked this question, however as far as I understand an assumption of this solution is that you run the export command from running environment. I don't have the environment, only the code which has import statements inside. Is there a way to automatically install all packages needed?
I have created a simple debian package for my python program using this post.
I am also using a postinst script to setup and populate mysql tables. The package gets installed with following command.
sudo apt install mypackage.deb
I now want to add an uninstall script so that if the package is removed, uninstall script gets called to cleanup the environment.
How can I incorporate uninstall script with the debian package?
You probably need to write a postrm script too the same way as you wrote the postinst script. See maintainer scrips flowcharts to understand how these scripts work.
A quote from the same article:
"It is possible to supply scripts as part of a package which the package management system will run for you when your package is installed, upgraded or removed.
These scripts are the control information files preinst, postinst, prerm and postrm. They must be proper executable files; if they are scripts (which is recommended), they must start with the usual #! convention. They should be readable and executable by anyone, and must not be world-writable."
I'm trying to allow other computers to also view a simple CLI application that I've put in a virtual environment. After searching for a while, I pip froze and generated a requirements.txt. When I attempt to install dependencies on a remote computer via pip install --editable. The terminal outputs a Python OS error.
The project is available at https://github.com/JonW27/calc
Screenshots of the error are provided:
I have a strong feeling that I've made a beginner's mistake/ did something dumb. If the post needs clarification then please specify, I honestly don't know what's wrong with it– I made the venv relocatable and ran into no errors.
Based on the error message, it appears you are attempting to install the package to the python's system-wide location. Try re-creating your virtualenv.