I would like to be able to run a Python script from any folder. However, this Python script, named statistics.py calls upon two other files, colorbar.py and language_bytes.py. It also reads data from a folder that is located beneath it, in data/languages.yml.
Because it has these dependencies, along with the .yml file, I have been unable to have it be executable from anywhere. Even if I add #!/usr/bin/env python3 to the first line of statistics.py and drag it, along with the other two files and the data folder into usr/local/bin, this doesn't work.
With a little bit of digging I found out that this was because the data folder was being deleted as soon as it went into usr/local/bin. This means that the Python script is unable to access the data that it needs, and so cannot be run.
Is there any other way to run a Python script globally?
Edit: Because this script is made to by modifiable by users and will be open-source, it would be optimal if it could be run without having to be bundled into a PyPI package. As well as this, it doesn't make sense that every time one would want to make a globally runnable script with resources they'd have to upload it to PyPI, even if it was just for personal use.
Related
I have a Tkinter app that uses images included in the same folder as the .py file. pyinstaller script.py produces an executable that runs but does not open any windows. This is because it is looking for images that don't exist in the same subdirectory. When I copy the important images to the dist folder Pyinstaller creates, the application runs correctly.
However, I would like to have a single executable that I can share with other users that doesn't also require them to have the images stored. The images should be bundled with the software somehow, like how commercial software (usually) doesn't require you to download assets separately from the program itself.
Is there a way to bundle Python programs and the assets they use into single-click applications?
Note that I am using Python 3 on Linux Mint. I am also something of a novice, so don't be surprised if I'm missing something obvious here.
It appears I've solved my own problem.
Instead of having the images included in the same folder as main.py and using the resulting short relative filepath to reach them, install the images in an appropriate space in the system directory tree (I used /home/$USER$/.$PROGRAMNAME$/) and have the program access the files using the absolute path to that directory. This will allow you to copy the program anywhere on your computer you want and have it run without a problem.
However, if you want to share it with someone else, you'll need to also include an installation script that places the assets in the correct directory on their computer.
I've built a project with Python in which a module of functions can be changed by the user. More specifically, functions can be added or deleted inside this module by other processes in the application. Now I have just converted the whole project into an executable file using auto-py-to-exe to run it through a console window, instead of running it through VS Code for instance. I can't change the module if it was not added as an additional file in auto-py-to-exe rather can the application use this module if I do add it as an additional file.
My question is: how can I turn this project into an executable with the possibility of changing this module by the program itself?
This may be tricky, but it is possible with tools like pyinstaller when you bundle your app as a directory, rather than a single file executable. The source files (albeit compiled) will be present in the directory.
In principle, you could edit files in a single-file executable, but it's probably more trouble than it's worth and may be blocked by permissions and other issues.
You could also design your software to read a file from a predefined location (or the bundle directory -- anywhere accessible by the user) and simply exec the string of the code to 'load' it. It could look more or less like an extension/scripting system for your program. One example comes to mind: the iterm2 software Python API.
It should go without saying that you must trust your users/inputs if you give them the ability to arbitrarily change the software code.
Let's say I have a script I've written:
~/workspace/myscript/script.py
If I have, for example, a ~/bin which I have added to my $PATH, then I could create a symbolic link
~/bin/script -> ~/workspace/myscript/script.py
And everything works fine, I can call my script from anywhere.
Then, say my script starts to grow, and I separate it out
~/workspace/myscript/
script.py
mylib.py
I now run into a problem, as described here, that if I am calling my python script directly (as opposed to importing it as a module) then I cannot do a relative import.
The only solution I have seen is to package up the whole program into a fully fledged python package with a setup.py and installing it system-wide (or managing a home directory python library folder).
This seems like a lot of extra work for the sake of breaking my code into multiple python files.
Is there some way I can:
Call the script from anywhere (have it callable on path),
Have the code sparated into multiple files,
Not have to manage a full python package and installation.
All at once?
You can add the root directory of your module to the Python path:
export PYTHONPATH="$PYTHONPATH:~/workspace/myscript/"
I'm relatively new to python and used to C, and I'm trying to find a way to possibly compile several .py files in a directory together as one file. Some of the methods I've seen were to make an egg or bundle them up as a zip file and run it with 'python foo.zip,' but I'm pretty restricted on those options.
The .zip method is closest to what I'm after but what I need is more along the lines of importing that file in the main script and not as an argument to the interpreter.
I have to run this code on several machines and would rather not have to copy a whole folder of modules with it, and would also rather not have to paste all of my code into one file.
Caveats: I'm running a pretty old version of python (2.4.3) on machines that are cut off from the Internet and that I don't have physical access to, so I can't install other modules. I would have to be able to pull it off with old vanilla python.
I want to distribute my Python application to co-workers for them to use. The application will on be run on Linux systems, but the users do not have admin privileges so cannot install my application's module dependencies. I would the users to be able to untar my application and then run my main.py script. Running another one-time 'install'-type script is okay, but not much else.
PyInstaller is close to what I want. Except I would like to distribute the source code of my application as well. So the application should be stand-alone and self-contained (with or without the python interpreter is fine, preferably with), but users should be able to make small changes to the code and rerun the application. My ideal solution is to create some sort of compressed/compiled archive of all my applications module dependencies and distribute that with my application. It doesn't have to be all dependencies, but at least the non-standard packages. The application will then import modules from this archive instead of the user's PYTHONPATH.
I tried virtualenv, but having the users source the activate script was a little too much. I've been looking into numerous other solutions, but I can't find one that works for me.
Why don't you create a directory with the interpreter you want to use, add in any modules etc. Then drop in a bash script, say run.sh which calls the program. It can launch your chosen interpretter with your python files, arguments etc.
Any source files can remain this way and be edited in place. You could tar and distribute the whole directory, or put in something like git.
One approach is to use virtualenv. It is designed to create isolated python environment and does a good job at it. It should be possible (link to package the virtualenv with your app with some effort. However virtualenv is not designed for that so it's not as easy as it could be.
package-virtualenv GitHub project might also help.