I have few simple scripts in a single folder. Something like this:
prject_root/moduleA.py
prject_root/moduleB.py
prject_root/moduleC.py
prject_root/moduleD.py
prject_root/config.py
prject_root/run.py
So run.py imports them all, while each module imports the config file (and can be run on it's own). I want to create an executable that will mimic calling python run.py and will hold all the data of all the files imported by run.py. How would I do that with pyinstaller or similar system, the binary will be executed under Linux (Debian).
If you want to have a binary that can be run even if the user doesn't have python and you don't mind your binary being relatively large, you could use the freeze.py program. freeze.py should come with your python installation, so locate it on your system, and then run:
python /your/path/to/freeze.py /your/path/to/project_root/run.py
This will package up all of your code, and the parts of python necessary to run your code into an executable, run, that you can then use and distribute.
A more detailed description of freeze can be found here.
Related
I am new to python compiling and I try to make it work for compiled scripts using python 3.9 when having miniconda installed.
Here is my main.py in the root folder of my project.
from say_hello import just_say_hello
print('Hi')
just_say_hello()
Here is say_hello.py
def just_say_hello():
print('Hello')
The files are in the same folder. In that folder, I run the following command-line statement to compile:
python -m compileall .
Afterwards, I run the following commandline-statement to run:
cd __pycache__
python .\main.cpython-39.pyc
This results in the following error:
ModuleNotFoundError: No module named 'say_hello'
Why this module cannot be found when running the script from the __pycache__ folder? Both script files (compiled) are in there. It should not be such a big problem. When running the normal original script, everything works fine. When using pycompile, suddenly, things break. However, I really need to compile.
How can I just compile this python script in such a way I can run it without such issues?
I tried several things. For example, I renamed the the pyc files to normal python files. After running
python main.py
I received the following error:
ValueError: source code string cannot contain null bytes
So I really need a solution to run my compiled multi file script.
Python is an interpreted language, what this means for you is that you don't have to compile the scripts to run them (Even though they are technically still compiled). This means that when you run a python file it is compiled to byte code (Machine code) and then runs. However, if other python files are referenced in the file you are running, then they are compiled, so that they do not have to be compiled when they are called, this is what .pyc files are. I would not recommend deleting them. I hope this helps.
Module import is one of many things that are wrong with Python, and just like significant whitespace, 'Python' seems proud of it.
Having said that, I usually solve my problems with import by adding everything to the PYTHONPATH. Either in the environment variable (depends on OS) or in code itself with sys.path.append('..')
Expect new problems when you reboot or move your application to a different machine or even a different directory. Or when you want to make your application cross-platform.
Test carefully for these scenarios.
PS. There is also a somewhat older answer that explains different ways of importing here: How can I import other Python files?
When you compile it down to machine code, it hardcodes the directory paths to the source files.
All you need to do to make the compiled files work is move them into the same directory where the uncompiled versions are.
I have a python project which takes in a bunch of pdf files from a directory, scrapes data from them, and then does some matching of that scraped data with some data in a CSV file.
The whole work has 2-3 python scripts, used as modules, and also uses dependencies of pdftotext, pandas, NumPy, etc.
Now I can pip freeze my Conda env and it can give me a requirements.txt file with all packages to install.
However, I want this main python script (which calls other modules and runs the whole project) to be run by a less technical person who doesn't work on pandas and other such python stuff.
So is there a way I can make this whole project as an executable file that encapsulates all dependencies, packages, scripts, and just running that executable in the terminal should run the whole project without having the other person install all dependencies themselves using requirements.txt file.
I can't use docker unfortunately as that is not permitted right now for my work.
I was thinking buck build if that works?
https://buck.build/
or if there is an easy way?
Thanks!
One approach is to package the Python application directory as a .zip file and execute that. Zip files that have a __main__.py entry point can be run this way.
This can be done easily in version 2.6 and up. Additional “zipapp” support was added in 3.6.
The main challenge has to do with compatibility for non-pure-Python libraries. What you zip up needs to be compatible with the machine where it will be run.
pip install cx_freeze
cxfreeze main.py --target-name your_exe_name
Replace your_exe_name. It will generate a build folder with your .exe in it.
Let's say I have a script I've written:
~/workspace/myscript/script.py
If I have, for example, a ~/bin which I have added to my $PATH, then I could create a symbolic link
~/bin/script -> ~/workspace/myscript/script.py
And everything works fine, I can call my script from anywhere.
Then, say my script starts to grow, and I separate it out
~/workspace/myscript/
script.py
mylib.py
I now run into a problem, as described here, that if I am calling my python script directly (as opposed to importing it as a module) then I cannot do a relative import.
The only solution I have seen is to package up the whole program into a fully fledged python package with a setup.py and installing it system-wide (or managing a home directory python library folder).
This seems like a lot of extra work for the sake of breaking my code into multiple python files.
Is there some way I can:
Call the script from anywhere (have it callable on path),
Have the code sparated into multiple files,
Not have to manage a full python package and installation.
All at once?
You can add the root directory of your module to the Python path:
export PYTHONPATH="$PYTHONPATH:~/workspace/myscript/"
I have written a Python script that I need to share with folks who may or may not have Python installed on their machine. As a dirty hack I figured I could copy my local Python3.6 install into the same folder as the script I made, and then create a .bat file that runs python from the copied Python source ie.
Python36\python.exe script.py %*
In this way I could just send them the folder, and all they have to do is double click the .bat file. Now this does work, but it takes about 2 - 5 mins for script.py to begin executing. How could I configure the copied python source so that it runs like it "should"?
In terms of speed that is little you can do. You could convert your Python script into a compiled extension, this increases speed of a Python script greatly. Cython can do this and once compile you then do as you have done already.
Honestly you will notice little difference if you do this, and that is about the best you will do with that method. A better method is to turn it into an executable directly.
What you are doing currantly is:
The batch command starts and executes (this is slow by itself). This starts the Python interpreter.
The Python interpreter loads the file and then starts.
You should use a tool such as Cx_Freeze or Pyinstaller to convert your script into an executable, then it could be run just like any other appliocation. You could also use Cython to achieve this.
You can use installers as well.
Are you using any libraries? A quick solution would be converting the python script to executable using py2exe. More details are also in this post.
from distutils.core import setup
import py2exe
setup(console=['sample.py'])
And then run the command
C:\Tutorial>python setup.py py2exe
I have a simple python script, which imports various other modules I've written (and so on). Due to my environment, my PYTHONPATH is quite long. I'm also using Python 2.4.
What I need to do is somehow package up my script and all the dependencies that aren't part of the standard python, so that I can email a single file to another system where I want to execute it. I know the target version of python is the same, but it's on linux where I'm on Windows. Otherwise I'd just use py2exe.
Ideally I'd like to send a .py file that somehow embeds all the required modules, but I'd settle for automatically building a zip I can just unzip, with the required modules all in a single directory.
I've had a look at various packaging solutions, but I can't seem to find a suitable way of doing this. Have I missed something?
[edit] I appear to be quite unclear in what I'm after. I'm basically looking for something like py2exe that will produce a single file (or 2 files) from a given python script, automatically including all the imported modules.
For example, if I have the following two files:
[\foo\module.py]
def example():
print "Hello"
[\bar\program.py]
import module
module.example()
And I run:
cd \bar
set PYTHONPATH=\foo
program.py
Then it will work. What I want is to be able to say:
magic program.py
and end up with a single file, or possibly a file and a zip, that I can then copy to linux and run. I don't want to be installing my modules on the target linux system.
I found this useful:
http://blog.ablepear.com/2012/10/bundling-python-files-into-stand-alone.html
In short, you can .zip your modules and include a __main__.py file inside, which will enable you to run it like so:
python3 app.zip
Since my app is small I made a link from my main script to __main__.py.
Addendum:
You can also make the zip self-executable on UNIX-like systems by adding a single line at the top of the file. This may be important for scripts using Python3.
echo '#!/usr/bin/env python3' | cat - app.zip > app
chmod a+x app
Which can now be executed without specifying python
./app
Use stickytape module
stickytape scripts/blah --add-python-path . > /tmp/blah-standalone
This will result with a functioning script, but not necessarily human-readable.
You can try converting the script into an executable file.
First, use:
pip install pyinstaller
After installation type ( Be sure you are in your file of interest directory):
pyinstaller --onefile --windowed filename.py
This will create an executable version of your script containing all the necessary modules. You can then transfer (copy and paste) this executable to the PC or machine you want to run your script.
I hope this helps.
You should create an egg file. This is an archive of python files.
See this question for guidance: How to create Python egg file
Update: Consider wheels in 2019
The only way to send a single .py is if the code from all of the various modules were moved into the single script and they your'd have to redo everything to reference the new locations.
A better way of doing it would be to move the modules in question into subdirectories under the same directory as your command. You can then make sure that the subdirectory containing the module has a __init__.py that imports the primary module file. At that point you can then reference things through it.
For example:
App Directory: /test
Module Directory: /test/hello
/test/hello/__init__.py contents:
import sayhello
/test/hello/sayhello.py contents:
def print_hello():
print 'hello!'
/test/test.py contents:
#!/usr/bin/python2.7
import hello
hello.sayhello.print_hello()
If you run /test/test.py you will see that it runs the print_hello function from the module directory under the existing directory, no changes to your PYTHONPATH required.
If you want to package your script with all its dependencies into a single file (it won't be a .py file) you should look into virtualenv. This is a tool that lets you build a sandbox environment to install Python packages into, and manages all the PATH, PYTHONPATH, and LD_LIBRARY_PATH issues to make sure that the sandbox is completely self-contained.
If you start with a virgin Python with no additional libraries installed, then easy_install your dependencies into the virtual environment, you will end up with a built project in the virtualenv that requires only Python to run.
The sandbox is a directory tree, not a single file, but for distribution you can tar/zip it. I have never tried distributing the env so there may be path dependencies, I'm not sure.
You may need to, instead, distribute a build script that builds out a virtual environment on the target machine. zc.buildout is a tool that helps automate that process, sort of like a "make install" that is tightly integrated with the Python package system and PyPI.
I've come up with a solution involving modulefinder, the compiler, and the zip function that works well. Unfortunately I can't paste a working program here as it's intermingled with other irrelevant code, but here are some snippets:
zipfile = ZipFile(os.path.join(dest_dir, zip_name), 'w', ZIP_DEFLATED)
sys.path.insert(0, '.')
finder = ModuleFinder()
finder.run_script(source_name)
for name, mod in finder.modules.iteritems():
filename = mod.__file__
if filename is None:
continue
if "python" in filename.lower():
continue
subprocess.call('"%s" -OO -m py_compile "%s"' % (python_exe, filename))
zipfile.write(filename, dest_path)
Have you taken into considerations Automatic script creation of distribute the official packaging solution.
What you do is create a setup.py for you program and provide entry points that will be turned into executables that you will be able run. This way you don't have to change your source layout while still having the possibility to easily distribute and run you program.
You will find an example on a real app of this system in gunicorn's setup.py