Is there a way to freeze python code only partially? - python

I would like to distribute my python application to the end-user as single-file executable. However, the end-user has the possibility to add functionalities to the application. For this he can write funtions to a python file which is then imported at runtime.
However, if I freeze the Application for distribution, any changes made by the user to the python file afterwards will have no effect.
How can I distribute my Application as a more or less single-file executable to the end-user with him still being able to add/remove functions?
Or can you give the user the possibility to add functionality in another way than by importing his Python functions at runtime?

I think there is a misunderstanding of how a programme written in Python is actually executed. If we want to "import Python code at runtime" into a programme, this imported Python code would have to be compiled at runtime. If the end-user does not have a Python compiler installed, how should that be possible? Unless your programme effectively contains a Python compiler, I think, what you want to achieve is impossible by design.

Related

Distribute a standalone Python software with Julia dependencies

I have a software that is mostly written in Python and, for now, I'm using PyInstaller to bundle and distribute the software in a user-friendly way (it's part of my CI pipeline, for Linux and Windows).
However, my performance is terrible and I want to rewrite some heavy parts in Julia while keeping the front-end in Python. I can use PyJulia to do this, but it means that the user has to install Julia manually in order to use my program.
Julia does have the equivalent of PyInstaller, which is PackageCompiler.jl, but I don't know how to call something compiled with PackageCompiler.jl from the Python side.
How can I make this work, so I can bundle and distribute an executable that has Python, Julia and everything it needs to run?
A little more details
My end user is someone (chemists and pharmacists) that have no idea what programming is. They don't have Python, Julia or Docker (and they don't even want to install it).
In my current approach, the software bundled with PyInstaller consists of a single executable with everything inside it (Python and everything it needs). What I really want is to keep the same user experience, but also with Julia running on the backstage.
I'll implement several functions in the Julia side, and I want (almost) the same level of integration as I get with PyJulia.
Maybe I'll go to Rust and just use the C interface, but I really would like to use Julia.
Thank y'all for your time.
Per here: https://julialang.github.io/PackageCompiler.jl/stable/apps.html#Creating-an-app-1 you can get what is basically an executable file and then you can just follow this post: Python Script execute commands in Terminal to execute the file you create.
You could try using JuliaWin. This provides a standalone Julia runtime that works fully self-contained.
We deployed a tool using PyInstaller with the JuliaWin included as one of the datas as PyInstaller calls them. Iirc it was not PyJulia but JuliaCall+PythonCall doing the interfacing in our case.
Unfortunately the startup was ridiculous (order of minutes). This is in part due to Julia's startup time, but largely due to the generated .exe first having to unpack the JuliaWin. So we are currently investigating using PackageCompiler, possibly also switching to providing the user with 2 files instead of 1.

How to convieve of a cross platform python script

I have made a python interactive script project containing a few directories with project files and a main python script.
The script does the work of batch processing scientific images for biological systematics.
I wrote os agnostic code, but I was thinking about trying to package/freeze the script as a cli utility that a (more or less)lay person could download and use.
I have been reading about packaging and freezing techniques in python and the more I read the more I feel I'm confused. (I'm linux user)
Am I conceiving of this script as a utility correctly? Is is worth it/possible to pass command line args to an .exe, and should I package/freeze the files for this kind of interactive cli script?
I don't have a lot of experience with windows. I'm looking for advice/pointer where to look next/search.
You are on the correct path, but you need to understand a little more about python packaging to know how to proceed.
You are right that you need to package your code with setup tools. This will make it cake to install the package for others and will let python decide where to properly put things. -including script utilities/non-code files-
Setuptools has special support built in JUST FOR SCRIPTS! that will let you pass arguments, ect..
However, in packaging the code, you must understand that what you are really doing is making it a library that can be installed with pip. Therefore, that python script utility that you wrote will be treated as a library by setuptools. Setuptools will have you make a separate script file to import your new library and call the script main() function.
I know that you were already using this manual, and that's fantastic. Everything you need to first set up and understand a python package is in there. RTFM up a storm.
As I understand it, you can then look more into freezing the package (and register it with Pypi) for other systems once you have a handle on packaging.
note: use scripts keyword in setup tools to specify your script

Making a standalone python script that can be run by another software

I am currently interning at a place where they've asked me to make a standalone python program to do something (say X).
Now, that program is to be run by some commands sent by their proprietary software which is written in their proprietary language. Now the reason I'm saying proprietary so many times is because they aren't ready to take me anywhere near their code. I am just supposed to make a Python code that does X based on the input given by their software.
So is there a way I can make an API and wrap it around my code so as to let the software control it? Also I need to make the whole thing standalone (maybe an installer of some kind) so that they don't have to install Python and the accompanying modules (like opencv) just to run my script?
All I could get out of them was "there are dll files that will be calling your app and we want an executable"
Any programm can execute any other program (if it has the appropriate rights) so there is no real distinction between "python file" and "python executable" (that is because python is an interpreted language. The python source files and the "final python program" are "identical" (asuming cpython), in contrast to e.g. a C program where the source files and the executable are vastly different).
If you are on windows there is the additional problem that the user must have installed python to execute .py files. There are some ways to mitigate that problem - there are python libraries that "freeze" the python interpreter and your code into a single .exe file (from the comment by Bakuriu see e.g. freeze) . You could bundle the python interpreter with your code. You can just say your users to install python (if the amount of users is low that might be the good way).
"API" is just a fancy way of saying "this is how you communicate with my programm". This might be how you call a library (e.g. what functions a python module exports) or this might be an HTTP API or which command line arguments are passed or which protocoll over an TCP socket is spoken. Without knowing which API you are supposed to implement you cannot fulfill your job.
Without knowing further specifications (what inputdoes the other program give to yours, how does it call your programm?) it's very hard to say anything more helpful.

Why Python gets installed in Frameworks directory?

I've been wondering why python gets installed in directory named Frameworks? (though it's not Framework)
$ which python
/Library/Frameworks/Python.framework/Versions/2.7/bin/python
Somebody please explain! Thanks!
That's the way it is in OS X.
The Mac/README file in the Python source tree goes into some more details of the advantages of a framework build versus a traditional UNIX shared-library build, which will also work on OS X. The main points:
"The main reason is because you want
to create GUI programs in Python.
With the exception of
X11/XDarwin-based GUI toolkits all
GUI programs need to be run from a
fullblown MacOSX application (a
".app" bundle).
While it is technically possible to
create a .app without using
frameworks you will have to do the
work yourself if you really want
this.
A second reason for using frameworks
is that they put Python-related items
in only two places:
"/Library/Framework/Python.framework"
and "/Applications/MacPython 2.6".
This simplifies matters for users
installing Python from a binary
distribution if they want to get rid
of it again. Moreover, due to the way
frameworks work a user without admin
privileges can install a binary
distribution in his or her home
directory without recompilation."

Python package with compiled code

I'm looking into releasing a python package which includes an existing fortran or C program. The fortran/C program is compiled by running
./configure
make
The python code calls the resulting binary through subprocess calls (i.e. the code is not really wrapped as such). What I would like is that when the user types
python setup.py install
the fortran/C program is first compiled using the ./configure and make commands, then I want the python module to be installed, and the binary to be installed in the python bin/ directory alongside executables that are usually installed via the scripts= option in distutils.core.setup.
First, are there any problems with doing this? And if not, what is the best way to do it via setup.py? Are there existing functions to automate the ./configure and make, since this is pretty standard? Or should I just use os.system calls? And either way, where should those commands go in setup.py? Then should I have make output the binary to e.g. scripts/ and then have scripts=['scripts/mybinary'] in the setup() function?
Don't make this too complex.
Just provide them as separate items with a README that says -- basically -- what you said in the question.
Build the Fortran/C with ./configure; make; make install.
Setup Python with python setup.py install.
It doesn't appear to be rocket science. Trying to over-simplify the installation means that you must account for every OS vagary and oddness.
It's easier to trust the users to do "standard" installations so that the Fortran/C is on the system PATH, and your Python script should be configured to find them on the system PATH.
People who want to use your software are then free to reconfigure it to their own unique needs. They will anyway. Don't overpackage and force them to fight against you to reconfigure things.
consider writing a python C extension as a wrapper for your C code, and a f2py extension as a wrapper for your fortran code. Then you can just use them in your python code as fast calls instead of using subprocess.

Categories

Resources