How to compile Python scripts for use in FORTRAN? - python

Although I found many answers and discussions about this question, I am unable to find a solution particular to my situation. Here it is:
I have a main program written in FORTRAN.
I have been given a set of python scripts that are very useful.
My goal is to access these python scripts from my main FORTRAN program. Currently, I simply call the scripts from FORTRAN as such:
CALL SYSTEM ('python pyexample.py')
Data is read from .dat files and written to .dat files. This is how the python scripts and the main FORTRAN program communicate to each other.
I am currently running my code on my local machine. I have python installed with numpy, scipy, etc.
My problem:
The code needs to run on a remote server. For strictly FORTRAN code, I compile the code locally and send the executable to the server where it waits in a queue. However, the server does not have python installed. The server is being used as a number crunching station between universities and industry. Installing python along with the necessary modules on the server is not an option. This means that my “CALL SYSTEM ('python pyexample.py')” strategy no longer works.
Solution?:
I found some information on a couple of things in thread Is it feasible to compile Python to machine code?
Shedskin, Psyco, Cython, Pypy, Cpython API
These “modules”(? Not sure if that's what to call them) seem to compile python script to C code or C++. Apparently not all python features can be translated to C. As well, some of these appear to be experimental. Is it possible to compile my python scripts with my FORTRAN code? There exists f2py which converts FORTRAN code to python, but it doesn't work the other way around.
Any help would be greatly appreciated. Thank you for your time.
Vincent
PS: I'm using python 2.6 on Ubuntu

One way or another, you'll need to get the Python runtime on your server, otherwise it won't be possible to execute Python bytecode. Ignacio is on the right track with suggesting invoking libpython directly, but due to Fortran's parameter-passing conventions, it will be a lot easier for you to write a C wrapper to handle the interface between Fortran and the CPython embedding API.
Unfortunately, you're doing this the hard way -- it's a lot easier to write a Python program that can call Fortran subroutines than the other way around.

You don't want any of those. What you should do is use FORTRAN's FFI to talk with libpython and frob its API.

Related

How to import Python scripts and dependencies in C# as a .dll

I have been searching for an answer for quite some time and I cannot figure out a robust solution: I have a C# application (on Unity) which needs to call some python functions and libraries. In particular, the python code needs to read a very large tabular file from C# -the equivalent of a large pandas dataframe-, avoiding writing and reading through a file if possible, because given the size it would reduce the performance.
I am a total beginner to C#/Unity, and I have given a look at the following workarounds:
IronPython: it is not an option because the last stable version is for python 2.7. My requirements are python 3.6 and higher, moving towards 3.8.
PyInstaller: it creates .exe files and not .dll. Moreover, it is not clear to me how you could call the executable from C# and pass a dataframe/2D array without writing it on disk. I am also not sure whether it can be called passing an entire input file as argument, if you really want to communicate via files.
Communication via socket: I am a bit lost about it. If this is really feasible on both python and Unity sides, I am happy to see your tutorials -especially for Unity!
Unity Python script editor: it works fine in dev mode. However, the python scripts cannot be included in the build. This: https://forum.unity.com/threads/python-for-unity-editor-only.914843/#post-8330904 seems very suboptimal for a stable build.
Thank you for any hint!

Is it possible to bind a C executable with Python?

Recently, the company I work for has been modernizing their codebases to mostly Python. Last week, I was tasked with looking into converting a utility program we rely on to manage some USB devices. I was quick to discover this program is highly specialized and is likely impossible to covert to Python (nor do I want to attempt it if I'm being honest). I have access to the source code for this program.
This program runs on Linux and currently compiles to an executable, not a shared object (.so).
Is binding Python to this program something that can be reasonably done? If so, I've already looked into ctypes, cffi, cython, pybind11, and writing raw C bindings, but I am lost as to what the best approach would be. If not, there is no major loss--I'll just have to call the program via subprocess and parse its output.
Thank you for your time.

Distribute a standalone Python software with Julia dependencies

I have a software that is mostly written in Python and, for now, I'm using PyInstaller to bundle and distribute the software in a user-friendly way (it's part of my CI pipeline, for Linux and Windows).
However, my performance is terrible and I want to rewrite some heavy parts in Julia while keeping the front-end in Python. I can use PyJulia to do this, but it means that the user has to install Julia manually in order to use my program.
Julia does have the equivalent of PyInstaller, which is PackageCompiler.jl, but I don't know how to call something compiled with PackageCompiler.jl from the Python side.
How can I make this work, so I can bundle and distribute an executable that has Python, Julia and everything it needs to run?
A little more details
My end user is someone (chemists and pharmacists) that have no idea what programming is. They don't have Python, Julia or Docker (and they don't even want to install it).
In my current approach, the software bundled with PyInstaller consists of a single executable with everything inside it (Python and everything it needs). What I really want is to keep the same user experience, but also with Julia running on the backstage.
I'll implement several functions in the Julia side, and I want (almost) the same level of integration as I get with PyJulia.
Maybe I'll go to Rust and just use the C interface, but I really would like to use Julia.
Thank y'all for your time.
Per here: https://julialang.github.io/PackageCompiler.jl/stable/apps.html#Creating-an-app-1 you can get what is basically an executable file and then you can just follow this post: Python Script execute commands in Terminal to execute the file you create.
You could try using JuliaWin. This provides a standalone Julia runtime that works fully self-contained.
We deployed a tool using PyInstaller with the JuliaWin included as one of the datas as PyInstaller calls them. Iirc it was not PyJulia but JuliaCall+PythonCall doing the interfacing in our case.
Unfortunately the startup was ridiculous (order of minutes). This is in part due to Julia's startup time, but largely due to the generated .exe first having to unpack the JuliaWin. So we are currently investigating using PackageCompiler, possibly also switching to providing the user with 2 files instead of 1.

Running python script without installed libraries

I have working Python script using scipy and numpy functions and I need to run it on the computer with installed Python but without modules scipy and numpy. How should I do that? Is .pyc the answer or should I do something more complex?
Notes:
I don't want to use py2exe. I am aware of it but it doesn't fit to the problem.
I have read, these questions (What is the difference between .py and .pyc files?, Python pyc files (main file not compiled?)) with obvious connection to this problem but since I am a physicist, not a programmer, I got totally lost.
It is not possible.
A pyc-file is nothing more than a python file compiled into byte-code. It does not contain any modules that this file imports!
Additionally, the numpy module is an extension written in C (and some Python). A substantial piece of it are shared libraries that are loaded into Python at runtime. You need those for numpy to work!
Python first "compiles" a program into bytecode, and then throws this bytecode through an interpreter.
So if your code is all Python code, you would be able to one-time generate the bytecode and then have the Python runtime use this. In fact I've seen projects such as this, where the developer has just looked through the bytecode spec, and implemented a bytecode parsing engine. It's very lightweight, so it's useful for e.g. "Python on a chip" etc.
Problem comes with external libraries not entirely written in Python, (e.g. numpy, scipy).
Python provides a C-API, allowing you to create (using C/C++ code) objects that appear to it as Python objects. This is useful for speeding things up, interacting with hardware, making use of C/C++ libs.
Take a look at Nuitka. If you'll be able to compile your code (not necessarily a possible or easy task), you'll get what you want.

Compiling Python to native code? [duplicate]

This question already has answers here:
Closed 11 years ago.
Possible Duplicate:
Is it feasible to compile Python to machine code?
Is it possible to compile Python code (plus its dependencies, plus the interpreter library) into a single, native Windows executable (with nothing else bundled along with it) from a Python file? (Kind of like how the GNU compiler for Java compiles Java into a native (humongous) executable, which contains everything in true machine code.)
If so, how would I go about doing this?
(Specifically, py2exe does not do what I want -- it includes the libraries inside a separate ZIP file, and it includes the interpreter as a separate DLL.)
Note 1:
To emphasize, I'm not asking for a "self-extracting archive", an "executable packer", or some other way of 'cheating' by bundling the files inside an exe -- I'm looking for something that genuinely converts Python into a native executable, like what GCJ does for Java.
Note 2:
Only if the above isn't possible:
Is it possible to at least generate a single executable from a Python code containing the interpreter bundled along with all the library dependencies, such that the resulting executable does not need to self-extract onto the target disk before running?
In this scenario, the 'compilation' requirement is relaxed: it doesn't matter if the code is actually compiled into machine code (it could simply be embedded as a text resource into the target executable), but the result must nevertheless be a single exe file [and nothing else] that can run standalone, specifically without needing to unpack/install anything onto the target disk before running.
Shed Skin can compile Python to C++, but only a restricted subset of it. Some aspects of Python are very difficult to compile to native code.
The short answer is no, and that is going to go for almost any language: any program you write is going to depend on some external libraries even if just the Windows system DLLs.
If you wrote a C program and compiled it with Microsoft's compiler you would still need the C runtime libraries to be installed. Chances are they already will be on most systems but it isn't guaranteed. Likewise even if you managed to compile a C Python interpreter statically linked to its libraries you still have to get the C runtime from somewhere.
What I suspect you are really asking is whether you can compile to a single .exe that depends only on libraries which you have a reasonable expectation of already being installed. So it all depends on what you are willing to consider part of the base system? Can you assume .Net framework 4 or Silverlight are installed? If so you might want to look at IronPython.
Likewise pypy can be built with either the Visual Studio toolchain or MinGW but I'm pretty sure in both cases you'll still need some external libraries at runtime.

Categories

Resources