Use Cython as Python to C Converter - python

I have huge Python modules(+8000 lines) .They basically have tons of functions for interacting with a hardware platform via serial port by reading and writing to hardware registers.
They are not numerical algorithms. So application is just reading/writing to hardware registers/memory. I use these libraries to write custom scripts. Eventually, I need to move all these stuff to be run in an embedded processor on my hardware to have finer control, then I just kick off the event from PC and the rest is in hardware.
So I need to convert them to C.If I can have my scripts be converted to C by an automatic tool, that would save me a huge time. This is why I got attracted to Cython. Efficiency is not important my codes are not number crunchers. But generated code should be relatively small to fit in my limited memory (few hundreds of Kilobytes).
Can I use Cython as converter for my custom Python scripts to C? My guess is yes, in which case can I use these .c files to be run in my hardware? My guess is not since I have to have Cython in my hardware as well for them to run. But if just creates some .c files, I can go through and make it standalone since code is not using much of features of Python it just use it as a fast to implement script.

Yes, at its core this is what Cython does. But ...
You don't need Cython, however, you do need libpython. You may feel like it doesn't use that many Python features, but I think if you try this you'll find it's not true -- you won't be able to separate your program from its dependence on libpython while still using the Python language.
Another option is PyPy, specifically it's translation toolchain, NOT the PyPy Python interpreter. It lets you translate RPython, a subset of the Python language, into C. If you really aren't using many Python language features or libraries, this may work.
PyPy is mostly known as an alternative Python implementation, but it is also a set of tools for compiling dynamic languages into various forms. This is what allows the PyPy implementation of Python, written in (R)Python, to be compiled to machine code.
If C++ is available, Nuitka is a Python to C++ compiler that works for regular Python, not just RPython (which is what shedskin and PyPy use).

If C++ is available for that embedded platform, there is shed skin, it converts python into c++.

Related

AoT Compiler for Python

I want to get my Python script working on a bare metal device like microcontroller WITHOUT the need for an interpreter. I know there are already JIT compilers for Python like PyPy, and interpreters like CPython.
However, existing interpreters I've seen (such as CPython) take up large memory (in MB range).
Is there an AOT compiler for Python (i.e. compiling directly to native hardware through intermediary like LLVM)?
I assume such a compiler would enable Python to run much faster compared to existing implementations AND with lower memory footprint. If there is, I wonder why that solution hasn't been popularized.
As you already mentioned Cython is an option (However, it is true that the result is big due since the C runtime need to implement the Python functionality together with your program).
With regards to LLVM there was a project by Google named unladen swallow. However, that project is mostly abandoned. You can find some information about it here
Basically it was an attempt to bring LLVM optimizations into the runtime of Cython. E.g JITTING Python code.
Another old alternative was shed skin which compiles Python to C++. Some information about it can be found here.
Yet another option similar to shed skin is to restrict yourself to a subset of the Python language and use micropython.
An alternative would be to use GraalVM with Truffle AOT with Python.
It's basically Python running on an optimized AOT for jvm.
The project looks promising. You can chek this link here:
https://www.graalvm.org/22.2/graalvm-as-a-platform/language-implementation-framework/AOT/
Recently came across codon,
Codon is a high-performance Python compiler that compiles Python code to native machine code without any runtime overhead. Typical speedups over Python are on the order of 10-100x or more, on a single thread. Codon's performance is typically on par with (and sometimes better than) that of C/C++.

Does WebAssembly run faster if written in C as opposed to Python?

There's a long list of languages that can be compiled into Wasm. Is there any performance gain from writing in something like C or Rust over Python? Or is it all the same since it is being compiled to Wasm?
Short answer: Yes, because Python, the language itself, is not compiled to Wasm, but its interpreter.
Saying Python supports Wasm does not always means the same. Firstly, Python is NOT a compiled language, it's a script language. Don't expect a script language will be compiled to a native (or Wasm) language because it is not meant to work that way.
Then how Python supports Wasm? Python interpreters/runtimes like cpython, which is written in C, are compiled to Wasm. There are two popular Python runtimes that supports Python: pyodide and Wasm port for micropython (there are a lot of efforts to run Python in a browser besides the two). Both of them are interpreters that translate Python to their own bytecode and then execute bytecode in Wasm. Of course there will be huge performance penalties just like cpython in the native environment.
Compiling to WebAssembly is basically just simulating a special form of assembly targeting virtual hardware. When you read "can compile language X" into Wasm, it doesn't always mean the language literally compiles directly to Wasm. In the case of Python, to my knowledge, it means "they compiled Python interpreters to Wasm" (e.g. CPython, PyPy), so the whole Python interpreter is Wasm, but it still interprets Python source code files normally, it doesn't convert them to special Wasm modules or anything. Which means all the overhead of the Python interpreter is there, on top of the overhead of the Wasm engine, etc.
So yes, C and Rust (which can target Wasm directly by swapping out the compiler backend) will still run faster than Python code targeting CPython compiled to Wasm, for the same reasons. Tools that speed up Python when run natively (e.g. Cython, raw CPython C extensions, etc.) may also work in Wasm to get the same speed ups, but it's not a free "Compile slow interpreted language to Wasm and become fast compiled language"; computers aren't that smart yet.

Execute python script from Assembly

I am planning on making an OS from scratch using Python. However I only know how to make it by writing in Assembly.
Is it possible for me to still write the kernel in Assembly, convert it to a binary and during boot execute the Python script?
I hope this made sense
I think you would be interested in this project:
https://github.com/Maratyszcza/PeachPy
A comment from LtU:
PeachPy is a Python framework for writing high-performance assembly kernels.
PeachPy aims to simplify writing optimized assembly kernels while preserving all optimization opportunities of traditional assembly.
You can use the same code to generate assembly for Windows, Unix, and Golang assembly. The library handles the various ABIs automatically. I haven't seen >this cool project before.
Among the cool features is the ability to invoke the generated assembly as regular Python functions. Nice.

When using the Python Interpreter, is the compiler used at all?

In Google's Python Class it reads
Python is a dynamic, interpreted (bytecode-compiled) language
I know what an interpreter is and know what bytecode is but the two together seem not to fit. After doing some reading it became a bit clearer that basically Python source code is automatically compiled before it is interpreted; but some new questions emerged.
When using the Python Interpreter does no compilation happen? If it does, when? For example if you're just typing code at the command line and it gets run each time you hit enter, when does a compiler have the opportunity to do its work?
Also in the linked to question above, #delnan gives a pretty broad definition of a compiler
A compiler is, more generally, a program that converts a program in
one programming language into a program in another programming
language...JIT compilers compile to native machine code at runtime
I guess my question is: what's the difference between an interpreter and automatic compiler? To refine the question a bit, if Python is compiled, why not compile all the way to machine code (or assembly, since I know writing compilers that can produce pure machine code is difficult)?
Perhaps it is best to forget semantics and just try to learn what Cpython is actually doing. When you invoke the Cpython binary, it does a number of things. Generally speaking, you can expect it to translate the code you've written into a sequence of bytecode instructions. This is the "compiling" stage that people will sometimes reference for python code. These are a more compact and efficient way to tell the interpreter what to do than your hand-written code. Frequently, python will cache these files for reuse in .pyc files (only re-generating if the associated .py file is newer). You can think of python bytecode as the set of instructions that the python virtual machine can run -- In a lot of ways, it's not really all that different than what you get for Java. When people speak of compiled languages (e.g. C), the compiler's job is to translate your code into a set of instructions that will run directly on your computer's hardware. Languages like Cpython and Java have an extra level of indirection (e.g. the Virtual Machine). The Virtual Machine runs directly on the computer's hardware and is responsible for interpreting the domain specific language.
Compared to standard "compiled" languages (e.g. C, Fortran), this stage is really light-weight -- and python doesn't do a lot of the checking that "traditional" compilers will do (e.g. typechecking). It pretty much only checks the syntax and does a few very simple optimizations using the peephole optimizer.

Game Library with Support for RPython

Are there any Python game libraries (Pygame, Pyglet, etc.) with support for RPython? Or game libraries specifically made for RPython? Or bindings for a game library for RPython?
Yes, check out the gameboy interpreter written in RPython, pygirl. https://bitbucket.org/pypy/lang-gameboy
RPython is not Python. They are different languages even though it just so happens that you can execute RPython code with a Python interpreter and get similar effects (much more slowly) to running the compiled RPython program. It is extremely unlikely for there to ever be any reasonable Python library (game library or any other kind) that also works in RPython. It would have to be specifically written for RPython, and if it worked in RPython it would be so inflexible when considered as a regular Python library that nobody would use it in that context.
If you want to program in a lower level compiled language with game libraries, use C# or Java. RPython is not a good competitor (in large part because it has very few libraries for anything, not even much of a standard library).
If you want to program in Python, use Python (possibly use the PyPy implementation of Python, which can be faster than the standard Python interpreter if it supports all the libraries you're using). RPython will not help you write Python code that goes faster.
If your goal is not to specifically produce a game, but rather to do some project in RPython because you want to learn RPython specifically, then that's a perfectly reasonable aim. But writing a game in will probably not be the most helpful project for learning to write RPython; you should probably consider writing some sort of interpreter instead, since that's what RPython is designed to do.

Categories

Resources