Python Shared Libraries - python

As I understand there are two types of modules in python (CPython):
- the .so (C extension)
- the .py
The .so are only loaded once even when there are different processes/interpreters importing them.
The .py are loaded once for each process/interpreter (unless reloading explicitly).
Is there a way .py can be shared by multiple processes/interpreters?
One would still need some layer where one could store modifications done to the module.
I'm thinking one could embed the interpreter in a .so as the first step. Is there an already developed solution.
I acknowledge i may be very far off in terms of feasible ideas about this. Please excuse my ignorance.

The reason .so (or .pyd) files take up memory space only once (except for their variables segment) is that they are recognized by the OS kernel as object code. .py files are only recognized as text files/data; it's the Python interpreter that grants them "code" status. Embedding the Python interpreter in a shared library won't resolve this.
Loading .py files only once despite their use in multiple processes would require changes deep inside CPython.
Your best option, if you want to save memory space, is to compile Python modules to .so files using Cython. That may require some changes to the modules.

No, there is no way. Python is so highly dynamic that each process that I'm not sure it would make any sense anyway, as you could monkey-patch the modules, for example. Perhaps there would be a way to share the code anyway, but the benefit would be very small for something that is likely to be a lot of work.

The best answer I can give you is "not impossible, but I don't know if it happens".
You have to think about what is actually happening. When you encounter a .py file, Python has to read the file, compile it, and then execute byte code. Compilation takes place inside of the process, and so can't be shared.
When you encounter a .so file, the operating system links in memory that has been reserved for that library. All processes share the same memory region, and so you save memory.
Python already has a third way of loading modules. If it can, upon loading a .py file, it creates a pre-compiled .pyc file that is faster to load (you avoid compilation). The next time it loads the .pyc file. They conceivably could the .pyc file by just mmapping it into memory. (Using MAP_PRIVATE in case other things mess with that byte code later.) If they did that, then shared modules would by default wind up in shared memory.
I have no idea whether it has actually been implemented in this way.

Related

How to compile single python scripts (not to exe)?

I know there is a lot of debate within this topic.
I made some research, I looked into some of the questions here, but none was exactly it.
I'm developing my app in Django, using Python 3.7 and I'm not looking to convert my app into a single .exe file, actually it wouldn't be reasonable to do so, if even possible.
However, I have seen some apps developed in javascript that use bytenode to compile code to .jsc
Is there such a thing for python? I know there is .pyc, but for all I know those are just runtime compiled files, not actually a bytecode precompiled script.
I wanted to protect the source code on some files that can compromise the security of the app. After all, deploying my app means deploying a fully fledged python installation with a web port open and an app that works on it.
What do you think, is there a way to do it, does it even make sense to you?
Thank you
The precompiled (.pyc) files are what you are looking for. They contain pre-optimized bytecode that can be run by the interpreter even when the original .py file is absent.
You can build the .pyc files directly using python -m py_compile <filename>. There is also a more optimized .pyo format that further reduces the file size by removing identifier names and docstrings. You can turn it on by using -OO.
Note that it might still be possible to decompile the generated bytecode with enough effort, so don't use it as a security measure.

Statically linking Python, but still supporting external .pyd modules

I'm looking at statically link Python with my application. The reason for this is because in some test cases I've seen a 10% speed increase. My application uses the Python C-API heavily, and it seems that Whole Program Optimization is able to do some good optimizations. I expect Profile Guided Optimizations will gain a little more too. This is all being done in MSVC2015
So far I've recompiled the pythoncore project (python35.dll) into a static library and linked that with my application (let's call it myapp.exe). FYI other than changing the project type to static, the only other thing that needs doing is setting the define Py_NO_ENABLE_SHARED during the static lib compile, and when compiling myapp.exe. This works fine and it's how I was able to obtain the 10% speed improvement test result.
So the next step is continuing to support external python modules that have .pyd files (which are .dll files renamed to .pyd). These modules will have been compiled expecting to dynamically link with python35.dll, so I need to provide a workaround for that requirement, since all of the python functions are now embedded into myapp.exe.
First I use a .def file to export all of the public Python functions from myapp.exe. This works fine.
The missing piece is how do I create a python35.dll which redirects all the calls to the functions exported from myapp.exe.
My first attempt is using DLL forwarding. I made a custom python35.dll which has a .def file with lines such as:
PyArg_Parse=myapp.PyArg_Parse
In theory, this works. If I use Dependency Walker on socket.pyd, it correctly opens my python35.dll and shows that all the calls are being forwarded to myapp.exe.
However when actually running myapp.exe and trying to import socket, it fails to load the required entry points from myapp.exe. 'import socket' in Python will cause a LoadLibrary("socket.pyd") to occur. This will load my custom python35.dll implicitly. The failure occurs while trying to load python35.dll, it's unable to find the entry points for it's forwards. It seems like the reason for this is because myapp.exe won't be part of the library search path. I seem to be able to verify this by copying myapp.exe to myapp.dll. If I do that then the python35.dll load works, however this isn't a solution since that will result in 2 copies of the Python enviroment (one in myapp.exe, one in myapp.dll)
Possible other avenues I've also looked into but haven't found the right solution for:
Somehow getting .exe files to be part of the library search path
Using Windows manifest/configuration to redirect the library somehow
Manually using declspec(naked) and jmp statements to more explicitly wrap the .dll. I'm working in x64, so I don't think that's possible anymore?
I could manually do the whole Python API and wrap each function manually. This is doable if I can find a way to create the function definitions of all the exports so it's not an insane amount of manual work.
In summary, is there a way to redirect/forward calls to a .dll to functions/data exported from an .exe. Thanks!
I ended up going with the solution that #martineau suggested in the comments, which was to put all of my application, including Python, into a single .dll instead of an .exe. Then the .exe is just a simple file that calls into the .dll and does nothing else.

Would it be a good idea to make python store compile code in file stream instead of pyc files?

I'm wondering if it wouldn't be a better if Python would store the compiled code in a file stream of the original source file. This would work on file systems supporting forks/data-streams, and fall-back if this is not possible.
On Windows using ADS (Alternative Data Streams)
On OS X using resource forks
On Linux using extended file attributes if compiled file is under 32k
Doing this will solve the problem of polluting the source tree or having problems like after the removal of a .py the .pyc remained and was loaded and used.
What do you think about this, sounds like a good idea or not? What issues to do see.
You sure do sacrifice an awful lot of portability this way -- right now .pyc files are uncommonly portable (often used by heterogeneous systems on a LAN through some kind of network file system arrangement, for example, though I've never been a fan of the performance characteristics of that approach), while your approach would only work on very specific filesystems and (I suspect) never across a network mount on heterogenous machines.
So, it would be a dire mistake to make the behavior you want the default one -- but it would surely be neat to have it as an option available for specific request if your deployment environment doesn't care about all of the above issues and does care about some of those you mention. Another "cool option to have", that I would actually use about 100 times more often, is to put the .pyc "files" in a database instead of having them in filesystems.
The cool thing is that this is (relatively) easily accomplished as an add-on "import hack" one way or another (depending on Python versons) -- most easily in recent-enough versions with importlib, Brett Cannon's masterpiece (but that might make backporting to older Python versions harder than other ways... too much depends on exactly what versions you need to support, a detail which I don't see in your Q, so I won't go into the implementation details, but the general idea doesn't change much across implementations).
One problem I forsee is that it then means that each platform has different behaviour.
The next is that not every filesystem OS X supports also supports resource forks (and the way it stores them in non-hfs filesystems is universally hated by everyone else: ._ )
Having said that, I have often been bitten by a .pyc file being used by apache because the apache process can't read the .py file I have replaced. But I think that this is not the solution: a better deployment process is ;)

Can I use zipimport to ship a embedded python?

Currently, I'm deploying a full python distribution (the original python 2.7 msi) with my app. Which is an embedded web server made with delphi.
Reading this, I wonder if is possible to embed the necessary python files with my app, to decrease load files and avoid conflict with several python versions.
I have previous experience with python for delphi so I only need to know if only shipping the python dll + zip with the distro + own scripts will work (and if exist any caveats I must know or a sample where I can look)
zipimport should work just fine for you -- I'm not familiar with Python for Delphi, but I doubt it disables that functionality (an embedding application can do that, but it's an unusual choice). Just remember that what you can zip up and import directly are the Python-coded modules (or just their corresponding .pyc or .pyo byte codes) -- DLLs (even if renamed as .pyds;-) need to be on disk to be loaded (so if you have a zipfile with them it will need to be unzipped at the start of the app, e.g. into a temporary directory).
Moreover, you don't even need to zip up all modules, just those you actually need (by transitive closure) -- and you can easily find out exactly which modules those are, with the modulefinder module of the standard Python library. The example on the documentation page I just pointed to should clarify things. Happy zipping!
Yes it is possible.
I'm actually writing automatisation script in Python with the Zipimport library. I actually included every .py files in my zip as well as configuration or xml files needed by those script.
Then, I call a .command file targeting a __main__.py class that redirect towards the desired script according to my sys.argv parameters which is really useful!

Speeding up the python "import" loader

I'm getting seriously frustrated at how slow python startup is. Just importing more or less basic modules takes a second, since python runs down the sys.path looking for matching files (and generating 4 stat() calls - ["foo", "foo.py", "foo.pyc", "foo.so"] - for each check). For a complicated project environment, with tons of different directories, this can take around 5 seconds -- all to run a script that might fail instantly.
Do folks have suggestions for how to speed up this process? For instance, one hack I've seen is to set the LD_PRELOAD_32 environment variable to a library that caches the result of ENOENT calls (e.g. failed stat() calls) between runs. Of course, this has all sorts of problems (potentially confusing non-python programs, negative caching, etc.).
zipping up as many pyc files as feasible (with proper directory structure for packages), and putting that zipfile as the very first entry in sys.path (on the best available local disk, ideally) can speed up startup times a lot.
The first things that come to mind are:
Try a smaller path
Make sure your modules are pyc's so they'll load faster
Make sure you don't double import, or import too much
Other than that, are you sure that the disk operations are what's bogging you down? Is your disk/operating system really busy or old and slow?
Maybe a defrag is in order?
When trying to speed things up, profiling is key. Otherwise, how will you know which parts of your code are really the slow ones?
A while ago, I've created the runtime and import profile visualizer tuna, and I think it may be useful here. Simply create an import profile (with Python 3.7+) and run tuna on it:
python3.7 -X importtime -c "import scipy" 2> scipy.log
tuna scipy.log
If you run out of options, you can create a ramdisk to store your python packages. A ramdisk appears as a directory in your file system, but will actually be mapped directly to your computer's RAM. Here are some instructions for Linux/Redhat.
Beware: A ramdisk is volatile, so you'll also need to keep a backup of your files on your regular hard drive, otherwise you'll lose your data when your computer shuts down.
Something's missing from your premise--I've never seen some "more-or-less" basic modules take over a second to import, and I'm not running Python on what I would call cutting-edge hardware. Either you're running on some seriously old hardware, or you're running on an overloaded machine, or either your OS or Python installation is broken in some way. Or you're not really importing "basic" modules.
If it's any of the first three issues, you need to look at the root problem for a solution. If it's the last, we really need to know what the specific packages are to be of any help.

Categories

Resources