We were using Python 2.4 in our application. Now we are migrating to 2.7.8.
In our code, we had used Sax2 and xpath functionality from "_xmlplus" library.
Approach 1:
We started off by re-writing the parsing logic used in all those files (ie files where Sax2 and xpath were used), but this is a tedious job.
Approach 2:
Use _xmlplus for 2.7 version. For which we need the source code for "_xmlplus". So that we can build the library. (We were not able to find the source code in web)
Can anyone please suggest the right approach, which we should take ?
It appears that _xmlplus is a package from PyXML library, so the source is available on sourceforge. However note that PyXML is no longer maintained (last files are from 2004).
Related
I am trying to use Cython to create .so binary files from our .py files and shared it with our team.
However even if we all use Python3, most of times it should be exactly similar revision (let us say 3.7.8), otherwise we get an error to import them.
Is this behavior expected?
Some of revisions are compatible. For example if we make .so with python 3.5.2 and import in 3.6.8 it works but it does not work in 3.7.8
Where does this mess comes from and what is the safest way to do this?
To follow up on my comments:
On the same platform, extension modules should work within a "minor version" (i.e. modules built with 3.7.2 and 3.7.3 should be compatible). I'm struggling to find a source for this though. Beyond that some effort has made in the past to ensure compatibility between releases, but not so much any more so it's possible you may be lucky and things work.
distutils/setuptools and other similar build mechanisms tag extension modules with a suffix indicating the version and some other details. For example, and extension would be called foo.cpython-37m.so instead of just foo.so. These tags prevent the module from being used with other Python versions and are a good thing. If you are removing these tags then this mess is entirely on you.
Python now defines a more limited stable ABI that should be compatible across Python versions. Cython is working towards supporting that but at the moment it is not in a usable state. In a year or so it should be a good solution.
In summary, .so files are not expected to be portable between different Python versions. You should either standardise on a Python version or build the .so files locally.
LibreOffice 3.5 includes a grammar checker, called (or maybe based on) LightProof. From what I have read, LightProof seems to be a Python library that can be used to check for custom grammar rules. But I can not for the life of me find a project page for LightProof.
The closest I got was http://cgit.freedesktop.org/libreoffice/lightproof/tree/, which seems to be the code for the LibreOffice extension, not LightProof itself.
So is LightProof actually a library that can be implemented in other applications, or is it just a code word for a LibreOffice feature?
LightProof has been superseded by LanguageTool, the source of which is available on GitHub.
It's not possible yet, but: "we will be able to make a stand-alone version of Lightproof/Grammalecte as it won’t be necessary to use Hunspell anymore" - see Olivier R.'s post to the LibreOffice mailing list.
What's the appropriate way of constructing a Python package via disutils when that Python package relies on a large system library?
I found this similar question, but it refers to an installable Python app, not a generic package.
I've written a package that relies on OpenCV. I'm only concerned with supporting Linux distros, but most distros either don't provide OpenCV or provide a version that's too old to use. Unfortunately, OpenCV is to large and cumbersome (and depends on several other system libraries) to include in the package and compile during the build step.
My current approach is to simply do nothing special in my setup.py and just import its Python modules in a try/except, showing a detailed error message if the import fails. Is there a better way?
You could use zc.buildout: http://www.buildout.org/
You should be able to extend the buildout config for your project from this one: https://github.com/cpsaltis/opencv-buildout
I have built a Python application using an external library (lxml-module). This runs fine in my system. Is there any way to compile this code or package this code, so that I can run it in another system which does not have this external library (lxml-module) module installed in it?
If possible please give me a little reference on *.pyd also?
PyInstaller would be a good way to go to package your code.
It works in a configure/make/build workflow (before which you setup a small spec file with different kinds of options). The external package will be shipped along with your application.
lxml is supported in PyInstaller: http://www.pyinstaller.org/wiki/SupportedPackages.
As for being able to compile your code on another machine, Marcin had a good suggestion.
You can always copy the module to your local path and import it from there. Kind of what django did for json until it was included in the standard library.
This could do the trick:
try:
import lxml
except ImportError:
import myownmodules.lxml as lxml
I know this is the less "high tech" approach but if the problem is simple enough this is what I would do without a blink.
Besides .... our buddy Tim seems to agree: "If the implementation is easy to explain, it may be a good idea."
For Windows package up using py2exe, for OSX use py2app and for Linux possibly use cx_freeze
The "final" release of Python for .NET (link) isn't pre-compiled for Python 2.6. I don't have a problem changing the compilation symbol to PYTHON26 as specified in the docs, the solution rebuilds just fine, but when attempting to import the CLR, I get an error indicating that python26.dll is missing. Anyone know how to build this file? It doesn't seem to be part of the solution, but I might be missing something obvious.
I managed it by following these instructions by one Feihong Hsu.
I notice now that the author has made a follow-up post that may be important too if you're using SP1.