Here's the situation. The company I work for has quite a bit of existing Tcl code, but some of them want to start using python. It would nice to be able to reuse some of the existing Tcl code, because that's money already spent. Besides, some of the test equipment only has Tcl API's.
So, one of the ways I thought of was using the subprocess module to call into some Tcl scripts.
Is subprocess my best bet?
Has anyone used this fairly new piece of code: Plumage? If so what is your experience (not just for Tk)?
Any other possible ways that I have not considered?
I hope you're ready for this. Standard Python
import Tkinter
tclsh = Tkinter.Tcl()
tclsh.eval("""
proc unknown args {puts "Hello World!"}
}"!dlroW olleH" stup{ sgra nwonknu corp
""")
Edit in Re to comment: Python's tcl interpreter is not aware of other installed tcl components. You can deal with that by adding extensions in the usual way to the tcl python actually uses. Here's a link with some detail
How Tkinter can exploit Tcl/Tk extensions
This can be done.
http://wiki.tcl.tk/13312
Specificly look at the typcl extension.
Typcl is a bit weird... It's a an extension to use Tcl from Python.
It doesn't really require CriTcl and could have been done in standard C.
This code demonstrates using Tcl as shared library, and hooking into it
at run time (Tcl's stubs architecture makes this delightfully simple).
Furthermore, Typcl avoids string conversions where possible (both ways).
I've not used it myself, but SWIG might help you out:
http://www.swig.org/Doc1.1/HTML/Tcl.html
Related
I have been programming with Python for some time and noticed that it's possible to interact, for example, with MS Excel files through the library XLWT.
Now I would like to know if it's possible to use Python to control other applications, such as the Calculator.exe which is on the standard Windows path C:\Windows\system32.
Is there any way to write a script with Python, let's say, to make the calculator opening and executing 9+3=? I usually like to write some code myself first and ask for help later, but here I've no clue even if it's possible and my researches on Internet have yield only this script to launch the program:
import subprocess
subprocess.call("C:\Windows\system32\calc.exe")
Any help, suggestion or even just "no, it's not possible" would be highly appreciated.
It will always depend on the cooperativity of the other program. If it allows being tweaked, it will offer an API for this (and hopefully a documentation telling you how to use it).
This is not really a Python question because it rather depends on how the API is written. If this API comes as a C library, you will have to write at least a bit of C code to access it via Python. If it is a way of calling the program (special options, etc.) then Python will have as much or less trouble providing these as any other programming language.
Long story short, a piece of code that I'm working with at work has the line:
from System import System
with a later bit of code of:
desc_ = System()
xmlParser = Parser(desc_.getDocument())
# xmlParser.setEntityBase(self.dtdBase)
for featureXMLfile in featureXmlList.split(","):
print featureXMLfile
xmlParser.parse(featureXMLfile)
feat = desc_.get(featureName)
return feat
Parser is an XML parser in Java (it's included in a different import), but I don't get what the desc_ bit is doing. I mean obviously, it somehow holds the feature that we're trying to pull out, but I don't entirely see where. Is System a standard library in Python or Java, or am I looking at something custom?
Unfortunately, everyone else in my group is out for Christmas Eve vacation, so I can't ask them directly. Thank you for your help. I'm still not horribly familiar with Python.
This isn't from the standard library, so you'll need to check your system (Python has plenty of introspection to help you with that).
You can tell as Python modules in the standard library use lowercase names as per PEP-8, or by searching the library reference.
Note as well that Python has it's own XML parsing tools that will be much nicer to work with in Python than Java's.
Edit: As you have noted in the comments you are using Jython, it seems likely this is Java's System package.
millimoose indicated the correct answer in his comment, but neglected to submit it as an answer, so I'm posting to indicate the correct answer. It was indeed a custom module built by my company. I was able to determine this by typing import System; print(System) into the interpreter.
When I first started reading about Python, all of the tutorials have you use Python's Interactive Mode. It is difficult to save, write long programs, or edit your existing lines (for me at least). It seems like a far more difficult way of writing Python code than opening up a code.py file and running the interpreter on that file.
python code.py
I am coming from a Java background, so I have ingrained expectations of writing and compiling files for programs. I also know that a feature would not be so prominent in Python documentation if it were not somehow useful. So what am I missing?
Let's see:
If you want to know how something works, you can just try it. There is no need to write up a file. I almost always scratch write my programs in the interpreter before coding them. It's not just for things that you don't know how they work in the programming language. I never remember what the correct arguments to range are to create, for example, [-2, -1, 0, 1]. I don't need to. I just have to fire up the interpreter and try stuff until I figure out it is range(-2, 2) (did that just now, actually).
You can use it as a calculator.
Python is a very introspective programming language. If you want to know anything about an object, you can just do dir(object). If you use IPython, you can even do object.<TAB> and it will tab-complete the methods and attributes of that object. That's way faster than looking stuff up in documentation or even in code.
help(anything) for documentation. It's way faster than any web interface.
Again, you have to use IPython (highly recommended), but you can time stuff. %timeit func1() and %timeit func2() is a common idiom to determine what is faster.
How often have you wanted to write a program to use once, and then never again. The fastest way to do this is to just do it in the Python interpreter. Sure, you have to be careful writing loops or functions (they must have the correct syntax the first time), but most stuff is just line by line, and you can play around with it.
Debugging. You don't need to put selective print statements in code to see what variables are when you write it in the interpreter. You just have to type >>> a, and it will show what a is. Nice again to see if you constructed something correctly. The building Python debugger pdb also uses the intrepeter functionality, so you can not only see what a variable is when debugging, but you can also manipulate or even change it without halting debugging.
When people say that Python is faster to develop in, I guarantee that this is a big part of what they are talking about.
Commenters: anything I am forgetting?
REPL Loops (like Python's interactive mode) provide immediate feedback to the programmer. As such, you can rapidly write and test small pieces of code, and assemble those pieces into a larger program.
You're talking about running Python in the console by simply typing "python"? That's just for little tests and for practicing with the language. It's very useful when learning the language and testing out other modules.
Of course any real software project is written in .py files and later executed by the interpreter!
The Python interpreter is a least common denominator: you can run it on multiple platforms, and it acts the same way (modulo platform-specific modules), so it's pretty easy to get a newbie going with.
It's a lot easier to tell a newbie to launch the interpreter and "do this" than to have them open a file, type in some code, save it, make it executable, make sure python is in your PATH, or use a #! line, etc etc. Scrap all of that and just launch the interpreter. For simple examples, you can't beat it. It was never meant for long programs, so if you were using it for that, you probably missed the part of the tutorial that told you "longer scripts go in a file". :)
you use the interactive interpreter to test snippets of your code before you put them into your script.
As already mentioned, the Python interactive interpreter gives a quick and dirty way to test simple Python functions and/or code snippets.
I personally use the Python shell as a very quick way to perform simple Numerical operations (provided by the math module). I have my environment setup, so that the math module is automatically imported whenever I start a Python shell. In fact, its a good way to "market" Python to non-Pythoniasts. Show them how they can use Python as a neat scientific calculator, and for simple mathematical prototyping.
One thing I use interactive mode for that others haven't mentioned: To see if a module is installed. Just fire up Python and try to import the module; if it dies, then your PYTHONPATH is broke or the module is not installed.
This is a great first step for "Hey, it's not working on my machine" or "Which Python did that get installed in, anyway" bugs.
I find the interactive interpreter very, very good for testing quick code, or to show others the Power of Python. Sometimes I use the interpreter as a handy calculator, too. It's amazing what you can do in a very short amount of time.
Aside from the built-in console, I also have to recommend Pyshell. It has auto-completion, and a decent syntax highlighting. You can also edit multiple lines of code at once. Of course, it's not perfect, but certainly better than the default python console.
When coding in Java, you almost always will have the API open in some browser window. However with the python interpreter, you can always import any module that you are thinking about using and check what it offers. You can also test the behavior of new methods that you are unsure of, to eliminate the "Oh! so THAT's how it works" as a source of bugs.
Interactive mode makes it easy to test code snippets before incorporating them into a larger program. If you use IDLE there's syntax highlighting and argument pop-ups to help you out. It's also a quick way of checking that you've figured out how to use a module without having to write a test program.
i just discovered http://code.google.com/p/re2, a promising library that uses a long-neglected way (Thompson NFA) to implement a regular expression engine that can be orders of magnitudes faster than the available engines of awk, Perl, or Python.
so i downloaded the code and did the usual sudo make install thing. however, that action had seemingly done little more than adding /usr/local/include/re2/re2.h to my system. there seemed to be some *.a file in addition, but then what is it with this *.a extension?
i would like to use re2 from Python (preferrably Python 3.1) and was excited to see files like make_unicode_groups.py in the distro (maybe just used during the build process?). those however were not deployed on my machine.
how can i use re2 from Python?
update two friendly people have pointed out that i could try to build DLLs / *.so files from the sources and then use Python’s ctypes library to access those. can anyone give useful pointers how to do just that? i’m pretty much clueless here, especially with the first part (building the *.so files).
update i have also posted this question (earlier) to the re2 developers’ group, without reply till now (it is a small group), and today to the (somewhat more populous) comp.lang.py group [—thread here—]. the hope is that people from various corners can contact each other. my guess is a skilled person can do this in a few hours during their 20% your-free-time-belongs-google-too timeslice; it would tie me for weeks. is there a tool to automatically dumb-down C++ to whatever flavor of C that Python needs to be able to connect? then maybe getting a viable result can be reduced to clever tool chaining.
(rant)why is this so difficult? to think that in 2010 we still cannot have our abundant pieces of software just talk to each other. this is such a roadblock that whenever you want to address some C code from Python you must always cruft these linking bits. this requires a lot of work, but only delivers an extension module that is specific to the version of the C code and the version of Python, so it ages fast.(/rant) would it be possible to run such things in separate processes (say if i had an re2 executable that can produce results for data that comes in on, say, subprocess/Popen/communicate())? (this should not be a pure command-line tool that necessitates the opening of a process each time it is needed, but a single processs that runs continuously; maybe there exist wrappers that sort of ‘demonize’ such C code).
David Reiss has put together a Python wrapper for re2. It doesn't have all of the functionality of Python's re module, but it's a start. It's available here: http://github.com/facebook/pyre2.
Possible yes, easy no. Looking at the re2.h, this is a C++ library exposed as a class. There are two ways you could use it from Python.
1.) As Tuomas says, compile it as a DLL/so and use ctypes. In order to use it from python, though, you would need to wrap the object init and methods into c style externed functions. I've done this in the past with ctypes by externing functions that pass a pointer to the object around. The "init" function returns a void pointer to the object that gets passed on each subsequent method call. Very messy indeed.
2.) Wrap it into a true python module. Again those functions exposed to python would need to be extern "C". One option is to use Boost.Python, that would ease this work.
SWIG handles C++ (unlike ctypes), so it may be more straightforward to use it.
You could try to build re2 into its own DLL/so and use ctypes to call functions from that DLL/so. You will probably need to define your own entry points in the DLL/so.
You can use the python package https://pypi.org/project/google-re2/. Although look at the bottom, there are a few requirements to install yourself before installing the python package.
In python, under what circumstances is SWIG a better choice than ctypes for calling entry points in shared libraries? Let's assume you don't already have the SWIG interface file(s). What are the performance metrics of the two?
I have a rich experience of using swig. SWIG claims that it is a rapid solution for wrapping things. But in real life...
Cons:
SWIG is developed to be general, for everyone and for 20+ languages. Generally, it leads to drawbacks:
- needs configuration (SWIG .i templates), sometimes it is tricky,
- lack of treatment of some special cases (see python properties further),
- lack of performance for some languages.
Python cons:
1) Code style inconsistency. C++ and python have very different code styles (that is obvious, certainly), the possibilities of a swig of making target code more Pythonish is very limited. As an example, it is butt-heart to create properties from getters and setters. See this q&a
2) Lack of broad community. SWIG has some good documentation. But if one caught something that is not in the documentation, there is no information at all. No blogs nor googling helps. So one has to heavily dig SWIG generated code in such cases... That is terrible, I could say...
Pros:
In simple cases, it is really rapid, easy and straight forward
If you produced swig interface files once, you can wrap this C++ code to ANY of other 20+ languages (!!!).
One big concern about SWIG is a performance. Since version 2.04 SWIG includes '-builtin' flag which makes SWIG even faster than other automated ways of wrapping. At least some benchmarks shows this.
When to USE SWIG?
So I concluded for myself two cases when the swig is good to use:
2) If one needs to wrap C++ code for several languages. Or if potentially there could be a time when one needs to distribute the code for several languages. Using SWIG is reliable in this case.
1) If one needs to rapidly wrap just several functions from some C++ library for end use.
Live experience
Update :
It is a year and a half passed as we did a conversion of our library by using SWIG.
First, we made a python version. There were several moments when we experienced troubles with SWIG - it is true. But right now we expanded our library to Java and .NET. So we have 3 languages with 1 SWIG. And I could say that SWIG rocks in terms of saving a LOT of time.
Update 2:
It is two years as we use SWIG for this library. SWIG is integrated into our build system. Recently we had major API change of C++ library. SWIG worked perfectly. The only thing we needed to do is to add several %rename to .i files so our CppCamelStyleFunctions() now looks_more_pythonish in python. First I was concerned about some problems that could arise, but nothing went wrong. It was amazing. Just several edits and everything distributed in 3 languages. Now I am confident that it was a good solution to use SWIG in our case.
Update 3:
It is 3+ years we use SWIG for our library. Major change: python part was totally rewritten in pure python. The reason is that Python is used for the majority of applications of our library now. Even if the pure python version works slower than C++ wrapping, it is more convenient for users to work with pure python, not struggling with native libraries.
SWIG is still used for .NET and Java versions.
The Main question here "Would we use SWIG for python if we started the project from the beginning?". We would! SWIG allowed us to rapidly distribute our product to many languages. It worked for a period of time which gave us the opportunity for better understanding our users requirements.
SWIG generates (rather ugly) C or C++ code. It is straightforward to use for simple functions (things that can be translated directly) and reasonably easy to use for more complex functions (such as functions with output parameters that need an extra translation step to represent in Python.) For more powerful interfacing you often need to write bits of C as part of the interface file. For anything but simple use you will need to know about CPython and how it represents objects -- not hard, but something to keep in mind.
ctypes allows you to directly access C functions, structures and other data, and load arbitrary shared libraries. You do not need to write any C for this, but you do need to understand how C works. It is, you could argue, the flip side of SWIG: it doesn't generate code and it doesn't require a compiler at runtime, but for anything but simple use it does require that you understand how things like C datatypes, casting, memory management and alignment work. You also need to manually or automatically translate C structs, unions and arrays into the equivalent ctypes datastructure, including the right memory layout.
It is likely that in pure execution, SWIG is faster than ctypes -- because the management around the actual work is done in C at compiletime rather than in Python at runtime. However, unless you interface a lot of different C functions but each only a few times, it's unlikely the overhead will be really noticeable.
In development time, ctypes has a much lower startup cost: you don't have to learn about interface files, you don't have to generate .c files and compile them, you don't have to check out and silence warnings. You can just jump in and start using a single C function with minimal effort, then expand it to more. And you get to test and try things out directly in the Python interpreter. Wrapping lots of code is somewhat tedious, although there are attempts to make that simpler (like ctypes-configure.)
SWIG, on the other hand, can be used to generate wrappers for multiple languages (barring language-specific details that need filling in, like the custom C code I mentioned above.) When wrapping lots and lots of code that SWIG can handle with little help, the code generation can also be a lot simpler to set up than the ctypes equivalents.
CTypes is very cool and much easier than SWIG, but it has the drawback that poorly or malevolently-written python code can actually crash the python process. You should also consider boost python. IMHO it's actually easier than swig while giving you more control over the final python interface. If you are using C++ anyway, you also don't add any other languages to your mix.
In my experience, ctypes does have a big disadvantage: when something goes wrong (and it invariably will for any complex interfaces), it's a hell to debug.
The problem is that a big part of your stack is obscured by ctypes/ffi magic and there is no easy way to determine how did you get to a particular point and why parameter values are what they are..
You can also use Pyrex, which can act as glue between high-level Python code and low-level C code. lxml is written in Pyrex, for instance.
ctypes is great, but does not handle C++ classes. I've also found ctypes is about 10% slower than a direct C binding, but that will highly depend on what you are calling.
If you are going to go with ctypes, definitely check out the Pyglet and Pyopengl projects, that have massive examples of ctype bindings.
I'm going to be contrarian and suggest that, if you can, you should write your extension library using the standard Python API. It's really well-integrated from both a C and Python perspective... if you have any experience with the Perl API, you will find it a very pleasant surprise.
Ctypes is nice too, but as others have said, it doesn't do C++.
How big is the library you're trying to wrap? How quickly does the codebase change? Any other maintenance issues? These will all probably affect the choice of the best way to write the Python bindings.
Just wanted to add a few more considerations that I didn't see mentioned yet.
[EDIT: Ooops, didn't see Mike Steder's answer]
If you want to try using a non Cpython implementation (like PyPy, IronPython or Jython), then ctypes is about the only way to go. PyPy doesn't allow writing C-extensions, so that rules out pyrex/cython and Boost.python. For the same reason, ctypes is the only mechanism that will work for IronPython and (eventually, once they get it all working) jython.
As someone else mentioned, no compilation is required. This means that if a new version of the .dll or .so comes out, you can just drop it in, and load that new version. As long as the none of the interfaces changed, it's a drop in replacement.
Something to keep in mind is that SWIG targets only the CPython implementation. Since ctypes is also supported by the PyPy and IronPython implementations it may be worth writing your modules with ctypes for compatibility with the wider Python ecosystem.
I have found SWIG to be be a little bloated in its approach (in general, not just Python) and difficult to implement without having to cross the sore point of writing Python code with an explicit mindset to be SWIG friendly, rather than writing clean well-written Python code. It is, IMHO, a much more straightforward process to write C bindings to C++ (if using C++) and then use ctypes to interface to any C layer.
If the library you are interfacing to has a C interface as part of the library, another advantage of ctypes is that you don't have to compile a separate python-binding library to access third-party libraries. This is particularly nice in formulating a pure-python solution that avoids cross-platform compilation issues (for those third-party libs offered on disparate platforms). Having to embed compiled code into a package you wish to deploy on something like PyPi in a cross-platform friendly way is a pain; one of my most irritating points about Python packages using SWIG or underlying explicit C code is their general inavailability cross-platform. So consider this if you are working with cross-platform available third party libraries and developing a python solution around them.
As a real-world example, consider PyGTK. This (I believe) uses SWIG to generate C code to interface to the GTK C calls. I used this for the briefest time only to find it a real pain to set up and use, with quirky odd errors if you didn't do things in the correct order on setup and just in general. It was such a frustrating experience, and when I looked at the interace definitions provided by GTK on the web I realized what a simple excercise it would be to write a translator of those interface to python ctypes interface. A project called PyGGI was born, and in ONE day I was able to rewrite PyGTK to be a much more functiona and useful product that matches cleanly to the GTK C-object-oriented interfaces. And it required no compilation of C-code making it cross-platform friendly. (I was actually after interfacing to webkitgtk, which isn't so cross-platform). I can also easily deploy PyGGI to any platform supporting GTK.