In python, using twisted loopingcall, multiprocessing.Process, and multiprocessing.Queue; is it possible to create a zombie process. And, if so, then how?
A zombie is a process which has completed but whose completion has not yet been noticed by the process which started it. It's the Twisted process's responsibility to reap its children.
If you start the process with spawnProcess, everything should always work as expected. However, as described in bug #733 in Twisted (which has long been fixed), there are a plethora of nasty edge-cases when you want to use Twisted with other functions that spawn processes, as Python's API historically made it difficult to cooperate between signal handlers.
This is all fixed in recent versions of the code, but I believe you may still encounter this bug in the following conditions:
You are using a version of Twisted earlier than 10.1.
You are using a version of Python earlier than 2.6.
You are not building Twisted's native extension modules (if you're working from a development checkout or unpacked tarball rather than an installed version, you can fix this with python setup.py build_ext -i).
You are using a module like popen or subprocess.
Hopefully upgrading Twisted or running the appropriate command will fix your immediate issue, but you should still consider using spawnProcess, since that lets you treat process output as a normal event in the reactor event loop.
Related
Does anybody know if I can safely compile multiple extensions concurrently with threading?
I realise this might not speed things up (although compilers are run in subprocesses, so maybe!), but I'm in a situation where GUI actions can start a simulation which may involve a compilation step, so I'd like to know if I need to prevent multiple simulations from compiling at the same time, or if this is fine
Not only they're not thread-safe — they're not process-safe: you cannot call setup() multiple times in one process. See an example of errors from this.
To work around the limitation you have to run python setup.py or pip in a subprocess and then the question of thread-safety no longer applies.
I have a kivy application in python which uses some threads.
As python is not able to run these threads on different Cores due to the Global Interpreter Lock, I would have liked to try to use PyPy for it and see if I can make the threads run faster of different cores since PyPy is different and offers stackless (what ever that is? :).
Does somebody have some information to share on how to make a simple python program, which launches some threads by the module threading, running with the pypy interpreter such that it uses this stackless feature?
Pypy won't resolve Python problems of running a single-thread each time, since it also makes use of the GIL - http://doc.pypy.org/en/latest/faq.html#does-pypy-have-a-gil-why
Besides that, Kivy is a complex project embedding Python itself - although I don't know it very well, I doubt it is possible to switch the Python used in it for Pypy.
Depending on what you are doing, you may want to use the multiprocessing module instead of threading - it is a drop-in replacement that will make transparent inter-process calls to Python functions, and can therefore take advantage of multiple-cores.
https://docs.python.org/3/library/multiprocessing.html
This is standard in cPython and can likely be used from within Kivy, if (and only if) all code in the subprocess just take care of number-crunching, and so on, and all user interaction and display updates are made on the main process.
I am thinking about making a program that will need to send input and take output from the various aircrack-ng suite tools. I know of a couple of python modules like subprocess, envoy, sarge and pexpect that would provide the necessary functionality. Can anyone advise on what I should be using or not using, especially as I'm new to python.
Thanks
As the maintainer of sarge, I can tell you that its goals are broadly similar to envoy (in terms of ease of use over subprocess) and there is (IMO) more functionality in sarge with respect to:
Cross-platform support for bash-like syntax (e.g.use of &&, ||, & in command lines)
Better support for capturing subprocess output streams and working with them asynchronously
More documentation, especially about the internals and peripheral issues like threading+forking in the context of using subprocess
Support for prevention of shell injection attacks
Of course YMMV, but you can check out the docs, they're reasonably comprehensive.
pexpect
In 2015, pexpect does not work on windows. Rumored to add "experimental" support in the next version, but this has been a rumor for a long time (I'm not holding my breath).
Having written many applications using pexpect (and loving it), I am now sorry because one of the things I love about python (that it is cross platform) is not true for my applications.
If you plan to ever add windows support, for the moment, avoid pexpect.
envoy
Not much activity in the last year. And few commits (12 total) since 2012. Not very promising for its future.
Internally it uses shlex in a way that is not compatible with windows paths (the commands must use '/' not '\' for directory separators). A workaround (when using pathlib) is to call as_posix() on path objects before passing them as commands. See this answer.
Getting access to the internal streams (i.e. I want to parse the output to have some updating scrollbars), seems possible but is not documented.
sarge
Works on windows out-of-the-box and has an expect() method that should provide functionality similar to pexpect (allowing me to update a scrollbar). Recent activity, but it is hosted on gitlab and bitbucket (very confusing).
Personal Conclusion
I'm moving from pexpect to sarge for future development. Seems to provide similar feature set to pexpect and supports windows.
subprocess - is a standard library module, so it'll be available with python installation. But it has a reputation of hard to use since it's api is non-intuitive.
envoy - is a third party module that wraps around subprocess. It was written to be an easy to use alternative to subprocess. The author of envoy Kenneth Reitz is famous for his Python for Humans philosophy.
I'm not familiar with the other two.
I have to do some task whenever some specific applications were launched. Is there any way to list all the processes running in an operating system or detect whenever a new process is created in operating system
Yeah, there is a bit of stuff in the standard Python library that Jython doesn't implement probably because it's platform dependent and too much work for the small Jython community to implement. Try looking for a Java solution, it's trivial to invoke Java code from Jython.
You could use psutil. Psutil provides information on running processes and system utilization.
Given the absence of a Windows fork() call, how's the multiprocessing package in Python 2.6 implemented under Windows? On top of Win32 threads or some sort of fake fork or just compatibility on top of the existing multithreading?
It's done using a subprocess call to sys.executable (i.e. start a new Python process) followed by serializing all of the globals, and sending those over the pipe. A poor man's cloning of the current process. This is the cause of the extra restrictions found when using multiprocessing on Windows plaform.
You may also be interested in viewing Jesse Noller's talk from PyCon about multiprocessing where he discusses its use.