launch lamson as a process - python

Can I programmatically launch the lamson SMTP server from inside my python code?
it cam be launched from the command line as:
lamson start
but , I want to launch it from code using the multiprocess package
from multiprocessing import Process
def f(name):
# what to put here
if __name__ == '__main__':
p = Process(target=f, args=('bob',))
p.start()
p.join()
what to put in the f function to lauch a lamson server
How this can be achieved?
thanks

Related

Running a small python program in background inside Jupyter notebook without blocking the main process

Let suppose I have this simple function:
def fun():
for i in range(5):
print(i)
sleep(2)
and I want to run it in background without interrupting the main code flow, is this achievable?
I tried saving the code in test.py and did:
from IPython.lib.backgroundjobs import BackgroundJobFunc
with open('test.py') as code:
job = BackgroundJobFunc(exec, code.read())
result = job.run()
It printed 0 and exited.
I also tried:
from subprocess import Popen, PIPE
process = Popen(['python', 'test.py.py'], stdout=PIPE, stderr=PIPE)
stdout, stderr = process.communicate()
print(stdout)
and
from threading import Thread
thread = Thread(target = fun)
thread.start()
thread.join()
print("thread finished...exiting")
Both blocked the main process. Could not do anything before it finished it's execution.
Is there a different way?
Creating a daemon thread solved the problem ( with one problem that it prints the value in cell you're currently printing / working)
t1 = threading.Thread(target=fun)
t1.setDaemon(True)
t1.start()
Any corrections / suggestions to this?

How do I use multiprocessing.Queue from a process with a pre-existing Pipe?

I am trying to use multiprocessing from inside another process that was spawned with Popen. I want to be able to communicate between this process and a new child process, but this "middle" process has a polling read on the pipe with its parent, which seems to block execution of its child process.
Here is my file structure:
entry.py
import subprocess, threading, time, sys
def start():
# Create process 2
worker = subprocess.Popen([sys.executable, "-u", "mproc.py"],
# When creating the subprocess with an open pipe to stdin and
# subsequently polling that pipe, it blocks further communication
# between subprocesses
stdin=subprocess.PIPE,
close_fds=False,)
t = threading.Thread(args=(worker))
t.start()
time.sleep(4)
if __name__ == '__main__':
start()
mproc.py
import multiprocessing as mp
import time, sys, threading
def exit_on_stdin_close():
try:
while sys.stdin.read():
pass
except:
pass
def put_hello(q):
# We never reach this line if exit_poll.start() is uncommented
q.put("hello")
time.sleep(2.4)
def start():
exit_poll = threading.Thread(target=exit_on_stdin_close, name="exit-poll")
exit_poll.daemon = True
# This daemon thread polling stdin blocks execution of subprocesses
# But ONLY if running in another process with stdin connected
# to its parent by PIPE
exit_poll.start()
ctx = mp.get_context('spawn')
q = ctx.Queue()
p = ctx.Process(target=put_hello, args=(q,))
# Create process 3
p.start()
p.join()
print(f"result: {q.get()}")
if __name__ == '__main__':
start()
My desired behavior is that when running entry.py, mproc.py should run on a subprocess and be able to communicate with its own subprocess to get the Queue output, and this does happen if I don't start the exit-poll daemon thread:
$ python -u entry.py
result: hello
but if exit-poll is running, then process 3 blocks as soon as it's started. The put_hello method isn't even entered until the exit-poll thread ends.
Is there a way to create a process 3 from process 2 and communicate between the two, even while the pipe between processes 1 and 2 is being used?
Edit: I can only consistently reproduce this problem on Windows. On Linux (Ubuntu 20.04 WSL) the Queues are able to communicate even with exit-poll running, but only if I'm using the spawn multiprocessing context. If I change it to fork then I get the same behavior that I see on Windows.

Google colab: printing in child processes

I can't see any output in a google colab when I use python Process. I tried print function and logging module but it doesn't work.
This simple example produces output on my machine (jupyter notebook, python 3.6.9) but doesn't work in colab:
from multiprocessing import Process
import time
def simple_fun(proc_id):
while True:
time.sleep(1)
print(proc_id)
N_PROCESS = 2
processes = []
for i in range(N_PROCESS):
p = Process(target=simple_fun, args=(i,))
p.start()
processes.append(p)
Is there anything I can do?
Am I missing something? May be the code above is platform-depended?
This process example works on google colab (Python 3.6.9)
from multiprocessing import Process
import os
def info(title):
print(title)
print('module name:', __name__)
if hasattr(os, 'getppid'): # only available on Unix
print( 'parent process:', os.getppid())
print( 'process id:', os.getpid())
def f(name):
info('function f')
print('hello', name)
if __name__ == '__main__':
info('main line')
p = Process(target=f, args=('bob',))
p.start()
p.join()
I found that it doesn't work only when I doesn't wait for a process with join function.
I can't block all processes at once.
But if process work takes approximately the same amount of time for all processes than I can block only on one process:
from multiprocessing import Process
import time
import logging
def simple_fun(proc_id):
while True:
time.sleep(1)
#logging.info(proc_id)
print(proc_id)
N_PROCESS = 2
processes = []
for i in range(N_PROCESS):
p = Process(target=simple_fun, args=(i,))
p.start()
processes.append(p)
processes[0].join() # <- wait for one of processes
This one works.

create threads in python in setDaemon mode

I'm on Ubuntu 16.04.6 LTS with python-2.7.12. I'm not an expert in python, but I have to maintain some code. Here is snippet:
from threading import Thread
...
class Shell(cmd.Cmd):
...
def do_start(self, line):
threads = []
t = Thread(target=traffic(line, arg1, arg2, arg3)
threads.append(t)
t.start()
t.join()
...
if __name__ == '__main__':
global config
global args
args = parse_args()
config = configparser.ConfigParser()
config.read(args.FILE)
s = Shell()
...
So it starts a small command-line shell, where I can execute some commands. It does work, however it blocks the CLI, as the threads starts, so I googled and thought that adding t.setDaemon(True) would help. I tried it before t.start() or after, and it didn't take any effect. Is it not supported in this version, or I'm doing something wrong?
Thanks.
The t.join() makes the main thread to wait for the one created, so the CLI is blocked.
If you want to run your CLI and not block the terminal you need to run it in the background.
If you run on Linux you can simply use the & sign

How to run 10 python programs simultaneously?

I have a_1.py~a_10.py
I want to run 10 python programs in parallel.
I tried:
from multiprocessing import Process
import os
def info(title):
I want to execute python program
def f(name):
for i in range(1, 11):
subprocess.Popen(['python3', f'a_{i}.py'])
if __name__ == '__main__':
info('main line')
p = Process(target=f)
p.start()
p.join()
but it doesn't work
How do I solve this?
I would suggest using the subprocess module instead of multiprocessing:
import os
import subprocess
import sys
MAX_SUB_PROCESSES = 10
def info(title):
print(title, flush=True)
if __name__ == '__main__':
info('main line')
# Create a list of subprocesses.
processes = []
for i in range(1, MAX_SUB_PROCESSES+1):
pgm_path = f'a_{i}.py' # Path to Python program.
command = f'"{sys.executable}" "{pgm_path}" "{os.path.basename(pgm_path)}"'
process = subprocess.Popen(command, bufsize=0)
processes.append(process)
# Wait for all of them to finish.
for process in processes:
process.wait()
print('Done')
If you just need to call 10 external py scripts (a_1.py ~ a_10.py) as a separate processes - use subprocess.Popen class:
import subprocess, sys
for i in range(1, 11):
subprocess.Popen(['python3', f'a_{i}.py'])
# sys.exit() # optional
It's worth to look at a rich subprocess.Popen signature (you may find some useful params/options)
You can use a multiprocessing pool to run them concurrently.
import multiprocessing as mp
def worker(module_name):
""" Executes a module externally with python """
__import__(module_name)
return
if __name__ == "__main__":
max_processes = 5
module_names = [f"a_{i}" for i in range(1, 11)]
print(module_names)
with mp.Pool(max_processes) as pool:
pool.map(worker, module_names)
The max_processes variable is the maximum number of workers to have working at any given time. In other words, its the number of processes spawned by your program. The pool.map(worker, module_names) uses the available processes and calls worker on each item in your module_names list. We don't include the .py because we're running the module by importing it.
Note: This might not work if the code you want to run in your modules is contained inside if __name__ == "__main__" blocks. If that is the case, then my recommendation would be to move all the code in the if __name__ == "__main__" blocks of the a_{} modules into a main function. Additionally, you would have to change the worker to something like:
def worker(module_name):
module = __import__(module_name) # Kind of like 'import module_name as module'
module.main()
return

Categories

Resources