I'm trying to use a new thread or multiprocessing to run a function.
The function is called like this:
Actualize_files_and_folders(self)
i've read alot about multiprocessing and threads, and looking the questions at StackOverflow but i can't make it work... It would be great to have some help >.<
im calling the function with a button.
def on_button_act_clicked(self, menuitem, data=None):
self.window_waiting.show()
Actualize_files_and_folders(self)
self.window_waiting.hide()
In the waiting_window i have a button called 'cancel', it would be great if i can have a command/function that kills the thread.
i've tryed a lot of stuff, for exemple:
self.window_waiting.show()
from multiprocessing import Process
a=Process(Target=Actualize_files_and_folders(self))
a.start()
a.join()
self.window_waiting.hide()
But the window still freezing, and window_waiting is displayed at the end of Actualize_files_and_folders(self), like if i had called a normal function.
Thanks so much for any help!!
It looks like the worker function is being called rather than used as a callback for the process target:
process = Process(target=actualize_files_and_folders(self))
This is essentially equivalent to:
tmp = actualize_files_and_folders(self)
process = Process(target=tmp)
So the worker function is called in the main thread blocking it. The result of this function is passed into the Process as the target which will do nothing if it is None. You need to pass the function itself as a callback, not its result:
process = Process(target=actualize_files_and_folders, args=[self])
process.start()
See: https://docs.python.org/2/library/multiprocessing.html
Related
So I have a pyhthon application which im attempting to emulate a queue line up system. It import a library and calls to that library, specificaly psycopg2. An example code is as below
import threading,queue,psycopg2
q = queue.Queue()
def workerChecker():
while True:#Keeps the thread always checking for new things in queue
item = q.get()
addItemToDb(item) <------This part
q.task_done()
threading.Thread(target=workerChecker, daemon=True).start()
def addItemToDb(item):
#Do something and use psycopg2 to insert item to db here
So I can't seem to find a clear answer online on which thread will the codes in addItemToDb run in.
More specifically, will all the codes in the function addItemToDb be restricted to execution within the same thread as workerChecker ? Given that it uses an imported library.
Any assistance or help will be really appreciated...
The code inside addItemToDb will be restricted to the thread that runs workerChecker as long as it is called only by that function, but it can be used anywhere outside that thread in the main thread or any other thread that you create.
If you want to make sure that only workerChecker has access to that function you could define addItemToDb inside workerChecker
def workerChecker():
def addItemToDb(item):
# addItemToDb definition
pass
while True:
item = q.get()
addItemToDb(item)
q.task_done()
I know there is something called thread, but I am confused by those complex information all over Google. myFunc() takes a little time (not computationally expensive, say play a short mp3 file).
What I want to do is call myFunc() and don't need to wait for it to return to run the following lines of code. Furthermore, I don't need to keep anything related to myFunc(arg), I only need it to be executed only.
while(True):
......
myFunc(arg)
###some
###lines
###of
###code
Sorry for my bad English. Cheers!
from threading import Thread
def myFunc(arg):
# run code here
while(True):
thread = Thread(target = myFunc, args = (arg, ))
thread.start() # starts the thread, executes the function
###some
###lines
###of
###code
thread.join() # wait for myFunc to finish
You can do similarly with processes instead of threads.
Might want take a look into pools if you want to perform a list of arguments with the same function. You can call imap and iterate the results and call the rest of the code.
I am implementing a Flask application and I'm calling a function A, based on a request. Inside the function A, there is another function called B and it is called. But, I don't need to wait until the end of the execution of function B to return the output from function A. I have done it with the following code implementation.
from threading import Thread
def functionA():
result = doSomething1()
Thread(target=functionB).start()
return result
def functionB():
# Do something after the execution of doSomething1()
Here, I am starting a new thread and do what I need to do but, I do not terminate the newly started thread. Do I need to terminate that thread by myself? If so, what is the best way to do that?
A thread will terminate on its own. To see if it is alive you can use thread.is_Alive()
If you must force termination of a thread, use thread.join() would be the closes option, as this blocks the calling thread until the thread in question has finished.
Also refer to the docs for more info on the Threading functions
https://docs.python.org/3/library/threading.html
I am trying to execute a time-consuming back-end job, executed by a front-end call. This back-end job should execute a callback method when it is completed, which will release a semaphore. The front end shouldn't have to wait for the long process to finish in order to get a response from the call to kick off the job.
I'm trying to use the Pool class from the multiprocessing library to solve this issue, but I'm running into some issues. Namely that it seems like the only way to actually execute the method passed into apply_async is to call the .get() method in the ApplyResult object that is returned by the apply_async call.
In order to solve this, I thought to create a Process object with the target being apply_result.get. But this doesn't seem to work.
Is there a basic understanding that I'm missing here? What would you folks suggest to solve this issue.
Here is a snippet example of what I have right now:
p = Pool(1)
result = p.apply_async(long_process, args=(config, requester), callback=complete_long_process)
Process(target=result.get).start()
response = {'status': 'success', 'message': 'Job started for {0}'.format(requester)}
return jsonify(response)
Thanks for the help in advance!
I don't quite understand why you would need a Process object here. Look at this snippet:
#!/usr/bin/python
from multiprocessing import Pool
from multiprocessing.managers import BaseManager
from itertools import repeat
from time import sleep
def complete_long_process(foo):
print "completed", foo
def long_process(a,b):
print a,b
sleep(10)
p = Pool(1)
result = p.apply_async(long_process, args=(1, 42),
callback=complete_long_process)
print "submitted"
sleep(20)
If I understand what you are trying to achieve, this does exactly that. As soon as you call apply_async, it launches long_process function and execution of the main program continues. As soon as it completes, complete_long_process is called. There is no need to use get method to execute long_process, and the code does not block and wait anything.
If your long_process does not appear to run, I assume your problem is somewhere within long_process.
Hannu
I have a program with a GUI that needs to do some multiprocessing. The point of it is to avoid freezing the GUI, and to allow the user using some other buttons while the process is running.
I'd like then to define a method like the following:
def start_waiting(self, parent, func, args):
self.window_waiting.show()
task=Process(func(*args))
task.start()
task.join()
# Some code to execute at the end of the process
#
#...
The problem is that join() is not working, and I need it because the code executed after join() indicates when the Process has ended. I'll use that code to update a cancel button of the window_waiting.
I then came in mind with another solution to avoid usingjoin(), I replaced it with:
while task.is_alive():
time.sleep(0.5)
But it didn't worked neither, so I tried a plan C, which was to create a queue:
def worker(input, output):
for func, args in iter(input.get, 'STOP'):
result=func(*args)
output.put(result)
task_queue = Queue()
done_queue = Queue()
task_queue.put(task)
Process(target=worker, args=(task_queue, done_queue)).start()
done_queue.get()
The last code gave me the error: 'Pickling an AuthenticationString object is '
TypeError: Pickling an AuthenticationString object is disallowed for security reasons
Which lead me to multiprocessing.Process subclass using shared queue but I haven't managed to solve the problem :/
Your first example should look like this:
def start_waiting(self,parent,func,args):
...not relevant code..
self.window_waiting.show()
task=Process(target=func, args=args) # This line is different.
task.start()
task.join()
The way you had it, it wasn't actually executing func in a child process; it was executing it in the parent, and then passing the return value to Process. When you called task.start(), it would probably instantly fail, because you were passing it whatever func returned, rather than a function object.
Note that because you're calling task.join() inside start_waiting, it's probably going to block your GUI, because start_waiting won't return until func completes, even though its running in a child process. The only way it wouldn't block is if you're running start_waiting in a separate thread from the GUI's event loop. You probably want something more like this:
def start_waiting(self,parent,func,args):
...not relevant code..
self.window_waiting.show() # You should interact with the GUI in the main thread only.
self.task = Process(target=func, args=args) # This line is different.
self.thrd = threading.Thread(target=self.start_and_wait_for_task)
self.thrd.start()
def start_and_wait_for_task(self):
""" This runs in its own thread, so it won't block the GUI. """
self.task.start()
self.task.join()
# If you need to update the GUI at all here, use GLib.idle_add (see https://wiki.gnome.org/Projects/PyGObject/Threading)
# This is required to safely update the GUI from outside the main thread.