Running two scripts in Python - python

Dear people of the internet, I need some help...
I have a Python script, main.py in which a user can enter 2 variables
First, ReminderTime is an int, second, ReminderText is a string. I want to send the two variables to another Python script, reminder.py.
The reminder script will do stuff with the two objects, and then it will use ToastNotifier to make a Windows notification, before shutting down.
The main script will still run in the foreground during that time. Is this possible?

You can use multiprocessing
Simple example:
p = Process(target=f, args=(ReminderTime, ReminderText))
p.start()
f - is function in your reminder.py that will start that script (like main in main.py by default)
args - is all arguments you need to send to this function
Also, you need to import you function import reminder.function_name
This code will start you f function in background, so main code will still runing.

Related

(Multithreading-Python) How I can create a script that run two scripts which I usually run from two different terminals?

I have two scripts a.py and b.py, they send data to each other via a local host (mqtt), and they both depend from a configuration file conf.json. I usually execute them in two different terminals,
a.py in one terminal
b.py in another
and everything it's OK. I am trying right now to create another script c.py which should do the following:
for parameter in parameters
update config.json
execute a.py and b.py "in two different terminals"
close a.py, b.py and start again with the new parameters
Now, I am very noob about this, so I tried to use Thread from threading
from threading import Thread
for parameter in parameter
#update config.json
class exp(Thread):
def __init__(self, name):
Thread.__init__(self)
self.name = name
def run(self):
if self.name == 0:
a.runs()
else:
b.runs()
thread1 = exp(0)
thread1.start()
thread2 = exp(1)
thread2.start()
a.py and b.py scripts both end by:
def runs():
#whatever runs do
if __name__ = 'main':
runs()
It runs without errors, but it does not work. I am quite sure there should be a nice and standard solution to this problem. Any ideas? Thanks!
So I eventually found this (dirty) solution...any advices for improvements?
p = subprocess.Popen(['python', 'random.py']) #initialize a random process just to make sense of the firsts terminate calls in the for cycle.
for parameter in parameters
subprocess.Popen.terminate(p)
#random code
p = subprocess.Popen(['python', 'a.py'])
p = subprocess.call(['python', 'b.py'])
#here I would like to do
#subprocess.Popen.terminate(p)....but it does not work, so I putted the terminate process at the start of the for cycle
I do not totally understand what I wrote but it works fine. Thanks everybody for previous tips, and I hope for further explanations.
You probably want multiprocessing, not the threading library (look up multiprocessing.Process. Another fairly equivalent option is to use subprocess.run to launch the two scripts via shell.
Regarding threads - Keep in mind they are limited due to the Global Interpreter Lock in CPython - which is the prevalent python implementation and the one you probably are using.
you can us qt Threading. Qt has a very powerful library exactly for this purpose.

Have a thread constantly run in background python

I have a function to check for sniffing tools I want to constantly run in the background of my python script:
def check():
unwanted_programmes = [] # to be added to
for p in psutil.process_iter(attrs=['pid', 'name']):
for item in unwanted_programmes:
if item in str(p.info['name']).lower():
webhook_hackers(str(p.info['name'])) # send the programme to a webhook which is in another function
sys.exit()
time.sleep(1)
I want this to run right from the start and then the rest of the script
I have written code like so:
if __name__ == "__main__":
check = threading.Thread(target=get_hackers())
check.start()
check.join()
threading.Thread(target=startup).start()
# startup function just does some prints and inputs before running other functions
However, this code only runs check once and then startup but I want check to run and then keep running in the background. Meanwhile, startup to run just once, how would I do this?
Your check function does what you want it to, but it only does it once, and that's the behavior that you're seeing; the thread finishes running the function and then cleanly exits. If you place everything in the function inside of a while(True): block at the top of the function then the function will loop infinitely and the thread will never exit, which sounds like it's what you want.

python run sevral codes at same time

is there any way I can run python code in this way:
main code will run all the time ,
once every 5 min will run another function while running the main code.
my code is reading gps signal and send it to my server every 5 seconds.
I have to run another code that check the device cpu\file\temp every 5 min (this part take around 30 seconds )
can both of them run at the same time while still getting gps?
I have the GPS code ready and also the Check code ready - how do I combine them (if it's possiable)
Thanks ,
This might answer your question: Running Two Script at Once Using Bash
Based on the answer here, all you'd have to run is:
python script1.py &
python script2.py &
You can use the threading module, to do this task: it allows you to run functions as different processes seperate from the main program. Threading Documentation
I think you should get what you want with the following code:
import threading
def gpsJob():
threading.Timer(300.0, gpsJob).start()
# Here goes your GPS-code
gpsJob()
if __name__ == '__main__':
# main code

multiprocessing pool example does not work and freeze the kernel

I'm trying to parallelize a script, but for an unknown reason the kernel just freeze without any errors thrown.
minimal working example:
from multiprocessing import Pool
def f(x):
return x*x
p = Pool(6)
print(p.map(f, range(10)))
Interestingly, all works fine if I define my function in another file then import it. How can I make it work without the need of another file?
I work with spyder (anaconda) and I have the same result if I run my code from the windows command line.
This happens because you didn't protect your "procedural" part of the code from re-execution when your child processes are importing f.
They need to import f, because Windows doesn't support forking as start method for new processes (only spawn). A new Python process has to be started from scratch, f imported and this import will also trigger another Pool to be created in all child-processes ... and their child-processes and their child-processes...
To prevent this recursion, you have to insert an if __name__ == '__main__': -line between the upper part, which should run on imports and a lower part, which should only run when your script is executed as the main script (only the case for the parent).
from multiprocessing import Pool
def f(x):
return x*x
if __name__ == '__main__': # protect your program's entry point
p = Pool(6)
print(p.map(f, range(10)))
Separating your code like that is mandatory for multiprocessing on Windows and Unix-y systems when used with 'spawn' or 'forkserver' start-method instead of default 'fork'. In general, start-methods can be modified with multiprocessing.set_start_method(method).
Since Python 3.8, macOS also uses 'spawn' instead of 'fork' as default.
It's general a good practice to separate any script in upper "definition" and lower "execution as main", to make code importable without unnecessary executions of parts only relevant when run as top level script. Last but not least, it facilitates understanding the control flow of your program when you don't intermix definitions and executions.

How to write a script to run multi-processes in multi-core machine effectively

everyone. I have a file Python (for example named: run.py). This program takes some parameters (python run.py param1 param2 ...) and each tuple parameter is a setting. Now, I have to run many settings simultaneously to finish all as soon as possible. I wrote a file run.sh as follow:
python run.py setting1 &
python run.py setting2 &
#more setting
...
wait
This file will execute all processes simultaneously, right? And I run on the machine 64 core cpu. I have some question here:
Will each process run on one core or not?
If not, how can I do this?
If I can run a process per a core, time running of setting1 will equal to time running when I just run an individual process: python run.py setting1
Did you try to use the multiprocessing module?
Assuming you want to execute some function work(arg1, arg2) multiple times in parallel, you'd end up with something like this
import multiprocessing
p = multiprocessing.Pool(multiprocessing.cpu_count()
results = p.starmap(work, [(arg11, arg12), (arg21, arg22)....]
# do something with the list of results
If your functions all look very different from each other then you can get away by writing a function wrapper, like so:
def wrapper(dict_args, inner_function):
return inner_function(dict_args)
# then launch the multiprocessing mapping
p.starmap(wrapper, [({'arg1': v1, 'arg2': v2}, job1), ({'foo': bar}, job2)..]

Categories

Resources