I have a python script that does some network stuff with the requests library. My issue is that for some reason the python script doesn't seem to exit immediately after it finishes.
I have measured the execution call of the function I use and of the total execution of the script with time.perf_counter().
--ipTest completed in 134.34 sec
--Program completed in 134.34 sec
But the script stops (writes on the file the output and exits) after 260 seconds. This is not always the case though, sometimes this is happening.
Is there a way to check what the script is doing the rest of the time?
Thanks!
Related
I am trying to get a way which can have a pytest function defined that checks the running time limit of python subprocesses.
You cannot find out anything about the running time of a subprocess unless you actually run that subprocess, so I do not think this is a case for mocking.
Your test needs to
run the subprocess,
wait for the allowed time, and
fail if the subprocess has not finished by then.
You can speed up the test by making sure it is notified (and then terminates successfully) as soon as the subprocess finishes.
Similarly to this question, I don't understand how the django-background-tasks module works. What I want is to have a button which would start a task in background so it doesn't block the front. The task has to be run as soon as the click occurs.
Reading the docs I thought that I only had to define my task as a function preceded by the decorator #background(schedule=1) then calling the function would start the task one second after the click handler was run.
Calling notify_user as normal will schedule the original function to be run 60 seconds from now:
However this is not what happens. What happens is that a new task is created after each click (I could understand that by requesting the background_task database thanks to this answer) but none ever runs.
The command python manage.py process_tasks seems to run one task defined previously before stopping. If no task has been defined, then it waits for a task to be defined for up to duration seconds (specified when running the command). This behavior is not what I expected while reading the documentation.
duration - Run task for this many seconds (0 or less to run forever) - default is 0
I don't think creating a crontab that calls python manage.py process_tasks every second is relevant, does this mean I have to manually call the command myself so it runs when I want ? Isn't the command supposed to always be running so it handles all tasks when they are scheduled to be run ?
The fact that the command only runs one task per run is troubling to me, what if I defined 1000 small tasks to be run in background in a minute, would the command ever catch up ?
I wrote a Python 3 script to measure the running time of a process, since it keeps dying and I'm interested in how long it will keep actively running. But it's running on my laptop, and I realized the statistics will be skewed by periods when I've put it to sleep.
It's on Linux, so I'm sure there's a log file I could parse (it used to be pm-suspend.log before systemd), but I'm wondering if there's a more general/direct way.
Can a process ask to be notified of suspend events?
I should note that I'm interested in the wallclock time the process was running, not the actual CPU execution time. So time.process_time() won't work.
It includes the suspended periods but as far as i think it is the only option to find the process running time.Moreover using this you can even find the running time of even individual lines of code.
from time import time
t0=time()
#your code here
print(time()-t0)
I have a python script which basically calls a few functions, but takes quite a while to complete. I want to be able to run this script many times on a loop.
I need the script to wait until it's previous iteration is done.
so for example;
for i in range(times_to_run):
do_things(thing_variable)
How can I make it so that the script waits until do_things() is done, before calling it again, and iterating through the loop. Thanks!
How is it possible to get a compiled .exe program written in Python to kill itself after a period of time after it is launched?
If I have some code and I compile it into an .exe, then launch it and it stays in a 'running' or 'waiting' state, how can I get it to terminate after a few mins regardless of what the program is doing?
The reason why is that the exe that is launched envokes a URL using PAMIE and automates some clicks. What I have noticed is that if the browser is closed the process remains in memory and does not clean itself up. I wanted to find a way to auto-clean up the process after say 5 mins which is more then enough time. I've tried using psutils to detect the process but that does not work in my case. Any suggestions is greatly appreciated.
def one(Time):
time.sleep(Time)
sys.exit(0)
def two():
thread.start_new_thread(one, (600,)) #10min countdown..
#Your running function here....Will run sim. with function one. when time ends,
#sys.exit exits program..(Not tested:) )
two()
Create a thread when your process starts.
Make that thread sleep for the required duration.
When that sleep is over, kill the process.