Create a new context manager instance every X iterations - python

Say that I have a context manager, CM, that creates a new API session. How can I create a new context manager every 10 iterations of using it? The issue is that the API can only handle so many requests.
counter = 0
# <some loop make new session when counter reaches 10>
with new_session() as session:
# Do stuff
counter += 1
My first thought was to do something like if counter % 10 == 0: make new session but that doesn’t preserve the session for the whole 10 iterations.
I can’t figure out how to set up the loop.

Use a loop for 10 iterations inside the context manager.
Then you can put the whole thing in another loop to keep going for multiples of 10 iterations.
keep_going = True
while keep_going:
with new_session() as session:
for _ in range(10):
# do stuff
To stop, set keep_going to False at some point.
If you need more fine-grained control over when to stop, you could put everything inside a function and use a return statement to exit (see How can I break out of multiple loops?).

Related

Shorten a repeating exit flag for functions

I would like to exit a function by pressing a button using the return statement in the if statement. To write these lines again and again and again is not really what I like. That's why I am basically looking for a function that tells the parent function to return.
Obviously I can't use the return statement in a function and just execute the function where I want to check the variable, although that would be the nicest way I could imagine.
I want to explain it in a loop, but please keep in mind that's not where I want to use it. The usage is for automated processes which should have many exit points.
import keyboard, time
RedFlag = False
def set_RedFlag():
global RedFlag
RedFlag = True
keyboard.add_hotkey("end", set_RedFlag)
PauseFlag = False
def set_PauseFlag():
global PauseFlag
print(PauseFlag)
if PauseFlag == False:
PauseFlag = True
else:
PauseFlag = False
keyboard.add_hotkey("space", set_PauseFlag)
def task():
for i in range(30):
print("test",i)
time.sleep(1)
# Flags
if RedFlag == True: # exitpoint
return
while PauseFlag == True: # pause the script
time.sleep(1/6)
task()
Really relevant is the if statement after #Flags. Especially with the while statement I would have to repeat multiple lines. I would love to have this in one single word as if it was a function.
Edit:
More about the "what for?":
I would like to run a macro and later maybe other tasks. For example writing the odd numbers from 1 to 511. And I want to be able to stop or pause it at any given time.
I forgot to mention, that I tried using the multiprocessing module, since I could terminate the task there. Sadly it is not an option for me not only because the starting time of a process is a bit too long for me, also I am learning a bit about kivy and I know it get's complicated when I want to start another process while using kivy.

python How to while loop while true , run multiple functions together

def function():
while True:
...omission...(this function is repeated permanently)
i =0
while i < 4:
driver.execute_script("""window.open("URL")""")
driver.switch_to.window(driver.window_handles[-1])
time.sleep(1)
function()
time.sleep(1)
i += 1 #open new tab and run function.
it doesn't work because while true loop is repeated permanently. Is there any ways to run multiple functions together?
https://imgur.com/a/4SIVekS This picture shows what I want
According to your picture, what you want is to launch the function a set number of times (4?), and run those in parrallel.
On a single core, as is the normal behavior, straight up parallel processing is impossible. You need to access other cores and manage a decentralized processing. while is useless there. I'm worried the level of difficulty is over your current skills, but here we go.
The overall flow that you (probably, depends on the actual memory safety of your functions) need is:
- to create a thread pool with the set number of threads for the number of runs you want.
- indicate the function you need to run
- start them, making sure the start itself is non-blocking.
- ensure one functions's processing doesn't impact another's results. race conditions are a common problem.
- gather results, again, in a non-blocking way.
You can use several methods. I highly recommend you read up a lot on the following documentations.
Threading:
https://docs.python.org/3/library/threading.html
Multiprocessing:
https://docs.python.org/3/library/multiprocessing.html
I don't understand your question because I don't understand what your function is supposed to do.
while True:
will always create an infinite loop. "while" is a command that tells python to loop through the following block so long as the expression following it evaluates to True. True always evaluates to True.
It seems like you want to use a conditional, like you do in "while x < 4".
x < 4
...is an expression that evaluates to true when x is less than 4, and false if x is not less than 4. Everything below the line:
while x < 4:
will then run if x is less than 4, and when it's done running that code, it will go back and evaluate if x is less than 4 again, and if it is, run the code again. To include another while loop inside of that loop, that new loop also needs an expression to evaluate. If you want to evaluate the same expression, write it out:
while x < 4:
# do something
while x < 4:
#do more things
# do even more things
# change x at some point, or else you're in an infinite loop.
However, there's no reason to do that specifically, because you're already doing it. All of the code is only running when x < 4, so checking that condition again right there is redundant, and doing it in another loop doesn't make sense. If the inside loop is also incrementing x, then the outside loop won't loop and doesn't need to increment x.
Also, if you want a function to check a condition based on a variable outside the function, you'll want to pass things to that function.

Python Execute a Function after timeout

i want to start a function after a timeout in a While true loop, but the code dont execute anything and jumps out the loop and i dont know why :/
Here is my Code
import requests
from threading import Timer
def timeout(flag):
print("New Request")
statuscode = requests.get("http://adslkfhdsjf.de").status_code
if statuscode == 200 and flag == 0:
print("Service available")
#Testzwecke
print("Flag: ", flag)
flag = 0
#Poste result to Backend
elif statuscode == 200 and flag == 1:
print("Service is available now")
print("Flag: ", flag)
flag = 0
#Email an User
#Post Request
elif statuscode != 200 and flag == 0:
print("Service is not available")
#Testzwecke
print("Flag: ", flag)
flag = 1
#Email to User
#Post Request
else:
print("Service is not available")
#Testzwecke
print("Flag: ", flag)
#Post Request
Timer(10, timeout, flag)
timeout(0)
I want that timeout is executed for example every 10 seconds. So every 10 second one condition from the function timeout() will be executed.
But its not working so far, the console output is nothing :/
Your first problem is just that you're not calling main(). And normally, I'd just add a comment to tell you that and close the question as a typo, but you don't want to fix that until you first fix your bigger problem.
Your code tries to create and call a new timeout function over and over, as fast as possible. And the first thing that timeout function does is to create a new Timer object. Which is a new thread.
So you're spawning new threads as fast as Python will let you, which means in a very short time you're going to have more threads than your OS can handle. If you're lucky, that will mean you get an exception and your program quits. If you're unlucky, that will mean your system slows to a crawl as the kernel starts swapping thread stacks out to disk, and, even after you manage to kill the program, it may still take minutes to recover.
And really, there's no reason for the while loop here. Each Timer schedules the next Timer, so it will keep running forever. And there's only ever 2 threads alive at a time that way.
But there's not even a reason for a Timer in the first place. You don't want to do anything while waiting 10 seconds between requests, so why not just sleep?
import time
import requests
def main():
flag = 0
while True:
print("New Request")
statuscode = requests.get("http://google.de").status_code
if statuscode == 200 and flag == 0:
print("Service available")
# etc.
time.sleep(10)
main()
Your code had another problem: you're defining a local variable named flag in timeout, but then you're trying to use it, in that flag == 0 check, before you ever assign to it. That would raise an UnboundLocalError. The fact that you happen to also have a local variable named flag in main doesn't make a difference. To fix this, you'd have to do one of these:
Pass flag in as an argument for Timer to pass to each timeout call as a parameter. (Probably best.)
Add a nonlocal flag declaration to timeout, so it becomes a closure cell shared by all of the timeout functions you define. (Not bad, but not the most idiomatic solution.)
Add a global flag declaration to both functions, so it becomes a global variable shared by everyone in the universe. (Probably fine a program this simple, but at the very least not a good habit to get into.)
But, once we've gotten rid of the thread, we've also gotten rid of the function, so there's just the one local flag, so the problem doesn't come up in the first place.

Python: Initiate a variable once in a program running every few minutes

I have a program running every minutes. I want that when I'm executing it for the first time I do something and after something else; like this :
def alarm_function (alarm):
first_time=0
if first_time==0:
send_on_website(message)
first_time+=1
alarm=0
else:
send_on_website(a_different_message)
if alarm==0:
#do nothing
if alarm==1:
alarm+=1
#do something
So basically after I executed once I want to erase the first line "first_time=0" because I don't want to initiate it again. Also, I want to make a counter on alarm variable which is initiate somewhere in the program. How can I do that ?
you would have to define a new function if you dont want the first_time any more. and for the counter you can put at the top of your program import time and where ever you want the counter to be you use time.sleep(?) #change ? for a any number of seconds

Python 2.7 Multiprocessing Pool for a list of Strings?

I'm new to Python (disclaimer: I'm new to programming and I've been reading python online for two weeks) and I've written a simple multi-processing script that should allow me to use four subprocesses at once. I was using a global variable (YES, I KNOW BETTER NOW) to keep track of how many processes were running at once. Start a new process, increment by one; end a process, decrement by one. This was messy but I was only focused on getting the multi-processes working, which it does.
So far I've been doing the equivalent of:
processes = 0
def function(value)
global processes
do stuff to value
processes-=1
While read line
if processes < 4
processes+=1
create a new subprocess - function(line)
1: I need to keep track of processes in a better way than a global. I saw some use of a 'pool' in python to have 4 workers, but I failed hard at it. I like the idea of a pool but I don't know how to pass each line of a list to the next worker. Thoughts?
2: On general principles, why is my global var decrement not working? I know it's ugly, but I at least expected it to be ugly and successful.
3: I know I'm not locking the var before editing, I was going to add that once the decrementation was working properly.
Sorry if that's horrible pseudo-code, but I think you can see the gist. Here is the real code if you want to dive in:
MAX_THREADS = 4
CURRENT_THREADS = 0
MAX_LOAD = 8
# Iterate through all users in the userlist and call the funWork function on each user
def funReader(filename):
# I defined the logger in detail above, I skipped about 200 lines of code to get it slimmed down
logger.info("Starting 'move' function for file \"{0}\"...".format(filename))
# Read in the entire user list file
file = open(filename, 'r')
lines = file.read()
file.close()
for line in lines:
user = line.rstrip()
funControl(user)
# Accept a username and query system load and current funWork thread count; decide when to start next thread
def funControl(user):
# Global variables that control whether a new thread starts
global MAX_THREADS
global CURRENT_THREADS
global MAX_LOAD
# Decide whether to start a new subprocess of funWork for the current user
print
logger.info("Trying to start a new thread for user {0}".format(user))
sysLoad = os.getloadavg()[1]
logger.info("The current threads before starting a new loop are: {0}.".format(CURRENT_THREADS))
if CURRENT_THREADS < MAX_THREADS:
if sysLoad < MAX_LOAD:
CURRENT_THREADS+=1
logger.info("Starting a new thread for user {0}.".format(user))
p = Process(target=funWork, args=(user,))
p.start()
else:
print "Max Load is {0}".format(MAX_LOAD)
logger.info("System load is too high ({0}), process delayed for four minutes.".format(sysLoad))
time.sleep(240)
funControl(user)
else:
logger.info("There are already {0} threads running, user {1} delayed for ten minutes.".format(CURRENT_THREADS, user))
time.sleep(600)
funControl(user)
# Actually do the work for one user
def funWork(user):
global CURRENT_THREADS
for x in range (0,10):
logger.info("Processing user {0}.".format(user))
time.sleep(1)
CURRENT_THREADS-=1
Lastly: any errors you see are likely to be transcription mistakes because the code executes without bugs on a server at work. However, any horrible coding practices you see are completely mine.
Thanks in advance!
how about this: (not tested)
MAX_PROCS = 4
# Actually do the work for one user
def funWork(user):
for x in range (0,10):
logger.info("Processing user {0}.".format(user))
time.sleep(1)
return
# Iterate through all users in the userlist and call the funWork function on each user
def funReader(filename):
# I defined the logger in detail above, I skipped about 200 lines of code to get it slimmed down
logger.info("Starting 'move' function for file \"{0}\"...".format(filename))
# Read in the entire user list file
file = open(filename, 'r')
lines = file.read()
file.close()
work = []
for line in lines:
user = line.rstrip()
work.append(user)
pool = multiprocessing.Pool(processes=MAX_PROCS) #threads are different from processes...
return pool.map(funWork, work)

Categories

Resources