def return_total():
dE_total = 0
for num in range(len(self.layer)):
dE_total += self.layer[num].backprop(delta[num])
return dE_total
I have the above method inside a class. I need to call the backprop() method using multithreading. Usually the length of self.layer is small. I was planning to try ThreadPoolExecutor's map() method. As far as i know it is used to call a function and iterable value. Here each thread should execute for a different method with a input paramater. Is there any way to go about doing this?
with ThreadPoolExecutor() as executor:
dE_total += executor.map(self.layer.backprop, delt)
I am aware the above code does not make any sense. I'm looking for something similar to the above idea.
Thanks for any help in advance
If I'm interpreting this correctly, you could write a method which takes the function as argument. This can then be passed to executor.map, e.g.:
def func_caller(funcs, params):
return func(*params)
dE_total += sum(executor.map(func_caller, funcs_params))
or similar, with funcs_params some appropriate list of tuples of functions and parameters. The argument unpacking might need to be adjusted.
Related
Why is it a practice to define a new function inside a decorator as the wrapper function below:
def not_during_the_night(func):
def wrapper():
if 7 <= datetime.now().hour < 22:
return func
else:
pass # Hush, the neighbours are asleep
return wrapper
Instead of just doing something like?:
def not_during_the_night(func):
if 7 <= datetime.now().hour < 22:
return func
else:
pass # Hush, the neighbours are asleep
Isn't the final result the same? Also, if we need to add some functionality we can do that as well so I really don't get why decorators are written like this but there must be a good reason for it. :)
EDIT: I accidentally left brackets inside the first case, the question was supposed to be like this
The idea of a decorator is to return a function you can call as needed, with enhanced functionality according to the decorator's purpose.
What you suggest will fail in use. First of all, if I instantiate this during the night, your proposal will return None, and my calling program will crash, even if I call the function at noon.
More generally, your proposal freezes the functionality based on time of instantiation, rather than when my use case calls the function.
It won't be the same result, in the first example with a wrapper the time condition will be checked each time you call the decorated function. And in the second example, the time condition will be checked just once at the time you apply the decorator to a function.
I have function like this one:
def get_list_of_movies(table):
#some code here
print(a_list)
return a_list
Reason why I want to use print and return is that I'm using this function in many places. So after calling this function from menu I want to get printed list of content.
This same function I'm using in another function - just to get list. Problem is - when I call this function it prints list as well.
Question: How to prevent function from executing print line when its used in another function just to get list?
This is part of exercise so I can't define more functions / split this or soo - I'm kind of limited to this one function.
Edit: Thank you for all answers! I'm just beginner but you showed me ways in python(programming in general) that I never thought of! Using second parameter (boolin) is very cleaver. I do learn here a lot!
Add a separate argument with a default value of None to control the printing. Pass
def get_list_of_movies(table, printIt=False):
...
if printIt:
print(a_list)
return a_list
...
movies = get_list_of_movies(table, printIt=True)
Another approach is to pass print itself as the argument, where the default value is a no-op:
def get_list_of_movies(table, printer=lambda *args: None):
...
printer(a_list)
return a_list
...
movies = get_list_of_movies(table, printer=print)
This opens the door to being able to customize exactly how the result is print; you are effectively adding an arbitrary callback to be performed on the return value, which admittedly can be handled with a custom pass-through function as well:
def print_it_first(x):
print(x)
return x
movies = print_it_first(get_list_of_movies(table))
This doesn't require any special treatment of get_list_of_movies itself, so is probably preferable from a design standpoint.
A completely different approach is to always print the list, but control where it gets printed to:
def get_list_of_movies(table, print_to=os.devnull):
...
print(a_list, file=location)
return a_list
movies = get_list_of_movies(table, print_to=sys.stdout)
The print_to argument can be any file-like object, with the default ensuring no output is written anywhere.
Python allows you to declare a function like
def print_all(*arguments):
for a in arguments:
print(a)
print_all(1,2,3)
Which allows one to pass in a variable amount of data. This seems much less readable to me than building a list or a dictionary, and passing those in as arguments like so.
def print_all2(things_to_print):
for thing in things_to_print:
print(thing)
things_to_print = [1,2,3]
print_all2(things_to_print)
The second option allows you to give the argument a proper name. When would it be preferable to use the *arguments technique? Is there a time when using *arguments is more Pythonic?
Is there a time when using *arguments is more Pythonic?
Not only "more pythonic", but it's often necessary.
Your need to use *args whenever you don't know how many arguments a function will recieve.
Think, for example, about decorators:
def deco(fun):
def wrapper(*args, **kwargs):
do_stuff()
return fun(*args, **kwargs)
return wrapper
Very opinion based, but sometimes you want to use a function providing the arguments in-line. It just looks a bit clearer:
function("please", 0, "work this time", 2.3)
than:
function(["please", 0, "work this time", 2.3])
In fact, there is a good example, which you even mention in your question: print! Imagine you'd have to create a list each time you wanted to print something:
print(["please print my variable", x, " and another:", y])
print([x])
Tedious.
I have written several functions that run sequentially, each one taking as its input the output of the previous function so in order to run it, I have to run this line of code
make_list(cleanup(get_text(get_page(URL))))
and I just find that ugly and inefficient, is there a better way to do sequential function calls?
Really, this is the same as any case where you want to refactor commonly-used complex expressions or statements: just turn the expression or statement into a function. The fact that your expression happens to be a composition of function calls doesn't make any difference (but see below).
So, the obvious thing to do is to write a wrapper function that composes the functions together in one place, so everywhere else you can make a simple call to the wrapper:
def get_page_list(url):
return make_list(cleanup(get_text(get_page(url))))
things = get_page_list(url)
stuff = get_page_list(another_url)
spam = get_page_list(eggs)
If you don't always call the exact same chain of functions, you can always factor out into the pieces that you frequently call. For example:
def get_clean_text(page):
return cleanup(get_text(page))
def get_clean_page(url):
return get_clean_text(get_page(url))
This refactoring also opens the door to making the code a bit more verbose but a lot easier to debug, since it only appears once instead of multiple times:
def get_page_list(url):
page = get_page(url)
text = get_text(page)
cleantext = cleanup(text)
return make_list(cleantext)
If you find yourself needing to do exactly this kind of refactoring of composed functions very often, you can always write a helper that generates the refactored functions. For example:
def compose1(*funcs):
#wraps(funcs[0])
def composed(arg):
for func in reversed(funcs):
arg = func(arg)
return arg
return composed
get_page_list = compose1(make_list, cleanup, get_text, get_page)
If you want a more complicated compose function (that, e.g., allows passing multiple args/return values around), it can get a bit complicated to design, so you might want to look around on PyPI and ActiveState for the various existing implementations.
You could try something like this. I always like separating train wrecks(the book "Clean Code" calls those nested functions train wrecks). This is easier to read and debug. Remember you probably spend twice as long reading your code than writing it so make it easier to read. You will thank yourself later.
url = get_page(URL)
url_text = get_text(url)
make_list(cleanup(url_text))
# you can also encapsulate that into its own function
def build_page_list_from_url(url):
url = get_page(URL)
url_text = get_text(url)
return make_list(cleanup(url_text))
Options:
Refactor: implement this series of function calls as one, aptly-named method.
Look into decorators. They're syntactic sugar for 'chaining' functions in this way. E.g. implement cleanup and make_list as a decorators, then decorate get_text with them.
Compose the functions. See code in this answer.
You could shorten constructs like that with something like the following:
class ChainCalls(object):
def __init__(self, *funcs):
self.funcs = funcs
def __call__(self, *args, **kwargs):
result = self.funcs[-1](*args, **kwargs)
for func in self.funcs[-2::-1]:
result = func(result)
return result
def make_list(arg): return 'make_list(%s)' % arg
def cleanup(arg): return 'cleanup(%s)' % arg
def get_text(arg): return 'get_text(%s)' % arg
def get_page(arg): return 'get_page(%r)' % arg
mychain = ChainCalls(make_list, cleanup, get_text, get_page)
print( mychain('http://is.gd') )
Output:
make_list(cleanup(get_text(get_page('http://is.gd'))))
In Python, what do you do if you are using a multiprocessing and you need to give the function an extra agruement?
Example:
if value == "Y":
pool = multiprocessing.Pool(processes=8)
pool.map(verify_headers, url_list)<-need to give parameter for a password
pool.close()
pool.join()
print "Done..."
and the function would be something like:
def verify_headers(url, password):
pass
Pool.map takes a function of one argument and an iterable to produce that argument. We can turn your function of two arguments into a function of one argument by wrapping it in another function body:
def verify_headers_with_passowrd(url):
return verify_headers(url, 'secret_password')
And pass that to pool.map instead:
pool.map(verify_headers_with_password, url_list)
so long as verify_headers can take password as a keyword argument, we can shorten that a little: you can use functools.partial
pool.map(functools.partial(verify_headers, password='secret_password'), url_list)
Edit: as Bakuriu points out, multiprocessing passes data round by pickling, so the following doesn't work:
pool.map(lambda url: verify_headers(url, 'secret_password'), url_list)
Since lambda's are functions without a name, and pickle serialzes functions by name.
i believe
from functools import partial
and
pool.map(partial(verify_headers,password=password),url_list)
should work?
edit: fixed based on recommendations below
You define a function, right after the original, that accepts as argument a 2-element tuple:
def verify_headers_tuple(url_passwd):
return verify_headers(*url_passwd)
Then you can zip the original url_list with itertools.repeat(password):
pool.map(verify_headers_tuple, it.izip(url_list, it.repeat(password)))
Note that the function passed to Pool.map must be defined at the top level of a module(due to pickling restrictions), which means you cannot use partial or lambda to create a "curried function".