Will Python automatically detect that the function was never called but defined? - python

True or False
If a function is defined but never called, then Python automatically detects that and issues a warning

One of the issues with this is that functions in Python are first class objects. So their name can be reassigned. For example:
def myfunc():
pass
a = myfunc
myfunc = 42
a()
We also have closures, where a function is returned by another function and the original name goes out of scope.
Unfortunately it is also perfectly legal to define a function with the same name as an existing one. For example:
def myfunc(): # <<< This code is never called
pass
def myfunc():
pass
myfunc()
So any tracking must include the function's id, not just its name - although that won't help with closures, since the id could get reused. It also won't help if the __name__ attribute of the function is reassigned.
You could track function calls using a decorator. Here I have used the name and the id - the id on its own would not be readable.
import functools
globalDict = {}
def tracecall(f):
#functools.wraps(f)
def wrapper(*args, **kwargs):
global globalDict
key = "%s (%d)" % (f.__name__, id(f))
# Count the number of calls
if key in globalDict:
globalDict[key] += 1
else:
globalDict[key] = 1
return f(*args, **kwargs)
return wrapper
#tracecall
def myfunc1():
pass
myfunc1()
myfunc1()
#tracecall
def myfunc1():
pass
a = myfunc1
myfunc1 = 42
a()
print(globalDict)
Gives:
{'myfunc1 (4339565296)': 2, 'myfunc1 (4339565704)': 1}
But that only gives the functions that have been called, not those that have not!
So where to go from here? I hope you can see that the task is quite difficult given the dynamic nature of python. But I hope the decorator I show above could at least allow you to diagnose the way the code is used.

No it is not. Python is not detect this. If you want to detect which functions are called or not during the run time you can use global set in your program. Inside each function add function name to set. Later you can print your set content and check if the the function is called or not.

False. Ignoring the difficulty and overhead of doing this, there's no reason why it would be useful.
A function that is defined in a module (i.e. a Python file) but not called elsewhere in that module might be called from a different module, so that doesn't deserve a warning.
If Python were to analyse all modules that get run over the course of a program, and print a warning about functions that were not called, it may be that a function was not called because of the input in this particular run e.g. perhaps in a calculator program there is a "multiply" function but the user only asked to sum some numbers.
If Python were to analyse all modules that make up a program and note and print a warning about functions that could not possibly be called (this is impossible but stay with me here) then it would warn about functions that were intended for use in other programs. E.g. if you have two calculator programs, a simple one and an advanced one, maybe you have a central calc.py with utility functions, and then advanced functions like exp and log could not possibly be called when that's used as part of simple program, but that shouldn't cause a warning because they're needed for the advanced program.

Related

Does python allow me to pass dynamic variables to a decorator at runtime?

I am attempting to integrate a very old system and a newer system at work. The best I can do is to utilize an RSS firehouse type feed the system utilizes. The goal is to use this RSS feed to make the other system perform certain actions when certain people do things.
My idea is to wrap a decorator around certain functions to check if the user (a user ID provided in the RSS feed) has permissions in the new system.
My current solution has a lot of functions that look like this, which are called based on an action field in the feed:
actions_dict = {
...
'action1': function1
}
actions_dict[RSSFEED['action_taken']](RSSFEED['user_id'])
def function1(user_id):
if has_permissions(user_id):
# Do this function
I want to create a has_permissions decorator that takes the user_id so that I can remove this redundant has_permissions check in each of my functions.
#has_permissions(user_id)
def function1():
# Do this function
Unfortunately, I am not sure how to write such a decorator. All the tutorials I see have the #has_permissions() line with a hardcoded value, but in my case it needs to be passed at runtime and will be different each time the function is called.
How can I achieve this functionality?
In your question, you've named both, the check of the user_id, as well as the wanted decorator has_permissions, so I'm going with an example where names are more clear: Let's make a decorator that calls the underlying (decorated) function when the color (a string) is 'green'.
Python decorators are function factories
The decorator itself (if_green in my example below) is a function. It takes a function to be decorated as argument (named function in my example) and returns a function (run_function_if_green in the example). Usually, the returned function calls the passed function at some point, thereby "decorating" it with other actions it might run before or after it, or both.
Of course, it might only conditionally run it, as you seem to need:
def if_green(function):
def run_function_if_green(color, *args, **kwargs):
if color == 'green':
return function(*args, **kwargs)
return run_function_if_green
#if_green
def print_if_green():
print('what a nice color!')
print_if_green('red') # nothing happens
print_if_green('green') # => what a nice color!
What happens when you decorate a function with the decorator (as I did with print_if_green, here), is that the decorator (the function factory, if_green in my example) gets called with the original function (print_if_green as you see it in the code above). As is its nature, it returns a different function. Python then replaces the original function with the one returned by the decorator.
So in the subsequent calls, it's the returned function (run_function_if_green with the original print_if_green as function) that gets called as print_if_green and which conditionally calls further to that original print_if_green.
Functions factories can produce functions that take arguments
The call to the decorator (if_green) only happens once for each decorated function, not every time the decorated functions are called. But as the function returned by the decorator that one time permanently replaces the original function, it gets called instead of the original function every time that original function is invoked. And it can take arguments, if we allow it.
I've given it an argument color, which it uses itself to decide whether to call the decorated function. Further, I've given it the idiomatic vararg arguments, which it uses to call the wrapped function (if it calls it), so that I'm allowed to decorate functions taking an arbitrary number of positional and keyword arguments:
#if_green
def exclaim_if_green(exclamation):
print(exclamation, 'that IS a nice color!')
exclaim_if_green('red', 'Yay') # again, nothing
exclaim_if_green('green', 'Wow') # => Wow that IS a nice color!
The result of decorating a function with if_green is that a new first argument gets prepended to its signature, which will be invisible to the original function (as run_function_if_green doesn't forward it). As you are free in how you implement the function returned by the decorator, it could also call the original function with less, more or different arguments, do any required transformation on them before passing them to the original function or do other crazy stuff.
Concepts, concepts, concepts
Understanding decorators requires knowledge and understanding of various other concepts of the Python language. (Most of which aren't specific to Python, but one might still not be aware of them.)
For brevity's sake (this answer is long enough as it is), I've skipped or glossed over most of them. For a more comprehensive speedrun through (I think) all relevant ones, consult e.g. Understanding Python Decorators in 12 Easy Steps!.
The inputs to decorators (arguments, wrapped function) are rather static in python. There is no way to dynamically pass an argument like you're asking. If the user id can be extracted from somewhere at runtime inside the decorator function however, you can achieve what you want..
In Django for example, things like #login_required expect that the function they're wrapping has request as the first argument, and Request objects have a user attribute that they can utilize. Another, uglier option is to have some sort of global object you can get the current user from (see thread local storage).
The short answer is no: you cannot pass dynamic parameters to decorators.
But... you can certainly invoke them programmatically:
First let's create a decorator that can perform a permission check before executing a function:
import functools
def check_permissions(user_id):
def decorator(f):
#functools.wraps(f)
def wrapper(*args, **kw):
if has_permissions(user_id):
return f(*args, **kw)
else:
# what do you want to do if there aren't permissions?
...
return wrapper
return decorator
Now, when extracting an action from your dictionary, wrap it using the decorator to create a new callable that does an automatic permission check:
checked_action = check_permissions(RSSFEED['user_id'])(
actions_dict[RSSFEED['action_taken']])
Now, when you call checked_action it will first check the permissions corresponding to the user_id before executing the underlying action.
You may easily work around it, example:
from functools import wraps
def some_function():
print("some_function executed")
def some_decorator(decorator_arg1, decorator_arg2):
def decorate(func):
#wraps(func)
def wrapper(*args, **kwargs):
print(decorator_arg1)
ret = func(*args, **kwargs)
print(decorator_arg2)
return ret
return wrapper
return decorate
arg1 = "pre"
arg2 = "post"
decorated = some_decorator(arg1, arg2)(some_function)
In [4]: decorated()
pre
some_function executed
post

unbound method must be called with instance as first argument. - Python

I have a relatively simple class which just changes the values of variables depending on the state.
class SetStates:
def LM_State1():
global p_LM1, p_LM2, p_LM3, p_RR1, p_RR2, p_RR3, p_RF1, p_RF2, p_RF3
p_LM1 = Ra_L*P_j1_s1
p_LM2 = P_j2_s1
p_LM3 = P_j3_s1
p_RR1 = Ra_R*(-1)*P_j1_s1
p_RR2 = (-1)*P_j2_s1
p_RR3 = (-1)*P_j3_s1
p_RF1 = Ra_R*(-1)*P_j1_s1
p_RF2 = (-1)*P_j2_s1
p_RF3 = (-1)*P_j3_s1
Initially I was calling the function within the class like so:
if LM_state == 1:
SetStates.LM_State1()
After realizing I need to initialize it now looks like this.
s=SetStates()
if LM_state == 1:
s.LM_State1()
But am now receiving an error specifying that it has been given 1 argument but expected 0. I am almost certain I am missing something very trivial. If someone could clear this up it would be great, thanks
Class methods (that is to say: any def block defined inside a class definition) automatically get passed the instance caller as their first argument (unless it's defined as a staticmethod but let's not muddy the waters). Since your function definition for LM_State1() doesn't include any arguments, Python complains that you gave it an argument (s) that it doesn't know what to do with.
As #BrenBarn mentions in the comments, your class doesn't make a whole lot of sense from a design perspective if it's just modifying global state, but that's the reason for the error anyway. If you really need this (hint: you don't) you should consider wrapping it in a module, importing the module, and defining all your set_state functions at the top-level of that module.
# stateful.py
def set_state_1():
...
# main.py
import stateful
stateful.set_state_1() # set the state!

Python custom function

I have been working at learning Python over the last week and it has been going really well, however I have now been introduced to custom functions and I sort of hit a wall. While I understand the basics of it, such as:
def helloworld():
print("Hello World!")
helloworld()
I know this will print "Hello World!".
However, when it comes to getting information from one function to another, I find that confusing. ie: function1 and function2 have to work together to perform a task. Also, when to use the return command.
Lastly, when I have a list or a dictionary inside of a function. I'll make something up just as an example.
def my_function():
my_dict = {"Key1":Value1,
"Key2":Value2,
"Key3":Value3,
"Key4":Value4,}
How would I access the key/value and be able to change them from outside of the function? ie: If I had a program that let you input/output player stats or a character attributes in a video game.
I understand bits and pieces of this, it just confuses me when they have different functions calling on each other.
Also, since this was my first encounter with the custom functions. Is this really ambitious to pursue and this could be the reason for all of my confusion? Since this is the most complex program I have seen yet.
Functions in python can be both, a regular procedure and a function with a return value. Actually, every Python's function will return a value, which might be None.
If a return statement is not present, then your function will be executed completely and leave normally following the code flow, yielding None as a return value.
def foo():
pass
foo() == None
>>> True
If you have a return statement inside your function. The return value will be the return value of the expression following it. For example you may have return None and you'll be explicitly returning None. You can also have return without anything else and there you'll be implicitly returning None, or, you can have return 3 and you'll be returning value 3. This may grow in complexity.
def foo():
print('hello')
return
print('world')
foo()
>>>'hello'
def add(a,b):
return a + b
add(3,4)
>>>7
If you want a dictionary (or any object) you created inside a function, just return it:
def my_function():
my_dict = {"Key1":Value1,
"Key2":Value2,
"Key3":Value3,
"Key4":Value4,}
return my_dict
d = my_function()
d['Key1']
>>> Value1
Those are the basics of function calling. There's even more. There are functions that return functions (also treated as decorators. You can even return multiple values (not really, you'll be just returning a tuple) and a lot a fun stuff :)
def two_values():
return 3,4
a,b = two_values()
print(a)
>>>3
print(b)
>>>4
Hope this helps!
The primary way to pass information between functions is with arguments and return values. Functions can't see each other's variables. You might think that after
def my_function():
my_dict = {"Key1":Value1,
"Key2":Value2,
"Key3":Value3,
"Key4":Value4,}
my_function()
my_dict would have a value that other functions would be able to see, but it turns out that's a really brittle way to design a language. Every time you call my_function, my_dict would lose its old value, even if you were still using it. Also, you'd have to know all the names used by every function in the system when picking the names to use when writing a new function, and the whole thing would rapidly become unmanageable. Python doesn't work that way; I can't think of any languages that do.
Instead, if a function needs to make information available to its caller, return the thing its caller needs to see:
def my_function():
return {"Key1":"Value1",
"Key2":"Value2",
"Key3":"Value3",
"Key4":"Value4",}
print(my_function()['Key1']) # Prints Value1
Note that a function ends when its execution hits a return statement (even if it's in the middle of a loop); you can't execute one return now, one return later, keep going, and return two things when you hit the end of the function. If you want to do that, keep a list of things you want to return and return the list when you're done.
You send information into and out of functions with arguments and return values, respectively. This function, for example:
def square(number):
"""Return the square of a number."""
return number * number
... recieves information through the number argument, and sends information back with the return ... statement. You can use it like this:
>>> x = square(7)
>>> print(x)
49
As you can see, we passed the value 7 to the function, and it returned the value 49 (which we stored in the variable x).
Now, lets say we have another function:
def halve(number):
"""Return half of a number."""
return number / 2.0
We can send information between two functions in a couple of different ways.
Use a temporary variable:
>>> tmp = square(6)
>>> halve(tmp)
18.0
use the first function directly as an argument to the second:
>>> halve(square(8))
32.0
Which of those you use will depend partly on personal taste, and partly on how complicated the thing you're trying to do is.
Even though they have the same name, the number variables inside square() and halve() are completely separate from each other, and they're invisible outside those functions:
>>> number
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
NameError: name 'number' is not defined
So, it's actually impossible to "see" the variable my_dict in your example function. What you would normally do is something like this:
def my_function(my_dict):
# do something with my_dict
return my_dict
... and define my_dict outside the function.
(It's actually a little bit more complicated than that - dict objects are mutable (which just means they can change), so often you don't actually need to return them. However, for the time being it's probably best to get used to returning everything, just to be safe).

Seeking general advice on how to prevent relentless "NameErrors" in Python

I have a question that I am sure has been on the mind of every intermediate-level Python programmer at some point: that is, how to fix/prevent/avoid/work around those ever-so-persistent and equally frustrating NameErrors. I'm not talking about actual errors (like typos, etc.), but a bizarre problem that basically say a global name was not defined, when in reality it was defined further down. For whatever reason, Python seems to be extremely needy in this area: every single variable absolutely positively has to hast to be defined above and only above anything that refers to it (or so it seems).
For example:
condition = True
if condition == True:
doStuff()
def doStuff():
it_worked = True
Causes Python to give me this:
Traceback (most recent call last):
File "C:\Users\Owner\Desktop\Python projects\test7.py", line 4, in <module>
doStuff()
NameError: name 'doStuff' is not defined
However, the name WAS defined, just not where Python apparently wanted it. So for a cheesy little function like doStuff() it's no big deal; just cut and paste the function into an area that satisfies the system's requirement for a certain order. But when you try to actually design something with it it makes organizing code practically impossible (I've had to "un-organize" tons of code to accomodate this bug). I have never encountered this problem with any of the other languages I've written in, so it seems to be specific to Python... but anyway I've researched this in the docs and haven't found any solutions (or even potential leads to a possible solution) so I'd appreciate any tips, tricks, workarounds or other suggestions.
It may be as simple as learning a specific organizational structure (like some kind of "Pythonic" and very strategic approach to working around the bug), or maybe just use a lot of import statements so it'll be easier to organize those in a specific order that will keep the system from acting up...
Avoid writing code (other than declarations) at top-level, use a main() function in files meant to be executed directly:
def main():
condition = True
if condition:
do_stuff()
def do_stuff():
it_worked = True
if __name__ == '__main__':
main()
This way you only need to make sure that the if..main construct follows the main() function (e.g. place it at the end of the file), the rest can be in any order. The file will be fully parsed (and thus all the names defined in the module can be resolved) by the time main() is executed.
As a rule of thumb: For most cases define all your functions first and then use them later in your code.
It is just the way it is: every name has to be defined at the time it is used.
This is especially true at code being executed at top level:
func()
def func():
func2()
def func2():
print "OK"
func()
The first func() will fail, because it is not defined yet.
But if I call func() at the end, everything will be OK, although func2() is defined after func().
Why? Because at the time of calling, func2() exists.
In short, the code of func() says "Call whatever is defined as func2 at the time of calling".
In Python defining a function is an act which happens at runtime, not at compile time. During that act, the code compiled at compile time is assigned to the name of the function. This name then is a variable in the current scope. It can be overwritten later as any other variable can:
def f():
print 42
f() # will print 42
def f():
print 23
f() # will print 23
You can even assign functions like other values to variables:
def f():
print 42
g = 23
f() # will print 42
g # will print 23
f, g = g, f
f # will print 23
g() # will print 42
When you say that you didn't come across this in other languages, it's because the other languages you are referring to aren't interpreted as a script. Try similar things in bash for instance and you will find that things can be as in Python in other languages as well.
There are a few things to say about this:
If your code is so complex that you can't organize it in one file, think about using many files and import them into one smaller main file
I you put your function in a class it will work. example:
class test():
def __init__(self):
self.do_something()
def do_something(self):
print 'test'
As said in the comment from Volatility that is an characteristic of interpreted languages

Is there a high-level profiling module for Python?

I want to profile my Python code. I am well-aware of cProfile, and I use it, but it's too low-level. (For example, there isn't even a straightforward way to catch the return value from the function you're profiling.)
One of the things I would like to do: I want to take a function in my program and set it to be profiled on the fly while running the program.
For example, let's say I have a function heavy_func in my program. I want to start the program and have the heavy_func function not profile itself. But sometime during the runtime of my program, I want to change heavy_func to profile itself while it's running. (If you're wondering how I can manipulate stuff while the program is running: I can do it either from the debug probe or from the shell that's integrated into my GUI app.)
Is there a module already written which does stuff like this? I can write it myself but I just wanted to ask before so I won't be reinventing the wheel.
It may be a little mind-bending, but this technique should help you find the "bottlenecks", it that's what you want to do.
You're pretty sure of what routine you want to focus on.
If that's the routine you need to focus on, it will prove you right.
If the real problem(s) are somewhere else, it will show you where they are.
If you want a tedious list of reasons why, look here.
I wrote my own module for it. I called it cute_profile. Here is the code. Here are the tests.
Here is the blog post explaining how to use it.
It's part of GarlicSim, so if you want to use it you can install garlicsim and do from garlicsim.general_misc import cute_profile.
If you want to use it on Python 3 code, just install the Python 3 fork of garlicsim.
Here's an outdated excerpt from the code:
import functools
from garlicsim.general_misc import decorator_tools
from . import base_profile
def profile_ready(condition=None, off_after=True, sort=2):
'''
Decorator for setting a function to be ready for profiling.
For example:
#profile_ready()
def f(x, y):
do_something_long_and_complicated()
The advantages of this over regular `cProfile` are:
1. It doesn't interfere with the function's return value.
2. You can set the function to be profiled *when* you want, on the fly.
How can you set the function to be profiled? There are a few ways:
You can set `f.profiling_on=True` for the function to be profiled on the
next call. It will only be profiled once, unless you set
`f.off_after=False`, and then it will be profiled every time until you set
`f.profiling_on=False`.
You can also set `f.condition`. You set it to a condition function taking
as arguments the decorated function and any arguments (positional and
keyword) that were given to the decorated function. If the condition
function returns `True`, profiling will be on for this function call,
`f.condition` will be reset to `None` afterwards, and profiling will be
turned off afterwards as well. (Unless, again, `f.off_after` is set to
`False`.)
`sort` is an `int` specifying which column the results will be sorted by.
'''
def decorator(function):
def inner(function_, *args, **kwargs):
if decorated_function.condition is not None:
if decorated_function.condition is True or \
decorated_function.condition(
decorated_function.original_function,
*args,
**kwargs
):
decorated_function.profiling_on = True
if decorated_function.profiling_on:
if decorated_function.off_after:
decorated_function.profiling_on = False
decorated_function.condition = None
# This line puts it in locals, weird:
decorated_function.original_function
base_profile.runctx(
'result = '
'decorated_function.original_function(*args, **kwargs)',
globals(), locals(), sort=decorated_function.sort
)
return locals()['result']
else: # decorated_function.profiling_on is False
return decorated_function.original_function(*args, **kwargs)
decorated_function = decorator_tools.decorator(inner, function)
decorated_function.original_function = function
decorated_function.profiling_on = None
decorated_function.condition = condition
decorated_function.off_after = off_after
decorated_function.sort = sort
return decorated_function
return decorator

Categories

Resources