When is a default object created in Python? - python

I have a Python (3) structure like following:
main_script.py
util_script.py
AccessClass.py
The main script is calling a function in util with following signature:
def migrate_entity(project, name, access=AccessClass.AccessClass()):
The call itself in the main script is:
migrate_entity(project_from_file, name_from_args, access=access_object)
All objects do have values when the call is done.
However, As soon as the main script is executed the AccessClass in the function parameters defaults is initialized, even though it is never used. For example this main script __init__ will create the default class in the function signature:
if __name__ == "__main__":
argparser = argparse.ArgumentParser(description='Migrate support data')
argparser.add_argument('--name', dest='p_name', type=str, help='The entity name to migrate')
load_dotenv()
fileConfig('logging.ini')
# Just for the sake of it
quit()
# The rest of the code...
# ...and then
migrate_entity(project_from_file, name_from_args, access=access_object)
Even with the quit() added the AccessClass is created. And if I run the script with ./main_script.py -h the AccessClass in the function signature is created. And even though the only call to the function really is with an access object I can see that the call is made to the AccessClass.__init__.
If I replace the default with None and instead check the parameter inside the function and then create it, everything is working as expected, i.e. the AccessClass is not created if not needed.
Can someone please enlighten me why this is happening and how defaults are expected to work?
Are parameter defaults always created in advance in Python?

Basically the mutable objects are initialized the moment you declare the function, not when you invoke it. That's why it's widely discouraged to use mutable types as defaults. You can use None as you mentioned and inside the body do the check if something is None and then initialize it properly.
def foo_bad(x = []): pass # This is bad
foo_bad() # the list initialized during declaration used
foo_bad([1,2]) # provided list used
foo_bad() # again the list initialized during declaration used
def foo_good(x = None):
if x is None:
x=[]
... # further logic

AccessClass is being created because you've set it as a default parameter, so it it's in the scope of the file itself and will be initialised when the file is first imported. This is also why it's not recommended to use lists or dicts as default parameters.
This is a much safer way of defining a default value if nothing is provided:
def migrate_entity(project, name, access=None):
if access is None:
access = AccessClass.AccessClass()
You could also use type hinting to demonstrate what type access should be:
def migrate_entity(project, name, access: Optional[AccessClass.AccessClass] = None): ...

Related

Define the variable of a function call indirectly

I have two files file1.py and file2.py.
I execute a function from file2.py which executes a function from file1.py.
# tree/file1.py
def transfer(.., token="Y"):
...
# tree/file2.py
from .file1 import transfer
def secure(..):
...
transfer(..) #the `token` argument does not need to be called here.
def main(..):
...
secure(..)
The transfer function from file1.py is used in several other files in the tree.
Could I, from outside the tree, set the token variable for the transfer function in my file1.py? Which would be applied for all these executions.
# outside the tree folder
from tree.file2 import main
token = ?
# choose the token variable and apply it to the transfer function
main(..)
This would save me from overloading the code and having to put a token argument to all the functions. I want the user to be able to choose his token. Avoid this:
from tree.file2 import main
token = "X"
# choose the token variable and apply it to the transfer function
main(.., token=token)
# and be forced to put the token argument to all functions..
My suggestion would be that file1.py should be left alone because if you ever want to use a different token the function is already defined to take a token: def transfer(.., token):
One possibility is to make an intermediary:
# file1a.py
from .file1 import transfer as tr
def transfer(...):
token = "X"
tr(..., token)
Now all other modules need to have this: from .file1a import transfer
(if you have hundreds of files and you don't want to change the import, you could swap the contents of file1.py for file1a.py)
You can also effectively mock out the actual function and make all functions automatically call your own function which supplies the required token:
from tree.file1 import transfer as tr
def transfer(...):
token = "X" # this function substitutes a different token
return tr(..., token) # call to the real function
import tree.file1 as file1
file1.transfer = transfer # this sets up the mock from here
from tree.file2 import main
main(...)
What follows is a whirlwind tour of maintaining mutable state between function calls, of which the default argument is but one example.
If you want a "configurable" default value, use a sentinel to detect when no argument is passed, then assign a module global to the parameter.
transfer_default = "Y"
def transfer(.., token=None):
if token is None:
token = transfer_token
...
You can technically alter a preset default value (by replacing all the defaults at once, since they are stored in an immutable tuple):
# Assuming token is the first/only parameter with a default
transfer.__defaults__ = ("N",)
but I don't recommend it.
You can also use a closure to allow the caller to create their own version of transfer with whatever default they want:
def make_transfer(default):
def _(..., token=default):
...
return _
transfer1 = make_transfer("Y")
transfer2 = make_transfer("N")
While you could always use make_transfer("Y")(...) on demand, instead of reusing transfer1(...), keep in mind that make_transfer has to define a new function every time it is called.
Using a closure leads to the dual of a closure, a class.
class Transferrer:
def __init__(self, token="Y"):
self.token = token
def transfer(self):
...
Transferrer().transfer() # with Y
Transferrer("N").transfer() # with N
As with closures, you probably want to reuse the same Transferrer object multiple times, rather than always creating a new one each time you want to call its transfer method.

Only pass function parameters if they exist

I have python3 class function defined like below:
class class1:
def funcOne(self, reqvar1, reqVar2, optVar1=default1, optVar2=default2, optVar3="server.domain", optVar4="defaultUser", optVar5="<default_Flags>"):
It gets called (I want to call it like this rather) in the main program like:
argsIn=argparser.parse_args()
classInst=class1()
classInst.funcOne(5, 12, argsIn.inVal1, argsIn.inVal2, argsIn.inVal3, argsIn.inVal4, argsIn.inVal5)
args.inVal[1-5] are optional on the command line when running. If they don't get supplied I want the class function to use the defaults, if they do get supplied then they would use the supplied values.
Currently if they are not supplied on the command line, inVal[1-5] are passed as 'None' which overwrite the actual default values.
The class function is maintained separately and they manage the defaults. Putting them into my script (for example in the argparser options) is not appropriate.
Is there a way to easily work with this situation that doesn't resort to:
if args.inVal1 and not args.inVal2...
if not args.inVal1 and args.inVal2 and not args.inVal3...
as the number of combinations gets large.
It seems like it should be simple, but I am not connecting something here.
Thank you for the help.
If you create a dictionary that contains the optional variable names, you can pass that dictionary to the function call. I only commented out the argparser for testing.
class class1:
def funcOne(self, reqvar1, reqVar2, optVar1='default1', optVar2='default2', optVar3="server.domain", optVar4="defaultUser", optVar5="<default_Flags>"):
print(optVar1)
print(optVar2)
print(optVar3)
print(optVar4)
print(optVar5)
#argsIn=argparser.parse_args()
optionalArgs = {'optVar1': 'TestingVar1', #args.inVal1,
'optVar2': None, #args.inVal2,
'optVar3': 'TestingVar3', #args.inVal3,
'optVar4': None, #args.inVal4,
'optVar5': None} #args.inVal5}
optionalArgsClean = {k:v for k, v in optionalArgs.items() if v is not None}
classInst=class1()
classInst.funcOne(5, 12, **optionalArgsClean)
Running the above code produces:
TestingVar1
default2
TestingVar3
defaultUser
<default_Flags>

Will Python automatically detect that the function was never called but defined?

True or False
If a function is defined but never called, then Python automatically detects that and issues a warning
One of the issues with this is that functions in Python are first class objects. So their name can be reassigned. For example:
def myfunc():
pass
a = myfunc
myfunc = 42
a()
We also have closures, where a function is returned by another function and the original name goes out of scope.
Unfortunately it is also perfectly legal to define a function with the same name as an existing one. For example:
def myfunc(): # <<< This code is never called
pass
def myfunc():
pass
myfunc()
So any tracking must include the function's id, not just its name - although that won't help with closures, since the id could get reused. It also won't help if the __name__ attribute of the function is reassigned.
You could track function calls using a decorator. Here I have used the name and the id - the id on its own would not be readable.
import functools
globalDict = {}
def tracecall(f):
#functools.wraps(f)
def wrapper(*args, **kwargs):
global globalDict
key = "%s (%d)" % (f.__name__, id(f))
# Count the number of calls
if key in globalDict:
globalDict[key] += 1
else:
globalDict[key] = 1
return f(*args, **kwargs)
return wrapper
#tracecall
def myfunc1():
pass
myfunc1()
myfunc1()
#tracecall
def myfunc1():
pass
a = myfunc1
myfunc1 = 42
a()
print(globalDict)
Gives:
{'myfunc1 (4339565296)': 2, 'myfunc1 (4339565704)': 1}
But that only gives the functions that have been called, not those that have not!
So where to go from here? I hope you can see that the task is quite difficult given the dynamic nature of python. But I hope the decorator I show above could at least allow you to diagnose the way the code is used.
No it is not. Python is not detect this. If you want to detect which functions are called or not during the run time you can use global set in your program. Inside each function add function name to set. Later you can print your set content and check if the the function is called or not.
False. Ignoring the difficulty and overhead of doing this, there's no reason why it would be useful.
A function that is defined in a module (i.e. a Python file) but not called elsewhere in that module might be called from a different module, so that doesn't deserve a warning.
If Python were to analyse all modules that get run over the course of a program, and print a warning about functions that were not called, it may be that a function was not called because of the input in this particular run e.g. perhaps in a calculator program there is a "multiply" function but the user only asked to sum some numbers.
If Python were to analyse all modules that make up a program and note and print a warning about functions that could not possibly be called (this is impossible but stay with me here) then it would warn about functions that were intended for use in other programs. E.g. if you have two calculator programs, a simple one and an advanced one, maybe you have a central calc.py with utility functions, and then advanced functions like exp and log could not possibly be called when that's used as part of simple program, but that shouldn't cause a warning because they're needed for the advanced program.

unbound method must be called with instance as first argument. - Python

I have a relatively simple class which just changes the values of variables depending on the state.
class SetStates:
def LM_State1():
global p_LM1, p_LM2, p_LM3, p_RR1, p_RR2, p_RR3, p_RF1, p_RF2, p_RF3
p_LM1 = Ra_L*P_j1_s1
p_LM2 = P_j2_s1
p_LM3 = P_j3_s1
p_RR1 = Ra_R*(-1)*P_j1_s1
p_RR2 = (-1)*P_j2_s1
p_RR3 = (-1)*P_j3_s1
p_RF1 = Ra_R*(-1)*P_j1_s1
p_RF2 = (-1)*P_j2_s1
p_RF3 = (-1)*P_j3_s1
Initially I was calling the function within the class like so:
if LM_state == 1:
SetStates.LM_State1()
After realizing I need to initialize it now looks like this.
s=SetStates()
if LM_state == 1:
s.LM_State1()
But am now receiving an error specifying that it has been given 1 argument but expected 0. I am almost certain I am missing something very trivial. If someone could clear this up it would be great, thanks
Class methods (that is to say: any def block defined inside a class definition) automatically get passed the instance caller as their first argument (unless it's defined as a staticmethod but let's not muddy the waters). Since your function definition for LM_State1() doesn't include any arguments, Python complains that you gave it an argument (s) that it doesn't know what to do with.
As #BrenBarn mentions in the comments, your class doesn't make a whole lot of sense from a design perspective if it's just modifying global state, but that's the reason for the error anyway. If you really need this (hint: you don't) you should consider wrapping it in a module, importing the module, and defining all your set_state functions at the top-level of that module.
# stateful.py
def set_state_1():
...
# main.py
import stateful
stateful.set_state_1() # set the state!

python decorator losing argument definitions

I am using a block like this:
def served(fn) :
def wrapper(*args, **kwargs):
p = xmlrpclib.ServerProxy(SERVER, allow_none=True )
return (p.__getattr__(fn.__name__)(*args, **kwargs)) # do the function call
return functools.update_wrapper(wrapper,fn)
#served
def remote_function(a, b):
pass
to wrap a series of XML-RPC calls into a python module. The "served" decorator gets called on stub functions to expose operations on a remote server.
I'm creating stubs like this with the intention of being able to inspect them later for information about the function, specifically its arguments.
As listed, the code above does not transfer argument information from the original function to the wrapper. If I inspect with inspect.getargspec( remote_function ) then I get essentially an empty list, instead of args=['a','b'] that I was expecting.
I'm guessing I need to give additional direction to the functools.update_wrapper() call via the optional assigned parameter, but I'm not sure exactly what to add to that tuple to get the effect I want.
The name and the docstring are correctly transferred to the new function object, but can someone advise me on how to transfer argument definitions?
Thanks.
Previous questions here and here suggest that the decorator module can do this.

Categories

Resources