I am quite new to python programming and working on a quite large python module using IntelliJ. I was wondering if I have a static method that is called from multiple places and I change the signature of this method to accept a different no. of arguments. I might fix the actual method calls in some places but might miss changing the calls in some other. In java, I will receive a compile time error but since python is interpreted I will only figure out that sometime during runtime(probably in production). Till now I have been using the command 'python -m compileall' but I was wondering like in Java is there any way to get syntax errors in IntelliJ.
Unit tests and static code analysis tools such as pylint will help. pylint is able to detect incorrect number of arguments being passed to functions.
If you're using Python 3, function annotations might be useful, and mypy can type check annotated function calls (apparently, I've not used it).
In general the strategy for changing existing function signatures without breaking dependent code is to use keyword arguments. For example, if you wanted to add a new argument to a function, add it as a keyword argument:
#def f(a):
# """Original function"""
# print(a)
def f(a, b=None):
"""New and improved function"""
print(a)
if b is not None:
print(b)
Now calls with and without the new argument will work:
>>> f('blah')
blah
>>> f('blah', 'cough')
blah
cough
Of course this will not always work, e.g. if argument(s) are removed, or if the semantics of the function are changed in a way that breaks existing code.
Related
I would like to copy an existing function from an existing module in the following way:
def foo(a,b,c=1,d=3,*arg):
return True
myClass.foo = lambda b,c,d,*arg : foo(my_value_of_a, b,c,d,*arg)
However, there are several problems with this approach namely:
I am doing this in a loop and I don't know the arguments of most functions
I am losing the default values - which I absolutely cannot
The __docs__ and other attributes would be nice to keep too
I tried to do something like this:
handler = getattr(mod,'foo')
handler.__defaults__ = tuple([my_value_of_a] + list(handler.__defaults__))
myClass.foo = handler
which is almost enough for my use case (just because I always modify the first argument only). The problem is that if I call mod.foo() it also has my_value_of_a as the default value for a!
I tried using the copy module to do a handler=deepcopy(handler) but even that didn't work and modifying the default values of handler also modifies the default values of the module function itself.
Any suggestions on who to do this in a "pythonic" way? I probably cannot use decorators either, since I'm looping over functions from external modules (several, actually).
I learned from here that one of the characteristics of a first class object is that it can be created at run time.
It is not clear for me when a function is actually created at a run time. Can one tell me how I can identify a function that is created at run time and a function that it not created at run time? All the functions in python are created at run time?
that post isn't amazing... everything in Python is "created at runtime" you'd need to compare to a compiled language (e.g. C) to find a difference between compile time and run time. you could also compare to languages like PHP (which you seem to have used) where you have to call magic methods like call_user_func and pass a string instead of just passing a function object around. I don't know PHP very well, but it would seem to struggle with idioms like:
def foo(a):
def bar(b):
return a * b
return bar
baz = foo(3)
print(baz(5))
where baz is a closure that "bound" a to the value 3, i.e. keeping a reference to it around so it can be used later. I think you'd need to create a class and object that bundles this functionality up in PHP
I feel like a link to a talk: https://www.youtube.com/watch?v=_AEJHKGk9ns might help understand better how names and values work in Python. a more advanced tool would be the dis module which can be good for understanding how things like this work
Could someone tell me whether this idea is feasible in Python?
I want to have a method and the datatype of the signature is not fixed.
For example:
Foo(data1, data2) <-- Method Definition in Code
Foo(2,3) <---- Example of what would be executed in runtime
Foo(s,t) <---- Example of what would be executed in runtime
I know the code could work if i change the Foo(s,t) to Foo("s","t"). But I am trying to make the code smarter to recognize the command without the "" ...
singledispatch might be an answer, which transforms a function into a generic function, which can have different behaviors depending upon the type of its first argument.
You could see a concrete example in the above link. And you should do some special things if you want to do generic dispatch on more than one arguments.
I've been developing a sudoku solver in Python and the following question came up while trying to improve performance:
Does python remember the result of a calculation if the same calculation has to be performed multiple times throughout the code? Example: compare the following 2 bits of code:
if get_single(foo, bar) is not None:
position = get_single(foo, bar)
single = get_single(foo, bar)
if single is not None:
position = single
Are these 2 pieces of code equal in performance or does the second piece perform faster because the calculation is only performed once?
No, Python does not remember function calls or other calculations automatically. In general, it would be very bad if it did—imagine if every call to, say, random.randrange(6) returned the same value as the first call.
However, it's not hard to explicitly make it remember calls for specific functions where it's useful. This is usually called "memoization".
See the lru_cache decorator in the docs, for a nice example built into the stdlib.* All you have to do to make it remember every call to get_single(foo, bar) is change the definition of get_single like this;
#functools.lru_cache(maxsize=None)
def get_single(foo, bar):
# etc.
Or, if get_single is someone else's code that you're importing and can't touch, you can just wrap it:
get_single = functools.lru_cache(maxsize=None)(othermod.get_single)
… and then call your wrapper instead of the module's version.
* Note that lru_cache was added in Python 3.2. If you're using 2.7 (or, for some reason, 3.0-3.1), you can install the backport from PyPI, or find any of dozens of other memoizing caches on PyPI or ActiveState—or even, noticing that the functools docs link to the source, like many other stdlib modules meant to also serve as example code, copy the source to your own project. Although, IIRC, the 3.2 code needs a small change to work with 2.7 because it relies on nonlocal to hide its internals.
That being said, even if you know get_single is memoized, it's still not very good style to call it twice. If you only need to do this once, just write the three lines of code. If you need to do it repeatedly, write a wrapper function that wraps up those three lines or code, and then calling that function will be shorter than even the two-line version.
I'm working on a project where I'm batch generating XML files which can import to the IDE of an industrial touchscreen.
Each XML file represents a screen, and most screens require the same functions and the process for dealing with them is the same, with the exception of the fact that each screen type has a unique configuration function.
I'm using a ScreenType class to hold attributes specific to a screen type, so I decided to write a unique configuration for each type, and pass it as a parameter to the __init__() of this class. This way, when I pass around my ScreenType as it is needed, it's configuration function will stay bundled and can be used whenever needed.
But I'm not sure what will happen if my configuration function itself has a dependency. For example:
def configure_inputdiag(a, b, c):
numerical_formatting = get_numerics(a)
# ...
return configured_object
Then, when it comes time to create an instance of a ScreenType
myscreentype = ScreenType(foo, man, shoe, configure_inputdiag)
get_numerics is a module scoped function, but myscreentype could (and does) get passed within other modules.
Does this create a problem with dependencies? I'd try to test it myself, but it seems like I don't have a fundamental understanding behind what's going on when I pass a function as a parameter. I don't want to draw incorrect conclusions about what's happening.
What I've tried: Googling, Search SO, and I didn't find anything specifically for Python.
Thanks in advance.
There's no problem.
The function configure_inputdiag will always refer to get_numerics in the context where it was defined. So, even if you call configure_inputdiag from some other module which knows nothing about get_numerics, it will work fine.
Passing a function as a parameter produces a reference to that function. Through that reference, you can call the function as if you had called it by name, without actually knowing the name (or the module from which it came). The reference is valid for the lifetime of the program, and will always refer to the same function. If you store the function reference, it basically becomes a different name for the same function.
What you are trying to do works in a very natural form in Python -
In the exampe above, you don't need to have the "get_numerics" function imported in the namespace (module) where the "configure_inputdiag" is - you just pass it as a normal parameter (say, call it "function") and you are going like in this example:
Module A:
def get_numerics(parm):
...
input diag = module_B.configure_inputdiag(get_numerics, a)
Module B:
def configure_inputdiag(function, parm):
result = function(parm)
Oh - I saw your doubt iwas the other waya round - anyway, there is no problem - in Python, functions are first class objects- jsut like ints and strings, and they can be passed around as parametrs to other functions in other modules as you wish. I think the example above clarifies that.
get_numerics is resolved in the scope of the function body, so it does not also need to be in the scope of the caller.