I have created the following constructor:
class Analysis:
def __init__(self, file_list, tot_col, tot_rows):
self.file_list = file_list
self.tot_col = tot_col
self.tot_rows = tot_rows
I then have the method full_analysis() call calc_total_rows() from the same file:
def full_analysis(self):
"""Currently runs all the analysis methods"""
print('Analysing file...\n' +
'----------------------------\n')
calc_total_rows()
From another file I am calling the full_analysis() however errors occur saying that calc_total_rows() is not defined, and the method is just below it.
I'm inexperienced with Python however I tried to rearrange the code and add 'self' in various places to no avail.
The other file does meet the requirements of the constructor, and if I remove the calc_total_rows() method, the print line runs. I however do not wish to call each method individually, and would like to call a single method which runs them all.
If calc_total_rows is an instance method as your question implies, then you need to call self.calc_total_rows() from within full_analysis. Unlike some other languages, Python does not have implicit instance references within method scope; you have to explicitly retrieve the member method from self.
I wish I had found this sooner.
In order to solve this, I had to use self in front of the method.
In my example:
def full_analysis(self):
"""Currently runs all the analysis methods"""
print('Analysing file...\n' +
'----------------------------\n')
self.calc_total_rows()
This works.
Related
I'm currently writing code in Python 2.7, which involves creating an object, in which I have two class methods and other regular methods. I need to use this specific combination of methods because of the larger context of the code I am writing- it's not relevant to this question, so I won't go into depth.
Within my __init__ function, I am creating a Pool (a multiprocessing object). In the creation of that, I call a setup function. This setup function is a #classmethod. I define a few variables in this setup function by using the cls.variablename syntax. As I mentioned, I call this setup function within my init function (inside the Pool creation), so these variables should be getting created, based on what I understand.
Later in my code, I call a few other functions, which eventually leads to me calling another #classmethod within the same object I was talking about earlier (same object as the first #classmethod). Within this #classmethod, I try to access the cls.variables I created in the first #classmethod. However, Python is telling me that my object doesn't have an attribute "cls.variable" (using general names here, obviously my actual names are specific to my code).
ANYWAYS...I realize that's probably pretty confusing. Here's some (very) generalized code example to illustrate the same idea:
class General(object):
def __init__(self, A):
# this is correct syntax based on the resources I'm using,
# so the format of argument isn't the issue, in case anyone
# initially thinks that's the issue
self.pool = Pool(processes = 4, initializer=self._setup, initargs= (A, )
#classmethod
def _setup(cls, A):
cls.A = A
#leaving out other functions here that are NOT class methods, just regular methods
#classmethod
def get_results(cls):
print cls.A
The error I'm getting when I get to the equivalent of the print cls.A line is this:
AttributeError: type object 'General' has no attribute 'A'
edit to show usage of this code:
The way I'm calling this in my code is as such:
G = General(5)
G.get_results()
So, I'm creating an instance of the object (in which I create the Pool, which calls the setup function), and then calling get_results.
What am I doing wrong?
The reason General.A does not get defined in the main process is that multiprocessing.Pool only runs General._setup in the subprocesses. This means that it will not be called in the main process (where you call Pool).
You end up with 4 processes where in each of them there is General.A is defined, but not in the main process. You don't actually initialize a Pool like that (see this answer to the question How to use initializer to set up my multiprocess pool?)
You want an Object Pool which is not natively impemented in Python. There's a Python Implementation of the Object Pool Design Pattern question here on StackOverflow, but you can find a bunch by just searching online.
I explain:
I would like to know how I can pass method or functions as arguments.
For example, in Python will be:
from MyFile import MyClass
MyClass().my_method_click(function) # without parentheses
In this example, in Python you send the functions or method without
parentheses, if I do:
from MyFile import MyClass
MyClass().my_method_click(function()) # with parentheses
I call the function but don't send it.
In Ruby, when you call a method or function, you can do it with or
without parentheses.
if I do this in Ruby:
require_relative "MyClass"
MyClass.new.my_method_click(function) # without parentheses
Just call it without send it.
Of course, is for a Button, that when I click it, run this operation.
How I can do it in Ruby??
Thanks!
Basically, you want to pass a runnable block of code. I haven't looked into Python yet, but I am sure it supports closures as well.
Anyhow, in Ruby, a "general" way of passing a runnable code is to use blocks (lambdas and procs).
function = lambda { # your code }
MyClass.new.my_method_click(function)
# or a shorter way
MyClass.new.my_method_click(-> { # your code })
# to run a block
def my_method_click(&block)
#you can either `yield` from your receiving method
yield
# or call `.call` method on your lambda/proc instance
block.call
end
You can also get an instance of your class' method or create a new method one using Method.new. But, you'd end up dealing with bindings and binding to the correct instance types, etc. So, it's much easier with Lambdas and Procs.
I'm just starting to learn Python and I have the following problem.
Using a package with method "bind", the following code works:
def callback(data):
print data
channel.bind(callback)
but when I try to wrap this inside a class:
class myclass:
def callback(data):
print data
def register_callback:
channel.bind(self.callback)
the call_back method is never called. I tried both "self.callback" and just "callback". Any ideas?
It is not clear to me how your code works, as (1) you did not post the implementation of channel.bind, and (2) your second example is incorrect in the definition of register_callback (it is using a self argument that is not part of the list of parameters of the method, and it lacks parentheses).
Nevertheless, remember that methods usually require a "self" parameter, which is implicitly passed every time you run self.function(), as this is converted internally to a function call with self as its first parameter: function(self, ...). Since your callback has just one argument data, this is probably the problem.
You cannot declare a method bind that is able to accept either a function or a class method (the same problem happens with every OOP language I know: C++, Pascal...).
There are many ways to do this, but, again, without a self-contained example that can be compiled, it is difficult to give suggestions.
You need to pass the self object as well:
def register_callback(self):
channel.bind(self.callback)
What you're doing is entirely possible, but I'm not sure exactly what your issue is, because your sample code as posted is not even syntactically valid. (The second method has no argument list whatsoever.)
Regardless, you might find the following sample code helpful:
def send_data(callback):
callback('my_data')
def callback(data):
print 'Free function callback called with data:', data
# The follwing prints "Free function callback called with data: my_data"
send_data(callback)
class ClassWithCallback(object):
def callback(self, data):
print 'Object method callback called with data:', data
def apply_callback(self):
send_data(self.callback)
# The following prints "Object method callback called with data: my_data"
ClassWithCallback().apply_callback()
# Indeed, the following does the same
send_data(ClassWithCallback().callback)
In Python it is possible to use free functions (callback in the example above) or bound methods (self.callback in the example above) in more or less the same situations, at least for simple tasks like the one you've outlined.
I want to envoke a method in my code in a supercass, to do some subclass- specific processing before continuing on. I come to python recently from C#... there, I'd probably use an interface. Here's the gist of it (as I picture it, but it's not working):
class superClass:
def do_specific_stuff(self): #To be implemented entirely by the subclass,
#but called from the superclass
pass
def do_general_stuff1(self):
#do misc
def do_general_stuff2(self):
#do more misc
def main_general_stuff(self):
do_general_stuff1()
do_specific_stuff()
do_general_stuff2()
I have a rather complicated implementation of this; this example is exactly what I need and far less painful to understand for a first- time viewer. Calling do_specific_stuff() at the moment gives me the error
'global name 'do_specific_stuff' is not defined.
When I add 'self' as in self.do_specific_stuff I get the error
'TypeError: do_specific_stuff() takes 0 positional arguments but 1 was given.' Any takers? Thanks in advance...
It needs to be
def main_general_stuff(self):
self.do_general_stuff1()
self.do_specific_stuff()
...
The problem is that you are missing the explicit reference to self: Python thinks you mean a global function without it. Note that there is no implicit this like in Java: You need to specify it.
I am using a block like this:
def served(fn) :
def wrapper(*args, **kwargs):
p = xmlrpclib.ServerProxy(SERVER, allow_none=True )
return (p.__getattr__(fn.__name__)(*args, **kwargs)) # do the function call
return functools.update_wrapper(wrapper,fn)
#served
def remote_function(a, b):
pass
to wrap a series of XML-RPC calls into a python module. The "served" decorator gets called on stub functions to expose operations on a remote server.
I'm creating stubs like this with the intention of being able to inspect them later for information about the function, specifically its arguments.
As listed, the code above does not transfer argument information from the original function to the wrapper. If I inspect with inspect.getargspec( remote_function ) then I get essentially an empty list, instead of args=['a','b'] that I was expecting.
I'm guessing I need to give additional direction to the functools.update_wrapper() call via the optional assigned parameter, but I'm not sure exactly what to add to that tuple to get the effect I want.
The name and the docstring are correctly transferred to the new function object, but can someone advise me on how to transfer argument definitions?
Thanks.
Previous questions here and here suggest that the decorator module can do this.