I have two different files, containing two different classes (Let's call them Foo and Bar).
In the file (Foo.py) with class Foo, I have:
class Foo:
def __init__(self):
...
And in the file (Bar.py) with class Bar, I have:
from Foo import Foo
class Bar(Foo.Foo):
def __init__(self):
...
When I run my code, I get this TypeError:
Traceback (most recent call last):
File "Bar.py", line 2, in <module>
class Bar(Foo.Foo):
TypeError: Error when calling the metaclass bases
__init__() takes exactly 1 argument (4 given)
Why is it telling me I have 4 arguments to pass __init__ when the only argument I have in the code is self?
It looks to me like your problem is that you are importing the class directly, but then trying to access it via the module name.
If you have a class Foo inside a module source file foo.py then you can use it like this:
import foo
new_instance = foo.Foo()
You can also do this:
from foo import Foo
new_instance = Foo()
But you are trying to do this:
from Foo import Foo
new_instance = Foo.Foo()
In the expression Foo.Foo(), the first Foo is your class. Then after the . Python parses out Foo and looks for a class member called Foo.
Note: I suggest you comply with PEP 8 guidelines, and your modules should use lower-case names. Thus foo.py rather than Foo.py.
http://legacy.python.org/dev/peps/pep-0008/
Related
I want to monkey patch a method of a library class to define a different default for a param. This fails:
from functools import partial
class A(object):
def meth(self, foo=1):
print(foo)
A.meth = partial(A.meth, foo=2)
a = A()
a.meth()
with:
Traceback (most recent call last):
File "...", line 10, in <module>
a.meth()
TypeError: meth() missing 1 required positional argument: 'self'
what is the correct way of doing this?
(Original code is using getattr on the method names in a loop)
The answers in the question linked involve defining a new module-level function - I would like to avoid a new function definition
Use partialmethod:
In [32]: from functools import partialmethod
In [33]: A.meth = partialmethod(A.meth, foo=2)
In [34]: a = A()
In [35]: a.meth()
2
Given a trivial Python package with an __init__.py:
$ ls -R foo/
foo/:
__init__.py bar.py
$ cat foo/bar.py
def do_stuff(): pass
$ cat foo/__init__.py
from .bar import *
I'm surprised that foo.bar is defined:
>>> import foo
>>> foo.bar
<module 'foo.bar' from 'foo/bar.pyc'>
My understanding of from x import * is that it doesn't define x in the current scope. For example:
>>> from abc import *
>>> abc
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
NameError: name 'abc' is not defined
Why is foo.bar defined in my first example, even though I don't have import bar inside __init__.py?
When you reference it by foo.bar, it is not referencing to bar used in the import statement in __init__.py file, instead it is a reference to bar module/file itself. Even if you remove all the code in __init__.py file, import foo; foo.bar would still work.
If that weren't the case, you wouldn't have been able to do something like this
import foo.bar
Since foo is a package, as it contains __init__ file, hence it's internal files can be referenced directly.
I have a python file file1 which has below code
Class myClass():
def __init__(self):
self._variable1 = 2
Now how can I access variable1 from another python file
from file1 import myClass
class = myClass()
class._variable1 ??
how can I access these variables.?
all done fine, except the name class for variable
try it
from file1 import myClass
cl = myClass()
cl.variable1
newer user python keywords as variable name, list of it you may look
import keyword
keyword.kwlist
more details lexical_analysis
I am developing on a larger project which has a class with docstrings tests for each method. Each docstring contains a few examples/tests. The docstrings make use of a function in another module frequently and I would like to import it once and have it available in every docstring function test.
For example, if this is tests.py
class A(object):
def test1(self):
"""
>>> myfunc()
1
"""
pass
def test2(self):
"""
>>> myfunc()
1
"""
pass
And this is funcs.py
from tests import A
# Do stuff with A
def myfunc():
return 1
I would like avoid modifying the above code to this:
class A(object):
def test1(self):
"""
>>> from funcs import myfunc
>>> myfunc()
1
"""
pass
def test2(self):
"""
>>> from funcs import myfunc
>>> myfunc()
1
"""
pass
And instead do something like a class level docstring module import. I also can't just import the function directly in the module because in my case that would create a circular dependency.
Doctests are invoked using python -m doctest tests.py which has this error output:
File "tests.py", line 4, in tests.A.test
Failed example:
myfunc()
Exception raised:
Traceback (most recent call last):
File "/usr/local/Cellar/python/2.7.10_2/Frameworks/Python.framework/Versions/2.7/lib/python2.7/doctest.py", line 1315, in __run
compileflags, 1) in test.globs
File "<doctest tests.A.test[0]>", line 1, in <module>
myfunc()
NameError: name 'myfunc' is not defined
********************************************
1 items had failures:
1 of 1 in tests.A.test
***Test Failed*** 1 failures.
It succeeds using the test code that has imports.
For anyone wondering why I might want to do this, my real world code is at https://github.com/EntilZha/ScalaFunctional/blob/master/functional/pipeline.py. The function I want to import is seq since it is an entrypoint alias to the class Sequence. I would prefer to use seq in the docstrings since very importantly they serve as documentation examples, seq has additional behavior, and I want to start running them as my test suite to make sure they stay up to date.
From the doctest man page, "By default, each time doctest finds a docstring to test, it uses a shallow copy of M‘s globals".
All you have to do is make sure that myfunc is present in the module's globals, perhaps by adding
from funcs import myfunc
to the top of the file.
I want to implement a plugin based file uploader which can upload files to different services. It loads all the python modules from a directory and then calls them based on the service to upload to.
I have a simple BaseHandler which is just an abstract base class for all plugins
import abc
class BaseHandler():
__metaclass__ = abc.ABCMeta
#abc.abstractmethod
def start(self,startString):
return
I have a simple plugin which inherits from BaseHandler
from BaseHandler import BaseHandler
class Cloud(BaseHandler):
def start(self,startString):
return
And the actual code which loads plugins and calls them
import logging
import os
import sys
from BaseHandler import BaseHandler
all_plugins = {}
def load_plugins():
plugin_dir = os.path.join(os.path.dirname(os.path.realpath(__file__)),"Handlers")
plugin_files = [x[:-3] for x in os.listdir(plugin_dir) if x.endswith(".py")]
sys.path.insert(0,plugin_dir)
for plugin in plugin_files:
mod = __import__(plugin)
logging.info('Plugins have been loaded from the directory '+plugin_dir)
for plugin in BaseHandler.__subclasses__():
logging.info('Plugin:'+plugin.__name__)
return BaseHandler.__subclasses__()
logging.basicConfig(level=logging.DEBUG)
loadedPlugins = load_plugins()
for plugin in loadedPlugins:
all_plugins[plugin.__name__]= plugin.__class__
handle = all_plugins[plugin.__name__]()
When I try to create the actual object of the plugin in the last line of the script
handle = all_plugins[plugin.__name__]()
I get an error TypeError: __new__() takes exactly 4 arguments (1 given).
Edit: added full trace back
Traceback (most recent call last):
File "C:\TestCopy\Test.py", line 24, in <
module>
handle = all_plugins[plugin.__name__]()
TypeError: __new__() takes exactly 4 arguments (1 given)
You are registering the meta class, not the plugin itself;
>>> BaseHandler()
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
TypeError: Can't instantiate abstract class BaseHandler with abstract methods start
I think you meant to store the plugin itself:
all_plugins[plugin.__name__] = plugin
The __class__ attribute is the BaseHandler class instead; plugin objects are classes, not instances.