/projects/mymath$ ls
__init__.py __init__.pyc mymath.py mymath.pyc tests
and under the directory tests I have
/projects/mymath/tests/features$ ls
steps.py steps.pyc zero.feature
I tried to import my factorial function
sys.path.insert(0,"../../")
#import mymath
from mymath.MyMath import factorial
But it said No module named MyMath.
Here is my dummy MyMath class.
class MyMath(object):
def factorial(self, number):
if n <= 1:
return 1
else:
return n * factorial(n-1)
So what's wrong? Thanks. Is this even a good practice (editing the sys path?)
This will work import mymath
You can't import a function out of a class. You want to either import the class itself (import mymath.mymath.MyMath) or put the function at a module level and do import mymath.mymath.factorial.
From what I can tell, it's right. There is no module named mymath.MyMath. There's a module named mymath.mymath though...
To be explicit, when you create a folder and put an __init__.py file in it, you've crated a package. If your __init__.py file is empty, then you still have to explicitly import the modules in the package. So you have to do import mymath.mymath to import the mymath module into your namespace. Then you can access the things you want via mymath.mymath.MyMath, and so on. If you want to import the class directly in, you have to do this:
from mymath.mymath import MyMath
And as someone else has already explained, you can't import a method from a class. You have to import the whole class.
One problem is your import is wrong. You have a package called mymath, and a module in it called mymath. In that module is a class. That's the most you can import:
>>> from mymath.mymath import MyMath
>>> myMathObject = MyMath()
>>> myMathObject.factorial(5)
120
The other problem is your second call to factorial should call factorial on self, otherwise it will try to look it up as a free function in the module, which won't work. Try it!
Here is my dummy MyMath class.
Python is not Java; the class is not any kind of organizational unit here, but simply a blueprint for data types. There is no reason to define a class here (which would then have to be instantiated anyway); just define the factorial function as a function.
Related
I was facing import error (ImportError: cannot import name 'ClassB') in following code:
dir structure:
main.py
test_pkg/
__init__.py
a.py
b.py
main.py:
from test_pkg import ClassA, ClassB
__init__.py:
from .a import ClassA
from .b import ClassB
a.py:
from test_pkg import ClassB
class ClassA:
pass
b.py:
class ClassB:
pass
in past i fixed it by quick 'experiment' by adding full name in import in a.py:
from test_pkg.b import ClassB
class ClassA:
pass
I have read about import machinery and according to :
This name will be used in various phases of the import search, and it may be the dotted path to a submodule, e.g. foo.bar.baz. In this case, Python first tries to import foo, then foo.bar, and finally foo.bar.baz. link2doc
I was expecting it will fail again, because it will try to import test_pkg during test_pkg import, but it is working. My question is why?
Also 2 additional questions:
is it proper approach to have cross modules dependencies?
is it ok to have modules imported in package init.py?
My analysis:
Based on readings i recognized that high probably issue is that, because off
__init__.py:
from .a import ClassA
from .b import ClassB
ClassA and ClassB import is executed as part of test_pkg import, but then hits import statement in a.py:
a.py:
from test_pkg import ClassB
class ClassA:
pass
and it fail because circular dependency occured.
But is working when is imported using:
from test_pkg.b import ClassB
and according to my understanding it shouldnt, because:
This name will be used in various phases of the import search, and it
may be the dotted path to a submodule, e.g. foo.bar.baz. In this case,
Python first tries to import foo, then foo.bar, and finally
foo.bar.baz. If any of the intermediate imports fail, a
ModuleNotFoundError is raised.
so i was expecting same behavior for both imports.
Looks like import with full path is not launching problematic test_pkg import process
from test_pkg.b import ClassB
Your file is named b.py, but you're trying to import B, not import b.
Depending on your platform (see PEP 235 for details), this may work, or it may not. If it doesn't work, the symptoms will be exactly what you're seeing: ImportError: cannot import name 'B'.
The fix is to from test_okg import b. Or, if you want the module to be named B, rename the file to B.py.
This actually has nothing to do with packages (except that the error message you get says cannot import name 'B' instead of No module named 'B', because in a failed from … import statement, Python can't tell whether you were failing to import a module from a package, or some global name from a module).
So, why does this work?
from test_pkg.b import B
I was expecting it will fail again, because it will try to import test_pkg during test_pkg import, but it is working. My question is why?
Because importing test_pkg isn't the problem in the first place; importing test_pkg.B is. And you solved that problem by importing test_pkg.b instead.
test_pkg.b is successfully found in test_pkg/b.py.
And then, test_pkg.b.B is found within that module, and imported into your module, because of course there's a class B: statement in b.py.
For your followup questions:
is it proper approach to have cross modules dependencies?
There's nothing wrong with cross-module dependencies as long as they aren't circular, and yours aren't.
It's perfectly fine for test_pkg.a to import test_pkg.b with an absolute import, like from test_pkg import b.
However, it's usually better to use a relative import, like from . import b (unless you need dual-version code that works the same on Python 2.x and 3.x). PEP 328 explains the reasons why relative imports are usually better for intra-package dependencies (and why it's only "usually" rather than "always").
is it ok to have modules imported in package __init__.py?
Yes. In fact, this is a pretty common idiom, used for multiple purposes.
For example, see asyncio.__init__.py, which imports all of the public exports from each of its submodules and re-exports them. There are a handful of rarely-used names in the submodules, which don't start with _ but aren't included in __all__, and if you want to use those you need to import the submodule explicitly. But everything you're likely to need in a typical program is included in __all__ and re-exported by the package, so you can just write, e.g., asyncio.Lock instead of asyncio.locks.Lock.
The code I would prefer to write is probably:
main.py:
from test_pkg import A, B
b.py:
class B:
pass
a.py:
from .b import B
class A:
pass
__init__.py:
from .a import A
from .b import B
main.py:
from test_pkg import A, B
The proper approach is to have cross module dependencies but not circular. You should figure out the hierarchy of your project and arrange your dependency graph in a DAG (directed acyclic graph).
what you put in package __init__.py will be what you can access thru the package. Also you can refer to this question for the use of __all__ in __init__.py.
Can someone explain this to me?
When you import Tkinter.Messagebox what actually does this mean (Dot Notation)?
I know that you can import Tkinter but when you import Tkinter.Messagebox what actually is this? Is it a class inside a class?
I am new to Python and dot notation confuses me sometimes.
When you're putting that dot in your imports, you're referring to something inside the package/file you're importing from.
what you import can be a class, package or a file, each time you put a dot you ask something that is inside the instance before it.
parent/
__init__.py
file.py
one/
__init__.py
anotherfile.py
two/
__init__.py
three/
__init__.py
for example you have this, when you pass import parent.file you're actually importing another python module that may contain classes and variables, so to refer to a specific variable or class inside that file you do from parent.file import class for example.
this may go further, import a packaging inside another package or a class inside a file inside a package etc (like import parent.one.anotherfile)
For more info read Python documentation about this.
import a.b imports b into the namespace a, you can access it by a.b . Be aware that this only works if b is a module. (e.g. import urllib.request in Python 3)
from a import b however imports b into the current namespace, accessible by b. This works for classes, functions etc.
Be careful when using from - import:
from math import sqrt
from cmath import sqrt
Both statements import the function sqrt into the current namespace, however, the second import statement overrides the first one.
My question is very similar to this one, but in my case the accepted answer does not decorate all the functions in the package when they are used within the the package and I'm not sure why.
For example, I have a project set up like this:
project/
package/
__init__.py
module_a.py
module_b.py
main.py
__init__.py
from .module_a import *
from .module_b import *
import types
# This is the decorator that will be used
from functools import lru_cache
for name, obj in list(globals().items()):
if isinstance(obj, types.FunctionType):
globals()[name] = lru_cache(maxsize=100)(obj)
module_a.py
from .module_b import b
def a(arg):
return b
module_b.py
def b(arg):
return arg
main.py
import package
print(package.a.cache_info())
print(package.a(None).cache_info())
print(package.b.cache_info())
When the package is imported, __init__.py decorates the functions in globals just fine when stepping through the code. However, if I execute main.py I get the following error:
C:\Users\pbreach\Anaconda3\python.exe C:/Users/pbreach/PycharmProjects/project/main.py
Traceback (most recent call last):
CacheInfo(hits=0, misses=0, maxsize=100, currsize=0)
File "C:/Users/pbreach/PycharmProjects/project/main.py", line 4, in <module>
print(package.a(None).cache_info())
AttributeError: 'function' object has no attribute 'cache_info'
Which would mean that b is not decorated when imported from module_b in module_a.
Why does this only happen in the second line? What might be a way to accomplish this?
I am fine with doing the decorating either during import in __init__.py or in main.py, but would rather not have to apply the decorator inside each module in package as in my case there are quite a few of them.
EDIT:
I think the issue is because globals in __init__.py is a different namespace than when b is imported into module_a meaning there are two different instances of the same function. Is there a way around this?
You are importing b from module_b in module_a before you get a chance to decorate it with functools.lru_cache.
The only viable alternative I see, is first explicitly decorating functions that get imported and used in other submodules and then applying the decorator to all other functions.
Using your example, first decorate b from module_b and then decorate the rest:
from package import module_b
import types
# This is the decorator that will be used
from functools import lru_cache
module_b.b = lru_cache(maxsize=100)(module_b.b)
from .module_a import *
from .module_b import *
for name, obj in list(globals().items()):
if isinstance(obj, types.FunctionType):
globals()[name] = lru_cache(maxsize=100)(obj)
Another option, as I stated in a comment, would be to use an if clause inside the modules that contain functions that get included in other modules were the wrapping would occur there.
So in module_b.py you could perform something like this:
if __name__ != '__main__':
from functools import lru_cache
b = lru_cache(b)
this only catches the case where the module isn't run as __main__. Now when another module includes this module and it's body gets executed, the wrapping will be performed here instead of __init__.py.
Quick demonstration of the problem:
Define a module and name it functions.py containing: (this is just an example)
from __future__ import division
from scipy import *
def a():
return sin(1)
Now try to use it in a separate file as follows:
import functions as f
If you type f the pop-up list shows all scipy contents while it should be only a! See:
How to solve this?
This problem makes difficult to see what in the module functions exist.
Note:
1) Think that the module function will have numerous user-defined-functions.
2) There is no problem why scipy function are available globally. That's OK. The problem is why do they appear as member of f?!
It perhaps is a bug in Pyscripter, I am not sure!
Your module functions will contain references to everything imported globally and declared inside its scope.
As you're doing from scipy import * inside the module it will import everything, as it well should, and you will be able to access all the scipy functions from your functions module.
If you only wanted to see a() after importing functions, change it to this
# functions.py
def a():
from scipy import sin
return sin(1)
which will ensure that no references to your import will exist for the person importing your module.
Here's some reading on imports you can go through.
First of all, you should avoid import *, thus your code should look like:
from __future__ import division
from scipy import sin
def a():
return sin(1)
or
from __future__ import division
import scipy
def a():
return scipy.sin(1)
Another alternative is to add this to your module:
__all__ = ['a']
I have a package in my project containing many *.py source files (each consisting of one class in most cases, and named by the class). I would like it so that when this package is imported, all of the files in the package are also imported, so that I do not have to write
import Package.SomeClass.SomeClass
import Package.SomeOtherClass.SomeOtherClass
import ...
just to import every class in the package. Instead I can just write
import Package
and every class is available in Package, so that later code in the file can be:
my_object = Package.SomeClass()
What's the easiest way of doing this in __init__.py? If different, what is the most general way of doing this?
The usual method is inside package/__init__.py
from .SomeClass import SomeClass
As Winston put it, the thing to do is to have an __init__.py file where all your classes are available in the module (global) namespace.
One way to do is, is to have a
from .myclasfile import MyClass
line for each class in your package, and that is not bad.
But, of course, this being Python, you can "automagic" this by doing something like this in __init__.py
import glob
for module_name in glob.glob("*.py"):
class_name = module_name.split(".")[0]
mod = __import__(module_name)
globals()[class_name] = getattr(mod, class_name)
del glob