Web.py NameError When Importing Module in Module - python

I am creating a web app using web.py on python 2.7.3.
I have the following folder structure:
start_app.py
/app
__init__.py
/models
__init__.py
ActionModel.py
AreaModel.py
/controllers
__init__.py
world.py
/views
Whenever I freshly start the app using python start_app.py, and visit world/surrounding I get the following error
<type 'exceptions.ImportError'> at /world/surrounding
cannot import name AreaModel
Python /home/dev/app/models/ActionModel.py in <module>, line 13
Web GET http://localhost:5000/world/surrounding
Line 13 is simply: from app.models import AreaModel but I don't see why python is complaining here.
If I comment this importing line, it runs fine. However, if I call a different URL, e.g. world/view, I get an error that AreaModel is not defined. Once I uncomment the line, it works fine again for all cases (i.e. /surrounding and /view).
I am suspecting that this has something to do with the fact that I am "importing in circles", i.e. world.py imports AreaModel, AreaModel imports ActionModel and ActionModel imports AreaModel.
I doubt that this is 'the pythonic way' to do things or even the 'MVC way', so I would very much appreciate your enlightening me how to do this properly.
Note: app is not in my PYTHONPATH, but I don't think it is needed here, since start_app.py is in the top-level directory and according to this all modules should be available.
Basically, what it comes down to is:
I need the models' functionalities in both the controllers and the models. Is it good practice to "import in circles"? Or is there a nicer approach to do this?
Also, is this problem related to python in general or just web.py?
Update:
Added init.py files, I had them, but did not include in original question. Sorry for that.
Update:
ActionModel.py includes (among others) a class named BaseAction and a few functions, which return instances or subclasses of BaseAction depending on what type of Action we are dealing with. They are called using e.g. ActionModel.get_by_id()
#matthew-trevor : Are you suggesting in a) that I should move those functions get_by_id() into a class ActionModel?
#actionmodel.py
class ActionModel(object):
def __init__(arg1, arg2, area_class):
self.area = area_class()
def get_by_id(self, id):
return BaseAction(id)
class BaseAction(object):
def __init__(id):
pass
I don't see how this should remedy my import problems though.

The Immediate Problem
You cannot import from folders, but you can import from packages. You can turn any folder into a package by adding an __init__.py file to it:
start_app.py
/app
__init__.py
/models
__init__.py
ActionModel.py
AreaModel.py
/controllers
__init__.py
world.py
/views
__init__.py
I'm guessing that ActionModel.py includes a class of the same name. If so, I recommend renaming the file to actionmodel.py to distinguish it from the class.
Circular imports
Is it good practice to "import in circles"? Or is there a nicer
approach to do this?
It's not only bad practice, it just won't work. There are a couple of ways to get around this, which will mostly depend on what you're trying to do:
a. In AreaModel, import the ActionModel module and then reference anything you want to use in it via attribute lookup and vice versa:
# areamodel.py
import actionmodel
def foo():
action = actionmodel.ActionModel(...)
As long as the references are inside class or function definitions, it will only occur at run time and not during importing, so the circular reference is avoided.
b. Turn models into a module and put both ActionModel and AreaModel code inside it.
c. Move the shared code/functionality for ActionModel and AreaModel into a base module they both import from.
d. Make your ActionModel class (or whatever) accept a class as an input, then pass AreaModel into it in world.py (ditto for AreaModel). This way, ActionModel doesn't need to contain a reference to AreaModel, it just has to know what to do with it:
# actionmodel.py
class ActionModel(object):
def __init__(arg1, arg2, area_class):
self.area = area_class()
# areamodel.py
class AreaModel(object):
def __init__(action_class):
self.action = action_class()
# world.py
from actionmodel import ActionModel
from areamodel import AreaModel
action = ActionModel('foo', 'bar', AreaModel)
area = AreaModel(ActionModel)
This is known as object composition.

Related

Import method from Python submodule in __init__, but not submodule itself

I have a Python module with the following structure:
mymod/
__init__.py
tools.py
# __init__.py
from .tools import foo
# tools.py
def foo():
return 42
Now, when import mymod, I see that it has the following members:
mymod.foo()
mymod.tools.foo()
I don't want the latter though; it just pollutes the namespace.
Funnily enough, if tools.py is called foo.py you get what you want:
mymod.foo()
(Obviously, this only works if there is just one function per file.)
How do I avoid importing tools? Note that putting foo() into __init__.py is not an option. (In reality, there are many functions like foo which would absolutely clutter the file.)
The existence of the mymod.tools attribute is crucial to maintaining proper function of the import system. One of the normal invariants of Python imports is that if a module x.y is registered in sys.modules, then the x module has a y attribute referring to the x.y module. Otherwise, things like
import x.y
x.y.y_function()
break, and depending on the Python version, even
from x import y
can break. Even if you don't think you're doing any of the things that would break, other tools and modules rely on these invariants, and trying to remove the attribute causes a slew of compatibility problems that are nowhere near worth it.
Trying to make tools not show up in your mymod module's namespace is kind of like trying to not make "private" (leading-underscore) attributes show up in your objects' namespaces. It's not how Python is designed to work, and trying to force it to work that way causes more problems than it solves.
The leading-underscore convention isn't just for instance variables. You could mark your tools module with a leading underscore, renaming it to _tools. This would prevent it from getting picked up by from mymod import * imports (unless you explicitly put it in an __all__ list), and it'd change how IDEs and linters treat attempts to access it directly.
You are not importing the tools module, it's just available when you import the package like you're doing:
import mymod
You will have access to everything defined in the __init__ file and all the modules of this package:
import mymod
# Reference a module
mymod.tools
# Reference a member of a module
mymod.tools.foo
# And any other modules from this package
mymod.tools.subtools.func
When you import foo inside __init__ you are are just making foo available there just like if you have defined it there, but of course you defined it in tools which is a way to organize your package, so now since you imported it inside __init__ you can:
import mymod
mymod.foo()
Or you can import foo alone:
from mymod import foo
foo()
But you can import foo without making it available inside __init__, you can do the following which is exactly the same as the example above:
from mymod.tools import foo
foo()
You can use both approaches, they're both right, in all these example you are not "cluttering the file" as you can see accessing foo using mymod.tools.foo is namespaced so you can have multiple foos defined in other modules.
Try putting this in your __init__.py file:
from .tools import foo
del tools

How do I spell Python package/module imports for this situation?

I have these Python files
project/packages/foo/job.py
project/packages/foo/models.py
project/packages/foo/stuff/Thing.py
models.py contains class Thing and Thing.py contains functions related to Thing
job.py tries to do this:
from . import models
from . import stuff
def job ():
x = models.Thing (123)
stuff.Thing.related_function (x)
This yields an error:
AttributeError: module 'foo.stuff' has no attribute 'Thing'
I've tried variations on the import spellings but can't get it to work. I want to not bring Thing into the namespace, but always have to refer to it as models.Thing or stuff.Thing
How do I do this?
Add the __init__ files:
project/packages/foo/__init__.py
project/packages/foo/job.py
project/packages/foo/models.py
project/packages/foo/stuff/__init__.py
project/packages/foo/stuff/Thing.py
In the stuff package, initialise the submodule by adding this line:
# in project/packages/foo/stuff/__init__.py file
from foo.stuff import Thing
The other __init__.py can be empty.
Now in your job.py code, the models.Thing attribute should resolve (it's a class) and stuff.Thing attribute should resolve (it's a submodule).
Note: It is not a good naming convention to have a module name matching the class, within the same project - that's unnecessarily confusing. I recommend to rename the Thing submodule for "functions related to Thing" to something else, maybe thing_helpers.

Python import function from package

(Python 3.6)
I have this folder structure:
package/
start.py
subpackage/
__init__.py
submodule.py
submodule.py:
def subfunc():
print("This is submodule")
__ init __.py:
from subpackage.submodule import subfunc
start.py:
import subpackage
subpackage.subfunc()
subpackage.submodule.subfunc()
I understand how and why
subpackage.subfunc()
works.
But I don't understand why:
subpackage.submodule.subfunc()
also works, if I have not done:
from subpackage import submodule
Nor:
import subpackage.submodule
Neither in __ init __.py nor in start.py
Thank you very much if anyone may clear my doubt.
When issuing from subpackage.submodule import subfunc, python does two things for you: one, search and evaluate the module named subpackage.submodule, put it into sys.modules cache; two, populate subpackage.submodule.subfunc object and bind name "subfunc" to the namespace of the current module:
The import statement combines two operations; it searches for the named module, then it binds the results of that search to a name in the local scope.
When importing subpackage.submodule, parent of submodule also got imported:
While certain side-effects may occur, such as the importing of parent packages, and the updating of various caches (including sys.modules) ...
On the last stage of importing subpackage.submodule, python would set the module as an attribute on its parent subpackage, this behavior is documented:
When a submodule is loaded using any mechanism (e.g. importlib APIs, the import or import-from statements, or built-in __import__()) a binding is placed in the parent module’s namespace to the submodule object.
If I'm getting this right, you have a folder called "package" in which there are 2 things: a .py file and another folder called "subpackage".
Inside "subpackage" you have __init__.py and submodule.py which the latter contains a function that just prints "This is submodule".
Now, when you call import subpackage, you call and "pull" everything that's inside "subpackage", including submodule and therefore, the subfunc() function.
When you write subpackage.submodule.subfunc() there's really nothing amazing going there, you just call the mainfolder/container (subpackage.), then the .py file (submodule.) and finally the function itself (subfunc() ).

Python: Packages, Modules, and Classes - is this approach any good?

After reading a decent amount in this area, I've come up with the following for myself.
__init__.py has imports of all submodules which are intended for outside use; an __all__ variable which contains a list of all.
Classes are typically one to a module, though if it makes sense to bundle a few classes into single module, then that's allowed.
Modules not meant for public use, are not imported in __init__.py, and are not listed in its __all__ member. This deters but does not prevent user code from accessing these non-public modules.
Any 'runnable' submodules in the package, must to explicit, non-relative imports of any required submodules.
Running these runnable submodules should be done with -m. Eg python3 -m mypackage.mySubModule
Any submodule MAY import the entire module via from mypackage import *. It seems fair to allow submodule authors, of having an easy way to get at all the (public) submodules, and fair to trust them to intelligently handle collision issues, to recognize the names of submodules, etc. It also puts the burden on __init__.py's __all__ This is the only use of from foo import *I tolerate :)
The biggest drawbacks I can think of are ...
Non-public submodules are still accessible.
Submodules explicitly use the package name, so if the package name changes, modules break
I'm allowing from foo import * just this one time, which is about as 'famous last words' as it gets.
I'm happy to hear other issues as well.
Below is a simplified but thorough example. Thanks very much to all.
File & Directory Layout
mypackage/
__init__.py
A.py # class A
B.py # class B
_Helper.py # Helper class used by A and B
api.py # public API for mypackage
runMe.py # A runnable class inside mypackage (possibly for testing)
userCode.py # Code that uses mypackage; could be located anywhere
__init__.py
from mypackage.A import A
from mypackage.B import B
from mypackage import api
__all__ = ['A', 'B', 'api']
A.py (and B.py as well)
from mypackage import _Helper
class A:
def __init__ (self): pass
def test (self): _Helper.printMessage()
_Helper.py
def printMessage ():
print("*** _Helper.printMessage() is reachable ***")
api.py
from mypackage import *
def createA (): return A()
def createB (): return B()
runMe.py
from mypackage import api
a = api.createA()
a.test()
b = api.createB()
b.test()
userCode.py
import mypackage
a = mypackage.api.createA()
a.test()
b = mypackage.api.createB()
b.test()

from <module> import ... in __init__.py makes module name visible?

Take the following code example:
File package1/__init__.py:
from moduleB import foo
print moduleB.__name__
File package1/moduleB.py:
def foo(): pass
Then from the current directory:
>>> import package1
package1.moduleB
This code works in CPython. What surprises me about it is that the from ... import in __init__.py statement makes the moduleB name visible. According to Python documentation, this should not be the case:
The from form does not bind the module name
Could someone please explain why CPython works that way? Is there any documentation describing this in detail?
The documentation misled you as it is written to describe the more common case of importing a module from outside of the parent package containing it.
For example, using "from example import submodule" in my own code, where "example" is some third party library completely unconnected to my own code, does not bind the name "example". It does still import both the example/__init__.py and example/submodule.py modules, create two module objects, and assign example.submodule to the second module object.
But, "from..import" of names from a submodule must set the submodule attribute on the parent package object. Consider if it didn't:
package/__init__.py executes when package is imported.
That __init__ does "from submodule import name".
At some point later, other completely different code does "import package.submodule".
At step 3, either sys.modules["package.submodule"] doesn't exist, in which case loading it again will give you two different module objects in different scopes; or sys.modules["package.submodule"] will exist but "submodule" won't be an attribute of the parent package object (sys.modules["package"]), and "import package.submodule" will do nothing. However, if it does nothing, the code using the import cannot access submodule as an attribute of package!
Theoretically, how importing a submodule works could be changed if the rest of the import machinery was changed to match.
If you just need to know what importing a submodule S from package P will do, then in a nutshell:
Ensure P is imported, or import it otherwise. (This step recurses to handle "import A.B.C.D".)
Execute S.py to get a module object. (Skipping details of .pyc files, etc.)
Store module object in sys.modules["P.S"].
setattr(sys.modules["P"], "S", sys.modules["P.S"])
If that import was of the form "import P.S", bind "P" in local scope.
this is because __init__.py represent itself as package1 module object at runtime, so every .py file will be defined as an submodule. and rewrite __all__ will not make any sense. you can make another file e.g example.py and fill it with the same code in __init__.py and it will raise NameError.
i think CPython runtime takes special algorithm when __init__.py looking for variables differ from other python files, may be like this:
looking for variable named "moduleB"
if not found:
if __file__ == '__init__.py': #dont raise NameError, looking for file named moduleB.py
if current dir contains file named "moduleB.py":
import moduleB
else:
raise namerror

Categories

Resources