I am confused in the methodologies of importing Modules in Python and stumbled onto the following.
import pandas as pd
from pandas import *
Was Wondering how these two commands are different from one another?
In practice, a quick answer is that if you import several modules in which some of the functions may have the same name(let's say .addition()). By importing in the manner of:
import module1 as md1
import module2 as md2
you can call md1.addition() and md2.addition() without trouble.
But if you import it as
from module1 import *
from module2 import *
you will get confused when you call addition().
Besides, if you are not familiar with the module, you could accidentally define a function which has the same name in the imported module and overwrites it. So, I think from module import * should be avoided unless you have a reason for that.
Related
Based on some answers I try to be more specific.
I want to import the print and the models AND code in my main.py
I know the question gets asked a lot, but still I could not figure out whats wrong with my code!
I have a project directory like this
-project
--__init__py
--main.py
--print.py
--requests
--__init__.py
--models.py
--code.py
i want to import from print.py and * from requests
Therefore I tried to add these lines in main.py
from . import print
#or
import print
#for requests I tried
import os.path
import sys
sys.path.append('./requests')
from requests import *
all of those lines cause the same ImportError attempted relative import with no known parent ,
using Python 39
anyone an idea where the problem is?
I am very confused that this seems not to work, was it possible in older versions?
You should definitely not be doing anything with sys.path. If you are using a correct Python package structure, the import system should handle everything like this.
From the directory structure you described, project would be the name of your package. So when using your package in some external code you would do
import package
or to use a submodule/subpackage
import project.print
import project.requests
and so on.
For modules inside the package you can use relative imports. When you write
i want to import from print.py and * from requests Therefore I tried
it's not clear from where you want to import them, because this is important for relative imports.
For example, in project/main.py to import the print module you could use:
from . import print
But if it's from project/requests/code.py you would use
from .. import print
As an aside, "print" is probably not a good name for a module, since if you import the print module it will shadow the print() built-in function.
Your main file should be outside the 'project'-directory to use that as a package.
Then, from your main file, you can import using from project.print import ....
Within the project-package, relative imports are possible.
In order to simplify my imports, I added the following code to my __init__.py:
from varro.algo.models.model import *
from varro.algo.models.fpga import *
from varro.algo.models.nn import *
so that I can do from varro.algo.models import ModelNN
If I want to import a different kind of model that doesn't require Tensorflow (as ModelNN does), I don't want to import it, as it takes a long time to load and may not be installed on all systems I'm working on.
However, importing from varro.algo.models import ModelFPGA loads Tensorflow, even though I never import ModelNN.
Is there a way I can simplify the imports without having to import ModelNN every time? (I figured I could just put the import statement for Tensorflow in the class itself, but I want a more robust solution.)
I want to write a Python module that automatically imports all the good stuff for me (about 50 other modules) so I don't have to copy and past them every time I start a new script. I attempted this by defining the following method in my module, soon to realize when I import my module and call this method, the imports take place locally.
def auto_import():
import os
import sys
# plus 50 other modules...
How can I accomplish this automation using modular programming? (I am using Python 3.6. on Ubuntu.)
You don't need a function to do that, you can simply make a file like commonimports.py which looks like this:
import os
import numpy as np
import sys
#and so on...
And add this import statement in other files
from commonimports import *
And you'll have all the modules ready to use within that namespace
Just make the name of your imported modules global:
def auto_import():
import os
import sys
global os, sys
This is not necessary to use this method if you def auto_import() then every time you have to use a autoimport function whenever you want to use those module.
I have been wondering for a long time how to appropriately import in Python, in relation to the way I'm going to explain now:
Imagine I've got two modules built by myself, modulea and moduleb, and have a last one which uses both the modulea and b, so modulec.
Furthermore, modulea uses moduleb.
The modules are all one-file, which form part of a big module which has got its own __init__.py file.
+ My project
-
-> modulea.py
-> moduleb.py
-> modulec.py
-> __init__.py # Here I import modulea, b, c.
-> main_module.py # This is were I do all the hardcore.
I've created a file for importing all the modules I need, instead of importing them one by one in each script, and to know if I have already imported it, I would have to check file for file if I have already imported that or not.
I could simply declare, as you have been saying below in the comment box, the required modules in each module, but I think that's not practical and it's easier to import them all only in one file.
Tell me if this is a good practice or not.
If I import
import modulea
import moduleb
import modulec
It won't work (ImportError) because modulea does not implicitly import moduleb inside the code, and it is imported before than the module it requires (b)
If I did
import moduleb
import modulea
import modulec
I guess they will work because since the runnning script already imports moduleb before a, it should also be accessible to a and not only to the running script; and the same I guess it happens with modulec.
Supposing this is true, which I think indeed, will it work if the import were done in only one line? Are the modules imported and included into the script one by one or are they rather imported altogether and them enabled for use, after the statement finishes (the one-line import call)?
import moduleb, modulea, modulec
And if it does, does it shorten too much the processing time of the code, or is it basically the same, despite the aesthetic matter?
You are overthinking way too much.
Each module must be "self contained" in the sense that itself satisfies its dependencies. If modulea needs moduleb, you do
# modulea.py
import moduleb
Then moduleb has to import its dependencies, if any:
# moduleb.py
import json # for instance
And finally modulec needs modulea and moduleb, you just import them:
# modulec.py
import modulea
import moduleb # <-- actually this import does not reload the moduleb, since it was already loaded in modulea, and it was cached.
There is nothing more about this.
And if you just need a function from a module, you can just import it into the namespace like this.
from moduleb import my_cool_function
Just as style convention, you should do each import in a new line:
import json
from moduleb import my_cool_function
from moduleb import another_not_so_cool_function
According to the python style guide you should use an import statement for each individual module. It will also tell you everything else you need to know about the conventions surrounding importing modules.
import moduleb
import modulea
import modulec
Also, as some of the commenters have indicated, you don't seem to be importing these modules properly within each script, thus creating errors here in this script.
You can put all of your imports into one place if you'd like.
myproject/internal.py
from moduleb import *
from modulea import *
from modulec import *
Now, your other modules can just import it
myproject/modulec.py
from internal import *
Although its usually preferable to import everything in the module that uses it, there are conditions where that is not desirable. I have a project with many small scripts that test product functionality. It would be a bother to have lots of imports in each of them.
Consider the following python package:
/pkg
/pkg/__init__.py
/pkg/a.py
/pkg/b.py
The a and b modules both depend on each other, and so each must import the other -- a circular dependency. There appear to be 4 ways to perform this import statement.
Inside a.py:
import b
Inside b.py (I'm only using one of these import statements at a time):
import a
import pkg.a
from pkg import a
from . import a
These import statements are all syntactically different versions of the same import, and yet they don't all work the same. Trying to import pkg.a will produce different results.
import a
The first import style works without error; though it will fail in python3 or if absolute imports are specified, so I would rather not use a deprecated style.
import pkg.a
This also works, though it makes the code a bit more verbose, especially when there are additional subpackages.
from pkg import a
This fails. ImportError: cannot import name a
from . import a
This also fails with the same error.
Why are these import statements resulting in different behavior? It seems odd that the last two styles should fail, especially since they appear to be the preferred style going forward. I'm using python 2.7.
EDIT:
To test this create a test.py with the following code:
import pkg.a
EDIT:
Also, if you use the as syntax to try and shorten the imported symbol on an import statement that otherwise works, it will also fail, though it will be an AttributeError:
import pkg.a as a