Python - import module once only - python

I have 2 python scripts.
satellite_utils.py
apply_errata.py
The script that I run is :
python3.4 apply_errata.py
apply_errata.py calls functions defined in satellite_utils.py.
Now I am using the module logging to log my messages. I want to import this only once rather than having to declare this in every script.
If i define logging in apply_errata.py and a reference is made to it in satellite_utils.py, I get :
Traceback (most recent call last):
File "apply_errata.py", line 20, in <module>
satellite_utils.applyErrata(args.release, args.all, args.rollback)
File "/root/config-3.1.21/automated-os-patching/satellite_utils.py", line 34, in applyErrata
applyErrataOnSystem(system, release, automaticRollback, [erratum])
File "/root/config-3.1.21/automated-os-patching/satellite_utils.py", line 39, in applyErrataOnSystem
logging.warning('is when this event was logged.')
NameError: name 'logging' is not defined
Any way I can avoid an import statement in everyfile ?

You can use the logging instance in the satellite_utils like this:
# in apply_errata.py
import satellite_utils
satellite_utils.logger.info('hello there')
or
# in apply_errata.py
from satellite_utils import logger
logger.info('hi there')
This works because any name defined in a Python file is attached to the global scope of that file (in Python files are modules, so file == module), and is accessible to anyone.
It's import to point out that this is, shall we say, not the canonical and preferred way to do things. You're a grown-up, you can decide for yourself.
Why it's not bad to import modules multiple times: they get cached by Python in the sys.modules dict, so next time when you import you'll just get that cached copy. Python docs on sys.modules

You can do it by importing the necessary libraries in a script and then import everything in that script into another script using wildchar. By doing this you are not importing all of them again instead you are referencing them and they can be used in the second script as if you were using them in the friest script.
For example:
1. Script1.py
import logging
import something
.....
...
log_i=logging.info
log_d=logging.debug
Script2.py
from Script1 import * #import all in Script1
log_i("this is info log")
log_d("this is debug log")#use the imported data
Here logging is imported in Script1 and I'm importing all from Script1 to Script2 which means all the libraries,variables,function definitions used in Script1 is accessible and modifiable from Script2. Hence I'm using logging directly without any declaration/assignment in Script2.
As per #anugrah's comment you can use __init__.py to initialise the modules in a directory so that they can be imported and used like the above method. So if you choose this method then it will be like
abc/__init__.py
abc/modules.py
import logging,os,sys
log_i=logging.info
log_d=logging.debug
Script1.py
from abc.modules import log_* #now I'm importing only log_i,log_d
log_i("this is info log")
log_d("this is debug log")

Related

import a dictionary built at run-time

My code is structured as follows:
main.py
utils.py
blah.py
The main module uses argparse to read in the location of a configurations yaml file which is then loaded as a dictionary. Is there a way for utils and blah to import this built-up dictionary?
Edit: I tried using from main import config (config being the dictionary I built) but I get ImportError: cannot import name 'config' from 'main'
Edit2: Main imports the other 2 modules - apologies for leaving out this very important detail
I would recommend making another file, say, globals.py. Import this in main, utils, and blah, and set properties in it to be recalled by the other modules. For example:
globals.py
configs = {}
main.py
import .globals
...
user_configs = yaml.load('user/entered/path.yml')
globals.configs.update(user_configs) # modifies the global `configs` variable
utils.py
import .globals
...
# need to use one of the configs for something:
try:
relevant_config = globals.configs['relevant_config']
except KeyError:
print("User did not input the config field 'relevant_config'")
All modules will be able to see the same globals instance, thus allowing you to use what are effectively global variables across your program.
You could simply save configs as a gobal variable in main.py and have utils.py and blah.py import .main, but having a designated module for this is cleaner and clearer than to have other modules importing the main module.
Just do
import main
and use it as
main.dictionary
That should do it!

A file import is working when I run the file from inside the module, but not when I run the file by importing the module from outside

My directory structure:
test.py
module/
importer.py
importee.py
__init__.py
So in my directory, I have test.py, then another directory which has been initialized as a module. Within that module, there is a file importer.py which imports a file importee.py. In order to test whether the import works, I made a simple function in importee.py and tried using it in importer.py (i.e. I ran importer.py directly); it worked just fine.
But when I go into test.py and have the import statement from module import * and try to run that (without any other code), it gives an error which traces back to the import statement in importer.py, saying No module named 'importee'
If it matters, the __init__.py in the module directory has the __all__ function specified properly.
I like to think this isn't a duplicate despite there being similarly titled posts, such as this or this or this or this; I've been searching for hours and still have no idea what could be causing this.
Any ideas? Any help is greatly appreciated.
Edit: content of the four files:
init.py
__ all __ = ["importee", "importer"]
importee.py
def example():
print("hey")
importer.py
from importee import *
example()
test.py
from module import *
When I run importer.py I get no errors, but when I run test.py I get a error which traces back to the first line of importer.py saying that No module named 'importee' found, even though I don't get that error when running importer.py directly...
The following runs and prints "hey" regardless of if you run python test.py from root or python importer.py from module.
You can learn more about relative imports from PEP 328. You can learn more about init here, but the important thing I want you to take away from this is that __init__ runs when you import from module.
Furthermore defining __all__ overrides identifiers that begin with _, and since you aren't using them I don't actually know that it would have any effect.
# test.py
from module import *
# module/__init__.py
from .importer import *
# module/importee.py
def example():
print("hey")
# module/importer.py
from .importee import *
example()

Import same python module if I call a function of different file

I was wondering what happens if I call a module on different file, which imports the same python module that is already imported on the main call, is it imported twice? If yes, how can I prevent it? What is recommended way for this?
On the following example time module is imported on the both files. As alternative solution, I passed time module as an argument to the module call that is located on different file.
Example:
hello.py
from module import module
import time
time.sleep(1)
module();
module.py
import time; # Already imported in hello.py
def module(): #{
time.sleep(1)
print('hello')
#}
Alternative: I am passing time module as argument into module() function that is located under module.py.
hello.py
from module import module
import time
time.sleep(1)
module(time);
module.py
def module(time): #{
time.sleep(1)
print('hello')
#}
A module is only located and executed once, no matter how many times it is imported. It's stored in the sys.modules dict, so subsequent imports are just a dictionary lookup. There's no reason to try to avoid multiple imports of the same module.

How to set the libraries global for all the functions?

I have a main file from which I call some functions, for example:
import numpy as np
import sys
sys.path.insert(0, '/myFolder/')
from myFunction import myFun1, muFun2, myFun3
However when I run the function I received the following error
tmp = myFun1(x,y)
NameError: global name 'np' is not defined
You could kind of achieve what you want, but you'd have to bypass the normal import system in python and basically just "source" either the central import file or the function files.
a.py
def func():
np.something()
b.py
import numpy as np
execfile('/path/to/a.py')
func()
execfile allows you to basically take the contents of one python file and run them within the current python file, as if the functions had been declared in b.py.
You could also do the reverse -- put all your imports in a single file and then execfile that file at the top of all your other python files.
That being said, you probably shouldn't do this, but python is flexible enough to allow you to dig yourself into a hole if you want to.

Why is Python more strict with circular imports when using from-imports?

I know that Python discourages any situation which can get you into a circular import. But I wanted understand the Python internals of why from-imports are seemingly arbitrarily less forgiving than normal imports in circular import situations.
For example, this code compiles:
# main.py
import CommonUtil
# commonutil.py
import util
class CommonUtil:
# some code that uses util.Util
pass
# util.py
import commonutil
class Util:
# some code that uses commonutil.CommonUtil
pass
But this code does not:
# main.py
import CommonUtil
# commonutil.py
import util
class CommonUtil:
# some code that uses util.Util
pass
# util.py
from commonutil import CommonUtil
class Util:
# some code that uses CommonUtil
pass
Traceback (most recent call last):
File "main.py", line 1, in <module>
import CommonUtil
File "commonutil.py", line 1, in <module>
import util
File "util.py", line 1, in <module>
from commonutil import CommonUtil
ImportError: cannot import name CommonUtil
You don't hit compiler errors as long as you don't try to use the relevant classes before all the imports have completed. But when you try to do some aliasing, then it fails. Can someone explain what's going on internally in the Python that causes this error to rear its head only when from-import is used? And secondarily, is there any easy way around this? (Besides the obvious "pull shared code out to a third module," which I'll likely do anyways.)
Modules are executed from top to bottom. When an import is seen for the first time, execution of the current module is suspended so that the other module can be imported. When the other module attempts to import the first module, it gets a reference to the currently partially-executed module. Since code located after the import of the other module hasn't yet been executed, any names contained within it cannot yet exist.
main.py
import a
a.py
var1 = 'foo'
import b
var2 = 'bar'
b.py
import a
print a.var1 # works
print a.var2 # fails
The way around it is to not access the names in the imported module until its execution has completed.
see http://effbot.org/zone/import-confusion.htm#circular-imports for an explanation of what is happening.
I assume you launch the main.py file. Python will first try to load commonutil. It will create a module object, and start filling it with class and function and global variable when encountering their definition. The first statement is an import, so now python creates the util module and start filling it. The common module exist, but is empty. In the first version, you do not access any commonutil object at load time, so everything is fine. In the second one, you try to fetch a specific variable in commonutil which does not exist at this moment. If you had used something like f(commonutil.CommonUtil) in the first version, it would have also crashed.

Categories

Resources