I found quite a few answers regarding the question how to re-import a module (e.g. after I changed it during programming), but I want to re-import it as. In other words I would like to repeat
import main.mydir.mymodule as mymod
and have my changes incorporated into my console without restarting the console.
What I am trying currently when I try to reload is the following. I might run
import main.warp.optimisation as opt
res = opt.combiascend(par)
then I do some changes, for example I put a print('Yes, this worked.') at the end of the method combiascend, then I run
import importlib
import main
importlib.reload(main)
importlib.reload(main.warp.optimisation)
opt = main.warp.optimisation
res = opt.combiascend(par)
This does not work: I am not getting any error, but changes I did in the module optimisation just were not applied. In my example, I do not get the respective output.
After employing one of those other answers to "refresh" main.mydir.mymodule, simply do:
mymod = main.mydir.mymodule
Looks like importlib also updates the reference you give it, so if the original import used an alias, you can simply reimport the alias. Given empty foo/__init__.py and foo/bar/__init__.py, and a foo/bar/test.py containing this:
def func():
print("a")
Then I get this:
>>> import foo.bar.test as mod
>>> mod.func()
a
>>> import importlib
>>> # (Updating the file now to print b instead)
>>> importlib.reload(mod)
<module 'foo.bar.test' from '/home/aasmund/foo/bar/test.py'>
>>> mod.func()
b
Related
I have a strange problem that I don't understand. I have a file date_import.py with several functions in it. (I don't want to show these functions here because they all are quite long.) I would like to import these functions in Jupyter. So I write 'from date_import import func1' and it works. But if I write 'from date_import import func1, func2' I get the answer "cannot import name func2". Also if I write 'from date_import import func2' i get the same answer. I thought at first that python somehow cannot see the changes in my file. But if I change the code in func1 and use inspect.getsource then, I can see that python accept the changes. But I still cannot import other functions, only func1.
Did somebody see such behavior and knows some way around?
Thanks in advance.
PS. Here is the function 2.
def func2(stichtag_sql):
sql = """(select distinct .....
)"""
tab = sqlContext.read.jdbc(url=jdbcURL, table=sql, properties=prop).cache()
totale_wbs = tab.toPandas()
totale_wbs.columns = map(str.lower, totale_wbs.columns)
totale_wbs.kdnr =totale_wbs.kdnr.astype(str)
return totale_wbs
if you created your function func_2 after you have imported your module date_import you have to reimport your module:
import importlib
importlib.reload(date_import)
try:
from date_import import (func1, func2)
Is there a way to do this in python 3.6+?
import -force mymodule
I just want a single python command that both:
(1) loads the module for the first time, and
(2) forces a reload of the module if it already loaded without barfing.
(This is not a duplicate question because I'm asking for something different. What I want is a single function call that will do Items (1) and (2) above as the same function call. I don't want to make a coding decision manually about if I could issue "import" or "imp.reload". I just want python code for a single function "def" that can detect which case is appropriate and proceed automatically to make the decision for me about how to import it it, (1) or (2).
I'm thinking that it something like this:
def import_force(m):
import sys
if m not in sys.modules:
import m
else:
import importlib
importlib.reload(m)
Except, I can't figure out how to pass a module name as a parameter. Just gives me an error no such module named 'm'
There is one missing step that you semi-corrected in your new answer, which is that you need to assign the new module in every scope that uses it. The easiest way is to return the module object and bind it to the name you want outside your function. Your original implementation was 90% correct:
import sys, importlib
def import_force(m):
if m not in sys.modules:
return __import__(m)
else:
return importlib.reload(sys.modules[m])
Now you can use this function from the command line to replace import, e.g.:
my_module = force_import('my_module')
Any time you find yourself using exec to perform a task for which there is so much well defined machinery already available, you have code smell. There is also no reason to re-import sys and importlib every time.
This function should do what you want:
def import_force(name):
needs_reload = name in sys.modules
module = importlib.import_module(name)
if needs_reload:
module = importlib.reload(module)
return module
# Usage example:
os = import_force('os')
An alternative approach is to write your own import hooks, which I won't describe.
However please note that this is an anti-pattern and I would discourage the practice of reloading modules at every import.
If this is for debugging purposes, then I would suggest using one of the many auto-reloader solutions available online: they watch your Python files for changes, and when you make modifications they automatically re-import the modules.
The reasons why your function didn't work are two:
The import keyword does not resolve variables, so import m does not mean "import the module which name is in the variable m", but rather it means "import the module named m".
importlib.reload wants a module object, not a module name.
import sys
import importlib
# importing with a sledgehammer... simple, effective, and it always works
def import_force(name):
module = importlib.import_module(name)
module = importlib.reload(module)
return module
#assuming mymodule.py is in the current directory
mymodule = import_force("mymodule")
It's possible! but a little bit tricky to code correctly the first time...
import sys
import importlib
def import_force(modstr):
if modstr not in sys.modules:
print("IMPORT " + modstr)
cmd = "globals()['%s'] = importlib.import_module('%s')" % (modstr, modstr)
exec(cmd)
else:
print("RELOAD " + modstr)
cmd = "globals()['%s'] = importlib.reload(%s)" % (modstr, modstr)
exec(cmd)
If you have a module file in your current directory call "mymodule.py", then use it like this:
Py> import_force("mymodule")
Version 2.0:
def import_force(modstr):
if modstr not in sys.modules:
print("IMPORT " + modstr)
globals()[modstr] = importlib.import_module(modstr)
else:
print("RELOAD " + modstr)
globals()[modstr] = importlib.reload(sys.modules[modstr])
I might be completely wrong here, but I can't find a proper google source for the dilemma that I have:
Let's say we are using python, and we have files
foo.py and bar.py, which have the following pseudocode:
Code in foo.py:
# Code in foo.py
import sys
def foo():
# Some blah code for foo function
And code in bar.py is:
# Code in bar.py
import sys
import foo
def bar():
# Some blah code for bar function
Now, what I am wondering is : Will this not cause code bloat?, since we have imported sys twice in bar.py. Once via import sys and another time because we are doing import foo?
Additionally, what will be the correct thing to do when you have to include libraries in multiple files, which in turn will be included in other files?
Importing a module twice in python does not introduce "bloat". A second import is a mere name-lookup in a cached modules-dictionary (sys.modules, to be precise. Which in case of sys makes this even less relevant, as there is actually nothing that doesn't implicitly trigger an import of sys - although it's obviously not exposed in the namespace).
And what happens if you import some parts of a module x in foo.py, and need them and possibly others in bar.py? Having to carefully groom your imports, and then use foo.something_from_x or x.something_else_from_x in bar.py would be extremely cumbersome to write and maintain.
TLDR: don't worry. Really. Don't.
This would not cause any kind of code bloat . When you import the same library multiple times , python only actually imports it one time (the first time) , and then caches it in sys.modules , and then later on everytime you do import sys , it returns the module object from sys.modules.
A very simple example to show this -
Lets say I have an a.py -
print("In A")
This would print In A everytime the module is imported. Now lets try to import this in multiple times -
>>> import a
In A
>>> import a
>>> import a
>>> import a
As you can see the code was imported only once actually, the rest of the times the cached object was returned. To check sys.modules -
>>> import sys
>>> sys.modules['a']
<module 'a' from '\\path\to\\a.py'>
When you import a module, what happens is that python imports the code , and creates a module object and then creates a name in the local namespace with either the name of the module (if no as keyword was provided , otherwise the name provided after as keyword) , and assigns the module object to it.
The same thing happens when doing it when importing in other modules. Another example -
b.py -
import a
print("In B")
c.py -
import b
import a
print("In C")
Result of running c.py -
In A
In B
In C
As you can see , a.py was only imported once.
I'm trying to temporarily remove a python module from sys.modules so that I can import it as part of a test case (with various system functions mocked out) and then put it back again. (Yes, that's a bit crazy and I'm probably going to end up restructuring the code instead but now I'm curious...)
I can remove the module and reimport it just fine but I can't seem to put it back to the original module once I'm finished. (Maybe that's just not posible?) Here's a test case that I wrote to test out the idea:
class Test(unittest.TestCase):
def test_assumptions(self):
import meta.common.fileutils as fu1
del(sys.modules["meta.common.fileutils"])
import meta.common.fileutils
del(sys.modules["meta.common.fileutils"])
sys.modules["meta.common.fileutils"] = fu1 # I hoped this would set the module back
import meta.common.fileutils as fu2
self.assertEqual(fu1, fu2) # assert fails, fu2 is a new copy of module :-(
Can anyone suggest why it might be failing?
Edit, using pop() as suggested by one of the answers also fails:
class Test(unittest.TestCase):
def test_assumptions(self):
import meta.common.fileutils as fu1
orig = sys.modules.pop("meta.common.fileutils")
import meta.common.fileutils
del(sys.modules["meta.common.fileutils"])
sys.modules["meta.common.fileutils"] = orig
import meta.common.fileutils as fu2
self.assertEqual(fu1, orig) # passes
self.assertEqual(fu2, orig) # fails
self.assertEqual(fu1, fu2) # fails
It looks to me like the issue here has to do with packages. In particular, for a module that lives in a package (eg meta.common), there are two ways to access it: via sys.modules, and via the parent package's dictionary (i.e., meta.common.__dict__). It looks to me like the import meta.common.fileutils as fu2 line is getting fu2's value from meta.common.__dict__, and not from sys.modules.
So the solution: in addition to monkey-patching sys.modules, you should also monkey-patch the parent package. I.e., add something like this:
>>> import meta.common
>>> meta.common.fileutils = fu1
right before the sys.modules["meta.common.fileutils"] = fu1 line.
The sys.modules structure is really just a Python dict. You can remove modules from it, and you can also put them back in.
Store the original module object in a local variable, using dict.pop() to both remove the module and return it:
orig = sys.modules.pop('meta.common.fileutils')
then, when it comes to restoring it, just put that object back into sys.modules:
sys.modules['meta.common.fileutils'] = orig
In interactive python I'd like to import a module that is in, say,
C:\Modules\Module1\module.py
What I've been able to do is to create an empty
C:\Modules\Module1\__init__.py
and then do:
>>> import sys
>>> sys.path.append(r'C:\Modules\Module1')
>>> import module
And that works, but I'm having to append to sys.path, and if there was another file called module.py that is in the sys.path as well, how to unambiguously resolve to the one that I really want to import?
Is there another way to import that doesn't involve appending to sys.path?
EDIT: Here's something I'd forgotten about: Is this correct way to import python scripts residing in arbitrary folders? I'll leave the rest of my answer here for reference.
There is, but you'd basically wind up writing your own importer which manually creates a new module object and uses execfile to run the module's code in that object's "namespace". If you want to do that, take a look at the mod_python importer for an example.
For a simpler solution, you could just add the directory of the file you want to import to the beginning of sys.path, not the end, like so:
>>> import sys
>>> sys.path.insert(0, r'C:\Modules\Module1')
>>> import module
You shouldn't need to create the __init__.py file, not unless you're importing from within a package (so, if you were doing import package.module then you'd need __init__.py).
inserting in sys.path (at the very first place) works better:
>>> import sys
>>> sys.path.insert(0, 'C:/Modules/Module1')
>>> import module
>>> del sys.path[0] # if you don't want that directory in the path
append to a list puts the item in the last place, so it's quite possible that other previous entries in the path take precedence; putting the directory in the first place is therefore a sounder approach.