I have 3 scripts: params.py (it defines a configuration class), foo.py (it uses that configuration) and main.py (it initializes the configuration and calls foo).
params.py:
class Config:
def __init__(self, x=0):
self.val = x
global config
config = Config()
foo.py:
from params import config
def foo():
return config.val + 5
main.py:
from params import config
from foo import foo
config = Config(10)
print(foo())
But instead of print 15, it prints 5. How can I fix it? It occurs because when foo.py does the import, it initializes config with 0. But, what can I do to modify from main the config value and read the new value from all other scripts?
Thank you!
Conceptually, you need to separate an object like Config() from the variables that may be referencing it at any given time. When params.py does config = Config(), it creates a Config object and assigns it to a variable in the params module namespace. It is params.config.
When main.py does from params import config, it adds a reference to this Config object to its own namespace. Now there are two references to the same object, one in params.config and another in main.config. So far, so good. from X import Y adds a binding to X.Y into the current namespace. Since params.config is a mutable class instance, main could change the values in that single Config object and it would be seen by all other referrers to that same object. config.val = 10 would be seen by all.
Now things go off the rails. When main does config = Config(10), it creates a new Config object and reassigns that variable to the main namespace. Now params.config references the first object and main references the second. That means changes made to the second object are not seen by the first.
If you want everyone to see the same object, you need to keep the namespace qualification. The scripts would change to
foo.py:
import params
def foo():
return params.config.val + 5
main.py:
import params
from foo import foo
params.config = Config(10)
print(foo())
Now, all of the scripts are using the one variable params.config and see any changes made to that object. This is kindof fragile as you've seen. If anybody does from params import config, reassiging params.config doesn't work.
global only marks a name in a local scope as being global; it has no affect in a global scope, in that it is already global.
What you want isn't really possible, as global namespaces are specific to an individual module, not the process as a whole.
If the value is defined in params.py, you will need to access it via params from all other modules, include the __main__ module created by your script.
params.py:
class Config:
def __init__(self, x=0):
self.val = x
config = Config()
foo.py:
import params
def foo():
return params.config.val + 5
main.py:
import params
from foo import foo
params.config = params.Config(10)
print(foo())
If you simply modified the existing configuration, you could use
params.py (same as above):
class Config:
def __init__(self, x=0):
self.val = x
config = Config()
foo.py (same as your original foo.py):
from params import config
def foo():
return config.val + 5
main.py
from params import config
from foo import foo
config.val = 10
print(foo())
In general, I don't think this is a good idea, as you're essentially creating a global state that can change from any file that imports the configuration file. This is known as action at a distance.
The best answer is to avoid this pattern altogether. For example, come up with a way to use the configuration file in a read-only manner.
That being said, if you really want to do this, make the variable class-level rather than instance-level, so that there exists only one val shared across the entire program.
class Config:
val = 0
def __init__(self, x=0):
Config.val = x
global config
config = Config()
Then, running main.py will print 15.
Related
from module import _dict -> _dict['new_key'] = 1 - and now, the same import in another file imports _dict with a previously non-existent key. This is a problem with pytest that runs several test*.py files, each mutating _dict - so e.g. test_b imports _dict modified by test_a.
A workaround is _dict = copy.deepcopy(_dict) before mutating - but suppose that isn't desired. importlib.reload(module) will not reload _dict - this said, is there any way to ensure the original module._dict is always imported?
Example (also runnable code, without pytest).
# configs.py
_dict = {'a': 1, 'b': 2}
# non_test.py
from configs import _dict
class SomeClass():
def __init__(self, a=None, b=None):
self.a = a or _dict['a']
self.b = b or _dict['b']
del _dict['a']
# test_a.py
def test_class():
SomeClass()
# test_b.py
def test_class():
SomeClass()
Each test*.py has the following 'header' & 'footer':
import pytest
from non_test import SomeClass
# test_*()
if __name__ == '__main__':
pytest.main([__file__, "-s"])
Note: the example isn't reflective of the actual context, within which I have a lot less flexibility. I'm not asking for a solution to 'the problem' itself - what I ask is right in the question's title. If it's "impossible" or there's nothing close to it, then that is the answer.
You can use the importlib.reload function to reload a module. Since _dict from configs is imported into module non_test, which is then imported into test_b, you should reload both non_test and configs to be able to both re-instantiate a new configs._dict and have it re-imported into non_test:
# test_b.py
import non_test
import configs
import importlib
importlib.reload(non_test)
importlib.reload(configs)
def test_class():
SomeClass()
Demo: https://repl.it/#blhsing/SaneDesertedMode
Basic Setup
Suppose I want to create a class named Foo. I may create a file like so:
foo.py:
class Foo:
def __init__(self):
self.data = "world"
def print(self):
print("Hello, " + self.data)
To utilize this class in my main script:
main.py
import foo
test = foo.Foo()
test.print()
Having to type foo.Foo() every time I instantiate the class already feels ridiculous enough, but it gets worse when I want to organize my code by separating my classes into a package:
classes/__init__.py
# Empty
classes/foo.py
# Copy of foo.py, above
main.py
import classes.foo
test = classes.foo.Foo()
test.print()
Simple Answer
I know I can clean this up somewhat by using from X import Y like so:
from classes.foo import Foo
test = Foo()
Preferred Answer
Because the file foo.py contains only one member whose name matches the file, I would prefer if I could do something like the following:
from classes import Foo
# Or:
import classes.Foo as Foo
test = Foo()
Is there a way to do this? Maybe with some code in my __init__.py?
In classes/__init__.py, put:
from .foo import Foo
Now you can write from classes import Foo.
I have a global object which uses a class, and that class uses the global object. How can i put the imports correct in a clean way
I now have:
run.py (The file i run)
from global_class import Global_class
global_object = Global_class()
global_object.create_some_object()
global_class.py
from some_class import Some_class
class Global_class:
def __init__(self):
self.name = 'my_name'
def create_some_object(self):
self.some_object = Some_class()
some_class.py
class Some_class:
def __init__(self):
print(global_object.name)
How can i now access global_object in Some_class? If i put:
from run import global_object
It creates a circular dependency, and it crashes. A possible method i thought of was putting the some_class import in the Global_class::create_some_object() method, but it seems to me as unclean code. Is there any better way
Any python import module or from module import Class statement runs corresponding module line by line and loads all the objects in the module namespace in the memory. However each of the names in the modules reside separately (that is the purpose of the modules after all). So a global_object in some_class.py is completely separate from the global_object in run.py. When the interpreter sees this name in some_class.py it will look in the local and global namespace (using the LEGB rule which stands for local, enclosed, global and builtins). But there is no reference to global_object exists here, it does in the calling module. Your suggestion of putting the some_class import statement inside the method create_object() will also not work for the same reason. As you have found out you cannot import global_object in some_class as it will again need to run the run.py thus creating a loop.
If you want to maintain this setup then one option would be to explicitly pass the global_object which is self in Global_class to the Some_class() constructor like below
#run.py
from global_class import Global_class
global_object = Global_class()
global_object.create_some_object()
#global_class.py
from some_class import Some_class
class Global_class:
def __init__(self):
self.name = 'my_name'
def create_some_object(self):
self.some_object = Some_class(self) #self -> global_object
#some_class.py
class Some_class:
def __init__(self, global_object):
print(global_object.name)
$ python run.py
my_name
I have three modules:
in_mod.py
class IN(object):
def __init__(self):
print("i am the original IN")
module1.py
from in_mod import IN
class C(object):
def __init__(self):
cl = IN()
and module2.py
from module1 import C
class IN2(object):
def __init__(self):
print("I am the new IN")
C()
import in_mod
in_mod.IN = IN2
C()
import module1
module1.IN = IN2
C()
I get the desired behaviour of monkey-patching out the IN class and replacing it with the IN2 class when I use module1.IN = IN2.
I would like to understand what the underlying difference between in_mod.IN = IN2 and module1.IN = IN2 is in this context.
I am referencing a related post.
from module import IN creates a local variable that references module.IN; module.IN and IN are two separate references to the same class. IN = IN2 changes the local reference, but module.IN (which is used by module.C) continues to reference the same class.
UPDATE
In your edit, module1.IN is a global variable in the module1 namespace that initially refers to a class in in_mod, but is distinct from the global in_mod.IN in the module namespace. Changing its value doesn't affect in_mod at all. There's no way to directly access in_mod's namespace via module1, because you don't import the entire module, just one value from the module.
I'm trying to figure out how I can override default values of functions that are defined inside some module. Consider this code (program.py):
# import the default & user defined settings
from default_settings import *
from my_settings import *
# import some functions, which might be dependent on the settings above
from functions import *
# call dummy_function from 'functions' - prints '1'
dummy_function()
# output SOME_DEFAULT - will be '1'
print SOME_DEFAULT
# re-import 'my_settings' - SOME_DEFAULT will now be '2'
from my_settings import *
print SOME_DEFAULT
here is default_settings.py:
DO_DEBUG = False
SOME_DEFAULT = 1
here is my_settings.py, who's values I'd like to use inside functions.py:
DO_DEBUG = True
SOME_DEFAULT = 2
This is functions.py, where I need to import default_settings, otherwise I get an NameError. I don't want to import my_settings here, because functions.py should be more like a generic library.
# if I leave this line out, then I get a
# "NameError: name 'SOME_DEFAULT' is not defined"
from default_settings import *
# I don't want to add "from my_settings import *" here, because 'functions.py'
# is supposed to be a generic library of functions.
# dummy decorator.
def do_profile(cond):
def resdec(f):
if cond:
print "profiling!"
return f
return resdec
# dummy function depending on both 'DO_DEBUG' and 'SOME_DEFAULT'
#do_profile(DO_DEBUG)
def dummy_function(bla=SOME_DEFAULT):
print bla
If I run python program.py I get the following output:
1
1
2
This is expected. The first 1 comes from dummy_function, the second 1 comes from the import of default_settings inside functions and the 2 is a result of me re-importing my_settings.
Is there a way that I can override the default settings that are needed by dummy_function by simply using my_settings? I thought about leaving out the from default_settings import * line in functions, but then I run into NameError. Is there a way to import from functions and at the same time pass on all variables into functions?
You need to encapsulate your settings differently. Right now, you're using two different modules as containers for two different sets of settings. Then you import all the names from those modules, counting on from my_settings import * to overwrite the names imported by from default_settings import *. That's an abuse of import.
In general, I'd say that the names defined when you import a module should not be redefined implicitly. from module import * is already bad because it implicitly defines a bunch of names in the global namespace; using another * import to implicitly redefine those names is just scary.
My suggestion would be to either use a dictionary to store settings, or use a settings class. In the first case, you could do something like this:
# settings.py
default_settings = {'foo': True, 'bar': False}
my_settings = {'foo': False}
current_settings = default_settings.copy()
current_settings.update(my_settings)
Now any module can import settings and access them like this:
foo = settings.default_settings['foo']
bar = settings.current_settings['bar']
settings.current_settings['bar'] = True
Any changes to these settings are visible to all modules that have imported settings.
A more complex approach might be to use a Settings class. Settings would define some defaults:
class Settings(object):
def __init__(self, foo=None, bar=None):
self.foo = foo if foo is not None else True
self.bar = bar if bar is not None else False
Now you can create various custom settings:
# settings.py
default_settings = Settings()
my_settings = Settings(foo=False)
current_settings = my_settings.copy()
current_settings.foo = False # pointless example
And again, as above, we import settings to access them or make changes:
# foo.py
import settings
bar = settings.current_settings.bar
settings.current_settings.foo = True
You can even inherit from Settings to create new defaults:
class LocalSettings(Settings):
def __init__(self, foo=None, bar=None): # in Python 3,
super(LocalSettings, self).__init__(foo, bar) # super().__... works
self.foo = foo if foo is not None else True
And so on.
Your functions are defined in functions.py at import time -- So, if (in program.py) you
#start of file DO NOT "import functions" YET!!!
import default_settings
import my_settings
default_settings.DO_DEBUG=my_settings.DO_DEBUG
default_settings.SOME_DEFAULT=my_settings.SOME_DEFAULT
import functions
then the settings from my_settings should take over. I don't know if this is the solution you are looking for (it doesn't sit right with me), but I can't see any other option with this code structure.
edit
To alleviate the pain of resetting all the settings by hand, you could probably use the inspect module:
#start of file DO NOT "import functions" YET!!!
import default_settings
import my_settings
import inspect
#pull out all of "my_settings" and apply them to "default_settings",
# but only if they don't start with an underscore (reserved)
members=inspect.getmembers(my_settings)
for k,v in members:
if( not k.startswith('_') ):
setattr(default_settings,k,getattr(my_settings,k))
import functions
However, This still doesn't sit right with me -- The thing I don't like is that the behavior of functions depends on when you import it which is not something you typically see in python. I think your code could benefit from some sort of restructuring.