I'm trying to figure out how I can override default values of functions that are defined inside some module. Consider this code (program.py):
# import the default & user defined settings
from default_settings import *
from my_settings import *
# import some functions, which might be dependent on the settings above
from functions import *
# call dummy_function from 'functions' - prints '1'
dummy_function()
# output SOME_DEFAULT - will be '1'
print SOME_DEFAULT
# re-import 'my_settings' - SOME_DEFAULT will now be '2'
from my_settings import *
print SOME_DEFAULT
here is default_settings.py:
DO_DEBUG = False
SOME_DEFAULT = 1
here is my_settings.py, who's values I'd like to use inside functions.py:
DO_DEBUG = True
SOME_DEFAULT = 2
This is functions.py, where I need to import default_settings, otherwise I get an NameError. I don't want to import my_settings here, because functions.py should be more like a generic library.
# if I leave this line out, then I get a
# "NameError: name 'SOME_DEFAULT' is not defined"
from default_settings import *
# I don't want to add "from my_settings import *" here, because 'functions.py'
# is supposed to be a generic library of functions.
# dummy decorator.
def do_profile(cond):
def resdec(f):
if cond:
print "profiling!"
return f
return resdec
# dummy function depending on both 'DO_DEBUG' and 'SOME_DEFAULT'
#do_profile(DO_DEBUG)
def dummy_function(bla=SOME_DEFAULT):
print bla
If I run python program.py I get the following output:
1
1
2
This is expected. The first 1 comes from dummy_function, the second 1 comes from the import of default_settings inside functions and the 2 is a result of me re-importing my_settings.
Is there a way that I can override the default settings that are needed by dummy_function by simply using my_settings? I thought about leaving out the from default_settings import * line in functions, but then I run into NameError. Is there a way to import from functions and at the same time pass on all variables into functions?
You need to encapsulate your settings differently. Right now, you're using two different modules as containers for two different sets of settings. Then you import all the names from those modules, counting on from my_settings import * to overwrite the names imported by from default_settings import *. That's an abuse of import.
In general, I'd say that the names defined when you import a module should not be redefined implicitly. from module import * is already bad because it implicitly defines a bunch of names in the global namespace; using another * import to implicitly redefine those names is just scary.
My suggestion would be to either use a dictionary to store settings, or use a settings class. In the first case, you could do something like this:
# settings.py
default_settings = {'foo': True, 'bar': False}
my_settings = {'foo': False}
current_settings = default_settings.copy()
current_settings.update(my_settings)
Now any module can import settings and access them like this:
foo = settings.default_settings['foo']
bar = settings.current_settings['bar']
settings.current_settings['bar'] = True
Any changes to these settings are visible to all modules that have imported settings.
A more complex approach might be to use a Settings class. Settings would define some defaults:
class Settings(object):
def __init__(self, foo=None, bar=None):
self.foo = foo if foo is not None else True
self.bar = bar if bar is not None else False
Now you can create various custom settings:
# settings.py
default_settings = Settings()
my_settings = Settings(foo=False)
current_settings = my_settings.copy()
current_settings.foo = False # pointless example
And again, as above, we import settings to access them or make changes:
# foo.py
import settings
bar = settings.current_settings.bar
settings.current_settings.foo = True
You can even inherit from Settings to create new defaults:
class LocalSettings(Settings):
def __init__(self, foo=None, bar=None): # in Python 3,
super(LocalSettings, self).__init__(foo, bar) # super().__... works
self.foo = foo if foo is not None else True
And so on.
Your functions are defined in functions.py at import time -- So, if (in program.py) you
#start of file DO NOT "import functions" YET!!!
import default_settings
import my_settings
default_settings.DO_DEBUG=my_settings.DO_DEBUG
default_settings.SOME_DEFAULT=my_settings.SOME_DEFAULT
import functions
then the settings from my_settings should take over. I don't know if this is the solution you are looking for (it doesn't sit right with me), but I can't see any other option with this code structure.
edit
To alleviate the pain of resetting all the settings by hand, you could probably use the inspect module:
#start of file DO NOT "import functions" YET!!!
import default_settings
import my_settings
import inspect
#pull out all of "my_settings" and apply them to "default_settings",
# but only if they don't start with an underscore (reserved)
members=inspect.getmembers(my_settings)
for k,v in members:
if( not k.startswith('_') ):
setattr(default_settings,k,getattr(my_settings,k))
import functions
However, This still doesn't sit right with me -- The thing I don't like is that the behavior of functions depends on when you import it which is not something you typically see in python. I think your code could benefit from some sort of restructuring.
Related
I have 3 scripts: params.py (it defines a configuration class), foo.py (it uses that configuration) and main.py (it initializes the configuration and calls foo).
params.py:
class Config:
def __init__(self, x=0):
self.val = x
global config
config = Config()
foo.py:
from params import config
def foo():
return config.val + 5
main.py:
from params import config
from foo import foo
config = Config(10)
print(foo())
But instead of print 15, it prints 5. How can I fix it? It occurs because when foo.py does the import, it initializes config with 0. But, what can I do to modify from main the config value and read the new value from all other scripts?
Thank you!
Conceptually, you need to separate an object like Config() from the variables that may be referencing it at any given time. When params.py does config = Config(), it creates a Config object and assigns it to a variable in the params module namespace. It is params.config.
When main.py does from params import config, it adds a reference to this Config object to its own namespace. Now there are two references to the same object, one in params.config and another in main.config. So far, so good. from X import Y adds a binding to X.Y into the current namespace. Since params.config is a mutable class instance, main could change the values in that single Config object and it would be seen by all other referrers to that same object. config.val = 10 would be seen by all.
Now things go off the rails. When main does config = Config(10), it creates a new Config object and reassigns that variable to the main namespace. Now params.config references the first object and main references the second. That means changes made to the second object are not seen by the first.
If you want everyone to see the same object, you need to keep the namespace qualification. The scripts would change to
foo.py:
import params
def foo():
return params.config.val + 5
main.py:
import params
from foo import foo
params.config = Config(10)
print(foo())
Now, all of the scripts are using the one variable params.config and see any changes made to that object. This is kindof fragile as you've seen. If anybody does from params import config, reassiging params.config doesn't work.
global only marks a name in a local scope as being global; it has no affect in a global scope, in that it is already global.
What you want isn't really possible, as global namespaces are specific to an individual module, not the process as a whole.
If the value is defined in params.py, you will need to access it via params from all other modules, include the __main__ module created by your script.
params.py:
class Config:
def __init__(self, x=0):
self.val = x
config = Config()
foo.py:
import params
def foo():
return params.config.val + 5
main.py:
import params
from foo import foo
params.config = params.Config(10)
print(foo())
If you simply modified the existing configuration, you could use
params.py (same as above):
class Config:
def __init__(self, x=0):
self.val = x
config = Config()
foo.py (same as your original foo.py):
from params import config
def foo():
return config.val + 5
main.py
from params import config
from foo import foo
config.val = 10
print(foo())
In general, I don't think this is a good idea, as you're essentially creating a global state that can change from any file that imports the configuration file. This is known as action at a distance.
The best answer is to avoid this pattern altogether. For example, come up with a way to use the configuration file in a read-only manner.
That being said, if you really want to do this, make the variable class-level rather than instance-level, so that there exists only one val shared across the entire program.
class Config:
val = 0
def __init__(self, x=0):
Config.val = x
global config
config = Config()
Then, running main.py will print 15.
I am trying to dynamically import modules and get it as global variable.
I am using maya 2020 python interpreter (Python 2.7)
I have a test module called "trigger_test_script.py" under "/home/arda.kutlu/Downloads/" folder.
When I dont import any custom class and run this:
###########################################################################[START]
import sys
import imp
class TestClass(object):
def __init__(self):
self.filePath = None
self.asName = None
def action(self):
exec("global %s" % self.asName, globals())
foo = "imp.load_source('%s', '/home/arda.kutlu/Downloads/trigger_test_script.py')" %self.asName
cmd = "{0}={1}".format(self.asName, foo)
exec(cmd, globals())
###########################################################################[END]
test = TestClass()
test.filePath = "/home/arda.kutlu/Downloads/trigger_test_script.py"
test.asName = "supposed_to_be_global"
test.action()
print(supposed_to_be_global)
I get the exact result that I want:
<module 'trigger_test_script' from '/home/arda.kutlu/Downloads/trigger_test_script.pyc'>
However, when I save TestClass (the part between hashes) into a file and import it like this:
import testClass
test = testClass.TestClass()
test.filePath = "/home/arda.kutlu/Downloads/trigger_test_script.py"
test.asName = "supposed_to_be_global"
test.action()
print(supposed_to_be_global)
the variable which 'supposed_to_be_global' is not becoming global and I get the NameError.
I always assumed that these two usages should return the same result but clearly I am missing something.
I appreciate any help, thanks.
I don't quite understand your last comment about having several modules with different action() methods is a problem. So ignoring that, here's how to make what's in your question work, The part in the hashes will work both in-line or if put in a separate module and imported.
###########################################################################[START]
import imp
class TestClass(object):
def __init__(self):
self.filePath = None
self.asName = None
def action(self):
foo = imp.load_source(self.asName, self.filePath)
return foo
###########################################################################[END]
#from testclass import TestClass
test = TestClass()
test.filePath = "/home/arda.kutlu/Downloads/trigger_test_script.py"
test.asName = "supposed_to_be_global"
supposed_to_be_global = test.action()
print(supposed_to_be_global)
Basic Setup
Suppose I want to create a class named Foo. I may create a file like so:
foo.py:
class Foo:
def __init__(self):
self.data = "world"
def print(self):
print("Hello, " + self.data)
To utilize this class in my main script:
main.py
import foo
test = foo.Foo()
test.print()
Having to type foo.Foo() every time I instantiate the class already feels ridiculous enough, but it gets worse when I want to organize my code by separating my classes into a package:
classes/__init__.py
# Empty
classes/foo.py
# Copy of foo.py, above
main.py
import classes.foo
test = classes.foo.Foo()
test.print()
Simple Answer
I know I can clean this up somewhat by using from X import Y like so:
from classes.foo import Foo
test = Foo()
Preferred Answer
Because the file foo.py contains only one member whose name matches the file, I would prefer if I could do something like the following:
from classes import Foo
# Or:
import classes.Foo as Foo
test = Foo()
Is there a way to do this? Maybe with some code in my __init__.py?
In classes/__init__.py, put:
from .foo import Foo
Now you can write from classes import Foo.
In order to avoid a circular import, I've been forced to define a function that looks like:
# do_something.py
def do_it():
from .helpers import do_it_helper
# do stuff
Now I'd like to be able to test this function, with do_it_helper patched over. If the import were a top level import,
class Test_do_it(unittest.TestCase):
def test_do_it(self):
with patch('do_something.do_it_helper') as helper_mock:
helper_mock.return_value = 12
# test things
would work fine. However, the code above gives me:
AttributeError: <module 'do_something'> does not have the attribute 'do_it_helper'
On a whim, I also tried changing the patch statement to:
with patch('do_something.do_it.do_it_helper') as helper_mock:
But that produced a similar error. Is there any way to mock this function, given the fact that I'm forced into importing it within the function where it's used?
You should mock out helpers.do_it_helper:
class Test_do_it(unittest.TestCase):
def test_do_it(self):
with patch('helpers.do_it_helper') as helper_mock:
helper_mock.return_value = 12
# test things
Here's an example using mock on os.getcwd():
import unittest
from mock import patch
def get_cwd():
from os import getcwd
return getcwd()
class MyTestCase(unittest.TestCase):
#patch('os.getcwd')
def test_mocked(self, mock_function):
mock_function.return_value = 'test'
self.assertEqual(get_cwd(), 'test')
If I have two files helper.app and main.app, I want to be able to accomplish something like this.
helper.py
def configurestuff(dblocationstring):
# Stuff that sets name and location
generic_connection_variable = connectto(dblocationstring)
def dostuff():
# does stuff with the generic_connection_variable
In my main.py, I want to be able to do something like
import helper
helper.configure("customlocationofdb")
helper.dostuff()
#or even
helper.generic_connection_variable.someApplicableMethod()
My goal is so that I can have a main.app that is able to use the "helper" passing variables to setup a connection and reuse that variable if possible within main.app after importing the helper. What is the best way to organize my code to accomplish this? (im not sure how to access generic_connection_variable in my main.py as it is in a function, or what the best way to do this is)
Implementing this as a class allows for greater flexibility:
class Config(object):
DB_STRING = 'some default value'
ANOTHER_SETTING = 'another default'
DEBUG = True
def dostuff(self):
print 'I did stuff to ',self.DEBUG
class ProductionConfig(Config):
DEBUG = False # only turn of debugging
class DevelopmentConfig(Config):
DB_STRING = 'localhost'
def dostuff(self):
print 'Warning! Development system ',self.DEBUG
Store this in any file for example, settings.py. In your code:
from settings import Config as settings
# or from settings import ProductionConfig as settings
print settings.DEBUG # for example
You can define generic_connection_variable to be a module level variable.
So in your helper.py you will have to
generic_connection_variable = None # or whatever default you want.
def configurestuff(dblocationstring):
global generic_connection_variable
# Stuff that sets name and location
generic_connection_variable = connectto(dblocationstring)
def dostuff():
global generic_connection_variable
# does stuff with the generic_connection_variable
It's a bit hard to tell what you are asking, but have you tried making generic_connection_variable an instance variable of helper? (with the self keyword)
# helper.py:
def configurestuff(dblocationstring):
# Stuff that sets name and location
self.generic_connection_variable = connectto(dblocationstring)
Now that generic_connection_variable belongs to an instance of helper instead of being local-scoped to configurestuff, you will be able to use it in main as follows:
import helper
helper.configure("customlocationofdb")
helper.generic_connection_variable.someApplicableMethod()
But you probably need to define a class for generic_connection_variable so it has a method called someApplicableMethod().