How to initialize helper method? - python

If I have two files helper.app and main.app, I want to be able to accomplish something like this.
helper.py
def configurestuff(dblocationstring):
# Stuff that sets name and location
generic_connection_variable = connectto(dblocationstring)
def dostuff():
# does stuff with the generic_connection_variable
In my main.py, I want to be able to do something like
import helper
helper.configure("customlocationofdb")
helper.dostuff()
#or even
helper.generic_connection_variable.someApplicableMethod()
My goal is so that I can have a main.app that is able to use the "helper" passing variables to setup a connection and reuse that variable if possible within main.app after importing the helper. What is the best way to organize my code to accomplish this? (im not sure how to access generic_connection_variable in my main.py as it is in a function, or what the best way to do this is)

Implementing this as a class allows for greater flexibility:
class Config(object):
DB_STRING = 'some default value'
ANOTHER_SETTING = 'another default'
DEBUG = True
def dostuff(self):
print 'I did stuff to ',self.DEBUG
class ProductionConfig(Config):
DEBUG = False # only turn of debugging
class DevelopmentConfig(Config):
DB_STRING = 'localhost'
def dostuff(self):
print 'Warning! Development system ',self.DEBUG
Store this in any file for example, settings.py. In your code:
from settings import Config as settings
# or from settings import ProductionConfig as settings
print settings.DEBUG # for example

You can define generic_connection_variable to be a module level variable.
So in your helper.py you will have to
generic_connection_variable = None # or whatever default you want.
def configurestuff(dblocationstring):
global generic_connection_variable
# Stuff that sets name and location
generic_connection_variable = connectto(dblocationstring)
def dostuff():
global generic_connection_variable
# does stuff with the generic_connection_variable

It's a bit hard to tell what you are asking, but have you tried making generic_connection_variable an instance variable of helper? (with the self keyword)
# helper.py:
def configurestuff(dblocationstring):
# Stuff that sets name and location
self.generic_connection_variable = connectto(dblocationstring)
Now that generic_connection_variable belongs to an instance of helper instead of being local-scoped to configurestuff, you will be able to use it in main as follows:
import helper
helper.configure("customlocationofdb")
helper.generic_connection_variable.someApplicableMethod()
But you probably need to define a class for generic_connection_variable so it has a method called someApplicableMethod().

Related

Share global variables between Python scripts

I have 3 scripts: params.py (it defines a configuration class), foo.py (it uses that configuration) and main.py (it initializes the configuration and calls foo).
params.py:
class Config:
def __init__(self, x=0):
self.val = x
global config
config = Config()
foo.py:
from params import config
def foo():
return config.val + 5
main.py:
from params import config
from foo import foo
config = Config(10)
print(foo())
But instead of print 15, it prints 5. How can I fix it? It occurs because when foo.py does the import, it initializes config with 0. But, what can I do to modify from main the config value and read the new value from all other scripts?
Thank you!
Conceptually, you need to separate an object like Config() from the variables that may be referencing it at any given time. When params.py does config = Config(), it creates a Config object and assigns it to a variable in the params module namespace. It is params.config.
When main.py does from params import config, it adds a reference to this Config object to its own namespace. Now there are two references to the same object, one in params.config and another in main.config. So far, so good. from X import Y adds a binding to X.Y into the current namespace. Since params.config is a mutable class instance, main could change the values in that single Config object and it would be seen by all other referrers to that same object. config.val = 10 would be seen by all.
Now things go off the rails. When main does config = Config(10), it creates a new Config object and reassigns that variable to the main namespace. Now params.config references the first object and main references the second. That means changes made to the second object are not seen by the first.
If you want everyone to see the same object, you need to keep the namespace qualification. The scripts would change to
foo.py:
import params
def foo():
return params.config.val + 5
main.py:
import params
from foo import foo
params.config = Config(10)
print(foo())
Now, all of the scripts are using the one variable params.config and see any changes made to that object. This is kindof fragile as you've seen. If anybody does from params import config, reassiging params.config doesn't work.
global only marks a name in a local scope as being global; it has no affect in a global scope, in that it is already global.
What you want isn't really possible, as global namespaces are specific to an individual module, not the process as a whole.
If the value is defined in params.py, you will need to access it via params from all other modules, include the __main__ module created by your script.
params.py:
class Config:
def __init__(self, x=0):
self.val = x
config = Config()
foo.py:
import params
def foo():
return params.config.val + 5
main.py:
import params
from foo import foo
params.config = params.Config(10)
print(foo())
If you simply modified the existing configuration, you could use
params.py (same as above):
class Config:
def __init__(self, x=0):
self.val = x
config = Config()
foo.py (same as your original foo.py):
from params import config
def foo():
return config.val + 5
main.py
from params import config
from foo import foo
config.val = 10
print(foo())
In general, I don't think this is a good idea, as you're essentially creating a global state that can change from any file that imports the configuration file. This is known as action at a distance.
The best answer is to avoid this pattern altogether. For example, come up with a way to use the configuration file in a read-only manner.
That being said, if you really want to do this, make the variable class-level rather than instance-level, so that there exists only one val shared across the entire program.
class Config:
val = 0
def __init__(self, x=0):
Config.val = x
global config
config = Config()
Then, running main.py will print 15.

How to inject different environment variable in dev, test, prod in Python

I used to work with Flask which offers an easy way to configure the application running in different modes. (dev, test, prod, ...)
class BaseConfig:
MY_PATH = "Something"
class DevelopmentConfig(BaseConfig):
MY_PATH = "Something else"
# ...
I am trying to build something similar but without using Flask. Here is the structure of the most simple code I could find:
-src
- main.py
- zip2h5
- __init__.py
- foo.py
-test
- __init__.py
- test_foo.py
The object Foo.py has a method path which output "path/to/dev" when in dev mode, "path/to/test" when in test mode. Writing if statements in the code would be messy and hard to test properly. Using environment variable seems much better. How and where do I set the configurations that Flask does?
# foo.py
class Foo():
def __init__(self, name):
self.name = name
def path(self):
return "path/in/dev"
# test_foo.py
class TestFoo(unittest.TestCase):
def test_path(self):
boo = Foo("Boo")
expected = "path/in/test"
self.assertEquals(boo.path(), expected)
Please, do not tell me I can patch the method. As I have said, this is just an example.
The environment for your process is available via the os module.
You can simply inject different environment variables for the path in your dev and test cases. I'm not sure how your running your tests, but usually you can do something like PATH='path/in/test' tests.sh to accomplish what you need.
I use the dotenv and keep .env files in my project root to manage this. I have a base test class that loads .env.test instead of .env for testing configuration.
Do it the same was Flask does it. Have multiple Config classes, then pass env as a parameter e.g
class Foo():
def __init__(self, name, env):
self.name = name
self.env = env
def path(self):
if self.env == 'TEST':
#initialize TestConfig class here
return TestConfigPath
test_foo.py
class TestFoo(unittest.TestCase):
def test_path(self):
boo = Foo("Boo")
expected = "path/in/test"
self.assertEquals(boo.path(), expected)

web.py: avoid global instances?

the last few weaks, I am playing a little bit with the Web.py framework. As my application is now getting bigger and bigger, I want to restructure the sourcecode and put code fragments in different classes. Now, I don't really know where I should create my object instances if I need them in different web.py classes. Let us assume, my sourcecode looks like:
import web
import myclass
urls = (
'/', 'index',
'/test', 'test'
)
#should i make my instance global...
my = myclass.myClass()
class test:
def __init__(self):
#...or should i make my instance local: my = myclass.myClass()
pass
def GET(self):
item = my.getItem()
return item
def POST(self):
pass
class index:
def __init__(self):
#...or should i make my instance local: my = myclass.myClass()
pass
def GET(self):
date = my.getDate()
return date
if __name__ == "__main__":
app = web.application(urls, globals())
app.run()
Now, I want to access the methods getItem() and getDate() (which belong to the instance my), if the appropriate sites in my webbrowser are called. My question is now: Should I make the instance global or is it better, if I make it local? I really don't like global instances, but I don't see any other way as to make it global. Sure, it would be possible, to create a local instance, but then, every time the page loads, a new instance would be created, right? Normally, this wouldn't be a problem, but myclass accesses a serial port, so I need to make sure, that only one instance is created.
Am I missing something or is a global instance the only possible solution to accomplish this?
After some research, I came to the conclusion, that global instances are the way to go here. However, one must be careful with global instances if they are used together with the web.py auto reload mode. In auto reload mode, a global instance is created every time a new page loads. If you want to avoid that, you have to use something like this:
import web
import serial
urls = ("/(.*)", "test"
)
web.config.debug = True
if(web.config.debug):
#in debug mode, make sure that global serial instance is only created at start up
if not hasattr(serObj, "_email"):
serObj = serial.Serial(0, 9600, parity=serial.PARITY_NONE)
web._serObj = serObj
else:
serObj = web._serObj
class test:
def GET(self):
return "Test"
def POST(self):
pass
if __name__ == "__main__":
app = web.application(urls, globals())
app.run()

how to change a variable from a module inside a module in a module... (confused)

I have a variable in my main module which is changed using another module, but I want to change the variable from my main module through another module. I'm new to programmering, so I actually don't really know how to explain this stuff - sorry if I'm asking a stupid question.
The program's hierarchy looks a bit like this:
Main
---Features
---Pygame_handling
------Features
I use the "Features" module to change a variable in the "Main". I do this simply by getting the defined variable from "Features". But when I change the variable through "Pygame_handling", it is not changed in the "Features" object created in the "Main" module.
Main.py
import Features
class Simulator:
def __init__(self):
self.Features = Features.Methods()
self.variables = self.Features.dictionary
self.PyObject = Pygame_handling.Window()
Pygame_handling.py
import Features
class Window:
def __init__(self):
self.Features = Features.Methods()
dict = {"some":"dict"}
self.Features.monitor_changes(dict)
How are you initializing those classes?
Usually when I need to do something like this I code it like:
py1.py:
class Test(object):
def test_print(self):
print 'Hi!'
TEST = Test()
py2.py:
from py1 import TEST
TEST.test_print()
# Adding new stuff to the TEST initialized class
TEST.new_var = 50
print TEST.new_var
#output: 50
So now you can just use the initialized class from on that module.

how to override default values needed by function in module?

I'm trying to figure out how I can override default values of functions that are defined inside some module. Consider this code (program.py):
# import the default & user defined settings
from default_settings import *
from my_settings import *
# import some functions, which might be dependent on the settings above
from functions import *
# call dummy_function from 'functions' - prints '1'
dummy_function()
# output SOME_DEFAULT - will be '1'
print SOME_DEFAULT
# re-import 'my_settings' - SOME_DEFAULT will now be '2'
from my_settings import *
print SOME_DEFAULT
here is default_settings.py:
DO_DEBUG = False
SOME_DEFAULT = 1
here is my_settings.py, who's values I'd like to use inside functions.py:
DO_DEBUG = True
SOME_DEFAULT = 2
This is functions.py, where I need to import default_settings, otherwise I get an NameError. I don't want to import my_settings here, because functions.py should be more like a generic library.
# if I leave this line out, then I get a
# "NameError: name 'SOME_DEFAULT' is not defined"
from default_settings import *
# I don't want to add "from my_settings import *" here, because 'functions.py'
# is supposed to be a generic library of functions.
# dummy decorator.
def do_profile(cond):
def resdec(f):
if cond:
print "profiling!"
return f
return resdec
# dummy function depending on both 'DO_DEBUG' and 'SOME_DEFAULT'
#do_profile(DO_DEBUG)
def dummy_function(bla=SOME_DEFAULT):
print bla
If I run python program.py I get the following output:
1
1
2
This is expected. The first 1 comes from dummy_function, the second 1 comes from the import of default_settings inside functions and the 2 is a result of me re-importing my_settings.
Is there a way that I can override the default settings that are needed by dummy_function by simply using my_settings? I thought about leaving out the from default_settings import * line in functions, but then I run into NameError. Is there a way to import from functions and at the same time pass on all variables into functions?
You need to encapsulate your settings differently. Right now, you're using two different modules as containers for two different sets of settings. Then you import all the names from those modules, counting on from my_settings import * to overwrite the names imported by from default_settings import *. That's an abuse of import.
In general, I'd say that the names defined when you import a module should not be redefined implicitly. from module import * is already bad because it implicitly defines a bunch of names in the global namespace; using another * import to implicitly redefine those names is just scary.
My suggestion would be to either use a dictionary to store settings, or use a settings class. In the first case, you could do something like this:
# settings.py
default_settings = {'foo': True, 'bar': False}
my_settings = {'foo': False}
current_settings = default_settings.copy()
current_settings.update(my_settings)
Now any module can import settings and access them like this:
foo = settings.default_settings['foo']
bar = settings.current_settings['bar']
settings.current_settings['bar'] = True
Any changes to these settings are visible to all modules that have imported settings.
A more complex approach might be to use a Settings class. Settings would define some defaults:
class Settings(object):
def __init__(self, foo=None, bar=None):
self.foo = foo if foo is not None else True
self.bar = bar if bar is not None else False
Now you can create various custom settings:
# settings.py
default_settings = Settings()
my_settings = Settings(foo=False)
current_settings = my_settings.copy()
current_settings.foo = False # pointless example
And again, as above, we import settings to access them or make changes:
# foo.py
import settings
bar = settings.current_settings.bar
settings.current_settings.foo = True
You can even inherit from Settings to create new defaults:
class LocalSettings(Settings):
def __init__(self, foo=None, bar=None): # in Python 3,
super(LocalSettings, self).__init__(foo, bar) # super().__... works
self.foo = foo if foo is not None else True
And so on.
Your functions are defined in functions.py at import time -- So, if (in program.py) you
#start of file DO NOT "import functions" YET!!!
import default_settings
import my_settings
default_settings.DO_DEBUG=my_settings.DO_DEBUG
default_settings.SOME_DEFAULT=my_settings.SOME_DEFAULT
import functions
then the settings from my_settings should take over. I don't know if this is the solution you are looking for (it doesn't sit right with me), but I can't see any other option with this code structure.
edit
To alleviate the pain of resetting all the settings by hand, you could probably use the inspect module:
#start of file DO NOT "import functions" YET!!!
import default_settings
import my_settings
import inspect
#pull out all of "my_settings" and apply them to "default_settings",
# but only if they don't start with an underscore (reserved)
members=inspect.getmembers(my_settings)
for k,v in members:
if( not k.startswith('_') ):
setattr(default_settings,k,getattr(my_settings,k))
import functions
However, This still doesn't sit right with me -- The thing I don't like is that the behavior of functions depends on when you import it which is not something you typically see in python. I think your code could benefit from some sort of restructuring.

Categories

Resources