Django Middleware context handling issue related to variable domain - python

I face a weird variable domain issue when trying to define a middleware for django that will keep the request in the thread context. the first code section create an error when I try to access the method "get" from the API in the views file. the second code example works great. why???
Example 1 (does not work):
class ContextHandler(object):
#_LOCALS = threading.local()
def process_request(self, request):
self._LOCALS = threading.local()
self._LOCALS.x = "alon"
return None
Example 2 (works):
class ContextHandler(object):
_LOCALS = threading.local()
def process_request(self, request):
self._LOCALS.x = "alon"
return None
common get method:
#classmethod
def get(cls):
return getattr(cls._LOCALS, 'x', None)
Thanks!

In the first example you have no class property _LOCALS, it is instance property. So in first case ContextHandler._LOCALS is None and cls in get() is ContextHandler.
If you want thread safe code don't stick with #classmethod and
class ContextHandler(object):
_LOCALS = threading.local()
as far as I know class definition is processed only once (most likely in main thread). I'd rather initialize _LOCALS in process_request() and make get() instance method:
class ContextHandler(object):
def process_request(self, request):
self._LOCALS = threading.local()
self._LOCALS.x = "alon"
return None
def get(self):
return getattr(self._LOCALS, 'x', None)

Turns out that declaring self.someproperty equals declaring .someproperty if the class has a class/static level variable.

Related

How to access the variables created within a `with` statement

I have defined a python context class and a Test class in a file:
class Test(object):
pass
class MyContext(object):
def __init(self):
self._vars = []
def __enter__(self):
pass
def __exit(self, ....):
pass
In another file using that context:
from somewhere import Test, MyContext
with MyContext() as ctx:
mytest = Test()
So what I want to achieve is that when I exit the context, I want to be aware of the mytest instance created and add it in the ctx._vars = [<instance of Test >].
I don't want to have a ctx.add_var(mytest) method, I want those Test instances to be added automatically to the ctx instance.
That is possible of being done, using Python's introspection capabilities, but you have to be aware this is not what the with context block was created for.
I agree it is a useful syntax construction that can be "deviated" to do things like what you want: annotate the objects created inside a code block in a "registry".
Before showing how to do that with a context manager consider if a class body would not suffice you. Using a class body this way also deviates from its primary purpose, but you have your "registry" for free:
from somewhere import Test, MyContext
class ctx:
mytest = Test()
vars = ctx.__dict__.values()
In order to do that with a context manager, you have to inspect the local variables at the start and at the end of the with block. While that is not hard to do, it wuld not cover all instances of Test created - because if the code is like this:
mytests = []
with Mycontext as ctx:
mytests.append(Test())
No new variable is created - so code tracking the local variables would not find anything. Code could be written to look recursively into variables with containers, such as dictionaries and lists - but then mytest() instances could be added to a container referenced as a global variable, or a variable in other module.
It turns out that a reliable way to track Test instances would be to instrument the Test class itself to annotate new instances ina registry. That is far easier and less depentend on "local variable introspection" tricks.
The code for that is somewhat like:
class Test(object):
pass
class MyContext(object):
def __init(self, *args):
self.vars = []
self.track = args
self.original_new = {}
def patch(self, cls_to_patch):
cls_new = getattr(cls_to_patch, "__new__")
if "__new__" in cls.__dict__:
self.original_new[cls_to_patch] = cls_new
def patched_new(cls, *args, **kwargs):
instance = cls_new(*args, **kwags)
self.vars.append(instance)
return instance
cls_to_patch.__new__ = patched_new
def restore(self, cls):
if cls in self.original_new:
# class had a very own __new_ prior to patching
cls.__new__ = self.original_new[cls]
else:
# just remove the wrapped new method, restores access to superclass `__new__`
del cls.__new__
def __enter__(self):
for cls in self.track:
self.patch(cls)
return self
def __exit(self, ....):
for cls in self.track:
self.restore(cls)
...
from somewhere import Test, MyContext
with MyContext(Test) as ctx:
mytest = Test()

Flask-Admin different forms and column_list for different roles

Following up on this question Flask-Admin Role Based Access - Modify access based on role I don't understand how to implement role-based views, especially regarding the form and column_lists.
Say I want MyModelView to show different columns if the user is a regular user or a superuser.
Overriding is_accessible in MyModelView has no effect at all
from flask_security import Security, SQLAlchemyUserDatastore, current_user
class MyModelView(SafeModelView):
# ...
def is_accessible(self):
if current_user.has_role('superuser'):
self.column_list = superuser_colum_list
self.form_columns = superuser_form_columns
else:
self.column_list = user_colum_list
self.form_columns = user_form_columns
return super(MyModelView, self).is_accessible()
# Has same effect as
def is_accessible(self):
return super(MyModelView, self).is_accessible()
and defining conditional class attributes does not work either as current_user is not defined (NoneType error as per AttributeError on current_user.is_authenticated()). Doing the same in the ModelView's __init__ being equivalent, current_user is still not defined
class MyModelView(SafeModelView):
#[stuff]
if current_user.has_role('superuser'):
column_list = superuser_colum_list
form_columns = superuser_form_columns
else:
column_list = user_colum_list
form_columns = user_form_columns
#[other stuff]
FYI, SafeModelView can be any class inheriting from dgBaseView in the previously mentioned question.
I usually define view class attributes such as column_list as properties. It allows you to add some dynamic logic to them:
from flask import has_app_context
from flask_security import current_user
class MyModelView(SafeModelView):
#property
def column_list(self):
if has_app_context() and current_user.has_role('superuser'):
return superuser_column_list
return user_column_list
#property
def _list_columns(self):
return self.get_list_columns()
#_list_columns.setter
def _list_columns(self, value):
pass
The problem with using this approach (and why your reassigning of column_list values in is_accessible function took no effect) is that many view attributes are cached on application launch and stored in private attributes. column_list for example is cached in _list_columns attribute so you need to redefine it as well. You can look how this caching works in flask_admin.model.base.BaseModelView._refresh_cache method.
Flask has_app_context method is needed here because first column_list read is happened on application launch when your current_user variable has no meaningful value yet.
The same can be done with form_columns attribute. The properties will look like this:
#property
def form_columns(self):
if has_app_context() and current_user.has_role('superuser'):
return superuser_form_columns
return user_form_columns
#property
def _create_form_class(self):
return self.get_create_form()
#_create_form_class.setter
def _create_form_class(self, value)
pass
#property
def _edit_form_class(self):
return self.get_edit_form()
#_edit_form_class.setter
def _edit_form_class(self, value):
pass

Flask routing to view functions via inheritance

I am currently in the process of writing a Flask application that routes endpoints to a variety of "Actions." These actions all implement a parent function called "run()"
In code:
import abc
class Action(object):
__metaclass__ = abc.ABCMeta
#classmethod
def authenticated(self):
print("bypassing action authentication")
return True
#classmethod
def authorized(self):
print("bypassing action authorization")
return True
#classmethod
#abc.abstractmethod
def execute(self):
raise NotImplementedError("must override execute!")
#classmethod
def response(self, executeResult):
return executeResult
#classmethod
def run(self):
result = ""
if self.authenticated() & self.authorized():
result = self.execute()
return self.response(result)
The intent is that all actually used actions are derived members of this Action class that bare-minimum implement an execute() function that differentiates them. Unfortunately, when I attempt to add routes for these
app.add_url_rule('/endone/', methods=['GET'], view_func=CoreActions.ActionOne.run)
app.add_url_rule('/endtwo/', methods=['GET'], view_func=CoreActions.ActionTwo.run)
I receive the following error:
AssertionError: View function mapping is overwriting an existing endpoint function: run
Does anyone know a possible solution to this issue? Thanks!
The common approach of generating view functions is to use Flask views. Subclass your Action class from flask.views.View, dispatch_request method is used instead of run:
import abc
from flask.views import View
class Action(View):
__metaclass__ = abc.ABCMeta
def authenticated(self):
print("bypassing action authentication")
return True
def authorized(self):
print("bypassing action authorization")
return True
#abc.abstractmethod
def execute(self):
raise NotImplementedError("must override execute!")
def response(self, executeResult):
return executeResult
def dispatch_request(self):
result = ""
if self.authenticated() & self.authorized():
result = self.execute()
return self.response(result)
And you can add routes using View.as_view() method which convert your class to view function:
app.add_url_rule(
'/endone/',
methods=['GET'],
view_func=CoreActions.ActionOne.as_view('endone')
)

Howto create Python Pyramid view class without need to specify 'name' for each method

I have several view classes in my Python Pyramid project added via add_handler:
config.add_handler('export_index', '/export', handler=ExportViews, action='index')
class ExportViews(ConfigViewBase):
#action(request_method='POST', name='index',
request_param='ftp_export.form.submitted')
#action(request_method='POST', name='index', xhr=True, renderer='json',
request_param='ftp_export.form.submitted')
def ftp_export(self):
#process form
return {}
#action(request_method='GET')
def index(self):
return {}
Is it possible to do the same having:
config.add_handler('export_index', '/export', handler=ExportViews)
class ExportViews(ConfigViewBase):
#action(request_method='POST',
request_param='ftp_export.form.submitted')
#action(request_method='POST', xhr=True, renderer='json',
request_param='ftp_export.form.submitted')
def ftp_export(self):
#process form
return {}
#action(request_method='GET')
def __call__(self):
return {}
So the __call__ was called when browser gets page, and ftp_export should be called when I post form on the same page. Now I get page not found error
Thank You.
You can do this with traversal. Traversal rocks :)
class Root(object):
def __getitem__(self, name):
if name == "export":
return ExportSomething(self)
if name == "export_something_else":
return ExportSomethingElse(self)
class ExportSomething(object):
implements(IViewable, IExportable)
def view(self, request):
return "Hi"
def export(self, request):
return "something else"
#view_config(context=IViewable, request_method="GET")
def view_viewable(conext, request):
return context.view(request)
#view_config(context=IExportable, request_method="POST")
def export_exportable(conext, request):
return context.export(request)
then you can implement a bunch of ExportThis and ExportThat classes, make them implement IViewable and IExportable interfaces, make them returned from Root.__getitem__ and everything magically works. Or, if you don't need multiple exporters you can omit interfaces and bind views directly to ExportSomething class. Or you can instantiate different instances of ExportSomething in getitem and make it to... I don't know, view/export different files/reports.

How do I refer to a class method outside a function body in Python?

I want to do a one time callback registration within Observer. I don't want to do the registration inside init or other function. I don't know if there is a class level equivalent for init
class Observer:
#classmethod
def on_new_user_registration(new_user):
#body of handler...
# first I try
NewUserRegistered().subscribe \
(Observer.on_new_user_registration) #gives NameError for Observer
#so I try
NewUserRegistered().subscribe(on_new_user_registration) #says not callable
#neither does this work
NewUserRegistered().subscribe(__metaclass__.on_new_user_registration)
class BaseEvent(object):
_subscriptions = {}
def __init__(self, event_info = None):
self.info = event_info
def fire(self):
for callback in self._subscriptions[event_type]:
callback(event_info)
def subscribe(self, callback):
if not callable(callback):
raise Exception(str(callback) + 'is not callable')
existing = self._subscriptions.get(self.__class__, None)
if not existing:
existing = set()
self._subscriptions[self.__class__] = existing
existing.add(callback)
class NewUserRegistered(BaseEvent):
pass
I suggest to cut down on the number of classes -- remember that Python isn't Java. Every time you use #classmethod or #staticmethod you should stop and think about it since these keywords are quite rare in Python.
Doing it like this works:
class BaseEvent(object):
def __init__(self, event_info=None):
self._subscriptions = set()
self.info = event_info
def fire(self, data):
for callback in self._subscriptions:
callback(self.info, data)
def subscribe(self, callback):
if not callable(callback):
raise ValueError("%r is not callable" % callback)
self._subscriptions.add(callback)
return callback
new_user = BaseEvent()
#new_user.subscribe
def on_new_user_registration(info, username):
print "new user: %s" % username
new_user.fire("Martin")
If you want an Observer class, then you can do it like this:
class Observer:
#staticmethod
#new_user.subscribe
def on_new_user_registration(info, username):
print "new user: %s" % username
But note that the static method does not have access to the protocol instance, so this is probably not very useful. You can not subscribe a method bound to an object instance like this since the object wont exist when the class definition is executed.
But you can of course do this:
class Observer:
def on_new_user_registration(self, info, username):
print "new user: %s" % username
o = Observer()
new_user.subscribe(o.on_new_user_registration)
where we use the bound o.on_new_user_registration as argument to subscribe.
I've come to accept that python isn't very intuitive when it comes to functional programming within class definitions. See this question. The problem with the first method is that Observer doesn't exist as a namespace until the class has been built. The problem with the second is that you've made a class method that doesn't really do what it's supposed to until after the namespace has been created. (I have no idea why you're trying the third.) In both case neither of these things occurs until after the class definition of Observer has been populated.
This might sound like a sad constraint, but it's really not so bad. Just register after the class definition. Once you realize that it's not bad style to perform certain initialization routines on classes in the body of the module but outside the body of the class, python becomes a lot friendlier. Try:
class Observer:
# Define the other classes first
class Observer:
#classmethod
def on_new_user_registration(new_user):
#body of handler...
NewUserRegistered().subscribe(Observer.on_new_user_registration)
Because of the way modules work in python, you are guaranteed that this registration will be performed once and only once (barring process forking and maybe some other irrelevant boundary cases) wherever Observer is imported.
oops. sorry about that.
All I had to do was to move the subscription outside the class definition
class Observer:
#classmethod
def on_new_user_registration(new_user):
#body of handler...
#after end of class
NewUserRegistered().subscribe(Observer.on_new_user_registration)
Guess it is a side-effect of too much Java that one doesn't immediately think of this.
What you're doing should work:
>>> class foo:
... #classmethod
... def func(cls):
... print 'func called!'
...
>>> foo.func()
func called!
>>> class foo:
... #classmethod
... def func(cls):
... print 'func called!'
... foo.func()
...
func called!
One thing to note though, class methods take a cls argument instead of a self argument. Thus, your class definition should look like this:
class Observer:
#classmethod
def on_new_user_registration(cls, new_user):
#body of handler...

Categories

Resources