django lambda and django-activity-stream - python

I am not familiar with the lambda function itself, and don't really know how to debug this issue.
Django-1.1.2
I am using, django-activity-stream in order to render activity streams for my users.
In the documentation for this items it says that you need to pass two lambda functions to incorporate with existing newtworks, such as django-friends(the one I am using)
Here are the functions that need to be pasted into your settings.py file.
ACTIVITY_GET_PEOPLE_I_FOLLOW = lambda user: get_people_i_follow(user)
ACTIVITY_GET_MY_FOLLOWERS = lambda user: get_my_followers(user)
I have done so, but now everytime I try and render the page that makes use of this, I get the following traceback.
Caught NameError while rendering: global name 'get_people_i_follow' is not defined
Although this has been set in my settings...
Your help is much appreciated!

Somewhere above where these lambdas are defined, you need to import the names get_people_i_follow and get_my_followers. I'm not familiar with django-activity-stream, but it's probably something like from activity_stream import get_people_i_follow, get_my_follwers.
Lambda is a keyword for creating anonymous functions on the fly, so the meaning of your code is basically the same as if you had written the following.
def ACTIVITY_GET_PEOPLE_I_FOLLOW(user):
return get_people_i_follow(user)
def ACTIVITY_GET_MY_FOLLOWERS(user):
return get_my_followers(user)

You need to make sure that the functions get_people_i_follow and get_my_followers are imported into your settings files.
e.g.:
from activity_stream.models import get_people_i_follow, get_my_followers
Lambda is just a shorthand for defining a function so:
ACTIVITY_GET_PEOPLE_I_FOLLOW = lambda user: get_people_i_follow(user)
Is equivalent to:
def activity_get_people_i_follow(user):
return get_people_i_follow(user)
ACTIVITY_GET_PEOPLE_I_FOLLOW = activity_get_people_i_follow
Which upon reflection means you don't gain a lot in this case. However if you needed to avoid importing those function too early in your settings file (i.e. due to circular import) then you could do:
def activity_get_people_i_follow(user):
from activity_stream.models import get_people_i_follow
return get_people_i_follow(user)
ACTIVITY_GET_PEOPLE_I_FOLLOW = activity_get_people_i_follow
and just import the activity stream function as you need it.
UPDATE: looks like defining these settings is a red-herring:
https://github.com/philippWassibauer/django-activity-stream/blob/master/activity_stream/models.py#L133
As you can see these settings are only needed if you are not using the default activity streams. So simply remove them from your settings file.
The seg-fault is probably due to an infinite recursion occurring, as get_people_i_follow calls whatever function is defined by ACTIVITY_GET_PEOPLE_I_FOLLOW, which in this case calls get_people_i_follow again...

If you are integrating with a pre-existing network I don't believe you're actually supposed to write verbatim:
ACTIVITY_GET_PEOPLE_I_FOLLOW = lambda user: get_people_i_follow(user)
ACTIVITY_GET_MY_FOLLOWERS = lambda user: get_my_followers(user)
I believe the author was just showing an example that ACTIVITY_GET_PEOPLE_I_FOLLOW and
ACTIVITY_GET_MY_FOLLOWERS need to be set to a lambda or function that accepts one user argument and returns a list of users. You should probably be looking for something like friends_for_user in django-friends, or writing your own functions to implement this functionality.
get_people_i_follow is indeed defined in activity_stream.models but it is just importing what's defined in settings.py. So if settings.py has ACTIVITY_GET_PEOPLE_I_FOLLOW = lambda user: get_people_i_follow(user) you're going to get a wild and crazy circular import / infinite recursion.

Related

Avoid global variable in a python util file

I have a utilities.py file for my python project. It contains only util functions, for example is_float(string), is_empty(file), etc.
Now I want to have a function is_valid(number), which has to:
read from a file, valid.txt, which contains all numbers which are valid, and load them onto a map/set.
check the map for the presence of number and return True or False.
This function is called often, and running time should be as small as possible. I don't want to read open and read valid.txt everytime the function is called. The only solution I have come up with is to use a global variable, valid_dict, which is loaded once from valid.txt when utilities.py is imported. The loading code is written as main in utilities.py.
My question is how do I do this without using a global variable, as it is considered bad practice? What is a good design pattern for doing such a task without using globals? Also note again that this is a util file, so there should ideally be no main as such, just functions.
The following is a simple example of a closure. The dictionary, cache, is encapsulated within the outer function (load_func), but remains in scope of the inner, even when it is returned. Notice that load_func returns the inner function as an object, it does not call it.
In utilities.py:
def _load_func(filename):
cache = {}
with open(filename) as fn:
for line in fn:
key, value = line.split()
cache[int(key)] = value
def inner(number):
return number in cache
return inner
is_valid = _load_func('valid.txt')
In __main__:
from utilities import is_valid # or something similar
if is_valid(42):
print(42, 'is valid')
else:
print(42, 'is not valid')
The dictionary (cache) creation could have been done using a dictionary comprehension, but I wanted you to concentrate on the closure.
The variable valid_dict would not be global but local to utilities.py. It would only become global if you did something like from utilities import *. Now that is considered bad practice when you're developing a package.
However, I have used a trick in cases like this that essentially requires a static variable: Add an argument valid_dict={} to is_valid(). This dictionary will be instantiated only once and each time the function is called the same dict is available in valid_dict.
def is_valid(number, valid_dict={}):
if not valid_dict:
# first call to is_valid: load valid.txt into valid_dict
# do your check
Do NOT assign to valid_dict in the if-clause but only modify it: e.g., by setting keys valid_dict[x] = y or using something like valid_dict.update(z).
(PS: Let me know if this is considered "dirty" or "un-pythonic".)

Sphinx revealing my (mailgun) password

I have a simple function
import config
def send_message(mailgunkey=config.MAILGUNKEY):
"""
send an email
"""
It relies on a variable defined in my config.py file. I read the variables from local files on all my machines as I don't want to have my keys etc. in any repository. However, I recently got into the habit of using Sphinx. When generating the html docs the expression config.MAILGUNKEY is getting evaluated and the actual key is revealed in the html file. Is there an option to stop this kind of undesired action?
Consider using this approach:
import config
def send_message(mailgunkey=None):
"""
send an email
"""
if mailgunkey is None:
mailgunkey = config.MAILGUNKEY
In general, this approach gives you some important advantages:
lets your users pass None as the default;
allows changes to config.MAILGUNKEY even if your module has already been imported;
solves the problem of mutable default arguments (not your case, but still it's something to be aware of).
The second point is, in my opinion, something very important, as I would be very surprised to see that changes to a configuration variable at runtime have no effects.
Another option would be to mock the module that has your secrets.
This avoids needing to change your code to generate documentation.
Assuming you are using autodoc; add the below to your conf.py:
autodoc_mock_imports = ["config","secrets",]
https://www.sphinx-doc.org/en/master/usage/extensions/autodoc.html?highlight=autodoc_mock_imports%20#confval-autodoc_mock_imports
The autodoc_preserve_defaults configuration option was added in Sphinx 4.0.
The problem is solved by setting this option to True in conf.py. Default argument values of functions will not be evaluated and shown in the generated output.

Can I replace an attribute with a lambda WITHOUT changing access syntax?

I have a boolean that is used all over a system I've just joined (it's Django's settings.DEBUG, but that's not really important, since this would be handy for testing, as well.)
I want to prevent non-django use of this attribute. Developers using my system should get an exception, telling them to use a different attribute:
settings.DEBUG = lambda: return_bool_or_throw_exception_if_caller_forbidden()
The trouble is, switching to a lambda requires that accessors change:
#instead of:
if settings.DEBUG:
#now:
if settings.DEBUG():
But this would require changing all the reads of DEBUG in Django code, which is unacceptable in my situation. Can I deliver a lambda or a function with no arguments in such a way that consumers can access the thing without function call semantics?
You could monkey patch the settings class and set DEBUG to a descriptor:
def DEBUG(self):
return return_bool_or_throw_exception_if_caller_forbidden()
settings.__class__.DEBUG = property(DEBUG)

How to organize database connections in larger Python/Flask applications?

I am currently trying to write a little web-application using python, flask and sqlite and I'm not sure about how to handle the database-connections.
Basically I've been following the "official" Tutorial (and http://flask.pocoo.org/docs/patterns/sqlite3/#sqlite3 ), leaving me with code like this in my main Flask/App module (drastically shortened):
#vs_app.before_request
def before_request():
g.db = sqlite3.connect("somedb.db")
def execute_db(command):
return g.db.cursor().execute(command).fetchall()
#app.route("/givemeallusers")
def foo():
return execute_db("SELECT * FROM users")
So this creates a DB-Connection for each request that can be used by my application to execute sql-commands and it works fine for smaller applications.
But for a larger project I'd like to put my database-code (with some utility methods) and app-code in different modules, so I could (for example) write:
from my_database_module import user_auth #
#app.route("/login")
def foo():
if user_auth(request.form["name"], request.form["pw"]):
pass
But for that I would need access to g (or g.db) from my database-class and the only way I could get that to work was by "manually" handing over the db-connection to each method, leaving me with code like this:
#app.route("/login")
def foo():
if user_auth(g.db, request.form["name"], request.form["pw"]):
pass
and in my database-module
def user_auth(database, name, pw):
pass
I don't think thats the best approach, but other ideas I had (like importing the g object into my database-class) don't work. I also don't know whether my approach is safe with regards to concurrent db-access, so any help with this would be appreciated.
tl;dr How to split Flask-App and Database/Database-Utility the right way?
Just in case someone stumbles upon this question:
The correct answer is to not bother with the approaches described in the Question, and instead just start working with Flask-SQLAlchemy.

Is there a high-level profiling module for Python?

I want to profile my Python code. I am well-aware of cProfile, and I use it, but it's too low-level. (For example, there isn't even a straightforward way to catch the return value from the function you're profiling.)
One of the things I would like to do: I want to take a function in my program and set it to be profiled on the fly while running the program.
For example, let's say I have a function heavy_func in my program. I want to start the program and have the heavy_func function not profile itself. But sometime during the runtime of my program, I want to change heavy_func to profile itself while it's running. (If you're wondering how I can manipulate stuff while the program is running: I can do it either from the debug probe or from the shell that's integrated into my GUI app.)
Is there a module already written which does stuff like this? I can write it myself but I just wanted to ask before so I won't be reinventing the wheel.
It may be a little mind-bending, but this technique should help you find the "bottlenecks", it that's what you want to do.
You're pretty sure of what routine you want to focus on.
If that's the routine you need to focus on, it will prove you right.
If the real problem(s) are somewhere else, it will show you where they are.
If you want a tedious list of reasons why, look here.
I wrote my own module for it. I called it cute_profile. Here is the code. Here are the tests.
Here is the blog post explaining how to use it.
It's part of GarlicSim, so if you want to use it you can install garlicsim and do from garlicsim.general_misc import cute_profile.
If you want to use it on Python 3 code, just install the Python 3 fork of garlicsim.
Here's an outdated excerpt from the code:
import functools
from garlicsim.general_misc import decorator_tools
from . import base_profile
def profile_ready(condition=None, off_after=True, sort=2):
'''
Decorator for setting a function to be ready for profiling.
For example:
#profile_ready()
def f(x, y):
do_something_long_and_complicated()
The advantages of this over regular `cProfile` are:
1. It doesn't interfere with the function's return value.
2. You can set the function to be profiled *when* you want, on the fly.
How can you set the function to be profiled? There are a few ways:
You can set `f.profiling_on=True` for the function to be profiled on the
next call. It will only be profiled once, unless you set
`f.off_after=False`, and then it will be profiled every time until you set
`f.profiling_on=False`.
You can also set `f.condition`. You set it to a condition function taking
as arguments the decorated function and any arguments (positional and
keyword) that were given to the decorated function. If the condition
function returns `True`, profiling will be on for this function call,
`f.condition` will be reset to `None` afterwards, and profiling will be
turned off afterwards as well. (Unless, again, `f.off_after` is set to
`False`.)
`sort` is an `int` specifying which column the results will be sorted by.
'''
def decorator(function):
def inner(function_, *args, **kwargs):
if decorated_function.condition is not None:
if decorated_function.condition is True or \
decorated_function.condition(
decorated_function.original_function,
*args,
**kwargs
):
decorated_function.profiling_on = True
if decorated_function.profiling_on:
if decorated_function.off_after:
decorated_function.profiling_on = False
decorated_function.condition = None
# This line puts it in locals, weird:
decorated_function.original_function
base_profile.runctx(
'result = '
'decorated_function.original_function(*args, **kwargs)',
globals(), locals(), sort=decorated_function.sort
)
return locals()['result']
else: # decorated_function.profiling_on is False
return decorated_function.original_function(*args, **kwargs)
decorated_function = decorator_tools.decorator(inner, function)
decorated_function.original_function = function
decorated_function.profiling_on = None
decorated_function.condition = condition
decorated_function.off_after = off_after
decorated_function.sort = sort
return decorated_function
return decorator

Categories

Resources