Access APScheduler cron trigger field values in python - python

Similar to this question I want to extract the info of a cron job trigger from APScheduler.
However, I need the "day_of_week" field and not everything. Using
for job in scheduler.get_jobs():
for f in job.trigger.fields:
print(f.name + " " + str(f))
i can see all the fields, e.g. week,hour,day_of_week , but
job.trigger.day_of_week is seemingly 'not an attribute' of the "CronTrigger" object. I'm confused as to what kind of object this job.trigger is and how its fields are packed. I tried to read the code on github, but it is even more puzzling.
How do I extract only the one field day_of_week, and how is this trigger class structured?
Diving deeper I found that
apscheduler.triggers.cron.fields.DayOfWeekField
I can find by indexing the job.trigger.fields[4], which seems really bad style, since it depends on the 'position'of the field. What I get is this DayOfWeekField, from which comically I am not able to retrieve it's value either:
a.get_value
<bound method DayOfWeekField.get_value of DayOfWeekField('day_of_week', '1,2,3,4')>
The structure of the fields is coded here, but I don't know what to do with dateval, the argument of get_value().
Eventually, after hopefully understanding the concept, I want to do
if job-day_of_week contains mon
if job-day_of_week == '*'
print ( job-day_of_week )
I am grateful for any suggestions/hints!

Looking at the code, you should be able to get the day_of_week field without hardcoding the index by using the CronTrigger class's FIELD_NAMES property, e.g.
dow_index = CronTrigger.FIELD_NAMES.index('day_of_week')
dow = job.trigger.fields[dow_index]
Getting the value of the field is a bit more complicated, but it appears that BaseField implements the str function that should give you the value of the expression that created the field as a string that you could parse to find what you want:
dow_value_as_string = str(dow)
if 'mon' in dow_value_as_string:
# do something
if dow_value_as_string = "*":
# do something else

Related

Custom path parameter parsing drf-yasg and Django

I'm trying to force dry-yasg to properly parse parameters from path. Let's say we have
path('users/<int:user_id>/', whatever.as_view(...))
In swagger docs it is not treated as int, but string instead
I have used
swagger_auto_schema(manual_parameters = [
openapi.Parameter(
name,
openapi.IN_PATH,
description=desc,
type=openapi.TYPE_INTEGER,
required=True
)
]
but it's pretty annoying. I could not find a function/method/class responsible for parsing that. Is there a simple method to change behaviour of this parser based on path, so that if int occurs then openapi.TYPE_INTEGER will be returned instead of string?
drf-yasg determines the parameter type automatically in some situations, and falls back on string if detection fails.
queryset = get_queryset_from_view(view_cls)
for variable in sorted(uritemplate.variables(path)):
model, model_field = get_queryset_field(queryset, variable)
attrs = get_basic_type_info(model_field) or {'type': openapi.TYPE_STRING}
As you can see, it tries to get type based on column type of view queryset. If your parameter name doesn't match anything in the queryset though, you get a string. So your first choice should be to try use a name it can autodetect.
If that doesn't work however, you will need to subclass EndpointEnumerator and override get_path_parameters(), probably easiest to call super().get_path_parameters() and go though each parameter and replace type based on variable name.
To use this class you will need your own OpenAPISchemaGenerator.
use a custom OpenAPISchemaGenerator
override its endpoint_enumerator_class with your own EndpointEnumerator

How do I pass an object instead of a string via pythons' input() function?

I am working on a machine learning modelling problem where an object is created to store training and validation data, but the validation set if optional and if not included when creating the object the default value is None.
If we find out later on though the user wants to add a validation pandas dataframe we were hoping to let them supply the name of the dataframe with input(). With a function defined right in the notebook we're running we can then do an eval(<input>) to turn the string into the object we need. If we define the object outside of our notebook though it seems like the scope doesn't include that variable.
I realize this probably isn't the best way to do this, so what is a more pythonic way to let a user supply a dataframe by name after an object as already been instantiated? We can pass the objects fine as arguments to functions. Is there a way to pass an object like that but with input() or some other user-friendly way to prompt the user?
It maybe possible to use locals() or globals() as a dict for grabbing an initialized variable by it's name.
the_variable = {'key_one': 'val_one'}
selected_input = input("Please input a variable name")
selected_var = locals()[selected_input]
print("selected_var continence -> {0}".format(selected_var))
Should output, well assuming the_variable was passed to input()
selected_var continence -> {'key_one': 'val_one'}
This is an adaptation of an answer to Calling a function of a module by using it's name a string, but seems to work in this instance too.
Update
I can't remember where I picked up the following perversion (I did look about though), and I'm not suggesting it's use in production. But...
questionable_response = lambda message: input("{message}: ".format(message = message))
this_response = json.loads(questionable_response("Input some JSON please"))
# <- '{"Bill": {"person": true}, "Ted": {"person": "Who is asking?"}}'
... does allow for object like inputting.
And getting data from an inputted json string could look like...
this_response['Bill']
# -> {u'person': True}
this_response['Ted'].get('person')
# -> u'Who is asking?'
... however, you'll likely see some issues with using above with other scripted components.
For the Unicode conversion there's some pre-posted answers on the subject. And checking help(json.loads) exposes that there's toggles for parse_ing floats, ints, and constants.
Even with that it may not be worth it, because there's still some oddities you'll run into if trying to implement this funkiness.
Just to list a few;
conjunctions are a no go; let's say ya get a clever Clara who inputs something like '{"Clara": {"person": "I'll not be labelled!"}}'. That would cause an error unless ' was escaped, eg. \'
the above is also quote fragile; perhaps someone at the keyboard hasn't had enough to drink and tries "{'Jain': {'person': True}}". That would first barf on quotes, then heave from True not being true
So like I prefaced at the start of this update, I'll not recommend this in production; could waist a lot of time chasing edge-cases. I only share it because maybe you've not found any other option for getting from input to something that can be interrogated like an object.

How can I safely override the django.contrib.admin.utils quote() method?

It appears that the quote() and unquote() methods inside django.contrib.admin.utils do not effectively handle underscores in primary keys. Specifically, I have some string-type primary keys that look like cus_C2xVQnht and when I use the django admin interface to edit them via the small pencil icon, the popup window will display an error like Customer with ID "cusÂxVQnht" doesn't exist. Perhaps it was deleted? (it is converting the C2 to the codepoint 00C2, aka Â. This is true for other valid codepoints as well (00C7, 00C6, 001B, etc)
If I manually go to the customers model and find the ID, I can pull it up and edit it just fine, but it seems the URL encoding doesn't work right when the primary key has an underscore in it.
After quite a lot of digging I managed to find these two functions buried deep inside django.contrib.admin.utils:
def quote(s):
"""
Ensure that primary key values do not confuse the admin URLs by escaping
any '/', '_' and ':' and similarly problematic characters.
Similar to urllib.parse.quote(), except that the quoting is slightly
different so that it doesn't get automatically unquoted by the Web browser.
"""
if not isinstance(s, str):
return s
res = list(s)
for i in range(len(res)):
c = res[i]
if c in """:/_#?;#&=+$,"[]<>%\n\\""":
res[i] = '_%02X' % ord(c)
return ''.join(res)
def unquote(s):
"""Undo the effects of quote(). Based heavily on urllib.parse.unquote()."""
mychr = chr
myatoi = int
list = s.split('_')
res = [list[0]]
myappend = res.append
del list[0]
for item in list:
if item[1:2]:
try:
myappend(mychr(myatoi(item[:2], 16)) + item[2:])
except ValueError:
myappend('_' + item)
else:
myappend('_' + item)
return "".join(res)
They appear to be called somewhere in the admin template rendering process, but I couldn't figure out where/how often/all the locations, so I decided to do a quick monkey patch to decide if it was worth pursuing as a solution: I changed all the underscores in quote() and unquote() except for the one in the list of problem characters in quote to dots...for example:
'_%02X' in quote()becomes '.%02X'
split('_') in unquote() becomes split('.')
myappend('_' + item) in unquote() becomes myappend('.' + item)
Upon doing this, the admin works correctly and it appears that the links attached to the edit icons on related fields are to the correct model instances, so I can edit them by clicking the pencil icons and don't get the error message noted above.
All that said, I can't seem to find a way to safely override these two methods. I really would rather not change the primary keys to eliminate the underscores because there are a lot of linked models in my database and it just seems like it will become a huge pain. This fix seems much easier and more reliable, and given that it worked properly on previous versions of Django I don't see how it's a bad idea to implement.
So, how can I override those methods? Or, as a related question, is there something I can do in the __str__ methods of my models to alleviate this problem? I'd do that much sooner than start writing custom classes that override Django admin internals. If there is no other solution, I would need some help in properly restructuring my database to adjust the primary keys, but I can say that these keys work perfectly on the "old" site I'm working on, which runs Django 1.11.6 and Python 2.7.9 (vs the current Django 2.1.1 and Python 3.6.5)
Please let me know if I can provide any more info. Thank you!!
This is fixed in django 2.2. See https://github.com/django/django/commit/e4df8e6dc021fa472fa77f9b835db74810184748

Pass kwargs into Django Filter

When viewing model entries from within Django Admin, you can specify filters. How can I mimic this behavior? Not to familiar with kwargs but something similar to this:
foo = Model.objects.filter(**__exact='**')
where the first set of ** would be a field in the model and the second set would be an entry. Basically making the queries variable, based on what the user chooses on the front end. How would I send that variable sort option to the view, and then return it back to the webpage. What about using a dictionary? Please help
This SO question has proven to be a little helpful, but still cannot grasp it completely.
You can unpack a python dict as your filter parameters using **
your_filters = {
'field_1__exact': value_1,
'field_2__gte': value_2,
}
Model.objects.filter(**your_filters)
Said that, you can built your query filters(a python dict) dynamically based on an user input.

How can I specify a default time for a ndb.TimeProperty()?

I find myself stuck on this problem, and repeated Googling, checking SO, and reading numerous docs has not helped me get the right answer, so I hope this isn't a bad question.
One entity I want to create is an event taking place during a convention. I'm giving it the property start_time = ndb.TimeProperty(). I also have a property date = messages.DateProperty(), and I'd like to keep the two discrete (in other words, not using DateTimeProperty).
When a user enters information to create an event, I want to specify defaults for any fields they do not enter at creation and I'd like to set the default time as midnight, but I can't seem to format it correctly so the service accepts it (constant 503 Service Unavailable response when I try it using the API explorer).
Right now I've set things up like this (some unnecessary details removed):
event_defaults = {...
...
"start_time": 0000,
...
}
and then I try looping over my default values to enter them into a dictionary which I'll use to .put() the info on the server.
data = {field.name: getattr(request, field.name) for field in request.all_fields()
for default in event_defaults:
if data[default] in (None, []):
data[default] = event_defaults[default]
setattr(request, default, event_defaults[default])
In the logs, I see the error Encountered unexpected error from ProtoRPC method implementation: BadValueError (Expected time, got 0). I have also tried using the time and datetime modules, but I must be using them incorrectly, because I still receive errors.
I suppose I could work around this problem by using ndb.StringProperty() instead, and just deal with strings, but then I'd feel like I would be missing out on a chance to learn more about how GAE and NDB work (all of this is for a project on udacity.com, so learning is certainly the point).
So, how can I structure my default time properly for midnight? Sorry for the wall of text.
Link to code on github. The conference.py file contains the code I'm having the trouble with, and models.py contains my definitions for the entities I'm working with.
Update: I'm a dummy. I had my model class using a TimeProperty() and the corresponding message class using a StringField(), but I was never making the proper conversion between expected types. That's why I could never seem to give it the right thing, but it expected two different things at different points in the code. Issue resolved.
TimeProperty expects a datetime.time value
import datetime
event_defaults = {...
...
"start_time": datetime.time(),
...
}
More in the docs: https://cloud.google.com/appengine/docs/python/ndb/entity-property-reference#Date_and_Time
Use the datetime() library to convert it into a valid ndb time property value
if data['time']:
data['time'] = datetime.strptime(data['time'][:10], "%H:%M").time()
else:
data['time'] = datetime.datetime.now().time()
ps: Don't forget to change data['time'] with your field name

Categories

Resources