Syntax to access Intersystems cache 2017.xx class properties with Python - python

I have a Python (3.7) process to access a cache 2017.xx instance. I want to look at the properties of the existing databases for the instance. I can not seem to find the proper syntax. This is code snippet:
import codecs
import os
import sys
import intersys.pythonbind3 as pyb
url = "localhost[1972]:%SYS"
user= "xxxx"
password="zzzz"
conn = pyb.connection()
conn.connect_now(url,user,password,None)
db = pyb.database(conn)
qry = pyb.query(db)
obj = pyb.object(db)
# I get general database information with:
qry.prepare_class("SYS.Database",'CompactLocalList')
As I loop through the list of databases I want to get the property "MaxSize". The docs I have read mention creating connection to the current database (openId ?) and then accessing properties something like .get()
That is a rational approach but I can not figure out the proper syntax to to get a file handle If that is the proper term) to the current database

After days of research I figured out the logic to get a handle to each of the cache databases. The method to use is the
_db = db.openid([class], directory,-1,-1)
In my case I am using the class = 'SYS.Database'
The directory is the dir for the current database (i.e. 'c:\intersystems\mgr\db')
From this I use the object getters/setters to get the property value:
maxSize = _db.get("MaxSize")
The getter can be used to get the value of any of the properties detailed in the 'SYS.Database' documentation. ( or any other cache class).

Related

How to handle cross referencing between Peewee model and application controller?

I would like to make use of object-oriented programming style when coding with Peewee. Unfortunately, docs give hints only with kinda global variables handling DB connection. When I try to take adventage of Model and Controller objects (View isn't important at this moment), I'm getting error, probably because of cross-referencing each other:
ImportError: cannot import name 'Application' from 'Application'
(C:\ [... src ...] )
Peewee requires to put database handler in abstract class definition, like this:
class BaseModel(Model):
class Meta:
database = SqliteDatabase('../res/db.db', pragmas={'foreign_keys': 1})
Well, the problem is, I cannot keep the DB handler like that. I'm preparing my app for kinda standalone Windows application with service module. For this reason, I guess I need to store absolute path for db file in config file. Consequently, before the model starts loading database, I need to load configuration files from controller.
What I did was pushing the DB handler to the static field in controller:
from Application import Application
from peewee import *
class BaseModel(Model):
class Meta:
database = Application.database
As you see, DB handler is taken from Application abstract controller. Application is a base controller, from which derives GuiApp and ServiceApp. Both descendands use the same DB, so keeping handler as a static field looks convenient for me.
Now, please do take a look at my Application class:
import logging.handlers
from peewee import SqliteDatabase
import datetime
import threading
from Windows import *
class Application:
database = SqliteDatabase(None)
def __init__(self, appname):
# (...)
from Entities import RouterSettings, BalanceEntry
Application.database.init(
conig_app_src + 'db.db',
pragmas={'foreign_keys': 1})
Application.database.connect()
# !!!
from Entities import RouterSettings, BalanceEntry
# !!!
Application.database.create_tables([RouterSettings, BalanceEntry], safe=True)
The problem is, when I put Peewee Entities import right before the place I start using them, I mean, inside __init__ method, I'm somehow losing accessability from another parts of my app. It forces me to put this import statements in every controller method in order to get proper access to Entity models.
On the other hand, when I put Entity import on top of controller module, I'm getting error from cross referencing. Error message I put above.
To sum up, I'm looking for OOP way to manage app with peewee models. Do you know any way to do that? Or do I have to use global database variable in the app init?
Thanks to #coleifer, I decided to make one another class for database handling:
class DbHandler:
database = SqliteDatabase(None)
#staticmethod
def start(dbSrc):
DbHandler.database.init(
dbSrc + '\\res\\SIMail.db',
pragmas={'foreign_keys': 1})
DbHandler.database.connect()
DbHandler.database.create_tables([RouterSettings, BalanceEntry],
safe=True)
Well, eventually it looks quite similar to global variables, but I think this solution fits my needs. What is the most important, I managed to get out of cross-referencing.

How to automatically reload a MongoDB document in pymongo or mongoengine when accessing its attributes?

I am using MongoDB for application setting and in my Python code, I read the setting document from db using mongoengine that is a high-level wrapper for pymongo.
I'm able to reload the document each time accessing its attributes like this (assuming that there is a document in Setting collection in MongoDB):
import time
import mongoengine
class Setting(mongoengine.Document):
log_level = mongoengine.StringField(default='info')
setting = Setting.objects[0]
while True:
time.sleep(1)
setting.reload()
print(setting.log_level)
But, I'm interested in a method to make Setting document automatically reloaded each time accessing log_level attribute. Is there a clean way to do this in mongoengine? I prefer to have a code like this and my Setting be always sync with db:
...
while True:
time.sleep(1)
print(setting.log_level)
I've read mongoengine doc a bit more to find a nice way and I've found no_cache() method that can be used to force the Setting class to returned a non-caching queryset. With this, I'm a bit closer:
def get_setting():
return Setting.objects.no_cache()[0]
while True:
time.sleep(1)
print(get_setting().log_level)
Any idea?
Thank you
You can write a custom method which does that.
Pseudocode:
def get_reloaded_attr(obj,attr):
obj.reload()
return getattr(obj,attr)

LDAP gidNumber like Auto Integer

Still working with LDAP...
The problem i submit today is this: i'm creating a posixGroup on a server LDAP using a custom method developed in python using Django framework. I attach the method code below.The main issue is that attribute gidNumber is compulsory of posixGroup class, but usually is not required when using graphical LDAP client like phpLDAPadmin since they fill automatically this field like an auto-integer.
Here the question: gidNumber is an auto integer attribute for default, or just using client like the quoted above? Must i specify it during the posixGroup entry creation?
def ldap_cn_entry(self):
import ldap.modlist as modlist
dn = u"cn=myGroupName,ou=plant,dc=ldap,dc=dem2m,dc=it"
# A dict to help build the "body" of the object
attrs = {}
attrs['objectclass'] = ['posixGroup']
attrs['cn'] = 'myGroupName'
attrs['gidNumber'] = '508'
# Convert our dict to nice syntax for the add-function using modlist-module
ldif = modlist.addModlist(attrs)
# Do the actual synchronous add-operation to the ldapserver
self.connector.add_s(dn, ldif)
connector is first instanced in the constructor of the class where this method is built. The constructor provides also to the LDAP initialization and binding. Than, the connection will be closed by the destructor.
to use the method i begin instancing the class it belongs, so it also connects to LDAP server. Than i use the method and finally i destroy the object i instanced before to close the connection. All works, indeed, if use this procedure to create a different entry, or if i specify the gidNumber manually.
The fact is i CAN'T specify the gidNumber any time i want to create a group to goal my purpose. I should leave it filling automatically (if that's possible) or think about another way to complete it.
I'm not posting more code about the class i made to not throng the page.
I'll provide more information if needed. Thank you.
The LDAP protocol has no method for auto-integer.
You need to specify the value when creating the entry.
You can do some tricks to help.
We often put the last used value on an OU (We add an AUX class with custom Attribute to the OU) in LDAP and then read, increment and then use the value when using the gidNumber.
Found this described.
-jim
Following #jeemster suggestion, i found the way to manage gidNumber.
Fist of all: i created a new entry on my LDAP called "gidNumber" and i added the optional attribute description to contain the last gidNumber i used (class: organizationalUnit, ou: gidNumber, description: 500).
Then i created the following functions:
def ldap_gid_finder(self):
# Locates the suport-entry with a simple query
self.baseDN = "ou=impianti,dc=ldap,dc=dem2m,dc=it"
self.searchScope = ldap.SCOPE_SUBTREE
self.retrieveAttributes = None
self.searchFilter = "ou=*gidNumber*"
# Results are putted in a dictionary
self.ldap_result = self.connector.search(
self.baseDN, self.searchScope, self.searchFilter, self.retrieveAttributes)
result_set = []
while 1:
result_type, result_data = self.connector.result(self.ldap_result, 0)
if (result_data == []):
break
else:
if result_type == ldap.RES_SEARCH_ENTRY:
result_set.append(result_data)
# The attribute containing gidNumber is passed to an instanced variable
self.actual_gid_number = int(result_set[0][0][1]['description'][0])
# Provides to gidNumber incrementation
def ldap_gid_increment(self):
dn = "ou=gidNumber,ou=impianti,dc=ldap,dc=dem2m,dc=it"
old = {'description': str(self.actual_gid_number)}
new = {'description': str(self.actual_gid_number + 1)}
ldif = modlist.modifyModlist(old,new)
self.connector.modify_s(dn, ldif)
As i sad above, these methods are defined in a class of which i overrided constructor and destructor, in order to bind/unbind automatically to LDAP server when i instance or delete the instance.
Then, i used a query on LDAP to find the object called gidNumber (the ou i created before), and i filled a dictionary with resulting information. In the dictionary i found the variable representing the gidNumber and i used integer casting to manipulate it for incrementing. And that's all.
This procedure i really efficent because i server reboots you don't lose gidNumber information! Thank you again, jeemster.

Can't delete row from SQLAlchemy due to wrong session

I am trying to delete an entry from my table. This is my code for the delete function.
#app.route("/delete_link/<link_id>", methods=['GET', 'POST'])
def delete_link(link_id):
link = models.Link.query.filter(models.Link.l_id == link_id).first()
db.session.delete(link)
db.session.commit()
return flask.redirect(flask.url_for('links'))
the line: db.session.delete(link) returns me this error:
InvalidRequestError: Object '' is already attached to session '1' (this is '2')
I've tried this code as well:
#app.route("/delete_link/<link_id>", methods=['GET', 'POST'])
def delete_link(link_id):
link = models.Link.query.filter(models.Link.l_id == link_id)
link.delete()
db.session.commit()
return flask.redirect(flask.url_for('links'))
which does not update the database. Link must not be in the session I guess, but I don't know how to check that, and how to fix it.
I am new to sqlalchemy.
EDIT:
I use this to create my db variable which probably creates the session at this stage (this is at the top of the code). It comes from the flask documentation
from yourapplication import db
You are creating 2 instances of the db object, inherently creating 2 different sessions.
In models.py:
...
5. from config import app
6.
7. db = SQLAlchemy(app)
In erika.py:
...
16. from config import app
...
23. db = SQLAlchemy(app)
then when you try to delete the element:
link = models.Link.query.filter(models.Link.l_id == link_id).first()
db.session.delete(link)
db.session.commit()
the following happens:
models.Link.query uses the database session created by models.py to get the record.
db.session.delete uses the session created by erika.py.
link is attached to the models.py session and you can't use another session (erikas.py) to delete it. Hence:
InvalidRequestError: Object '' is already attached to session '1' (this is '2')
Solution
The solution it's simple. Have only one instance of a db object at any time and reuse that instance whenever you need db operations.
erika.py
from models import db
This way you are always using the same session that was used to fetch your records.
It appears to be a similar problem to the one described at http://blog.miguelgrinberg.com/post/the-flask-mega-tutorial-part-xvi-debugging-testing-and-profiling
It's a good in-depth description of the problem and how he solved it. The author of that article made a fix that's available as a fork.
The Fix
To address this problem we need to find an alternative way of attaching Flask-WhooshAlchemy's query object to the model.
The documentation for Flask-SQLAlchemy mentions there is a model.query_class attribute that contains the class to use for queries. This is actually a much cleaner way to make Flask-SQLAlchemy use a custom query class than what Flask-WhooshAlchemy is doing. If we configure Flask-SQLAlchemy to create queries using the Whoosh enabled query class (which is already a subclass of Flask-SQLAlchemy's BaseQuery), then we should have the same result as before, but without the bug.
I have created a fork of the Flask-WhooshAlchemy project on github where I have implemented these changes. If you want to see the changes you can see the github diff for my commit, or you can also download the fixed extension and install it in place of your original flask_whooshalchemy.py file.

KindError in Google App Engine

I defined a simple class in GAE for keeping user profiles data like this:
class User(db.Model):
email = db.EmailProperty()
role = db.StringProperty(default=roles.USER)
first_name = db.StringProperty()
last_name = db.StringProperty()
...
I use memcache to keep session information. memcache data looks like this { 'key': 'agpjYW5kaXJhdGVzcgoLEgRVc2VyGCMM'}. I get session_id value from the cookie. When I try to get user info linked to that cookie like this:
session_id = request['session_id']
data = memcache.get(session_id)
user = User.get(data['key'])
I get KindError exception:
KindError: Kind 'User' is not a subclass of kind 'User'
I know this user exists, memcache exists. User class is defined only once in my project. Why this error occurs and how can I make it work?
UPDATE: I tried to use db.get() instead of User.get() and it worked. So, what's the problem there can be?
Model.get() does check whether the supplied key is of the correct kind, as defined in the documentation. If not of the correct kind it will throw a KindError.
db.get() does not do any type checking and therefore will succeed with the supplied value if it exists in the data store, but will not necessarily return a User entity.
So you need to check whether the key in your memcache is actually of the User kind. Are you sure it's not overwritten with the key of a different model at some point?
The App Engine framework defines a class called 'User' as part of the Users API. In addition, you have your own class by the same name. When the exception occurs, you're trying to use one, but getting the other.
To avoid this, rename your model. You should also be careful how you import modules in Python. Instead of:
from google.appengine.api.users import User
or worse:
from google.appengine.api.users import *
you should use:
from google.appengine.api import users
And then refer to users.User, which is unambiguous.
The problem, it seems to me, is more subtle than that. I was getting the error with this call to Model.get() (I'm retrieving a top-level singleton object, always there):
datastore = GDSDatastore.get(gds.Key.from_path(*path))
so I investigated with this code:
datastore = gds.get(gds.Key.from_path(*path))
if not(datastore is None or isinstance(datastore, GDSDatastore)):
logger.error("KindError isinstance(GDSDatastore)=%s class=%s" % (isinstance(datastore, GDSDatastore), datastore.__class__.__name__))
raise gds.KindError('Kind %r is not a GDSDatastore instance' %
(datastore.kind()))
The vast majority of the time I get no error, but today I got this interesting log:
KindError isinstance(GDSDatastore)=False class=GDSDatastore
Now, that strikes me as rather peculiar.
(Note: GDSDatastore is defined locally: class GDSDatastore(gds.Model))

Categories

Resources