I tried to update a Transaction model queryset in the below pre_delete signal that connected to the Wallet model, but it didn't work.
The signal triggers, and all lines work correctly, except the last line :(
#receiver(pre_delete, sender=Wallet)
def delete_wallet(sender, instance, **kwargs):
if instance.is_default:
raise ValidationError({"is_default": "You can't delete the default wallet"})
q = {"company": instance.company} if instance.company else {"user": instance.user}
try:
default_wallet = Wallet.objects.get(**q, is_default=True)
except Wallet.DoesNotExist:
default_wallet = Wallet.objects.filter(**q).exclude(pk=instance.pk).first()
if not default_wallet:
raise ValidationError({"is_default": "You can't delete the last wallet"})
instance.transactions.all().update(wallet=default_wallet)
I moved the logic to the delete() function of the Wallet model instead of using the pre_delete() signal, and it's fixed now.
Maybe it cant find default wallet so it tries go get the first one which is the wallet you want to delete (because it's before delete so it still exist).
Try to exclude pk=instance.pk in your default_wallet query.
Related
first I'd like to check if the file exists, and Ive used this os.path:
def check_db_exist():
try:
file_exists = exists('games.db')
if file_exists:
file_size = os.path.getsize('games.db')
if file_size > 3000:
return True, file_size
else:
return False, 'too small'
else:
return False, 'does not exist'
except:
return False, 'error'
I have a separate file for my models, and creating the database. My concern is, if I import the class for the database it instantiates the sql file.
Moreover, pywebview when displaying my html, wipes all variables.
If I were to run this process as I load my page, then I can't access the variable for true/false sqlite exists.
db = SqliteDatabase('games.db')
class Game(Model):
game = CharField()
exe = CharField()
path = CharField()
longpath = CharField()
i_d = IntegerField()
class Meta:
database = db
This creates the table, so checking if the file exists is useless.
Then if I uncomment the first line in this file the database gest created, otherwise all of my db. variables are unusable. I must be missing a really obvious function to solve my problems.
# db = SqliteDatabase('games.db')
def add_game(game, exe, path, longpath, i_d):
try:
Game.create(game=game, exe=exe, path=path, longpath=longpath, i_d=i_d)
except:
pass
def loop_insert(lib):
db.connect()
for i in lib[0]:
add_game(i.name, i.exe, i.path, i.longpath, i.id)
db.close()
def initial_retrieve():
db.connect()
vals = ''
for games in Game.select():
val = js.Import.javascript(str(games.game), str(games.exe), str(games.path), games.i_d)
vals = vals + val
storage = vals
db.close()
return storage
should I just import the file at a different point in the file? whenever I feel comfortable? I havent seen that often so I didnt want to be improper in formatting.
edit: edit: Maybe more like this?
def db():
db = SqliteDatabase('games.db')
return db
class Game(Model):
game = CharField()
exe = CharField()
path = CharField()
file 2:
from sqlmodel import db, Game
def add_game(game, exe, path, longpath, i_d):
try:
Game.create(game=game, exe=exe, path=path, longpath=longpath, i_d=i_d)
except:
pass
def loop_insert(lib):
db.connect()
for i in lib[0]:
add_game(i.name, i.exe, i.path, i.longpath, i.id)
db.close()
I am not sure if this answers your question, since it seems to involve multiple processes and/or processors, but In order to check for the existence of a database file, I have used the following:
DATABASE = 'dbfile.db'
if os.path.isfile(DATABASE) is False:
# Create the database file here
pass
else:
# connect to database here
db.connect()
I would suggest using sqlite's user_version pragma:
db = SqliteDatabase('/path/to/db.db')
version = db.pragma('user_version')
if not version: # Assume does not exist/newly-created.
# do whatever.
db.pragma('user_version', 1) # Set user version.
from reddit:
me: To the original challenge, there's a reason I want to know whether the file exists. Maybe its flawed at the premises, I'll explain and you can fill in there.
This script will run on multiple machines I dont ahve access to. At the entry point of a first-time use case, I will be porting data from a remote location, if its the first time the script runs on that machine, its going down a different work flow than a repeated opening.
Akin to grabbing all pc programs vs appending and reading from teh last session. How would you suggest quickly understanding if that process has started and finished from a previous session.
Checking if the sqlite file is made made the most intuitive sense, and then adjusting to byte size. lmk
them:
This is a good question!
How would you suggest quickly understanding if that process
has started and finished from a previous session.
If the first thing your program does on a new system is download some kind of fixture data, then the way I would approach it is to load the DB file as normal, have Peewee ensure the tables exist, and then do a no-clause SELECT on one of them (either through the model, or directly on the database through the connection if you want.) If it's empty (you get no results) then you know you're on a fresh system and you need to make the remote call. If you get results (you don't need to know what they are) then you know you're not on a fresh system.
I use SQLFORM.smartgrid to show a list of records from a table (service_types). In each row of the smartgrid there is a delete link/button to delete the record. I want to executive some code before smartgrid/web2py actually deletes the record, for example I want to know if there are child records (services table) referencing this record, and if any, flash a message telling user that record cannot be deleted. How is this done?
db.py
db.define_table('service_types',
Field('type_name', requires=[IS_NOT_EMPTY(), IS_ALPHANUMERIC()]),
format='%(type_name)s',
)
db.define_table('services',
Field('service_name',requires=[IS_NOT_EMPTY(),IS_NOT_IN_DB(db,'services.service_name')]),
Field('service_type','reference service_types',requires=IS_IN_DB(db,db.service_types.id,
'%(type_name)s',
error_message='not in table',
zero=None),
ondelete='RESTRICT',
),
Field('interest_rate','decimal(15,2)',requires=IS_DECIMAL_IN_RANGE(0,100)),
Field('max_term','integer'),
auth.signature,
format='%(service_name)s',
)
db.services._plural='Services'
db.services._singular='Service'
if db(db.service_types).count() < 1:
db.service_types.insert(type_name='Loan')
db.service_types.insert(type_name='Contribution')
db.service_types.insert(type_name='Other')
controller
def list_services():
grid = SQLFORM.smartgrid(db.services
, fields = [db.services.service_name,db.services.service_type]
)
return locals()
view
{{extend 'layout.html'}}
{{=grid}}
There are two options. First, the deletable argument can be a function that takes the Row object of a given record and returns True or False to indicate whether the record is deletable. If it returns False, the "Delete" button will not be shown for that record, nor the delete operation be allowed on the server.
def can_delete(row):
return True if [some condition involving row] else False
grid = SQLFORM.smartgrid(..., deletable=can_delete)
Second, there is an ondelete argument that takes the db Table object and the record ID. It is called right before the delete operation, so to prevent the delete, you can do a redirect within that function:
def ondelete(table, record_id):
record = table(record_id)
if [some condition]:
session.flash = 'Cannot delete this record'
redirect(URL())
grid = SQLFORM.smartgrid(..., ondelete=ondelete)
Note, if the grid is loaded via an Ajax component and its actions are therefore performed via Ajax, using redirect within the ondelete method as shown above will not work well, as the redirect will have no effect and the table row will still be deleted from the grid in the browser (even though the database record was not deleted). In that case, an alternative approach is to return a non-200 HTTP response to the browser, which will prevent the client-side Javascript from deleting the row from the table (the delete happens only on success of the Ajax request). We should also set response.flash instead of session.flash (because we are not redirecting/reloading the whole page):
def ondelete(table, record_id):
record = table(record_id)
if [some condition]:
response.flash = 'Cannot delete this record'
raise HTTP(403)
Note, both the deletable and ondelete arguments can be dictionaries with table names as keys, so you can specify different values for different tables that might be linked from the smartgrid.
Finally, notice the delete URLs look like /appname/list_services/services/delete/services/[record ID]. So, in the controller, you can determine if a delete is being requested by checking if 'delete' in request.args. In that case, request.args[-2:] represents the table name and record ID, which you can use to do any checks.
From Anthony's answer I chose the second option and came up with the following:
def ondelete_service_type(service_type_table, service_type_id):
count = db(db.services.service_type == service_type_id).count()
if count > 0:
session.flash = T("Cant delete")
#redirect(URL('default','list_service_types#'))
else:
pass
return locals()
def list_service_types():
grid = SQLFORM.smartgrid(db.service_types
, fields = [db.service_types.type_name, db.services.service_name]
, ondelete = ondelete_service_type
)
return locals()
But, if I do this...
if count > 0:
session.flash = T("Cant delete")
else:
pass
return locals()
I get this error:
And if I do this:
if count > 0:
session.flash = T("Cant delete")
redirect(URL('default','list_service_types#')) <== please take note
else:
pass
return locals()
I get the flash error message Cant delete but the record appears deleted from the list, and reappears after a page refresh with F5 (apparently because the delete was not allowed in the database, which is intended).
Which one should I fix and how?
Note
If any of these issue is resolved I can accept Anthony's answer.
I have the "reviewer" field available in my task, and I want to switch the reviewer with the task assignee automatically when the task is moved from the 'In progress' stage to the 'Review' stage. I have the following Python code in my server action:
picture of the code in context
def assignrev(self):
for record in self:
if record['project.task.type.stage_id.name']=='Review':
a=self.res.users.reviewer_id.name
b=self.res.users.user_id.name
record['res.users.user_id.name']=a
record['res.users.reviewer_id.name']=b
and below are links to pictures of my automated action settings:
Server action to run
"When to run" settings
Unfortunately, changing the task stage to 'Review' does not give the expected results. Any suggestion please?
Kazu
Ok I finally got the answer to this. below is a picture of the code in context for Odoo 10:
No "def" of "for record" needed: the code will not run.
I just hope this will be helpful to someone else...
Kazu
My guess is that you are incorrectly calling the fields you're trying to get.
# Instead of this
a = self.res.users.reviewer_id.name
b = self.res.users.user_id.name
record['res.users.user_id.name']=a
record['res.users.reviewer_id.name']=b
# Try this
# You don't need to update the name, you need to update the database ID reference
record['user_id'] = record.reviewer_id.id
record['reviewer_id'] = record.user_id.id
Furthermore, why don't you try using an onchange method instead?
#api.multi
def onchange_state(self):
for record in self:
if record.stage_id.name == 'Review':
record.update({
'user_id': record.reviewer_id.id,
'reviewer_id': record.user_id.id,
})
If you're still having problems, you can use ipdb to debug your code more easily by triggering set_trace in your method.
def assignrev(self):
# Triggers a break in code so that you can debug
import ipdb; ipdb.set_trace()
for record in self:
# Test line by line with the terminal to see where your problem is
I am using SQLAlchemy event which is very very good, I can void a lots of repeated work.
Now I need to delete some items, and do some post work in after_delete listener after delete action, but, meanwhile in after_delete listener I need to using those deleted items info which cannot be accessed while the session had been committed, here's a piece of code .
def shop_delete_category(category_id):
item_category = ItemCategory.query.get(category_id)
if item_category is None:
result['code'] = 100
else:
deleted_category_id_items = ItemCategory.query.filter(ItemCategory.id == category_id).with_entities(ItemCategory.id).all()
# some other business logic
db.session.delete(item_category)
db.session.commit()
My event listener code:
#event.listens_for(ItemCategory, 'after_delete')
def after_delete_shop_category(mapper, connect, target):
# business logic
shop_category_delete_signal.send(target)
# here will need deleted_category_id_items
# how can I pass in ?
# logic will be like this
redis_cache_delete_category_ids(deleted_category_id_items)
I want to do some post work after delete action , and in delete action I will use deleted_category_id_items, my question is
how can I pass deleted_category_id_items the event? Or is there some better way to slove the problem ?
Edit: My own answer
found : before_delete listener can solve the problem
Edit:
Another question how can I pass my own parameter to a listener, like this
#event.listens_for(ItemCategory, 'after_delete')
def after_delete_shop_category(mapper, connect, target, *my_own_paramters):
# business logic
shop_category_delete_signal.send(target)
# here will need deleted_category_id_items
# how can I pass in ?
# logic will be like this
redis_cache_delete_category_ids(deleted_category_id_items)
# can use my_own_paramters do some work
Thanks very much !
I have a problem with the session in SQLAlchemy, when i Add a row in the DB it's OK, but if i want to add another row without closing my app, It doesn't Add
This is the function in my Model:
def add(self,name):
self.slot_name = name
our_slot = self.session_.query(Slot).filter_by(slot_name = str(self.slot_name)).first()
if our_slot:
return 0
else:
self.session_.add(self)
self.session_.commit()
return 1
The problem is that you commit your session. After committing a session, it is closed. Either you commit after you are done adding, or you open a new session after each commit. Also take a look at Session.commit(). You should probably read something about sessions in SQLAlchemy's documentation.
Furthermore, suggest you do this:
def add(self,name):
self.slot_name = name
try:
our_slot = self.session_.query(Slot)\
.filter_by(slot_name = str(self.slot_name)).one()
self.session_.add(self)
return 1
except NoResultFound:
return 0
Of course, this only works if you expect exactly one result. It is considerd good practice to raise exceptions and catch them instead of making up conditions.