Web2py Query Legacy Database - python

I have a legacy database called my_legacy_db which is separate from the normal db.
my_legacy_db
users
- email
- username
- name
So cliff, your first part would work to generate field names and put everything in a dict to build the query's. The problem is when I do this query:
db().select(my_legacy_db.users)
I get this error:
In [20] : db().select(my_legacy_db.users)
Traceback (most recent call last):
File "/opt/web-apps/web2py/gluon/contrib/shell.py", line 233, in run
exec compiled in statement_module.__dict__
File "<string>", line 1, in <module>
File "/opt/web-apps/web2py/gluon/dal.py", line 7578, in select
return adapter.select(self.query,fields,attributes)
File "/opt/web-apps/web2py/gluon/dal.py", line 1307, in select
sql = self._select(query, fields, attributes)
File "/opt/web-apps/web2py/gluon/dal.py", line 1196, in _select
raise SyntaxError, 'Set: no tables selected'
SyntaxError: Set: no tables selected
In [21] : print (flickr_db.users)
users
In [22] : print flickr_db
<DAL {'_migrate_enabled': True, '_lastsql': "SET sql_mode='NO_BACKSLASH_ESCAPES';", '_db_codec': 'UTF-8', '_timings': [('SET FOREIGN_KEY_CHECKS=1;', 0.0002460479736328125), ("SET sql_mode='NO_BACKSLASH_ESCAPES';", 0.00025606155395507812)], '_fake_migrate': False, '_dbname': 'mysql', '_request_tenant': 'request_tenant', '_adapter': <gluon.dal.MySQLAdapter object at 0x91375ac>, '_tables': ['users'], '_pending_references': {}, '_fake_migrate_all': False, 'check_reserved': None, '_uri': 'mysql://CENSORED', 'users': <Table 'username': <gluon.dal.Field object at 0x9137b6c>, '_db': <DAL {...}>, 'cycled': <gluon.dal.Field object at 0x94d0b8c>, 'id': <gluon.dal.Field object at 0x95054ac>, 'ALL': <gluon.dal.SQLALL object at 0x969a7ac>, '_sequence_name': 'users_sequence', 'name': <gluon.dal.Field object at 0x9137ecc>, '_referenced_by': [], '_singular': 'Users', '_common_filter': None, '_id': <gluon.dal.Field object at 0x95054ac>}>, '_referee_name': '%(table)s', '_migrate': True, '_pool_size': 0, '_common_fields': [], '_uri_hash': 'dfb3272fc537e3339819a1549180722e'}>
Am I doing something wrong here? Is the legacy db not built in /databases right? Thanks in advance for any help.
UPDATE: I tried as anthony suggested in the model shell:
In [3] : db(my_legacy_db.users).select()
Traceback (most recent call last):
File "/opt/web-apps/web2py/gluon/contrib/shell.py", line 233, in run
exec compiled in statement_module.__dict__
File "<string>", line 1, in <module>
File "/opt/web-apps/web2py/gluon/dal.py", line 7577, in select
fields = adapter.expand_all(fields, adapter.tables(self.query))
File "/opt/web-apps/web2py/gluon/dal.py", line 1172, in expand_all
for field in self.db[table]:
File "/opt/web-apps/web2py/gluon/dal.py", line 6337, in __getitem__
return dict.__getitem__(self, str(key))
KeyError: 'users'
Now I know that users is defined in my_legacy_db, and all syntax is correct. Is this an error that is there because the db files aren't generating correctly? Or am I still doing something wrong with the select syntax?

If "users" is the name of a table and you want to select all records and all fields, you would do:
db(my_legacy_db.users).select()
The query goes inside db(), not inside select() (select() is where you list the fields you want returned, or leave it empty if you want all fields). Note, in the above line, my_legacy_db.users is not actually a query but just a table -- that's a shortcut to tell web2py you want all records in the table.
You could also do:
db().select(my_legacy_db.users.ALL)
That indicates you want all fields, and by excluding the query, it assumes you want all records in the table.
See the book for more details.

Related

Saving model without writing to db in Django

I have a many-to-many to many type models. I use a script to populate the database. However I want to print the objects and save it only if i want it to be saved using a y/n style input. The problem is I can't create the objects without saving them as you can see below.
>>> mov = Movie(name="completenothing")
>>> direc = Director(name="Someone")
>>> direc.movie_name.add(mov)
Traceback (most recent call last):
File "<console>", line 1, in <module>
File "/home/username/Code/virtualenvironments/matrix/local/lib/python2.7/site-packages/django/db/models/fields/related_descriptors.py", line 513, in __get__
return self.related_manager_cls(instance)
File "/home/username/Code/virtualenvironments/matrix/local/lib/python2.7/site-packages/django/db/models/fields/related_descriptors.py", line 830, in __init__
(instance, self.pk_field_names[self.source_field_name]))
ValueError: "<Director: Someone>" needs to have a value for field "id" before this many-to-many relationship can be used.
>>> direc.save()
>>> direc.movie_name.add(mov)
Traceback (most recent call last):
File "<console>", line 1, in <module>
File "/home/username/Code/virtualenvironments/matrix/local/lib/python2.7/site-packages/django/db/models/fields/related_descriptors.py", line 934, in add
self._add_items(self.source_field_name, self.target_field_name, *objs)
File "/home/username/Code/virtualenvironments/matrix/local/lib/python2.7/site-packages/django/db/models/fields/related_descriptors.py", line 1060, in _add_items
(obj, self.instance._state.db, obj._state.db)
ValueError: Cannot add "<Movie: completenothing N/A>": instance is on database "default", value is on database "None"
>>> mov.save()
>>> direc.movie_name.add(mov)
Director and Movie are in a many-to-many relation and i want their information displayed before saving. Is there some mechanism to allow this ?
If you use a pre_save signal, you can try this.
from django.db.models.signals import pre_save
def confirm_save(sender, instance, **kwargs):
# do something with your instance (display information)
ans = input("Do you want to save(y/n)")
if ans == 'y':
print("Your instance saved successfully")
else:
raise Exception("Not saved")
pre_save.connect(confirm_save, sender=MyModel)

can't use pony orm on sqlite3 blob fields

Just trying some basic exercises with pony ORM (and python3.5, sqlite3).
I just want to print a select query of some data I have without further processing to start with. Pony orm does not seem to like that at all....
The sqlite db dump
PRAGMA foreign_keys=OFF;
BEGIN TRANSACTION;
CREATE TABLE sums (t text, path BLOB, name BLOB, sum text, primary key (path,name));
INSERT INTO "sums" VALUES('directory','','','');
INSERT INTO "sums" VALUES('file','','sums-backup-f.db','6859b35f9f026317c5df48932f9f2a91');
INSERT INTO "sums" VALUES('file','','md5-tree.py','c7af81d4aad9d00e88db7af950c264c2');
INSERT INTO "sums" VALUES('file','','test.db','a403e9b46e54d6ece851881a895b1953');
INSERT INTO "sums" VALUES('file','','sirius-alexa.db','22a20434cec550a83c675acd849002fa');
INSERT INTO "sums" VALUES('file','','sums-reseau-y.db','1021614f692b5d7bdeef2a45b6b1af5b');
INSERT INTO "sums" VALUES('file','','.md5-tree.py.swp','1c3c195b679e99ef18b3d46044f6e6c5');
INSERT INTO "sums" VALUES('file','','compare-md5.py','cfb4a5b3c7c4e62346aa5e1affef210a');
INSERT INTO "sums" VALUES('file','','charles.local.db','9c50689e8185e5a79fd9077c14636405');
COMMIT;
Here is the code I try to run on python3.5 interactive shell:
from pony.orm import *
db = Database()
class File(db.Entity) :
_table_ = 'sums'
t = Required(str)
path = Required(bytes)
name = Required(bytes)
sum = Required(str)
PrimaryKey(path,name)
db.bind('sqlite','/some/edited/path/test.db')
db.generate_mapping()
File.select().show()
And it fails like this :
Traceback (most recent call last):
File "/usr/lib/python3.5/site-packages/pony/orm/core.py", line 5149, in _fetch
try: result = cache.query_results[query_key]
KeyError: (('f', 0, ()), (<pony.orm.ormtypes.SetType object at 0x7fd2d2701708>,), False, None, None, None, False, False, False, ())
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "<string>", line 2, in show
File "/usr/lib/python3.5/site-packages/pony/utils/utils.py", line 75, in cut_traceback
raise exc # Set "pony.options.CUT_TRACEBACK = False" to see full traceback
File "/usr/lib/python3.5/site-packages/pony/utils/utils.py", line 60, in cut_traceback
try: return func(*args, **kwargs)
File "/usr/lib/python3.5/site-packages/pony/orm/core.py", line 5256, in show
query._fetch().show(width)
File "/usr/lib/python3.5/site-packages/pony/orm/core.py", line 5155, in _fetch
used_attrs=translator.get_used_attrs())
File "/usr/lib/python3.5/site-packages/pony/orm/core.py", line 3859, in _fetch_objects
real_entity_subclass, pkval, avdict = entity._parse_row_(row, attr_offsets)
File "/usr/lib/python3.5/site-packages/pony/orm/core.py", line 3889, in _parse_row_
avdict[attr] = attr.parse_value(row, offsets)
File "/usr/lib/python3.5/site-packages/pony/orm/core.py", line 1922, in parse_value
val = attr.validate(row[offset], None, attr.entity, from_db=True)
File "/usr/lib/python3.5/site-packages/pony/orm/core.py", line 2218, in validate
val = Attribute.validate(attr, val, obj, entity, from_db)
File "/usr/lib/python3.5/site-packages/pony/orm/core.py", line 1894, in validate
if from_db: return converter.sql2py(val)
File "/usr/lib/python3.5/site-packages/pony/orm/dbapiprovider.py", line 619, in sql2py
if not isinstance(val, buffer): val = buffer(val)
TypeError: string argument without an encoding
Am I using this wrong, or is this a bug ? I don't mind go filing a bug, but it's the first time I'm using this orm, so I thought it might be better to check first ...
SQLite has a (mis)feature, which allows a column to store an arbitrary value disregarding the column type. Instead of rigid data type, each SQLite column has an affinity, while each value has a storage class which can be different within the same column. For example, you can store text value inside an integer column, and vice versa. See Datatypes In SQLite Version 3 for more information.
The reason for the error is that the table contains values of "wrong" type in its BLOB columns. Correct SQLite binary literal looks like x'abcdef'. The INSERT commands that you use insert UTF8 strings instead.
This problem was somewhat fixed in the latest version of Pony which you can take from GitHub. Now if Pony receives a string value from a BLOB column it just keep that value without throwing an exception.
If you populate the table with Pony, it will writes BLOB data as a correct binary values, so it can read them later without any problem.

Can't figure out Web2py Ticket error its odd

I'm new to Web2py and am unable to understand the error that the ticket is throwing up. Can someone explain the error and why it is occurring?
Ticket ID
127.0.0.1.2016-05-28.15-45-10.493c5f3c-e5f2-4034-8e82-69637b1fcc35
<type 'exceptions.SyntaxError'> invalid table/column name "size" is a "ALL" reserved SQL/NOSQL keyword
Version
web2py™ Version 2.12.1-stable+timestamp.2015.08.07.07.22.06
Traceback (most recent call last):
File "C:\Users\sharankumar\Desktop\New\gluon\restricted.py", line 227, in restricted
exec ccode in environment
File "C:/Users/sharankumar/Desktop/New/applications/MyLogin/models/db.py", line 232, in <module>
format='%(name)s')
File "C:\Users\sharankumar\Desktop\New\gluon\packages\dal\pydal\base.py", line 817, in define_table
table = self.lazy_define_table(tablename,*fields,**args)
File "C:\Users\sharankumar\Desktop\New\gluon\packages\dal\pydal\base.py", line 834, in lazy_define_table
table = table_class(self, tablename, *fields, **args)
File "C:\Users\sharankumar\Desktop\New\gluon\packages\dal\pydal\objects.py", line 351, in __init__
check_reserved(field_name)
File "C:\Users\sharankumar\Desktop\New\gluon\packages\dal\pydal\base.py", line 519, in check_reserved_keyword
'invalid table/column name "%s" is a "%s" reserved SQL/NOSQL keyword' % (name, backend.upper()))
SyntaxError: invalid table/column name "size" is a "ALL" reserved SQL/NOSQL keyword
In db.define_table(), it appears you have attempted to create a field named "size", which is not allowed because it is a SQL reserved word. You should either change the field name or use the "rname" argument to specify a different name for the database to use:
Field('size', rname='object_size', ...)
With the above, you can use the name "size" in all of your Python code, but the database will actually create a field with the name "object_size".

Syntax error while running mongo-connector between MongoDB and Neo4J

I am using mongo-connector to do the initial bulk_upsert operation between MongoDB and Neo4J. At some point while querying with py2neo, the InvalidSyntax exception is occurring due to which nothing is being inserted into graph database. I believe the issue lies somewhere in the DocManager during syntax translations. I am running py2neo v2.0.8 and Neo4J v2.3.1.
Here is the detailed stack trace:
Exception in thread Thread-2:
Traceback (most recent call last):
File "//anaconda/lib/python2.7/threading.py", line 810, in __bootstrap_inner
self.run()
File "//anaconda/lib/python2.7/site-packages/mongo_connector/util.py", line 85, in wrapped
func(*args, **kwargs)
File "//anaconda/lib/python2.7/site-packages/mongo_connector/oplog_manager.py", line 256, in run
docman.upsert(doc, ns, timestamp)
File "//anaconda/lib/python2.7/site-packages/mongo_connector/doc_managers/neo4j_doc_manager.py", line 66, in upsert
tx.commit()
File "//anaconda/lib/python2.7/site-packages/py2neo/cypher/core.py", line 333, in commit
return self.post(self.__commit or self.__begin_commit)
File "//anaconda/lib/python2.7/site-packages/py2neo/cypher/core.py", line 288, in post
raise self.error_class.hydrate(error)
InvalidSyntax: Invalid input '{': expected whitespace, comment or a label name (line 1, column 20 (offset: 19))
"MERGE (d:Document: { _id: {parameters}._id})"
What could be happening here?
Thanks for reporting this.
Neo4j Doc Manager uses a key naming convention of xxx_id to identify relationships, where the value of a property with key xxx_id is assumed to be an id referencing a document in collection xxx. This convention allows us to define relationships from the document data model. I'm assuming that the error here is caused by Neo4j Doc Manager treating the nested document's _id field as a relationship, but not checking for a null collection name (since nothing appears before "_id" in the key).
This is a bug and we'll add a check for this to avoid the Cypher syntax error. Those interested can track the issue here: https://github.com/neo4j-contrib/neo4j_doc_manager/issues/56

select a single column from Mysql DB using sqlalchemy

How do I get values from a single column using sqlalchemy?
In MySQL
select id from request r where r.product_id = 1;
In Python
request = meta.tables['request']
request.select(request.c.product_id==1).execute().rowcount
27L
>>> request.select([request.c.id]).where(request.c.product_id==1).execute()
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "build/bdist.freebsd-6.3-RELEASE-i386/egg/sqlalchemy/sql/expression.py", line 2616, in select
File "build/bdist.freebsd-6.3-RELEASE-i386/egg/sqlalchemy/sql/expression.py", line 305, in select
File "build/bdist.freebsd-6.3-RELEASE-i386/egg/sqlalchemy/sql/expression.py", line 5196, in __init__
File "build/bdist.freebsd-6.3-RELEASE-i386/egg/sqlalchemy/sql/expression.py", line 1517, in _literal_as_text
sqlalchemy.exc.ArgumentError: SQL expression object or string expected.
I found the answer, I have to use the general select vs the table select.
Leaving this incase more folks find it useful.
conn = engine.connect()
stmt = select([request.c.id]).where(request.c.product_id==1)
conn.execute(stmt).rowcount
27L

Categories

Resources