I've got a database created using Django models which I'm now accessing using SQLAlchemy and Elixir. The querying works and I can pull items out of the database perfectly happily but when I edit them and try to save them it throws the following exception:
>>> p = Problem.query.first()
>>> p
<Problem('Test Problem', 'This is a test problem so the database has something in it', 'SBMT')>
>>> p.name
u'Test Problem'
>>> p.name = "Test_Problem"
>>> p.save()
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/usr/lib/python2.7/site-packages/elixir/entity.py", line 1116, in save
return self._global_session.save(self, *args, **kwargs)
AttributeError: 'Session' object has no attribute 'save'
What am I doing wrong? Have I missed a crucial part of the set up that is preventing me from saving things to the database or is it a problem with my versions of elixir and SQLAlchemy?
I've already run setup_all() and the metadata.bind is all set, hence I can query the database.
I'm no Elixir expert, but from the documentation it looks like it uses something called a global session.
To save the changes to the database, you do a session.commit(), there's no save() method like in Django.
Related
I want to connect a QComboBox with SQL Server.
I'm creating an employee form, where the user can add the data for each employee. I want to use 3 tables (already made and populated in SQL Server) for the QComboBox, one for the Building, one for the Department and one for the Job. The problem is that I can't figure out how to do it.
I've used some code that I found online, which is the following:
self.deptcombbox = QComboBox()
self.deptcombbox.addItems(self.deptcomb)
and
def deptcomb(self):
conn = pyodbc.connect("DRIVER={SQL Server};SERVER=DESKTOP-QFV3HFM\SQLEXPRESS;DATABASE=Tests;USERNAME=ggs;PASSWORD=2332")
q = conn.cursor()
q.execute('SELECT DISTINCT dept_name FROM departments')
pmList = q.fetchall()
pmList = list(pmList)
conn.commit()
conn.close()
return pmList
but I get error:
Traceback (most recent call last):
File "c:\Users\gnsta\Documents\Programming tests\EmployeeForm.py", line 89, in <module>
window = Window()
File "c:\Users\gnsta\Documents\Programming tests\EmployeeForm.py", line 24, in __init__
self.buildingcombobox.addItems(self.buildcomb)
TypeError: addItems(self, Iterable[str]): argument 1 has unexpected type 'method'
I understand that I need to use a list for the combobox, but list() isn't working.
I'm setting up Superset using Python 3.7.7 + Debian 10 in a Docker container and am getting an error when running superset init. Expected result: Superset loads example data and initial configuration into my Postgresql 11 database.
The problem looks to be related to Superset's attempt at loading information about its example database. Specifically, some of the values it's attempting to insert are being read as NoneType, which sqlalchemy is rejecting because they're not bytes-like. Here are the most relevant portions of the trace:
2020-05-20 22:07:50,651:INFO:root:Creating database reference for examples
Traceback (most recent call last):
File "/usr/local/lib/python3.7/site-packages/cryptography/utils.py", line 36, in _check_byteslike
memoryview(value)
TypeError: memoryview: a bytes-like object is required, not 'NoneType'
...
File "/usr/local/lib/python3.7/site-packages/superset/cli.py", line 51, in init
utils.get_example_database()
File "/usr/local/lib/python3.7/site-packages/superset/utils/core.py", line 976, in get_example_database
return get_or_create_db("examples", db_uri)
File "/usr/local/lib/python3.7/site-packages/superset/utils/core.py", line 968, in get_or_create_db
db.session.commit()
...
sqlalchemy.exc.StatementError: (builtins.TypeError) data must be bytes-like
[SQL: INSERT INTO dbs (created_on, changed_on, verbose_name, database_name, sqlalchemy_uri, password, cache_timeout, select_as_create_table_as, expose_in_sqllab,
allow_run_async, allow_csv_upload, allow_ctas, allow_dml, force_ctas_schema, allow_multi_schema_metadata_fetch, extra, perm, impersonate_user, created_by_fk, ch
anged_by_fk) VALUES (%(created_on)s, %(changed_on)s, %(verbose_name)s, %(database_name)s, %(sqlalchemy_uri)s, %(password)s, %(cache_timeout)s, %(select_as_create
_table_as)s, %(expose_in_sqllab)s, %(allow_run_async)s, %(allow_csv_upload)s, %(allow_ctas)s, %(allow_dml)s, %(force_ctas_schema)s, %(allow_multi_schema_metadata
_fetch)s, %(extra)s, %(perm)s, %(impersonate_user)s, %(created_by_fk)s, %(changed_by_fk)s) RETURNING dbs.id]
[parameters: [{'database_name': 'examples', 'sqlalchemy_uri': 'postgresql://superset:[redacted]#[redacted]:5432/superset', 'password': '[redacted]', 'perm': None,
'verbose_name': None, 'cache_timeout': None, 'force_ctas_schema': None}]]
Full trace here. The funny business starts here, when Superset attempts to record the metadata about its example database but apparently doesn't pass parameters for all the required fields. It attempts to instantiate the Database model here.
I get the same error with superset load_examples. I don't think it's a database connection issue, as I'm able to access via psql and can see that data has been populated in the user table.
Obviously this command typically works fine in other environments, so I'm wondering if perhaps there's some kind of incompatibility in my setup I'm not aware of. Package versions: apache-superset 0.35.2, cryptography 2.7, sqlalchemy 1.3.16, sqlalchemy-utils 0.36.1. My superset_config.py is here, and the Dockerfile is a duplicate of this one.
I am pretty new to Python and trying to use a very simple one table DB.
I'm using python 3, peewee and pymysql. The database is MySQL, local on my Windows 10 PC (wampserver) and the code is;
import pymysql
pymysql.install_as_MySQLdb()
import MySQLdb
import peewee
from peewee import *
db = MySQLdb.connect(user='xxxxx', passwd='xxxxx',
host='localhost', port=3306, db='xxxxx')
class SearchUrl(peewee.Model):
id = peewee.AutoField()
url = peewee.TextField()
page_title = peewee.TextField()
date_found = peewee.DateField()
date_last_changed = peewee.DateField()
article_status = peewee.CharField()
description = peewee.TextField()
class Meta:
database = db
newUrl = SearchUrl.create(url='http://cropnosis.co.uk', page_title="Cropnosis Ltd.",
date_found='2018-04-13', status='new', description='Cropnosis website')
newUrl.save()
for url in SearchUrl.filter(url='http://cropnosis.co.uk'):
print(url.description)
db.close()
Whenever I run this I get the following error in the "SearchUrl.create" line. I have searched for the 'returning_clause' attribute but have found little or no mention of it esp. with regard to MySQL.
Any help or relevant links greatly appreciated.
Traceback (most recent call last):
File "researcherDb.py", line 26, in <module>
date_found='2018-04-13', status='new', description='Cropnosis website')
File "C:\.dev\Client\Cropnosis\Researcher\lib\site-packages\peewee.py", line 5161, in create
inst.save(force_insert=True)
File "C:\.dev\Client\Cropnosis\Researcher\lib\site-packages\peewee.py", line 5284, in save
pk_from_cursor = self.insert(**field_dict).execute()
File "C:\.dev\Client\Cropnosis\Researcher\lib\site-packages\peewee.py", line 5128, in insert
return ModelInsert(cls, cls._normalize_data(__data, insert))
File "C:\.dev\Client\Cropnosis\Researcher\lib\site-packages\peewee.py", line 5868, in __init__
if self.model._meta.database.returning_clause:
AttributeError: 'Connection' object has no attribute 'returning_clause'
You want this:
from peewee import *
# this is peewee.MySQLDatabase, not pymysql's connection class.
db = MySQLDatabase('the_db_name', user='xxxxx', passwd='xxxxx')
When trying to connect to MongoDB , I'm experiencing an error
How can I solve this?
Traceback (most recent call last):
File "D:/MongoDB-U/Python/Codes/Try.py", line 17, in
print (item['name'])
TypeError: 'NoneType' object has no attribute 'getitem'
Code:
import pymongo
from pymongo import MongoClient
connection = MongoClient('localhost',27017)
db = connection.test
names = db.things
item = things.find_one()
print (item['name'])
You're creating a names collection variable but then using a things collection variable in your find_one call. It should be:
db = connection.test
things = db.things
item = things.find_one()
print (item['name'])
I am running define_tables in the recommended way:
db = DAL('postgres://user:XXXX#localhost:5432/mydb', migrate_enabled=False, auto_import=False, lazy_tables=True)
db.define_table('auth_user',
Field('email', unique=True),
Field('password', length=512, type='password', readable=False, label='Password'),
...)
This gets executed without errors, but no table is created in the database. Whenever I try to insert a new user:
relation "auth_user" does not exist
What can be going on? Once the tables are created (manually, for example), the application works fine. I am using a postgres backend. This happens no matter what value I give to lazy_tables
EDIT
This is the full test script:
from gluon import DAL
from gluon import Field
db = DAL('postgres://user:pass#localhost:5432/mydb', migrate_enabled=False)
db.define_table(
'auth_user',
Field('email', type='string', unique=True),
Field('password', type='password'),
Field('registration_key', type='string', length=512, writable=False, readable=False, default=''),
Field('reset_password_key', type='string', length=512, writable=False, readable=False, default=''),
Field('registration_id', type='string', length=512, writable=False, readable=False, default=''),
)
db.commit()
print db.tables
db.auth_user.insert(email='g#b.c')
And I get the following output:
['auth_user']
Traceback (most recent call last):
File "xxx.py", line 19, in <module>
db.auth_user.insert(email='g#b.c')
File "/tmp/web2py/gluon/dal.py", line 9293, in insert
ret = self._db._adapter.insert(self, self._listify(fields))
File "/tmp/web2py/gluon/dal.py", line 1361, in insert
raise e
psycopg2.ProgrammingError: relation "auth_user" does not exist
LINE 1: INSERT INTO auth_user(reset_password_key,registration_id,reg...
The table is somehow "created" (in memory?), but it is not really in the postgres database. What does this mean?
Simply remove migrate_enabled=False, which turns off migrations and therefore prevents the creation or modification of database tables. There is also no need to explicitly set auto_import=False as that is already the default.
If the above doesn't help, it is possible that web2py did successfully create such a table previously and it was removed without web2py knowing about it. If the application's /databases folder includes a file with a name like *_auth_user.table, delete that file and try again.
If that's not the issue, check the /databases/sql.log file and confirm that web2py attempted to create the table. Most likely, something in your system configuration is preventing the table from being created.
UPDATE: From your edit, it appears you are using the DAL outside of a web2py application. Because you have not specified the folder argument to the DAL() constructor, it will save the *.table migration files in the current working directory, and it will not create a sql.log file. In this case, it is best to create a separate folder for the migration and log files:
DAL('postgres://user:pass#localhost:5432/mydb', folder='/path/to/folder')
In that case, it will save all of the *.table migration files and the sql.log file in the specified folder.