I've been running a Django application on my local machine and am trying to push it to appengine. One of the queries I was making before that never caused any trouble was:
ALTER TABLE Records ADD COLUMN Id
but when I try to execute this query on Cloud SQL, I get this error:
Error 1064: You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near '' at line 1
What am I doing wrong?
The syntax for Alter table is
ALTER TABLE table_name ADD column_name datatype
You forgot to specify the datatype for Id
Something like
ALTER TABLE Records ADD COLUMN Id INT
Related
I am working on my first heroku web-app using python and flask. I have it connected to a SQLite database locally and postgresql database through heroku. When running SQL commands in Heroku, I am able to query all of the data from another table, but when I try to access data from the "user" table, it says there is no column named "username", even though in the explorer it shows a column with the same name:
Does this mean my table is empty and the users aren't being added? I'm not getting any errors when adding the users in the app. Any help is appreciated.
"user" is a reserved word in postgresql, so when you're querying that user table, your query isn't going exactly where you expect. Specifying the full table name with its schema should work, which in your case is probably public.user.
So your query would look like this:
SELECT "username"
FROM public.user;
Another way of dealing with this is to surround "user" with quotation marks, like this:
SELECT "username"
FROM "user";
I cannot access one of the tables in my database. I can't even delete the table by myself, so I am not sure what to do. Here is the error:
ERROR 1064 (42000): You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near 'match' at line 1
So this error appears every time I am doing an operation on that table (describe, select, drop, etc.)
I have deleted the model from my code, but that doesn't remove the table from the db.
This is a bit of speculation. But the error is referring to match. It might not be obvious, but match is a reserved word in MySQL. It is used for full-text searches.
If you have a column or table named match and it is being referred to without escape characters (backticks), then you would likely get an error like this.
The thing to do is to fix the name of the table/column so it does no conflict with a reserved word.
I'm running a python db migration script (Flask-Migrate) and have added the alembic.ddl.imp import DefaultImpl to get around the first set of errors but now I'm getting the following. I'm trying to use this script to set up my tables and database in snowflake. What am I missing? Everything seems to be working and I can't seem to find any help on this particular error in the snowflake documentation. I would assume that the snowflake sqlalchemy connector would address the creation of a unique index.
The script so far does create several of the tables, but when it gets to this part it throws the error.
> sqlalchemy.exc.ProgrammingError:
> (snowflake.connector.errors.ProgrammingError) 001003 (42000): SQL
> compilation error: syntax error line 1 at position 7 unexpected
> 'UNIQUE'. [SQL: CREATE UNIQUE INDEX ix_flicket_users_token ON
> flicket_users (token)] (Background on this error at:
> http://sqlalche.me/e/f405)
Snowflake does not have INDEX objects, so any CREATE ... INDEX statement will fail.
With Snowflake, you have to trust the database to organize your data with micro partitions and build a good access plan for your queries.
You will feel uneasy at first, but eventually stop worrying.
Bleeding edge solutions will require monitoring/tuning performance using the query log, however.
Nothing new here.
I'm new to sqlalchemy and have been trying to figure this out for days!
I have some python code which is executing the following line:
mdb_session.query(PendingConfig).filter(PendingConfig.id == config.id).delete()
It's deleting all rows in a table called PendingConfig which have an id equals to a given config.id.
I want to log the underlying SQL query sqlalchemy that generates, but don't know how to do that since delete() returns an integer equal to the number of rows deleted.
I tried setting up a logger but that had it's own issues, as I explained in this post.
Need help on this!
If you really want to get the SQL that was actually run by the MySQL server, then you can enable the MySQL query log or slow query log, and read it from the database server.
See https://dev.mysql.com/doc/refman/5.7/en/slow-query-log.html
The MySQL Server doesn't know anything about Python, it just knows that a client sent it a query to execute.
If it's a parameterized query, it will contain ? placeholders in the SQL text, but SQLAlchemy doesn't do placeholders as far as I know. It always interpolates parameter values into the SQL query before sending it to MySQL.
I am running into MySQL behavior on Google Cloud SQL I have never seen before.
Every MySQL command we try is working from a Python script except INSERT. We can create the table and show tables, but when we insert data - nothing appears in the table. Yet, if we copy that exact same insert statement to the MySQL command line and run it, the insert works fine.
BUT here is the unusual part. Even though the Python script fails to insert data, the UID AUTO INCREMENT field has incremented for every empty and failed insert. For example, if the Python script fails to insert a row, the next time we run an insert from the mySQL command line, we see that the UID field has incremented by one.
It is as if MySQL started to insert the data, auto-incremented the UID field, but then the data never arrived.
We are using MySQL on Google Cloud SQL. The insert is a simple test:
insert into queue (filename, text) VALUES ('test', 'test')
Any ideas what this is or how to debug it?
It turns out AUTOCOMMIT is set to OFF on Google Cloud SQL.
All SQL inserts must be followed by a commit statement.
For example:
import MySQLdb
db = mdb.connect(ip, login, pword)
query = "insert into tbname (entity, attribute) VALUES('foo', 'bar')"
cursor = db.cursor()
cursor.execute(query)
db.commit()