MySQL data is caching with google cloud SQL and SQLAlchemy [duplicate] - python

This question already has answers here:
Why are some mysql connections selecting old data the mysql database after a delete + insert?
(2 answers)
Closed 8 months ago.
This is my first question on stackoverflow, so please correct me if I do something wrong :).
My data from a database hosted at Google Cloud SQL is caching with Flask-SQLAlchemy. When I add a new record and try to get that record it doesn't exist.
I am using a script that adds records and I use one to get records.
I first tried it with SQLite, that worked perfectly. But with MySQL at Google Cloud SQL it doesn't.
Every time I add/change something to the database I use db.session.commit()
I use the pymysql module with this: pymysql.install_as_MySQLdb()
And my connection URI looks like this: mysql://...
Edit:
This is what I use to add a new record (my script is for adding jokes):
new_joke = Jokes(joke, user["username"], user["id"], avatar_url, "0")
db.session.add(new_joke)
db.session.commit()
And this is what I use to get a record (random):
jokes = Jokes.query.all()
randint = random.randint(0, len(jokes) - 1)
joke = jokes[randint]

I found an answer!
By doing db.session.commit() before making a query, it refreshes it's cache.
My code now looks like this:
db.session.commit()
jokes = Jokes.query.all()
randint = random.randint(0, len(jokes) - 1)
joke = jokes[randint]

Related

My data doesn't show up on DB Browser for SQLite [duplicate]

This question already has answers here:
Sqlite insert query not working with python?
(2 answers)
Closed 1 year ago.
I am currently learning SQL for one of my projects and the site, that I learn from, advised me to use DB Browser to see my Database Content. However, I can't see the data inside the SQL. This is how my code looks like. I'm creating a table and then trying to write some values in it. It creates the DB successfully but the data doesn't show up.
import sqlite3 as sql
connection = sql.connect("points.db")
cursor = connection.cursor()
cursor.execute("CREATE TABLE IF NOT EXISTS servers (server_id TEXT, name TEXT, exp INTEGER)")
cursor.execute("INSERT INTO servers VALUES ('848117357214040104', 'brknarsy', 20)")
Can you check that your data is inserted?
Something like this in the end:
cursor.execute("SELECT * FROM servers")
r = cursor.fetchall()
for i in r:
print(r)
Perhaps SQLite browser just needs a refresh

how to insert python logs in postgresql table?

I need to insert the logs from my test case into a table in postgresql data base.
I was able to connect to the db but I can't figure out how to insert this line result in the tabble, I have tried the below but it doesnt work
import logging
import psycopg2
from io import StringIO
from config import config
params = config()
conn = psycopg2.connect(**params)
print(conn)
curr = conn.cursor()
try:
if not hw.has_connection():
logging.error('Failure: Unable to reach websb! ==> '+ str(driver.find_element_by_xpath('//span[#jsselect="heading" and #jsvalues=".innerHTML:msg"]').text))
return
elif hw.is_websiteReachable():
logging.info("Success: website is reachable!")
curr.execute("""INSERT INTO db_name(logs) VALUES (%s)""", ("Success: website is reachable!"))
conn.commit()
except:
logging.error("Failure: Unable to reach website!")
return
Iam a total beginner in this. I have searched but I couldnt find a clear example or guide about it. the above code throws the exception eventhough the website is reachable. sorry if I sound dumb.
It looks like you're incorrectly constructing your SQL statement. Instead of INSERT INTO db_name(table_name) ... it should be INSERT INTO table_name(column_name) .... If you've correctly connected to the appropriate database in your connection settings, you usually don't have to specify the database name each time you write your SQL.
Therefore I would recommend, the following modification (assuming your table is called logs and it has a column named message):
# ...
sql = 'INSERT INTO logs(message) VALUES (%s);'
msg = 'Success: website is reachable!'
curr.execute(sql, (msg))
conn.commit()
You can read the pyscopg2 docs here for more information as well if that would help with passing named parameters to your SQL queries in Python.
You can check a good solution that I personally use in my in-server projects. You just need to give a connection-string to the CRUD object and all the things will be done. For Postgres you can use:
'postgresql+psycopg2://username:password#host:port/database'
or
'postgresql+pg8000://username:password#host:port/database'
for more details check SQLAlchemy Engine Configuration.

How to create a database in Mongo using Python?

I have written the below Python program to create a MongoDB.
import pymongo
myclient = pymongo.MongoClient('mongodb://localhost:27017/')
mydb = myclient['mydatabase']
dblist = myclient.list_database_names()
if "mydatabase" in dblist:
print("The database exists.")
else :
print("The database does not exists.")
But while I execute the program , I get the result as
The database does not exists.
Why so? Is there something that I am missing ?
I am following the code mentioned in W3Schools Tutorial
In MongoDB the database is not actually created until you put some data into it, try put a document inside and check again.
Important: In MongoDB, a database is not created until it gets content!
Databases are lazily created. Insert something into a collection in that database and they (database/collection) will spring into existence.

How to Python PostgreSQL INSERT IF NOT EXIST? [duplicate]

This question already has answers here:
How to UPSERT (MERGE, INSERT ... ON DUPLICATE UPDATE) in PostgreSQL?
(7 answers)
Closed 7 years ago.
I have a python script that is using the Psycopg adapter; I am parsing a JSON Array and inserting into my PostgreSQL database.
for item in data["SchoolJSONData"]:
mId = item.get("Id")
mNumofRooms = item.get("NumofRooms")
mFloors = item.get("Floors")
con = None
con = psycopg2.connect("dbname='database' user='dbuser'")
cur = con.cursor()
cur.execute('INSERT INTO Schools(Id, NumofRooms, Floors)VALUES(%s, %s, %s)',(mId, mNumofRooms, mFloors))
con.commit()
Everytime I run the script again, I get the following:
psycopg2.IntegrityError: duplicate key value violates unique constraint "schools_pkey"
How can I run the insert script so that it will ignore existing entries in the database?
EDIT: Thanks for the replies all... I am trying to NOT overwrite any data, only ADD (if the PK is not already in the table), and ignore any errors. In my case I will only be adding new entries, never updating data.
There is no single one way to solve this problem. As well this problem has little to do with python. It is valid exception generated by the database ( not just postgre all databases will do the same ).
But you can try - catch this exception and continue smoothly later.
OR
you can use "select count(*) where id = mId" to ensure it is not existing already.

PySide/PyQt: insert data/table from database to another database [duplicate]

This question already has an answer here:
Copy table from remote sqlite database?
(1 answer)
Closed 8 years ago.
working with PySide.
I have a sqlite database as default database:
db1 = QtSql.QSqlDatabase.addDatabase("QSQLITE")
db1.setDatabaseName(path_of_db)
db1.open()
and a second database with connection name "second_db":
db2 = QtSql.QSqlDatabase.addDatabase("QSQLITE", "second_db")
db2.setDatabaseName(path_of_db)
db2.open()
query = QSql.QSqlQuery("second_db")
query.exec_("SELECT * FROM table_name")
Now i want to insert records from a table from db1 into a table from db2. I have a model for db1. I know that I can insert record per record through the models. I also thought about writing the records from db1 in a file/variable and then insert these into db2.
Is there a simpler/quicker solution maybe a sql-query? How can I solve the problem?
Thank you for your help ;-)
I'd probably try row by row insertion, and see how fast it is. Might be all you need. A few tips on bulk insertions are here; basically,if you use BEGIN and END TRANSACTION, and an in-memory journal then you greatly speed up inserts. Oh and create indexes after you're done, not before.

Categories

Resources