I've got a twisted based network application. Now I would like to implement a new database design but I''m getting stuck with the Deferred object.
I write sessions into my database by using this two functions:
def createSession(self, peerIP, peerPort, hostIP, hostPort):
SessionUUID = uuid.uuid1().hex
yield self.writeSession(SessionUUID, peerIP, hostIP)
sid = self.db.runQuery("SELECT id FROM sessions WHERE sid = %s",
(SessionUUID,))
yield sid
#defer.inlineCallbacks
def writeSession(self, SessionUUID, peerIP, hostIP):
sensorname = self.getSensor() or hostIP
r = yield self.db.runQuery("SELECT id FROM sensors WHERE ip = %s",
(sensorname,))
if r:
id = r[0][0]
else:
yield self.db.runQuery("INSERT INTO sensors (ip) VALUES (%s)",
(sensorname,))
r = yield self.db.runQuery("""SELECT LAST_INSERT_ID()""")
id = int(r[0][0])
self.simpleQuery(
"""
INSERT INTO sessions (sid, starttime, sensor, ip)
VALUES (%s, FROM_UNIXTIME(%s), %s, %s)
""",
(SessionUUID, self.nowUnix(), id, peerIP))
In short words:
createSession creates an UUID for the session and calls writeSession to write this into my db. After this is written I try to select the ID of the last insert by using the UUID in the where statement and return the result.
Now my problem. To update the session information I call this function:
def handleConnectionLost(self, sid, args):
self.simpleQuery("UPDATE sessions SET endtime = now() WHERE sid = %s",
(sid))
As you can see I try to use the sid from createSession which is an Deferred object and not an integer. If I got this right and I add a Callback to handleConnectionLost it will run the query at this time so that I can use the value here. But this is not my only function where I need the sid. So it would be an overhead when I do every time a callback when I need the sid.
Is there a way that I can give my sid as an integer to my functions? So that I'm just running the query one time? How does it have to look like?
When I'm using a Deferred query with a now() statement. Will it use now() when I added this query to my Callbacks or will it use now() when the query is fired?
You can immediately get the ID after inserting a new row for later use, similar question was answered here: The equivalent of SQLServer function SCOPE_IDENTITY() in mySQL?
it will use now() when the query is fired
Related
I have a Pulumi (python) script that needs to query a database to get a list of customers. The rest of the setup that it does is based on that list.
I've tried to store the username/password for that list in a pulumi secret with pulumi config set --secret db_user $USER and pulumi config set --secret db_password $PASSWORD so that they are encrypted in the pulumi stack file. The problem is that when I try to retrieve them, they are Output objects. I think that pulumi does this so that it can track the value and the resource that created it together, but I just need the string values so I can connect to a database and run a query, as in this simplified example:
db_host = pulumi_config.require("db_host")
db_name = pulumi_config.require("db_name")
db_user = pulumi_config.require_secret("db_user")
db_password = pulumi_config.require_secret("db_password")
# psycopg2.connect fails with an error:
# TypeError: <pulumi.output.Output object at 0x10feb3df0> has type Output, but expected one of: bytes, unicode
connection = psycopg2.connect(
host = db_host,
database = db_name,
user = db_user,
password = db_password)
cursor = connection.cursor()
query = """
SELECT id
FROM customers
WHERE ready = true
ORDER BY id DESC
"""
cursor.execute(query)
customer_ids = []
for record in cursor:
customer_ids.append(record[0])
The code above fails when I try to connect with psycopg2 because it requires a string.
I know that when I use Pulumi libraries that take Pulumi Inputs/Outputs as parameters, the secrets are decrypted just fine. So how can I decrypt these secrets for use with non-Pulumi code?
I think that pulumi does this so that it can track the value and the resource that created it together
The actual reason is because Pulumi needs to resolve the value it retrieves from config, and its an eventual operation. Pulumi decrypts the value using the key first, and once that's done it can resolve it.
You're dealing with an Output and like any other Output, you need to resolve the value using an apply if you want to interpolate it into a string.
connection = Output.all(db_user, db_password) \
.apply(lambda args: psycopg2.connect(
host = db_host,
database = db_name,
user = args[0],
password = args[1]))
# perform your SQL query here
Note, all of the logic you're talking about needs to happen inside the apply
As a reference for anyone else who tries to do something like this, the complete solution looked like this:
# Takes a connection object, uses it to perform a query, and then returns a list of customer IDs
def do_query(connection):
query = """
SELECT id
FROM customers
WHERE ready = true
ORDER BY id DESC
"""
cursor = connection.cursor()
cursor.execute(query)
customer_ids = []
for record in cursor:
customer_ids.append(record[0])
return customer_ids
# gets a list of customer IDs, wrapped in an Output object.
def get_customer_ids():
customer_ids = Output.all(db_user, db_password) \
.apply(lambda args: do_query(
psycopg2.connect(
host = db_host,
database = db_name,
user = args[0],
password = args[1])))
return customer_ids
NOTE: The list of customer IDs will still be wrapped in an Output object, so when you want to use it, you will need to do something like this:
def create_connector_for_customers(customer_ids):
for customer in customer_ids:
connector_config = ConnectorConfigArgs(
# Use customer_id to set up connector
)
destination_schema = ConnectorDestinationSchemaArgs(
# Use customer_id to set up connector
)
# The customer ID list is wrapped in an Output, it can only be accessed within an `apply`
customer_list = get_customer_ids()
customer_list.apply(lambda customer_ids: create_connector_for_customers(customer_ids))
I have written a Flask App which can access a mysql database running in a Dockerfile with following schema:
CREATE TABLE tasksdb (
id INTEGER AUTO_INCREMENT PRIMARY KEY,
task VARCHAR(20),
is_completed BOOLEAN,
notify VARCHAR(100)
);
I need to return the id of the task which was last created. Thus, I would like to retrieve the id from the database. I have tried several variants but none of them worked:
cursor.execute('SELECT id FROM tasksdb WHERE task="%s"', [task])
print(cursor.fetchall())
Output: ()
cursor.execute("SELECT COUNT(*) FROM tasksdb")
Output: ((1,),), ((2,),) etc for each time running the command.
However, I am confused since the database shows this, when running the following in the Docker bash shell:
SELECT * from tasksdb;
Empty set (0.00 sec)
Nonetheless, I try to get the 1,2 etc from the output so that my function can return this:
The problem is that
string_id=str(cursor.fetchall()) leads to output (). Thus, I have no idea how to access the middle part.
res = int(''.join(map(str, cursor.fetchall()))) leads to
ValueError: invalid literal for int() with base 10: ''
res=cursor.fetchall()[0]
leads to: IndexError: tuple index out of range
Could you help me how I could get the id?
Additionally, I am wondering why the id always starts again at 1 when restarting the flask application and running the test but not restarting the mysql container. I would expect the container and database to create the tasks with a new id even after restarting.
Here is my full code:
from flask import Flask, request, Response
import json
import MySQLdb
import json
app = Flask(__name__)
cursor = None
def get_db_connection():
global cursor
if not cursor:
db = MySQLdb.connect("127.0.0.1", "root", "DockerPasswort!", "demo", port=3306)
cursor = db.cursor()
return cursor
# Create a new task
#app.route('/v1/tasks', methods=['POST'])
def post():
cursor = get_db_connection()
data = request.get_json()
if "title" not in data:
return bulkadd(data)
is_completed=False
if "is_completed" in data:
is_completed=data["is_completed"]
notify=""
if "notify" in data:
notify=data["notify"]
task = data["title"]
sql='INSERT INTO tasksdb (task, is_completed, notify) VALUES (%s, %s, %s)'
cursor.execute(sql, [task, is_completed, notify])
#Insert any of the attempts to obtain the id here and return it.
return "True"
Thank you for your help!
-------------------------------------EDIT-----------------------
I just added a task directly in mysql and executed this:
SELECT * from tasksdb;
+----+---------------+--------------+-----------------------------+
| id | task | is_completed | notify |
+----+---------------+--------------+-----------------------------+
| 22 | My First Task | 0 | test#berkeley.edu |
+----+---------------+--------------+-----------------------------+
1 row in set
The fact that id is 22 convinces me that the requests earlier were stored. However, I do not understand why only this one row is returned? Should all former requests not be saved as well?
After executing a query like
resp = engine.execute('UPDATE my_table SET name = NULL WHERE id = 10')
I want to display the execution response such as
Query OK, 2 rows affected (0.16 sec)
Is this response accessible through the ResultProxy object (ie resp)?
I want to save an API response, on some table of my database, I'm using Postgres along with psycopg2.
This is my code:
import json
import requests
import psycopg2
def my_func():
response = requests.get("https://path/to/api/")
data = response.json()
while data['next'] is not None:
response = requests.get(data['next'])
data = response.json()
for item in data['results']:
try:
connection = psycopg2.connect(user="user",
password="user",
host="127.0.0.1",
port="5432",
database="mydb")
cursor = connection.cursor()
postgres_insert_query = """ INSERT INTO table_items (NAME VALUES (%s)"""
record_to_insert = print(item['name'])
cursor.execute(postgres_insert_query, record_to_insert)
connection.commit()
count = cursor.rowcount
print (count, "success")
except (Exception, psycopg2.Error) as error :
if(connection):
print("error", error)
finally:
if(connection):
cursor.close()
connection.close()
my_func()
I mean, I just wanted to sort of "print" all the resulting data from my request into the db, is there a way to accomplish this?
I'm a bit confused as You can see, I mean, what could be some "print" equivalent to achieve this?
I mean, I just want to save from the API response, the name field, into the database table. Or actually INSERT that, I guess psycopg2 has some sort of function for this circumstance?
Any example You could provide?
EDIT
Sorry, I forgot, if I run this code it will throw this:
PostgreSQL connection is closed
A particular name
Failed to insert record into table_items table syntax error at or near "VALUES"
LINE 1: INSERT INTO table_items (NAME VALUES (%s)
There are a few issues here. I'm not sure what the API is or what it is returning, but I will make some assumptions and suggestions based on those.
There is a syntax error in your query, it is missing a ) it should be:
postgres_insert_query = 'INSERT INTO table_items (NAME) VALUES (%s)'
(I'm also assuming thatNAME` is a real column in your database).
Even with this correction, you will have a problem since:
record_to_insert = print(item['name']) will set record_to_insert to None. The return value of the print function is always None. The line should instead be:
record_to_insert = item['name']
(assuming the key name in the dict item is actually the field you're looking for)
I believe calls to execute must pass replacements as a tuple so the line: cursor.execute(postgres_insert_query, record_to_insert) should be:
cursor.execute(postgres_insert_query, (record_to_insert,))
Currently my code is
client = boto3.client('sdb')
query = 'SELECT * FROM `%s` WHERE "%s" = "%s"' % (domain, key, value)
response = client.select(SelectExpression = query)
The variable key and value is input by user, what are the best way to escape them in my above code?
Edit: What I concern is how to escape the fields such as we did in the past to prevent SQL injection, but now in SimpleDB
Subselects and destructive operations can't be performed using simpledb.
Amazon provides quoting rules: http://docs.aws.amazon.com/AmazonSimpleDB/latest/DeveloperGuide/QuotingRulesSelect.html
You can apply this behavior in python using this function:
def quote(string):
return string.replace("'", "''").replace('"', '""').replace('`', '``')
client = boto3.client('sdb')
query = 'SELECT * FROM `%s` WHERE "%s" = "%s"' % (quote(domain), quote(key), quote(value))
response = client.select(SelectExpression = query)
If you meant sideffect of SQL injection is deletion/destruction, SimpleDB only support querying data, if you want to protect data exposing ( that you dont want to ) check aws docs here
Note: Since the guide is good to go, i thought the link is enough