postgres SQL Python Pandas - python

I have an SQL query that works through a Grafana dashboard is this possible to recreate in Pandas?
In Grafana:
SELECT
"time" AS "time",
metric AS metric,
value
FROM volttron
WHERE
$__timeFilter("time") AND
kv_tags->'equip_name' = '["35201"]' AND
'power' = any(m_tags)
ORDER BY 1,2
Trying to recreate in Pandas postgress connection with psycopg2:
eGauge35201 = pd.read_sql('SELECT "time" AS "time", metric AS metric, value FROM volttron WHERE $__timeFilter("time") AND kv_tags->equip_name = ["35201"] AND power = any(m_tags) ORDER BY 1,2', dbconn)
This throws a lot of errors:
DatabaseError: Execution failed on sql 'SELECT "time" AS "time", metric AS metric, value FROM volttron WHERE $__timeFilter("time") AND kv_tags->equip_name = ["35201"] AND power = any(m_tags) ORDER BY 1,2': syntax error at or near "$"
LINE 1: ...c AS metric, value FROM slipstream_volttron WHERE $__timeFil...
Im trying to build a dataframe directly... Sorry still learning db any tips greatly appreciated...

Not using Grafana, I'm pretty sure that $__timeFilter and m_tags is something that should come from Grafana's end, and is then replaced by proper PostgreSQL expressions when when the query is actually made to the database after you defined it.
The query also uses several protected SQL words like "time" which is correctly escaped, and value, which isn't. This can lead to some unwanted behaviour.
I would recreate this query for Pandas to understand properly. But without knowing what e.g. $__timeFilter is replaced with, we cannot know what it should be. You could e.g. monitor Grafana's or the database's logs.
From a quick Google search, this looks promising.

Related

Single Quote in Cursor Execute for Snowflake Connector

I am trying to update a table from the Snowflake Connector in python to a Snowflake table;
update TABLE set X = 100 where Y = 'Applebee's'
To escape the single quote in the Snowflake UI, I would format the "where" clause to be
where Y = 'Applebee\'s'
Also tried:
where Y = 'Applebee''s'
However, no fix that I have tried is succeeding to workaround this error in python. Is there a way to do this from python, in a one step process, without any steps in Snowflake? I only care about workarounds from python.
In all SQL, the only true way to avoid weird characters and not have to account for each possibility is to parametrized your sql calls either as some sort of stored procedure call or a parametrized call from your client - in this case python.
In general this means, never have a case where you are concatenating a sql string to the value of a where clause like you have above.
Here's an example of how to parametrize a statement in python here

Syntax error when inserting strings into MySQL using PyMySQL

I frequently use pymysql to insert data into a MySQL server.
When inserting strings, I usually (but not every time) receive: pymysql.err.ProgrammingError: (1064, ...) when I insert a string using the code (where refers to a varchar):
cursor.execute("Insert into table (column) values (%s)", (stringVar))
Typically I have to do something like:
cursor.execute("Insert into table (column) values ('"+stringVar+"')"))
However, sometimes that throws the same error and I have to do something like:
stringVar="'"+stringVar
stringVar=stringVar+"'"
cursor.execute("Insert into table (column) values ("+stringVar+")")
This just isn't a feasible way to program this operation.
I assume I am messing up something simple but I cannot figure out what this is. I use pymysql a lot and this error is really starting to wear on me. Any help would be much appreciated!
cursor.execute('INSERT INTO table (column) VALUES (?)', (stringVar,))
Whenever you're trying to directly format a string into a query like that, it's basically always a sign you're doing something wrong. Every python database interface I'm aware of has a way to pass parameters to queries like above. Note that having the stringVar contained within an iterable is required.

Django: Using named parameters on a raw SQL query

I'm trying to execute a raw query that is built dynamically.
To assure that the parameters are inserted in the valid position I'm using named parameters.
This seems to work for Sqlite without any problems. (all my tests succeed)
But when I'm running the same code against MariaDB it fails...
A simple example query:
SELECT u.*
FROM users_gigyauser AS u
WHERE u.email like :u_email
GROUP BY u.id
ORDER BY u.last_login DESC
LIMIT 60 OFFSET 0
Parameters are:
{'u_email': '%test%'}
The error I get is a default syntax error as the parameter is not replaced.
I tried using '%' as an indicator, but this resulted in SQL trying to parse
%u[_email]
and that returned a type error.
I'm executing the query like this:
raw_queryset = GigyaUser.objects.raw(
self.sql_fetch, self._query_object['params']
)
Or when counting:
cursor.execute(self.sql_count, self._query_object['params'])
Both give the same error on MariaDB but work on Sqlite (using the ':' indicator)
Now, what am I missing?
edit:
The format needs to have s suffix as following:
%(u_email)s
If you are using SQLite3, for some reason syntax %(name)s will not work.
You have to use :name syntax instead if you want to pass your params as {"name":"value"} dictionary.
It's contrary to the documentation, that states the first syntax should work with all DB engines.
Heres the source of the issue:
https://code.djangoproject.com/ticket/10070#comment:18

getting error during describe a table in vectorwise using python ingresdbi module

I am using python ingress module for connectivity with vectorwise database.
For describe a table I am using the code below:
import ingresdbi
local_db = ingresdbi.connect(database ='x',uid ='y',driver ='z',pwd ='p')
local_db_cursor = local_db.cursor()
local_db_cursor.execute('help tran_applog ; ' )
I am getting this error :
Syntax error. Last symbol read was: 'help'."
Solutions will be appreciated. Thanks
The problem you've got is that 'help' isn't a real SQL statement that's understood by the DBMS server. It's really a terminal monitor command that gets converted into some queries against the system catalogs under the covers.
The alternative depends a little on what you're trying to get from the "describe table". The system catalogs relating to table and column information are iitables and iicolumns and you can do a select against them. Check the documentation or experiment.
Alternatively there appears to be a row descriptor you can get from ingresdbi, see the example here http://community.actian.com/wiki/Python_Row_Description
HTH
I believe you should do it like in any other shell script: echo "help tran_applog;" | sql mydatabase
Reason: "HELP" is not a standard SQL statement.
As suggested by PaulM, your best option to get metadata about tables is to query the system catalogs (iitables, iicolumns, iirelation, etc).
Start with something like:
SELECT C.column_name, C.column_datatype
FROM iitables T, iicolumns C
WHERE T.table_name = C.table_name
AND T.table_name = 'tran_applog';\g

Python+MySQLConnector: Substitution in query results in an error

I used MySQL Connector/Python API, NOT MySQLdb.
I need to dynamically insert values into a sparse table so I wrote the Python code like this:
cur.executemany("UPDATE myTABLE SET %s=%s WHERE id=%s" % data)
where
data=[('Depth', '17.5cm', Decimal('3003')), ('Input_Voltage', '110 V AC', Decimal('3004'))]
But it resulted an error:
TypeError: not enough arguments for format string
Is there any solution for this problem? Is it possible to use executemany when there is a
substitution of a field in query?
Thanks.
Let's start with the original method:
As the error message suggests you have a problem with your SQL syntax (not Python). If you insert your values you are effectively trying to execute
UPDATE myTABLE SET 'Depth'='17.5cm' WHERE id='3003'
You should notice that you are trying to assign a value to a string 'Depth', not a database field. The reason for this is that the %s substitution of the mysql module is only possible for values, not for tables/fields or other object identifiers.
In the second try you are not using the substitution anymore. Instead you use generic python string interpolation, which however looks similar. This does not work for you because you have a , and a pair of brackets too much in your code. It should read:
cur.execute("UPDATE myTABLE SET %s=%s WHERE id=%s" % data)
I also replaced executemany with execute because this method will work only for a single row. However your example only has one row, so there is no need to use executemany anyway.
The second method has some drawbacks however. The substitution is not guaranteed to be quoted or formatted in a correct manner for the SQL query, which might cause unexpected behaviour for certain inputs and may be a security concern.
I would rather ask, why it is necessary to provide the field name dynamically in the first place. This should not be necessary and might cause some trouble.

Categories

Resources