Psycopg2 converting date string to previous day - python

I'm using psycopg2 to insert some data into a PostgreSQL database.
What I'm doing:
import psycopg2
conn = psycopg2.connect(host="127.0.0.1", port='5432', database="mydatabase", user="root", password="root")
cursor = conn.cursor()
# MY-SQL-FILE has a bunch of INSERT statements, such as:
# INSERT INTO mytable (mydate) values ('1991-18-03');
cursor.execute(open("/MY-SQL-FILE.sql").read())
conn.commit()
conn.close()
What I end up with
Executing the above script will create a row in my database where the date is: 1991-03-17 21:00:00
Obs: The field type of mydate is timestamptz
What I expected
I expected to get a row where the date were 1991-03-18 00:00:00
Funny thing is if I try to execute the same insert statement into DBeaver I get 1991-03-18 00:00:00
What I want to know
Why is this happening? (it most likely will work if I convert the date from string to datetime)
But I want to know why this happens, where does this 21:00:00 comes from, if it were a DB TZ problem the time would not be the same everytime I think...

Related

Django: Performing a raw sql query with count()

I want to make a query to get the number of rows in the model MyTable that have the field expire (datetime) be a date in the current week.
In MySQL I execute the following query:
select count(expire) from MyTable where yearweek(expire) = yearweek(now());
So I can get the rows that belongs to the current week starting from Sunday. I tried replicate that using queryset in my view like this:
now = datetime.now()
myquery = MyTable.objects.filter(expire__year=now.year).filter(expire__week=now.isocalendar()[1])
But the thing is, Django method of querying starts from Monday, I want it for Sunday. So I though in doing a raw sql query Like this:
myquery = MyTabel.objects.raw('select count(expire) from MyTable where yearweek(expire) = yearweek(now());')
But it doesnt work, it doesnt output a value. How can I make this query?
I'm using mysql 5.7, python 3.7 and django 2.1.11
I know these are all old versions but I can't update.

pyodbc: can't execute very long sql statement

I am trying to execute a very long statement from python which has around 1.3 million characters using following code:
import pyodbc
conn_str = 'Driver={SQL Server};
SERVER=MYSERVER;DATABASE=MyDatabase;TrustedConnection=True'
conn = pyodbc.connect(conn_str)
cursor = conn.cursor()
try:
cursor.execute("A SQL statement with 1.3m characters")
cursor.commit()
except Exception as e:
print(e)
finally:
conn.close()
It's basically a long list of insert statements.
I am watching the SQL Profiler as this is running against my SQL server and it executes every time a different number of INSERT statements. It inserts the data up to around 40k characters. Then it suddenly stops. I was thinking of a max number of characters that a sql statement can hold, but since it's executing a different number of statements that doesn't sound like the issue here?
Any one any ideas what's happening here and how I could get around this?
Thanks,
Joe
Edit:
here is the query:
SET XACT_ABORT ON;
SET QUOTED_IDENTIFIER ON;
IF (select max(id)
from Table1) = 87648
BEGIN
BEGIN TRY
BEGIN TRANSACTION
INSERT INTO Table1 VALUES (87649, 'G4KG72HF6','87649');
INSERT INTO Table1 VALUES (87650, 'G4KG72HF6','87650');
INSERT INTO Table1 VALUES (87651, 'GDGVFKVW6','87651');
INSERT INTO Table1 VALUES (87652, 'GYAPWLNU1','87652');
INSERT INTO Table1 VALUES (87653, 'GYAPWLNU1','87653');
INSERT INTO Table1 VALUES (87654, 'H884542A2','87654');
INSERT INTO Table1 VALUES (87655, 'HT2XM4U83','87655');
INSERT INTO Table1 VALUES (87656, 'GPD9P39C7','87656');
INSERT INTO Table1 VALUES (87657, 'J2ZBUN7Q7','87657');
INSERT INTO Table1 VALUES (87658, 'JBWS35M69','87658');
INSERT INTO Table1 VALUES (87659, 'JMU6ANZN7','87659');
INSERT INTO Table1 VALUES (87660, 'JWRLK6D48','87660');
INSERT INTO Table1 VALUES (87661, 'K6NZSPSL2','87661');
--- a lot more inserts happening here
COMMIT
END TRY
BEGIN CATCH
PRINT N'ERROR: ' + ERROR_MESSAGE()
IF ##TRANCOUNT > 0
BEGIN
ROLLBACK
PRINT N'Transaction rolled back'
END
END CATCH
END
ELSE
PRINT 'Max id in Table1 != 87648, has this script been ran already?'
It's basically a long list of insert statements
Since your SQL text does not begin with SET NOCOUNT ON;, each INSERT statement is generating an update count that gets queued so it can be returned to your Python app, and there is a limit as to how long that queue can be.
So, just prepend SET NOCOUNT ON; to your SQL text to avoid the problem.
(See this GitHub issue for a more thorough discussion.)

How to insert a datetime properly in a Sqlite database? (UTC vs with-timezone time)

I need to insert a datetime into a Sqlite database with Python. (I already read this question, it's not a duplicate here because I'm dealing with timezones questions).
import sqlite3, datetime
dbconn = sqlite3.connect(':memory:')
c = dbconn.cursor()
c.execute('create table mytable(title text, t timestamp)')
#1
c.execute('insert into mytable (title, t) values (?, ?)', ("hello2", datetime.datetime(2018,3,10,12,12,00)))
#2
c.execute('insert into mytable (title, t) values (?, ?)', ("hello", "Sat Mar 10 2018 12:12:00 GMT+0100 (Paris, Madrid)"))
c.execute("select * from mytable")
for a in c.fetchall():
print a
# (u'hello', u'Sat Mar 10 2018 12:12:00 GMT+0100 (Paris, Madrid)')
# (u'hello2', u'2018-03-10 12:12:00')
The method #1 seems to be the natural way to insert a datetime object inside a Sqlite database, but then it doesn't save the timezone.
I receive user input in the form Sat Mar 10 2018 12:12:00 GMT+0100 (Paris, Madrid). The method #2 seems to save it into the DB as a string only, and this is not very good: then we can't easily query all rows with date between day1 and day2 for example.
Question: Should I use another method, and convert Sat Mar 10 2018 12:12:00 GMT+0100 (Paris, Madrid) , and insert this UTC in the Sqlite DB?
Or even should I only insert the UNIX timestamp as an integer in the DB?
Note: I'm ok to discard the original timezone, if it's properly converted to UTC: when I'll query from DB and display for output on a web page, I'll format with the user browser timezone.
Note2: According to https://www.sqlite.org/datatype3.html#date_and_time_datatype, it seems that Sqlite doesn't have a datetime type, it's stored as TEXT, REAL, or INT. But how does it internally know what to use among these choices?
The column t is not "really" a datetime one. According to: https://docs.python.org/2/library/sqlite3.html#default-adapters-and-converters , python adds the "timestamp" type to allow easy insertion of datetime objects into sqlite. These objects are converted into strings when they are stored in sqlite. sqlite only supports these datatypes: https://www.sqlite.org/datatype3.html
According to https://www.sqlite.org/lang_datefunc.html , sqlite uses UTC time internally. I think the easiest way for you to store your times is to either convert them into UTC or Unix time before inserting them.
If your times are in local time, you can convert them this way: How do I convert local time to UTC in Python?
If your times are not in local time, converting them is a bit more complicated, as Python does not define any timezones in the standard library. Your best bet here is probably to use a library such as the one described here: https://stackoverflow.com/a/4771733/6180687
I would recommend using Unix time if you already have it, to make your life easier.

Python date string to postgres usable date

Issue:
I stored some dates as text in my Postgres table, and I want to convert it over to actual dates, again in Postgres.
Im not sure if there is a better way to do this or what im doing wrong. I have pulled a bunch of data into a PostgreSQL database in just text format. As a result I need to go back through and clean it up. I am running into issues with the data. I need to convert it into a format that PostgreSQL can use. I went about pulling it back into python and trying to convert and kick it back. Is this the best way to do this? Also I am having issue with datetime.strptime.. I believe i've got the directive correct but no go. :/
import psycopg2
from datetime import datetime
# connect to the PostgreSQL database
conn = psycopg2.connect(
"dbname='postgres' user='postgres' host=10.0.75.1 password='mysecretpassword'")
# create a new cursor
cur = conn.cursor()
cur.execute("""SELECT "Hash","Date" FROM nas """)
# commit the changes to the database
myDate = cur.fetchall()
for rows in myDate:
target = rows[1]
datetime.strptime(target, '%B %d, %Y, %H:%M:%S %p %Z')
Here is a Postgres query which can convert your strings into actual timestamps:
select
ts_col,
to_timestamp(ts_col, 'Month DD, YYYY HH:MI:SS PM')::timestamp with time zone
from your_table;
For a full solution, you might take the following steps:
create a new timestamp column ts_col_new in your table
update that column using the logic from the above query
then delete the old column containing text
The update might look something like this:
update your_table
set ts_col_new = to_timestamp(ts_col, 'Month DD, YYYY HH:MI:SS PM')::timestamp with time zone;

How to insert values of variables using python 2.7 into mysql?

I want to insert the value of variable (temp, hum) into my table sensors using mysql database (db name:project). I am using Ubuntu 14.04, and the connection is made between python and MySQL, but I can't put the value of the variable. So any help is welcome, and thanks.
This is my script Python:
from random import *
import socket
import sys
import mysql.connector
from datetime import datetime
temp = randrange(10, 31, 2)
hum = randrange(300, 701, 2)
print temp
print hum
conn=mysql.connector.connect(user='root',password='12345',host='localhost',database='projet');
mycursor=conn.cursor();
mycursor.execute("""INSERT INTO sensors VALUES('$datetime','$temp','$hum')""");
conn.commit()
mycursor.execute("SELECT * FROM sensors")
This is my table where you can find the variables temp and hum and not their values:
That's because you are not passing the variable values into the query. Pass them with the query into the execute() method:
mycursor.execute("""
INSERT INTO
sensors
VALUES
('$datetime', %s, %s)""", (temp, hum))
where %s are placeholders for the query parameters. Note that the database driver would handle the parameter type conversion and escaping automatically.
As for the $datetime, if you want to have a current date/datetime in this column, look into using NOW() or CURDATE(), see:
MySQL: Curdate() vs Now()
but i can't put the value of variable
Assuming data-time is a string and the rest are integers
Try
"INSERT INTO sensors VALUES('%s','%d','%d')", %(date, temp, num)
in the execute method

Categories

Resources