In Python I've connected to a Postgres database using the following code:
conn = psycopg2.connect(
host = "localhost",
port = "5432",
database = "postgres",
user = "postgres",
password = "123"
)
cur = conn.cursor()
I have created a table called departments and want to insert data into the database from a CSV file. I read the csv in as follows:
departments = pd.DataFrame(pd.read_csv('departments.csv'))
And I am trying to insert this data into the table with the following code:
for row in departments.itertuples():
cur.execute('''
INSERT INTO departments VALUES (?,?,?)
''',
row.id, row.department_name, row.annual_budget)
conn.commit()
which I've seen done in various articles but I keep getting the error:
TypeError: function takes at most 2 arguments (4 given)
How can I correct this, or is there another way to insert the csv?
You have to pass the row information as a tuple. Try this instead:
for row in departments.itertuples():
cur.execute('''
INSERT INTO departments VALUES (%s, %s, %s)
''',
(row.id, row.department_name, row.annual_budget))
conn.commit()
See the docs for more info: https://www.psycopg.org/docs/usage.html
Related
I am trying to test this function which inserts data into postgres via GCP Cloud Function. I am having trouble testing this. It say some JSON is required. However what is exactly required her to be called?
def hello_world(self):
# establishing the connection
conn = psycopg2.connect(
database='somedb',
user='someuser',
password='somepwd',
host='XX.XXX.XXX.XXX',
port='5432'
)
# creating a cursor object
cursor = conn.cursor()
# list that contain records to be inserted into table
data = [('Babita', 'Bihar'), ('Anushka', 'Hyderabad'),
('Anamika', 'Banglore'), ('Sanaya', 'Pune'),
('Radha', 'Chandigarh')]
# inserting record into employee table
for d in data:
cursor.execute("INSERT into employee(name, state) VALUES (%s, %s)", d)
# print("List has been inserted to employee table successfully...")
# Commit your changes in the database
conn.commit()
# Closing the connection
conn.close()
I wrote a Python script to read data from a CSV file into a SQL Server database and everything worked when everything was read into a table.
Now I have put the database in the 3NF and I don't know exactly how to fill the database correctly.
Python Code:
def connection_string(driver, server_name, database_name):
conn_string = f"""
DRIVER={{{driver}}};
SERVER={server_name};
DATABASE={database_name};
Trust_Connection=yes;
"""
return conn_string
sql_insert = '''
INSERT INTO Land(ISOCode, LandName)
VALUES (?, ?)
INSERT INTO Infections(InfactionsAtThisDay)
VALUES (?)
INSERT INTO CurrentDate(DateDay)
VALUES (?)
INSERT INTO CovidData(LandId, CurrentDateId, InfectionsId)
VALUES ((select LandId from Land), (select CurrentDateId from CurrentDate), (select InfectionsId from Infections));
'''
CSV Data: https://github.com/owid/covid-19-data/blob/master/public/data/owid-covid-data.csv
All tables are filled with the data except for "CovidData".
My question is what do I have to change so that "CovidData" is filled correctly?
I don't run into any error messages anymore but when i refresh my database nothing is actually injected? using psycopg2 and pgadmin4
import psycopg2 as p
con = p.connect("dbname =Feedbacklamp user =postgres password= fillpw host=localhost port=5432")
cur = con.cursor()
sql = "INSERT INTO audiolevels(lokaalnummer,audiolevel,tijdstip) VALUES (%s,%s,%s)"
val = "100"
val1 = 100
val2 = "tijdstip"
cur.execute(sql,(val,val1,val2))
con.commit
cur.close
con.close
The values to be inserted into my pgadmin sql database
con.commit() should be a function call I think is your problem. You are missing the parentheses which treats it as member access instead of a function call. This also goes for the other methods cur.close() and con.close()
I am creating a Python app that will store my homework in a database (using PhpMyAdmin). Here comes my problem:
At this moment, I am sorting every input with an ID (1, 2, 3, 4...), a date (23/06/2018...), and a task (read one chapter of a book). Now I would like to sort them by the date because when I want to read what do I have to do. I would prefer to see what shall I do first, depending on when should I get it done. For example:
If I have two tasks: one 25/07/2018 and the other 11/07/2018, I would like to show the 11/07/2018 first, no matter if it was addead later than the 25/07/2018. I am using Python (3.6), pymysql and PhpMyAdmin to manage the database.
I have had an idea to get this working, maybe I could run a Python script every 2 hours, that sorts all the elements in the database, but I have no clue about how can I do it.
Now, I will show you the code that enters the values into a database and then it shows them all.
def dba():
connection = pymysql.connect(host='localhost',
user='root',
password='Adminhost123..',
db='deuresc',
charset='utf8mb4',
cursorclass=pymysql.cursors.DictCursor)
try:
with connection.cursor() as cursor:
# Create a new record
sql = "INSERT INTO `deures` (`data`, `tasca`) VALUES (%s, %s)"
cursor.execute(sql, (data, tasca))
# connection is not autocommit by default. So you must commit to save
# your changes.
connection.commit()
with connection.cursor() as cursor:
# Read a single record
sql = "SELECT * FROM `deures` WHERE `data`=%s"
cursor.execute(sql, (data,))
resultat = cursor.fetchone()
print('Has introduït: ' + str(resultat))
finally:
connection.close()
def dbb():
connection = pymysql.connect(host='localhost',
user='root',
password='Adminhost123..',
db='deuresc',
charset='utf8mb4',
cursorclass=pymysql.cursors.DictCursor)
try:
with connection.cursor() as cursor:
# Read a single record
sql = "SELECT * FROM `deures`"
cursor.execute(sql)
resultat = cursor.fetchall()
for i in resultat:
print(i)
finally:
connection.close()
Can someone help?
You don't sort the database. You sort the results of the query when you ask for data. So in your dbb function you should do:
SELECT * FROM `deures` ORDER BY `data`
assuming that data is the field with the date.
I have an SQL database and am wondering what command you use to just get a list of the table names within that database.
To be a bit more complete:
import MySQLdb
connection = MySQLdb.connect(
host = 'localhost',
user = 'myself',
passwd = 'mysecret') # create the connection
cursor = connection.cursor() # get the cursor
cursor.execute("USE mydatabase") # select the database
cursor.execute("SHOW TABLES") # execute 'SHOW TABLES' (but data is not returned)
now there are two options:
tables = cursor.fetchall() # return data from last query
or iterate over the cursor:
for (table_name,) in cursor:
print(table_name)
SHOW tables
15 chars
show tables will help. Here is the documentation.
It is also possible to obtain tables from a specific scheme with execute the single query with the driver below.
python3 -m pip install PyMySQL
import pymysql
# Connect to the database
conn = pymysql.connect(host='127.0.0.1',user='root',passwd='root',db='my_database')
# Create a Cursor object
cur = conn.cursor()
# Execute the query: To get the name of the tables from a specific database
# replace only the my_database with the name of your database
cur.execute("SELECT table_name FROM information_schema.tables WHERE table_schema = 'my_database'")
# Read and print tables
for table in [tables[0] for tables in cur.fetchall()]:
print(table)
output:
my_table_name_1
my_table_name_2
my_table_name_3
...
my_table_name_x