Error Exporting .csv from PostgreSQL using Python 3 - python

I have a simple PostgreSQL copy statement that copies a table from a network host (NETWORK-HOST) database to a .csv file on a shared network folder. It works fine in PGAdmin, but when I transfer it to a psycopg2 script, it tells me permission denied. I have double checked to make sure full control is granted to my username on the network share, but that has not made a difference. I am running Windows 10, Python 3 (32 bit), PostgreSQL 9.5.1.
Here is the script in PGAdmin that runs successfully:
copy "Schema".county_check_audit to '\\NETWORK-HOST\NetworkFolder\county_check.csv' delimiter ',' CSV HEADER;
here is the script where I get the permission error:
import psycopg2
connection = psycopg2.connect(database="db", user="postgres", password="password", host="NETWORK-HOST")
cursor = connection.cursor()
cursor.execute("copy \"Schema\".area_check_audit to '\\\\NETWORK-HOST\\NetworkFolder\\area_check.csv' delimiter ',' CSV HEADER;")
connection.commit()
This is the error:
psycopg2.ProgrammingError: could not open file "\\NETWORK-HOST\NetworkFolder\area_check.csv" for writing: Permission denied
Any insights are greatly appreciated.

According to the error message, you have to add write access to the file.
To change file'security access on windows, check : Permission denied when trying to import a CSV file from PGAdmin
I'd suggest to test your code first by trying to write to a file that is on the same host and once you are sure your code is fine, you can debug the access rights to your file on another host.

Related

python 3, NamedTemporaryFile, permission denied

I need to create a file using NamedTemporaryFile and download some data into it:
f = NamedTemporaryFile(suffix='.csv', delete=False)
s3.download_fileobj(bucket, f"{files_dir}/metadata/{table_name}.csv.gz", f)
After that I need to load data into mysql table using LOAD DATA INFILE
But I get an error:
File '/tmp/tmp00f3funu.csv' not found (OS errno 13 - Permission denied)
I suppose mysql root user doesn't have enough permissions to read temp file.
This temp file has permissions to read only for owner of this file.
What should I do to resolve this issue?
here you either need to run as SQL user, which might not be the best idea
or just change file permissions after file is created using os.chmod
see detailed answer here: https://stackoverflow.com/a/16249655/7453765

How to give my python program permission for saving a file on a server?

I am really new in Python.
I have made a python program for merging pdfs. The problem is when the program want to save the completed merged pdf on the server (at work), it doesn't work. I get an Error: Permission denied. But when I say to my program that the merged pdf file must be saved on a local folder (on my pc), it works perfectly. How can I fix this?
An example of pathpdflocation_compleet (local on my pc) :
C:\Users\myname\Documents\TF\Compleet\pdfcompleet.pdf
An example of pathpdflocation_compleet (at work) :
W:\xxx\xxx\xxx\xx\pdfcompleet.pdf
Can I give my python script permission for saving this pdf file on the server?
I have searched on the web, but I don't find anything about this.
I have already tried with double backslashes and / or \.
I have already tried to give the full name of the server.
'''
python
'''
from PyPDF2 import PdfFileMerger
from openpyxl import load_workbook
filepath=input ('Path of the Merge Excel: ')
wb=load_workbook(filepath,data_only = True)
fiche1="A1"
file1=sheet[fiche1].value
fiche2="A2"
file2=sheet[fiche2].value
pdf2merge = [file1,file2,...]
merger = PdfFileMerger()
for pdf in pdf2merge:
merger.append(pdf)
with open(pathpdflocation_compleet, 'wb') as fout:
merger.write(fout)
The error is this:
IOError: [Errno 13] Permission denied
I want the file to be saved on the server.

Django-dbbackup: how to change backup file extension?

I am trying to use django-dbbackup module to backup my database. After running dbbackup it saves a db dump with a name something like this:
default-user-2018-03-22-220805.psql
Then I delete 1 last row in one tables of my db. Next I run dbrestore and get the follow:
> Finding latest backup Restoring backup for database 'default' and
> server 'None'
> Restoring: default-user-2018-03-22-220805.psql
> Restore tempfile created: 407.0 KiB
> Are you sure you want to continue?
> [Y/n] y
> Following files were affected
But after that - nothing happens. The deleted row is not restored in my db.
I have read some where (unfortunately already lost that page) that to restore my db the dump file must be in .tar format.
Also I have tried to use that .psql with pgAdmin 4 - to restore db via UI tool. But got an error that input file is not a valid archive.
And last I tried to use that file with Windows cmd running pd_restore and got:
pg_restore: [archiver] input file does not appear to be a valid archive
So the question is: how to use django-dbbackup dbrestore with its
generated file or how to change the extension format of its output
files if it is not possible to restore from .psql files?
P.S. I have also found in the source code a row extension = 'psql', tried to change that, and it worked - I got a .tar file on the output but next dbrestore said that:
Finding latest backup
CommandError: There's no backup file available.
You can change file backup extension with this variable
DBBACKUP_FILENAME_TEMPLATE = '{databasename}-{servername}-{datetime}.sql'
You define DBBACKUP_FILENAME_TEMPLATE variable in settings.py
This setting change extension backup file from dump to sql.
You can change sql extension file with another extension.

Generating Model code with pwiz Error

I am trying to generate some model values using pwiz. The database in question is a Sqlite database, which actually contains some Django tables in addition to some regular tables generated using a python script. However when I try the following in a Linux terminal
python -m pwiz -e sqlite -t mapping_table db.sqlite3
I get the following error:
/python2.7/site-packages/peewee.py", line 3001, in get_indexes
for _, name, is_unique in cursor.fetchall():
ValueError: too many values to unpack
The table I am trying to retrieve is one generated using another python script. It only has a couple of columns and rows in it. Not sure how to proceed here exactly.
python -m pwiz -e sqlite db.sqlite3 > db_map.py
pwiz reads the database and creates a file with the database mapping.
db.sqlite3 is the name of your database (put your database filename)
db_map.py is the filename of the output file (name it as you like but keep the .py extension)

mysql load data infile for a list of files

I am using ubuntu 12.04 operating system. I have a folder full of .csv files. I need to import all these csv files into a mysql data base on the local machine. Currently, I have been using this syntax, from the mysql command line, to load the csv files into the data base 1 by 1:
load data local infile 'file_name.csv' into table table_name fields terminated by ',' optionally enclosed by '"' lines terminated by '\r\n';
This works really well. I want to know if there is a way that I could load all these files at once. My first idea was to make a python script to handle it:
import MySQLdb as mysql
import os
import string
db=mysql.connect(host="localhost",user="XXXX",passwd="XXXX",db="test")
l = os.listdir(".")
for file_name in l:
print file_name
c=db.cursor()
if (file_name.find("DIV.csv")>-1):
c.execute("""load data local infile '%s' into table cef_div_table fields terminated by ',' optionally enclosed by '"' lines terminated by '\r\n';""" % file_name)
With this solution, I am running into the problem that load data local infile will not work with the new versions of MySQL clients, unless I start MySQL from the command line with the --local-infile option. That is really a drag...
I found a solution that seemed to work. I use the local_file = 1 option when establishing the connection in python (as suggested here: MySQL LOAD DATA LOCAL INFILE Python). This way, the code appears to complete without any errors, but nothing is every uploaded to the database.
It is strange, just to make sure, I tried uploading a single file from the mysql command line, and it worked file.
I am willing to try another solution to this problem of uploading multiple csv files into mysql all at once. Any help is greatly appreciated!

Categories

Resources