ftplib -- deleting many files from ftp folder - python

I am pretty new to Python, but would like to use it to do some tasks on an FTP. Feel like it should be fairly easy but I am having some issues when trying to delete multiple files (hundreds) from an FTP folder. I have the file names I want to delete as strings from a SQL table I can copy and paste if needed.
My code so far:
import os
import ftplib
ftpHost = 'ftp.myhost.com'
ftpPort = 21
ftpUsername = 'myuser'
ftpPassword = 'mypassword'
ftp = ftplib.FTP(timeout=30)
ftp.connect(ftpHost, ftpPort)
ftp.login(ftpUsername, ftpPassword)
ftp.cwd("/myftpfolder/January2023")
ftp.delete("1234myfile.mp4")
ftp.quit()
print("Execution complete...")
As above, I can delete the files one-off but is there a practical way for me to delete about 800 files from the folder above if I were able to paste them somewhere or put them in a text file and have Python read through it to execute the deletes? I suppose this isn't necessarily an FTP or ftplib specific question, but could help me get a better general understanding of lists, tuples, etc. Using Python3.10 btw.
Thanks!

Related

Zip list of files python

I have created some csv files in my code and I would like to zip them as one folder to be sent by e-mail. I already have the e-mail function but the problem is to zip.
I tried to use this: here I am not extracting or find the files in a directory. I am creating the program the csv files and making a list of it.
My list of files is like this:
lista_files = [12.csv,13.csv,14.csv]
It seems to be easy for developers but as a beginning it is hard. I would really appreciate if someone can help me.
I believe you're looking for the zipfile library. And given that you're looking at a list of filenames, I'd just iterate using a for loop. If you have directories listed as well, you could use os.walk.
import zipfile
lista_files = ["12.csv","13.csv","14.csv"]
with zipfile.ZipFile('out.zip', 'w') as zipMe:
for file in lista_files:
zipMe.write(file, compress_type=zipfile.ZIP_DEFLATED)

Python FTP how to ask for folders and delete/remove and create new one

So I'm looking for a script to connect to my FTP server and ask for folders and delete/remove those folders and make new ones after the delete/remove. Any ideas?
ftplib is built into Python and should do exactly what you need. It is well documented here: https://docs.python.org/3.6/library/ftplib.html
You'll need to use ftp.rmd(dir) to remove a directory and ftp.mkd(dir) to create one. Here's are some commands you might use:
from ftplib import FTP
ftp = FTP('ftp.example.com')
ftp.login()
# List contents
ftp.dir()
# Or get file/folder names
items = ftp.nlst()
# Delete directory 'mydir'
ftp.rmd('mydir')
# Create directory 'mydir'
ftp.mkd('mydir')
# Disconnect
ftp.quit()
Try using the ftplib, something like with this:
from ftplib import FTP
ftp = FTP(your_host)
ftp.login(user=your_username, passwd=your_password)
To get a single file from your ftp instance, use
`ftp.retrbinary('RETR {}'.format(file_name), destination)`
A delete just uses ftp.delete(file_name), sends use
ftp.storbinary('STOR {}'.format(file_name), source)
and lists can use ftp.nlst()

Python FTP: parseable directory listing

I'm using the Python FTP lib for the first time. My goal is simply to connect to an FTP site, get a directory listing, and then download all files which are newer than a certain date - (e.g. download all files created or modified within the last 5 days, for example)
This turned out to be a bit more complicated than I expected for a few reasons. Firstly, I've discovered that there is no real "standard" FTP file list format. Most FTP sites conventionally use the UNIX ls format, but this isn't guaranteed.
So, my initial thought was to simply parse the UNIX ls format: it's not so bad after all, and it seems most mainstream FTP servers will use it in response to the LIST command.
This was easy enough to code with Python's ftplib:
import ftplib
def callback(line):
print(line)
ftp = ftplib.FTP("ftp.example.com")
result = ftp.login(user = "myusername", passwd = "XXXXXXXX")
dirlist = ftp.retrlines("LIST", callback )
This works, except the problem is that the date given in the UNIX list format returned by the FTP server I'm dealing with doesn't have a year. A typical entry is:
-rw-rw-r-- 1 user user 1505581 Dec 9 21:53 somefile.txt
So the problem here is that I'd have to code in extra logic to sort of "guess" if the date refers to the current year or not. Except really, I'd much rather not code some complex logic like that when it seems so unnecessary - there's no reason the FTP server shouldn't be able to give me the year.
Okay, so after Googling around for some alternative ways to get LIST information, I've found that many FTP servers support the MLST and MLSD command, which apparently provides a directory listing in a "machine-readable" format, i.e. a list format which is much more amenable to automatic processing. Great. So, I try the following:
dirlist = ftp.sendcmd("MLST")
print(dirlist)
This produces a single line response, giving me data about the current working directory, but NOT a list of files.
250-Start of list for /
modify=20151210094445;perm=flcdmpe;type=cdir;unique=808U6EC0051;UNIX.group=1003;UNIX.mode=0775;UNIX.owner=1229; /
250 End of list
So this looks great, and easy to parse, and it also has a modify date with the year. Except it seems the MLST command is showing information about the directory itself, rather than a listing of files.
So, I've Googled around and read the relevant RFCs, but can't seem to figure out how to get a listing of files in "MLST" format. It seems the MLSD command is what I want, but I get a 425 error when I try that:
File "temp8.py", line 8, in <module>
dirlist = ftp.sendcmd("MLSD")
File "/usr/lib/python3.2/ftplib.py", line 255, in sendcmd
return self.getresp()
File "/usr/lib/python3.2/ftplib.py", line 227, in getresp
raise error_temp(resp)
ftplib.error_temp: 425 Unable to build data connection: Invalid argument
So how can I get a full directory listing in MLST/MLSD format here?
There is another module ftputil which is built based on ftplib, and has many features emulating os, os.path, shutil. I found it pretty easy to use and robust in related operation. Maybe you could give it a try.
As for your purpose, the introduction codes solves it exactly.
you could try this, and see if you can get what you need.
print(ftp.mlst('directory'))
I am working on something similar where i need to parse the content of directory and all sub directories within. However the server that I am working with did not allow mlst command, so i accomplished what i need by,
parse the main directory content
for loop through main directory content
Append for loop output to pandas DataFrame.
test = pd.Series('ftp.nlst('/target directory/'))
df_server_content = pd.DataFrame()
for i in test:
data_dir = '/target directory/' + i
server_series = pd.Series(ftp.nlst(data_dir))
df_server_content = df_server_content.append(server_series)

Compare archiwum.rar content and extracted data from .rar in the folder on Windows 7

Does anyone know how to compare amount of files and size of the files in archiwum.rar and its extracted content in the folder?
The reason I want to do this, is that server I'am working on has been restarted couple of times during extraction and I am not sure, if all the files has been extracted correctly.
.rar files are more then 100GB's each and server is not that fast.
Any ideas?
ps. if the solution would be some code instead standalone program, my preference is Python.
Thanks
In Python you can use RarFile module. The usage is similar to build-in module ZipFile.
import rarfile
import os.path
extracted_dir_name = "samples/sample" # Directory with extracted files
file = rarfile.RarFile("samples/sample.rar", "r")
# list file information
for info in file.infolist():
print info.filename, info.date_time, info.file_size
# Compare with extracted file here
extracted_file = os.path.join(extracted_dir_name, info.filename)
if info.file_size != os.path.getsize(extracted_file):
print "Different size!"

Python ftplib - uploading multiple files?

I've googled but I could only find how to upload one file... and I'm trying to upload all files from local directory to remote ftp directory. Any ideas how to achieve this?
with the loop?
edit: in universal case uploading only files would look like this:
import os
for root, dirs, files in os.walk('path/to/local/dir'):
for fname in files:
full_fname = os.path.join(root, fname)
ftp.storbinary('STOR remote/dir' + fname, open(full_fname, 'rb'))
Obviously, you need to look out for name collisions if you're just preserving file names like this.
Look at Python-scriptlines required to make upload-files from JSON-Call and next FTPlib-operation: why some uploads, but others not?
Although a different starting position than your question, in the Answer of that first url you see an example construction to upload by ftplib a json-file plus an xml-file: look at scriptline 024 and further.
In the second url you see some other aspects related to upload of more files.
Also applicable for other file-types than json and xml, obviously with a different 'entry' before the 2 final sections which define and realize the FTP_Upload-function.
Create a FTP batch file (with a list of files that you need to transfer). Use python to execute ftp.exe with the "-s" option and pass in the list of files.
This is kludgy but apparently the FTPlib does not have accept multiple files in its STOR command.
Here is a sample ftp batch file.
*
OPEN inetxxx
myuser mypasswd
binary
prompt off
cd ~/my_reg/cronjobs/k_load/incoming
mput *.csv
bye
If the above contents were in a file called "abc.ftp" - then my ftp command would be
ftp -s abc.ftp
Hope that helps.

Categories

Resources