Using the "net use" command in a loop - python

I am writing a simple python script that connects to several remote Windows machines, read the content of a remote folder on these machine, and then zip and copy all files that have been modified after a given date.
The problem is that after it connects to the first computer, it does not connect to the second computer, the "net use" command does not work.
If I do it manually though the windows command line of my computer, it does work, but not through my python script.
I have not found any topic that could help me, and I am kind of stuck now... Have you guys any idea of what I could do wrong ?
Below is my code (I apologize if it is not looking very neat, I am starting in python).
import os, subprocess, datetime, shutil
# the local destination of log files on my computer
ROOT_folder = 'C:\\Logs'
for i, IP in enumerate(list_of_IPs):
# list of names corresponding to the IPs
location = list_of_locs[i]
# create a local repository in my ROT folder for storing the logs of this remote station
try:
os.chdir(ROOT_folder + '\\' + location)
except:
os.mkdir(ROOT_folder + '\\' + location)
# The path to the logs that are stored on the remote windows machines
remote_path_to_logs = '_Temporary\\Logs'
print location
# Mapping a drive m: >>> Here I get the error at the 2nd iteration
subprocess.call(r'net use m: \\' + IP + '\c$ Password /user:Username', shell=True)
# The modification date of the most recent file I donwloaded is stored on my local computer in a txt file - here I read the date
try:
with open(ROOT_folder + '\\' + location + '\\last_mod.txt','r') as myFile:
last_file_downloaded = datetime.datetime.strptime(myFile.read(),'%Y-%m-%d %H:%M:%S')
except:
last_file_downloaded = datetime.datetime(1970,1,1)
os.chdir('M:\\' + remote_path_to_logs)
# I sort the list of files from oldest to newest
list_files = os.listdir('M:\\' + remote_path_to_logs)
list_sorted = sorted([(fl, os.path.getmtime(fl)) for fl in list_files],key=lambda x: x[1])
for log, logtime in list_sorted:
date_file = datetime.datetime(1970,1,1) + datetime.timedelta(seconds=logtime)
# I zip and move the file to my computer if it was modified after the date I stored on my computer
if date_file > last_file_downloaded :
print log + ': zipping and moving to local directory... '
with zipfile.ZipFile(date_file.strftime('%Y-%m-%d_%H-%M-%S') + '.zip','w', zipfile.ZIP_DEFLATED) as z:
z.write(log)
shutil.move(date_file.strftime('%Y-%m-%d_%H-%M-%S') + '.zip',ROOT_folder + '\\' + location)
# I overwrite the modification date in my file
with open(ROOT_folder + '\\' + location + '\\last_mod.txt','w') as myFile:
myFile.write(date_file.strftime('%Y-%m-%d %H:%M:%S'))
# Disconnecting the drive m:
subprocess.call('net use m: /delete /yes', shell=True)
# I tried to put a time.sleep(5) here but it does not help

Actually, if I don't use a letter at all for the mapped drive, the script is working:
I use the command 'net use \\' + IP + '\c$ Password /user:Username' instead of 'net use m: \\' + IP + '\c$ Password /user:Username'.
It works, althgough I still can't explain why it is not working using a same letter.

Related

why in the famous backup python script, it doesn't work when target folder name contains spaces?

hello everyone: this is the famous code for backing up files. It works fine giving me the message ( successful back up ). the target directory is created, but it is empty.
But when I remove spaces from the target folder name it worked very fine. so, what is the problem and how to use target folders with spaces ?
enter code here
import os
import time
source = [r'D:\python35']
target_dir = r"D:\Backup to harddisk 2016"
target = target_dir + os.sep + time.strftime('%Y%m%d%H%M%S') + '.zip'
if not os.path.exists(target_dir):
os.mkdir(target_dir)
zip_command = "zip -r {0} {1}".format(target, ' '.join(source))
print('zip command is:')
print(zip_command)
print('Running:')
if os.system(zip_command) == 0:
print('Successful backup to', target)
else:
print('Backup FAILED')
The problem is that Windows doesn't know that those spaces are part of the file name. Use subprocess.call, which takes a list of parameters instead. On Windows, it escapes the spaces for you before calling CreateProcess.
import subprocess as subp
zip_command = ["zip", "-r", target] + source
if subp.call(zip_command) == 0:
print('Successful backup to', target)
It looks like it uses the command line: "zip -r {0} {1}".format(target, ' '.join(source).
Spaces are used to separate arguments on the command line. If there's a space within the name, it believes it's the start of another argument.

python script windows path

I have a python script that doesn't seem to be opening the files.
The folder in the script is defined like this:
logdir = "C:\\Programs\\CommuniGate Files\\SystemLogs\\"
submitdir = "C:\\Programs\\CommuniGate Files\\Submitted\\"
This is how the paths are being used:
filenames = os.listdir(logdir)
fnamewithpath = logdir + fname
I'm running this script in Windows 7 sp1
Does this look correct?
Is there something I can put into the code to debug it to make sure the files are opening?
Thank you,
Docfxit
Edited to provide more clarification:
The actual code to open and close the file is here:
# read all the log parts
for fname in logfilenames :
fnamewithpath = logdir + fname
try :
inputFile = open(fnamewithpath, "r")
except IOError as reason :
print("Error: " + str(reason))
return
if testing :
print("Reading file '%s'" % (fname))
reporter.munchFile(inputFile)
inputFile.close()
# open output file
if testing :
outfilename = fullLognameWithPath + ".summary"
fullOutfilename = outfilename
else :
outfilename = submitdir + "ls" + str(time.time()) + "-" + str(os.getpid())
fullOutfilename = outfilename + ".tmp"
try :
outfile = open(fullOutfilename, "w")
except IOError :
print("Can't open output file " + fullOutfilename)
return
if not testing :
# add the mail headers first
outfile.write("To: " + reportToAddress + "\n")
outfile.write("From: " + reportFromAddress + "\n")
outfile.write("Subject: CGP Log Summary new for " + logname + "\n")
if useBase64 :
outfile.write("Content-Transfer-Encoding: base64\n")
outfile.write("\n")
# save all this as a string so that we can base64 encode it
outstring = ""
outstring += "Summary of file: " + fullLogname + partAddendum + "\n"
outstring += "Generated " + time.asctime() + "\n"
outstring += reporter.generateReport()
if useBase64 :
outstring = base64.encodestring(outstring)
outfile.write(outstring)
outfile.close()
if not testing :
# rename output file to submit it
try :
os.rename(outfilename + ".tmp", outfilename + ".sub")
except OSError :
print("Can't rename mail file to " + outfilename + ".sub")
I was originally wondering if the double back slashes included in the path were correct.
I can't figure out why it isn't producing the output correctly.
Just in case someone would like to see the entire script I posted it:
The first half is at:
http://pastebin.ws/7ipf3
The second half is at:
http://pastebin.ws/2fuu3n
It was too large to post all in one.
This is being run in Python 3.2.2
Thank you very much for looking at it,
Docfxit
The code as written above does not actually open either file.
os.listdir returns a list of the files (technically entries, as non-files like . and .. are also included) in the specified path, but does not actually open them. For that you would need to call the open function on one of the paths.
If you wanted to open all the files in filenames for write, you could do something like this:
fileList = []
for f in filenames:
if os.path.isfile(fullPath):
fullPath = os.path.join(logdir, f)
fileList.append(open(fullPath, 'w')
After this, the list fileList would contain the open file handles for all of the files, which could then be iterated over (and, for example, all written to if you wanted to multiplex the output).
Note that when done, you should loop through the list and explicitly close them all (the with syntax that automatically closes them has additional complexities/limitations when it comes to dynamically sized lists, and is best avoided here, IMO).
For more info on files, see:
Reading and Writing Files
Also, it's best to use os.path.join to combine components of a path. That way it can be portable across supported platforms, and will automatically use the correct path separators, and such.
Edit in response to comment from OP:
I would recommend you step through the code using a debugger to see exactly what's going wrong. I use pudb, which is an enhanced command-line debugger, and find it invaluable. You can install it via pip into your system/virtualenv Python environment.

Python CIM_DataFile search for file by full path

So, I am trying to write a script that will be able to connect to remote systems and query the CIM_DataFile among other things.
For the sake of testing, I wrote the following code to test on my local machine. I have two files (ns.txt and dns.txt) in the root of my C: drive, however, the queries are not working correctly for Name= (which is the full path).
import wmi
wmiService = wmi.WMI()
for f in wmiService.CIM_DataFile(Name="c:\ns.txt"):
print "NAME '" + f.Name + "'"
for f in wmiService.CIM_DataFile(Name="c:\dns.txt"):
print "NAME '" + f.Name + "'"
for f in wmiService.CIM_DataFile(FileName="ns", Extension="txt", Drive="c:"):
print "FILENAME '" + f.Name + "'"
for f in wmiService.CIM_DataFile(FileName="dns", Extension="txt", Drive="c:"):
print "FILENAME '" + f.Name + "'"
The output of the above code is:
NAME 'c:\ns.txt'
FILENAME 'c:\ns.txt'
FILENAME 'c:\dns.txt'
Why is it not showing c:\dns.txt for the Name= query? I have also tested on other files located in different places on my system and most of them do not show up for the Name= query.
The reason for the file:wmi.py inside the path:Python27\Lib\site-packages.
I changed this file.
My problem has been resolved.
In fact, the problem is with a library that is installed.

How to list all the folders and files in the directory after connecting through SFTP in Python

I am using Python and trying to connect to SFTP and want to retrieve an XML file from there and need to place it in my local system. Below is the code:
import paramiko
sftpURL = 'sftp.somewebsite.com'
sftpUser = 'user_name'
sftpPass = 'password'
ssh = paramiko.SSHClient()
# automatically add keys without requiring human intervention
ssh.set_missing_host_key_policy( paramiko.AutoAddPolicy() )
ssh.connect(sftpURL, username=sftpUser, password=sftpPass)
ftp = ssh.open_sftp()
files = ftp.listdir()
print files
Here connection is success full. And now I want to see all the folders and all the files and need to enter in to required folder for retrieving the XML file from there.
Finally my intention is to view all the folders and files after connecting to SFTP server.
In the above code I had used ftp.listdir() through which I got output as some thing like below
['.bash_logout', '.bash_profile', '.bashrc', '.mozilla', 'testfile_248.xml']
I want to know whether these are the only files present?
And the command I used above is right to view the folders too?
What is the command to view all the folders and files?
The SFTPClient.listdir returns everything, files and folders.
Were there folders, to tell them from the files, use SFTPClient.listdir_attr instead. It returns a collection of SFTPAttributes.
from stat import S_ISDIR, S_ISREG
sftp = ssh.open_sftp()
for entry in sftp.listdir_attr(remotedir):
mode = entry.st_mode
if S_ISDIR(mode):
print(entry.filename + " is folder")
elif S_ISREG(mode):
print(entry.filename + " is file")
The accepted answer by #Oz123 is inefficient. SFTPClient.listdir internally calls SFTPClient.listdir_attr and throws most information away returning file and folder names only. The answer then uselessly and laboriously re-retrieves all that data by calling SFTPClient.lstat for each file.
See also How to fetch sizes of all SFTP files in a directory through Paramiko.
Obligatory warning: Do not use AutoAddPolicy – You are losing a protection against MITM attacks by doing so. For a correct solution, see Paramiko "Unknown Server"
One quick solution is to examine the output of lstat of each object in ftp.listdir().
Here is how you can list all the directories.
>>> for i in ftp.listdir():
... lstatout=str(ftp.lstat(i)).split()[0]
... if 'd' in lstatout: print i, 'is a directory'
...
Files are the opposite search:
>>> for i in ftp.listdir():
... lstatout=str(ftp.lstat(i)).split()[0]
... if 'd' not in lstatout: print i, 'is a file'
...
Here is a solution I have come up with. Based on https://stackoverflow.com/a/59109706 . My solution gives a pretty output.
Update I have modified it slightly to incorporate Martin's suggestions. Now my code is considerably fast compared to my initial version using isdir and listdir
# prefix components:
space = ' '
branch = '│ '
# pointers:
tee = '├── '
last = '└── '
def stringpath(path):
# just a helper to get string of PosixPath
return str(path)
from pathlib import Path
from stat import S_ISDIR
def tree_sftp(sftp, path='.', parent='/', prefix=''):
"""
Loop through files to print it out
for file in tree_sftp(sftp):
print(file)
"""
fullpath = Path(parent, path)
strpath = stringpath(fullpath)
dirs = sftp.listdir_attr(strpath)
pointers = [tee] * (len(dirs) - 1) + [last]
pdirs = [Path(fullpath, d.filename) for d in dirs]
sdirs = [stringpath(path) for path in pdirs]
for pointer, sd, d in zip(pointers, sdirs, dirs):
yield prefix + pointer + d.filename
if S_ISDIR(d.st_mode):
extension = branch if pointer == tee else space
yield from tree_sftp(sftp, sd, prefix=prefix + extension)
You can try it out like this using pysftp
import pysftp
with pysftp.Connection(HOSTNAME, USERNAME, PASSWORD) as sftp:
for file in tree_sftp(sftp):
print(file)
Let me know if if works for you.

copying logs in python using the command line function

I had a doubt with my code which i think I can verify here . My requirement is to copy the apache log and error log from two different servers . Iv written down a python program, using a for loop.
My code:
def copylogs(Appache,Errorlog, folder_prefix) :
root_path = '/home/tza/Desktop/LOGS/'
folders = ['Appache','Errorlog']
for folder in folders:
folder_name = folder_prefix + "_" + folder + str(int(time.time()))
mkdircmd = "mkdir -p " + root_path + "/" + folder_name
os.system(mkdircmd)
filePath = root_path + folder_name
serverPath = "/var/log/apache/*"
cmd = "scp " + "symentic#60.62.1.164:" + serverPath + " " + filePath
cmd = cmd.replace("60.62.1.164" ,myip1)
cmd = os.system(cmd)
print "Logs are at:",root_path+folder_name
time.sleep(10)
filePath = root_path + folder
serverPath = "/var/log/errorlog/*"
cmd = "scp " + "symentic#10.95.21.129:" + serverPath + " " + filePath
cmd = cmd.replace("10.95.21.129" ,myip2)
cmd = os.system(cmd)
print "Logs are at:",root_path+folder_name
now Im calling the function at the end of my program :
folder_prefix = "Fail Case-1"
copylogs(Appache,Errorlog, folder_prefix)
I have an issue here . Programm executes successfully but the logs get overwritten .what i mean is first Appache folder gets created ,logs are copied and then once again it gets overwritten .
What i require is : create a folder Appachelogs[with the timestamp as defined ] ,copy the logs from machine one , next copy error logs from machine2 , and continue the program
How can this be achieved?
scp by default overwrites if a same file name exists in the target computer.
I would suggest using a combination of a error file name + the timestamp for naming the error logs. It's always a good convention for logs to have a timestamp in the name and they also prevent the overwriting problem you are experiencing.
Consider using rsync instead of scp
Do your logs have the same filenames on both machines? scp will overwrite them if they do.
Personally I would have two directories, one for each machine. Or use a timestamp as suggested by Sylar in his answer.

Categories

Resources