Advanced Python FTP - can I control how ftplib talks to a server? - python

I need to send a very specific (non-standard) string to an FTP server:
dir "SYS:\IC.ICAMA."
The case is critical, as are the style of quotes and their content.
Unfortunately, ftplib.dir() seems to use the 'LIST' command rather than 'dir' (and it uses the wrong case for this application).
The FTP server is actually a telephone switch and it's a very non-standard implementation.
I tried using ftplib.sendcmd(), but it also sends 'pasv' as part of the command sequence.
Is there an easy way of issuing specific commands to an FTP server?

Try the following. It is a modification of the original FTP.dir command which uses "dir" instead of "LIST". It gives a "DIR not understood" error with the ftp server I tested it on, but it does send the command you're after. (You will want to remove the print command I used to check that.)
import ftplib
class FTP(ftplib.FTP):
def shim_dir(self, *args):
'''List a directory in long form.
By default list current directory to stdout.
Optional last argument is callback function; all
non-empty arguments before it are concatenated to the
LIST command. (This *should* only be used for a pathname.)'''
cmd = 'dir'
func = None
if args[-1:] and type(args[-1]) != type(''):
args, func = args[:-1], args[-1]
for arg in args:
if arg:
cmd = cmd + (' ' + arg)
print cmd
self.retrlines(cmd, func)
if __name__ == '__main__':
f = FTP('ftp.ncbi.nih.gov')
f.login()
f.shim_dir('"blast"')

Related

SyntaxError: missing ; before statement - Python

Let's say I have this snippet
list_command = 'mongo --host {host} --port {port} ' \
'--username {username} --password {password} --authenticationDatabase {database} < {path}'
def shell_exec(cmd: str):
import subprocess
p = subprocess.call(cmd, shell=True)
return p
Let's say these are the commands I'm trying to run on mongo
use users
show collections
db.base.find().pretty()
If format the string list_command with the appropriate values and pass it to the function with shell=True, it works fine. But I'm trying to avoid it for security purposes.
If I call it with shell=False, I get the following error:
2020-08-31T14:08:49.291+0100 E QUERY [thread1] SyntaxError: missing ; before statement #./mongo/user-01-09-2020:1:4
failed to load: ./mongo/user-01-09-2020
253
Your list_command is a shell command: in particular, it includes input redirection (via < {path}), which is a syntactic feature of the shell. To use it you need shell=True.
If you don’t want to use shell=True, you need to change the way you construct the argument (separate arguments need to be passed as separate items of a list rather than as a single string), and you need to pass the script into the standard input via an explicit pipe, by setting its input parameter:
cmd = ['mongo', '--host', '{host}', '--port', …]
subprocess.run(cmd, input=mongodb_script)
Using input raised the following error: TypeError: init() got an unexpected keyword argument 'input'.
I ended up doing the following:
import subprocess
def shell_exec(cmd: str, stdin=None):
with open(stdin, 'rb') as f:
return subprocess.call(cmd.split(), stdin=f)

How to obtain a python command line argument if only it's a string

I'm making my own python CLI and i want to pass only String arguments
import sys
import urllib, json
# from .classmodule import MyClass
# from .funcmodule import my_function
def main():
args = sys.argv[1:]
#print('count of args :: {}'.format(len(args)))
#for arg in args:
# print('passed argument :: {}'.format(arg))
#always returns true even if i don't pass the argument as a "String"
if(isinstance(args[0], str)):
print('JSON Body:')
url = args[0]
response = urllib.urlopen(url)
data = json.loads(response.read())
print(data)
# my_function('hello world')
# my_object = MyClass('Thomas')
# my_object.say_name()
if __name__ == '__main__':
main()
I execute it by api "url" and this is the correct output:
Although when i'm trying to execute api url without passing it as a String my output is a little odd:
How can i accept only String arguments?
What I've tried so far:
Found this solution here but it didn't work for me (couldn't recognize the join() function)
problem isn't a python issue. It's just that your URL contains a &, and on a linux/unix shell, this asks to run your command in the background (and the data after & is dropped). That explains the [1]+ done output, with your truncated command line.
So you have to quote your argument to avoid it to be interpreted (or use \&). There's no way around this from a Un*x shell (that would work unquoted from a Windows shell for instance)

Python Fabric won't pass in variable

I had a script that was working. I made one small change and now it stopped working. The top version works, while the bottom one fails.
def makelocalconfig(file="TEXT"):
host = env.host_string
filename = file
conf = open('/home/myuser/verify_yslog_conf/%s/%s' % (host, filename), 'r')
comment = open('/home/myuser/verify_yslog_conf/%s/localconfig.txt' % host, 'w')
for line in conf:
comment.write(line)
comment.close()
conf.close()
def makelocalconfig(file="TEXT"):
host = env.host_string
filename = file
path = host + "/" + filename
pwd = local("pwd")
conf = open('%s/%s' % (pwd, path), 'r')
comment = open('%s/%s/localconfig.txt' % (pwd, host), 'w')
for line in conf:
comment.write(line)
comment.close()
conf.close()
For troubleshooting purposes I added a print pwd and print path line to make sure the variables were getting filled correctly. pwd comes up empty. Why isn't this variable being set correctly? I use this same format of
var = sudo("cmd")
all the time. Is local different than sudo and run?
In short, you may need to add capture=True:
pwd = local("pwd", capture=True)
local runs a command locally:
a convenience wrapper around the use of the builtin Python subprocess
module with shell=True activated.
run runs a command on a remote server and sudo runs a remote command as super-user.
There is also a note in the documentation:
local is not currently capable of simultaneously printing and capturing output, as run/sudo do. The capture kwarg allows you to switch between printing and capturing as necessary, and defaults to False.
When capture=False, the local subprocess’ stdout and stderr streams are hooked up directly to your terminal, though you may use the global output controls output.stdout and output.stderr to hide one or both if desired. In this mode, the return value’s stdout/stderr values are always empty.

running command before other finished in python

I asked already and few people gave good advises but there were to many unknowns for me as I am beginner. Therefore I decided to ask for help again without giving bad code.
I need a script which will execute copy files to directory while the other is still running.
Basically I run first command, it generates files (until user press enter) and then those files are gone (automatically removed).
What I would like to have is to copying those files (without have to press "Enter" as well).
I made in bash however I would like to achieve this on python. Please see below:
while kill -0 $! 2>/dev/null;do
cp -v /tmp/directory/* /tmp/
done
If first script is purely command line : it should be fully manageable with a python script.
General architecture :
python scripts starts first one with subprocess module
reads output from first script until it gets the message asking for pressing enter
copies all files from source directory to destination directory
sends \r into first script input
waits first script terminates
exits
General requirements :
first script must be purely CLI one
first script must write to standart output/error and read from standard input - if it reads/writes to physical terminal (/dev/tty on Unix/Linux or con: on Dos/Windows), it won't work
the end of processing must be identifiable in standard output/error
if the two above requirement were no met, the only way would be to wait a define amount of time
Optional operation :
if there are other interactions in first script (read and/or write), it will be necessary to add the redirections in the script, it is certainly feasible, but will be a little harder
Configuration :
the command to be run
the string (from command output) that indicates first program has finished processing
the source directory
the destination directory
a pattern for file name to be copied
if time defined and no identifiable string in output : the delay to wait before copying
A script like that should be simple to write and test and able to manage the first script as you want.
Edit : here is an example of such a script, still without timeout management.
import subprocess
import os
import shutil
import re
# default values for command execution - to be configured at installation
defCommand = "test.bat"
defEnd = "Appuyez"
defSource = "."
defDest = ".."
# BEWARE : pattern is in regex format !
defPattern="x.*\.txt"
class Launcher(object):
'''
Helper to launch a command, wait for a defined string from stderr or stdout
of the command, copy files from a source folder to a destination folder,
and write a newline to the stdin of the command.
Limits : use blocking IO without timeout'''
def __init__(self, command=defCommand, end=defEnd, source=defSource,
dest=defDest, pattern = defPattern):
self.command = command
self.end = end
self.source = source
self.dest = dest
self.pattern = pattern
def start(self):
'Actualy starts the command and copies the files'
found = False
pipes = os.pipe() # use explicit pipes to mix stdout and stderr
rx = re.compile(self.pattern)
cmd = subprocess.Popen(self.command, shell=True, stdin=subprocess.PIPE,
stdout=pipes[1], stderr=pipes[1])
os.close(pipes[1])
while True:
txt = os.read(pipes[0], 1024)
#print(txt) # for debug
if str(txt).find(self.end) != -1:
found = True
break
# only try to copy files if end string found
if found:
for file in os.listdir(self.source):
if rx.match(file):
shutil.copy(os.path.join(self.source, file), self.dest)
print("Copied : %s" % (file,))
# copy done : write the newline to command input
cmd.stdin.write(b"\n")
cmd.stdin.close()
try:
cmd.wait()
print("Command terminated with %d status" % (cmd.returncode,))
except:
print("Calling terminate ...")
cmd.terminate()
os.close(pipes[0])
# allows to use the file either as an imported module or directly as a script
if __name__ == '__main__':
# parse optional parameters
import argparse
parser = argparse.ArgumentParser(description='Launch a command and copy files')
parser.add_argument('--command', '-c', nargs = 1, default = defCommand,
help="full text of the command to launch")
parser.add_argument('--endString', '-e', nargs = 1, default = defEnd,
dest="end",
help="string that denotes that command has finished processing")
parser.add_argument('--source', '-s', nargs = 1, default = defSource,
help="source folder")
parser.add_argument('--dest', '-d', nargs = 1, default = defDest,
help = "destination folder")
parser.add_argument('--pattern', '-p', nargs = 1, default = defPattern,
help = "pattern (regex format) for files to be copied")
args = parser.parse_args()
# create and start a Launcher ...
launcher = Launcher(args.command, args.end, args.source, args.dest,
args.pattern)
launcher.start()

Passing a password to KLOG from within a script, using `subprocess.POPEN`

A series of applications I'm writing require that the user be able to read from a filesystem with KLOG authentication. Some functions require the user to have KLOG tokens (i.e., be authenticated) and others don't. I wrote a small Python decorator so that I can refactor the "you must be KLOGed" functionality within my modules:
# this decorator is defined in ``mymodule.utils.decorators``
def require_klog(method):
def require_klog_wrapper(*args, **kwargs):
# run the ``tokens`` program to see if we have KLOG tokens
out = subprocess.Popen('tokens', stdout=subprocess.PIPE)
# the tokens (if any) are located in lines 4:n-1
tokens_list = out.stdout.readlines()[3:-1]
if tokens_list == []:
# this is where I launch KLOG (if the user is not authenticated)
subprocess.Popen('klog')
out = method(*args, **kwargs)
return out
return require_klog_wrapper
# once the decorator is defined, any function can use it as follows:
from mymodule.utils.decorators import require_klog
#require_klog
def my_function():
# do something (if not KLOGed, it SHUOLD ask for the password... but it does not!)
It is all very simple. Except when I tried to apply the following logic: "if the user is not KLOGed, run KLOG and ask for the password".
I do this using subprocess.Popen('klog') and the password: prompt does come up to the terminal. However, when I write the password it actually is echoed back to the terminal and, worse, nothing happens upon hitting return.
Edit:
After Alex's fast and correct response I solved the problem as follows:
I erased all the *.pyc files from my module's directory (yes - this made a difference)
I used getpass.getpass() to store the password in a local variable
I called the KLOG command with the -pipe option
I passed the locally-stored password to the pipe via the pipe's write method
Below is the corrected decorator:
def require_klog(method):
def require_klog_wrapper(*args, **kwargs):
# run the ``tokens`` program to see if we have KLOG tokens
out = subprocess.Popen('tokens', stdout=subprocess.PIPE)
# the tokens (if any) are located in lines 4:n-1
tokens_list = out.stdout.readlines()[3:-1]
if tokens_list == []:
args = ['klog', '-pipe']
# this is the custom pwd prompt
pwd = getpass.getpass('Type in your AFS password: ')
pipe = subprocess.Popen(args, stdin=subprocess.PIPE)
# here is where the password is sent to the program
pipe.stdin.write(pwd)
return method(*args, **kwargs)
return require_klog_wrapper
Apparently, your script (or something else it's spawning later) and the subprocess running klog are "competing" for the /dev/tty -- and the subprocess is losing (after all, you're not calling the wait method of the object returned from subprocess.Popen, to ensure you wait until it terminates before continuing, so a race condition of some kind would be hardly surprising).
If a wait does not suffice, I would work around this by putting
pwd = getpass.getpass('password:')
in the Python code (with an import getpass at the top of course), then running klog with the -pipe argument and stdin=subprocess.PIPE, and writing pwd to that pipe (with a call to the communicate method of the object returned from subprocess.Popen).

Categories

Resources