Python Fabric won't pass in variable - python

I had a script that was working. I made one small change and now it stopped working. The top version works, while the bottom one fails.
def makelocalconfig(file="TEXT"):
host = env.host_string
filename = file
conf = open('/home/myuser/verify_yslog_conf/%s/%s' % (host, filename), 'r')
comment = open('/home/myuser/verify_yslog_conf/%s/localconfig.txt' % host, 'w')
for line in conf:
comment.write(line)
comment.close()
conf.close()
def makelocalconfig(file="TEXT"):
host = env.host_string
filename = file
path = host + "/" + filename
pwd = local("pwd")
conf = open('%s/%s' % (pwd, path), 'r')
comment = open('%s/%s/localconfig.txt' % (pwd, host), 'w')
for line in conf:
comment.write(line)
comment.close()
conf.close()
For troubleshooting purposes I added a print pwd and print path line to make sure the variables were getting filled correctly. pwd comes up empty. Why isn't this variable being set correctly? I use this same format of
var = sudo("cmd")
all the time. Is local different than sudo and run?

In short, you may need to add capture=True:
pwd = local("pwd", capture=True)
local runs a command locally:
a convenience wrapper around the use of the builtin Python subprocess
module with shell=True activated.
run runs a command on a remote server and sudo runs a remote command as super-user.
There is also a note in the documentation:
local is not currently capable of simultaneously printing and capturing output, as run/sudo do. The capture kwarg allows you to switch between printing and capturing as necessary, and defaults to False.
When capture=False, the local subprocess’ stdout and stderr streams are hooked up directly to your terminal, though you may use the global output controls output.stdout and output.stderr to hide one or both if desired. In this mode, the return value’s stdout/stderr values are always empty.

Related

Redirecting Output From a Program to a File with Python: Specific Bug

I've been trying to run a Java program and capture it's STDOUT output to a file from the Python script. The idea is to run test files through my program and check if it matches the answers.
Per this and this SO questions, using subprocess.call is the way to go. In the code below, I am doing subprocess.call(command, stdout=f) where f is the file I opened.
The resulted file is empty and I can't quite understand why.
import glob
test_path = '/path/to/my/testfiles/'
class_path = '/path/to/classfiles/'
jar_path = '/path/to/external_jar/'
test_pattern = 'test_case*'
temp_file = 'res'
tests = glob.glob(test_path + test_pattern) # find all test files
for i, tc in enumerate(tests):
with open(test_path+temp_file, 'w') as f:
# cd into directory where the class files are and run the program
command = 'cd {p} ; java -cp {cp} package.MyProgram {tc_p}'
.format(p=class_path,
cp=jar_path,
tc_p=test_path + tc)
# execute the command and direct all STDOUT to file
subprocess.call(command.split(), stdout=f, stderr=subprocess.STDOUT)
# diff is just a lambda func that uses os.system('diff')
exec_code = diff(answers[i], test_path + temp_file)
if exec_code == BAD:
scream(':(')
I checked the docs for subprocess and they recommended using subprocess.run (added in Python 3.5). The run method returns the instance of CompletedProcess, which has a stdout field. I inspected it and the stdout was an empty string. This explained why the file f I tried to create was empty.
Even though the exit code was 0 (success) from the subprocess.call, it didn't mean that my Java program actually got executed. I ended up fixing this bug by breaking down command into two parts.
If you notice, I initially tried to cd into correct directory and then execute the Java file -- all in one command. I ended up removing cd from command and did the os.chdir(class_path) instead. The command now contained only the string to run the Java program. This did the trick.
So, the code looked like this:
good_code = 0
# Assume the same variables defined as in the original question
os.chdir(class_path) # get into the class files directory first
for i, tc in enumerate(tests):
with open(test_path+temp_file, 'w') as f:
# run the program
command = 'java -cp {cp} package.MyProgram {tc_p}'
.format(cp=jar_path,
tc_p=test_path + tc)
# runs the command and redirects it into the file f
# stores the instance of CompletedProcess
out = subprocess.run(command.split(), stdout=f)
# you can access useful info now
assert out.returncode == good_code

running command before other finished in python

I asked already and few people gave good advises but there were to many unknowns for me as I am beginner. Therefore I decided to ask for help again without giving bad code.
I need a script which will execute copy files to directory while the other is still running.
Basically I run first command, it generates files (until user press enter) and then those files are gone (automatically removed).
What I would like to have is to copying those files (without have to press "Enter" as well).
I made in bash however I would like to achieve this on python. Please see below:
while kill -0 $! 2>/dev/null;do
cp -v /tmp/directory/* /tmp/
done
If first script is purely command line : it should be fully manageable with a python script.
General architecture :
python scripts starts first one with subprocess module
reads output from first script until it gets the message asking for pressing enter
copies all files from source directory to destination directory
sends \r into first script input
waits first script terminates
exits
General requirements :
first script must be purely CLI one
first script must write to standart output/error and read from standard input - if it reads/writes to physical terminal (/dev/tty on Unix/Linux or con: on Dos/Windows), it won't work
the end of processing must be identifiable in standard output/error
if the two above requirement were no met, the only way would be to wait a define amount of time
Optional operation :
if there are other interactions in first script (read and/or write), it will be necessary to add the redirections in the script, it is certainly feasible, but will be a little harder
Configuration :
the command to be run
the string (from command output) that indicates first program has finished processing
the source directory
the destination directory
a pattern for file name to be copied
if time defined and no identifiable string in output : the delay to wait before copying
A script like that should be simple to write and test and able to manage the first script as you want.
Edit : here is an example of such a script, still without timeout management.
import subprocess
import os
import shutil
import re
# default values for command execution - to be configured at installation
defCommand = "test.bat"
defEnd = "Appuyez"
defSource = "."
defDest = ".."
# BEWARE : pattern is in regex format !
defPattern="x.*\.txt"
class Launcher(object):
'''
Helper to launch a command, wait for a defined string from stderr or stdout
of the command, copy files from a source folder to a destination folder,
and write a newline to the stdin of the command.
Limits : use blocking IO without timeout'''
def __init__(self, command=defCommand, end=defEnd, source=defSource,
dest=defDest, pattern = defPattern):
self.command = command
self.end = end
self.source = source
self.dest = dest
self.pattern = pattern
def start(self):
'Actualy starts the command and copies the files'
found = False
pipes = os.pipe() # use explicit pipes to mix stdout and stderr
rx = re.compile(self.pattern)
cmd = subprocess.Popen(self.command, shell=True, stdin=subprocess.PIPE,
stdout=pipes[1], stderr=pipes[1])
os.close(pipes[1])
while True:
txt = os.read(pipes[0], 1024)
#print(txt) # for debug
if str(txt).find(self.end) != -1:
found = True
break
# only try to copy files if end string found
if found:
for file in os.listdir(self.source):
if rx.match(file):
shutil.copy(os.path.join(self.source, file), self.dest)
print("Copied : %s" % (file,))
# copy done : write the newline to command input
cmd.stdin.write(b"\n")
cmd.stdin.close()
try:
cmd.wait()
print("Command terminated with %d status" % (cmd.returncode,))
except:
print("Calling terminate ...")
cmd.terminate()
os.close(pipes[0])
# allows to use the file either as an imported module or directly as a script
if __name__ == '__main__':
# parse optional parameters
import argparse
parser = argparse.ArgumentParser(description='Launch a command and copy files')
parser.add_argument('--command', '-c', nargs = 1, default = defCommand,
help="full text of the command to launch")
parser.add_argument('--endString', '-e', nargs = 1, default = defEnd,
dest="end",
help="string that denotes that command has finished processing")
parser.add_argument('--source', '-s', nargs = 1, default = defSource,
help="source folder")
parser.add_argument('--dest', '-d', nargs = 1, default = defDest,
help = "destination folder")
parser.add_argument('--pattern', '-p', nargs = 1, default = defPattern,
help = "pattern (regex format) for files to be copied")
args = parser.parse_args()
# create and start a Launcher ...
launcher = Launcher(args.command, args.end, args.source, args.dest,
args.pattern)
launcher.start()

python 3 open terminal and run program

I made a small script in sublime that will extract commands from a json file that is on the user's computer and then it will open the terminal and run the settings/command. This works, except that it doesn't really open up the terminal. It only runs the command (and it works, as in my case it will run gcc to compile a simple C file), and pipes to STDOUT without opening up the terminal.
import json
import subprocess
import sublime_plugin
class CompilerCommand(sublime_plugin.TextCommand):
def get_dir(self, fullpath):
path = fullpath.split("\\")
path.pop()
path = "\\".join(path)
return path
def get_settings(self, path):
_settings_path = path + "\\compiler_settings.json"
return json.loads(open(_settings_path).read())
def run(self, edit):
_path = self.get_dir(self.view.file_name())
_settings = self.get_settings(_path)
_driver = _path.split("\\")[0]
_command = _driver + " && cd " + _path + " && " + _settings["compile"] + " && " + _settings["exec"]
proc = subprocess.Popen(_command, shell=True, stdout=subprocess.PIPE, stderr=subprocess.STDOUT)
I'm not sure if using subprocess.Popen is the right way to go about it as I'm new to Python.
So to re-iterate; I want it to open up the terminal, run the command, and have the terminal stay open until the user presses ENTER or something. I'm running Windows 7 and Python 3, if that matters.
subprocess.Popen simply creates a subprocess with the given command. It is in no way related to opening a terminal window or any other windows for that matter.
You'll have to look into your platform specific UI automation solutions in order to achieve what you want. Or see if maybe the Sublime plugins mechanism can already do that.
NOTES:
Also, you should be using os.path.join/os.path.split/os.path.sep etc for your path operations—Sublime also runs on OS X for example, and OS X does not use backslashes. Also, file handles need to be closed, so use:
with open(...) as f:
return json.load(f) # also not that there is no nead to f.read()+json.loads()
# if you can just json.load() on the file handle
Furthermore, strings should usually be built using string interpolation:
_command = "{} && cd {} && {} && {}".format(_driver, _path, _settings["compile"], _settings["exec"])
...and, you should not be prefixing your local variables with _—it doesn't look nice and serves no purpose in Python either; and while we're at it, I might as well use the chance to recommend you to read PEP8: http://www.python.org/dev/peps/pep-0008/.

In a pre-commit hook - how to access/compare current and previous versions of files

I'm trying to add to our existing pre-commit SVN hook so that it will check for and block an increase in file size for files in specific directory/s.
I've written a python script to compare two file sizes, which takes two files as arguments and uses sys.exit(0) or (1) to return the result, this part seems to work fine.
My problem is in calling the python script from the batch file, how to reference the newly committed and previous versions of each file? The existing code is new to me and a mess of %REPOS%, %TXN%s etc and I'm not sure how to go about using them. Is there a simple, standard way of doing this?
It also already contains code to loop through the changed files using svnlook changed, so that part shouldn't be an issue.
Thanks very much
If comparing file sizes is all you need to do, look no further than the svnlook filesize command. The default invocation - svnlook filesize repo path - will give you the size of the HEAD revision of path. To get the size of the path in the incoming commit use svnlook filesize repo path -t argv[2].
Still, here is an example of listing all revisions of a versioned path (except the incoming one, since this is pre-commit hook).
#!/usr/bin/env python
from sys import argv, stderr, exit
from subprocess import check_output
repo = argv[1]
transaction = argv[2]
def path_history(path, limit=5):
path = '/%s' % path
cmd = ('svnlook', 'history', '-l', str(limit), repo, path)
out = check_output(cmd).splitlines()[2:]
for rev, _path in (i.split() for i in out):
if _path == path:
yield rev
def commit_changes():
cmd = ('svnlook', 'changed', repo, '-t', transaction)
out = check_output(cmd).splitlines()
for line in out:
yield line.split()
def filesize(path, rev=None, trans=None):
cmd = ['svnlook', 'filesize', repo, path]
if rev: cmd.extend(('-r', str(rev)))
elif trans: cmd.extend(('-t', str(trans)))
out = check_output(cmd)
return out.rstrip()
def filesize_catwc(path, rev=None, trans=None):
'''A `svnlook filesize` substitute for older versions of svn.
Uses `svnlook cat ... | wc -c` and should be very inefficient
for large files.'''
arg = '-r %s' % rev if rev else '-t %s' % trans
cmd = 'svnlook cat %s %s %s | wc -c' % (arg, repo, path)
out = check_output(cmd, shell=True)
return out.rstrip()
for status, path in commit_changes():
if status in ('A', 'M', 'U'):
# get the last 5 revisions of the added/modified path
revisions = list(path_history(path))
headrev = revisions[0]
oldsize = filesize(path, rev=headrev)
newsize = filesize(path, trans=transaction)
It is probably easier to write a whole pre-commit script in python. According to the subversion handbook, there are three inputs to pre-commit;
Two command line arguments
repository path
commit transaction name
lock-token info on standard input
If you want to know which files have changed, I suggest you use the subprocess.check_output() function to call svnlook changed. For the files which contents have changed, you should call svnlook filesize, to get the size of the file as it is in the last revision in the repository. The size of the equivalent file in the working directory you'd have to query with os.stat(), as shown in the function getsizes().
import subprocess
import sys
import os
repo = sys.argv[1]
commit_name = sys.argv[2]
def getsizes(rname, rfile):
'''Get the size of the file in rfile from the repository rname.
Derive the filename in the working directory from rfile, and use
os.stat to get the filesize. Return the two sizes.
'''
localname = rfile[10:].strip() # 'U trunk/foo/bar.txt' -> 'foo/bar.txt'
reposize = subprocess.check_output(['svnlook', 'filesize', rname, rfile])
reposize = int(reposize)
statinfo = os.stat(localname)
return (reposize, statinfo.st_size)
lines = subprocess.check_output(['svnlook', 'changed', repo]).splitlines()
for line in lines:
if line.startswith('U ') or line.startswith('UU'):
# file contents have changed
reposize, wdsize = getsizes(repo, line)
# do something with the sizes here...
elif line.startswith('_U'):
# properties have changed
pass

Advanced Python FTP - can I control how ftplib talks to a server?

I need to send a very specific (non-standard) string to an FTP server:
dir "SYS:\IC.ICAMA."
The case is critical, as are the style of quotes and their content.
Unfortunately, ftplib.dir() seems to use the 'LIST' command rather than 'dir' (and it uses the wrong case for this application).
The FTP server is actually a telephone switch and it's a very non-standard implementation.
I tried using ftplib.sendcmd(), but it also sends 'pasv' as part of the command sequence.
Is there an easy way of issuing specific commands to an FTP server?
Try the following. It is a modification of the original FTP.dir command which uses "dir" instead of "LIST". It gives a "DIR not understood" error with the ftp server I tested it on, but it does send the command you're after. (You will want to remove the print command I used to check that.)
import ftplib
class FTP(ftplib.FTP):
def shim_dir(self, *args):
'''List a directory in long form.
By default list current directory to stdout.
Optional last argument is callback function; all
non-empty arguments before it are concatenated to the
LIST command. (This *should* only be used for a pathname.)'''
cmd = 'dir'
func = None
if args[-1:] and type(args[-1]) != type(''):
args, func = args[:-1], args[-1]
for arg in args:
if arg:
cmd = cmd + (' ' + arg)
print cmd
self.retrlines(cmd, func)
if __name__ == '__main__':
f = FTP('ftp.ncbi.nih.gov')
f.login()
f.shim_dir('"blast"')

Categories

Resources