Python subprocess not returning - python

I want to call a Python script from Jenkins and have it build my app, FTP it to the target, and run it.
I am trying to build and the subprocess command fails. I have tried this with both subprocess.call() and subprocess.popen(), with the same result.
When I evaluate shellCommand and run it from the command line, the build succeeds.
Note that I have 3 shell commands: 1) remove work directory, 2) create a fresh, empty, work directory, then 3) build. The first two commands return from subprocess, but the third hangs (although it completes when run from the command line).
What am I doing wrongly? Or, what alternatives do I have for executing that command?
# +=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=
def ExcecuteShellCommandAndGetReturnCode(arguments, shellCommand):
try:
process = subprocess.call(shellCommand, shell=True, stdout=subprocess.PIPE)
#process.wait()
return process #.returncode
except KeyboardInterrupt, e: # Ctrl-C
raise e
except SystemExit, e: # sys.exit()
raise e
except Exception, e:
print 'Exception while executing shell command : ' + shellCommand
print str(e)
traceback.print_exc()
os._exit(1)
# +=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=
def BuildApplciation(arguments):
# See http://gnuarmeclipse.github.io/advanced/headless-builds/
jenkinsWorkspaceDirectory = arguments.eclipseworkspace + '/jenkins'
shellCommand = 'rm -r ' + jenkinsWorkspaceDirectory
ExcecuteShellCommandAndGetReturnCode(arguments, shellCommand)
shellCommand = 'mkdir ' + jenkinsWorkspaceDirectory
if not ExcecuteShellCommandAndGetReturnCode(arguments, shellCommand) == 0:
print "Error making Jenkins work directory in Eclipse workspace : " + jenkinsWorkspaceDirectory
return False
application = 'org.eclipse.cdt.managedbuilder.core.headlessbuild'
shellCommand = 'eclipse -nosplash -application ' + application + ' -import ' + arguments.buildRoot + '/../Project/ -build myAppApp/TargetRelease -cleanBuild myAppApp/TargetRelease -data ' + jenkinsWorkspaceDirectory + ' -D DO_APPTEST'
if not ExcecuteShellCommandAndGetReturnCode(arguments, shellCommand) == 0:
print "Error in build"
return False

I Googled further and found this page, which, at 1.2 says
One way of gaining access to the output of the executed command would
be to use PIPE in the arguments stdout or stderr, but the child
process will block if it generates enough output to a pipe to fill up
the OS pipe buffer as the pipes are not being read from.
Sure enough, when I deleted the , stdout=subprocess.PIPE from the code above, it worked as expected.
As I only want the exit code from the subprocess, the above code is enough for me. Read the linked page if you want the output of the command.

Related

How to execute a non-blocking script in python and get its return code?

I am trying to execute a non-blocking bash script from python and to get its return code. Here is my function so far:
def run_bash_script(script_fullname, logfile):
my_cmd = ". " + script_fullname + " >" + logfile +" 2>&1"
p = subprocess.Popen(my_cmd, shell=True)
os.waitpid(p.pid, 0)
print(p.returncode)
As you can see, all the output is redirected into a log file, which I can monitor while the bash process is running.
However, the last command just returns 'None' instead of a useful exit code.
What am I doing wrong here?
You should use p.wait() rather than os.waitpid(). os.waitpid() is a low level api and it knows nothing about the Popen object so it could not touch p.

Running vulture from a python script

I'm trying to find a way to run vulture (which finds unused code in python projects) inside a python script.
vulture documentation can be found here:
https://pypi.org/project/vulture/
Does anyone know how to do it?
The only way I know to use vulture is by shell commands.
I tried to tun the shell commands from the script, using module subprocess, something like this:
process = subprocess.run(['vulture', '.'], check=True,
stdout=subprocess.PIPE, stderr=subprocess.STDOUT,universal_newlines=True)
which I though would have the same effect as running the shell command "vulture ."
but it doesn't work.
Can anyone help?
Thanks
Vulture dev here.
The Vulture package exposes an API, called scavenge - which it uses internally for running the analysis after parsing command line arguments (here in vulture.main).
It takes in a list of Python files/directories. For each directory, Vulture analyzes all contained *.py files.
To analyze the current directory:
import vulture
v = vulture.Vulture()
v.scavenge(['.'])
If you just want to print the results to stdout, you can call:
v.report()
However, it's also possible to perform custom analysis/filters over Vulture's results. The method vulture.get_unused_code returns a list of vulture.Item objects - which hold the name, type and location of unused code.
For the sake of this answer, I'm just gonna print the name of all unused objects:
for item in v.get_unused_code():
print(item.name)
For more info, see - https://github.com/jendrikseipp/vulture
I see you want to capture the output shown at console:
Below code might help:
import tempfile
import subprocess
def run_command(args):
with tempfile.TemporaryFile() as t:
try:
out = subprocess.check_output(args,shell=True, stderr=t)
t.seek(0)
console_output = '--- Provided Command: --- ' + '\n' + args + '\n' + t.read() + out + '\n'
return_code = 0
except subprocess.CalledProcessError as e:
t.seek(0)
console_output = '--- Provided Command: --- ' + '\n' + args + '\n' + t.read() + e.output + '\n'
return_code = e.returncode
return return_code, console_output
Your expected output will be displayed in console_output
Link:
https://docs.python.org/3/library/subprocess.html

How to avoid displaying errors caused after running subprocess.call

So when I run subprocess.call in python, after running the script, if there are error messages caused by the bash, I would like to not display it to the user.
So for instance,
for i in range(len(official_links)):
if(subprocess.call('pacman ' + '-Qi ' + official_links[i].replace('https://www.archlinux.org/packages/?q=', ''),shell=True, stdout=subprocess.PIPE) == 0):
print(official_links[i].replace('https://www.archlinux.org/packages/?q=', '') + ' installed')
else:
print(official_links[i].replace('https://www.archlinux.org/packages/?q=', '') + ' not installed')
the command pacman -Qi packagename cheks if the packagename is already installed or not. When I run my script, if it is installed, I get no extra messages from the bash, only what I print. But if the package does not exist and an error is caused, both the error and my print gets printed on the screen.
Is there a way to avoid printing command errors too?
Thanks.
Redirect the stderr as well:
if(subprocess.call('pacman ' + '-Qi ' + official_links[i].replace('https://www.archlinux.org/packages/?q=', ''),shell=True, stdout=subprocess.PIPE, stderr=subprocess.PIPE) == 0):
That's where the error is displayed.

Python subprocess call rsync

I am trying to to run rsync for each folder in a folder.
__author__ = 'Alexander'
import os
import subprocess
root ='/data/shares'
arguments=["--verbose", "--recursive", "--dry-run", "--human-readable", "--remove-source-files"]
remote_host = 'TL-AS203'
for folder in os.listdir(root):
print 'Sync Team ' + folder.__str__()
path = os.path.join(root,folder, 'in')
if os.path.exists(path):
folder_arguments = list(arguments)
print (type(folder_arguments))
folder_arguments.append("--log-file=" + path +"/rsync.log")
folder_arguments.append(path)
folder_arguments.append("transfer#"+remote_host+":/data/shares/"+ folder+"/out")
print "running rsync with " + str(folder_arguments)
returncode = subprocess.call(["rsync",str(folder_arguments)])
if returncode == 0:
print "pull successfull"
else:
print "error during rsync pull"
else:
print "not a valid team folder, in not found"
If I run this I get the following output:
Sync Team IT-Systemberatung
<type 'list'>
running rsync with ['--verbose', '--recursive', '--dry-run', '--human-readable', '--remove-source-files', '--log-file=/data/shares/IT-Systemberatung/in/rsync.log', '/data/shares/IT-Systemberatung/in', 'transfer#TL-AS203:/data/shares/IT-Systemberatung/out']
rsync: change_dir "/data/shares/IT-Systemberatung/['--verbose', '--recursive', '--dry-run', '--human-readable', '--remove-source-files', '--log-file=/data/shares/IT-Systemberatung/in/rsync.log', '/data/shares/IT-Systemberatung/in', 'transfer#TL-AS203:/data/shares/IT-Systemberatung" failed: No such file or directory (2)
rsync error: some files/attrs were not transferred (see previous errors) (code 23) at main.c(1040) [sender=3.0.4]
error during rsync pull
Sync Team IT-Applikationsbetrieb
not a valid team folder, in not found
transfer#INT-AS238:/data/shares/IT-Systemberatung
If i manually start rsync from bash with these arguments, everything works fine. I also tried it with shell=true but with the same result.
You need to do:
returncode = subprocess.call(["rsync"] + folder_arguments)
Calling str() on a list will return the string represention of the python list which is not what you want to pass in as an argument to rsync
You do a os.chdir(os.path.join(root,folder)), but never go back.
In order to properly resume operation on the next folder, you should either remember the last os.getpwd() and return to it, or just do os.chdir('..') at the end of one loop run.

subprocess ssh command fails for some commands but not others (command works in terminal)

As part of a python script, I am hoping to capture the output of a shell command executed via ssh, namely
ssh User#999 screen -list
If I execute the above command directly in terminal, I get the results I need. However, when executing through subprocess.check_output as below, I get a non-zero exit status 1 error.
I am able to execute other commands via ssh and capture the output without problem.
Is there something specific about screen -list that does not like being called in this fashion?
import subprocess
srvr = 'User#999.99.999.9'
print("CMD 1: ===============")
cmd1 = "ssh " + srvr + " ls -l"
print ("COMMAND IS ..... " + cmd1 + "\n")
out1 = subprocess.check_output(cmd1, shell=True)
print(out1 + "\n")
print("CMD 2: ===============")
cmd2 = "ssh " + srvr + " screen -list"
print ("COMMAND IS ..... " + cmd2 + "\n")
out2 = subprocess.check_output(cmd2, shell=True)
print(out2 + "\n")
Error:
subprocess.CalledProcessError: Command '['ssh User#999.99.999.9 screen', '-list']' returned non-zero exit status 1
subprocess.check_output check the exit code of the subprocess; and it raises exception if the exit code is not zero.
If you don't care about exit code, use subprocess.Popen.communicate:
out1, err1 = subprocess.Popen(cmd1,
stdout=subprocess.PIPE,
stderr=subprocess.PIPE).communicate()
That's how subprocess.check_output() is supposed to work. See: http://docs.python.org/2/library/subprocess.html
The command on your server is returning a non zero return code and thus is raising the appropriate Exception CalledProcessError.

Categories

Resources