Getting git fetch output to file through python - python

I am trying to save git fetch output to file through python, using:
subprocess.check_output(["git", "fetch", "origin", ">>", "C:/bitbucket_backup/backup.log", "2>&1"], cwd='C:/bitbucket_backup/loopx')
but I believe there is something missing in subprocess.check_output args because when adding >> C:/bitbucket_backup/backup.log 2>&1 I receive this error:
Traceback (most recent call last):
File "<pyshell#28>", line 1, in <module>
subprocess.check_output(["git", "fetch", "origin", ">>", "C://bitbucket_backup//backup.log", "2>&1"], cwd='C://bitbucket_backup//loopx')
File "C:\Users\fabio\AppData\Local\Programs\Python\Python36-32\lib\subprocess.py", line 336, in check_output
**kwargs).stdout
File "C:\Users\fabio\AppData\Local\Programs\Python\Python36-32\lib\subprocess.py", line 418, in run
output=stdout, stderr=stderr)
subprocess.CalledProcessError: Command '['git', 'fetch', 'origin', '>>', 'C://bitbucket_backup//backup.log', '2>&1']' returned non-zero exit status 128.

Quickfix: enable shell features to handle redirection arguments:
subprocess.check_output(["git", "fetch", "origin", ">>", "C:/bitbucket_backup/backup.log", "2>&1"], cwd='C:/bitbucket_backup/loopx', shell=True)
But that's really dirty as python is able to do that really nicely:
output = subprocess.check_output(["git", "fetch", "origin"], stderr=subprocess.STDOUT, cwd='C:/bitbucket_backup/loopx')
with open("C:/bitbucket_backup/backup.log","ab") as f: # append to file
f.write(output)
That said, if you're to rewrite all git commands in python, maybe you should use a git python API like GitPython for instance.

Related

Call 7zip from Python on Windows

I'm trying to figure out how to invoke 7zip, on Windows, from a Python program.
I'm trying:
stdout = subprocess.run(['C:\\Program Files\\7-Zip\\7z.exe', "a -t7z -mx0", "C:\\Users\\IanWo\\test.7z", "C:\\Users\\IanWo\\test.txt", "C:\\Users\\IanWo\\test2.txt"], shell=True, check=True, capture_output=True, text=True).stdout
print(stdout)
but am getting:
Traceback (most recent call last):
File "D:\Normal Backup\Code\ProcessRetrospectBackups\process.py", line 93, in <module>
stdout = subprocess.run(['C:\\Program Files\\7-Zip\\7z.exe', "a -t7z -mx0", "C:\\Users\\IanWo\\test.7z", "C:\\Users\\IanWo\\test.txt", "C:\\Users\\IanWo\\test2.txt"], shell=True, check=True, capture_output=True, text=True).stdout
File "C:\Users\IanWo\AppData\Local\Programs\Python\Python39\lib\subprocess.py", line 524, in run
raise CalledProcessError(retcode, process.args,
subprocess.CalledProcessError: Command '['C:\\Program Files\\7-Zip\\7z.exe', 'a -t7z -mx0', 'C:\\Users\\IanWo\\test.7z', 'C:\\Users\\IanWo\\test.txt', 'C:\\Users\\IanWo\\test2.txt']' returned non-zero exit status 7.
>>>
It invokes 7z.exe fine as long as I don't have any arguments. I've tried with and without shell=True with no change.
#ThiefMaster is right. Here's the correct call:
stdout = subprocess.run(['C:\\Program Files\\7-Zip\\7z.exe', "a", "-t7z", "-mx0", "C:\\Users\\IanWo\\test.7z", "C:\\Users\\IanWo\\test.txt", "C:\\Users\\IanWo\\test2.txt"], shell=True, check=True, capture_output=True, text=True).stdout
print(stdout)

pyquibase error: subprocess.CalledProcessError

I use python and I want to create a database source controller like liquibase.
I find the python version of liquibase call pyquibase
but get subprocess.CalledProcessError
this is my simple code:
from pyquibase.pyquibase import Pyquibase
if __name__ == '__main__':
pyquibase = Pyquibase.sqlite('test.sqlite', 'db-changelog-1.xml')
pyquibase.update()
and I got these errors:
Traceback (most recent call last):
File "/home/ali/dev/project/python/DatabaseSourceContoller/DatabaseSourceContoller/main.py", line 5, in <module>
pyquibase.update()
File "/home/ali/dev/project/python/DatabaseSourceContoller/venv/lib/python3.5/site-packages/pyquibase/pyquibase.py", line 69, in update
output = self.liquibase.execute(self.change_log_file, "update")
File "/home/ali/dev/project/python/DatabaseSourceContoller/venv/lib/python3.5/site-packages/pyquibase/liquibase_executor.py", line 103, in execute
shell = True
File "/usr/lib/python3.5/subprocess.py", line 316, in check_output
**kwargs).stdout
File "/usr/lib/python3.5/subprocess.py", line 398, in run
output=stdout, stderr=stderr)
subprocess.CalledProcessError: Command 'java -jar /home/ali/dev/project/python/DatabaseSourceContoller/venv/lib/python3.5/site-packages/pyquibase/liquibase/liquibase.jar --driver=org.sqlite.JDBC --classpath=/home/ali/dev/project/python/DatabaseSourceContoller/venv/lib/python3.5/site-packages/pyquibase/db-connectors/sqlite-jdbc-3.18.0.jar --changeLogFile=db-changelog-1.xml --url="jdbc:sqlite:test.sqlite" update' returned non-zero exit status 255
pyquibase forks a child process to execute the liquibase changelog update. And the subprocess.CalledProcessError means that the liquibase changelog update has failed.
Now, in order to find out why it failed, you can run the liquibase command manually to see the actual error messages:
java -jar /home/ali/dev/project/python/DatabaseSourceContoller/venv/lib/python3.5/site-packages/pyquibase/liquibase/liquibase.jar --driver=org.sqlite.JDBC --classpath=/home/ali/dev/project/python/DatabaseSourceContoller/venv/lib/python3.5/site-packages/pyquibase/db-connectors/sqlite-jdbc-3.18.0.jar --changeLogFile=db-changelog-1.xml --url="jdbc:sqlite:test.sqlite" update
pyquibase doesn't print the actual error messages for you yet. The next version upgrade should have that feature.

Reading output from terminal using subprocess.run

I'm writing a python string to parse a value from a JSON file, run a tool called Googler with a couple of arguments including the value from the JSON file, and then save the output of the tool to a file (CSV preferred, but that's for another day).
So far the code is:
import json
import os
import subprocess
import time
with open("test.json") as json_file:
json_data = json.load(json_file)
test = (json_data["search_term1"]["apparel"]["biba"])
#os.system("googler -N -t d1 "+test) shows the output, but can't write to a file.
result= subprocess.run(["googler", "-N","-t","d1",test], stdout=subprocess.PIPE, universal_newlines=True)
print(result.stdout)
When I run the above script, nothing happens, the terminal just sits blank until I send a keyboard interrupt and then I get this error:
Traceback (most recent call last):
File "script.py", line 12, in <module>
result= subprocess.run(["googler", "-N","-t","d1",test], stdout=subprocess.PIPE, universal_newlines=True)
File "/usr/lib/python3.5/subprocess.py", line 695, in run
stdout, stderr = process.communicate(input, timeout=timeout)
File "/usr/lib/python3.5/subprocess.py", line 1059, in communicate
stdout = self.stdout.read()
KeyboardInterrupt
I tried replacing the test variable with a string, same error. The same line works on something like "ls", "-l", "/dev/null".
How do I extract the output of this tool and write it to a file?
Your googler command works in interactive mode. It never exits, so your program is stuck.
You want googler to run the search, print the output and then exit.
From the docs, I think --np (or --noprompt) is the right parameter for that. I didn't test.
result = subprocess.run(["googler", "-N", "-t", "d1", "--np", test], stdout=subprocess.PIPE, universal_newlines=True)

How does Jenkins deal with python scripts with Popen

We use Jenkins to run our cronjobs. We run Centos 6.8 on our server. Jenkins is version 1.651.
I'm running into a funny problem. When I run my script from the terminal, it works fine. I don't get any errors.
When I run the same script in Jenkins, it fails and says there's no such file or directory.
The error message from the Jenkins output I get is this:
Traceback (most recent call last):
File "runMTTRScript.py", line 256, in <module>
main()
File "runMTTRScript.py", line 252, in main
startTest(start, end, impalaHost)
File "runMTTRScript.py", line 72, in startTest
getResults(start, end)
File "runMTTRScript.py", line 111, in getResults
proc1 = subprocess.Popen(cmd, stdout=processlistOut)
File "/glide/lib64/python2.7/subprocess.py", line 710, in __init__
errread, errwrite)
File "/glide/lib64/python2.7/subprocess.py", line 1335, in _execute_child
raise child_exception
OSError: [Errno 2] No such file or directory
Here's the code that the error above is complaining about:
with open(JAVAOUT + "_processList" + outfileDate, 'w') as processlistOut, \
open(JAVAOUT + "_innodb" + outfileDate, 'w') as innodbOut:
cmd = ["java", "-cp", "MTTR4hrs-1.0.5-SNAPSHOT-allinone.jar", "com.servicenow.bigdata.MTTR4hrs", "-c", "config.properties", "-m", DBIFILE, "-d", start, end, "-f", "processlist", "-ds", "dbi"]
proc1 = subprocess.Popen(cmd, stdout=processlistOut)
cmd = ["java", "-cp", "MTTR4hrs-1.0.5-SNAPSHOT-allinone.jar", "com.servicenow.bigdata.MTTR4hrs", "-c", "config.properties", "-m", DBIFILE, "-d", start, end, "-f", "engineinnodbstatus", "-ds", "dbi"]
proc2 = subprocess.Popen(cmd, stdout=innodbOut)
Why would it complain that a file is not there from Jenkins but be fine when I run it from cmd line? Could this also be some race condition in python that I'm not aware of too? The "with...open" doesn't open a file fast enough for the Popen to make use of? I'm also open to the fact that it might be some OS problem too (too many open files, something stupid, etc.).

Paramiko SSH python

I´m trying the simplest way to make a SSH connection and execute a command with paramiko
import paramiko, base64
client = paramiko.SSHClient()
client.load_system_host_keys()
client.connect('10.50.0.150', username='XXXXX', password='XXXXXX')
stdin, stdout, stderr = client.exec_command('show performance -type host-io')
for line in stdout:
print '... ' + line.strip('\n')
client.close()
------------ERROR-----------------------
Traceback (most recent call last):
File "a.py", line 5, in <module>
stdin, stdout, stderr = client.exec_command('show performance -type host-io')
File "/usr/lib/python2.6/site-packages/paramiko-1.10.1-py2.6.egg/paramiko/client.py", line 374, in exec_command
chan.exec_command(command)
File "/usr/lib/python2.6/site-packages/paramiko-1.10.1-py2.6.egg/paramiko/channel.py", line 218, in exec_command
self._wait_for_event()
File "/usr/lib/python2.6/site-packages/paramiko-1.10.1-py2.6.egg/paramiko/channel.py", line 1122, in _wait_for_event
raise e
EOFError
If i execute this code changing the command it works and to another computer, this command works fine via SSH interative shell.
Any idea ?
After client.connect(. . .) you need to use this command
session = client.get_transport().open_session()
then use session.exec_command(. . .).

Categories

Resources