Python os.system invalid syntax - python

Hi I would like to execute the following command via shell.
curl -g -d '{ "action": "block_count" }' [::1]:7076
However when inserted in an os.system call , I get an invalid syntax. What would be the right syntax format.
#!/usr/bin/env python
import os
import json
aba = os.system('curl -g -d '{ "action": "block_count" }' [::1]:7076')
baba = json.loads(aba)

You could simply use the triple-quoted string literal, like:
os.system('''curl -g -d '{"action": "block_count"}' [::1]:7076''')
but even better, use the right tool for the job, i.e. requests:
import requests
data = requests.post('[::1]:7076', json={"action": "block_count"}).json()
If you insist on direct curl command invocation, please use the subprocess module instead of the old and inflexible os.system (also unsafe, for inputs not strictly checked). You can use the subprocess.check_output as a replacement in your case. There's no need to execute your curl command in a subshell, so you can split curl's arguments, like:
import subprocess
output = subprocess.check_output(['curl', '-g', '-d', '{"action": "block_count"}', '-s', '[::1]:7076'])
data = json.loads(output)
Note that check_output will return the standard output of the command executed (like os.system does), but it'll raise a CalledProcessError exception in case the command fails with non-zero status, or OSError exception if the command is not found.

You need to escape the single quotes. So change this:
aba = os.system('curl -g -d '{ "action": "block_count" }' [::1]:7076')
to this:
aba = os.system('curl -g -d \'{ "action": "block_count" }\' [::1]:7076')

Related

How to run this tsduck shell command containing quotes with subprocess.run in Python

Here is the command as written in the tsduck manual:
tsp -I dvb -a 1 #ts1028.txt \
-P svremove -s AlJazeeraEnglish \
-P merge "tsp -I dvb -a 0 #ts1022.txt -P zap TV5MondeEurope" \
-P analyze -i 30 -o merged.txt \
-O dektec #modulation.txt
Here is my version:
import sys
import subprocess
mod_values = { "bandwidth": "8-mhz",
"convolutional_rate": "7/8",
"frequency": "578000000",
"guard_interval": "1/4",
"dmb_constellation": "64-QAM",
"modulation": "DVB-T"}
tsterinfo_rate = subprocess.run(['tsterinfo',
"-h", mod_values["convolutional_rate"],
"-g", mod_values["guard_interval"],
"-c", mod_values["dmb_constellation"],
"-s"], stdout=subprocess.PIPE, universal_newlines=True)
mod_values["dvb_bitrate"] = tsterinfo_rate.stdout
infile=sys.argv[1]
run_tsp = subprocess.run(['tsp',
'--verbose',
'-b', mod_values["dvb_bitrate"],
'-I', 'null',
'-P', 'merge',
f'"tsp -I File {infile} --infinite"',
'-P', 'pcrbitrate',
'-P', 'regulate',
'-O', 'dektec',
'--frequency', mod_values["frequency"],
'--modulation', mod_values["modulation"],
'--guard-interval', mod_values["guard_interval"],
'--convolutional-rate', mod_values["convolutional_rate"],
'--dmb-constellation', mod_values["dmb_constellation"],
'-s'])
The quoted part in the command returns this error if I try keeping it as full string with spaces in double quotes surround my single quotes:
/bin/sh: 1: tsp -I File ../Videos/myts.ts --infinite: not found
without the quotes at all it errors saying too many inputs the same as it would straight into the terminal without quotes
python 3.8.5, ubuntu 20.04
I found a few things wrong with my tsp command when working through this. The answer to the question about passing quotes through to the sub-process seems to be solved by using
shell=True
In the sub-process options. Then you can pass the command line argument as one big string rather than as a list.
My final script for taking a transport stream as an argument and creating a CBR output ready for modulating through Dektec DTU-215 is this:
import sys
import subprocess
import json
# set modulation parameters in dict in order to reference once for bitrate calc and use again for modulator setup
mod_values = { "bandwidth": "8-mhz",
"convolutional_rate": "7/8",
"frequency": "578000000",
"guard_interval": "1/4",
"dmb_constellation": "64-QAM",
"modulation": "DVB-T"}
# calculate modulated bitrate and add to dict
tsterinfo_rate = subprocess.run(['tsterinfo',
"-h", mod_values["convolutional_rate"],
"-g", mod_values["guard_interval"],
"-c", mod_values["dmb_constellation"],
"-s"], stdout=subprocess.PIPE, universal_newlines=True)
mod_values["dvb_bitrate"] = tsterinfo_rate.stdout.strip()
# first argument is input file transport stream
infile=sys.argv[1]
# use mediainfo to calculate bitrate of input ts (must be CBR)
infile_mediainfo = subprocess.run(["mediainfo",
"--Output=JSON",
infile],
capture_output=True)
print(infile_mediainfo)
media_data = json.loads(infile_mediainfo.stdout)
ts_bitrate = int(media_data["media"]["track"][0]["OverallBitRate"])
print(f'ts_bitrate is: {ts_bitrate}')
# without -t option we don't have a PAT to even merge our stream with
# packet burst seems to make big difference to how smooth final playback is, default (16 according to docs) was jerky but 14 seems smooth
run_tsp = subprocess.run(f'tsp \
--verbose \
-d \
-b {mod_values["dvb_bitrate"]} \
-I null \
-P regulate --packet-burst 14 \
-P merge \
-t \
"tsp -I file {infile} \
--infinite \
-P regulate -b {ts_bitrate}" \
-O dektec \
-f {mod_values["frequency"]} \
-m {mod_values["modulation"]} \
-g {mod_values["guard_interval"]} \
-r {mod_values["convolutional_rate"]} \
--dmb-constellation {mod_values["dmb_constellation"]} \
-s', shell=True)

How to call jq written in shell with python subprocess?

I have two following shell scripts.
nodes.sh:
#!/bin/bash
NODE_IDs=$(docker node ls --format "{{.ID}}")
for NODE_ID in ${NODE_IDs}
do
docker node inspect $NODE_ID | jq -r '.[] | {node:.ID, ip:.Status.Addr}'
done | jq -s
nodes.sh gives following output (with ./nodes.sh or cat ./nodes.sh | bash):
[
{
"node": "b2d9g6i9yp5uj5k25h1ehp26e",
"ip": "192.168.1.123"
},
{
"node": "iy25xmeln0ns7onzg4jaofiwo",
"ip": "192.168.1.125"
}
]
node_detail.sh:
#!/bin/bash
docker node inspect b2d | jq '.[] | {node: .ID, ip: .Status.Addr}'
where as node_detail.sh gives (./node_detail.sh or cat ./node_detail.sh):
{
"node": "b2d9g6i9yp5uj5k25h1ehp26e",
"ip": "192.168.1.123"
}
Problem: I would like to run both script from python subporcess.
I can run and get output for node_detail.sh with following code:
>>> import subprocess
>>> proc = subprocess.Popen('./node_detail.sh', stdout=subprocess.PIPE, shell=True)
>>> proc.stdout.read()
'{\n "node": "b2d9g6i9yp5uj5k25h1ehp26e",\n "ip": "192.168.1.123"\n}\n'
I wrote following code to get output from nodes.sh
>>> import subprocess
>>> proc = subprocess.Popen('./nodes.sh', stdout=subprocess.PIPE, shell=True)
Now I am getting following error:
>>> jq - commandline JSON processor [version 1.5-1-a5b5cbe]
Usage: jq [options] <jq filter> [file...]
jq is a tool for processing JSON inputs, applying the
given filter to its JSON text inputs and producing the
filter's results as JSON on standard output.
The simplest filter is ., which is the identity filter,
copying jq's input to its output unmodified (except for
formatting).
For more advanced filters see the jq(1) manpage ("man jq")
and/or https://stedolan.github.io/jq
Some of the options include:
-c compact instead of pretty-printed output;
-n use `null` as the single input value;
-e set the exit status code based on the output;
-s read (slurp) all inputs into an array; apply filter to it;
-r output raw strings, not JSON texts;
-R read raw strings, not JSON texts;
-C colorize JSON;
-M monochrome (don't colorize JSON);
-S sort keys of objects on output;
--tab use tabs for indentation;
--arg a v set variable $a to value <v>;
--argjson a v set variable $a to JSON value <v>;
--slurpfile a f set variable $a to an array of JSON texts read from <f>;
See the manpage for more options.
Error: writing output failed: Broken pipe
Error: writing output failed: Broken pipe
Why I am getting Error: writing output failed: Broken pipe?
In nodes.sh, rather than invoking jq without any argument, invoke it as jq -s ..

Why does this valid shell command throw an error in python via subprocess?

The line awk -F'[][]' '/dB/ { print $2 }' <(amixer sget Master) in bash returns my system's current volume (e.g. "97%").
I tried to incorporate this in Python 3
#!/usr/bin/env python3
import subprocess
command = "awk -F'[][]' '/dB/ { print $2 }' <(amixer sget Master)"
output = subprocess.Popen(command, shell=True, stdout=subprocess.PIPE).stdout.read()
print(output)
However the output from the shell returns
/bin/sh: 1: Syntax error: "(" unexpected
b''
Why does this fail and how do I fix my code?
As already pointed out, the syntax you are using is a bash syntax (a.k.a. bashism). The default shell used in subprocess.Popen is /bin/sh & it does not support process substitution.
You can specify the shell to be used via executable argument.
Try this:
output = subprocess.Popen(command, shell=True, stdout=subprocess.PIPE, executable="/bin/bash").stdout.read()
Because you are using bashism in form of a process substitution, and your /bin/sh doesn't support that:
<(...)
Changing this to a pipe should solve your problem:
command = "amixer sget Master | awk -F'[][]' '/dB/ { print $2 }'"
Alternative you can start bash from within sh:
command = "bash -c 'amixer sget Master | awk -F'\\''[][]'\\'' '\\''/dB/ { print $2 }'\\'''"
But as you will soon realize, quoting and escaping will become a nightmare

freebsd pw adduser command and python file descriptors

I'm trying to write a system management script in Python 2.7 on FreeBSD and I'm stuck trying to programmatically set the user's password when adding them. I'm using the FreeBSD pw command which has a -h flag which accepts a file descriptor as an argument.
The route I was taking is using Python's subprocess module, but I
seem to be getting stuck in that Python treats everything as strings
and the pw -h option is expecting a fd (file descriptor) back.
The command I'm trying to run is:
/usr/sbin/pw useradd foobar2 -C /usr/local/etc/bsdmanage/etc/pw.conf -m -c "BSDmanage foobar2 user" -G foobar2-www -h
I'm doing this via:
objTempPassFile = open(strTempDir + 'foobar.txt', 'w+')
objTempPassFile.write(strTempPass)
objTempPassFile.seek(0)
listCmdArgs = shlex.split(strPwUserCmd)
processUser = subprocess.Popen(listCmdArgs,stdin=objTempPassFile.fileno(),stdout=subprocess.PIPE,stderr=subprocess.PIPE)
strOutPut, strErrorValue = processUser.communicate()
where strPwUserCmd is the above pw command and strTempPass is just a string.
I also tried passing the password string as an option to Popen.communicate() and changing stdin to stdin=subprocess.PIPE
I also tried using a StringIO object. However, passing that either gets errors about it not being a valid I/O object or the pw commands fails and doesn't see any arguments passed to the -h switch.
FreeBSD pw manpage
Any ideas? Thanks.
So, if you use the -h 0 flag to pw it prompts for stdin pipe and then you just use process.communicate(string) to pass the password in.
So, like this:
/usr/sbin/pw useradd foobar2 -C /usr/local/etc/bsdmanage/etc/pw.conf -m -c "BSDmanage foobar2 user" -G foobar2-www -h 0
as the command string. Then call that via:
listCmdArgs = shlex.split(strPwUserCmd)
processUser = subprocess.Popen(listCmdArgs,stdin=subprocess.PIPE,stdout=subprocess.PIPE,stderr=subprocess.PIPE)
strOutPut, strErrorValue = processUser.communicate(strTempPass)
and have strTempPass be the password string. strPwUserCmd is the above 'pw' command string.

python sub-process

I usually execute a Fortran file in Linux (manually) as:
Connect to the server
Go to the specific folder
ifort xxx.for -o xxx && ./xxx (where 'xxx.for' is my Fortran file and 'xxx' is Fortran executable file)
But I need to call my fortran file (xxx.for) from python (I'm a beginner), so I used subprocess with the following command:
cmd = ["ssh", sshConnect, "cd %s;"%(workDir), Fortrancmd %s jobname "%s -o %s" exeFilename "%s && %s ./ %s%s"%(exeFilename)]
But I get an error, and I'm not sure what's wrong. Here's the full code:
import string
import subprocess as subProc
from subprocess import Popen as ProcOpen
from subprocess import PIPE
import numpy
import subprocess
userID = "pear"
serverName = "say4"
workDir = "/home/pear/2/W/fortran/"
Fortrancmd = "ifort"
jobname = "rad.for"
exeFilename = "rad"
sshConnect=userID+"#"+servername
cmd=["ssh", sshConnect, "cd %s;"%(workDir), Fortrancmd %s jobname "%s -o %s" exeFilename "%s && %s ./ %s%s"%(exeFilename)]
**#command to execute fortran files in Linux
**#ifort <filename>.for -o <filename> && ./<filename> (press enter)
**#example:ifort xxx.for -o xxx && ./xxx (press enter)
print cmd
How can I write a python program that performs all 3 steps described above and avoids the error I'm getting?
there are some syntax errors...
original:
cmd=["ssh", sshConnect, "cd %s;"%(workDir), Fortrancmd %s jobname "%s -o %s" exeFilename "%s && %s ./ %s%s"%(exeFilename)]
I think you mean:
cmd = [
"ssh",
sshConnect,
"cd %s;" % (workDir,),
"%s %s -o %s && ./%s" % (Fortrancmd, jobname, exeFilename, exeFilename)
]
A few notes:
a tuple with one element requires a comma at the end of the first argument see (workDir,) to be interpreted as a tuple (vs. simple order-of-operations parens)
it is probably easier to contruct your fortan command with a single string format operation
PS - For readability it is often a good idea to break long lists into multiple lines :)
my advice
I would recommend looking at this stackoverflow thread for ssh instead of using subprocess
For the manual part you may want to look into pexpect or for windows wexpect. These allow you to perform subprocesses and pass input under interactive conditions.
However most of what you're doing sounds like it would work well in a shell script. For simplicity, you could make a shell script on the server side for your server side operations, and then plug in the path in the ssh statement:
ssh user#host "/path/to/script.sh"
one error:
you have an unquoted %s in your list of args, so your string formatting will fail.
Here is a complete example of using the subprocess module to run a remote command via ssh (a simple echo in this case) and grab the results, hope it helps:
>>> import subprocess
>>> proc = subprocess.Popen(("ssh", "remoteuser#host", "echo", "1"), stdout=subprocess.PIPE, stderr=subprocess.PIPE)
>>> stdout, stderr = proc.communicate()
Which in this case returns: ('1\n', '')
Note that to get this to work without requiring a password you will likely have to add your local user's public key to ~remoteuser/.ssh/authorized_keys on the remote machine.
You could use fabric for steps 1 and 2.
This is the basic idea:
from fabric.api import *
env.hosts = ['host']
dir = '/home/...'
def compile(file):
with cd(dir):
run("ifort %s.for -o %s" %(file,file))
run("./%s > stdout.txt" % file)
Create fabfile.py
And you run fab compile:filename
do you have to use python?
ssh user#host "command"

Categories

Resources