How to call jq written in shell with python subprocess? - python

I have two following shell scripts.
nodes.sh:
#!/bin/bash
NODE_IDs=$(docker node ls --format "{{.ID}}")
for NODE_ID in ${NODE_IDs}
do
docker node inspect $NODE_ID | jq -r '.[] | {node:.ID, ip:.Status.Addr}'
done | jq -s
nodes.sh gives following output (with ./nodes.sh or cat ./nodes.sh | bash):
[
{
"node": "b2d9g6i9yp5uj5k25h1ehp26e",
"ip": "192.168.1.123"
},
{
"node": "iy25xmeln0ns7onzg4jaofiwo",
"ip": "192.168.1.125"
}
]
node_detail.sh:
#!/bin/bash
docker node inspect b2d | jq '.[] | {node: .ID, ip: .Status.Addr}'
where as node_detail.sh gives (./node_detail.sh or cat ./node_detail.sh):
{
"node": "b2d9g6i9yp5uj5k25h1ehp26e",
"ip": "192.168.1.123"
}
Problem: I would like to run both script from python subporcess.
I can run and get output for node_detail.sh with following code:
>>> import subprocess
>>> proc = subprocess.Popen('./node_detail.sh', stdout=subprocess.PIPE, shell=True)
>>> proc.stdout.read()
'{\n "node": "b2d9g6i9yp5uj5k25h1ehp26e",\n "ip": "192.168.1.123"\n}\n'
I wrote following code to get output from nodes.sh
>>> import subprocess
>>> proc = subprocess.Popen('./nodes.sh', stdout=subprocess.PIPE, shell=True)
Now I am getting following error:
>>> jq - commandline JSON processor [version 1.5-1-a5b5cbe]
Usage: jq [options] <jq filter> [file...]
jq is a tool for processing JSON inputs, applying the
given filter to its JSON text inputs and producing the
filter's results as JSON on standard output.
The simplest filter is ., which is the identity filter,
copying jq's input to its output unmodified (except for
formatting).
For more advanced filters see the jq(1) manpage ("man jq")
and/or https://stedolan.github.io/jq
Some of the options include:
-c compact instead of pretty-printed output;
-n use `null` as the single input value;
-e set the exit status code based on the output;
-s read (slurp) all inputs into an array; apply filter to it;
-r output raw strings, not JSON texts;
-R read raw strings, not JSON texts;
-C colorize JSON;
-M monochrome (don't colorize JSON);
-S sort keys of objects on output;
--tab use tabs for indentation;
--arg a v set variable $a to value <v>;
--argjson a v set variable $a to JSON value <v>;
--slurpfile a f set variable $a to an array of JSON texts read from <f>;
See the manpage for more options.
Error: writing output failed: Broken pipe
Error: writing output failed: Broken pipe
Why I am getting Error: writing output failed: Broken pipe?

In nodes.sh, rather than invoking jq without any argument, invoke it as jq -s ..

Related

Execute chained bash commands including multiple pipes and grep in Python3

I have to use the below bash command in a python script which includes multiple pip and grep commands.
grep name | cut -d':' -f2 | tr -d '"'| tr -d ','
I tried to do the same using subprocess module but didn't succeed.
Can anyone help me to run the above command in Python3 scripts?
I have to get the below output from a file file.txt.
Tom
Jack
file.txt contains:
"name": "Tom",
"Age": 10
"name": "Jack",
"Age": 15
Actually I want to know how can run the below bash command using Python.
cat file.txt | grep name | cut -d':' -f2 | tr -d '"'| tr -d ','
This works without having to use the subprocess library or any other os cmd related library, only Python.
my_file = open("./file.txt")
line = True
while line:
line = my_file.readline()
line_array = line.split()
try:
if line_array[0] == '"name":':
print(line_array[1].replace('"', '').replace(',', ''))
except IndexError:
pass
my_file.close()
If you not trying to parse a json file or any other structured file for which using a parser would be the best approach, just change your command into:
grep -oP '(?<="name":[[:blank:]]").*(?=",)' file.txt
You do not need any pipe at all.
This will give you the output:
Tom
Jack
Explanations:
-P activate perl regex for lookahead/lookbehind
-o just output the matching string not the whole line
Regex used: (?<="name":[[:blank:]]").*(?=",)
(?<="name":[[:blank:]]") Positive lookbehind: to force the constraint "name": followed by a blank char and then another double quote " the name followed by a double quote " extracted via (?=",) positive lookahead
demo: https://regex101.com/r/JvLCkO/1

How to write this code into a one lined command

I have a curl command which return some json result.
{
"all":[
{
"id":"1",
"actions":[
"power",
"reboot"
]
},
{
"id":"2",
"actions":[
"shutdown"
]
},
{
"id":"3",
"actions":[
"backup"
]
}
]
}
I retreive the data actions with this command :
curl -s https://DOMAIN/API -H "X-Auth-Token: TOKEN" | python -c "import sys, json, re; print [ i['allowed_actions'] for i in json.load(sys.stdin)['servers']]"
But I would like to use this code in python on the command line :
for i in json.load(sys.stdin)['all']:
if i['id'] == '1':
print(i['actions'])
I tried this :
curl -s https://DOMAIN/API -H "X-Auth-Token: TOKEN" | python -c "import sys, json, re; print [ if i['id'] == '1': i['actions'] for i in json.load(sys.stdin)['servers']]"
But it returns a syntax error
File "<string>", line 1
import sys, json, re; for i in json.load(sys.stdin)['all']:\nif i['id'] == '1':\nprint(i['actions'])
^
SyntaxError: invalid syntax
you want to print this expression:
[i['actions'] for i in json.load(sys.stdin)['all'] if i['id'] == '1']
This filters the sub dictionary/ies where id == 1 and builds a list with the actions data.
so adapted to curl command line:
curl -s https://DOMAIN/API -H "X-Auth-Token: TOKEN" | python -c "import sys, json, re; print([i['actions'] for i in json.load(sys.stdin)['all'] if i['id'] == '1'])"
Feeding to a simple python command line works:
[['power', 'reboot']]
id seems unique, so maybe it's better to avoid returning a 1-element list:
next((i['actions'] for i in json.load(sys.stdin)['all'] if i['id'] == '1'),None)
with that expression it yields ['power', 'reboot'] or None if not found
Give jq a try. It's a lightweight and flexible command-line JSON parser, and it's a standard package in Ubuntu (apt install jq) and Red Hat (yum install jq).
$ curl ... | jq -c '.all[] | select(.id=="1").actions'
["power","reboot"]
It plays nice with both JSON and standard UNIX tools. If you want to feed the output to regular tools use -r:
$ curl ... | jq -r '.all[] | select(.id=="1").actions[]'
power
reboot

How to save all keys value into an array variable

I have a cURL command which return some json results.
I have :
{
"all":[
{
"id":"1"
},
{
"id":"2"
},
{
"id":"3"
}
]
}
My goal is to retreive all IDs values into an array in Bash. I know how to retreive a particular ID knowing the position.
Here is what I tried :
#!/bin/bash
CURL_COMM=$(curl https://DOMAIN/API -H "X-Auth-Token: TOKEN" | python -c "import sys, json; print json.load(sys.stdin)['all'][0]['id']")
echo "$CURL_COMM"
This will output 1 as expected, but I need to retreive the other IDs without knowing the number of element. Is that possible ?
And is it possible to retreive values contained in array, like :
{
"all":[
{
"id":"1",
"actions":[
"power",
"reboot"
]
},
{
"id":"2"
},
{
"id":"3"
}
]
}
Is it possible to retreive the actions list ?
You can use list comprehension:
python -c "import sys, json; print [i['id'] for i in json.load(sys.stdin)['all']]"
As always, jq makes working with JSON from a command line easy.
First one as a string holding a JSON array:
$ CURL_COMM=$(curl blah | jq -c '[ .all[].id | tonumber ]')
$ echo $CURL_COMM
[1,2,3]
First one as a bash array:
$ CURL_COMM=($(curl blah | jq '.all[].id | tonumber'))
$ echo ${CURL_COMM[1]}
2
Second one:
$ jq -c '.all[0].actions' example.json
["power","reboot"]

Python os.system invalid syntax

Hi I would like to execute the following command via shell.
curl -g -d '{ "action": "block_count" }' [::1]:7076
However when inserted in an os.system call , I get an invalid syntax. What would be the right syntax format.
#!/usr/bin/env python
import os
import json
aba = os.system('curl -g -d '{ "action": "block_count" }' [::1]:7076')
baba = json.loads(aba)
You could simply use the triple-quoted string literal, like:
os.system('''curl -g -d '{"action": "block_count"}' [::1]:7076''')
but even better, use the right tool for the job, i.e. requests:
import requests
data = requests.post('[::1]:7076', json={"action": "block_count"}).json()
If you insist on direct curl command invocation, please use the subprocess module instead of the old and inflexible os.system (also unsafe, for inputs not strictly checked). You can use the subprocess.check_output as a replacement in your case. There's no need to execute your curl command in a subshell, so you can split curl's arguments, like:
import subprocess
output = subprocess.check_output(['curl', '-g', '-d', '{"action": "block_count"}', '-s', '[::1]:7076'])
data = json.loads(output)
Note that check_output will return the standard output of the command executed (like os.system does), but it'll raise a CalledProcessError exception in case the command fails with non-zero status, or OSError exception if the command is not found.
You need to escape the single quotes. So change this:
aba = os.system('curl -g -d '{ "action": "block_count" }' [::1]:7076')
to this:
aba = os.system('curl -g -d \'{ "action": "block_count" }\' [::1]:7076')

Why does this valid shell command throw an error in python via subprocess?

The line awk -F'[][]' '/dB/ { print $2 }' <(amixer sget Master) in bash returns my system's current volume (e.g. "97%").
I tried to incorporate this in Python 3
#!/usr/bin/env python3
import subprocess
command = "awk -F'[][]' '/dB/ { print $2 }' <(amixer sget Master)"
output = subprocess.Popen(command, shell=True, stdout=subprocess.PIPE).stdout.read()
print(output)
However the output from the shell returns
/bin/sh: 1: Syntax error: "(" unexpected
b''
Why does this fail and how do I fix my code?
As already pointed out, the syntax you are using is a bash syntax (a.k.a. bashism). The default shell used in subprocess.Popen is /bin/sh & it does not support process substitution.
You can specify the shell to be used via executable argument.
Try this:
output = subprocess.Popen(command, shell=True, stdout=subprocess.PIPE, executable="/bin/bash").stdout.read()
Because you are using bashism in form of a process substitution, and your /bin/sh doesn't support that:
<(...)
Changing this to a pipe should solve your problem:
command = "amixer sget Master | awk -F'[][]' '/dB/ { print $2 }'"
Alternative you can start bash from within sh:
command = "bash -c 'amixer sget Master | awk -F'\\''[][]'\\'' '\\''/dB/ { print $2 }'\\'''"
But as you will soon realize, quoting and escaping will become a nightmare

Categories

Resources