How to save all keys value into an array variable - python

I have a cURL command which return some json results.
I have :
{
"all":[
{
"id":"1"
},
{
"id":"2"
},
{
"id":"3"
}
]
}
My goal is to retreive all IDs values into an array in Bash. I know how to retreive a particular ID knowing the position.
Here is what I tried :
#!/bin/bash
CURL_COMM=$(curl https://DOMAIN/API -H "X-Auth-Token: TOKEN" | python -c "import sys, json; print json.load(sys.stdin)['all'][0]['id']")
echo "$CURL_COMM"
This will output 1 as expected, but I need to retreive the other IDs without knowing the number of element. Is that possible ?
And is it possible to retreive values contained in array, like :
{
"all":[
{
"id":"1",
"actions":[
"power",
"reboot"
]
},
{
"id":"2"
},
{
"id":"3"
}
]
}
Is it possible to retreive the actions list ?

You can use list comprehension:
python -c "import sys, json; print [i['id'] for i in json.load(sys.stdin)['all']]"

As always, jq makes working with JSON from a command line easy.
First one as a string holding a JSON array:
$ CURL_COMM=$(curl blah | jq -c '[ .all[].id | tonumber ]')
$ echo $CURL_COMM
[1,2,3]
First one as a bash array:
$ CURL_COMM=($(curl blah | jq '.all[].id | tonumber'))
$ echo ${CURL_COMM[1]}
2
Second one:
$ jq -c '.all[0].actions' example.json
["power","reboot"]

Related

How to call jq written in shell with python subprocess?

I have two following shell scripts.
nodes.sh:
#!/bin/bash
NODE_IDs=$(docker node ls --format "{{.ID}}")
for NODE_ID in ${NODE_IDs}
do
docker node inspect $NODE_ID | jq -r '.[] | {node:.ID, ip:.Status.Addr}'
done | jq -s
nodes.sh gives following output (with ./nodes.sh or cat ./nodes.sh | bash):
[
{
"node": "b2d9g6i9yp5uj5k25h1ehp26e",
"ip": "192.168.1.123"
},
{
"node": "iy25xmeln0ns7onzg4jaofiwo",
"ip": "192.168.1.125"
}
]
node_detail.sh:
#!/bin/bash
docker node inspect b2d | jq '.[] | {node: .ID, ip: .Status.Addr}'
where as node_detail.sh gives (./node_detail.sh or cat ./node_detail.sh):
{
"node": "b2d9g6i9yp5uj5k25h1ehp26e",
"ip": "192.168.1.123"
}
Problem: I would like to run both script from python subporcess.
I can run and get output for node_detail.sh with following code:
>>> import subprocess
>>> proc = subprocess.Popen('./node_detail.sh', stdout=subprocess.PIPE, shell=True)
>>> proc.stdout.read()
'{\n "node": "b2d9g6i9yp5uj5k25h1ehp26e",\n "ip": "192.168.1.123"\n}\n'
I wrote following code to get output from nodes.sh
>>> import subprocess
>>> proc = subprocess.Popen('./nodes.sh', stdout=subprocess.PIPE, shell=True)
Now I am getting following error:
>>> jq - commandline JSON processor [version 1.5-1-a5b5cbe]
Usage: jq [options] <jq filter> [file...]
jq is a tool for processing JSON inputs, applying the
given filter to its JSON text inputs and producing the
filter's results as JSON on standard output.
The simplest filter is ., which is the identity filter,
copying jq's input to its output unmodified (except for
formatting).
For more advanced filters see the jq(1) manpage ("man jq")
and/or https://stedolan.github.io/jq
Some of the options include:
-c compact instead of pretty-printed output;
-n use `null` as the single input value;
-e set the exit status code based on the output;
-s read (slurp) all inputs into an array; apply filter to it;
-r output raw strings, not JSON texts;
-R read raw strings, not JSON texts;
-C colorize JSON;
-M monochrome (don't colorize JSON);
-S sort keys of objects on output;
--tab use tabs for indentation;
--arg a v set variable $a to value <v>;
--argjson a v set variable $a to JSON value <v>;
--slurpfile a f set variable $a to an array of JSON texts read from <f>;
See the manpage for more options.
Error: writing output failed: Broken pipe
Error: writing output failed: Broken pipe
Why I am getting Error: writing output failed: Broken pipe?
In nodes.sh, rather than invoking jq without any argument, invoke it as jq -s ..

How to write this code into a one lined command

I have a curl command which return some json result.
{
"all":[
{
"id":"1",
"actions":[
"power",
"reboot"
]
},
{
"id":"2",
"actions":[
"shutdown"
]
},
{
"id":"3",
"actions":[
"backup"
]
}
]
}
I retreive the data actions with this command :
curl -s https://DOMAIN/API -H "X-Auth-Token: TOKEN" | python -c "import sys, json, re; print [ i['allowed_actions'] for i in json.load(sys.stdin)['servers']]"
But I would like to use this code in python on the command line :
for i in json.load(sys.stdin)['all']:
if i['id'] == '1':
print(i['actions'])
I tried this :
curl -s https://DOMAIN/API -H "X-Auth-Token: TOKEN" | python -c "import sys, json, re; print [ if i['id'] == '1': i['actions'] for i in json.load(sys.stdin)['servers']]"
But it returns a syntax error
File "<string>", line 1
import sys, json, re; for i in json.load(sys.stdin)['all']:\nif i['id'] == '1':\nprint(i['actions'])
^
SyntaxError: invalid syntax
you want to print this expression:
[i['actions'] for i in json.load(sys.stdin)['all'] if i['id'] == '1']
This filters the sub dictionary/ies where id == 1 and builds a list with the actions data.
so adapted to curl command line:
curl -s https://DOMAIN/API -H "X-Auth-Token: TOKEN" | python -c "import sys, json, re; print([i['actions'] for i in json.load(sys.stdin)['all'] if i['id'] == '1'])"
Feeding to a simple python command line works:
[['power', 'reboot']]
id seems unique, so maybe it's better to avoid returning a 1-element list:
next((i['actions'] for i in json.load(sys.stdin)['all'] if i['id'] == '1'),None)
with that expression it yields ['power', 'reboot'] or None if not found
Give jq a try. It's a lightweight and flexible command-line JSON parser, and it's a standard package in Ubuntu (apt install jq) and Red Hat (yum install jq).
$ curl ... | jq -c '.all[] | select(.id=="1").actions'
["power","reboot"]
It plays nice with both JSON and standard UNIX tools. If you want to feed the output to regular tools use -r:
$ curl ... | jq -r '.all[] | select(.id=="1").actions[]'
power
reboot

How, or can, I get python -mjson.tool to maintain the order of the attributes

I know little of python other than this simple invocation: python -m json.tool {someSourceOfJSON}
Note how the source document is ordered "id", "z", "a" but the resulting JSON document presents the attributes "a", "id", "z".
$ echo '{ "id": "hello", "z": "obj", "a": 1 }' | python -m json.tool
{
"a": 1,
"id": "hello",
"z": "obj"
}
How, or can, I make the json.tool thing maintain the order of the attributes from the original JSON document?
The python version is whatever comes with this MacBookPro
$ python --version
Python 2.7.15
I'm not sure if it's possible with python -m json.tool but is with a one liner (which I'm guessing is the actual X/Y root problem) :
echo '{ "id": "hello", "z": "obj", "a": 1 }' | python -c "import json, sys, collections; print(json.dumps(json.loads(sys.stdin.read(), object_pairs_hook=collections.OrderedDict), indent=4))"
Result:
{
"id": "hello",
"z": "obj",
"a": 1
}
This is essentially the following code, but without the immediate objects and some readability compromises, such as oneline imports.
import json
import sys
import collections
# Read from stdin / pipe as a str
text = sys.stdin.read()
# Deserialise text to a Python object.
# It's most likely to be a dict, depending on the input
# Use `OrderedDict` type to maintain order of dicts.
my_obj = json.loads(text, object_pairs_hook=collections.OrderedDict)
# Serialise the object back to text
text_indented = json.dumps(my_obj, indent=4)
# Write it out again
print(text_indented)

How to include \n with python -m json.tool

I'm trying to create a json file via shell, however, new lines are now allowed and through an error.
Invalid control character at: line 5 column 26 (char 87) which points to \n
echo '{
"param1": "asdfasf",
"param2": "asffad",
"param3": "asdfsaf",
"param4": "asdfasf\nasfasfas"
}' | python -m json.tool > test.json
Assuming I'd like to preserve new lines, how can I get this to put a json file?
UPDATE:
I'm thinking it has something to do with strict mode for python's json encoder/decoder.
If strict is False (True is the default), then control characters will
be allowed inside strings. Control characters in this context are
those with character codes in the 0-31 range, including '\t' (tab),
'\n', '\r' and '\0'.
https://docs.python.org/2/library/json.html
How can strict mode be set to False from within python -m json.tool?
Escaping the \ seems to do the trick:
echo '{
"param1": "asdfasf",
"param2": "asffad",
"param3": "asdfsaf",
"param4": "asdfasf\\nasfasfas"
}' | python -m json.tool > test.json
It creates valid json:
with open('/home/test.json', 'rU') as f:
js = json.load(f)
print(js)
print(js["param4"])
Output:
{'param1': 'asdfasf', 'param3': 'asdfsaf', 'param2': 'asffad', 'param4': 'asdfasf\nasfasfas'}
asdfasf
asfasfas
zsh is replacing \n with a proper carriage return. You could escape it or use heredoc style instead:
python -m json.tool > test.json << EOF
{
"param1": "asdfasf",
"param2": "asffad",
"param3": "asdfsaf",
"param4": "asdfasf\nasfasfas"
}
EOF

how to import python variable to bash

I have two scripts:
Python:
if canceled==True:
os.environ["c"] = "0"
if canceled==False:
os.environ["c"] = "1"
Bash:
kill_jobs()
{
pkill -TERM -P "$BASHPID"
echo $c
if [ $c == "0" ]
then
echo -e "OPERATIONS CANCELED"
elif [ $c == "1" ]
then
echo -e "OPERATIONS FINISHED"
fi
}
trap kill_jobs EXIT
How can I do to pass a python variable to bash script ?
(My level in bash is near to 0 ...)
Thanks
Edit: Now I have this error: (in french)
[: == : opérateur unaire attendu
Or you can try:
os.environ["c"] = "value"
You can set it this way, I guess
Refer
The python script should end with:
print c
Then you use it in bash with:
c=$(python_script)

Categories

Resources