How to parse JSON passed on the command line - python

I am trying to pass JSON parameters through command line in Python:
automation.py {"cmd":"sel_media","value":"5X7_photo_paper.p}
how can I extract the values sel_media and 5X7_photo_paper.p?
I used the following code, but it is not working:
cmdargs = str(sys.argv[1])
print cmdargs

Provided you pass actual valid JSON to the command line and quote it correctly, you can parse the value with the json module.
You need to quote the value properly, otherwise your shell or console will interpret the value instead:
automation.py '{"cmd":"sel_media","value":"5X7_photo_paper.p"}'
should be enough for a bash shell.
In Python, decode with json.loads():
import sys
import json
cmdargs = json.loads(sys.argv[1])
print cmdargs['cmd'], cmdargs['value']
Demo:
$ cat demo.py
import sys
import json
cmdargs = json.loads(sys.argv[1])
print cmdargs['cmd'], cmdargs['value']
$ bin/python demo.py '{"cmd":"sel_media","value":"5X7_photo_paper.p"}'
sel_media 5X7_photo_paper.p

The above is generally correct, but I ran into issues with it when running on my own python script
python myscript.py '{"a":"1"}'
does not work directly in my terminal
so I did
python myscript.py '{\"a\":\"1\"}'

Related

Error in Importing a bash script into a python

I am trying to create a python script script.py in bash and importing a bash script.
#!/usr/bin/env python
import os
import glob
from fnmatch import fnmatch
# importing a software
python_package = os.system("""#!/path_to_bin/bin/python \
from __future__ import print_function, division \
from python_toolbox.toolbox.some_toolbox import run \
if __name__ == '__main__': \
run()"""
# testing
greeting = "Hello world!"
print(greeting)
Running the script.py in python3
$python3 script.py
File "script.py", line 15
greeting = "Hello world!"
SyntaxError: invalid syntax
Nominally the problem is that you are missing the closing paren on the os.system call. But there is a better way to run a python program than trying to write it all on the command line. Instead, you can pass a full script, including newlines, to python's stdin.
#!/usr/bin/env python
import sys
import subprocess as subp
# importing a software
def run_script():
subp.run([sys.executable, "-"], input=b"""
print("I am a called python script")
""")
# testing
run_script()
greeting = "Hello world!"
print(greeting)
In this script, the second python script is run whenever you call run_script. Notice that the script in the string has to follow the normal python indendation rules. So, there is indentation inside run_script but then the string holding the second script starts its indentation all the way to the left again.

how to run an executable with python json

so I have this executable binary file that can be run via terminal
and by python
using code
$`python3 shell_c.py`
where python file contains
import subprocess
def executable_shell():
x=subprocess.run('cd build && ./COSMO/methane_c0.outmol.cosmo', shell=True, capture_output=True)
print(x)
executable_shell()
where COSMO is my executable name and "methane_c0.outmol" is the dynamic value that should be changed along with ".cosmo" which is an extension)
so to get these values from the JSON file I have created a JSON file
with input
{
"root_directory": "C:\\Users\\15182\\cosmo theory\\COSMO\\UDbase8",
"file_name": "methane_c0",
"file_format": ".cosmo",
"output1": "N_atoms",
"output2": "total number of segments"
}
now all that is left is to pass the value of file_name and file_format to the subprocess code to run it.
but I am not getting how to do go about it.
code I have written so far is basic
import json
with open ("parameters.json") as file:
data =json.load(file)
print(type(data))
pront(data)
how should I go so that values can be passed to a python file?
Something like this?
import json
import subprocess
with open ("parameters.json") as file:
data =json.load(file)
dynamic_file_name = data['file_name']+'.outmol'+data['file_format']
def executable_shell():
x=subprocess.run('cd build && ./COSMO/'+dynamic_file_name, shell=True, capture_output=True)
print(x)
executable_shell()

How do I pass python variables to subprocess call to sed?

I'm trying to send a call to sed but it seems like it is not processing the variables it looks correct when I print it but it not correct in the call
#!/usr/bin/python -tt
import json
from pprint import pprint
from subprocess import call
with open('admin_list.json') as data_file:
admins = json.load(data_file)
#pprint(data[0]["key"])
for admin in admins:
#print(admin["name"])
#print (" sudo sed, 1 a ${"+admin['key']+"} /home/"+admin['name']+"/.ssh/authorized_keys")
call(["sudo sed", "1 a ${"+admin['key']+"} /home/"+admin['name']+"/.ssh/authorized_keys"])
OSError: [Errno 2] No such file or directory"
I've updated my code I do not get errors but the file still is not updated
#!/usr/bin/python -tt
import json
import os
from pprint import pprint
from subprocess import call
with open('admin_list.json') as data_file:
admins = json.load(data_file)
#pprint(data[0]["key"])
for admin in admins:
call(["sudo","sed", "1 a "+admin['key']+"","/home/"+admin['name']+"/.ssh/authorized_keys"])
call(['cat','/home/'+admin["name"]+'/.ssh/authorized_keys'])
You have to split your arguments properly else spaces will be interpreted literally
Here you have 4 arguments:
sudo
sed
argument of sed
file to parse as admin
And don't rely on env. variables, evaluate them beforehand (else you'll need shell=True)
so pass a 4-item list to call
call(["sudo","sed", "1 a "+os.getenv(admin['key']),"/home/"+admin['name']+"/.ssh/authorized_keys"])
note that to modify your file in-place you need to add -i option to sed:
call(["sudo","sed", "-i", "1 a "+os.getenv(admin['key']),"/home/"+admin['name']+"/.ssh/authorized_keys"])

Curl command in Python 3

I want to execute a curl with command in python.
Usually, I just need enter the command in terminal and press return key.
The command shows below:
curl -H "`oauth2l header --json key.json mobileinsights`" https://mobileinsights.googleapis.com/v2/networks
The result is in json format.
Use the subprocess module to run your shell command.
import subprocess
result = subprocess.check_output('curl -H "`oauth2l header --json
key.json mobileinsights`" https://mobileinsights.googleapis.com/v2/networks', shell=True)
Then, use the json module to parse the JSON data returned by the server.
import json
result_json = json.loads(result)
You may load an OS and JSON load modules and then run
os.execute(curl URL) store it in any variable then convert it into JSON format with JSON load module

How to execute python script from another python script and send it's output in text file

I have a script test.py which is used for server automation task and have other script server.py which list out all server name.
server.py list all sevrer name in text.log file and this log file is used by test.py as a input.
I want one single script test.py to execute server.py from inside it and also redirect server.py script output to text.log file as well.
So far i have tried with execfile("server.py") >text.log in test.py which didn't worked.
Use subprocess.call with stdout argument:
import subprocess
import sys
with open('text.log', 'w') as f:
subprocess.call([sys.executable, 'server.py'], stdout=f)
# ADD stderr=subprocess.STDOUT if you want also catch standard error output

Categories

Resources