Handling python output data in a bash script - python

I have a bash script which goes out,does some DNS queries and returns an IP address value which I assign as a variable $external
I want to take this variable from the bash script and feed it into python and do some subnet intelligence but I'm not sure how to handle the data after this and pass it back to bash.
I can see that the $external variable is being passed from bash into python ok but this is where I'm not sure what to do next. (thanks Farhan.K for assisting me with what I have already)
python3 <<END
import ipaddress
ipsub = {"10.10.10.0/24": "Firewall-Denver", "10.10.20.0/25": "FirewallNewYork"}
iplist = [$external]
ipfirewall = []
for i in ipsub:
for j in iplist:
if ipaddress.ip_address(j) in ipaddress.ip_network(i):
ipfirewall.append([j,ipsub[i]])
END
The following would write it to a file:
with open('output.txt', 'w') as file:
file.writelines('\t'.join(i) + '\n' for i in ipfirewall)
But how to I pass it back to bash in the same format?
Thanks in advance for your advice and assitance.

You can do this by writing the intended result to Python's standard output, and capturing the result in a variable.
#!/bin/bash
external="'10.10.10.1','10.10.20.1'"
result="$(python3 <<END
import ipaddress
ipsub = {"10.10.10.0/24": "Firewall-Denver", "10.10.20.0/25": "FirewallNewYork"}
iplist = [$external]
ipfirewall = []
for i in ipsub:
for j in iplist:
if ipaddress.ip_address(j) in ipaddress.ip_network(i):
ipfirewall.append([j,ipsub[i]])
print(ipfirewall)
END
)"
echo "$result"
In the above script, I pass $external into the Python heredoc, run your process, and print the result in Python. The $() construct captures the standard output of the Python script into a Bash variable.
For the more generic version of this question, see How to assign a heredoc value to a variable in Bash?

Related

Named variables from Python for use in Bash source without temp file

I'm trying to get multiple variables from Python to be usable as variables within Bash. Although I have python as the source, it could be any programme/script, but not a file.
I can get source to do the job with a temp file. So I guess the question is, how do I pipe from a script output to source?
The source python could output any and/or multiple key=value so simply x=$(python script) does not work.
Python snip example
print("STARTDATE=" + STARTDATE)
print("STARTTIME=" + STARTTIME)
Output
STARTDATE=06/15/2021
STARTTIME=15:46:21.00
Bash working example with temp file
python script > temp.file
source temp.file
echo $STARTDATE
Echo is just for debugging here, I would be doing work on the variables
Failed attempts:
source python script
source <(python script)
eval $(python script)
You will need spaces around the strings and will also need to ensure that there are no spaces before or after "=" and so:
print("STARTDATE=\"" + STARTDATE + "\"")
print("STARTTIME=\"" + STARTTIME + "\"")
Making sure that you are using a bash shell and not sh, you should then be able to action the script and set the variables with:
source <(./script.py)
Or in a script:
#!/bin/bash
source <(./script.py)

Python: how do I read (not run) a shell script, inserting arguments along the way

I have a simple shell script script.sh:
echo "ubuntu:$1" | sudo chpasswd
I need to open the script, read it, insert the argument, and save it as a string like so: 'echo "ubuntu:arg_passed_when_opening" | sudo chpasswd' using Python.
All the options suggested here actually execute the script, which is not what I want.
Any suggestions?
You would do this the same way that you read any text file, and we can use sys.argv to get the argument passed when running the python script.
Ex:
import sys
with open('script.sh', 'r') as sfile:
modified_file_contents = sfile.read().replace('$1', sys.argv[1])
With this method, modified_file_contents is a string containing the text of the file, but with the specified variable replaced with the argument passed to the python script when it was run.

Import multiple files output from bash script into Python lists

I have a bash script that connects to multiple compute nodes and pulls data from each one depending on some arguments entered after the bash script is called. For simplicity sake, I'm essentially doing this:
for h in node{0..7}; do ssh $h 'fold -w 80 /program.headers | grep "RA"
| head -600 | tr -d "RA =" > '$h'filename'; done
I'm trying to take the 8 files that come out of this (each have 600 pieces of information) and save them each as a list in Python. I then need to manipulate them in Python (split and convert to float) to be able to plot the data with Matplotlib.
For a bash script that only outputs one file, I can easily make a variable name equal to check_output and then manipulate from there:
test = subprocess.check_output("./bashscript")
test_list = test.split()
test = [float(a) for a in test_list]
I am also able to read a saved file from my bash script by using:
test = subprocess.check_output(['cat', '/path/filename'])
test_list = test.split()
test = [float(a) for a in test_list]
The problem is, I'm working with over 80 files after I get all that I need. Is there some way in Python to say, "for every file made store the contents of that as a list"?
Instead of capturing data by using subprocess you can use os.popen() to execute scripts. The benefit of using it is that you can read the output of a command/script as you are reading a file. So you can use read(), readlines(),readline() according to your wish which all will give you a list. By using that you can execute the script and capture output like this
import os
output=os.popen("./bashscript").readlines() #now output has the op of bashsceipt with each line as a seperate item as list.
check this for more info on how to use os.popen(). check this to know difference between read(),readlines(),readline(),xreadlines()
Define a simple interface between your bash script and your python script
It looks like the simple interface used to be a print out of the file, but this solution did not scale to multiple files. Now, I recommend the interface be printing out the names of files created. It would look something like this:
filenames = subprocess.check_output("./bashscript").split()
for filename in filenames:
with open(filename) as file_obj:
file_data = [float(a) for a in file_obj.readlines()]
It looks like you are unfamiliar with Python but are familiar with bash. As a result, you are programming hobbled on bash crutches, instead you should embrace Python and use it in your application. You probably do not need the bash script at all.

Cannot capture single quote string in Python from Perl

I have a Perl script from which I am calling a Python script.
I am using:
system "python script.py '".$var1."' '".$var2."' '".$var3."' '".$var4."' '".$var5."'";
Where, $var1 = "'Nostoc azollae' 0708", which has single quotes in the string.
In the script.py script, I am using:
var1 = sys.argv[1]
And if I print var1, it only prints: Nostoc and the rest is not printed, rest is working fine.
So, clearly the Python script is not receiving the string with the ' included.
What can be a solution to this?
Avoid shell invocation when using system() by using separate parameters instead of joining them all together,
system("python", "script.py", $var1, $var2, $var3, $var4, $var5);

Calling configuration file ID into Linux Command with Date Time from Python

I'm trying to write a script to get the following outputs to a folder (YYYYMMDDHHMMSS = current date and time) using a Linux command in Python, with the ID's in a configutation file
1234_YYYYMMDDHHMMSS.txt
12345_YYYYMMDDHHMMSS.txt
12346_YYYYMMDDHHMMSS.txt
I have a config file with the list of ID's
id1 = 1234
id2 = 12345
id3 = 123456
I want to be able to loop through these in python and incorporate them into a linux command.
Currently, my linux commands are hardcoded in python as such
import subprocess
import datetime
now = datetime.datetime.now()
subprocess.call('autorep -J 1234* -q > /home/test/output/1234.txt', shell=True)
subprocess.call('autorep -J 12345* -q > /home/test/output/12345.txt', shell=True)
subprocess.call('autorep -J 123456* -q > /home/test/output/123456.txt', shell=True)
print now.strftime("%Y%m%d%H%M%S")
The datetime is defined, but doesn't do anything currently, except print it to the console, when I want to incorporate it into the output txt file. However, I want to be able to write a loop to do something like this
subprocess.call('autorep -J id1* -q > /home/test/output/123456._now.strftime("%Y%m%d%H%M%S").txt', shell=True)
subprocess.call('autorep -J id2* -q > /home/test/output/123456._now.strftime("%Y%m%d%H%M%S").txt', shell=True)
subprocess.call('autorep -J id3* -q > /home/test/output/123456._now.strftime("%Y%m%d%H%M%S").txt', shell=True)
I know that I need to use ConfigParser and currently have been this piece written which simply prints the ID's from the configuration file to the console.
from ConfigParser import SafeConfigParser
import os
parser = SafeConfigParser()
parser.read("/home/test/input/ReportConfig.txt")
def getSystemID():
for section_name in parser.sections():
print
for key, value in parser.items(section_name):
print '%s = %s' % (key,value)
print
getSystemID()
But as mentioned in the beggining of the post, my goal is to be able to loop through the ID's, and incorporate them into my linux command while adding the datetime format to the end of the file. I'm thinking all I need is some kind of while loop in the above function in order to get the type of output I want. However, I'm not sure how to call the ID's and the datetime into a linux command.
So far you have most of what you need, you are just missing a few things.
First, I think using ConfigParser is overkill for this. But it's simple enough so lets continue with it. Lets change getSystemID to a generator returning your IDs instead of printing them out, its just a one line change.
parser = SafeConfigParser()
parser.read('mycfg.txt')
def getSystemID():
for section_name in parser.sections():
for key, value in parser.items(section_name):
yield key, value
With a generator we can use getSystemID in a loop directly, now we need to pass this on to the subprocess call.
# This is the string of the current time, what we add to the filename
now = datetime.datetime.now().strftime('%Y%m%d%H%M%S')
# Notice we can iterate over ids / idnumbers directly
for name, number in getSystemID():
print name, number
Now we need to build the subprocess call. The bulk of your problem above was knowing how to format strings, the syntax is described here.
I'm also going to make two notes on how you use subprocess.call. First, pass a list of arguments instead of a long string. This helps python know what arguments to quote so you don't have to worry about it. You can read about it in the subprocess and shlex documentation.
Second, you redirect the output using > in the command and (as you noticed) need shell=True for this to work. Python can redirect for you, and you should use it.
To pick up where I left off above in the foor loop.
for name, number in getSystemID():
# Make the filename to write to
outfile = '/home/test/output/{0}_{1}.txt'.format(number, now)
# open the file for writing
with open(outfile, 'w') as f:
# notice the arguments are in a list
# stdout=f redirects output to the file f named outfile
subprocess.call(['autorep', '-J', name + '*', '-q'], stdout=f)
You can insert the datetime using Python's format instruction.
For example, you could create a new file with the 1234 prefix and the datime stamp like this:
new_file = open("123456.{0}".format(datetime.datetime.now()), 'w+')
I am not sure if I understood what your are looking for, but I hope this helps.

Categories

Resources