I am very new to shell scripting, so I'm still figuring things out. Here is my problem:
I have a python .py executable file which creates multiple files and saves them to a directory. I need to run that file in a shell script. For some reason, the shell script executes the python script but no new files appear in my directory. When I just run the .py file, everything works fine
Here's what my shell script looks like:
#!/bin/bash
cd /home/usr/directory
python myfile.py
Within my python script, the files that are saved are pickled object instances. So every one of them looks something like this:
f = file('/home/usr/anotherdirectory/myfile.p','w')
pickle.dump(myObject,f)
f.close()
This line:
f = file('/home/usr/directory/myfile.p','w')
Should be:
f = open('/home/usr/directory/myfile.p','wb+')
For best practices it should be done like this:
with open('/home/usr/directory/myfile.p','wb+') as fs:
pickle.dump(myObject, fs)
The documentation for the file function states:
When opening a file, it’s preferable to use open() instead of invoking this constructor directly.
Problems like this may be one of the reasons why. Try changing
f = file('/home/usr/directory/myfile.p','w')
to
f = open('/home/usr/directory/myfile.p','w')
Related
I have a python code in which at the beginning it takes a string variable let say "element_name" from user and build some sub-folders based on this string and also some output files created by this code move to those folders.
On the other hand, I have a bash script in which some codes should be running in the sub-folders made in python code.
Any help how to introduce those folders in bash? How to pass the "element_name" from python to bash?
In python code "a.py" I tried
first = subprocess.Popen(['/bin/echo', element_name], stdout=subprocess.PIPE)
second = subprocess.Popen(['bash', 'path/to/script', '--args'], stdin=first.stdout)
and then in bash
source a.py
echo $element_name
but it doesn't work.
It's not clear from your question what is in your scripts, but I guess
subprocess.run(['/bin/bash', 'path/to/script', '--args', element_name])
is doing what you intend to do, passing the value of element_name to script as an argument.
I found a way. What I did is to pass the argument in a bash file and import this bash file as a source to my main bash file. Now everything works well.
I have three python scripts. One gathers data from database(data_for_report.py), another generates report from that data and creaters .xlsx file(report_gen.py) and the last one modifies the style of that excel file(excel_style.py).
Now all three files are in the same directory and what I do now is simply execute scripts one after another in the interpreter to get the report. I want to make everything work with one click so people who need this report could do it themselves. I thought of creating an exe with pyinstaller, but I can not think of a way to link my scripts together so that when data_for_report.py ends its job report_gen.py is started and so on.
I tried to put
subprocess.call("report_gen.py", shell=True)
at the end of the first script, but nothing happens, I just get this:
Out[2]: 1
How could I do this?
Actually, This problem can be solved by using batch programming. Your python files will run in batches i.e. one file after the other. I am assuming your all three python files resides in folder ReportGenerator with Path as C:\ReportGenerator so adjust accordingly the PATH as of your system (Please care for \ and / in PATH of folder having the python files).
Your files are which need to be executed:
data_for_report.py
report_gen.py
excel_style.py
Now open a Notepad file and write the below lines.
cd C:/ReportGenerator
python data_for_report.py
python report_gen.py
python excel_style.py
PAUSE
Now save this file with file_Name.bat anywhere u want in system and remember it. After saving the batch file icon will form on saving.
Now Open window command prompt and just drag this batch file to window command prompt.
Why not encapsulate all the logic for each script in a function, make a new file which imports all the 3 functions, and then run that script.
So if the scripts are
data_for_report.py
def f1():
...
report_gen.py
def f2():
...
excel_style.py
def f3():
...
Then the final script which you will run is :
from data_for_report import f1
from report_gen import f2
from excel_style import f3
f1()
f2()
f3()
My file structure looks like this:
runner.py
scripts/
something_a/
main.py
other_file.py
something_b/
main.py
anythingelse.py
something_c/
main.py
...
runner.py should look at all folders in scripts/ in run the main.py located there.
Right now I'm achieving this through subprocess.check_output. It works, but some of these scripts take a long time to run and I don't get to see any progress; it prints everything after the process has finished.
I'm hoping to find a solution that allows for 2 things to be done somewhat easily:
1) Stream the output instead of getting it all at the end
2) Doesn't
prohibit running multiple scripts at once
Is this possible? A lot of the solutions I've seen for running a Python script from another require knowledge of the other script's name/location. I can also enforce that all the main.py's have a specific function if that helps.
You could use Popen to loop through each file and write its content to multiple log files. Then, you could read from these files in real-time, while each one is populated. :)
How you would want to translate the output to a more readable format, is a little bit more tricky because of readability. You could create another script which reads these log files, with Popen, and decide on how you'd like this information read back in a understandable manner.
""" Use the same command as you would do for check_output """
cmd = ''
for filename in scriptList:
log = filename + ".log"
with io.open(filename, mode=log) as out:
subprocess.Popen(cmd, stdout=out, stderr=out)
I want to run some command line scripts from within my python program. These scripts generates some output files. I want to grab these output files from the subprocess call as object in my python program, while canceling generation of files on disk. Problem is I don't know how to do it, or whether that is even possible.
A simple example would look like this:
#foo.py
fout1 = open("temp1.txt","w")
fout2 = open("temp2.txt","w")
fout1.write("fout1")
fout2.write("fout2")
fout1.close()
fout2.close()
#test.py
import subprocess
process = subprocess.Popen(["python","foo.py"], ????????) #what arguments to use to grab temp1.txt and temp2.txt
print(process.??????) #how to access those files
I am familiar with subprocess.Popen so that is what the example code uses, but I am open to the use of other modules too if they could do it.
Idea
Basically, what my script does is checking C:/SOURCE for .txt files and add a timestamp to it. To replicate it you can basically make that folder and put some txt files in there. Then, it's supposed to run a .vbs file, which then runs a .bat files with some rclone commands which don't matter here. I did it like this because there wont be a CMD window opening when running the rclone command through the .vbs file.
Python code
import time, os, subprocess
while True:
print("Beginning checkup")
print("=================")
timestamp = time.strftime('%d_%m_%H_%M') # only underscores: no naming issues
the_dir = "C:/SOURCE"
for fname in os.listdir(the_dir):
if fname.lower().endswith(".txt"):
print("found " + fname)
time.sleep(0.1)
new_name = "{}-{}.txt".format(os.path.splitext(fname)[0], timestamp)
os.rename(os.path.join(the_dir, fname), os.path.join(the_dir, new_name))
time.sleep(0.5)
else:
subprocess.call(['cscript.exe', "copy.vbs"])
time.sleep(60)
VBScript code
Set WshShell = CreateObject("WScript.Shell" )
WshShell.Run Chr(34) & "copy.bat" & Chr(34), 0
Set WshShell = Nothing
The only important part for the Python script is below the very last else, where the subprocess.call() is supposed to run the .vbs file. What happens when running the script is it shows the first two lines that always come up when running CMD, but then nothing.
How could I fix that? I tried:
subprocess.call("cscript copy.vbs")
subprocess.call("cmd /c copy.vbs")
both with the same outcome, it doesn't do anything.
Anyone have an idea?
Why are you invoking a VBScript to invoke a batch script from Python? You should be able to simple run whatever the batch script is doing directly from your Python code. But even if you wanted to keep the batch script, something like this should do just fine without VBScript as an intermediary.
subprocess.call(['cmd', '/c', 'copy.bat'])
You may want to give the full path of the batch file, though, to avoid issues like the working directory not being what you think it is.
If your batch script resides in the same directory as the Python script, you can build the path with something like this:
import os
import subprocess
scriptdir = os.path.dirname(__file__)
batchfile = os.path.join(scriptdir, 'copy.bat')
subprocess.call(['cmd', '/c', os.path.realpath(batchfile)])
It seems there is no such an operation that could not be done using plain Python. Scan a directory, copy a file -- Python has it all in the standard library. See os.path and shutil modules.
Adding VB scripts and launching subprocesses make your code complex and difficult to debug.