Running a Perl script and entering commands into it from Python - python

I'm a week into learning Python and am trying to write a piece of code that allows me to run a text-based Perl script in LXTerminal automatically. I have a couple of questions regarding some specifics.
I need my code to start the Perl script with a user-inputted environment file, enter a few specific settings into the Perl script, and then read in many .txt files, one at a time, into the Perl script. It also needs to restart the process for every single .txt file and capture each individual output (it would help if every output could be written to a single .csv file).
To call the Perl script, I'm starting with the following:
alphamelts="/home/melts/Desktop/alphamelts"
pipe=subprocess.Popen(["perl", "/home/Desktop/melts/alphaMELTS", "run_alphamelts.command -f %s"]) % raw_input("Enter an environment file:"), stdout=PIPE
Assuming that's correct, I now need it to read in a .txt file, enter number-based commands, have my code wait for the Perl script to finish its calculations, and I need it to write the output to a .csv file. If it helps, the Perl script I'm running automatically generates a space delimited file containing the results of its calculations once the program exists, but it would be super helpful if only a few of its outputs were written onto a single seperate .csv file for each .txt file processed.
No idea where to go from here but I absolutely have to get this working. Sorry for the complexity.
Thank you!

you can do some really cool stuff in ipython. Check out this notebook for some specific examples. As far as waiting for a subprocess to finish, I think you need to put a pause in your script. Also, for data handling and export to csv and excel, I'd recommend pandas
Just something to get you started.

Related

Safe way to view json currently being written by Python code

I have a script I'm running a bunch of times that generates and logs data in json files. These take days to run and I need to run several dozen test cases. I log progress in json files for post-processing. I'd like to check in occasionally to see how long it has left. This is all single thread, but I've dealt with multiprocessing enough to be scared of opening the file while it's being written for fear that viewing it will place a temporary lock on the file.
Is it safe to view the json in a linux terminal using nano log_file.json while my Python scripts are running and could attempt to write to the log at any time?
If it is not safe, are there any alternatives?
I'm worried if Python tries to record an entry that it could be lost or throw an error while I'm viewing progress. Viewing only, no saving obviously. I'd love to check in on progress to switch between test cases faster, but I really don't want to raise an error that loses days of progress if it's unable to write to the json.
Sorry if this is a duplicate, I tried searching but I'm not sure what to even search for this question.
You can use tail command on terminal to view the logs. Following is the full command:-
tail -F <path_to_file>
It will show some of the last lines of the file and continue to show if data is being written in the file.

How to automatically run the python script when a new csv file appears

I need a solution so that a ready python script will run every time a new csv file appears in the folder. The script uses this csv file, which will always have the same name, but it will be updated from time to time and then the script should run automatically.
I am not an advanced python user, so I can't handle it myself and the solution will make my work much easier. I am asking for tips on how to do it or send some tutorial or code.
Have a nice day everyone :)

How to send command in separate python window

Searching isn't pulling up anything useful so perhaps my verbiage is wrong.
I have a python application that I didn't write that takes user input and performs tasks based on the input. The other script I did write watches the serial traffic for a specific match condition. Both scripts run in different windows. What I want to do is if I get a match condition from my script output a command to the other script. Is there a way to do this with python? I am working in windows and want to send the output to a different window.
Since you can start the script within your script, you can just follow the instructions in this link: Read from the terminal in Python
old answer:
I assume you can modify the code in the application you didn't write. If so, you can tell the code to "print" what it's putting on the window to a file, and your other code could constantly monitor that file.

How to run multiple files successively in a single process using python

I am working in Windows, and just learning to use python (python 2.7).
I have a bunch of script files ("file1.script", "file2.script", "file3.script"....) that are executed in TheProgram.exe. Python has already given me the ability to automatically create these script files, but now I want to successively run each of these script files, back-to-back, in TheProgram.exe.
So far I have figured out how to use the subprocess module in python to start "TheProgram.exe" in a new process (child process?) and load the first script file as follows:
my_process = subprocess.Popen(["Path to TheProgram.exe", "Path to File1.script"])
As seen, simply "opening" the script file in TheProgram.exe, or passing it as an argument in this case, will execute it. Once File1.script is done, TheProgram.exe generates an output file, and then just sits there. It does not terminate. This is I want, because now I would like to load File2.script in the same process without terminating (file2.script is dependent on file1.script completing successfully), then File3.script etc.
Is this possible? And if so how? I cannot seem to find any documentation or anyone else who has had this problem. If I can provide other information please let me know, I am also new to posting to these forums. Thanks so much for any assistance.

Back end process in windows

I need to run the python program in the backend. To the script I have given one input file and the code is processing that file and creating new output file. Now if I change the input file content I don't want to run the code again. It should run in the back end continously and generate the output file. Please if someone knows the answer for this let me know.
thank you
Basically, you have to set up a so-called FileWatcher, i.e. some mechanism which looks out for changes in a file.
There are several techniques for watching file/directory changes in python. Have a look at this question: Monitoring contents of files/directories?. Another link is here, this is about directory changes but file changes are handled in a similar way. You could also google for "watch file changes python" in order to get a lot of answers :)
Note: If you're programming in windows, you should probably implement your program as windows service, look here for how to do that.

Categories

Resources