Run a python script in perl - python

I have two scripts, a python script and a perl script.
How can I make the perl script run the python script and then runs itself?

Something like this should work:
system("python", "/my/script.py") == 0 or die "Python script returned error $?";
If you need to capture the output of the Python script:
open(my $py, "|-", "python2 /my/script.py") or die "Cannot run Python script: $!";
while (<$py>) {
# do something with the input
}
close($py);
This also works similarly if you want to provide input for the subprocess.

The best way is to execute the python script at the system level using IPC::Open3. This will keep things safer and more readable in your code than using system();
You can easily execute system commands, read and write to them with IPC::Open3 like so:
use strict;
use IPC::Open3 ();
use IO::Handle (); #not required but good for portabilty
my $write_handle = IO::Handle->new();
my $read_handle = IO::Handle->new();
my $pid = IPC::Open3::open3($write_handle, $read_handle, '>&STDERR', $python_binary. ' ' . $python_file_path);
if(!$pid){ function_that_records_errors("Error"); }
#read multi-line data from process:
local $/;
my $read_data = readline($read_handle);
#write to python process
print $write_handle 'Something to write to python process';
waitpid($pid, 0); #wait for child process to close before continuing
This will create a forked process to run the python code. This means that should the python code fail, you can recover and continue with your program.

It may be simpler to run both scripts from a shell script, and use pipes (assuming that you're in a Unix environment) if you need to pass the results from one program to the other

Related

How to read flags on jenkins jobs

We have two scripts (one for Katalon and one for Python) that we want to launch from Jenkins.
First we want to launch Katalon and, at a certain point in the script, tell Jenkins that launch the python script. Then finished the python script, Jenkins should tell katalon that can continue.
Current jenkins pipeline code:
"pipeline {
agent any
stages {
stage('Unit Test') {
steps {
echo 'Hello Example'
bat """./katalon -noSplash -runMode=console projectPath="/Users/mypc/project/proyect1/example.prj" -retry=0 -
testSuitePath="Test Suites/IOS/TestSuiteAccount" -executionProfile="default" -
deviceId="example" -browserType="iOS" """
sleep 5
}
}
stage('Unit Test2') {
steps {
echo 'Start second test'
bat """python C:\\Users\\myPC\\Documents\\project\\project-katalon-code\\try_python.py"""
sleep 5
}
}
}
}"
In pseudocode it would be the following:
Katalon script:
my_job()
call_jenkins_to_start_python()
if jenkins.python_flag == True
my_job_continue()
Pipeline Jenkins script:
Katalon.start()
if katalon_sent_signal_to_start_python == True
start_python_job()
if python_finished_job_signal == True
send_katalon_signal_to_continue()
Would be a good solution to read/write an external file? Didn't find anything similar.
Thank you!
AFAIK, jenkins starts a sparate process for the bat () step and waits for it to finish. Also, communication between jenkins and the bat process is not possible: you trigger the script and you read the returned value and, if needed, stdout.
Additionally, I do not know if it is possible what you want, because I do not know Katalon at all. What you want requires Katalon waiting for a result from the python script and then, when this results reach Katalon, it should resume its execution.
I recommend you to first try the process without Jenkins: Create a windows script that does exactly what you want. If you are able to do that, you can then call that new script from jenkins, giving input or reading outputs as needed. Even as you suggested, using files for that.

Run script for send in-game Terraria server commands

In the past week I install a Terraria 1.3.5.3 server into an Ubuntu v18.04 OS, for playing online with friends. This server should be powered on 24/7, without any GUI, only been accessed by SSH on internal LAN.
My friends ask me if there is a way for them to control the server, e.g. send a message, via internal in-game chat, so I thought use a special character ($) in front of the desired command ('$say something' or '$save', for instance) and a python program, that read the terminal output via pipe, interpreter the command and send it back with a bash command.
I follow these instructions to install the server:
https://www.linode.com/docs/game-servers/host-a-terraria-server-on-your-linode
And config my router to forward a dedicated port to the terraria server.
All is working fine, but I really struggle to make python send a command via "terrariad" bash script, described in the link above.
Here is a code used to send a command, in python:
import subprocess
subprocess.Popen("terrariad save", shell=True)
This works fine, but if I try to input a string with space:
import subprocess
subprocess.Popen("terrariad \"say something\"", shell=True)
it stop the command in the space char, output this on the terminal:
: say
Instead of the desired:
: say something
<Server>something
What could I do to solve this problem?
I tried so much things but I get the same result.
P.S. If I send the command manually in the ssh putty terminal, it works!
Edit 1:
I abandoned the python solution, by now I'll try it with bash instead, seem to be more logic to do this way.
Edit 2:
I found the "terrariad" script expect just one argument, but the Popen is splitting my argument into two no matter the method I use, as my input string has one space char in the middle. Like this:
Expected:
terrariad "say\ something"
$1 = "say something"
But I get this of python Popen:
subprocess.Popen("terrariad \"say something\"", shell=True)
$1 = "say
$2 = something"
No matter i try to list it:
subprocess.Popen(["terrariad", "say something"])
$1 = "say
$2 = something"
Or use \ quote before the space char, It always split variables if it reach a space char.
Edit 3:
Looking in the bash script I could understand what is going on when I send a command... Basically it use the command "stuff", from the screen program, to send characters to the terraria screen session:
screen -S terraria -X stuff $send
$send is a printf command:
send="`printf \"$*\r\"`"
And it seems to me that if I run the bash file from Python, it has a different result than running from the command line. How this is possible? Is this a bug or bad implementation of the function?
Thanks!
I finally come with a solution to this, using pipes instead of the Popen solution.
It seems to me that Popen isn't the best solution to run bash scripts, as described in How to do multiple arguments with Python Popen?, the link that SiHa send in the comments (Thanks!):
"However, using Python as a wrapper for many system commands is not really a good idea. At the very least, you should be breaking up your commands into separate Popens, so that non-zero exits can be handled adequately. In reality, this script seems like it'd be much better suited as a shell script.".
So I came with the solution, using a fifo file:
First, create a fifo to be use as a pipe, in the desired directory (for instance, /samba/terraria/config):
mkfifo cmdOutput
*/samba/terraria - this is the directory I create in order to easily edit the scripts, save and load maps to the server using another computer, that are shared with samba (https://linuxize.com/post/how-to-install-and-configure-samba-on-ubuntu-18-04/)
Then I create a python script to read from the screen output and then write to a pipe file (I know, probably there is other ways to this):
import shlex, os
outputFile = os.open("/samba/terraria/config/cmdOutput", os.O_WRONLY )
print("python script has started!")
while 1:
line = input()
print(line)
cmdPosition = line.find("&")
if( cmdPosition != -1 ):
cmd = slice(cmdPosition+1,len(line))
cmdText = line[cmd]
os.write(outputFile, bytes( cmdText + "\r\r", 'utf-8'))
os.write(outputFile, bytes("say Command executed!!!\r\r", 'utf-8'))
Then I edit the terraria.service file to call this script, piped from terrariaServer, and redirect the errors to another file:
ExecStart=/usr/bin/screen -dmS terraria /bin/bash -c "/opt/terraria/TerrariaServer.bin.x86_64 -config /samba/terraria/config/serverconfig.txt < /samba/terraria/config/cmdOutput 2>/samba/terraria/config/errorLog.txt | python3 /samba/terraria/scripts/allowCommands.py"
*/samba/terraria/scripts/allowCommands.py - where my script is.
**/samba/terraria/config/errorLog.txt - save Log of errors in a file.
Now I can send commands, like 'noon' or 'dawn' so I can change the in-game time, save world and backup it with samba server before boss fights, do another stuff if I have some time XD, and have the terminal showing what is going on with the server.

How do I terminate a shell program through python

Let's say I do somthing like this:
import os
os.system('java some_program.jar')
Is there a way to stop the execution of that program through python?
My situation:
I have a program in java that does some stuff and inserts data into a .csv file and I need to run it through python (because I'm using python for handeling the data in the .csv file) but the program itself doesn't stop by itself so i need a way to stop it manually once it inserts the data into the .csv file
Don't use os.system.
Instead, use p = subprocess.Popen(...). Then simply call p.kill().
Also, your Java program should be updated to exit when it sees EOF on stdin.
You could try having the java program echo to console or something that it is finished writing to the CSV file using the subprocesses library and it's check_output function. And when that is done, use something like this: os.system("taskkill /im some_program.jar") to kill off the program.

Running two different python script with different path using batch script

I need to execute two python script script1.py (path: dir1) and script2.py(path:dir2) in loop.In order to run this two script I need to give the python path. Earlier I used to manually set path and execute the script. Since I need to execute script in loop, How can i create a batch file that could execute one script and after its work is done execute another.
I am newbie with batch script.
Thanks
You can use a python script that calle the subprocess.call() method.
It takes a list of strings containing the commands to be called.
You then use that with a try loop to check if the first script executes without errors, and, if so, call the next (optionally with a try as well).
#batch.py
import subprocess
try:
subprocess.call(['python3','path/to/script1.py'])
except Exception as e:
print('Error: ', e) #alternatively add your logger here
sys.exit(1)
try:
subprocess.call(['python3','path/to/script2.py'])
except Exception as e:
print('Error: ', e) #alternatively add your logger here
sys.exit(1)
you would then execute this using python3 path/to/batch.py from the CLI.
Take note that, if you're using python 2.7 (which you shouldnt unless absolutely necessary), you have to alter the code to call python instead of python3.
For more references and information on the subprocess module check out the python documentation and this answer
You can create a shell script (which runs for 10 times) for this:
#!/bin/bash
for i in `seq 0 10` ;
do
echo "Running Script 1"
python Script1.py <path as argument>;
echo "script 1 completed"
echo "Running Script 2"
python Script2.py <path as argument>;
echo "script 2 completed"
done
Please clarify if your need is something else.

Cannot Launch Interactive Program While Piping to Script in Python

I have a python script that needs to call the defined $EDITOR or $VISUAL. When the Python script is called alone, I am able to launch the $EDITOR without a hitch, but the moment I pipe something to the Python script, the $EDITOR is unable to launch. Right now, I am using nano which shows
Received SIGHUP or SIGTERM
every time. It appears to be the same issue described here.
sinister:Programming [1313]$ echo "import os;os.system('nano')" > "sample.py"
sinister:Programming [1314]$ python sample.py
# nano is successfully launched here.
sinister:Programming [1315]$ echo "It dies here." | python sample.py
Received SIGHUP or SIGTERM
Buffer written to nano.save.1
EDIT: Clarification; inside the program, I am not piping to the editor. The code is as follows:
editorprocess = subprocess.Popen([editor or "vi", temppath])
editorreturncode = os.waitpid(editorprocess.pid, 0)[1]
When you pipe something to a process, the pipe is connected to that process's standard input. This means your terminal input won't be connected to the editor. Most editors also check whether their standard input is a terminal (isatty), which a pipe isn't; and if it isn't a terminal, they'll refuse to start. In the case of nano, this appears to cause it to exit with the message you included:
% echo | nano
Received SIGHUP or SIGTERM
You'll need to provide the input to your Python script in another way, such as via a file, if you want to be able to pass its standard input to a terminal-based editor.
Now you've clarified your question, that you don't want the Python process's stdin attached to the editor, you can modify your code as follows:
editorprocess = subprocess.Popen([editor or "vi", temppath],
stdin=open('/dev/tty', 'r'))
The specific case of find -type f | vidir - is handled here:
foreach my $item (#ARGV) {
if ($item eq "-") {
push #dir, map { chomp; $_ } <STDIN>;
close STDIN;
open(STDIN, "/dev/tty") || die "reopen: $!\n";
}
You can re-create this behavior in Python, as well:
#!/usr/bin/python
import os
import sys
sys.stdin.close()
o = os.open("/dev/tty", os.O_RDONLY)
os.dup2(o, 0)
os.system('vim')
Of course, it closes the standard input file descriptor, so if you intend on reading from it again after starting the editor, you should probably duplicate its file descriptor before closing it.

Categories

Resources