Command line parameters for an inline python script? - python

I have a set of comman-line tools that I implement as bash functions, eg:
function sf
{
sftp $(svr $1)
}
where svr is another function that translates a short name to a fully qualified domain name. I got the idea that I wanted to convert this function:
function irpt
{
~/tools/icinga_report $*
}
to something like:
function irpt
{
python <<!
import requests
...lots of python stuff...
!
}
This works perfectly, except for one thing: I need to add the parameters somewhere, but I can't see where. I have tried to enclose the whole python block in { }, but it doesn't work:
function irpt
{
python <<!
import requests
...lots of python stuff...
!
}
The shell doesn't accept the definition:
-bash: /etc/profile: line 152: syntax error near unexpected token `$*'
-bash: /etc/profile: line 152: `} $*'
Is there any way to implement this?
===EDIT===
Inspired by the answer I accepted, this is what I did, maybe it is useful for others:
function irpt
{
python <<!
import requests
param="$*".split(' ')
...lots of python stuff...
!
}
This works beautifully.

One way:
function irpt
{
python <<!
import requests
v1='$1'
print(v1)
!
}
Running the function:
$ irpt hello
hello
$

It looks a little weird, but you can use the bash <(command) syntax to provide a script file dynamically (in reality, a named pipe); the rest follows.
function demo {
python <(echo 'import sys; print(sys.argv)') "$#"
}

You can use something like this
foo() {
cmd=$(cat <<EOF
print("$1")
EOF
)
python -c "$cmd"
}
Alternatively,
foo() {
python -c $(cat <<EOF
print("$1")
EOF
)
}
And then use the function like
foo test

Related

How to run a Python script from PHP

I need to run a python script from php.. Here is my try..
<?php
if (isset($_POST['action'])) {
switch ($_POST['action']) {
case 'Run_Python':
Run_Python();
break;
case 'Run_Python2':
Run_Python2();
break;
}
}
function Run_Python() {
$output = shell_exec("/root/anaconda3/envs/py36/bin/python3 /home/scripts/test.py");
echo $output;
exit;
}
}
This action is combined with a button...But I am not getting any output from this..
Note:
Here is a separate bash script which uses to run set of python scripts.This works fine.. Am I doing any wrong with the libraries?
#!/bin/bash
cd /home/scripts/
export PATH="/root/anaconda3/envs/py36/bin:/usr/share/Modules/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/root/bin"
export TEMP=/home/svradmin/tmp
python3 test.py

Return result of python script in Jenkins pipeline

I have a jenkins pipeline where I have 2 stages:
pipelien {
agent {label 'master'}
stages ('stage 1') {
stage {
steps {
sh "python3 script.py" //this script returns value {"status": "ok"}
}
}
stage ('stage 2') {
// Do something with the response JSON from script.py
}
}
}
The issue is that I cannot do it. What I have tries:
Pass the result to an environment variable, but for some reason Jenkins didn't recognize it. Maybe I did something wrong here?
Playing an parsing stdout of script.py is not an option, because this script prints lot of logs
The only option which is left is to create a file to store that JSON and then to read that file in the next step, but it's ugly.
Any ideas?
In the end we chose to create files and store the messages there.
We pass the file name as an argument to python script and log everything.
We then remove all the workspace after the job succeeds.

Run python from laravel

I'm trying to run a python script from Laravel, with Symfony Process component like this :
My controller :
public function access(Request $request)
{
$process = new Process(['test.py', 'helloworld']);
$process->run();
dd($process->getOutput());
}
Python script :
import sys
x = sys.argv[1]
print(x)
But all i get in dd is ""
What do you think is the problem ?
Not an answer, just a tip but too long to fit as a comment:
Instead of dumping only the standard output, you can dump as well other useful information:
$process = new Process(['/usr/bin/python', '/my/full/path/test.py', 'helloworld']);
$process->run();
echo "Output:\n";
dump($process->getOutput());
echo "Error:\n";
dump($process->getErrorOutput());
echo "Exit code: " . $process->getExitCode() . "\n";
die;
Symfony process keeps throwing Fatal Python error: _Py_HashRandomization_Init error, used shell_exec() instead.
<?php
$var = "Hello World";
$command = escapeshellcmd('path_to_python_script.py '.$var);
$output = shell_exec($command);
echo $output;
?>
you can also send file instead of variable and have it handled from python. if you want to get runtime inputs checkout Brython or Skulpt.

How to pass pipeline credential parameter to python script as env variable

I have a pipeline job with a credential parameter (user name and password), and also a groovy file that runs a shell script which triggers python file.
How can I pass those parameters to the env so the python script can use them with os.getenv?
Groovy file code:
def call() {
final fileContent = libraryResource('com/amdocs/python_distribution_util/main.py')
writeFile file: 'main.py', text: fileContent
sh "python main.py"}
I know the pipeline-syntax should look something similar to that:
withCredentials([usernamePassword(credentialsId: '*****', passwordVariable: 'ARTIFACTORY_SERVICE_ID_PW', usernameVariable: 'ARTIFACTORY_SERVICE_ID_UN')]) {
// some block
}
what is the correct way for doing it?
Correct syntax:
def call() {
def DISTRIBUTION_CREDENTIAL_ID = params.DISTRIBUTION_CREDENTIAL_ID
withCredentials([usernamePassword(credentialsId: '${DISTRIBUTION_CREDENTIAL_ID}', passwordVariable: 'ARTIFACTORY_SERVICE_ID_PW', usernameVariable: 'ARTIFACTORY_SERVICE_ID_UN')]) {
sh '''
export ARTIFACTORY_SERVICE_ID_UN=${ARTIFACTORY_SERVICE_ID_UN}
export ARTIFACTORY_SERVICE_ID_PW=${ARTIFACTORY_SERVICE_ID_PW}
}
}
and then you can use config.py file to pull the values using:
import os
ARTIFACTORY_SERVICE_ID_UN = os.getenv('ARTIFACTORY_SERVICE_ID_UN')
ARTIFACTORY_SERVICE_ID_PW = os.getenv('ARTIFACTORY_SERVICE_ID_PW')

Passing arguments to a embedded python script in bash

I'm having difficulties passing arguments to a embedded bash script.
#!/bin/bash/
function my_function() {
MYPARSER="$1" python - <<END
<<Some Python Code>>
class MyParser(OptionParser):
def format_epilog(self, formatter):
return self.epilog
parser=MyParser(version=VER, usage=USAGE, epilog=DESC)
parser.add_option("-s", "--Startdir", dest="StartDir",
metavar="StartDir"
)
parser.add_option("-r", "--report", dest="ReportDir",
metavar="ReportDir"
)
<<More Python Code>>
END
}
foo="-s /mnt/folder -r /storagefolder/"
my_function "$foo"
I've read Steve's Blog: Embedding python in bash scripts which helped but I'm still unable to pass the argument. I've tried both parser and myparser as environmental variables.
Is it as simple as defining $2 and passing them individually?
Thanks
You're overcomplicating this rather a lot. Why mess with a parser where
value="hello" python -c 'import os; print os.environ["value"]'
Or, for a longer script:
value="hello" python <<'EOF'
import os
print os.environ["value"]
EOF
If you need to set sys.argv for compatibility with existing code:
python - first second <<<'import sys; print sys.argv'
Thus:
args=( -s /mnt/folder -r /storagefolder/ )
python - "${args[#]}" <<'EOF'
import sys
print sys.argv # this is what an OptionParser will be looking at by default.
EOF

Categories

Resources