I have a pipeline job with a credential parameter (user name and password), and also a groovy file that runs a shell script which triggers python file.
How can I pass those parameters to the env so the python script can use them with os.getenv?
Groovy file code:
def call() {
final fileContent = libraryResource('com/amdocs/python_distribution_util/main.py')
writeFile file: 'main.py', text: fileContent
sh "python main.py"}
I know the pipeline-syntax should look something similar to that:
withCredentials([usernamePassword(credentialsId: '*****', passwordVariable: 'ARTIFACTORY_SERVICE_ID_PW', usernameVariable: 'ARTIFACTORY_SERVICE_ID_UN')]) {
// some block
}
what is the correct way for doing it?
Correct syntax:
def call() {
def DISTRIBUTION_CREDENTIAL_ID = params.DISTRIBUTION_CREDENTIAL_ID
withCredentials([usernamePassword(credentialsId: '${DISTRIBUTION_CREDENTIAL_ID}', passwordVariable: 'ARTIFACTORY_SERVICE_ID_PW', usernameVariable: 'ARTIFACTORY_SERVICE_ID_UN')]) {
sh '''
export ARTIFACTORY_SERVICE_ID_UN=${ARTIFACTORY_SERVICE_ID_UN}
export ARTIFACTORY_SERVICE_ID_PW=${ARTIFACTORY_SERVICE_ID_PW}
}
}
and then you can use config.py file to pull the values using:
import os
ARTIFACTORY_SERVICE_ID_UN = os.getenv('ARTIFACTORY_SERVICE_ID_UN')
ARTIFACTORY_SERVICE_ID_PW = os.getenv('ARTIFACTORY_SERVICE_ID_PW')
Related
I have a jenkins pipeline where I have 2 stages:
pipelien {
agent {label 'master'}
stages ('stage 1') {
stage {
steps {
sh "python3 script.py" //this script returns value {"status": "ok"}
}
}
stage ('stage 2') {
// Do something with the response JSON from script.py
}
}
}
The issue is that I cannot do it. What I have tries:
Pass the result to an environment variable, but for some reason Jenkins didn't recognize it. Maybe I did something wrong here?
Playing an parsing stdout of script.py is not an option, because this script prints lot of logs
The only option which is left is to create a file to store that JSON and then to read that file in the next step, but it's ugly.
Any ideas?
In the end we chose to create files and store the messages there.
We pass the file name as an argument to python script and log everything.
We then remove all the workspace after the job succeeds.
I have a script file(run_edr.py) in my local machine and when I run it by using "cmd" and the following command then the script works perfectly. The script takes fewer parameters, the first parameter is an input document folder path and the second parameter is the output folder path to store the output documents.
my python command,
python run_edr.py -input_path "C:\Users\aslamm5165\Downloads\EDRCODE_ArgParser\files\EDR" -output_path "C:\Users\aslamm5165\Downloads\test" -site_name "a" -site_address "b" -site_city "c" -site_county "d" -site_state "e" -site_type "1"
I have tried like below, but not working, where Did I go wrong?
ScriptRuntimeSetup setup = Python.CreateRuntimeSetup(null);
ScriptRuntime runtime = new ScriptRuntime(setup);
ScriptEngine engine = Python.GetEngine(runtime);
ScriptSource source = engine.CreateScriptSourceFromFile(#"C:\Users\aslamm5165\Downloads\EDRCODE_ArgParser\run_edr.py");
ScriptScope scope = engine.CreateScope();
List<String> argv = new List<String>();
//Do some stuff and fill argv
argv.Add("python"+#" C:\Users\aslamm5165\Downloads\EDRCODE_ArgParser\run_edr.py -input_path" + #"C:\Users\aslamm5165\Downloads\EDRCODE_ArgParser\files\EDR");
argv.Add("-output_path"+ #"C:\Users\aslamm5165\Downloads\test");
argv.Add("-site_name 'a' -site_address 'b' -site_city 'c' -site_county 'd' -site_state 'e' -site_type '1'");
engine.GetSysModule().SetVariable("argv", argv);
source.Execute(scope);
I have tried with the system process as well as shown below, no error in the code, but the script is not getting executed. So I don't know what is the correct way of doing this but I want to start my script from my .Net Core application.
ProcessStartInfo start = new ProcessStartInfo();
start.FileName = #"cmd.exe";
start.Arguments = string.Format("python run_edr.py -input_path {0} -output_path {1} -site_name 'a' -site_address 'b' -site_city 'c' -site_county 'd' -site_state 'e' -site_type '1'", #"C:\Users\aslamm5165\Downloads\EDRCODE_ArgParser\files\EDR", #"C:\Users\aslamm5165\Downloads\test");
start.UseShellExecute = true;
start.RedirectStandardOutput = false;
start.WindowStyle = System.Diagnostics.ProcessWindowStyle.Hidden;
Process.Start(start);
I solved a similar running Python scripts in .NET Core 3.1 problem by changing the executable file from cmd.exe or /bin/bash in Linux to a batch script (Windows) or shell script (Linux) file. Here's my approach:
1, for Windows OS, create a run.bat file which include the python.exe and the %* to pass all arguments to it:
C:\YOUR_PYTHON_PATH\python.exe %*
2, for LInux OS, create a run.sh file to execute python with arguments:
#!/bin/bash
/usr/bin/python3 "$#"
3, use Process and ProcessStartInfo (your second approach):
string fileName = null;
if (RuntimeInformation.IsOSPlatform(OSPlatform.Windows))
{
fileName = "path_to_bat/run.bat"
}
else
{
fileName = "path_to_bat/run.sh"
}
ProcessStartInfo start = new ProcessStartInfo
{
FileName = fileName,
Arguments = string.Format("\"{0}\" \"{1}\"", script, args),
UseShellExecute = false,
CreateNoWindow = true,
RedirectStandardOutput = true,
RedirectStandardError = true
};
using Process process = Process.Start(start);
the .NET code is same to Windows except the FileName should be the shell script's name with path.
I have a web interface built with Spring and I want to execute the command "python file.py" from it.
The main problem is that inside the file.py there is a pyomo model that is supposed to give some output. I can execute a python script if it's a simple print or something, but the pyomo model is completely ignored.
What could be the reason?
Here is the code I wrote in the controller to execute the call:
#PostMapping("/execute")
public void execute(#ModelAttribute("component") #Valid Component component, BindingResult result, Model model) {
Process process = null;
//System.out.println("starting!");
try {
process = Runtime.getRuntime().exec("python /home/chiara/Documents/GitHub/Pyomo/Solver/test/sample.py");
//System.out.println("here!");
} catch (Exception e) {
System.out.println("Exception Raised" + e.toString());
}
InputStream stdout = process.getInputStream();
BufferedReader reader = new BufferedReader(new InputStreamReader(stdout, StandardCharsets.UTF_8));
String line;
try {
while ((line = reader.readLine()) != null) {
System.out.println("stdout: " + line);
}
} catch (IOException e) {
System.out.println("Exception in reading output" + e.toString());
}
}
Update: I found that what I was missing was that I didn't check where the code run. So be sure to do so and eventually move the input files (if you have any) in the directory where python is executing, otherwise the script can't find them and elaborate them.
You can use
cwd = os.getcwd()
to check the current working directory of a process.
Another possibility is to redirect the stderr on the terminal or in a log file, because from the Server terminal you won't see anything even if there are errors.
The code posted in the question is the correct way to invoke a bash command from java.
I have a set of comman-line tools that I implement as bash functions, eg:
function sf
{
sftp $(svr $1)
}
where svr is another function that translates a short name to a fully qualified domain name. I got the idea that I wanted to convert this function:
function irpt
{
~/tools/icinga_report $*
}
to something like:
function irpt
{
python <<!
import requests
...lots of python stuff...
!
}
This works perfectly, except for one thing: I need to add the parameters somewhere, but I can't see where. I have tried to enclose the whole python block in { }, but it doesn't work:
function irpt
{
python <<!
import requests
...lots of python stuff...
!
}
The shell doesn't accept the definition:
-bash: /etc/profile: line 152: syntax error near unexpected token `$*'
-bash: /etc/profile: line 152: `} $*'
Is there any way to implement this?
===EDIT===
Inspired by the answer I accepted, this is what I did, maybe it is useful for others:
function irpt
{
python <<!
import requests
param="$*".split(' ')
...lots of python stuff...
!
}
This works beautifully.
One way:
function irpt
{
python <<!
import requests
v1='$1'
print(v1)
!
}
Running the function:
$ irpt hello
hello
$
It looks a little weird, but you can use the bash <(command) syntax to provide a script file dynamically (in reality, a named pipe); the rest follows.
function demo {
python <(echo 'import sys; print(sys.argv)') "$#"
}
You can use something like this
foo() {
cmd=$(cat <<EOF
print("$1")
EOF
)
python -c "$cmd"
}
Alternatively,
foo() {
python -c $(cat <<EOF
print("$1")
EOF
)
}
And then use the function like
foo test
I have the build system for Postgres:
{
"cmd": ["psql", "-U", "postgres", "-d", "test", "-o", "c:/app/sql/result.txt", "-f", "$file"]
}
It works fine, executes current file and sends results to the file c:/app/sql/result.txt.
I want to modify it to automatically save current selection to a file and run psql on that file. Can it be done in a build system?
As my learning has borne fruit let me answer my own question. The simple plugin saves a selected text in a file and calls a build system:
import sublime, sublime_plugin
class ExecuteSelectedSqlCommand(sublime_plugin.TextCommand):
def run(self, edit):
for region in self.view.sel():
if not region.empty():
with open('/a/temporary/file', 'w') as f:
f.write(self.view.substr(region))
self.view.window().run_command('build')
break