I am trying to use subprocess in my python script to open Julia and then run a script.
To run on my machine, I enter this in terminal:
$ julia
$ include(test.jl); func("in.csv", "out.csv")
How do I replicate this process and chain both of these commands so that I can run from subprocess in a single call?
I've tried julia; include(test.jl); func("in.csv", "out.csv") and julia && include(test.jl) && func("in.csv", "out.csv")
but both result in
-bash: syntax error near unexpected token `"test.jl"`
The key here is that you're not really chaining two commands from the standpoint of Python's subprocess. There's just one command: julia. You want to pass a somewhat complicated argument to Julia that will execute multiple Julia expressions.
In short, you just want to do:
subprocess.run(['julia','-e','include("test.jl"); func("in.csv", "out.csv")'])
What's happening here is that you're just executing one subprocess, julia, started up with the -e command line flag that just runs whatever comes next in Julia. You can optionally use the capitalized -E flag instead which will print out whatever func (your last expression there) returns.
It's worth pointing out, though, that there are better ways of getting Julia and Python interoperating — especially if you need to transfer data back and forth.
Related
I am writing a python program which is runs some BASH scripts in sub-processes.
One might say that I am using BASH as a scripting language for my python program.
Is there a way to inject python functions into my bash script without having to reload the python code from bash?
strawman example: If this were Python running Javascript, then I could bind my Python functions into the js2py VM and call them from within javascript.
I thought of calling: python some_file_of_mine.py from bash , but this would be launching a new python process, without access to my python program's data.
Also thought of calling: python -c $SOME_INJECTED_PYTHON_CODE. This could use Python's inspect.source() to pre-inject some simple python code along with some bound data from the parent process into the child bash shell. However this will be very quote(`/") sensetive, and will cause some problems with $import.
What I would really like, is a simple way of calling the parent process the bash subprocess, and getting some data back (short of using a Flask + CURL combination)
You can send the result of your functions to the standard output by asking the Python interpreter to print the result:
python -c 'import test; print test.get_foo()'
The -c option simply asks Python to execute some Python commands.
In order to store the result in a variable, you can therefore do:
RESULT_FOO=`python -c 'import test; print test.get_foo()'`
or, equivalently
RESULT=$(python -c 'import test; print test.get_foo()')
since backticks and $(…) evaluate a command and replace it by its output.
PS: Getting the result of each function requires parsing the configuration file each time, with this approach. This can be optimized by returning all the results in one go, with something like:
ALL_RESULTS=$(python -c 'import test; print test.get_foo(), test.get_bar()')
The results can then be split and put in different variables with
RESULT_BAR=$(echo $ALL_RESULTS | cut -d' ' -f2)
which takes the second result and puts it in RESULT_BAR for example (and similarly: -fn for result #n).
PS2: it would probably be easier to do everything in a single interpreter (Python, but maybe also the shell), if possible, instead of calculating variables in one program and using them in another one.
I would like to execute a Common Lisp (SBCL) code from Python e.g. via shell. Also I need to run a Lisp-library called Shop3 to execute my Lisp code. I tried:
os.system('sbcl && (asdf:load-system "shop3") && (in-package:SHOP-USER) && (load "/Users/kiliankramer/Desktop/Shop-Planer/planner-new")')
But it's not working, it's only starting sbcl but then stop before to load the asdf library "shop3".
Can you tell how to execute my Lisp code or what alternatives I have to run an external Lisp program (including the Lisp library) to execute it?
Thanks in forward. :)
&& chains shell commands. I.e., it starts sbcl and waits for it to terminate, and if the termination was successful, then it will try to execute (asdf:load-system "shop3") as a shell command (not what you want!)
You need to use sbcl command line arguments:
os.system("sbcl --eval '(asdf:load-system \\"shop3\\")' --eval '(in-package :SHOP-USER)' --load /Users/kiliankramer/Desktop/Shop-Planer/planner-new")
However, you might want to use the more modern interface instead of os.system.
It will also avoid the need for escaping quotes &c:
subprocess.run(["sbcl","--eval",'(asdf:load-system "shop3")',
"--eval",'(in-package :SHOP-USER)',
"--load","/Users/kiliankramer/Desktop/Shop-Planer/planner-new")
I've been looking for a while, but I haven't found anything in Ruby like python's -i flag.
Common behaviour for me if I'm testing something is to run the unfinished python script with a -i flag so that I can see and play around with the values in each variable.
If I try irb <file>, it still terminates at EOF, and, obviously ruby <file> doesn't work either. Is there a command-line flag that I'm missing, or some other way this functionality can be achieved?
Edit: Added an explanation of what kind of functionality I'm talking about.
Current Behaviour in Python
file.py
a = 1
Command Prompt
$ python -i file.py
>>> a
1
As you can see, the value of the variable a is available in the console too.
You can use irb -r ./filename.rb (-r for "require"), which should basically do the same as python -i ./filename.py.
Edit to better answer the refined question:
Actually, irb -r ./filename.rb does the equivalent of running irb and subsequently running
irb(main):001:0> require './filename.rb'. Thus, local variables from filename.rb do not end up in scope for inspection.
python -i ./filename.py seems to do the equivalent of adding binding.irb to the last line of the file and then running it with ruby ./filename.rb. There seems to be no one-liner equivalent to achieve this exact behaviour for ruby.
Is there a command-line flag that I'm missing, or some other way this functionality can be achieved?
Yes, there are both. I'll cover an "other way".
Starting with ruby 2.5, you can put a binding.irb in some place of your code and then the program will go into an interactive console at that point.
% cat stop.rb
puts 'hello'
binding.irb
Then
% ruby stop.rb
hello
From: stop.rb # line 3 :
1: puts 'hello'
2:
=> 3: binding.irb
irb(main):001:0>
It was possible for a long time before, with pry. But now it's in the standard package.
You can use the command irb. When that has started you can load and execute any ruby file with load './filename.rb'
I just created a rickroll prank to play on friends and family. I want to be able to download the file from github using a curl command which works. My issue is that when I use a pipe and try to execute the script it does it right after curl gets executed and before it downloads the file.
This is the command I am trying to run:
curl -L -O https://raw.githubusercontent.com/krish-penumarty/RickRollPrank/master/rickroll.py | python rickroll.py
I have tried to run it using the sleep command as well, but haven't had any luck.
(curl -L -O https://raw.githubusercontent.com/krish-penumarty/RickRollPrank/master/rickroll.py; sleep 10) | python rickroll.py
Expanding on my comment.
There are several ways to chain commands using most shell languages (here I assume sh / bash dialect).
The most basic: ; will just run each command sequentially, starting the next one as the previous one completes.
Conditional chaining, && works as ; but aborts the chain as soon as a command returns an error (any non-0 return code).
Conditional chaining, || works as && but aborts the chain as soon as a command succeeds (returns 0).
What you tried to do here is neither of those, it's piping. Triggered by |, it causes commands on its sides to be run at once, with the standard output of the left-hand one being fed into the standard input of the right-hand one.
Your second example doesn't work either, because it causes two sequences to be run in parallel:
First sequence is the curl, followed by a sleep once it finishes.
Second sequence is the python command, run simultaneously with anything written by the first sequence redirected as its input.
So fix it: command1 && command2, will run curl, wait for it to complete, and only run python if curl succeeded.
And again, you can use your example to show how harmful it can be to run commands one doesn't fully understand. Have your script write “All your files have been deleted” in red, it can be good for educating people on that subject.
I have been able to use subprocess to embed bash script into python. I happen to navigate through a python code today and stumbled across this line of code below, which also embed bash script into python - using construct analogous to docstring.
#!/bin/bash -
''''echo -n
if [[ $0 == "file" ]]; then
..
fi
'''
Can someone throw light on this approach. What is this approach called, and perhaps the benefits associated. I can obviously see simplicity but I think there's more to this than that.
This is a somewhat clever way to make the file both a valid Python script and a valid bash script. Note that it does not cause a subprocess to magically be spawned. Rather, if the file is evaluated by bash, the bash script will be run, and if it is evaluated by Python, the bash script will be ignored.
It's clever, but probably not a good software engineering practice in general. It usually makes more sense to have separate scripts.
To give a more concrete example (say this file is called "polyglot"):
''''echo hello from bash
exit
'''
print('hello from python')
As you note, bash will ignore the initial quotes, and print "hello from bash", and then exit before reaching the triple quote. And Python will treat the bash script as a string, and ignore it, running the Python script below.
$ python polyglot
hello from python
$ bash polyglot
hello from bash
But naturally, this can usually (and more clearly) be refactored into two scripts, one in each language.
no, that's not embedded into python, the shebang says it's a bash script
the '''' is '' twice, which is just an empty string, it doesn't have any effect.
the ''' is invalid, as the last ' is not closed.