set -e at the beginning of the bash script instructs bash to fail the whole script on first failure of any command inside.
Is there any equivalent to use with ipython script which invokes bash commands through !command?
As noted in check the exit status of last command in ipython, there is an _exit_code variable. What you want to do is thus equivalent to adding an assert _exit_code==0 after each shell command. I have not found a feature to do the check automatically, but I'm not that familiar with ipython.
Related
Question
My default Python is 2.7, but I have a script that requires Python 3.4. I am trying to create a function in R that will:
Switch to Python 3.4
Run this script
Switch back to Python 2.7
Import results into R
To switch between Python versions, I use my cluster's "dotkit" system, like this:
use Python-2.7
use Python-3.4
"use" is a bash function that is imported in my .bashrc file. It sets all of my path variables (PATH, LIBRARY_PATH, LD_LIBRARY_PATH, CPATH, C_INCLUDE_PATH, etc). The problem is that when I try to call this function in R, I get the following error:
system('use Python-3.4')
sh: use: command not found
It seems like this is a problem with my PATH. I am using the correct shell:
system('echo $SHELL')
/bin/bash
My $PATH variable also looks good. However, when I create a script that essentially does the same thing:
load_py34.sh:
#!/bin/bash
source ~/.bashrc
use Python-3.4
and call this script through R, then it actually runs, but for some reason, it doesn't change my python version within R. (I have verified that this script works from the command line.)
> R
> system('python --version')
Python 2.7.1
> system('sh load_py34.sh')
Prepending: R-3.4 (ok)
> system('python --version')
Python 2.7.1
So I'm a little confused, but if anyone can help, I would really appreciate it.
Suggested fixes
When I combine them into a single command, I still have the same problem:
> system("sh load_py34.sh; python --version")
Prepending: Python-3.4 (already loaded)
Python 2.7.1
When I try calling bash directly, I still have a problem with the PATH:
> system("bash -c 'use Python-3.4; python --version'")
bash: use: command not found
Python 2.7.1
.bashrc is only loaded for interactive bash sessions.
"use" is a bash function that is imported in my .bashrc file. It sets
all of my path variables.
If set via export, the environment of the calling process will not be altered.
export [-fn] [name[=word]] ... The supplied names are marked for automatic export to the environment of subsequently executed commands. (https://man7.org/linux/man-pages/man1/bash.1.html)
Child processes do not normally have access to the parent process' environment. (This poses a problem because system() creates a sub-process.)
The source and . built-ins execute the commands in the current shell environment, hence why your script works.
Other commands (executables, non-shell-builtins) are executed by the fork-and-exec mechanism, whereby the executing shell process forks, creating a child process with an identical environment and state. This new child process is the process in which the command is executed. Changes to the environment of that process are not replicated to the parent's environment.
This means that you will not be able to rely on system('...') to modify the environment of the R process, or that of processes spawned by subsequent system() invocations.
In a single invocation to system(), you can construct a command-line that changes the environment of the spawned shell like so:
bash -c 'source ~/.bashrc; use Python-3.4; python --version'
Mind you, ~/.bashrc is not really the best place to put this functionality (might be subjective).
When you call system() it uses /bin/sh, not /bin/bash. sh doesn't read your .bashrc file when it starts up, so it does not know any of the functions you've defined there.
To use the function from your .bashrc, you must get bash to run it instead:
system("bash -c 'use Python-3.4; python --version'")
Edit: placement of closing single quote.
I'd like to be able to both start an interactive session after completing a script (which I know can be done with ipython -i myscript.py), and have the shell execute the whos() command immediately afterward. This will help my workflow by allowing my to try out a script, and if there are errors, pick out suspect variables in the namespace to see what their deal is (it's often hard to remember exactly which variable is called what).
Is this even possible? I tried ipython -c "whos" -i myscript.py, but it seems ipython will only run the -i part or -c part -- whichever comes first.
For what its worth, this can be done using %run magic and the literal Enter character:
ipython -i -c "%run myfile^M%whos"
where myfile is the script head (no .py), and ^M is obtained in the shell by typing Ctrl+vEnter.
I have the virutalenv created and installed. I have also installed jsnapy tool inside my virutal env.
This is the script that we are using:
Filename : venv.py
import os
os.system('/bin/bash --rcfile ~/TestAutomation/End2EndAutomation/bin/activate')
os.system('End2EndAutomation/bin/jsnapy')
ubuntu#server:~/TestAutomation$ python venv.py
(End2EndAutomation) ubuntu#sdno-server:~/TestAutomation$ ^C
We need to know, is how we can get into virutalenv, run a command and deactivate it using python script?
[EDIT1]
i used the code given in the comment. its just entering virutal env. When i issue exit, its running jsnapy command.
ubuntu#server:~/TestAutomation$ python venv.py
(End2EndAutomation) ubuntu#server:~/TestAutomation$ exit
exit
usage:
This tool enables you to capture and audit runtime environment of
networked devices running the Junos operating system (Junos OS)
Tool to capture snapshots and compare them
It supports four subcommands:
--snap, --check, --snapcheck, --diff
1. Take snapshot:
jsnapy --snap pre_snapfile -f main_configfil
Each call to os.system() will create a new bash instance and terminate the previous one. To run all the commands in one bash instance you could put all your commands inside a single bash script and call that from os.system()
run.sh
source ~/TestAutomation/End2EndAutomation/bin/activate
End2EndAutomation/bin/jsnapy
deactivate
Python
os.system('source run.sh')
Alternatively, you could write a multiline bash command, as long as it's all in one os.system() call.
Two successive calls to os.system() will create two independent processes, one after the other. The second will run when the first finishes. Any effects of commands executed in the first process will have been forgotten and flushed when the second runs.
You want to run the activation and the command which needs to be run in the virtualenv in the same process, i.e. the same single shell instance.
To do that, you can use bash -c '...' to run a sequence of commands. See below.
However, a better solution is to simply activate the virtual environment from within Python itself.
p = os.path.expanduser('~/TestAutomation/End2EndAutomation/bin/activate_this.py')
execfile(p, dict(__file__=p))
subprocess.check_call(['./End2EndAutomation/bin/jsnapy'])
For completeness, here is the Bash solution, with comments.
import subprocess
subprocess.check_call(['bash', '-c', """
. ~/TestAutomation/End2EndAutomation/bin/activate
./End2EndAutomation/bin/jsnapy"""])
The preference for subprocess over os.system is recommended even in the os.system documentation.
There is no need to explicitly deactivate; when the bash command finishes, that will implicitly also deactivate the virtual environment.
The --rcfile trick is a nice idea, but it doesn't work when the shell you are calling isn't interactive.
I have a use case where I need to execute a Python script I'm working on from a login shell due to a dependency on values sourced from /etc/profile.d/. There's several combinations I've tried, but I seem to run into one issue or another.
This is nested in another automated process that's running in a non-login shell so I'm looking for a one-line command to get this to work.
Does anyone know of a one-line command to get this to work?
For example, I tried the following, but it failed with the "cannot execute binary file" error:
bash --login python my_python_script.py
/usr/bin/python: /usr/bin/python: cannot execute binary file
I've also tried a few other combinations with and without the shebang. Any help is appreciated.
You need -c there before the command and then you need to quote the command as a single argument.
bash --login -c 'python my_python_script.py'
If the shell also needs to be an interactive shell then the -i option needs to be used there also.
bash --login -i -c 'python my_python_script.py'
I'd like a shortcut key for GEdit that will run the currently open .py file when I press, say, F5. I have a script that does this via an external Terminal window, but I'm having more trouble creating a version that uses the internal output window (Shell Output, I guess), since I can't find a good way to grab the pyenv details from the ~./bashrc file. Working with pyenv is mandatory.
Here's what I have via GEdit's External Tools plugin:
UNSOLVED: Internal Shell Output method:
I wanted to get access to the pyenv settings in ~./bashrc, so I tried this External Tools script:
#!/bin/bash
set +m
bash -i python $GEDIT_DOCUMENTS_PATH
This works (thanks to -i), but it gives me the "bash: no job control in this shell" warning. Running set +m should get rid of this message, but it doesn't.
So I moved the relevant stuff I had at the end of ~/.bashrc over to this script, which is not ideal at all:
#!/bin/bash
export PYENV_ROOT="${HOME}/.pyenv"
if [ -d "${PYENV_ROOT}" ]; then
export PATH="${PYENV_ROOT}/bin:${PATH}"
eval "$(pyenv init -)"
fi
export PYENV_VERSION=3.3.4
export LD_LIBRARY_PATH=~/.pyenv/versions/3.3.4/lib/python3.3/site-packages/PySide-1.2.1-py3.3.egg/PySide/
python $GEDIT_CURRENT_DOCUMENT_NAME
Problems: This last block is terrible. It's just copied from ~/.bashrc, and it even has to include PySide data that ~/.bashrc should take care of. Also, for some reason, using this method always outputs the first line of the .py file (say, import sys). Obviously, no input() can be given using this method, and outputting to GEdit's Embedded Terminal seems impossible. Also, I can't get rid of the "Done" message, even by using set +m or running the command in a subshell.
SOLVED: External Terminal window method:
#!/bin/sh
gnome-terminal -x $SHELL -ic "python $GEDIT_CURRENT_DOCUMENT_NAME; printf \"\nPress any key to continue.\"; read -n 1 -s"
or, define a Terminal profile called, say, Wait, that sets Title and Command->When terminal exits: Hold the terminal open, and do this:
#!/bin/sh
gnome-terminal --profile=Wait -x $SHELL -ic "python $GEDIT_CURRENT_DOCUMENT_NAME; printf \"\nPress any key to continue.\""
This gives a "status 0" message though, so the other method is better. Both methods use an interactive shell to access ~/.bashrc.
Steps to add your custom shortcut key and functionality in GEdit:
1) Open the Manage External Tools.
2) Add a tool
3) Give the tool a name.
4) Enter this code:
#!/bin/sh
python $GEDIT_DOCUMENTS_PATH
5) Give Shortcut Key as F5 by typing directly the function key F5 into the box.
To run the current file, you should save it first. You can now see the output in Shell Output window that comes up when you run the command either through F5 or manually clicking on the command.
Of course, you can modify it to suit your own needs.
I wanted the same. After reading your post the answers and comments, I tried myself around.
To run only the currently open document, open (I am using gedit under Ubuntu 14.04.4 LTS) under
Tools>Manage External Tools pressing '+' to add a new tool:
and type in into the screen for the shell script:
#!/bin/sh
# run the current document in python
python $GEDIT_CURRENT_DOCUMENT_PATH
The $GEDIT_DOCUMENTS_PATH would apply it to all folders in your current folder of your document - but you wanted to run only the current document isn't it? And the other suggestions: I don't understand why one should make it unnecessarily complicated and why one should ask again - because the moment you press the key you want that thing to be executed, isn't it?
I tried it myself and it works flawlessly.