Invoking shell-command from function in interactive IPython shell - python

I have just been playing around with IPython. Currently I am wondering how it would be possible to run a shell-command with a python variable within a function. For example:
def x(go):
return !ls -la {go}
x("*.rar")
This gives me "sh: 1: Syntax error: end of file unexpected". Could anybody please give me a clue on how to let my "x"-function invoke ls like "ls -la *.rar"? There are *.rar files in my working directory.
Thank you in advance,
Rainer

If you look at the history command output, you'll see that to call external programs ipython uses _ip.system method.
Hence, this should work for you:
def x(go):
return _ip.system("ls -la {0}".format(go))
However, please note that outside ipython you should probably use subprocess.Popen.

There was a bug in the "!" shell access that made the expansion of "function scoped variables" fail. Your ipython's version might be affected.
You can avoid it by doing yourself the variable expansion:
def x(go):
return get_ipython().getoutput("ls -la {0}".format(go))

While subprocess.Popen is probably the way to go as #jcollado said, just for completeness there is the os.system command to immediately send a command to the shell. However, the subprocess module is almost always a better choice than os.system or os.spawn.
Also, depending on what you are trying to do you may want to use python commands to interact with the operating system rather than passing commands out to a shell. If you want to deal with lists of files for instance, os.walk would likely result in cleaner and more portable code than grabbing the directory list through shell commands. You can look at the documentation for Python's OS module here.

Depending on what you wanted to accomplish, this may be the better way:
In [50]: %alias x ls -la %l
In [51]: x *.rar
-rw-r--r-- 1 dubbaluga users 45254 Apr 4 15:12 schoolbus.rar

Maybe its easier to use Python for this case:
import glob
files = glob.glob('*.rar')

Related

subprocess.Popen and relative directories

I am writing a script to open notepad.exe using subprocess.Popen()
import subprocess
command = '%windir%\system32\\notepad.exe'
process = subprocess.Popen(command)
output = process.communicate()
print(output[0])
This throws a FileNotFoundError
Is it possible to change/add to the above code to make it work with relative paths?
I did try to run the script from C:\Windows> after moving it there, which again failed. Also set the shell=True, but failed as well.
Writing a similar script using os.popen() works ok with relative paths, regardless which directory the script is run from, but as far as I understand popen is not the way forward..
Early steps in the world of programming/Python. Any input much appreciated.
Use os.path.expandvars to expand %windir%:
command = os.path.expandvars('%windir%\\system32\\notepad.exe')
The result is a path that then can be passed to subprocess.Popen.
subprocess.Popen does not expand environment variables such as %windir%. The shell might but you really should not depend on shell=True to do that.
Pro tip: whenever you get an error asking the system to execute a command, print the command (and, if applicable, the current working directory). The results will often surprise you.
In your case, I suspect you're just missing a backslash. Use this instead:
command = '%windir%\\system32\\notepad.exe'
Before you make that change, try printing the value of command immediately after assignment. I think you'll find the leading "s" in "system" is missing, and that the mistake is obvious.
HTH.
You could use raw strings to avoid having to double-up your backslashes.
command = r'%windir%\system32\notepad.exe'

Using environment variables in Fabric

Assuming:
export TEST=/somewhere
I want to run the command /somewhere/program using:
with cd('$TEST'):
run('program')
However, this doesn't work because the $ gets escaped.
Is there a way to use an environment variable in a Fabric cd() call?
Following suggestion from #AndrewWalker, here is a more compact solution that worked for me (and to my knowledge, the result is the same):
with cd(run("echo $TEST")):
run("program")
But I decided to go for a (very slightly) more concise yet as readable solution:
run('cd $TEST && program')
This second solution, if I am correct, produces the same result.
You can capture the value by using echo
testdir = str(run("echo $TEST"))
with cd(testdir):
run("program")
Alternatively:
import os
def my_task():
with lcd(os.environ['TEST_PATH']):
local('pwd')
os.getenv('TEST_PATH') may also be used (with a default, optionally)
Hat tip: Send bash environment variable back to python fabric

How to get "canonical unix shell" for Python

According to the documentation for the subprocess module, its default shell is /bin/sh, but I have an ingrained, and probably irrational, aversion to hard-coding such constants.
Therefore, I much prefer to refer to some constant defined in subprocess. I have not been able to find any way to interrogate subprocess directly for this constant. The best I've managed is
def _getshpath():
return subprocess.check_output('which sh', shell=True).strip()
or
def _getshpath():
return subprocess.check_output('echo "$0"', shell=True).strip()
...both of which look pathetically fragile, since their validity ultimately depend on precisely the specific value I'm trying to determine in the first place. (I.e., if the value of this executable is not "/bin/sh", either definition could easily be nonsensical.)
What's best-practice for getting this path (without hard-coding it as "/bin/sh")?
Thanks!
Hard-coding it as /bin/sh is perfectly valid. If you look at the documentation for C's popen() you'll find it does this too. /bin/sh is, by construction, the system's standard shell.
You could try
>>> import os
>>> shell = os.environ['SHELL']
>>> print shell
'/bin/bash'
You can use this to set the executable argument of subprocess.Popen.

How can I get a file to autorun before I run any command in ipython?

I have a python file that holds a bunch of functions that I'm continually modifying and then testing in ipython. My current workflow is to run "%run myfile.py" before each command. However, ideally, I'd like that just to happen automatically. Is that possible?
If you really want to use rlwrap for this, write a filter! Just define an input_handler that adds %run myfile.py to the input, and an echo_handler to echo your original input so that you won't see this happening (man RlwrapFilter tells you all you ever wanted to know about filter writing, and then some).
But isn't it more elegant to solve this within ipython itself, using IPython.hooks.pre_runcode_hook?
import os
import IPython
ip = IPython.ipapi.get()
def runMyFile(self):
ip.magic('%run myFile.py')
raise IPython.ipapi.TryNext()
ip.set_hook('pre_runcode_hook', runMyFile)
I can't find any elegant way. This is the ugly way. Run:
rlwrap awk '{print "%run myfile.py"} {print} {fflush()}' |ipython
This reads from STDIN, but prints the command you wanted before each command. fflush is there to disable buffering and pass things to ipython immediately. rlwrap is there to keep the readline bindings; you can remove it if you don't have it, but this will be less convenient (no arrow keys, etc.).
Mind that you will have to type your commands before the ipython prompt appears. There might be other more annoying things which break, I haven't tested thoroughly.

pythonrc in interactive code

I have a .pythonrc in my path, which gets loaded when I run python:
python
Loading pythonrc
>>>
The problem is that my .pythonrc is not loaded when I execute files:
python -i script.py
>>>
It would be very handy to have tab completion (and a few other things) when I load things interactively.
From the Python documentation for -i:
When a script is passed as first argument or the -c option is used, enter interactive mode after executing the script or the command, even when sys.stdin does not appear to be a terminal. The PYTHONSTARTUP file is not read.
I believe this is done so that scripts run predictably for all users, and do not depend on anything in a user's particular PYTHONSTARTUP file.
As Greg has noted, there is a very good reason why -i behaves the way it does. However, I do find it pretty useful to be able to have my PYTHONSTARTUP loaded when I want an interactive session. So, here's the code I use when I want to be able to have PYTHONSTARTUP active in a script run with -i.
if __name__ == '__main__':
#do normal stuff
#and at the end of the file:
import sys
if sys.flags.interactive==1:
import os
myPythonPath = os.environ['PYTHONSTARTUP'].split(os.sep)
sys.path.append(os.sep.join(myPythonPath[:-1]))
pythonrcName = ''.join(myPythonPath[-1].split('.')[:-1]) #the filename minus the trailing extension, if the extension exists
pythonrc = __import__(pythonrcName)
for attr in dir(pythonrc):
__builtins__.__dict__[attr] = getattr(pythonrc, attr)
sys.path.remove(os.sep.join(myPythonPath[:-1]))
del sys, os, pythonrc
Note that this is fairly hacky and I never do this without ensuring that my pythonrc isn't accidentally clobbering variables and builtins.
Apparently the user module provides this, but has been removed in Python 3.0. It is a bit of a security hole, depending what's in your pythonrc...
In addition to Chinmay Kanchi and Greg Hewgill's answers, I'd like to add that IPython and BPython work fine in this case. Perhaps it's time for you to switch? :)

Categories

Resources