Using environment variables in Fabric - python

Assuming:
export TEST=/somewhere
I want to run the command /somewhere/program using:
with cd('$TEST'):
run('program')
However, this doesn't work because the $ gets escaped.
Is there a way to use an environment variable in a Fabric cd() call?

Following suggestion from #AndrewWalker, here is a more compact solution that worked for me (and to my knowledge, the result is the same):
with cd(run("echo $TEST")):
run("program")
But I decided to go for a (very slightly) more concise yet as readable solution:
run('cd $TEST && program')
This second solution, if I am correct, produces the same result.

You can capture the value by using echo
testdir = str(run("echo $TEST"))
with cd(testdir):
run("program")

Alternatively:
import os
def my_task():
with lcd(os.environ['TEST_PATH']):
local('pwd')
os.getenv('TEST_PATH') may also be used (with a default, optionally)
Hat tip: Send bash environment variable back to python fabric

Related

Pass variable from Python to Bash

I am writing a bash script in which a small python script is embedded. I want to pass a variable from python to bash. After a few search I only found method based on os.environ.
I just cannot make it work. Here is my simple test.
#!/bin/bash
export myvar='first'
python - <<EOF
import os
os.environ["myvar"] = "second"
EOF
echo $myvar
I expected it to output second, however it still outputs first. What is wrong with my script? Also is there any way to pass variable without export?
summary
Thanks for all answers. Here is my summary.
A python script embedded inside bash will run as child process which by definition is not able to affect parent bash environment.
The solution is to pass assignment strings out from python and eval it subsequently in bash.
An example is
#!/bin/bash
a=0
b=0
assignment_string=$(python -<<EOF
var1=1
var2=2
print('a={};b={}'.format(var1,var2))
EOF
)
eval $assignment_string
echo $a
echo $b
Unless Python is used to do some kind of operation on the original data, there's no need to import anything. The answer could be as lame as:
myvar=$(python - <<< "print 'second'") ; echo "$myvar"
Suppose for some reason Python is needed to spit out a bunch of bash variables and assignments, or (cautiously) compose code on-the-fly. An eval method:
myvar=first
eval "$(python - <<< "print('myvar=second')" )"
echo "$myvar"
Complementing the useful Cyrus's comment in question, you just can't do it. Here is why,
Setting an environment variable sets it only for the current process and any child processes it launches. os.environ will set it only for the shell that is running to execute the command you provided. When that command finishes, the shell goes away, and so does the environment variable.
You can pretty much do that with a shell script itself and just source it to reflect it on the current shell.
There are a few "dirty" ways of getting something like this done. Here is an example:
#!/bin/bash
myvar=$(python - <<EOF
print "second"
EOF
)
echo "$myvar"
The output of the python process is stored in a bash variable. It gets a bit messy if you want to return more complex stuff, though.
You can make python return val and pass it to bash:
pfile.py
print(100)
bfile.sh
var=$(python pfile.py)
echo "$var"
output: 100
Well, this may not be what you want but one option could be running the other batch commands in python using subprocess
import subprocess
x =400
subprocess.call(["echo", str(x)])
But this is more of a temporary work around. The other solutions are more along what you are looking for.
Hope I was able to help!

bring DYLD_LIBRARY_PATH and/or LD_LIBRARY_PATH into python on OSX [duplicate]

I believe that running an external command with a slightly modified environment is a very common case. That's how I tend to do it:
import subprocess, os
my_env = os.environ
my_env["PATH"] = "/usr/sbin:/sbin:" + my_env["PATH"]
subprocess.Popen(my_command, env=my_env)
I've got a gut feeling that there's a better way; does it look alright?
I think os.environ.copy() is better if you don't intend to modify the os.environ for the current process:
import subprocess, os
my_env = os.environ.copy()
my_env["PATH"] = "/usr/sbin:/sbin:" + my_env["PATH"]
subprocess.Popen(my_command, env=my_env)
That depends on what the issue is. If it's to clone and modify the environment one solution could be:
subprocess.Popen(my_command, env=dict(os.environ, PATH="path"))
But that somewhat depends on that the replaced variables are valid python identifiers, which they most often are (how often do you run into environment variable names that are not alphanumeric+underscore or variables that starts with a number?).
Otherwise you'll could write something like:
subprocess.Popen(my_command, env=dict(os.environ,
**{"Not valid python name":"value"}))
In the very odd case (how often do you use control codes or non-ascii characters in environment variable names?) that the keys of the environment are bytes you can't (on python3) even use that construct.
As you can see the techniques (especially the first) used here benefits on the keys of the environment normally is valid python identifiers, and also known in advance (at coding time), the second approach has issues. In cases where that isn't the case you should probably look for another approach.
With Python 3.5 you could do it this way:
import os
import subprocess
my_env = {**os.environ, 'PATH': '/usr/sbin:/sbin:' + os.environ['PATH']}
subprocess.Popen(my_command, env=my_env)
Here we end up with a copy of os.environ and overridden PATH value.
It was made possible by PEP 448 (Additional Unpacking Generalizations).
Another example. If you have a default environment (i.e. os.environ), and a dict you want to override defaults with, you can express it like this:
my_env = {**os.environ, **dict_with_env_variables}
you might use my_env.get("PATH", '') instead of my_env["PATH"] in case PATH somehow not defined in the original environment, but other than that it looks fine.
To temporarily set an environment variable without having to copy the os.envrion object etc, I do this:
process = subprocess.Popen(['env', 'RSYNC_PASSWORD=foobar', 'rsync', \
'rsync://username#foobar.com::'], stdout=subprocess.PIPE)
The env parameter accepts a dictionary. You can simply take os.environ, add a key (your desired variable) (to a copy of the dict if you must) to that and use it as a parameter to Popen.
I know this has been answered for some time, but there are some points that some may want to know about using PYTHONPATH instead of PATH in their environment variable. I have outlined an explanation of running python scripts with cronjobs that deals with the modified environment in a different way (found here). Thought it would be of some good for those who, like me, needed just a bit more than this answer provided.
In certain circumstances you may want to only pass down the environment variables your subprocess needs, but I think you've got the right idea in general (that's how I do it too).
This could be a solution :
new_env = dict([(k,(':'.join([env[k], v]) if k in env else v)) for k,v in os.environ.items()])

Odoo - how to use it interactively in python interpreter?

I read here that it might be possible to use python interpreter to access Odoo and test things interactively (https://www.odoo.com/forum/help-1/question/how-to-get-a-python-shell-with-the-odoo-environment-54096), but doing this in terminal:
ipython
import sys
import openerp
sys.argv = ['', '--addons-path=~/my-path/addons', '--xmlrpc-port=8067', '--log-level=debug', '-d test',]
openerp.cli.main()
it starts Odoo server, but I can't write anything in that terminal tab to use it interactively. If for example I write anything like print 'abc', I don't get any output. Am I missing something here?
Sometime I use "logging" library for print output on the console/terminal.
For example:
import logging
logging.info('Here is your message')
logging.warning('Here is your message')
For more details, You may checkout this reference link.
The closest thing I have found to interactive is put the line
import pdb; pdb.set_trace()
in the method I want to inspect, and then trigger that method.
It's clunky, but it works.
As an example, I was just enhancing the OpenChatter implementation for our copy of OpenERP, and during the "figure things out" stage I had that line in .../addons/mail/mail_thread.py::mail_thread.post_message so I could get a better idea of what was happening in that method.
The correct way to do this is with shell:
./odoo-bin shell -d <yourdatabase>
Please, be aware that if you already have an instance of odoo, the port will be busy. In that case, the instance you are opening should be using a different port. So the command should be something like this:
./odoo-bin shell --xmlrpc-port=8888 -d <yourdatabase>
But if you want to have your addons available in the new instance, yo can make something similar to the following:
./odoo-bin shell -c ~/odooshell.conf -d <yourdatabase>
This way you can have in your odooshell.conf whatever you need to have configured (port, addons_path, etc). This way you can work smoothly with your shell.
As I always use docker, this is what I do to have my shell configured in docker:
docker exec -ti <mycontainer> odoo shell -c /etc/odoo/odooshell.conf -d <mydatabase>
You will have the env available to do anything. You can create express python code to make whatever you need. The syntax is very similar to server actions. For example:
partner_ids = env['res.partner'].search([])
for partner in partner_ids:
partner['name'] = partner.name + '.'
env.cr.commit()
Remember to env.cr.commit() if you make any data change.

Invoking shell-command from function in interactive IPython shell

I have just been playing around with IPython. Currently I am wondering how it would be possible to run a shell-command with a python variable within a function. For example:
def x(go):
return !ls -la {go}
x("*.rar")
This gives me "sh: 1: Syntax error: end of file unexpected". Could anybody please give me a clue on how to let my "x"-function invoke ls like "ls -la *.rar"? There are *.rar files in my working directory.
Thank you in advance,
Rainer
If you look at the history command output, you'll see that to call external programs ipython uses _ip.system method.
Hence, this should work for you:
def x(go):
return _ip.system("ls -la {0}".format(go))
However, please note that outside ipython you should probably use subprocess.Popen.
There was a bug in the "!" shell access that made the expansion of "function scoped variables" fail. Your ipython's version might be affected.
You can avoid it by doing yourself the variable expansion:
def x(go):
return get_ipython().getoutput("ls -la {0}".format(go))
While subprocess.Popen is probably the way to go as #jcollado said, just for completeness there is the os.system command to immediately send a command to the shell. However, the subprocess module is almost always a better choice than os.system or os.spawn.
Also, depending on what you are trying to do you may want to use python commands to interact with the operating system rather than passing commands out to a shell. If you want to deal with lists of files for instance, os.walk would likely result in cleaner and more portable code than grabbing the directory list through shell commands. You can look at the documentation for Python's OS module here.
Depending on what you wanted to accomplish, this may be the better way:
In [50]: %alias x ls -la %l
In [51]: x *.rar
-rw-r--r-- 1 dubbaluga users 45254 Apr 4 15:12 schoolbus.rar
Maybe its easier to use Python for this case:
import glob
files = glob.glob('*.rar')

Python subprocess/Popen with a modified environment

I believe that running an external command with a slightly modified environment is a very common case. That's how I tend to do it:
import subprocess, os
my_env = os.environ
my_env["PATH"] = "/usr/sbin:/sbin:" + my_env["PATH"]
subprocess.Popen(my_command, env=my_env)
I've got a gut feeling that there's a better way; does it look alright?
I think os.environ.copy() is better if you don't intend to modify the os.environ for the current process:
import subprocess, os
my_env = os.environ.copy()
my_env["PATH"] = "/usr/sbin:/sbin:" + my_env["PATH"]
subprocess.Popen(my_command, env=my_env)
That depends on what the issue is. If it's to clone and modify the environment one solution could be:
subprocess.Popen(my_command, env=dict(os.environ, PATH="path"))
But that somewhat depends on that the replaced variables are valid python identifiers, which they most often are (how often do you run into environment variable names that are not alphanumeric+underscore or variables that starts with a number?).
Otherwise you'll could write something like:
subprocess.Popen(my_command, env=dict(os.environ,
**{"Not valid python name":"value"}))
In the very odd case (how often do you use control codes or non-ascii characters in environment variable names?) that the keys of the environment are bytes you can't (on python3) even use that construct.
As you can see the techniques (especially the first) used here benefits on the keys of the environment normally is valid python identifiers, and also known in advance (at coding time), the second approach has issues. In cases where that isn't the case you should probably look for another approach.
With Python 3.5 you could do it this way:
import os
import subprocess
my_env = {**os.environ, 'PATH': '/usr/sbin:/sbin:' + os.environ['PATH']}
subprocess.Popen(my_command, env=my_env)
Here we end up with a copy of os.environ and overridden PATH value.
It was made possible by PEP 448 (Additional Unpacking Generalizations).
Another example. If you have a default environment (i.e. os.environ), and a dict you want to override defaults with, you can express it like this:
my_env = {**os.environ, **dict_with_env_variables}
you might use my_env.get("PATH", '') instead of my_env["PATH"] in case PATH somehow not defined in the original environment, but other than that it looks fine.
To temporarily set an environment variable without having to copy the os.envrion object etc, I do this:
process = subprocess.Popen(['env', 'RSYNC_PASSWORD=foobar', 'rsync', \
'rsync://username#foobar.com::'], stdout=subprocess.PIPE)
The env parameter accepts a dictionary. You can simply take os.environ, add a key (your desired variable) (to a copy of the dict if you must) to that and use it as a parameter to Popen.
I know this has been answered for some time, but there are some points that some may want to know about using PYTHONPATH instead of PATH in their environment variable. I have outlined an explanation of running python scripts with cronjobs that deals with the modified environment in a different way (found here). Thought it would be of some good for those who, like me, needed just a bit more than this answer provided.
In certain circumstances you may want to only pass down the environment variables your subprocess needs, but I think you've got the right idea in general (that's how I do it too).
This could be a solution :
new_env = dict([(k,(':'.join([env[k], v]) if k in env else v)) for k,v in os.environ.items()])

Categories

Resources