I would like to deploy an application using fabric to a proxied server. Normally we ssh to a proxy server then ssh to the production server, however fabric doesn’t seem to allow for this directly.
An example of the setup would be local --> server A (Proxy) --> Server B (App server)
The destination is server B.
I have tried using the fab file below to test.
import os.path
from fabric.api import env, run, sudo, cd, local, put, settings
from fabric.contrib.files import sed, exists
from datetime import datetime
def proxy():
env.user = "root"
env.hosts = ['proxy']
env.key_filename = "/home/root/monitorserver.pem"
def production():
"""Defines production environment ."""
env.is_debuggable = False
env.user = "root"
env.hosts = ['appserver']
env.key_filename = "/home/root/appserver.pem"
def createfile():
"""Start Excecute test commands"""
sudo("touch /tmp/test_%s" % datetime.now().strftime('%H:%M:%S'))
but running the commands
fab proxy createfile production createfile
only seems to work as
fab proxy createfile
fab production createfile
Is there a way I can run fabric locally and deploy to server B with the proxy in place?
I think this can be done by creating 2 fabfiles: 1 on local, and 1 on the proxy server.
from fabric.api import env, run, sudo, cd
from datetime import datetime
def proxy():
env.user = "root"
env.hosts = ['proxy']
env.key_filename = "/home/root/monitorserver.pem"
with cd('/home/root/'):
createfile()
run("fab production")
def production():
"""Defines production environment ."""
env.is_debuggable = False
env.user = "root"
env.hosts = ['appserver']
env.key_filename = "/home/root/appserver.pem"
createfile()
def createfile():
"""Start Excecute test commands"""
sudo("touch /tmp/test_%s" % datetime.now().strftime('%H:%M:%S'))
Run fab proxy.
(Haven't tested the code, but something like this should work.)
Related
Here is my script :
from fabric2 import Connection
c = Connection('127.0.0.1')
with c.cd('/home/bussiere/'):
c.run('ls -l')
But i have this error :
paramiko.ssh_exception.AuthenticationException: Authentication failed.
So how do I run a command on localhost ?
In Fabric2, the Connection object has got a local() method.
Have a look at this object's documentation here.
As of July 2020, with fabric2 if you don't pass argument to your task decorator by default you are on the local machine.
for example the following will run on your local machine (localhost) :
Example 1 : Only on local
#python3
#fabfile.py
from fabric import task, Connection
c = Connection('remote_user#remote_server.com')
#task
def DetailList(c):
c.run('ls -l') # will run on local server because the decorator #task does not contain the hosts parameter
You then would run this on your machine with
fab DetailList
If you want to mix code that should be running on remote server and on local you should pass the connection to the #task decorator as a parameter.
Example 2: on local and on remote (but different functions)
#python3
#fabfile.py
#imports
from fabric import task, Connection
#variables
list_of_hosts = ['user#yourserver.com'] #you should have already configure the ssh access
c = Connection(list_of_hosts[0])
working_dir = '/var/www/yourproject'
#will run on remote
#task(hosts = list_of_hosts)
def Update(c):
c.run('sudo apt get update') # will run on remote server because hosts are passed to the task decorator
c.run(f'cd {working_dir} && git pull') # will run on remote server because hosts are passed to the task decorator
c.run('sudo service apache2 restart') # will run on remote server because hosts are passed to the task decorator
#will run on local because you do not specify a host
#task
def DetailsList(c):
c.run('ls -l') # # will run on local server because hosts are NOT passed to the task decorator
As mentionned by Ismaïl there also is a 'local' method that can be used when passing the hosts parameter, the 'local' method will run on the localhost although you have specified the host parameter to the task decorator. Be careful though you can not use the 'local' method if you didn't specified any hosts parameters, use run instead as shown in example 1 & 2.
Example 3 : use both on remote and local servers but under the same functions, note we are not decorating functions that are called in the UpdateAndRestart function.
#python3
#fabfile.py
#imports
from fabric import task, Connection
#variables
list_of_hosts = ['www.yourserver.com'] #you should have already configure the ssh access
c = Connection(list_of_hosts[0])
working_dir = '/var/www/yourproject'
def UpdateServer(c):
c.run('sudo apt get update') # will run on remote server because hosts are passed to the task decorator
c.local('echo the remote server is now updated') # will run on local server because you used the local method when hosts are being passed to the decorator
def PullFromGit(c):
c.run(f'cd {working_dir} && git pull') # will run on remote server because hosts are passed to the task decorator
c.local('echo Git repo is now pulled') # will run on local server because you used the local method when hosts are being passed to the decorator
def RestartServer(c):
c.run('sudo service apache2 restart') # will run on remote server because hosts are passed to the task decorator
c.local('echo Apache2 is now restarted') # will run on local server because you used the local method when hosts are being passed to the decorator
#task(hosts = list_of_hosts)
def UpdateAndRestart(c):
UpdateServer(c)
PullFromGit(c)
RestartServer(c)
c.local('echo you have updated, pulled and restarted Apache2') # will run on local server because you used the local method when hosts are being passed to the decorator
You will be able to run the entire stack with :
fab UpdateAndRestart
I'm trying to copy a file from my local fabric system to the multiple remote hosts using fabric put command, So, when I ran it doesn't complain anything but it does not copy the file.
Secondly, I see my remote Server already has the file, may that be an issue here? below the code.
import sys
from fabric.api import env
from fabric.operations import run, put
env.skip_bad_hosts=True
env.command_timeout=160
env.user = 'jaggle'
env.shell = "/bin/sh -c"
env.warn_only = True
env.password = '!#heyNaSaFreak'
use_sudo = True
def readhost():
env.hosts = [line.strip() for line in sys.stdin.readlines()]
def copyLDAP():
put("/Karn/ldap.conf","/etc/ldap.conf", use_sudo = True)
Below is output of the run ...
$ echo "tt-server01" | fab readhost -f OpenDgCopy.py copyLDAP
[tt-server0] Executing task 'copyLDAP'
[tt-server0] put: /Karn/ldap.conf -> /etc/ldap.conf
Done.
Disconnecting from tt-server0... done.
Instead of the readhost task, you can directly use the -H option with a comma-separated list of hosts:
fab -f OpenDgCopy.py copyLDAP -H tt-server0,tt-sever1
According to the documentation of put:
As with the OpenSSH sftp program, .put will overwrite
pre-existing remote files without requesting confirmation.
I need to ssh to a remote Ubuntu server to do some routine job, in following steps:
ssh in as userA
sudo su - userB
run daliy_python.py script with use psycopg2 to read some info from the database (via local connection (non-TCP/IP))
scp readings to my local machine
The question is: How to do that automatically?
I've try to use Fabric, but I run into a problem with psycopg2, after I run the Fabric script below, I received error from my daliy_python.py
psycopg2.OperationalError: could not connect to server: No such file or directory
Is the server running locally and accepting
connections on Unix domain socket "/var/run/xxx/.s.xxxx"?
My fabfile.py code is as below:
from fabric.api import *
import os
import socket
import pwd
# Target machine setting
srv = 'server.hostname.com'
env.hosts = [srv]
env.user = 'userA'
env.key_filename = '/location/to/my/key'
env.timeout = 2
# Force fabric abort at timeout
env.skip_bad_hosts = False
def run_remote():
user = 'userB'
with settings(warn_only=True):
run('whoami')
with cd('/home/%s/script/script_folder' % user):
sudo('whoami')
sudo('pwd', user=user)
sudo('ls', user=user)
sudo('python daliy_python.py', user=user)
Any suggestions? My database can only be access via userB locally, but only userA can ssh to the server. That might be a limitation. Both local and remote machine is running Ubuntu 14.04.
This is what I do to read my root accessible logfiles without extra login
ssh usera#12.34.56.78 "echo hunter2 | sudo -S tail -f /var/log/nginx/access.log"
That is: ssh usera#12.34.56.78 "..run this code on the remote.."
Then on the remote, you pipe the sudo password into sudo -S echo hunter2 | sudo -S
Add a -u userb to sudo to switch to a particular user, I am using root in my case. So then as the sudo'ed user, run your script. In my case tail -f /var/log/nginx/access.log.
But, reading your post, I would probably simply set up a cronjob on the remote, so it runs automatically. I actually do that for all my databases. A cronjob dumps them once a day to a certain directory, with the date as filename. Then I download them to my local PC with rsync an hour later.
I finally find out where my problem is.
Thanks #chishake and #C14L, I look at the problem in another way.
After inspired by this posts link1 link2, I start to think this problem is related to environmental variables.
Thus I add a with statement to alter $HOME and it worked.
fabfile.py is as below:
from fabric.api import *
import os
import socket
import pwd
# Target machine setting
srv = 'server.hostname.com'
env.hosts = [srv]
env.user = 'userA'
env.key_filename = '/location/to/my/key'
env.timeout = 2
# Force fabric abort at timeout
env.skip_bad_hosts = False
def run_remote():
user = 'userB'
with settings(warn_only=True):
run('whoami')
with shell_env(HOME='/home/%s' % user):
sudo('echo $HOME', user=user)
with cd('/home/%s/script/script_folder' % user):
sudo('whoami')
sudo('pwd', user=user)
sudo('ls', user=user)
sudo('python daliy_python.py', user=user)
In my fabfile, I have set env.use_ssh_config to True. Whenever I run the fabfile, it will get the correct hostname and user from the ssh config, but not the correct key. It will go though my keys(all stored in ~/.ssh/) at random, requiring me to enter the passphrase for all of them, till it gets to the correct key.
It's only fabric that gives me this problem. Running scp as a local command in the fabfile uses the correct key.
Host example
HostName example.com
User elssar
IdentityFile ~/.ssh/id_example
PreferredAuthentications publickey
Entries in my ssh config look like this.
I'm, using Fabric 1.10.1 and Paramiko 1.14.1, Python 2.7.3 and Ubuntu 12.04.
Edit - There is a related open issue in the fabric repository - https://github.com/fabric/fabric/issues/1282
Edit - basic structure of my fabfile, and how I run it
from fabric.api import env, run
def do_something():
run("echo test")
def setup(host):
env.hosts = [host]
# command
fab server:hostname do_something
I tried to check on my setup; here is what I did to debug:
>>> from fabric.network import key_filenames
>>> key_filenames()
[]
>>> from fabric.state import env
>>> env.use_ssh_config = True
>>> env.host_string = 'salt-states'
>>> key_filenames()
['/Users/damien/.ssh/salt.rsa.priv']
update: you could update your fabfile to instrument your task:
from fabric.api import env, run
from fabric.network import key_filenames
def do_something_debug():
env.use_ssh_config = True
print key_filenames()
run("echo test")
def server(host):
env.hosts = [host]
then run the command
fab server:hostname do_something_debug
Fabric has a hosts setting to specify which computers to SSH into.
Amazon Web Services has more of a dynamic inventory that can be queried in python using tools like boto.
Is there a way to combine these two services? Ideally, I wanted something as simple as ansible's approach with an inventory file and using an external file like ec2.py.
More specifically, is there a prebaked solution for this use case? Ideally, I would like to run something straightforward like this:
from fabric.api import env, task
import ec2
env.roledefs = ec2.Inventory()
#task
def command():
run("lsb_release -a")
And run it like so, assuming env.roledefs['nginx'] exists:
$ fab -R nginx command
You can use fabric and boto concurrently.
First you need to export the aws_secret_key, aws_secret_access_key and default regions from your console. Fabric file name should be fabfile.py and should not ec2.py/other.
import boto, urllib2
from boto.ec2 import connect_to_region
from fabric.api import env, run, cd, settings, sudo
from fabric.api import parallel
import os
import sys
REGION = os.environ.get("AWS_EC2_REGION")
env.user = "ec2-user"
env.key_filename = ["/home/user/uswest.pem"]
#task
def command():
run("lsb_release -a")
def _create_connection(region):
print "Connecting to ", region
conn = connect_to_region(
region_name = region,
aws_access_key_id=os.environ.get("AWS_ACCESS_KEY_ID"),
aws_secret_access_key=os.environ.get("AWS_SECRET_ACCESS_KEY")
)
print "Connection with AWS established"
return connection
Finally this program can be executed by using below command.
$ fab command
From http://docs.python-guide.org/en/latest/scenarios/admin/
You can see that if you set
env.hosts = ['my_server1', 'my_server2']
You'll then be able to target those hosts.
With boto, if you just have a function that does
ec2_connection.get_only_instances(filter={'tag':< whatever>})
and returns a list of their dns names, you'll be able to then set
env.hosts = [< list of dns names from ec2>]
Piece of cake!