Python Fabric perform remote database task - python

I need to ssh to a remote Ubuntu server to do some routine job, in following steps:
ssh in as userA
sudo su - userB
run daliy_python.py script with use psycopg2 to read some info from the database (via local connection (non-TCP/IP))
scp readings to my local machine
The question is: How to do that automatically?
I've try to use Fabric, but I run into a problem with psycopg2, after I run the Fabric script below, I received error from my daliy_python.py
psycopg2.OperationalError: could not connect to server: No such file or directory
Is the server running locally and accepting
connections on Unix domain socket "/var/run/xxx/.s.xxxx"?
My fabfile.py code is as below:
from fabric.api import *
import os
import socket
import pwd
# Target machine setting
srv = 'server.hostname.com'
env.hosts = [srv]
env.user = 'userA'
env.key_filename = '/location/to/my/key'
env.timeout = 2
# Force fabric abort at timeout
env.skip_bad_hosts = False
def run_remote():
user = 'userB'
with settings(warn_only=True):
run('whoami')
with cd('/home/%s/script/script_folder' % user):
sudo('whoami')
sudo('pwd', user=user)
sudo('ls', user=user)
sudo('python daliy_python.py', user=user)
Any suggestions? My database can only be access via userB locally, but only userA can ssh to the server. That might be a limitation. Both local and remote machine is running Ubuntu 14.04.

This is what I do to read my root accessible logfiles without extra login
ssh usera#12.34.56.78 "echo hunter2 | sudo -S tail -f /var/log/nginx/access.log"
That is: ssh usera#12.34.56.78 "..run this code on the remote.."
Then on the remote, you pipe the sudo password into sudo -S echo hunter2 | sudo -S
Add a -u userb to sudo to switch to a particular user, I am using root in my case. So then as the sudo'ed user, run your script. In my case tail -f /var/log/nginx/access.log.
But, reading your post, I would probably simply set up a cronjob on the remote, so it runs automatically. I actually do that for all my databases. A cronjob dumps them once a day to a certain directory, with the date as filename. Then I download them to my local PC with rsync an hour later.

I finally find out where my problem is.
Thanks #chishake and #C14L, I look at the problem in another way.
After inspired by this posts link1 link2, I start to think this problem is related to environmental variables.
Thus I add a with statement to alter $HOME and it worked.
fabfile.py is as below:
from fabric.api import *
import os
import socket
import pwd
# Target machine setting
srv = 'server.hostname.com'
env.hosts = [srv]
env.user = 'userA'
env.key_filename = '/location/to/my/key'
env.timeout = 2
# Force fabric abort at timeout
env.skip_bad_hosts = False
def run_remote():
user = 'userB'
with settings(warn_only=True):
run('whoami')
with shell_env(HOME='/home/%s' % user):
sudo('echo $HOME', user=user)
with cd('/home/%s/script/script_folder' % user):
sudo('whoami')
sudo('pwd', user=user)
sudo('ls', user=user)
sudo('python daliy_python.py', user=user)

Related

What is remote port forwarding command `ssh -R 80:localhost:8080 nokey#localhost.run` equivalent to in Paramiko?

ssh -R 80:localhost:8080 nokey#localhost.run is equivalent to what in Paramiko?
I search a lot but no success ...
Code I tried
import paramiko
command = "df"
# Update the next three lines with your
# server's information
username = "localhost:9876"
host = "nokey#localhost.run"
client = paramiko.client.SSHClient()
client.set_missing_host_key_policy(paramiko.AutoAddPolicy())
client.connect(host,port=80, username=username)
_stdin, _stdout,_stderr = client.exec_command("df")
print(stdout.read().decode())
client.close()
A partial equivalent is Transport.request_port_forward.
But that will only setup the server-side part of the forwarding. The local-side implementation that will establish the local connection from the SSH client (Paramiko) to the local server is up to you.
An example how to implement it is given in demos/rforward.py.
An equivalent of your command will be:
python3 rforward.py -u nokey -p 80 -r 127.0.0.1:8080 localhost.run
Note that the rforward.py script does not start a shell. It just does the forwarding. So it's actually an equivalent of ssh with -N switch.
If you need both forwarding and shell, you will need to modify the script to additionally do SSHClient.invoke_shell like demos/demo.py do.

Duo API Directory Sync

By Default Duo Sync runs once Daily, due to the demand of business this needs to be done every 2 hours. looking at DUO API there is a Command for User Sync:
python -m duo_client.client --ikey <> --skey <> --host api-<>.duosecurity.com --method POST --path /admin/v1/users username=<> /directorysync/<DIR SYNC>/syncuser
However I don't see an API for a general overall sync with the Active Directory So to combat such, I was hoping to get all the users from the 2FA Group and Sync via username over a loop using the following:
import sys
import os
import duo_client
from ldap3 import Server, Connection, ALL, NTLM, ALL_ATTRIBUTES, ALL_OPERATIONAL_ATTRIBUTES, AUTO_BIND_NO_TLS, SUBTREE
from ldap3.core.exceptions import LDAPCursorError
server_name = ''
domain_name = ''
user_name = ''
password = '!'
admin_api = duo_client.Admin(
ikey= "",
skey= "",
host= "api-.duosecurity.com",)
format_string = '{:40}'
print(format_string.format('samaccountname'))
server = Server(server_name, get_info=ALL)
conn = Connection(server, user='{}\\{}'.format(domain_name, user_name), password=password, authentication=NTLM,
auto_bind=True)
conn.search('dc={},dc=int'.format(domain_name), '(&(objectCategory=user)(memberOf=CN=2FA,OU=,OU=,OU=,OU=,DC=,DC=int))',
attributes=[ALL_ATTRIBUTES, ALL_OPERATIONAL_ATTRIBUTES])
for e in sorted(conn.entries):
print(e.samaccountname)
os.system("python -m duo_client.client --ikey --skey --host api-.duosecurity.com --method POST --path /admin/v1/users username={}/directorysync//syncuser".format(e.samaccountname))"
The above code some what works, but for some users it also re-creates them as the following: User_IDs such as "username/Dir/DIRAPI/usersync". as showing in images below Duo API
Syncing User
It seemed the username={} was in the wrong
The below is to Create a new user hence why i was seeing username/..../....
Post /admin/v1/users username={}
Below is the Right way for using the API Call.
os.system("python -m duo_client.client --ikey --skey --host api-.duosecurity.com --method POST --path /admin/v1/users/directorysync/syncuser username={}".format(e.samaccountname))"

Fabric to copy the File from local host to multiple remote hosts

I'm trying to copy a file from my local fabric system to the multiple remote hosts using fabric put command, So, when I ran it doesn't complain anything but it does not copy the file.
Secondly, I see my remote Server already has the file, may that be an issue here? below the code.
import sys
from fabric.api import env
from fabric.operations import run, put
env.skip_bad_hosts=True
env.command_timeout=160
env.user = 'jaggle'
env.shell = "/bin/sh -c"
env.warn_only = True
env.password = '!#heyNaSaFreak'
use_sudo = True
def readhost():
env.hosts = [line.strip() for line in sys.stdin.readlines()]
def copyLDAP():
put("/Karn/ldap.conf","/etc/ldap.conf", use_sudo = True)
Below is output of the run ...
$ echo "tt-server01" | fab readhost -f OpenDgCopy.py copyLDAP
[tt-server0] Executing task 'copyLDAP'
[tt-server0] put: /Karn/ldap.conf -> /etc/ldap.conf
Done.
Disconnecting from tt-server0... done.
Instead of the readhost task, you can directly use the -H option with a comma-separated list of hosts:
fab -f OpenDgCopy.py copyLDAP -H tt-server0,tt-sever1
According to the documentation of put:
As with the OpenSSH sftp program, .put will overwrite
pre-existing remote files without requesting confirmation.

pg_dump password in python

I have a ubuntu remote server say 172.123.342.12. I want to take backup of a postgresql database on my local machine via a python script.
My Script is:
def backUp(self):
Pass = 'fb2024d4'
os.putenv("PGPASSWORD",Pass)
dt = datetime.now()
format = "%Y_%b_%d"
cur_time = dt.now()
form_time = cur_time.strftime(format)
backup_str = "C:\\Bitnami\\odoo-8.0-7\\postgresql\\bin\\pg_dump.exe --format=c -h 172.123.342.12 -p 5432 -d new_db -U bn_openerp > C:\\Users\\n\\Desktop\\Odoo_Backups\\%s.dump" %form_time
os.system(backup_str)
print ("Backup Created in Desktop")
box.showinfo("Information", "Backup Created")
backup()
It does nothing. Some help will be appreciated.
EDIT: The Script works on a database on windows as i am using admin account. So it does not asks for password. But When i try to backup a database from remote ubuntu server. It asks for password.
I have tried following solutions:
1.) SET PGPASSPASSWORD = C:\foo\bar..\pgpass.conf.
2.) os.putenv("PGPASSWORD","password")
3.) PGPASSWORD='password' pg_dump.exe -h localhost.....
No one worked for me.
I was able to use a python script to create a dump file using pg_dump.exe:
filename = 'C:/Path/To/File/mydb_dump.sql'
pgDump = 'C:/Program Files/PostgeSQL/9.5/bin/pg_dump'
subprocess.Popen('"{}" -h 127.0.0.1 dbname > "{}"'.format(pgDump, filename), shell=True)
A few things to note:
I STRONGLY CAUTION AGAINST USING shell=True !!!
There is a huge security hazard with possible shell injections as per the documentation.
I'm not sure if will work with a remote Ubuntu server, but I couldn't see why not if all permissions and sharing is setup properly.
I know this is pretty old, but I hope it helps.

Fabric not using the correct key

In my fabfile, I have set env.use_ssh_config to True. Whenever I run the fabfile, it will get the correct hostname and user from the ssh config, but not the correct key. It will go though my keys(all stored in ~/.ssh/) at random, requiring me to enter the passphrase for all of them, till it gets to the correct key.
It's only fabric that gives me this problem. Running scp as a local command in the fabfile uses the correct key.
Host example
HostName example.com
User elssar
IdentityFile ~/.ssh/id_example
PreferredAuthentications publickey
Entries in my ssh config look like this.
I'm, using Fabric 1.10.1 and Paramiko 1.14.1, Python 2.7.3 and Ubuntu 12.04.
Edit - There is a related open issue in the fabric repository - https://github.com/fabric/fabric/issues/1282
Edit - basic structure of my fabfile, and how I run it
from fabric.api import env, run
def do_something():
run("echo test")
def setup(host):
env.hosts = [host]
# command
fab server:hostname do_something
I tried to check on my setup; here is what I did to debug:
>>> from fabric.network import key_filenames
>>> key_filenames()
[]
>>> from fabric.state import env
>>> env.use_ssh_config = True
>>> env.host_string = 'salt-states'
>>> key_filenames()
['/Users/damien/.ssh/salt.rsa.priv']
update: you could update your fabfile to instrument your task:
from fabric.api import env, run
from fabric.network import key_filenames
def do_something_debug():
env.use_ssh_config = True
print key_filenames()
run("echo test")
def server(host):
env.hosts = [host]
then run the command
fab server:hostname do_something_debug

Categories

Resources