I have a ubuntu remote server say 172.123.342.12. I want to take backup of a postgresql database on my local machine via a python script.
My Script is:
def backUp(self):
Pass = 'fb2024d4'
os.putenv("PGPASSWORD",Pass)
dt = datetime.now()
format = "%Y_%b_%d"
cur_time = dt.now()
form_time = cur_time.strftime(format)
backup_str = "C:\\Bitnami\\odoo-8.0-7\\postgresql\\bin\\pg_dump.exe --format=c -h 172.123.342.12 -p 5432 -d new_db -U bn_openerp > C:\\Users\\n\\Desktop\\Odoo_Backups\\%s.dump" %form_time
os.system(backup_str)
print ("Backup Created in Desktop")
box.showinfo("Information", "Backup Created")
backup()
It does nothing. Some help will be appreciated.
EDIT: The Script works on a database on windows as i am using admin account. So it does not asks for password. But When i try to backup a database from remote ubuntu server. It asks for password.
I have tried following solutions:
1.) SET PGPASSPASSWORD = C:\foo\bar..\pgpass.conf.
2.) os.putenv("PGPASSWORD","password")
3.) PGPASSWORD='password' pg_dump.exe -h localhost.....
No one worked for me.
I was able to use a python script to create a dump file using pg_dump.exe:
filename = 'C:/Path/To/File/mydb_dump.sql'
pgDump = 'C:/Program Files/PostgeSQL/9.5/bin/pg_dump'
subprocess.Popen('"{}" -h 127.0.0.1 dbname > "{}"'.format(pgDump, filename), shell=True)
A few things to note:
I STRONGLY CAUTION AGAINST USING shell=True !!!
There is a huge security hazard with possible shell injections as per the documentation.
I'm not sure if will work with a remote Ubuntu server, but I couldn't see why not if all permissions and sharing is setup properly.
I know this is pretty old, but I hope it helps.
Related
I'm a software tester, trying to verify that the log on a remote QNX (a BSD variant) machine will contain the correct entries after specific actions are taken. I am able to list the contents of the directory in which the log resides, and use that information in the command to read (really want to use tail -n XX <file>) the file. So far, I always get a "(No such file or directory)" when trying to read the file.
We are using Froglogic Squish for automated testing, because the Windows UI (that interacts with the server piece on QNX) is built using Qt extensions for standard Windows elements. Squish uses Python 2.7, so I am using Python 2.7.
I am using paramiko for the SSH connection to the QNX server. This has worked great for sending commands to the simulator piece that also runs on the QNX server.
So, here's the code. Some descriptive names have been changed to avoid upsetting my employer.
import sys
import time
import select
sys.path.append(r"C:\Python27\Lib\site-packages")
sys.path.append(r"C:\Python27\Lib\site-packages\pip\_vendor")
import paramiko
# Import SSH configuration variables
ssh_host = 'vvv.xxx.yyy.zzz'
thelog_dir = "/logs/the/"
ssh_user = 'un'
ssh_pw = 'pw'
def execute_Command(fullCmd):
outptLines = []
#
# Try to connect to the host.
# Retry a few times if it fails.
#
i = 1
while True:
try:
ssh = paramiko.SSHClient()
ssh.set_missing_host_key_policy(paramiko.AutoAddPolicy())
ssh.connect(ssh_host, 22, ssh_user, ssh_pw)
break
except paramiko.AuthenticationException:
log ("Authentication failed when connecting to %s" % ssh_host)
return 1
except:
log ("Could not SSH to %s, waiting for it to start" % ssh_host)
i += 1
time.sleep(2)
# If we could not connect within time limit
if i == 30:
log ("Could not connect to %s. Giving up" % ssh_host)
return 1
# Send the command (non-blocking?)
stdin, stdout, stderr = ssh.exec_command(fullCmd, get_pty=True)
for line in iter(stdout.readline, ""):
outptLines.append(line)
#
# Disconnect from the host
#
ssh.close()
return outptLines
def get_Latest_Log():
fullCmd = "ls -1 %s | grep the_2" %thelog_dir
files = execute_Command(fullCmd)
theFile = files[-1]
return theFile
def main():
numLines = 20
theLog = get_Latest_Log()
print("\n\nThe latest log is %s\n\n" %theLog)
fullCmd = "cd /logs/the; tail -n 20 /logs/the/%s" %theLog
#fullCmd = "tail -n 20 /logs/the/%s" %theLog
print fullCmd
logLines = execute_Command(fullCmd)
for line in logLines:
print line
if __name__ == "__main__":
# execute only if run as a script
main()
I have tried to read the file using both tail and cat. I have also tried to get and open the file using Paramiko's SFTP client.
In all cases, the response of trying to read the file fails -- despite the fact that listing the contents of the directory works fine. (?!) And BTW, the log file is supposed to be readable by 'world'. Permissions are -rw-rw-r--.
The output I get is:
"C:\Users\xsat086\Documents\paramikoTest>python SSH_THE_MsgChk.py
The latest log is the_20210628_115455_205.log
cd /logs/the; tail -n 20 /logs/the/the_20210628_115455_205.log
(No such file or directory)the/the_20210628_115455_205.log"
The file name is correct. If I copy and paste the tail command into an interactive SSH session with the QNX server, it works fine.
Is it something to do with the 'non-interactive' nature of this method of sending commands? I read that some implementations of SSH are built upon a command that offers a very limited environment. I don't see how that would impact this tail command.
Or am I doing something stupid in this code?
I cannot really explain completely, why you get the results you get.
But in general a corrupted output is a result of enabling and not handling terminal emulation. You enable the terminal emulation using get_pty=True. Remove it. You should not use the terminal emulation, when automating command execution.
Related question:
Is there a simple way to get rid of junk values that come when you SSH using Python's Paramiko library and fetch output from CLI of a remote machine?
I have a .yml file present in a remote server , I want to make changes on it using python fabric. If it can be done with other python libraries feel free to share.
Thank you
You are trying to edit a line in the middle of a file which is imo is not possible.
What you can do, is making a copy of the remote file on your local machine with the desired values you want to change, and then send it back to the remote server.
from fabric import Connection as connection, task
#task
def executeTask(ctx):
with connection(host=dev_server, user=myuser) as c:
c.put('PATH_TO_YOUR_YML_FILE_LOCALLY', 'PATH_TO_YOUR_REMOTE_YML_FILE')
Don't forget to :
Replacedev_server and myuser with the remote server IP and username on it
put the code above in a file called fabfile.py and you run from your command line fab executeTask
The code above is fabric 2.4 compatible
EDIT:
Because of permissions issue you can do the following :
#task
def executeTask(ctx):
with connection(host=dev_server, user=myuser) as c:
c.put("PATH_TO_YOUR_YML_FILE_LOCALLY") # implicit to remote $HOME
c.sudo("mv YOUR_FILE_NAME YOUR_DESIRED_LOCATION") # again implicitly with a CWD of $HOME
c.sudo("chown root:root YOUR_REMOTE_FILE")
Referenc:
https://github.com/fabric/fabric/issues/1750#issuecomment-406043571
If you just need to change port number you can use sed like this
def change_port(filename):
with cd('/location'):
run('sed -i "s/old_port_number/new_port_number/g" ' +filename)
I have a simple request. I want to connect to an already existing google compute engine instance, run a command, and close the connection.
I have used the great sample code here for instance creation and deletion.
Additionally, I have a startup script running which works perfectly.
Now I am reading this article to use paramiko to connect to my instance. This may or may not be the best thing to do, so please correct me if I am going down the wrong path.
I have the following sample code:
import paramiko
ssh = paramiko.SSHClient()
ssh.set_missing_host_key_policy(
paramiko.AutoAddPolicy())
ssh.connect('35.***.***.**',username='user',password='pass')
stdin, stdout, stderr = ssh.exec_command("sudo su -")
stdin, stdout, stderr = ssh.exec_command("ls -l")
stdout.readlines()
Now - I am not sure which username or password I am supposed to use.
When I run this code, I do not get the list of files and directories in my root as I want, but I do get a list of files and directories in the default user account's home - so it is connecting.
My goal is to connect to a gce instance, run a command, and that is it! For some reason it is trickier than I anticipated. Am I doing something wrong here?
If you are facing a similar use case you can explore gcloud ssh. It worked for me, but I cannot comment if this is best practice or not.
My solution here was something like the following:
import subprocess
def check_for_completion(instance_name = ""):
cmd = "gcloud compute ssh %s --zone=us-east1-b --command=\"sudo -S -i -u root -p '' ls /root/temp/ \""%(instance_name)
try:
res = subprocess.check_output(cmd, shell=True)
items = str(res).split('\n')
return {'response':items,'complete':False}
except:
return {'response':None,'complete':True}
I need to ssh to a remote Ubuntu server to do some routine job, in following steps:
ssh in as userA
sudo su - userB
run daliy_python.py script with use psycopg2 to read some info from the database (via local connection (non-TCP/IP))
scp readings to my local machine
The question is: How to do that automatically?
I've try to use Fabric, but I run into a problem with psycopg2, after I run the Fabric script below, I received error from my daliy_python.py
psycopg2.OperationalError: could not connect to server: No such file or directory
Is the server running locally and accepting
connections on Unix domain socket "/var/run/xxx/.s.xxxx"?
My fabfile.py code is as below:
from fabric.api import *
import os
import socket
import pwd
# Target machine setting
srv = 'server.hostname.com'
env.hosts = [srv]
env.user = 'userA'
env.key_filename = '/location/to/my/key'
env.timeout = 2
# Force fabric abort at timeout
env.skip_bad_hosts = False
def run_remote():
user = 'userB'
with settings(warn_only=True):
run('whoami')
with cd('/home/%s/script/script_folder' % user):
sudo('whoami')
sudo('pwd', user=user)
sudo('ls', user=user)
sudo('python daliy_python.py', user=user)
Any suggestions? My database can only be access via userB locally, but only userA can ssh to the server. That might be a limitation. Both local and remote machine is running Ubuntu 14.04.
This is what I do to read my root accessible logfiles without extra login
ssh usera#12.34.56.78 "echo hunter2 | sudo -S tail -f /var/log/nginx/access.log"
That is: ssh usera#12.34.56.78 "..run this code on the remote.."
Then on the remote, you pipe the sudo password into sudo -S echo hunter2 | sudo -S
Add a -u userb to sudo to switch to a particular user, I am using root in my case. So then as the sudo'ed user, run your script. In my case tail -f /var/log/nginx/access.log.
But, reading your post, I would probably simply set up a cronjob on the remote, so it runs automatically. I actually do that for all my databases. A cronjob dumps them once a day to a certain directory, with the date as filename. Then I download them to my local PC with rsync an hour later.
I finally find out where my problem is.
Thanks #chishake and #C14L, I look at the problem in another way.
After inspired by this posts link1 link2, I start to think this problem is related to environmental variables.
Thus I add a with statement to alter $HOME and it worked.
fabfile.py is as below:
from fabric.api import *
import os
import socket
import pwd
# Target machine setting
srv = 'server.hostname.com'
env.hosts = [srv]
env.user = 'userA'
env.key_filename = '/location/to/my/key'
env.timeout = 2
# Force fabric abort at timeout
env.skip_bad_hosts = False
def run_remote():
user = 'userB'
with settings(warn_only=True):
run('whoami')
with shell_env(HOME='/home/%s' % user):
sudo('echo $HOME', user=user)
with cd('/home/%s/script/script_folder' % user):
sudo('whoami')
sudo('pwd', user=user)
sudo('ls', user=user)
sudo('python daliy_python.py', user=user)
I have change my hosts file,so how to change hostname.my system is ubuntu.
eg my hosts file:
192.168.0.100 host1.mydomain.com
192.168.0.101 host2.mydomain.com
I wanna the hostname file under /etc/hostname of host1 to host1.mydomain.com,the hostname file of host2 to host2.mydomain.com
how to do that using fabric?
I have to ssh every host and edit the hostname file,does fabric can do this?
I didn't mean to use hostname command but to edit the /etc/hostname file.
I mean how to use fabric to do that:
such as:
def update_hostname():
get("/etc/hosts","hosts")
hosts_content = file("hosts")
**hostname = ·get the hostname corespond to ip·**
get("/etc/hostname","hostname")
update `hostname file`
put("hostname","/etc/hostname")
how get the ip? because fabric do the job on every host, and the hostname is correspond to each host. I need to know the which host the job is working and then get the ip back,then get the hostname correspond the the ip,and final update the hostname file.
Fabric is just a SSH wrapper, so what you're looking at is LINUX specific, not frabric or python specific.
from fabric.api import run
run('hostname your-new-name')
run('echo your-new-hostname > /etc/hostname')
And just do a run(..edit..) according to your linux dist?
Or just do:
from subprocess import Popen, PIPE
hosts = open('/etc/networking/hosts', 'rb')
for hostline in hosts.readlines():
ip, name = hostline.split(' ')
command = ['ssh', '-t', 'root#' + host.strip('\r\n ,;), ' ', "echo " + name.strip('\r\n ,;) + " > /etc/hostname",]
stdout, stderr = Popen(command, stdout=PIPE, stderr=PIPE).communicate()
hosts.close()
Note: /etc/networking/hosts might be placed somewhere else for you.
The important part here is that you loop through the /hosts file, and ssh to each machine echoing the given hostname to that machine.
def hostname():
'''
function to change the hostname of the ubuntu server
'''
server_hostname = prompt ("The Hostname for the server is :")
sed ("/etc/hostname", before='current hostname', after='%s' % (server_hostname), use_sudo=True,backup='')
sudo ("init 6")
This will change the hostname according to your choice.
in your fabric script you'll need to...
ssh into the machine as a user permitted to edit the hosts file ( via permissions or groups ). if you need to sudo into a user, search StackOverflow for issues regarding sudo and Fabric -- you'll need to tweak your fabfile to not prompt for a password.
fabric can have an awkward way to deal with reading/writing/opening files. you'll may be best off by cd into the right directory. something like...
with cd('/etc/')
run('echo new_hostname hostname')